Skip to content

Commit

Permalink
Merged 'master' branch (#3127)
Browse files Browse the repository at this point in the history
* [src] Change warp-synchronous to cub::BlockReduce (safer but slower) (#3080)

* [src] Fix && and || uses where & and | intended, and other weird errors (#3087)

* [build] Some fixes to Makefiles (#3088)

clang is unhappy with '-rdynamic' in compile-only step, and the
switch is really unnecessary.

Also, the default location for MKL 64-bit libraries is intel64/.
The em64t/ was explained already obsolete by an Intel rep in 2010:
https://software.intel.com/en-us/forums/intel-math-kernel-library/topic/285973

* [src] Fixed -Wreordered warnings in feat (#3090)

* [egs] Replace bc with perl -e (#3093)

* [scripts] Fix python3 compatibility issue in data-perturbing script (#3084)

* [doc] fix some typos in doc. (#3097)

* [build] Make sure expf() speed probe times sensibly (#3089)

* [scripts] Make sure merge_targets.py works in python3 (#3094)

* [src] ifdef to fix compilation failure on CUDA 8 and earlier (#3103)

* [doc] fix typos and broken links in doc. (#3102)

* [scripts] Fix frame_shift bug in egs/swbd/s5c/local/score_sclite_conf.sh (#3104)

* [src] Fix wrong assertion failure in nnet3-am-compute (#3106)

* [src] Cosmetic changes to natural-gradient code (#3108)

* [src,scripts] Python2 compatibility fixes and code cleanup for nnet1 (#3113)

* [doc] Small documentation fixes; update on Kaldi history (#3031)

* [src] Various mostly-cosmetic changes (copying from another branch) (#3109)

* [scripts]  Simplify text encoding in RNNLM scripts (now only support utf-8) (#3065)

* [egs] Add "formosa_speech" recipe (Taiwanese Mandarin ASR) (#2474)

* [egs] python3 compatibility in csj example script (#3123)

* [egs] python3 compatibility in example scripts (#3126)

* [scripts] Bug-fix for removing deleted words (#3116)

The type of --max-deleted-words-kept-when-merging in segment_ctm_edits.py
was a string, which prevented the mechanism from working altogether.
  • Loading branch information
desh2608 authored and danpovey committed Mar 17, 2019
1 parent f93749a commit 2a69516
Show file tree
Hide file tree
Showing 295 changed files with 2,009 additions and 316 deletions.
2 changes: 1 addition & 1 deletion egs/aishell/s5/local/chain/tuning/run_tdnn_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ if [ $stage -le 10 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $treedir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/aishell/s5/local/chain/tuning/run_tdnn_2a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ if [ $stage -le 10 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $treedir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/aishell2/s5/local/chain/tuning/run_tdnn_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ fi
if [ $stage -le 10 ]; then
echo "$0: creating neural net configs using the xconfig parser";
num_targets=$(tree-info $treedir/tree | grep num-pdfs | awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
opts="l2-regularize=0.002"
linear_opts="orthonormal-constraint=1.0"
output_opts="l2-regularize=0.0005 bottleneck-dim=256"
Expand Down
2 changes: 1 addition & 1 deletion egs/aishell2/s5/local/chain/tuning/run_tdnn_1b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ if [ $stage -le 10 ]; then
echo "$0: creating neural net configs using the xconfig parser";
feat_dim=$(feat-to-dim scp:data/${train_set}_hires/feats.scp -)
num_targets=$(tree-info $treedir/tree | grep num-pdfs | awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
opts="l2-regularize=0.002"
linear_opts="orthonormal-constraint=1.0"
output_opts="l2-regularize=0.0005 bottleneck-dim=256"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -220,7 +220,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
affine_opts="l2-regularize=0.01 dropout-proportion=0.0 dropout-per-dim=true dropout-per-dim-continuous=true"
tdnnf_opts="l2-regularize=0.01 dropout-proportion=0.0 bypass-scale=0.66"
linear_opts="l2-regularize=0.01 orthonormal-constraint=-1.0"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -211,7 +211,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -235,7 +235,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
tdnn_opts="l2-regularize=0.006"
lstm_opts="l2-regularize=0.0025 decay-time=20 dropout-proportion=0.0"
output_opts="l2-regularize=0.001"
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_cnn_tdnn_lstm_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

lstm_opts="decay-time=20"

Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_cnn_tdnn_lstm_1b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

lstm_opts="decay-time=20 dropout-proportion=0"

Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_cnn_tdnn_lstm_1c.sh
Original file line number Diff line number Diff line change
Expand Up @@ -185,7 +185,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

lstm_opts="decay-time=40"

Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_1b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_1c.sh
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_1d.sh
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_1e.sh
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_1f.sh
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_1g.sh
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_1h.sh
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_1i.sh
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
opts="l2-regularize=0.02"
output_opts="l2-regularize=0.004"

Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -173,7 +173,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1c.sh
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1d.sh
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1e.sh
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1f.sh
Original file line number Diff line number Diff line change
Expand Up @@ -173,7 +173,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1g.sh
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1h.sh
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1i.sh
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1j.sh
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

lstm_opts="decay-time=20"

Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1k.sh
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

lstm_opts="decay-time=20"

Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1l.sh
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1m.sh
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

lstm_opts="decay-time=20 dropout-proportion=0.0"

Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1n.sh
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_1o.sh
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
tdnn_opts="l2-regularize=0.025"
lstm_opts="l2-regularize=0.01"
output_opts="l2-regularize=0.004"
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_lstm_bs_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
tdnn_opts="l2-regularize=0.003"
lstm_opts="l2-regularize=0.005"
output_opts="l2-regularize=0.001"
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_opgru_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
gru_opts="dropout-per-frame=true dropout-proportion=0.0"

mkdir -p $dir/configs
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_opgru_1b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
gru_opts="dropout-per-frame=true dropout-proportion=0.0"

mkdir -p $dir/configs
Expand Down
2 changes: 1 addition & 1 deletion egs/ami/s5b/local/chain/tuning/run_tdnn_opgru_1c.sh
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ if [ $stage -le 15 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
gru_opts="dropout-per-frame=true dropout-proportion=0.0"

mkdir -p $dir/configs
Expand Down
2 changes: 1 addition & 1 deletion egs/aspire/s5/local/chain/tuning/run_blstm_7b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ if [ $stage -le 11 ]; then

num_targets=$(tree-info $treedir/tree | grep num-pdfs | awk '{print $2}')
[ -z $num_targets ] && { echo "$0: error getting num-targets"; exit 1; }
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

lstm_opts="decay-time=20"

Expand Down
2 changes: 1 addition & 1 deletion egs/aspire/s5/local/chain/tuning/run_tdnn_7b.sh
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ if [ $stage -le 11 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $treedir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/aspire/s5/local/chain/tuning/run_tdnn_lstm_1a.sh
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ if [ $stage -le 12 ]; then
echo "$0: creating neural net configs using the xconfig parser";

num_targets=$(tree-info $treedir/tree |grep num-pdfs|awk '{print $2}')
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

lstm_opts="decay-time=40"

Expand Down
2 changes: 1 addition & 1 deletion egs/babel/s5d/local/chain/tuning/run_tdnn.sh
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ if [ $stage -le 17 ]; then

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
[ -z $num_targets ] && { echo "$0: error getting num-targets"; exit 1; }
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)

mkdir -p $dir/configs
cat <<EOF > $dir/configs/network.xconfig
Expand Down
2 changes: 1 addition & 1 deletion egs/babel/s5d/local/chain/tuning/run_tdnn_lstm.sh
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ if [ $stage -le 17 ]; then

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
[ -z $num_targets ] && { echo "$0: error getting num-targets"; exit 1; }
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
lstm_opts="decay-time=20"
label_delay=5

Expand Down
2 changes: 1 addition & 1 deletion egs/babel/s5d/local/chain/tuning/run_tdnn_lstm_bab1.sh
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ if [ $stage -le 17 ]; then

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
[ -z $num_targets ] && { echo "$0: error getting num-targets"; exit 1; }
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
lstm_opts="decay-time=20"
label_delay=5

Expand Down
2 changes: 1 addition & 1 deletion egs/babel/s5d/local/chain/tuning/run_tdnn_lstm_bab2.sh
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ if [ $stage -le 17 ]; then

num_targets=$(tree-info $tree_dir/tree |grep num-pdfs|awk '{print $2}')
[ -z $num_targets ] && { echo "$0: error getting num-targets"; exit 1; }
learning_rate_factor=$(echo "print 0.5/$xent_regularize" | python)
learning_rate_factor=$(echo "print (0.5/$xent_regularize)" | python)
lstm_opts="decay-time=20"
label_delay=5

Expand Down
Loading

0 comments on commit 2a69516

Please sign in to comment.