Skip to content
This repository has been archived by the owner on Dec 1, 2021. It is now read-only.

Commit

Permalink
Change learning rate schdule messages
Browse files Browse the repository at this point in the history
  • Loading branch information
ruimashita committed Dec 27, 2018
1 parent ea8f63a commit 1b82b44
Show file tree
Hide file tree
Showing 12 changed files with 13 additions and 12 deletions.
2 changes: 1 addition & 1 deletion blueoil/blueoil_init.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@
("constant", "'constant' -> constant learning rate."),
("2-step-decay", "'2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1."),
("3-step-decay", "'3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1"),
("3-step-decay-with-warmup", "'3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'"),
("3-step-decay-with-warmup", "'3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'"),
])


Expand Down
2 changes: 1 addition & 1 deletion blueoil/templates/blueoil-config.tpl.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: {{ learning_rate_schedule }}
initial_learning_rate: {{ initial_learning_rate_value }}

Expand Down
2 changes: 1 addition & 1 deletion docs/usage/init.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ This is an example of configuration.
image size (integer x integer): 32x32
how many epochs do you run training (integer): 100
initial learning rate: 0.001
choose learning rate schedule: '2-step-decay' -> learning rate reduce to 1/10 on epochs/2 and epochs-1.
choose learning rate schedule ({epochs} is the number of training epochs you entered before): '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
enable data augmentation? No
apply quantization at the first layer: yes
```
Expand Down
2 changes: 1 addition & 1 deletion tests/config/caltech101_classification.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: constant
initial_learning_rate: 0.001

Expand Down
2 changes: 1 addition & 1 deletion tests/config/caltech101_classification_has_validation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: constant
initial_learning_rate: 0.001

Expand Down
2 changes: 1 addition & 1 deletion tests/config/delta_mark_classification.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: constant
initial_learning_rate: 0.001

Expand Down
2 changes: 1 addition & 1 deletion tests/config/delta_mark_classification_has_validation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: constant
initial_learning_rate: 0.001

Expand Down
2 changes: 1 addition & 1 deletion tests/config/delta_mark_object_detection.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: constant
initial_learning_rate: 0.001

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: constant
initial_learning_rate: 0.001

Expand Down
3 changes: 2 additions & 1 deletion tests/config/make_yml_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,8 @@
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, \
then train the same way as '3-step-decay'.
"""

trainer_lr_schedules = [
Expand Down
2 changes: 1 addition & 1 deletion tests/config/openimagesv4_object_detection.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: constant
initial_learning_rate: 0.001

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ trainer:
# 'constant' -> constant learning rate.
# '2-step-decay' -> learning rate decrease by 1/10 on {epochs}/2 and {epochs}-1.
# '3-step-decay' -> learning rate decrease by 1/10 on {epochs}/3 and {epochs}*2/3 and {epochs}-1.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train same as '3-step-decay'.
# '3-step-decay-with-warmup' -> warmup learning rate 1/1000 in first epoch, then train the same way as '3-step-decay'.
learning_rate_schedule: constant
initial_learning_rate: 0.001

Expand Down

0 comments on commit 1b82b44

Please sign in to comment.