Skip to content
This repository has been archived by the owner on Dec 1, 2021. It is now read-only.

[WIP] Add the config file of parameter tuner for object detection #275

Open
wants to merge 792 commits into
base: master
Choose a base branch
from

Conversation

nlpng
Copy link
Contributor

@nlpng nlpng commented May 16, 2019

Motivation and Context

Previously, there were not config file provided for object detection of the experimental parameter tuner. This PR provides it.

Description

  • This PR includes the config file for tuning the lm_yolo model in lmnet with name lm_fyolo_quantize_pascalvoc_2007_2012_tune.py.
  • Since training the model requires the 'warm-up' step, the parameter of soft_start is provided to support the 'warm-up' step.

How has this been tested?

Screenshots (if appropriate):

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature / Optimization (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)

Checklist:

  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.

nlpng and others added 30 commits January 29, 2019 17:03
xml parser use xml.etree.ElementTree instead of BeautifulSoup4
Added pattern matching and new optimization passes
Optimizer Selector の PR コメント対応
nlpng and others added 21 commits April 26, 2019 17:23
* add new file for ray tune

* add config file for ray tune

* add optuna-based hp optimization contributed by @djKooks

* improve print for more trial detail

* casual clean-up on tune_optuna

* adapt more hyper-parameters for tuning

* add search for weight decay rate parameter

* make optimizer tunable

* reloading the config file every time new trial start

* clear kwargs before assigning new ones

* add perimagestandardization to tune config file

* add search option for lr scheduler of polynomial decay

* move tune_spec to config file

* remove redundant imports

* add dynamic allocation of resources

* fix incompatible problem of easydict

* make compatible with data iterator

* auto detect number of gpus to train

* test lmnet_v1_quantize_cifar10

* add options for saving directory

* saving directory defaults to ~/ray_results

* add config file for tuning segmentation

* add optional id for different tuning sessions

* oops, no weight decay parameter for lm_segnet

* remove redundant lines

* remove more redundant lines

* test various values in config file

* update package requirement for ray

* separate core scripts and config files

* remove un-used imports

* remove optuna implementation

* quick update comments

* update copyright year

* add some comments

* Rename trial to shorter string
* Add try catch to ensure queue import compatibility

* fix queue to Queue typo

* use from instead of direct import
* add build dependency to pep8-dlk

* fix to follow pep8
* create automated_testing dir on FPGA for dlk test

* change to use default python(2.7.12) on FPGA for dlk test

* remove Type Hints for python 2.7

* enable to set fpga host by environment variables

* add FPGA_HOST from environment variable

* add StrictHostKeyChecking option instead of ssh configuration via volume mount of /root/.ssh

* fix mount.err to mount.out

* remove comment 'only available on Jenkins'

* enable to connect to FPGA by user's ssh config and keys

* fix 'Bad owner or permissions on /root/.ssh/config'
* Update docs for CUDA10

* fix > to >=
* allow dataset prefetch only for training

* remove to-do flag

* add comments
@iizukak
Copy link
Member

iizukak commented May 16, 2019

Super Nice PR!!
We want hyper parameter search on OD.

@CLAassistant
Copy link

CLAassistant commented Jun 12, 2020

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
5 out of 13 committers have signed the CLA.

✅ primenumber
✅ kchygoe
✅ tvlenin
✅ iizukak
✅ odoku
❌ nlpng
❌ ruimashita
❌ tsawada
❌ hadusam
❌ lm-jira
❌ Joeper214
❌ wtnb93
❌ yd8534976
You have signed the CLA already but the status is still pending? Let us recheck it.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 13 committers have signed the CLA.

✅ iizukak
❌ ruimashita
❌ nlpng
❌ kchygoe
❌ tsawada
❌ Joeper214
❌ primenumber
❌ lm-jira
❌ tvlenin
❌ yd8534976
❌ wtnb93
❌ hadusam
❌ odoku
You have signed the CLA already but the status is still pending? Let us recheck it.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CI: auto-run Run CI automatically
Projects
None yet
Development

Successfully merging this pull request may close these issues.