-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperparameter optimization with Optuna #24
Merged
Merged
Commits on Sep 11, 2024
-
Main file for hyperparameter tuning:
1. Running with a .yaml config file will lead to usual execution. 2. Running with the --optimized flag will tune hyperparameters with optuna. No input file needed.
Configuration menu - View commit details
-
Copy full SHA for 27041fb - Browse repository at this point
Copy the full SHA 27041fbView commit details -
Module for hyperparameter tuning.
1. objective.py defines the objective function for tuning. Contains the fastvpinns object returning metric for tuning. 2. optuna_tuner.py manages the hyperparameter tuning process.
Configuration menu - View commit details
-
Copy full SHA for 65637b7 - Browse repository at this point
Copy the full SHA 65637b7View commit details -
Configuration menu - View commit details
-
Copy full SHA for 9ac45cc - Browse repository at this point
Copy the full SHA 9ac45ccView commit details -
Configuration menu - View commit details
-
Copy full SHA for 5aff902 - Browse repository at this point
Copy the full SHA 5aff902View commit details
Commits on Sep 18, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 9d1618b - Browse repository at this point
Copy the full SHA 9d1618bView commit details -
Changes to main file to incorporate hyperparameter tuning using Optuna
1. Accept number of trials and number of training iteration for each trial as an argument.
Configuration menu - View commit details
-
Copy full SHA for dc1a7f9 - Browse repository at this point
Copy the full SHA dc1a7f9View commit details -
1. Accept an is_optimized argument, True if hyperparameter optimization with Optuna is being used. 2. If is_optimized is True, geometry module doesn't print out the test mesh and VTK file for each trial. 3. Backward compatibility - default value of is_optimized is False, existing code with config file should work as is.
Configuration menu - View commit details
-
Copy full SHA for 3dd7f00 - Browse repository at this point
Copy the full SHA 3dd7f00View commit details -
Parallel runs with optuna tuner
1. Creates an SQLite database if it doesn't exist. Can be used for stalled runs or parallel implementation. 2. Lists available number of GPUs and divides jobs.
Configuration menu - View commit details
-
Copy full SHA for 6e05813 - Browse repository at this point
Copy the full SHA 6e05813View commit details -
Configuration menu - View commit details
-
Copy full SHA for 3b63a4c - Browse repository at this point
Copy the full SHA 3b63a4cView commit details -
Configuration menu - View commit details
-
Copy full SHA for 7aa1711 - Browse repository at this point
Copy the full SHA 7aa1711View commit details -
Configuration menu - View commit details
-
Copy full SHA for d09e32f - Browse repository at this point
Copy the full SHA d09e32fView commit details
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.