Skip to content

Conversation

BernhardAhrens
Copy link
Collaborator

trial with using Hyperopt
seems to work
my implementation is still rough at edges, especially the Mspec struct.

@lazarusA
Copy link
Member

On top of this, maybe we should do a version of train without any of the logging/dashboard logic, and a return of nothing (or only the final accuracy). All the pieces are in place already. After that, doing a pmap and using Distributed.jl could speed things up allowing us to do our grid search in the cluster. I will take a look at this tomorrow.

@BernhardAhrens BernhardAhrens linked an issue Aug 17, 2025 that may be closed by this pull request
@BernhardAhrens
Copy link
Collaborator Author

I managed to use threads in parallel with the @thyperopt macro that ships with the package. They also have the @phyperopt macro to use different workers but I didn't get that to work yet.

I maybe have not understood threads and workers - is threads simply different cpus on one node and workers maybe potentially on different nodes?

@BernhardAhrens BernhardAhrens changed the title [WIP] hyperparameter tuning with Hyperopt.jl Hyperparameter tuning with Hyperopt.jl Aug 30, 2025
@BernhardAhrens BernhardAhrens merged commit fc329c8 into main Aug 30, 2025
4 checks passed
@BernhardAhrens BernhardAhrens deleted the ba/hyper branch August 30, 2025 08:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

train, validation, test split and hyperparameter tuning
2 participants