
Tunny is Grasshopper's optimization component using Optuna, an open source hyperparameter auto-optimization framework.
Basic functions can be used freely without licensing, but advanced features require a license to enable their use.
This component support below optimization algorithms.
- Bayesian optimization(TPE)
- Bayesian optimization(GP)
- Genetic algorithm(NSGA-II)
- Genetic algorithm(NSGA-III)
- Evolution strategy(CMA-ES)
- Quasi-MonteCarlo
- Random
- BruteForce
- Human-in-the-loop optimization
TPE, GP, NSGA-II and NSGA-III also support multi-objective optimization with constrains.
It is inspired by components such as Galapagos, opossum, and wallacei, and can be used in a similar way to them.
For more information on how to use it, click here to see document.
The following is taken from the Optuna official website
Optuna™, an open-source automatic hyperparameter optimization framework, automates the trial-and-error process of optimizing the hyperparameters. It automatically finds optimal hyperparameter values based on an optimization target. Optuna is framework agnostic and can be used with most Python frameworks, including Chainer, Scikit-learn, Pytorch, etc.
Optuna is used in PFN projects with good results. One example is the second place award in the Google AI Open Images 2018 – Object Detection Track competition.
Optuna official site : https://optuna.org/
Install
-
Go to Purchase Page, if you need license.
-
Download the Tunny yak file in Release Page.There are several files available, but if you are not particular about the dotnet version, use the following.
-
Rhino7 users: net48
-
Rhino8 users: net7.0
-
-
Drag and drop yak files into Rhino.
-
Restart Rhino.
-
That's it - enjoy optimizing with Grasshopper!
- License Type:
Grasshopper for Rhino 8 for Win
Grasshopper for Rhino 8 Mac
Grasshopper for Rhino 8 for Win






