About
Finding optimum solution in black box functions can be essential techniques in technology field. For that purpose, many approaches are going to be researched. To choosing the best solution, I summarize the difference between those approaches in a table.
Comparison
TPE | Bayesian Optimization | Genetic Algorithms | |
Approach | Divides parameter space into two distributions and proposes new samples based on these. | Uses Gaussian processes to model and explore the function shape with uncertainty. | Evolves solutions through genetic operations on a population, seeking the optimal solution. |
Cost | Relatively low (simple calculation for sample proposals). | High (requires computation of kernel functions and matrix inversions). | Medium to high (requires many individuals and generations, each needing evaluation). |
Scalability | Adapts well to high-dimensional parameter spaces. | Scalability decreases as dimensionality and complexity increase. | Can handle high-dimensional problems if parameters are set appropriately. |
Parameter Dependency | Few hyperparameters, relatively easy to configure. | Kernel selection and hyperparameter tuning are crucial. | Many parameters to configure, such as crossover rate, mutation rate, and selection method. |
Ease of Use | User-friendly and implemented in many libraries. | Requires specialized knowledge for proper kernel selection. | Many parameters to set, which may require trial and error. |
Application Range | Wide range of problems but may converge to local optima. | Very precise optimization possible, but practical applications may be limited by computational cost. | Broad applicability in both continuous and discrete optimization problems. |