Model-based Design Optimization Taking into Account Design Viability via Classification

Main Article Content

M. Niehoff
D. Bestle
P. Kupijai


Design optimization of real-world industrial products is usually a challenging high-dimensional task with several multi-modal objectives. Therefore, the solution has to be found by global optimization algorithms which require fast surrogate models to realize a large number of design evaluations. However, approximating the original optimization criteria by surrogates may mislead the optimization by offering solutions in the entire design domain, even if designs are not viable in reality. Therefore, a classification model should be used as additional optimization constraint to guide the optimizer to viable results.

Article Details

How to Cite
Model-based Design Optimization Taking into Account Design Viability via Classification. (2023). Engineering Modelling, Analysis and Simulation, 1.

How to Cite

Model-based Design Optimization Taking into Account Design Viability via Classification. (2023). Engineering Modelling, Analysis and Simulation, 1.


T. Simpson, V. Toropov, V. Balabanov, and F. Viana, “Design and analysis of computer experiments in multidisciplinary design optimization: a review of how far we have come-or not,” in 12th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, 2008, p. 5802.

K. Cheng, Z. Lu, C. Ling, and S. Zhou, “Surrogate-assisted global sensitivity analysis: an overview,” Struct Multidisc Optim, vol. 61, pp. 1187–1213, 2020.

C. Audet, J. Denni, D. Moore, A. Booker, and P. Frank, “A surrogate-model-based method for constrained optimization,” in 8th Symposium on Multidisciplinary Analysis and Optimization, 2000, p. 4891.

J. Li, P. Wang, H. Dong, J. Shen, and C. Chen, “A classification surrogate-assisted multi-objective evolutionary algorithm for expensive optimization,” Knowledge-Based Systems, vol. 242, p. 108416, 2022.

D. R. Jones, M. Schonlau, and W. J. Welch, “Efficient global optimization of expensive black-box functions,” Journal of Global Optimization, vol. 13, no. 4, p. 455, 1998.

T. Most and J. Will, “Metamodel of Optimal Prognosis - an automatic approach for variable reduction and optimal metamodel selection,” Proc. Weimarer Optimierungs-und Stochastiktage, vol. 5, pp. 20–21, 2008.

M. Lockan, P. Amtsfeld, D. Bestle, and M. Meyer, “Non-Linear Sensitivity Analysis Based on Regression Trees with Application to 3D Aerodynamic Optimization of Gas Turbine Blades,” in International Conference on Engineering and Applied Sciences Optimization (OPT-i), Kos, Griechenland, 2014.

L. Hartwig and D. Bestle, “Compressor blade design for stationary gas turbines using dimension reduced surrogate modeling,” in 2017 IEEE Congress on Evolutionary Computation (CEC), 2017, pp. 1595–1602.

Z. Wang and M. Ierapetritou, “A novel feasibility analysis method for black‐box processes using a radial basis function adaptive sampling approach,” AIChE J., vol. 63, no. 2, pp. 532–550, 2017.

A. I. J. Forrester, A. Sóbester, and A. J. Keane, “Optimization with missing data,” Proc. R. Soc. A., vol. 462, no. 2067, pp. 935–945, 2006.

Y. He, J. Sun, P. Song, and X. Wang, “Dual Kriging assisted efficient global optimization of expensive problems with evaluation failures,” Aerospace Science and Technology, vol. 105, p. 106006, 2020.

Y. He, J. Sun, P. Song, and X. Wang, “Variable-fidelity expected improvement based efficient global optimization of expensive problems in presence of simulation failures and its parallelization,” Aerospace Science and Technology, vol. 111, p. 106572, 2021.

M. Sacher et al., “A classification approach to efficient global optimization in presence of non-computable domains,” Struct. Multidisc Optim., vol. 58, pp. 1537–1557, 2018.

F. Pedregosa et al., “Scikit-learn: Machine learning in Python,” The Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.

M. D. McKay, R. J. Beckman, and W. J. Conover, “A comparison of three methods for selecting values of input variables in the analysis of output from a computer code,” Technometrics, vol. 42, no. 1, pp. 55–61, 2000.

G. C. Cawley and N. L. C. Talbot, “On over-fitting in model selection and subsequent selection bias in performance evaluation,” The Journal of Machine Learning Research, vol. 11, pp. 2079–2107, 2010.

T. Fawcett, “An introduction to ROC analysis,” Pattern Recognition Letters, vol. 27, no. 8, pp. 861–874, 2006.

M. W. Browne, “Cross-Validation methods,” Journal of Mathematical Psychology, vol. 44, no. 1, pp. 108–132, 2000.

Daniel Berrar, “Cross-Validation,” in Reference Module in Life Sciences, B. D. Roitberg, Ed., Place of publication not identified: Elsevier, 2016.

A. M. Molinaro, R. Simon, and R. M. Pfeiffer, “Prediction error estimation: a comparison of resampling methods,” Bioinformatics (Oxford, England), vol. 21, no. 15, pp. 3301–3307, 2005.

R. Kruse, S. Mostaghim, C. Borgelt, C. Braune, and M. Steinbrecher, Computational Intelligence. Cham: Springer International Publishing, 2022.

P. Geurts, D. Ernst, and L. Wehenkel, “Extremely randomized trees,” Mach Learn, vol. 63, pp. 3–42, 2006.

I. M. Sobol, “Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates,” Mathematics and Computers in Simulation, vol. 55, 1-3, pp. 271–280, 2001.

T. Homma and A. Saltelli, “Importance measures in global sensitivity analysis of nonlinear models,” Reliability Engineering & System Safety, vol. 52, no. 1, pp. 1–17, 1996.

I. M. Sobol', “On sensitivity estimation for nonlinear mathematical models,” Matematicheskoe modelirovanie, vol. 2, no. 1, pp. 112–118, 1990.

A. Saltelli, P. Annoni, I. Azzini, F. Campolongo, M. Ratto, and S. Tarantola, “Variance based sensitivity analysis of model output. Design and estimator for the total sensitivity index,” Computer Physics Communications, vol. 181, no. 2, pp. 259–270, 2010.

J. Herman and W. Usher, “SALib: An open-source Python library for sensitivity analysis,” Journal of Open Source Software, vol. 2, no. 9, p. 97, 2017.

T. Most and J. Will, “Sensitivity analysis using the Metamodel of Optimal Prognosis,” Weimar Optimization and Stochastic Days, vol. 8, no. 0, 2011.

K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGA-II,” IEEE Trans. Evol. Computat., vol. 6, no. 2, pp. 182–197, 2002.

J. Blank and K. Deb, “Pymoo: Multi-Objective Optimization in Python,” IEEE Access, vol. 8, pp. 89497–89509, 2020.