Author:
Allie Yuxin Lin
Marketing Writer
Allow me to paint a picture for you. You’re an auto manufacturer, and you realize that the increased demand for fuel efficiency is pushing the industry toward new engine designs that can reduce fuel consumption while abiding by stricter governmental regulations on emissions. To accommodate this, you must follow the industry standard and rely on both experimental prototyping and numerical modeling. As you learn more about numerical simulation, you see that there are two approaches that you could take, so you start exploring these in depth. The design of experiments (DoE) technique explores the design space through many simulations and creates a response surface to optimize outcomes. This approach allows you to run many concurrent simulations to achieve quick design times. However, traditional linear regression-based response surface methods (RSMs) are unable to capture the complex, non-linear interactions in engine combustion. The second option involves the application of genetic algorithms (GAs), which optimize designs through multiple simulations over many generations.1 Your research shows that the GA method is very effective at exploring optimal design strategies, but it typically requires many generations to converge, leading to an extended design turnaround of up to several months.
Now you’re facing a difficult predicament. You have two options in front of you, one which will solve the problem within a reasonable timeframe but might miss out on the optimal solution, and another that is robust but computationally costly.
Enter machine learning (ML) optimization. Offering rapid project turnaround, cost-efficiency, and knowledge of the full design space, ML optimization is a game-changer in the field.2 Trained on DoE data, the ML tool has access to a wealth of information across the entire design space that would not be obtained through traditional sequential optimization methods. With a sufficiently complex ML model, you can capture the non-linear relationships that a DoE alone cannot, while keeping the optimization turnaround time low.
In previous versions of our software, optimization could be accomplished through our in-house CONVERGE Genetic Optimization (CONGO) utility, which enables you to run a GA optimization or a DoE interrogation study. A GA takes a survival-of-the-fittest approach to optimize a design, in which input parameters are randomly generated to create a population of parameters with the highest user-defined merit.
In late 2024, we released an ML tool in CONVERGE Studio that enables rapid optimization. First, you will identify the parameters that you want to vary during your optimization study (e.g., injection pressure, EGR ratio), and define the performance metrics you will use to assess the merit of your simulation results (e.g., minimum fuel consumption, minimum NOx emissions). The tool will then initialize a DoE by systematically generating a set of input variables for CONVERGE simulations that span your design space. A Latin hypercube sampling approach can be used to maximize the minimum distance between DoE sample points, producing a quasi-random sample that better captures the underlying data distribution compared to a random sample. After generating input files for the DoE, CONVERGE users can run their cases concurrently on CONVERGE Horizon, our cloud computing service that provides affordable, on-demand access to the latest high-performance computing (HPC) technologies.
The results from the DoE can now serve as the training data for the ML model. Since the most appropriate ML algorithm for a particular set of data cannot be determined a priori, the ML tool will combine several different algorithms through ensemble learning: ridge regression, random forest, gradient boosting, support vector machine, and neural network. This ML meta learning model will identify the combination of the five algorithms that best emulates the CFD setup. You can then use the trained ML meta model to predict the optimal case, evaluated with your predefined performance metrics. Finally, you can run the predicted best case in CONVERGE to confirm the results.
The ML tool offers a streamlined process for rapid and accurate optimization. The goal is not to replace CFD with ML, but rather to use ML in conjunction with CFD to enable fast, optimization-based design. A simplified schematic of the process can be seen in Figure 1.

While CONVERGE’s ML tool can be called within a user-defined function (UDF) for different purposes, such as reduced-order modeling, the approach is primarily targeted for design optimization. Its flexibility and ease of use enables the tool to process copious amounts of data, uncover nuanced patterns, and provide actionable insights.
To increase the efficiency of internal combustion engines, we partnered with Polaris and Oracle Cloud in 2021 to combine ML, CFD, and HPC for an exhaust port optimization study.
After identifying five exhaust port parameters to vary and parametrizing the geometry, the team used Latin hypercube sampling to set up a DoE study with 256 cases. The cases were run on CONVERGE Horizon in less than a day. We separated the wealth of data generated by the DoE study to train (using 90% of the DoE data) and test (using 10% of the DoE data) an ML emulator. This two-step process ensures the ML emulator can genuinely predict designs, rather than simply regurgitating the data from the DoE. Having confirmed the efficacy of the ML emulator, the team then used the trained emulator to predict the optimal case that minimized the exhaust port pumping work. The optimization study produced a small yet significant improvement in exhaust port efficiency. With traditional methods, an experimental optimization would have been far more expensive and taken significantly more time. However, thanks to the use of ML and HPC, this study was completed in a few days rather than several months. For more information, read our blog, which goes into detail about the design, methodology, results, and future outlooks of this study.
Harnessing wind energy is a cornerstone of the global agenda toward sustainability, since it provides a renewable power source with minimal environmental impact. Advancements in wind turbine technology enable the establishment of wind farms, which can generate significantly more power than a single turbine.
Wind farm layout can influence overall energy output, operational efficiency, and total project costs. In a poorly laid out wind farm, wake effects generated by upwind turbines may decrease the performance of downwind turbines. In such scenarios, ML can help optimize wind farm layout by accurately predicting turbine interactions to ensure each turbine receives optimal wind flow.
For a wind farm of 25 NREL 5MW wind turbines with constant wind speed and neutral atmospheric conditions, CONVERGE’s ML tool optimized the layout of the center five turbines for maximum power production. A DoE study produced the data to train the ensemble ML model, which was used to predict the optimal layout. The ML model, which was fully trained in 1 minute, returned four optimums, which were run in CONVERGE to confirm the configuration that produced the most power. Figure 2 shows the optimized wind farm layout, where the turbines in the center row are staggered.

Having concluded your research, you breathe a sigh of relief. CONVERGE’s ML tool has the potential to not only transform the engine industry, but also impart important insights in applications such as wind farm layout and reduced-order modeling. By training the model with DoE data, you have access to the entire design space and can uncover hidden patterns that were previously out of reach. With the speed and flexibility of CONVERGE’s ML tool, you no longer have to choose between quick results and accuracy—you could have both.
[1] Pei, Y., Pal, P., Zhang, Y., Traver, M., Cleary, D., Futterer, C., Brenner, M., Probst, D., and Som, S., “CFD-Guided Combustion System Optimization of a Gasoline Range Fuel in a Heavy-Duty Compression Ignition Engine Using Automatic Piston Geometry Generation and a Supercomputer,” SAE Technical Paper 2019-01-0001, 2019, doi:10.4271/2019-01-0001.
[2] Moiz, A.A., Pal, P., Probst, D., Pei, Y., Zhang, Y., Som, S., and Kodavasal, J., “A Machine Learning-Genetic Algorithm (ML-GA) Approach for Rapid Optimization Using High-Performance Computing,” SAE Technical Paper 2018-01-0190, 2018, doi:10.4271/2018-01-0190