Skip to main content
Seminar | Energy Systems Division

Coarse-Grid Computational Fluid Dynamics Error Prediction using Machine Learning

ES Seminar

Abstract: Despite the progress in high-performance computing, computational fluid dynamics (CFD) simulations are still computationally expensive for many practical engineering applications such as simulating large computational domains and highly turbulent flows. One of the major reasons for the high expense of CFD is the need for a fine grid to resolve phenomena at the relevant scale and obtain a grid-independent solution. The fine-grid requirements often drive the computational time step size down, which makes long transient problems prohibitively expensive.

In the research presented, the feasibility of a coarse-grid CFD (CG-CFD) approach is investigated by using machine learning algorithms. Relying on coarse grids increases the discretization error. Hence, a novel method is suggested to produce a surrogate model that predicts the CG-CFD local errors to correct the variables of interest. Given high-fidelity data, a surrogate model is trained to predict the CG-CFD local errors as a function of the coarse-grid local fluid flow features. Machine learning regression algorithms are used to construct a surrogate model that relates the local error and the coarse-grid features.

This method is applied to a three-dimensional flow in a lid-driven cavity domain. A set of scenarios that test the proposed method is studied. These scenarios investigate the capability of the surrogate model to interpolate and extrapolate outside the training data range. These scenarios also cover a range of Reynolds number plus a range of grid sizes and aspect ratios (geometries). The proposed method maximizes the benefit of the available data and shows potential for a good predictive capability.