The increasing demand for sustainable energy, results in more wind turbines being built offshore. The blades of wind turbines consist of composites which makes them difficult to design. Because composites consist of a micro- and macro-structure of which their interplay determines
...
The increasing demand for sustainable energy, results in more wind turbines being built offshore. The blades of wind turbines consist of composites which makes them difficult to design. Because composites consist of a micro- and macro-structure of which their interplay determines the global behavior under loads. Therefore, traditional methods of analysis are often infeasible: full-scale experiments take too long and are too costly, small-scale experiments can not capture the interaction between the micro- and macro-structure, and analytical theories are often only tractable for simple cases.
$FE^2$ is one such method that can analyze the behavior of multi-scale materials, such as composites. It consists of a macro finite element model where the constitutive behavior of each integration point is obtained by homogenizing representative volume elements (RVE) that represent the microstructure. Although this method is capable of accurately predicting composites, the computational cost is high, as for each integration point another finite element model must be computed.
This process can be sped up by replacing the RVE with a surrogate that is more efficient to compute. Recently, Gaussian Process Regression (GPR), a probabilistic machine learning model, is used as a surrogate for the RVEs. It uses prior knowledge and observations of the RVE to make its predictions. It is based on Gaussian Processes, meaning that the predictions exhibit a multivariate Gaussian distribution which provides an uncertainty measure besides its prediction. This is useful in determining where new observations must be collected from the RVE.
To fully capture the RVE a lot of observations are needed from it. This is still a computational bottleneck for the surrogate as they are expensive to compute. The GPR model can be enhanced with observations from a low-fidelity model, for example, linear elastic, which is called GPR with multi-fidelity. The low-fidelity observations, inexpensive and inaccurate, enhance the prediction of the high-fidelity model, which uses high-fidelity observations that are expensive but accurate. This is shown to decrease the number of observations needed for the high-fidelity model. Thus, fewer observations need to be collected from the RVE. However, the correlation between the low- and high-fidelity is often assumed to be constant in these models. This presents a problem as the correlation in the case of surrogate modeling is often non-linear. The literature investigates several extensions of the GPR with multi-fidelity that assume a non-linear correlation, but these models become more complex and lose their simplicity as their predictions are not Gaussian anymore.
Instead, this thesis keeps the constant correlation assumption but improves the correlation inference by splitting the model. Splitting means that multiple independent GPR with multi-fidelity models are used in different regions. As splitting results in discontinuous predictions, the thesis also investigates two stitching methods to remove these discontinuities. The first, the constrained boundary (CB) method, constrains the predictive mean and predictive variance of to neighboring models to be equal at their respective boundaries. These constraints to the optimization procedure of the hyperparameters of the local GPR with multi-fidelity models. The second stitching method, the random variable mixture (RVM) uses a weighing procedure based on a mixture of experts approach. However, this method mixes the random variables instead of the probability density distributions. This creates a much simpler and analytically tractable method. For both of these stitching methods, only the high-fidelity uses the stitching procedure while the low-fidelity is split...