Development of a Rapid Screening Approach to Estimate the Seismic Capacity of Typical Buildings in Groningen

More Info
expand_more

Abstract

Gas extraction in the north-eastern Groningen province of the Netherlands has resulted in an increase in induced earthquakes, causing significant damage to homes and affecting residents’ mental health. While reinforcement measures and compensations are being implemented, progress has been slow. The structural analysis for reinforcing these buildings follows various methodologies in accordance with Dutch standard NPR 9998, but these methods are time-consuming and repetitive. This study investigates the use of machine learning techniques to optimize and accelerate the structural analysis of unreinforced masonry (URM) residential buildings. The traditional seismic analysis of URM buildings is computationally expensive, requiring extensive time for model development and assessment. To address this, a classification model and a surrogate model were developed using Deep Neural Network. The data required for both models are obtained from the existing database of Royal HaskoningDHV. Both models provided results indicating whether the building requires reinforcement measures or not. The classification model employs binary classification, using specific building features to categorize structures into those requiring reinforcement
and those that do not. The model generates rapid, implementable outcomes, enabling efficient categorization of buildings. This swift assessment allows for prompt decision-making, particularly in identifying and prioritizing high-risk structures that need more detailed analysis. Surrogate models are efficient approximations of complex analysis, capable of predicting reliable structural responses. Here surrogate for SLaMA analysis is built. The SLaMA surrogate will serve as a proof of concept for developing surrogates of more complex and computationally demanding structural analyses. The results show that, for the classification model, despite sufficient data points, the correlation between input features and the target variable remains weak, primarily due to changes in NPR regulations over the years. This did not allow for a consistent assessment procedure; hence, the same building might be compliant with a recent NPR and not with the older NPRs. For the surrogate model, two different input combinations were experimented with, the analysis provides insights into the impact of input parameters, the need for dimensionality reduction, and the number of data points required for computationally expensive models. The input combinations helped to analyze the curse of dimensionality. The results showed that as more parameters were added, the model required significantly more data to maintain performance. Additionally, the relevance of the selected
reduced input features was observed, to see how well it captures the key aspects of the original analysis with less computational cost. It was observed that the selected features need to remain closely aligned with the original analysis to produce comparable results. In conclusion, while machine learning approaches offer significant potential for improving the efficiency of seismic analysis, further refinement and validation of the models are necessary to address their current limitations and enhance their applicability in real-world scenarios.

Files

Rithu_Maria_Report.pdf
(pdf | 5.28 Mb)
Unknown license