Abstract:This article presents a comprehensive overview of the digital twin technology and its capability levels, with a specific focus on its applications in the wind energy industry. It consolidates the definitions of digital twin and its capability levels on a scale from 0-5; 0-standalone, 1-descriptive, 2-diagnostic, 3-predictive, 4-prescriptive, 5-autonomous. It then, from an industrial perspective, identifies the current state of the art and research needs in the wind energy sector. The article proposes approaches to the identified challenges from the perspective of research institutes and offers a set of recommendations for diverse stakeholders to facilitate the acceptance of the technology. The contribution of this article lies in its synthesis of the current state of knowledge and its identification of future research needs and challenges from an industry perspective, ultimately providing a roadmap for future research and development in the field of digital twin and its applications in the wind energy industry.
Abstract:Upcoming technologies like digital twins, autonomous, and artificial intelligent systems involving safety-critical applications require models which are accurate, interpretable, computationally efficient, and generalizable. Unfortunately, the two most commonly used modeling approaches, physics-based modeling (PBM) and data-driven modeling (DDM) fail to satisfy all these requirements. In the current work, we demonstrate how a hybrid approach combining the best of PBM and DDM can result in models which can outperform them both. We do so by combining partial differential equations based on first principles describing partially known physics with a black box DDM, in this case, a deep neural network model compensating for the unknown physics. First, we present a mathematical argument for why this approach should work and then apply the hybrid approach to model two dimensional heat diffusion problem with an unknown source term. The result demonstrates the method's superior performance in terms of accuracy, and generalizability. Additionally, it is shown how the DDM part can be interpreted within the hybrid framework to make the overall approach reliable.
Abstract:Hybrid Analysis and Modeling (HAM) is an emerging modeling paradigm which aims to combine physics-based modeling (PBM) and data-driven modeling (DDM) to create generalizable, trustworthy, accurate, computationally efficient and self-evolving models. Here, we introduce, justify and demonstrate a novel approach to HAM -- the Corrective Source Term Approach (CoSTA) -- which augments the governing equation of a PBM model with a corrective source term generated by a deep neural network (DNN). In a series of numerical experiments on one-dimensional heat diffusion, CoSTA is generally found to outperform comparable DDM and PBM models in terms of accuracy -- often reducing predictive errors by several orders of magnitude -- while also generalizing better than pure DDM. Due to its flexible but solid theoretical foundation, CoSTA provides a modular framework for leveraging novel developments within both PBM and DDM, and due to the interpretability of the DNN-generated source term within the PBM paradigm, CoSTA can be a potential door-opener for data-driven techniques to enter high-stakes applications previously reserved for pure PBM.
Abstract:Most modeling approaches lie in either of the two categories: physics-based or data-driven. Recently, a third approach which is a combination of these deterministic and statistical models is emerging for scientific applications. To leverage these developments, our aim in this perspective paper is centered around exploring numerous principle concepts to address the challenges of (i) trustworthiness and generalizability in developing data-driven models to shed light on understanding the fundamental trade-offs in their accuracy and efficiency, and (ii) seamless integration of interface learning and multifidelity coupling approaches that transfer and represent information between different entities, particularly when different scales are governed by different physics, each operating on a different level of abstraction. Addressing these challenges could enable the revolution of digital twin technologies for scientific and engineering applications.
Abstract:Recent applications of machine learning, in particular deep learning, motivate the need to address the generalizability of the statistical inference approaches in physical sciences. In this letter, we introduce a modular physics guided machine learning framework to improve the accuracy of such data-driven predictive engines. The chief idea in our approach is to augment the knowledge of the simplified theories with the underlying learning process. To emphasise on their physical importance, our architecture consists of adding certain features at intermediate layers rather than in the input layer. To demonstrate our approach, we select a canonical airfoil aerodynamic problem with the enhancement of the potential flow theory. We include features obtained by a panel method that can be computed efficiently for an unseen configuration in our training procedure. By addressing the generalizability concerns, our results suggest that the proposed feature enhancement approach can be effectively used in many scientific machine learning applications, especially for the systems where we can use a theoretical, empirical, or simplified model to guide the learning module.