Abstract:Alongside the continuous process of improving AI performance through the development of more sophisticated models, researchers have also focused their attention to the emerging concept of data-centric AI, which emphasizes the important role of data in a systematic machine learning training process. Nonetheless, the development of models has also continued apace. One result of this progress is the development of the Transformer Architecture, which possesses a high level of capability in multiple domains such as Natural Language Processing (NLP), Computer Vision (CV) and Time Series Forecasting (TSF). Its performance is, however, heavily dependent on input data preprocessing and output data evaluation, justifying a data-centric approach to future research. We argue that data-centric AI is essential for training AI models, particularly for transformer-based TSF models efficiently. However, there is a gap regarding the integration of transformer-based TSF and data-centric AI. This survey aims to pin down this gap via the extensive literature review based on the proposed taxonomy. We review the previous research works from a data-centric AI perspective and we intend to lay the foundation work for the future development of transformer-based architecture and data-centric AI.
Abstract:This work aims at optimizing injection networks, which consist in adding a set of long-range links (called bypass links) in mobile multi-hop ad hoc networks so as to improve connectivity and overcome network partitioning. To this end, we rely on small-world network properties, that comprise a high clustering coefficient and a low characteristic path length. We investigate the use of two genetic algorithms (generational and steady-state) to optimize three instances of this topology control problem and present results that show initial evidence of their capacity to solve it.