Abstract:Multi-tasking machine learning (ML) models exhibit prediction abilities in domains with little to no training data available (few-shot and zero-shot learning). Over-parameterized ML models are further capable of zero-loss training and near-optimal generalization performance. An open research question is, how these novel paradigms contribute to solving tasks related to enhancing the renewable energy transition and mitigating climate change. A collection of unified ML tasks and datasets from this domain can largely facilitate the development and empirical testing of such models, but is currently missing. Here, we introduce the ETT-17 (Energy Transition Tasks-17), a collection of 17 datasets from six different application domains related to enhancing renewable energy, including out-of-distribution validation and testing data. We unify all tasks and datasets, such that they can be solved using a single multi-tasking ML model. We further analyse the dimensions of each dataset; investigate what they require for designing over-parameterized models; introduce a set of dataset scores that describe important properties of each task and dataset; and provide performance benchmarks.
Abstract:Active learning (AL) could contribute to solving critical environmental problems through improved spatio-temporal predictions. Yet such predictions involve high-dimensional feature spaces with mixed data types and missing data, which existing methods have difficulties dealing with. Here, we propose a novel batch AL method that fills this gap. We encode and cluster features of candidate data points, and query the best data based on the distance of embedded features to their cluster centers. We introduce a new metric of informativeness that we call embedding entropy and a general class of neural networks that we call embedding networks for using it. Empirical tests on forecasting electricity demand show a simultaneous reduction in prediction error by up to 63-88% and data usage by up to 50-69% compared to passive learning (PL) benchmarks.