Deep learning (DL) models have emerged as a promising solution for Internet of Things (IoT) applications. However, due to their computational complexity, DL models consume significant amounts of energy, which can rapidly drain the battery and compromise the performance of IoT devices. For sustainable operation, we consider an edge device with a rechargeable battery and energy harvesting (EH) capabilities. In addition to the stochastic nature of the ambient energy source, the harvesting rate is often insufficient to meet the inference energy requirements, leading to drastic performance degradation in energy-agnostic devices. To mitigate this problem, we propose energy-adaptive dynamic early exiting (EE) to enable efficient and accurate inference in an EH edge intelligence system. Our approach derives an energy-aware EE policy that determines the optimal amount of computational processing on a per-sample basis. The proposed policy balances the energy consumption to match the limited incoming energy and achieves continuous availability. Numerical results show that accuracy and service rate are improved up to 25% and 35%, respectively, in comparison with an energy-agnostic policy.