Specializing Directed Acyclic Graph Federated Learning(SDAGFL) is a new federated learning framework which updates model from the devices with similar data distribution through Directed Acyclic Graph Distributed Ledger Technology (DAG-DLT). SDAGFL has the advantage of personalization, resisting single point of failure and poisoning attack in fully decentralized federated learning. Because of these advantages, the SDAGFL is suitable for the federated learning in IoT scenario where the device is usually battery-powered. To promote the application of SDAGFL in IoT, we propose an energy optimized SDAGFL based event-triggered communication mechanism, called ESDAGFL. In ESDAGFL, the new model is broadcasted only when it is significantly changed. We evaluate the ESDAGFL on a clustered synthetically FEMNIST dataset and a dataset from texts by Shakespeare and Goethe's works. The experiment results show that our approach can reduce energy consumption by 33\% compared with SDAGFL, and realize the same balance between training accuracy and specialization as SDAGFL.