Abstract : A number of electric devices in buildings can be considered as important demand response (DR) resources, for instance, the battery energy storage system (BESS) and the heat, ventilation, and air conditioning (HVAC) systems. The conventional model-based DR methods rely on efficient ondemand computing resources. However, the current buildings suffer from the high cost of computing resources and lack a costeffective automation system, which becomes the main obstacle to the popularization and implementation of the DR program. Therefore, in this paper, we present a hybrid cloud and edge control strategy for BESS and HVAC based on deep reinforcement learning (DRL). On the cloud infrastructure, the agent learns the control strategy online based on the proposed continuous dueling deep Q-learning (C-DDQN) algorithm, and the learned strategy is distributed to the edge devices for execution. Under this framework, the data-intensive application of cloud computing in real-time DR shows advantages in high processing speed, unlimited data aggregation, fault-tolerant, cost-saving, security, and confidentiality. However, if every controller is trained from the beginning, the cloud resources are wasted to a large extent
 ? A hybrid cloud and edge framework is presented for the control of DR resources, including BESS and HVAC units. ? The problem that the training of the DRL-based method requires power computing resources can be solved. The proposed framework saves the high cost of local computing resources. With this framework commissioned, the learning in the cloud and the real-time control in the edge can be fully automated, and the DR resources can respond to the DR plan actively and timely with limited human participation
 ? The optimization process is usually timeconsuming, especially when the dimension of the decision variables is very large. To solve the optimization problem efficiently, a powerful computing resource is required. However, the current buildings suffer from the high cost of computing resources and lack a cost-effective automation system, which becomes the main obstacle to the popularization of the DR program.
 ? The same finding can be concluded for the slight illegal action percentage in the first 500 episodes. The gap becomes smaller after 500 episodes. It shows the significance of transfer learning in online training. From the perspective of average daily cost, it can be concluded in Table III that both training a new model ($12.1) and transferring from the existing model ($11.7) are very close to the optimization result ($11.1). But directly using the existing model ($18.2) will cause a 63.96% higher cost than the optimal solution.
 ? Luo at el. pointed out that cloud-based information infrastructure will be widely used in the next-generation power grid. Customer-oriented energy management as a service (EMaaS) under the cloud framework is put forward . The cloud can provide energy management service through solving the optimization for various types of load, such as electric vehicles (EVs) and air-conditioners . With the aid of the cloud, end-users only need to pay-on-demand and largely reduce the local investment cost and operation cost on local hardware
Download DOC Download PPT

We have more than 145000 Documents , PPT and Research Papers

Have a question ?

Mail us : info@nibode.com