DNNOff Offloading DNN-based Intelligent IoT Applications in Mobile Edge Computing

      

ABSTARCT :

Deep neural network (DNN) has become increasingly popular in industrial IoT scenarios. Due to high demands on computational capability, it is hard for DNN-based applications to directly run on intelligent end devices with limited resources. Computation offloading technology offers a feasible solution by offloading some computation-intensive tasks to the cloud or edges. Supporting such capability is not easy due to two aspects: (1) Adaptability: offloading should be dynamically occur among computation nodes. (2) Effectiveness: it needs to be determined which parts are worth offloading. This paper proposed a novel approach, called DNNOff. For a given DNN-based application, DNNOff first rewrites the source code to implement a special program structure supporting on-demand offloading, and at runtime, automatically determines the offloading scheme. We evaluated DNNOff on a real-world intelligent application, with three DNN models. Our results show that, compared with other approaches, DNNOff saves response time by 12.4%-66.6% on average.

EXISTING SYSTEM :

? The edge server is closer to the device than the cloud server, so it has greater bandwidth and response time. ? However, most existing computation offloading frameworks for blockchain mining services have ignored user privacy. ? There are still some shortcomings in existing deep learning methods, e.g., the slow learning speed and the failure of the original network parameters when the environment changes. ? To tackle these challenges, we propose a Deep Meta Reinforcement Learning-based Offloading (DMRO) algorithm, which combines multiple parallel DNNs with Q-learning to make fine-grained offloading decisions.

DISADVANTAGE :

? In light of this situation, the strategy of computation offloading has been adopted to solve this problem. ? A promising technique to solve this problem is to offload computationally intensive tasks to nearby servers with more abundant resources, which is called computation offloading or nomadic services. ? The multi-user computation offloading problem in a multi-channel wireless interference environment was studied in, and the game theory method was used to implement effective channel allocation in a distributed manner. ? The two-layer optimization method was used to decouple the original NP-hard problem into a low-level problem to seek power and sub-carrier allocation and upper-layer task offloading.

PROPOSED SYSTEM :

• The proposed work considers all the important parameters in the cost function and generates a comprehensive training dataset with high computation and complexity. • The proposed work considers the partitioning process in a partial offloading technique and calculates cost for each possible partitioning and offloading policy and then select the partitioning and offloading policy with minimum cost. • The proposed cost function also considers the propagation delay, radio resources, and computing resources. • In the proposed work, first we divide a task into n components and then using the partial offloading technique, the UE offloads some of the components to MES and some of the components are executed on UE.

ADVANTAGE :

? In this paper, we propose a computation offloading strategy under a scenario of multi-user and multi-mobile edge servers that considers the performance of intelligent devices and server resources. ? Computation offloading is a promising technique that can promote the lifetime and performance of smart devices by offloading local computation tasks to edge servers. ? However, the effect of server resources used in computation offloading performance was not considered. ? A task scheduling model has been proposed based on the improved auction to optimize the time requirements of the tasks and computation performance of the MEC servers. ? The improved auction algorithm proposed in this paper not only has advantages in time complexity but also improves the efficiency of virtual machines.

Download DOC Download PPT

We have more than 145000 Documents , PPT and Research Papers

Have a question ?

Chat on WhatsApp