Caching Popular Transient IoT Contents in an SDN-based Edge Infrastructure
ABSTARCT :
With more than 75 billions of objects connected by 2025, Internet of Things (IoT) is the catalyst for the digital revolution, contributing to the generation of big amounts of (transient) data, which calls into question the storage and processing performance of the conventional cloud. Moving storage resources at the edge can reduce the data retrieval latency and save core network resources, albeit the actual performance depends on the selected caching policy. Existing edge caching strategies mainly account for the content popularity as crucial decision metric and do not consider the transient feature of IoT data. In this paper, we design a caching orchestration mechanism, deployed as a network application on top of a software-defined networking Controller in charge of the edge infrastructure, which accounts for the nodes’ storage capabilities, the network links’ available bandwidth, and the IoT data lifetime and popularity. The policy decides which IoT contents have to be cached and in which node of a distributed edge deployment with limited storage resources, with the ultimate aim of minimizing the data retrieval latency. We formulate the optimal content placement through an Integer Linear Programming (ILP) problem and propose a heuristic algorithm to solve it. Results show that the proposal outperforms the considered benchmark solutions in terms of latency and cache hit probability, under all the considered simulation settings.
EXISTING SYSTEM :
? We scan the existing caching and replacement policies specifically designed for transient contents in VNDN and, finally, we outline interesting research perspectives.
? A few existing survey papers about VNDN can be already found in the literature; however, none of them tackle our targets.
? To reach the targeted objectives, the proposal defines an optimal scheduling of data traffic over the existing links and selects a set of parked vehicles as optimal relays.
? We present a classification and analysis of the existing solutions proposed in the context of VNDN caching.
? If the newly received value is higher, then the existing cached item is replaced in favor of the new one.
DISADVANTAGE :
? The optimization problem is modeled as a large-scale linear programming problem that is solved using column generation method.
? Considering these problems, researchers investigated the possibility of caching content items locally and proactively at the edge of the mobile networks (i.e., caching in SBS and user terminal (UT)) before users request them.
? They formulate the problem with the aim to increase the throughput by unloading a lot of traffic from the main cellular network.
? The objective of the optimization problem is to maximize the offloading probability.
PROPOSED SYSTEM :
• In parallel, several caching strategies have been proposed to improve the availability of contents.
• Multiple approaches have been proposed over the years to limit the adverse effects of packet broadcasting, e.g., high traffic congestion and limited reliability, and improve the forwarding decision.
• Many other caching solutions were proposed in the literature that outperform CEE.
• An analytical model capturing the distributed contention in the VANET is proposed that shows how AoI changes with the beacon sending frequency and the vehicle density.
• More advanced scheduling mechanisms are proposed to minimize AoI and, sometimes, jointly satisfy other performance metrics, e.g., limiting the energy consumption or the network congestion.
ADVANTAGE :
? To handle these traffic explosions, mobile wireless networks require continuous evolution and improve the performance in terms of power consumption, data throughput, and utilization of network resources such as backhaul network capacity and bandwidth.
? Service chaining policy refers to the term that describes executing multiple service functions in an ordered list to guarantee performance and security requirements.
? The performance indices in these proposals are overall delay, user satisfaction ratio, offloading probability, and total throughput.
? The performance of the caching algorithm with the increase of network size is addressed with the scalability of the caching algorithm.
|