SOTAVerified

Dependency-Aware Computation Offloading in Mobile Edge Computing: A Reinforcement Learning Approach

2019-09-18Unverified0· sign in to hype

SHENGLI PAN

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Mobile edge computing (MobEC) builds an Information Technology (IT) service environment to enable cloud-computing capabilities at the edge of mobile networks. To tackle the restrictions in the battery power and computation capability of mobile devices, task offloading for using MobEC is developed and used to reduce the service latency and to ensure high service efficiency. However, most of the existing schemes only focus on one-shot offloading, while taking less into consideration the task dependency. It is urgently needed a more comprehensive and adaptive way to take both the energy constraint and the inherent dependency of tasks into account,since modern communication networks have increasingly become complicated and dynamic. To this end, in this paper, we are motivated to study the problem of dependency-aware task offloading decision in MobEC, aiming at minimizing the execution time for mobile applications with constraints on energy consumption. To solve this problem, we propose a model-free approach based on reinforcement learning (RL), i.e., a Q-learning approach that adaptively learns to optimize the offloading decision and energy consumption jointly by interacting with the network environment. Simulation results show that our RL-based approach is able to achieve significant reduction on the total execution time with comparably less energy consumption.

Tasks

Reproductions