A QoE-Oriented Computation Offloading Algorithm based on Deep Reinforcement Learning for Mobile Edge Computing
This repository contains the Python code for reproducing the decentralized QECO (QoE-Oriented Computation Offloading) algorithm, designed for Mobile Edge Computing (MEC) systems.
QECO is designed to balance and prioritize QoE factors based on individual mobile device requirements while considering the dynamic workloads at the edge nodes. The QECO algorithm captures the dynamics of the MEC environment by integrating the Dueling Double Deep Q-Network (D3QN) model with Long Short-Term Memory (LSTM) networks. This algorithm address the QoE maximization problem by efficiently utilizing resources from both MDs and ENs.
-
D3QN: By integrating both double Q-learning and dueling network architectures, D3QN overcomes overestimation bias in action-value predictions and accurately identifies the relative importance of states and actions. This improves the model’s ability to make accurate predictions, providing a foundation for enhanced offloading strategies.
-
LSTM: Incorporating LSTM networks allows the model to continuously estimate dynamic work- loads at edge servers. This is crucial for dealing with limited global information and adapting to the uncertain MEC environment with multiple MDs and ENs. By predicting the future workload of edge servers, MDs can effectively adjust their offloading strategies to achieve higher QoE.
- main.py: The main code, including training and testing structures, implemented using Tensorflow 1.x.
- MEC_Env.py: Contains the code for the mobile edge computing environment.
- D3QN.py: The code for reinforcement learning with double deep Q-network (D3QN) for mobile devices, implemented using Tensorflow 1.x.
- DDQN_keras.py: D3QN implementation using Keras.
- DDQN_torch.py: D3QN implementation using PyTorch.
- Config.py: Configuration file for MEC entities and neural network setup.
If you use this work in your research, please cite it as follows:
I. Rahmati, H. Shahmansouri, and A. Movaghar, "QECO: A QoE-Oriented Computation Offloading Algorithm based on Deep Reinforcement Learning for Mobile Edge Computing".
@article{rahmati2023qeco,
title={QECO: A QoE-Oriented Computation Offloading Algorithm based on Deep Reinforcement Learning for Mobile Edge Computing},
author={Rahmati, Iman and Shah-Mansouri, Hamed and Movaghar, Ali},
journal={arXiv preprint arXiv:2311.02525},
year={2023}
}
- Iman Rahmati: Research Assistant in the Computer Science and Engineering Department at SUT.
- Hamed Shah-Mansouri: Assistant Professor in the Electrical Engineering Department at SUT.
- Ali Movaghar: Professor in the Computer Science and Engineering Department at SUT.
Make sure you have the following packages installed:
- Tensorflow
- PyTorch
- numpy
- matplotlib
- Clone the repository:
git clone https://github.com/ImanRHT/QECO.git
cd QECO
-
Configure the MEC environment in Config.py.
-
Run the training script:
python main.py
We welcome contributions! Here’s how you can get involved:
- Fork the repository and create a new branch for your contribution.
- Submit a pull request detailing your changes or additions.
- For bug reports or feature requests, open a GitHub issue here.
-
H. Shah-Mansouri and V. W. Wong, “Hierarchical fog-cloud computing for iot systems: A computation offloading game", IEEE Internet of Things Journal, May 2018.
-
M. Tang and V. W. Wong, "Deep reinforcement learning for task offloading in mobile edge computing systems", IEEE Transactions on Mobile Computing, Nov 2020.
-
H. Zhou, K. Jiang, X. Liu, X. Li, and V. C. Leung, “Deep reinforcement learning for energy-efficient computation offloading in mobile-edge computing”, IEEE Internet of Things Journal, Jun 2021.
-
L. Yang, H. Zhang, X. Li, H. Ji, and V. C. Leung, “A distributed computation offloading strategy in small-cell networks integrated with mobile edge computing”, IEEE/ACM Transactions on Networking, Dec 2018.