Embedded and Cloud Computing Integration for Smart Mobile Learning Applications Using Deep Reinforcement Learning
Keywords:
Embedded systems; Deep reinforcement learning; Cloud–edge computing; Smart mobile learning; Computation offloading; Energy efficiencyAbstract
Smart mobile learning applications are becoming increasingly dependent on embedded devices which run with harsh limitations with regards to computation power, energy, and network variability. Although cloud computing is capable of providing scalable processing, storage capacities, the integration of embedded platforms to cloud infrastructure is still a big problem especially when it comes to use in resource and intensive and latency-sensitive learning applications. In this paper, the author proposes a smart embedded-cloud computing system of smart mobile learning applications relying on Deep Reinforcement Learning (DRL). This proposed solution defines the computation offloading problem and resource allocation issue as a sequential decision-making problem, according to which the DRL agent is a dynamic, in charge of deciding whether the learning tasks need to be performed locally on the embedded system or offloaded through edge servers or cloud servers. State space is a combination of device status, network status and task properties and the reward mechanism serves to manage the combination of execution latency, energy use and quality of service. Numerous experiments performed in a simulated mobile learning setting prove that the suggested DRL-based mechanism is more favourable, compared to traditional local execution, offloading to the cloud, and heuristic-based approaches, in terms of lower latency and energy consumption. The findings demonstrate the scalability and flexibility of DRA on dynamic mobile systems, so that the suggested model is a prospective indicator to the future-generation smart mobile learning systems combining embedded and cloud computing technologies.
