About this Research Topic
This Research Topic aims to explore the alternatives for executing single tasks, interdependent tasks, placing Service Function Chains for serving multiple users, and to analyze the tradeoffs between them in terms of achieved QoS and QoE (e.g., response time, delay, bandwidth, and energy consumption). In this regard, cost minimization for the targeted infrastructures can also be an objective, including monetary costs, energy consumption, resource utilization, and user satisfaction maximization. Finally, resilience during computational offloading and formal guarantees regarding the execution of tasks (Service Level Agreements - SLAs) can be examined as well.
Specific themes that contributors are asked to address can include but are not limited to Machine Learning algorithms (Reinforcement Learning, Supervised Learning, Deep Learning, etc.) for placing tasks in networking infrastructures, Control Theory, Optimization, Game Theory, and Contract Theory -based solutions. The different modern networking infrastructures can include all the 5G and beyond infrastructures (6G) like, for example, Edge and Fog Computing platforms, Serverless Computing architectures, Approximate Computing, and Digital Twin and concern their comparison towards local (on-device) execution or traditional Cloud-based execution. The proposed solutions could seek a balance in distributing the task execution across the user-Cloud continuum in a broader scope.
Keywords: Machine Learning, Control Theory, Game Theory, Optimization, 5G/6G Networks, Task Placement, Edge Computing, Fog Computing, Cloud Computing
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.