Enabling Real-time AI Inference on Mobile Devices via GPU-CPU Collaborative Execution

Hao Li*, Joseph K. Ng*, Tarek Abdelzaher

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

3 Citations (Scopus)

Abstract

AI-powered mobile applications are becoming increasingly popular due to recent advances in machine intelligence. They include, but are not limited to mobile sensing, virtual assistants, and augmented reality. Mobile AI models, especially Deep Neural Networks (DNN), are usually executed locally, as sensory data are collected and generated by end devices. This imposes a heavy computational burden on the resource-constrained mobile phones. There are usually a set of DNN jobs with deadline constraints waiting for execution. Existing AI inference frameworks process incoming DNN jobs in sequential order, which does not optimally support mobile users' real-time interactions with AI services. In this paper, we propose a framework to achieve real-time inference by exploring the heterogeneous mobile SoCs, which contain a CPU and a GPU. Considering characteristics of DNN models, we optimally partition the execution between the mobile GPU and CPU. We present a dynamic programming-based approach to solve the formulated real-time DNN partitioning and scheduling problem. The proposed framework has several desirable properties: 1) computational resources on mobile devices are better utilized; 2) it optimizes inference performance in terms of deadline miss rate; 3) no sacrifices in inference accuracy are made. Evaluation results on an off-the-shelf mobile phone show that our proposed framework can provide better real-time support for AI inference tasks on mobile platforms, compared to several baselines.

Original languageEnglish
Title of host publicationProceedings - 2022 IEEE 28th International Conference on Embedded and Real-Time Computing Systems and Applications, RTCSA 2022
PublisherIEEE
Pages195-204
Number of pages10
ISBN (Electronic)9781665453448
ISBN (Print)9781665453455
DOIs
Publication statusPublished - 23 Aug 2022
Event28th IEEE International Conference on Embedded and Real-Time Computing Systems and Applications, RTCSA 2022 - Taipei, Taiwan, Province of China
Duration: 23 Aug 202225 Aug 2022
https://ieeexplore.ieee.org/xpl/conhome/9904767/proceeding

Publication series

NameProceedings - IEEE International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA)
ISSN (Print)2325-1271
ISSN (Electronic)2325-1301

Conference

Conference28th IEEE International Conference on Embedded and Real-Time Computing Systems and Applications, RTCSA 2022
Country/TerritoryTaiwan, Province of China
CityTaipei
Period23/08/2225/08/22
Internet address

Scopus Subject Areas

  • Computer Networks and Communications
  • Computer Science Applications
  • Information Systems and Management
  • Control and Optimization

Fingerprint

Dive into the research topics of 'Enabling Real-time AI Inference on Mobile Devices via GPU-CPU Collaborative Execution'. Together they form a unique fingerprint.

Cite this