Keyphrases
Two-time Scale
100%
Continual Learning
100%
Edge-cloud Collaborative
100%
Online Management
100%
Two-time-scale Approach
100%
Training Model
50%
Time Application
50%
Deep Learning
50%
Low Computation
50%
Communication Cost
50%
Error Bound
50%
Geographical Location
50%
Optimization Framework
50%
Global Constraints
50%
Distributed Deep Learning
50%
Model Dynamics
50%
Generated Data
50%
Uncertain Future
50%
Network Change
50%
Deep Learning Training
50%
Coupling Effect
50%
Local Model
50%
Data Heterogeneity
50%
Model Convergence
50%
Model Aggregation
50%
Online Optimization
50%
Lyapunov Optimization
50%
Data Offloading
50%
Online Offloading
50%
Continuous Training
50%
Computation Capability
50%
Synchronization Frequency
50%
Model Synchronization
50%
Fine Time-scales
50%
Dynamic Offloading
50%
Compromise Model
50%
Data Dynamics
50%
Computational Node
50%
Long-term Costs
50%
Computer Science
Deep Learning Method
100%
Data Stream
100%
Large Data Set
50%
Optimization Technique
50%
Communication Cost
50%
Real-Time Application
50%
Optimization Framework
50%
Global Constraint
50%
Training Model
50%
Data Heterogeneity
50%
Computation Node
50%
Lyapunov Optimization
50%
Computation Capability
50%