Enabling Edge Devices that Learn from Each Other: Cross Modal Training for Activity Recognition

Abstract Edge devices rely extensively on machine learning for intelligent inferences and pattern matching. However, edge devices use a multitude of sensing modalities and are exposed to wide ranging contexts. It is difficult to develop separate machine learning models for each scenario as manual labeling is not scalable. To reduce the amount of labeled data and to speed up the training process, we propose to transfer knowledge between edge devices by using unlabeled data. Our approach, called RecycleML, uses cross modal transfer to accelerate the learning of edge devices across different sensing modalities. Using human activity recognition as a case study, we show that RecycleML reduces the amount of required labeled data by at least 90% and speeds up the training process by up to 50 times in comparison to training the edge device from scratch
Authors
  • Tianwei Xing (UCLA)
  • Sandeep Singh Sandha (UCLA)
  • Bharathan Balaji (UCLA)
  • Supriyo Chakraborty (IBM US)
  • Mani Srivastava (UCLA)
Date Jun-2018
Venue EdgeSys18 Proceedings of the 1st International Workshop on Edge Systems, Analytics and Networking Pages 37-42