Self-Learning Neural Network Architectures for Variable Length Encoding Schemes

Abstract Data analytics problems require real-time processing of complex data gathered by a large network of sensors. As networks grow larger and the analytics more complex, centralized solutions are becoming increasingly infeasible. Analytics problems leverage the density of sensor networks to make predictions based on spatio-temporal correlations which are difficult to compress and turn into a more easily communicable form. Our previous work uses neural networks to learn a fixed-length encoding scheme to capture these correlations and solve analytics problems using distributed inference. In this paper, we present a novel neural network architecture that allows for variable size outputs from fully connected neural network layers.
Authors
  • Nick Nordlund (Yale)
  • Heesung Kwon (ARL)
  • Geeth de Mel (IBM UK)
  • Leandros Tassiulas (Yale)
Date Sep-2019
Venue Annual Fall Meeting of the DAIS ITA, 2019