Abstract |
Geographic phenomena such as weather, pollution, pollen are increasingly monitored through deployments of wide-area networked sensors. However, the coverage of these sensors is limited to key densely populated regions. A standard approach to inferring missing spatial and temporal values is to use regression. In this paper, we present a new approach, NN-SAR, to inferring spatiotemporal values from existing deployed sensors. We model this inference problem as that of learning a spatial representation of the underlying phenomena from the existing data and use deep learning based auto-encoder approach. Classical auto-encoders learn on image or singular time series data without taking “spatial” similarities into account. We present a novel mechanism for encoding the spatially distributed sensor readings as “images” and apply the auto-encoder with convolutional layers to learn an efficient representation of the data, which can then be used to infer missing sensor data. Preliminary results indicate that the performance of our approach is far superior to the state-of-the-art Spatial Auto-Regressive (SAR) models by 20% on average. |