||In this paper, we study how to design resource allocation algorithms for data analytics services that are computationally intensive and have low-latency requirements. As a paradigm application, we consider a video surveillance service where video streams are analyzed in the cloud with deep learning algorithms (i.e., object detection and image classification). We present a network model that allows data analytics to be processed in multiple stages, and propose an algorithm that provides low congestion when the arrival rate is constant over time. The algorithm also allows other types of data analytics to be carried out in the cloud in order to maximize resource utilization. The performance of the proposed algorithm is evaluated using simulation and demonstrates it is possible to obtain low-delay while maximizing the use of resources.