Publication:
Latency and resource consumption analysis for serverless edge analytics

Loading...
Thumbnail Image
Full text at PDC
Publication Date
2022-03-16
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Springer
Citations
Google Scholar
Research Projects
Organizational Units
Journal Issue
Abstract
The serverless computing model, implemented by Function as a Service (FaaS) platforms, can offer several advantages for the deployment of data analytics solutions in IoT environments, such as agile and on-demand resource provisioning, automatic scaling, high elasticity, infrastructure management abstraction, and a fine-grained cost model. Nonetheless, in case of applications with strict latency requirements, the cold start problem in FaaS platforms can represent an important drawback. The most common techniques to alleviate this problem, mainly based on instance pre-warming and instance reusing mechanisms, are usually not well adapted to different application profiles and, in general, can entail an extra expense of resources. In this work, we analyze the effect of instance pre-warming and instance reusing on both, application latency (response time) and resource consumption, for a typical data analytics use case (a machine learning application for image classification) with different input data patterns. Furthermore, we propose to extend the classical centralized cloud-based serverless FaaS platform to a two-tier distributed edge-cloud platform to bring the platform closer to the data source and reduce network latencies.
Description
Unesco subjects
Keywords
Citation
Collections