Academic
jobline | Researcher |
Contact information
Name | Eugenio Rubio Drosdov |
Contact
Publications
Rubio-Drosdov, Eugenio; Díaz-Sánchez, Daniel; Marín-López, Andrés; Almenares-Mendoza, Florina
A Framework for Microservice Migration and Performance Assessment Proceedings Article
In: pp. 291 - 299, 2020, ISBN: 978-1-4503-5988-7.
@inproceedings{pa059,
title = {A Framework for Microservice Migration and Performance Assessment},
author = {Eugenio Rubio-Drosdov and Daniel Díaz-Sánchez and Andrés Marín-López and Florina Almenares-Mendoza},
doi = {doi:10.3233/AISE200053},
isbn = {978-1-4503-5988-7},
year = {2020},
date = {2020-06-25},
urldate = {2020-06-25},
pages = {291 - 299},
abstract = {In a large Smart Grid, smart meters produce tremendous amount of data that are hard to process, analyze and store. Fog computing is an environment that offers a place for collecting, computing and storing smart meter data before transmitting them to the cloud. Due to the distributed, heterogeneous and resource constrained nature of the fog computing nodes, fog applications need to be developed as a collection of interdependent, lightweight modules. Since this concept aligns with the goals of microservices architecture (MSA), efficient placement of microservices-based Smart Grid applications within fog environments has the potential to fully leverage capabilities of fog devices. Microservice architecture is an emerging software architectural style. It is based on microservices to provide several advantages over a monolithic solution, such as autonomy, composability, scalability, and fault-tolerance. However, optimizing the migration of microservices from one fog environment to other while assuring certain quality is still a big issue that needs to be addressed. In this paper, we propose an approach for assisting the migration of microservices in MSA-based Smart Grid systems, based on the analysis of their performance within the possible candidate destinations. Developers create microservices that will be eventually deployed at a given infrastructure. Either the developer, cosidering the design, or the entity deploying the service have a good knowledge of the quality required by the microservice. Due to that, they can create tests that determine if a destination meets the requirements of a given microservice and embed these tests as part of the microservice. Our goal is to automate the execution of performance tests by attaching a specification that contains the test parameters to each microservice.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
In a large Smart Grid, smart meters produce tremendous amount of data that are hard to process, analyze and store. Fog computing is an environment that offers a place for collecting, computing and storing smart meter data before transmitting them to the cloud. Due to the distributed, heterogeneous and resource constrained nature of the fog computing nodes, fog applications need to be developed as a collection of interdependent, lightweight modules. Since this concept aligns with the goals of microservices architecture (MSA), efficient placement of microservices-based Smart Grid applications within fog environments has the potential to fully leverage capabilities of fog devices. Microservice architecture is an emerging software architectural style. It is based on microservices to provide several advantages over a monolithic solution, such as autonomy, composability, scalability, and fault-tolerance. However, optimizing the migration of microservices from one fog environment to other while assuring certain quality is still a big issue that needs to be addressed. In this paper, we propose an approach for assisting the migration of microservices in MSA-based Smart Grid systems, based on the analysis of their performance within the possible candidate destinations. Developers create microservices that will be eventually deployed at a given infrastructure. Either the developer, cosidering the design, or the entity deploying the service have a good knowledge of the quality required by the microservice. Due to that, they can create tests that determine if a destination meets the requirements of a given microservice and embed these tests as part of the microservice. Our goal is to automate the execution of performance tests by attaching a specification that contains the test parameters to each microservice.