Event-driven architectures (EDAs) are widely used to make distributed mission critical software systems more- efficient and scalable. In the context of EDAs, data distribution service (DDS) is a recent standard by the object management group that offers a rich support for quality- of-service and balances predictable behavior and implementation efficiency. The DDS specification does not outline how messages are delivered, so several architectures are nowadays available. This paper focuses on performance assessment of OMG DDS-compliant middleware technologies. It provides three contributions to the study of evaluating the performance of DDS implementations: 1) describe the challenges to be addressed; 2) propose possible solutions; 3) define a representative workload scenario for evaluating the performance and scalability of DDS platforms. At the end of the paper, a case study of DDS performance assessment, performed with the proposed benchmark, is presented.
Performance assessment of OMG compliant data distribution middleware / Esposito, Christiancarmine; Russo, Stefano; Di Crescenzo, D.. - STAMPA. - (2008), pp. 1-8. (Intervento presentato al convegno Parallel and Distributed Processing 2008. IPDPS 2008. tenutosi a Washington DC nel 14-18 April 2008) [10.1109/IPDS.2008.4536566].
Performance assessment of OMG compliant data distribution middleware
ESPOSITO, CHRISTIANCARMINE;RUSSO, STEFANO;
2008
Abstract
Event-driven architectures (EDAs) are widely used to make distributed mission critical software systems more- efficient and scalable. In the context of EDAs, data distribution service (DDS) is a recent standard by the object management group that offers a rich support for quality- of-service and balances predictable behavior and implementation efficiency. The DDS specification does not outline how messages are delivered, so several architectures are nowadays available. This paper focuses on performance assessment of OMG DDS-compliant middleware technologies. It provides three contributions to the study of evaluating the performance of DDS implementations: 1) describe the challenges to be addressed; 2) propose possible solutions; 3) define a representative workload scenario for evaluating the performance and scalability of DDS platforms. At the end of the paper, a case study of DDS performance assessment, performed with the proposed benchmark, is presented.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.