Reconciling Predictability and Coherent Caching

Ayoosh Bansal, Jayati Singh, Yifan Hao, Jen Yang Wen, Renato Mancuso, Marco Caccamo

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Real-time systems are required to respond to their physical environment within predictable time. While multi-core platforms provide incredible computational power and throughput, they also introduce new sources of unpredictability. For parallel applications with data shared across multiple cores, overhead to maintain data coherence is a major cause of execution time variability. This source of variability can be eliminated by application level control for limiting data caching at different levels of the cache hierarchy. This removes the requirement of explicit coherence machinery for selected data. We show that such control can reduce the worst case write request latency on shared data by 52%. Benchmark evaluations show that proposed technique has a minimal impact on average performance.

Original languageEnglish (US)
Title of host publication2020 9th Mediterranean Conference on Embedded Computing, MECO 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728169477
StatePublished - Jun 2020
Event9th Mediterranean Conference on Embedded Computing, MECO 2020 - Budva, Montenegro
Duration: Jun 8 2020Jun 11 2020

Publication series

Name2020 9th Mediterranean Conference on Embedded Computing, MECO 2020


Conference9th Mediterranean Conference on Embedded Computing, MECO 2020


  • cache coherence
  • hardware/software co-design
  • memory contention
  • worst-case execution time

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications
  • Signal Processing
  • Safety, Risk, Reliability and Quality
  • Instrumentation


Dive into the research topics of 'Reconciling Predictability and Coherent Caching'. Together they form a unique fingerprint.

Cite this