Lossy Compression applied to the Worst Case Execution Time Problem
David Jack Griffin, 2013
Abstract Interpretation and Symbolic Model Checking are powerful techniques in the field of testing. These techniques can verify the correctness of systems by exploring the state space that the systems occupy. As this would normally be intractable for even moderately complicated systems, both techniques employ a system of using approximations in order to reduce the size of the state space considered without compromising on the reliability of the results. When applied to Real-time Systems, and in particular Worst Case Execution Time Estimation, Abstract Interpretation and Symbolic Model Checking are primarily used to verify the temporal properties of a system. This results in a large number of applications for the techniques, from verifying the properties of components to the values given variables may take. In turn, this results in a large problem area for researchers in devising the approximations required to reduce the size of the state space whilst ensuring the analysis remains safe.
This thesis examines the use of Abstract Interpretation and Symbolic Model Checking, in particular focusing on the methods used to create approximations. To this end, this thesis introduces the ideas of Information Theory and Lossy Compression. Information Theory gives a structured framework which allows quantifying or valuing information. In other domains, Lossy Compression utilises this framework to achieve reasonably accurate approximations. However, unlike Abstract Interpretation or Symbolic Model Checking, lossy compression provides ideas on how one can find information to remove with minimal consequences. Having introduced lossy compression applications, this thesis introduces a generic approach to applying lossy compression to problems encountered in Worst Case Execution Time estimation.
To test that the generic approach works, two distinct problems in Worst Case Execution Time estimation are considered. The first of these is providing a Must/May analysis for the PLRU cache; whilst common in usage, the logical complexity of a PLRU cache renders it difficult to analyse. The second problem is that of loop bound analysis, with a particular focus on removing the need for information supplied by annotations, due to the inherent unverifiability of annotations.
BibTex Entry
@phdthesis{PHD_Griffin, author = {David Jack Griffin}, day = {1}, month = {September}, publisher = {University of York}, school = {University of York}, title = {Lossy Compression applied to the Worst Case Execution Time Problem}, url = {http://etheses.whiterose.ac.uk/8250/}, year = {2013} }