14th International Conference on Runtime Verification

September 22 - September 25, 2014 Toronto, Canada

700px-downtown-explore-toronto-top
700px-Toronto_skyline_sailboat
700px-Toronto_ON_Toronto_Skyline2_modified
700px-FuturebigToronto
700px-Skyline_of_Toronto_viewed_from_Harbour
700px_Downtown

1st Intl. Competition of Software for Runtime Verification (CSRV-2014)

held with RV 2014 in Toronto, Canada

 

CSRV-2014 is the 1st International Software Runtime Verification Competition as a part of the 14th International Conference on Runtime Verification. The event will be held in September 2014, in Toronto, Canada. CSRV-2014 will draw attention to the invaluable effort of software developers and researchers who contribute in this field by providing the community with new or updated tools, libraries and frameworks for the instrumentation and runtime verification of software.

Runtime Verification is a verification technique for the analysis of software at execution time based on extracting information from a running system and checking if the observed behaviors satisfy or violate the properties of interest. During the last decade, many important tools and techniques have been developed and successfully employed. However, there is a pressing need to compare such tools and techniques, since we currently lack of a common benchmark suite as well as scientific evaluation methods to validate and test new prototype runtime verification tools. 

The main aims of CSRV-2014 competition are to:

  • Stimulate the development of new efficient and practical runtime verification tools and the maintenance of the already developed ones.
  • Produce a benchmark suite for runtime verification tools, by sharing case studies and programs that researchers and developers can use in the future to test and to validate their prototypes.
  • Discuss the metrics employed for comparing the tools.
  • Provide a comparison of the tools running with different benchmarks and evaluating using different criteria.
  • Enhance the visibility of presented tools among the different communities (verification, software engineering, cloud computing and security) involved in software monitoring.

Please direct any enquiries to the competition co-organizers:

  • Ezio Bartocci (Vienna University of Technology, Austria), This email address is being protected from spambots. You need JavaScript enabled to view it. ;
  • Borzoo Bonakdarpour (McMaster University, Canada), This email address is being protected from spambots. You need JavaScript enabled to view it. ;
  • Yliès Falcone (Université Joseph Fourier, France), This email address is being protected from spambots. You need JavaScript enabled to view it. .


CSRV-2014 Jury
The CSRV Jury will include a representative for each participating team and some representatives of the Demonstration Tools Committee of Runtime Verification Conference.

 
Call for Participation
The main goal of CSRV-2014 competition is to compare tools for runtime verification. We invite and encourage the particiEpation with benchmarks and tools for the competition.The competition will consist of three main tracks based on the input language used:

  • Track on monitoring Java programs (online monitoring)
  • Track on monitoring C programs (online monitoring)
  • Track on monitoring of traces (offline monitoring)

The competition will follow three phases:

  • Benchmarks/Specification collection phase - the participants are invited to submit their benchmarks (C or Java programs and/or traces). The organizers will collect them in a common repository (publicly available). The participants will then train their tools using the shared benchmarks;
  • Monitor collection phase - the participants are invited to submit their monitors. The participants with the tools/monitors (see more information in the following section) that meet the qualification requirements will be qualified for the evaluation phase;
  • Evaluation phase - the qualified tools will be evaluated running the  benchmarks and they will be ranked using different criteria (i.e. memory utilization/overhead, CPU utilization/overhead, ...). The final results will be presented at RV 2014 conference.
Please refer to the dedicated pages for more details on the three phases.
 
 
Important Dates 
Dec. 15, 2013 Declaration of intent (email: This email address is being protected from spambots. You need JavaScript enabled to view it. )
March 1, 2014 Submission deadline for benchmark programs and the properties to be monitored.  
March 15, 2014 Tool training starts by participants
July 20, 2014 Monitor submission
September 1, 2014 Notifications and reviews.