WFCD - Foundations and Applications of Component-based Design 2009

October 11th, 2009      Grenoble, France, within ESWeek 2009 organised and funded by ARTIST 

Abstracts


NameLecture Title
 
Albert Benveniste
INRIA
Contract Based System Design with Symbolic Synchronous Modal Specifications
Contract based reasoning requires the following services regarding contracts (or interfaces): consistency, satisfaction, dominance, and compatibility.
On the other hand, its application to system design requires two types of combinators on contracts:
- parallel composition to reflect architectural features, and
- conjunction to support multiplicity of viewpoints and requirements.
We shall argue that modal specifications provide the adequate answer to the above issues, with low complexity algorithms. So far comprehensive theories have been developed for models with interleaving semantics. A full theory and body of algorithms remain to be developed for the case of synchronous systems with symbolic variables that are commonly found in embedded systems engineering.
In my talk I shall develop issues and solutions toward this objective.
This is a presentation of work by Benoît Caillaud, Jean-Baptiste Raclet, Eric Badouel, and Axel Legay.
 
Prof. Manfred Broy
TU München
Mandatory Properties of Component Concepts
Component concepts are of major significance in software and systems engineering. They are key concepts for basic system notions such as software and systems architecture, encapsulation, reuse and composition and thus dominate the engineering methodology for the entire development process. The expressive power and richness of system descriptions and specifications based on particular instances of components depends crucially on the selected model of components. Several notions of components have been proposed based process algebras, state machines and various conceptions from programming and the field of distributed systems. The question which component concept is the best choice for a particular approach is difficult to answer. One way to evaluate component and composition concepts is to look at their properties and how they support methodological concepts such as interface abstraction and information hiding, modularity and compositionality with respect to refinement, hierarchical decomposition and various ways to of structuring a sytem’s functionality.
 
Prof. Rolf Ernst
TU Braunschweig
Composition of Interval and Probabilistic Timing Models
Timing and schedulability analysis using models based on intervals of performance data, such as execution times, load and service, have been an active research topic and are successfully applied in practice, not the least because component analysis can easily be composed and the has demonstrated low computational effort. There is a strong incentive to combine such "worst case" (non deterministic) models with probabilistic models to obtain less conservative results in cases where probabilistic timing guarantees are sufficient. It has been shown, however, that correct analysis imposes strong limitations on probabilistic dependencies that seem to make such models inapplicable in practice.
Nevertheless, when revisiting some of the proposed analysis solutions for practically important systems, it turns out that capturing dependencies with fréchet bounds leads to meaningful probabilistic timing analysis which can be combined with analysis using worst case interval models. Even a limited analysis composability is possible with significantly less restrictions than in previous work. The talk will present first results.
 
Prof. Tom Henzinger
IST Austria
From Boolean to Quantitative System Specifications
The boolean view holds that given a model and a specification, the specification is either true or false in the model. Statetransition models have been extended to accommodate certain numerical quantities, such as time stamps and probabilities. However, the boolean view of specifications is still prevalent, for example, in checking whether a timed model satisfies a formula of a real-time logic, or whether a stochastic process satisfies a formula of a probabilistic logic. We advocate the fully quantitative view, where a specification can be satisfied to different degrees, that is, the value of a given specification in a given model is a real number rather than a boolean. We survey some initial results in this direction and present some potential applications, not only in the timed and probabilistic domains, but also for specifying and analyzing the resource use, cost, and reliability of a system.
 
Prof. Bengt Jonsson
Uppsala University
Generating Interface Models from Test Data
The problem of generating descriptions of how a component interface can be used, or how a protocol entity behaves, has recently gained interest, and been addressed using automata learning techniqes. These techniques allow to infer a finite state machine model of a component by observing its reaction to a set of selected test cases. We describe how these techniques can be extended to handle various classes of data parameters. One goal is to generate descriptions of common communication protocols, such as TCP, which can be used in documentation, test case generation, and similar activities.
 
Prof. Kim Larsen
Aalborg University
Playing Games with Timed Interfaces
In this presentation we investigate the use of Timed I/O Game Automata as a formalism for specifying the behaviour of real-time components.
We use Timed I/O Game Automata (TIOGA) as specifications in order to allow for a variety of different implementations of a given component. A TIOGA is a timed game automata with actions partitioned into input and output actions, and with input transitions being controllable (by the user or context of the component) and output transitions uncontrollable. In addition, a TIOGA may have several outputs enabled in a given state as well as timing-uncertainty with respect to output, thus leaving room for variety in the choice of output and time made by a given implementation.

We introduce the notion of timed alternating SIMULATION between TIOGA’s. Similarly CONSISTENCY between TIOGA’s is defined as a non-empty intersection of the two implementation sets. TIOGAs are (as their implementations) input-enabled. Thus, contrary to interface automota of de Alfaro and Henzinger, assumptions of how a TIOGA will be used are explicitely modelled by having unexpected inputs leading to completely underspecified states of the TIOGA. In combining two TIOGA we may check for their COMPATIBILITY in terms of insuring that an implementation of one will never provide an output not anticipated by (some implementation of) the other.

In the talk, we shall show how all the above notions — satisfiability, refinement, consistency and compatibility — can be reformulated as safety games on product timed games thus enabling the tool UPPAAL-Tiga to be applied.

The work has been partly carried out in collaboration with Thomas Chatain, ENS Cachan, France and Alexandre David, CISS, Aalborg University, Denmark, as well as with Ulrik Nyman, CISS, Aalborg University, Denmark, and Andrzej Wasowski, ITU, Copenhagen, Denmark.

 
Prof. Edward A. Lee
UC Berkeley
Ensuring Correct Composition of Components using Lattice-based Ontologies
Joint work with Man-Kit Leung (Berkeley), Thomas Mandl (Bosch) Elizabeth Latronico (Bosch), Charles Shelton (Bosch), and Stavros Tripakis (Berkeley)
Including semantic information in models helps to expose modeling errors early in the design process, engage a designer in a deeper understanding of the model, and standardize concepts and terminology across a development team. It is impractical, however, for model builders to manually annotate every component with semantic properties. This talk describes a correct, scalable and automated method to infer semantic properties using lattice-based ontologies, given relatively few manual annotations. Semantic concepts and their relationships are formalized as a lattice, and relationships within and between components are expressed as a set of constraints and acceptance criteria relative to the lattice. Our inference engine automatically infers properties wherever they are not explicitly specified. Our implementation leverages the infrastructure in the Ptolemy II type system to get efficient and scalable inference and consistency checking. We demonstrate the approach on a non-trivial Ptolemy II model of an adaptive cruise control system.

 
Joseph Sifakis
VERIMAG Laboratory
chair
Correct-by-construction Distributed Implementations for BIP
We propose a method for generating distributed implementations from BIP models. The method is based on two source-to-source transformations preserving observational equivalence: 1) A transformation from a global state to a partial state model which replaces strong synchronization primitives in atomic components by send/receive primitives; 2) A transformation which replaces multiparty interaction with protocols based on asynchronous message passing. This transformation may use additional protocols ensuring mutual exclusion between conflicting interactions.
We propose different distributed implementations from fully decentralized to fully centralized ones. We study their properties, in particular performance criteria such as degree of parallelism and overhead for coordination. We also provide experimental results including a tool for automatic generation of implementations and benchmarks for different case studies.
 
Prof. Janos Sztipanovits
Vanderbilt University
Composition in Heterogeneous Systems
Component-based design is widely used in both computer science and engineering, but has been most successfully applied in areas where small system size and homogeneity limited complexity. Unfortunately, cyner-physical systems (CPS) are inherently heterogeneous in structure, component behaviors, and types of interactions among components. CPSs have three fundamentally different design layers: the Computation/Communication Layer, Platform Layer and Physical Layer. Due to the underlying fundamental difference in time and concurrency models, existing composition frameworks address compositionality and composability differently and neglect the interaction across the layers. The resulting problems are literally everywhere. For example, establishing compositional properties in physical systems (such as stability) can be nontrivial even in simple cases, and much of the focus in systems theory has been on standardized system architectures, such as feedback loops. However, compositionality theories on the Physical Layer neglect effects of computational implementation such as time varying delays, nonlinear affects of quantization on compositionality. In this talk we focus on inter-layer interactions. We will discuss theories, methods and tools that enable utilizing compositionality results for specific properties in the design layers by using two independent principles: decoupling and regularity. These principles are fundamentally simplification strategies that decrease dimensionality and limit complexity of interactions. We show the significance and effective use of four methods: (a) passivity and approximate passivity, (b) cross-layer abstractions, (c) orthogonal structures and (d) inter-layer isolation. We show mathematical techniques to orthogonalize design concerns and preserve compositionality with respect to the properties of interest in different layers.
 
Prof. Lothar Thiele
ETH Zurich
Combining State-Based and Analytic Component Models
We advocate a compositional and hybrid approach for obtaining key (performance) metrics of embedded systems. At its core the proposed methodology assumes components to be either flow-oriented descriptions or networks of timed automata, where interaction among such heterogeneous components is modeled by streams of discrete activity triggers. We will discuss our experiences with such an approach in terms of implementation and experimental results.

(c) Artist Consortium, All Rights Reserved - 2006, 2007, 2008, 2009

Réalisation Axome - Création de sites Internet