Marcelo
Fiore
A
study of type isomorphisms with application to algorithmics
(Joint
work with Tom Leinster)
We
consider the automorphism group of the finitistic (that is, non recursive)
programs on the datatype B = B * B. Specifically, we show that when
* is interpreted as the product type constructor the resulting group
is isomorphic to the group introduced by Richard Thompson known as V.
Further, we also establish that when * is interpreted as a tensor-product
type constructor the resulting group is isomorphic to Richard Thompson's
subgroup of V known as F. As an application, and following Sean Cleary,
we then explain how a well-known finite presentation of the group F
can be used to give an algorithm for transforming any two binary trees
with the same number of leaves into each other by performing rotations
either at the root or at the root's left child.
Marcelo
Frias
Tool
Support for the Argentum Project
In
this talk I presented an overview of the Argentum project, and of the
Calculus for Omega Closure Fork Algebras that serves as the homogeneous
formalism to which different logics can be translated. I mentioned that
the close ressemblance between Omega Closure Fork Algebras and Alloy
suggests the use of the latter to analyze properties of the former.
Then, I moved to prove that the Calculus for Omega Closure Fork Algebras
provides a complete calculus for Alloy too. Finally, I presented the
DynAlloy tool and showed that it is better suited than plain Alloy for
the analysis of properties of executions of Alloy operations.
References
Frias
M.F., Galeotti J.P., Lopez Pombo C.G. and Aguirre N.,
DynAlloy: Upgrading Alloy with Actions,
in Proceedings of ICSE 2005, St. Louis, USA, May 2005.
Frias
M.F., Lopez Pombo C.G., Baum G.A., Aguirre N. and Maibaum T.S.E.,
Reasoning About Static and Dynamic Properties in Alloy: A Purely Relational
Approach,
to appear in ACM-Transactions on Software Engineering and Methodology
(TOSEM), In press.

David
de Frutos-Escrig
Bisimulations
up-to for the linear time-branching time spectrum and beyond
(Joint
work with Carlos Gregorio Rodriguez)
We
present a nice characterization of all the semantics in the linear time
branching time spectrum by means of bisimulations up-to that allow acoalgebraic
treatment of any of them. The idea is to use the preorder relation generating
any of these semantics, instead of the strong bisimulation equivalence
that is used in the classical up-to approach. Although this coalgebraic
approach is mainly interesting in the case of infinite processes, first
we prove our results for finite processess, and then we extend them
by applying a continuity result. The ready simulation equivalence, that
is rather close to the bisimulation equivalence, represents the limit
for our first main theorem. But we also prove a second theorem that
gives us the desired characterization for any simulation semantics fulfilling
some adequate properties, that we think that distinguish those "good"
simulation semantics, having as a consequence many interesting properties.
These include for instance the nested simulation semantics, that are
stronger than the ready simulation, and therefore were not covered by
our first theorem.
Therefore
we have in practice a bootstraping mechanism that allows to infer any
of the semantic equivalences starting from a part of the corresponding
preorders. However the usual way to provide such a initial part of the
preorders will be a partial application of the axiomatizations for them.
Unfortunately k-nested semantics have no such finite axiomatization,
and then it seems not easy to directly apply in practice its coalgebraic
characterization. Then we looked for a more modular approach that takes
into account the nested character of these semantics. Thus we developed
our counting bisimulations, that allow to put together all the nested
simulation semantics, characterizing all together by a single bisimulation-like
game.
Once
again our approach is rather general since it is based in which we call
constrained simulations, that generate a family of different semantics,
depending on the relation we use as concrete constraint. Whenever such
an initial relation is finitely axiomatizable, then all the generalized
nested simulation semantics have a common (mutually recursive) conditional
axiomatization, which in fact can be seen as a finite (unconditional)
axiomatization if the nesting levels are considered as a third argument
of a single relation.
Therefore
we have established a common framework to study all the different semantics
in Van Glabbeek's spectrum, including the nested semantics, for which
we have not a finite axiomatization.

Radu Grosu
Efficient
Modeling of Excitable Cells Using Hybrid Automata
We
present an approach based on hybrid automata (HA), which combine discrete
transition graphs with continuous dynamical systems, to modelling complex
biological systems. Our goal is to efficiently capture the behaviour
of excitable cells previously modeled by systems of nonlinear differential
equations. In particular, we derive HA models from the Hodgkin-Huxley
model of the giant squid axon, the Luo-Rudy dynamic model of a guinea
pig ventricular cell, and a model of a neonatal rat ventricular myocyte.
Our much simpler HA models are able to successfully capture the action-potential
morphology of the different cells, as well as reproduce typical excitable
cell characteristics, such as refractoriness (period of non-responsiveness
to external stimulation) and restitution (adaptation to pacing rates).
To model electrical wave propagation in a cell network, the single-cell
HA models are linked to a classical 2D spatial model. The resulting
simulation framework exhibits significantly improved computational efficiency
in modeling complex wave patterns, such as the spiral waves underlying
pathological conditions in the heart.

Masami Hagiya
Satisfiability
Checking of Two-way Modal m-calculus and Its Applications
Branching-time
temporal logic can naturally be used to express properties of trees
and graphs, where two-way logic is more useful because it can handle
both forward and backward edges. We have implemented a tableaux-based
satisfiability checker for alternation-free two-way modal mu-calculus
using BDDs (binary decision diagrams) and applied it to schema checking
in XML and shape analysis. In particular, shape analysis can be considered
as tableaux construction with suitable extensions to temporal logic:
two-way, nominal, and global modality.

Deriving Bisimulation Congruences for Graph Transformation Systems
This
talk is concerned with the derivation of labelled transitions and bisimulation
congruences from unlabelled reaction rules. After a short introduction
to categorical constructions such as pushouts and pullbacks, we show
how to address this problem in the DPO (double-pushout) approach to
graph rewriting. This is done by deriving so-called borrowed contexts
as labels. It can be shown that bisimilarity for this labelled transition
system is indeed a congruence.
Then,
in the second part of the talk, we compare our work to closely related
work by Leifer/Milner and Sassone/Sobocinski in which labels are derived
using the categorical notion of relative pushout. We explain how problems
with automorphisms can arise in this setting and how they are solved
by the various approaches.

Merged
Processes - a New Condensed Representation of Petri Net Behaviour
(Joint work with
V.Khomenko, A.Kondratyev and W.Vogler)
Model
checking based on Petri net unfoldings is an approach widely applied
to cope with the state space explosion problem. In this paper we propose
a new condensed representation of a Petri net's behaviour called merged
processes, which copes well not only with concurrency, but also with
the other sources of state space explosion, viz. sequences of choices
and non-safeness. Moreover, this representation is sufficiently similar
to the traditional unfoldings, so that a large body of results developed
for the latter can be re-used. Experimental results indicate that the
proposed representation of a Petri net's behaviour alleviates the state
space explosion problem to a significant degree and is
suitable for model checking.
Optimal
Controller Synthesis for Real-Time Systems
(Joint
work with Patricia Bouyer, Franck Cassez, Alexandre David, Emmanuel
Fleury and Didier Lime)
This
talk reports on work on controller synthesis for real-time systems through
the computation of winning strategies for timed games. Though such games
for long have been know to be decidable there has been a lack of efficient,
TRUELY on-the-fly algorithms for analyising timed games.
Our
work provides an efficient symbolic, forward and on-the-fly algorithm
for solving timed games with respect to reachability and safety properties.
The algorithm is theoretically optimal having worst-case complexity
linear in the underlying region-graph. In practice the algorithm may
terminate long before having explored the entire state-space, as its
on-the-fly nature allows the algorithm to terminate as soon as a winning
strategy has been identified. Also, the individual steps of the algorithm
are carried out efficiently by using so-called zones as the underlying
datastructure.
A small
extension of the algorithm permits time-optimal winning strategies (for
reachability games) to be obtained, and the very nature of the algorithm
points directly to a distributed version of the algorithm (similar to
the distributed version of UPPAAL for reachability analysis of timed
automata).
The
talk also report an efficient implementation of the algorithm using
the DBM-library of UPPAAL experimentally evaluated on a version of the
Production Cell.
We
also report on recent results on cost-optimal winning strategies for
games based on priced timed automata. This problem is in general undecidable
-- however under certain non-zenoness assumptions the problem becomes
decidable. Experiments with a prototype implementation in HyTech has
been made.

Zhiming Liu
Integrating
Methods and their Theories for Component Software Modelling, Construction
and Verification
This presentation is about an ongoing research programme on object-oriented
and component-based software development at UNU-IIST. We discuss some
of the concepts, difficulties and significant issues that we need
to consider when developing a formal method for component-based software
engineering. We argue that to deal with the challenges, there is a
need in research to link existing theories and methods of programming
for effective support to component-based software engineering. We
then show initial results about how this can be achieved in the development
in a Relational Calculus, called rCOS, for object and component systems.

Markus
Mueller-Olm
Analysis
of Modular Arithmetic
(Joint work
with Helmut Seidl (Technische Universitaet Muenchen))

Ernst
Ruediger Olderog
More
on Verification of Cooperating Traffic Agents
(This
talk presents a continuation of the joint work W. Damm and H. Hungar
reported at the meeting of WG 2.2 in Bertinoro, 2004.)
We decompose the proof of collision freedom of cooperating traffic agents
like trains, cars and airplanes by devising a general proof rule that
can be instantiated with concrete applications. In our approach an agent
is modelled as a parallel composition of a plant and a controller each
one represented as a hybrid automaton.
The
proof rule exploits a coordination protocol that is typical for achieving
collision avoidance and makes use of concepts like safety envelopes
and criticality measures. The proof rule has a number of premises each
of which, we argue, is substantially easier to verify than the global
property of collision freedom. We formalise the proof rule for the case
of two traffic agents in the State Transition Calculus, an extension
of the Duration Calculus that can represent both interval based state
properties and instantaneous transitions, and prove its soundness.
As
an application we consider a train gate scenario and show how it can
be viewed as an instantiation of the general proof rule, thus proving
collision freedom.

Luke Ong
Game
Semantics and Software Model Checking
Game
semantics is a way of giving meanings to programming languages using
simple and intuitive notions of game playing. The approach goes back
to Kleene and Gandy in one tradition, and to Kahn and Plotkin, and Berry
and Curien in another. Compared to other semantics of programming languages,
a striking feature of game semantics is that it is denotational (and
so admits compositional methods) and yet has clear operational content;
further it gives rise to highly accurat (fully abstract) models for
a wide range of programming languages.
A recent
development in game semantics is concerned with the algorithmic representation
of fully abstract game semantics, with a view to applications in software
model checking and program analysis. The goal is to build on the tools
and methods which have been developed in the Verification community,
while exploring the advantages offered by our semantics-based approach.
In this talk, we present a complete classification of the decidable
(with respect to observational equivalence) fragments of a higher-order
procedural language, and briefly survey applications of the approach
to program verification.

Catuscia
Palamidessi
Probabilistic
and Nondeterministic Aspects of Anonymity
(Joint
work with Mohit Bhargava)
The concept of anonymity comes into play in a wide range of situations,
varying from voting and anonymous donations to postings on bulletin
boards and sending mails. The systems for ensuring anonymity often use
random mechanisms which can be described probabilistically, while the
agents' interest in performing the anonymous action may be totally unpredictable,
irregular, and hence expressable only nondeterministically.
Formal
definitions of the concept of anonymity have been investigated in the
past either in a totally nondeterministic framework, or in a purely
probabilistic one. In this paper, we investigate a notion of anonymity
which combines both probability and nondeterminism, and which is suitable
for describing the most general situation in which both the systems
and the user can have both probabilistic and nondeterministic behavior.
We also investigate the properties of the definition for the particular
cases of purely nondeterministic users and purely probabilistic users.
We
formulate our notions of anonymity in terms of observables for processes
in the probabilistic $\pi$-calculus, whose semantics is based on Probabilistic
Automata.
We
illustrate our ideas by using the example of the dining cryptographers.

Jaan Penjam
Deductive
and Inductive Methods for Program Synthesis
(Joint
work with Elena Sanko, Insitute of Cybernetics, Tallinn, Estonia)
Formal specifications that define required system properties play a
central role in software engineering. They can be used for reasoning
about complex programs on a suitable abstract level. Formal models are
inescapable when programs are automatically derived or verified. The
most widely used techniques for automatic program construction are based
on stepwise refinement of a specification until a program code is achieved.
In
this talk, the different approaches are considered for program generation
where programs are obtained by means of special mathematically based
manipulations, without construction of intermediate level specifications.
In particular, we briefly introduce a method based on proof search,
known as structural synthesis of programs (SSP), and on the other hand,
a technique based on the optimization of a specific fitness function
related to the task to be solved. SSP has been implemented in a couple
of commercial and some experimental systems that are basically of scientific
or pedagogical value. SSP's most advanced implementation is in the system
NUT performed jointly by the researchers of the Institute of Cybernetics
(Tallinn) and Deparment of Teleinformatics of KTH.
The
SSP method is a deductive approach for program synthesis carried out
via special logical inference rules from a general relational specification
of a problem. This specification is often called the computational model
of the problem. Here we are going to design an inductive approach for
automatic program construction based on the same specification language
(computational model) that is augmented by experimental data that describe
desirable input-output behavior of the program to be composed. A common
specification would bridge deductive and inductive methods of program
construction. We believe that combining these two types of techniques
might provide more general and effective procedures for automation of
software development. This would simulate human reasoning where deductive
inference steps are interleaved with drawing conclusions from samples
of experimental data. In the talk we will introduce the idea of inductive
program construction as well as an algorithm for coding sequential programs
by real numbers that allow transforming a task for program synthesis
into an optimisation problem that could be solved using evolutionary
programming techniques.

K.V.S. Prasad
A Calculus of Mobile Broadcasting Systems
Computers
often send messages broadcast over ethernets, and point-to-point between
them: globally asynchronous, locally synchronous. This paradigm inspires
the new primitive calculus presented here, MBS (mobile broadcasting
systems). MBS processes communicate by broadcast, but only in {\em rooms},
and move between rooms at unspecified speeds. A broadcaster is heard
instantly by everyone else in the room, but by no one outside. Priorities
are attached to speech and to exits from rooms, and rooms can be blocked
awaiting a message; these primitives suffice to synchronise broadcasting
and moving. MBS borrows the treatment of names from the $\pi$-calculus,
but the latter's ``get/put $b$ on channel $a$'' becomes in MBS ``go
to $a$ and hear/say $b$''.

Masahiko
Sato
Truth
by Evidence
We
introduce a new theory of expressions where expressions are divided
into the two categories of objects and judgments. The universe of expressions
is open ended and we can develop mathematics step by step by adding
new constants for objects and jugments. We assign meaning to judgments
by defining what is an {\em evidence} for a given judgment. We also
give a short demonstration of the computer environment CAL/NF for formally
developing mathematics which is based on our theory of expressions.

Martin Steffen
Observability,
Connectivity, and Replay in a Sequential Calculus of Classes
Object
calculi have been investigated as semantical foundation for object-oriented
languages. Often, they are object-based, whereas the mainstream of object-oriented
languages is class-based.
We
formulate an operational semantics that incorporates the connectivity
information into the scoping mechanism of the calculus. Furthermore,
we formalize a notion of equivalence on traces which captures the uncertainty
of observation caused by the fact that the observer may fall into separate
groups of objects. We use a corresponding trace semantics for a simple
notion of observability. This requires to capture the notion of \emph{determinism}
for traces where classes may be instantiated into more than one instance
during a run and showing thus twice an equivalent behavior (doing a
``replay''), a problem absent in an object-based setting.
Keywords:class-based object-oriented languages, formal semantics, determinism,
full abstraction

Scott
D. Stoller
Checking Atomicity in Concurrent Java Programs
Atomicity
is a common correctness requirement for concurrent programs. It requires
that concurrent invocations of a set of methods be equivalent to performing
the invocations serially in some order. This is like serializability
in transaction processing. Analysis of atomicity in concurrent programs
is challenging, because synchronization commands may be scattered differently
throughout each program, instead of being handled by a fixed strategy,
such as 2-phase locking.
We
are developing techniques for checking atomicity in concurrent programs,
using static analysis, dynamic analysis (run-time monitoring), and novel
combinations of them. This talk surveys our research in this area, discusses
trade-offs between different techniques, and describes our experience
applying them to about 40 KLOC of Java programs.

Paulo Tabuada
On
the synthesis of correct by design embedded control software.
Current state of the art technology has fostered the pervasive use of
microprocessors in our every day life. Examples include portable accessories
such as mobile phones and PDA's; home appliances such as refrigerators,
toasters and coffee machines; medical equipment such as defibrillators,
dialysis machines and MRI?s among many other systems. These embedded
computing devices interact with the continuous environment in nontrivial
ways that difficult its analysis and synthesis. In this talk I will
focus on the problem of synthesizing embedded controllers that are provably
correct by design. This approach contrasts with current techniques where
verification plays an important role in ensuring correctness of operation.
I will show how it is possible to automatically design controllers from
discrete specifications (languages, finite state machines, temporal
logics, etc) for purely continuous systems and how these results lead
to embedded control software. One novelty of this approach is the fact
that we synthesize both continuous control laws and a description of
its software implementation resulting in embedded control systems that
are correct by construction.

Tarmo Uustallu
Compositional
natural semantics, Hoare logics and type systems for low-level languages
(Joint
work with Ando Saabas, Institute of Cybernetics, Tallinn)
Low-level
languages, where a piece of code is a flat set of labelled instructions
with no explicit phrase structure, have traditionally been considered
inherently non-modular and therefore hard to reason about. But the advent
of the paradigm of proof-carrying code (or, more generally, certified
code) has generated significant renewed interest in reasoning about
low-level code. We show that the seemingly banal implicit and ambiguous
structure of finite unions of pieces of code with non-overlapping support
makes a phrase structure which, if properly exploited, is perfectly
reasonable both from the viewpoint of metatheory and with the view on
practical reasoning about large pieces of software. We show how this
phrase structure leads to compositional natural semantics, Hoare logics
and type systems with beautiful metatheoretical properties. Among other
things, the approach supports compilation of proofs together with programs.
This is new work, which has been partially reported upon at SOS 2005,
but is ongoing. Some new work by Tan and Appel as well as Benton is
related, but our approach is, we claim, substantially simpler and neater.

Igor Walukiewicz
Alternating
Timed Automata
(Joint
work with Slawomir Lasota)
A notion
of alternating timed automata is proposed. It is shown that such automata
with only one clock have decidable emptiness problem. This gives a new
class of timed languages which is closed under boolean operations and
which has an effective presentation. It is shown that the complexity
of the emptiness problem for alternating timed automata with one clock
is non-primitive recursive. The proof gives also the same lower bound
for the universality problem for nondeterministic timed automata with
one clock thereby answering a question asked in a recent paper by Ouaknine
and Worrell. Undecidability of the emptiness problem over infinite words
is also shown.

Rafal Wisniewski
Investigation
of Discrete Event Systems by Means of Dynamical Systems. Application
to Concurrency.
There
has been a huge amount of interesting work done in the field of hybrid
systems. A lot of attention has been paid to extending the existing
techniques developed for analysis of discrete event systems to tackle
the dynamics; one of many examples is a hybrid automata. As a person
devoted to the dynamical systems and their application in control engineering
I will attack the problem from the opposite point of view. I tackle
discrete event systems with tools developed for the geometric theory
of dynamical systems. The case study I will consider in this talk is
the concurrency and the objective is to classify executions of a concurrent
program. I will treat a computer program as a flow line of a vector
field. The flow line is born at a point p, which is the start of the
program, and dies at a point q, the end of the program. Due to different
scheduling scenarios the execution of the same program may result in
flow lines of vector fields close to each other.
