Report on the WG1.8 Workshop at CONCUR 2007---Part 1
As you may know, Jos Baeten, Wan Fokkink, Anna Ingolfsdottir, Uwe Nestmann and I organized a Workshop on Applying Concurrency Research in Industry, co-located with CONCUR 2007 in Lisbon, on behalf of WG1.8. The workshop was held on the afternoon of Friday, 7 September, and ran concurrently with the last two sessions of the main conference. Despite the competition with CONCUR, we had about 25 participants at the workshop, including several members of the working group. I think that this was a rather decent level of attendance for a "strategic event".
In this post, for the benefit of the WG members and of the community as a whole, I'll try to summarize the main discussion points that were raised during the presentations at the workshop and the ensuing panel discussion. I will also try to recall some of the questions that the audience asked the panel members. I hope that some of blog readers will want to contribute their opinion on these points and give their own answers to those questions themselves.
The organizers of the workshop will use all of the contributions that they'll receive in putting together an article for the concurrency column of the BEATCS.
Report on the Invited Talks
Note: I encourage the speakers to correct anything they said that I may have misinterpreted or misrepresented. I take full responsibility for any error I might have made in relaying the gist of the invited talks, and I'll be happy to post corrections and further comments. This is meant to be a live repository.
The workshop began with four invited presentations delivered by Hubert Garavel, Vincent Danos, Kim G. Larsen and Jan Friso Groote.
Hubert gave an overview of his twenty-year work on the CADP tool set, which is the oldest concurrency-theoretic tool still in activity. A thread underlying his work in what he called "applied concurrency theory" is that one must push the main results of our research area to industry and that this is only possible with the development of strong tool support for our methods. Hubert said that one of his design principles has been to restrict the tool's functionality for efficiency reasons, and that elegant semantics and efficient execution are two separate (at times conflicting) issues.
I have a note to the effect that, during Hubert's presentation, somebody (possibly Hubert himself) said that a lot of code doing bisimulation minimization has been lost over the years. We simply have not been consistent enough in preserving some of our earlier tooling efforts for the present generations of developers. Jan Friso said that current bisimulation minimization algorithms do not scale up to the size of the systems that are currently being analyzed, and asked whether it would be appropriate to rekindle research and implementation efforts on efficient and scalable bisimulation algorithms.
Earlier that day, Vincent had delivered an inspiring tutorial on his work in doing rule-based analysis of biological signalling. Listening to his tutorial, I was left with the impression that he is really having an impact in the life sciences, and that experimental biologists might very well use his tools based on concurrency-theoretic ideas. At the workshop, Vincent presented another way in which concurrency-theoretic ideas can help experimental biologists in their work. Experimental biology has a huge knowledge representation problem. (Vincent mentioned that there are two papers published in that area each minute!) In his opinion, experimental biologists can/should
- use concurrency-inspired languages to express biological understanding and
- display this information in a wiki-type system.
Kim's talk was based on his experience with the ten-year development of the Uppaal tool, and reported on the knowledge transfer activity, which is part of his involvement in CISS. Apart from surveying the development of Uppaal and its recent offsprings, Kim's talk sent out the following general messages to the audience.
- Tools are a necessary condition for the successful transfer of concurrency-theoretic ideas in industry. Tool development is labour intensive, and one needs the sustained effort of many people over many years to produce good software tools.
- Academic courses offered to students and to industry practitioners play a key role.
- Concurrency researchers should try and target different communities of potential users. One never knows where successful applications are going to stem from.
- A good beginning is useful! Being able to start with a success story may lead to further successes and a long-term collaborations. However, Kim warned against assuming that a good beginning is a guarantee of a good ending, and recounted the story of the Aalborg collaboration with Bang and Olufsen, who disappeared from sight after Klaus Havelund, Arne Skou and Kim found and fixed a bug in one of their protocols. See here.
- The success of CISS shows that several companies are very interested in applying concurrency-theoretic ideas and tools because this reduces time to market and increases the quality of their products.
- The impact of one's work on the application of concurrency-theoretic research in industry is not always directly proportional to the amount of effort one puts into the work itself. Kim gave the example of the synthesis of control software controlling the temperature and humidity in an actual pig stable in Denmark. This software was synthesized using Uppaal Cora in a short time and is actually running to the satisfaction of its customers :-)
- Finally, Kim called for an expansion of the scope for use of concurrency theory. We should link our work to testing, optimization etc. and embed it into standard software engineering methodologies, which are familiar to practitioners.
Some Questions to the Speakers
Here are some questions that were addressed to the speakers during the panel discussion, in no particular order. I encourage readers of this report to post their own answers and further questions as comments to the post. Later on, I will post the answers from the panel members as I recall them.
- What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?
- How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?
- Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?
- How can we, as a community, foster the building of industrial-strength tools based on sound theories?
- What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?
Addendum 14/9/2007: After I wrote this post, it occurred to me that the workshop discussion may have given the impression that industrial impact can solely be achieved by means of tools and joint case studies. Moshe Vardi's work on specification languages like ForSpec on the other hand indicates that the development of theoretically sound and clean specification languages that are actually used by industry is another area in which in the community can (and I believe should) have an impact.
I hope I can quote Moshe as saying
"In fact, I believe that much of my industrial impact has been achieved through the development of clean and useful theory."