Some Answers By the Workshop Speakers
In a previous post, I listed the questions that the attendees at our CONCUR 2007 workshop raised to the invited speakers and panel members. I also promised to pen down my recollections of their answers. Here they, hoping that I won't misrepresent the opinions of these colleagues too much. I am sure that they will correct me if I inadvertently do so. Anything I write below should be taken with a huge pinch of salt :-)
What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?
Kim G. Larsen (KGL): The vast majority of computing systems in operation today are embedded. Proper modelling and analysis of the behaviour of embedded systems requires time. However, industry does not really seem to care whether the model of time used in the models is discrete or continuous. When analyzing embedded systems, quantitative analysis techniques are needed and I expect that stochastics will play an increasing role in the future.
Jan Friso Groote (JFG): Basically, industry do not know what they want and there is little point in chasing their ever-changing needs. Concrete modelling and analysis of computing systems should be based on a uniform calculus, which is rich enough to model the problem scenarios at hand. As far as real-time is concerned, it should be expressible in the uniform calculus.
How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?
Both JFG and KGL reported on several examples of interaction with industry where there seemed no relation between "size" and the success of the interaction. Kim described a successful collaboration on testing of GUI applications with a one-person company having basically no technological expertise. This was compared with the collaboration with Bang and Olufsen, which was a failure despite the resounding success of their first cooperation.
Jan Friso highlighted the successful cooperations with industry within the Laquso laboratory. (See here for more information.)
Hubert Garavel (HG) stated there are three necessary conditions for a successful collaboration with industry. The company should have
Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?
There seemed to be a general opinion here that probabilistic modelling is nice, but not necessary. More precisely, none of the speakers had yet met an example that they could not model adequately because their models and tools do not permit stochastic or probabilistic modelling.
JFG stated that he wants to work on research topics that can have applicability in real-life scenarios. He wants to have interaction with industry mainly as a way to learn what are the good/bad aspects of his modelling language and his tools. He feels that one should push for the use of the same model for verification and performance evaluation. (I have a vague recollection that this opinion was also shared by Kim.)
How can we, as a community, foster the building of industrial-strength tools based on sound theories?
The general feeling amongst the panelists was that our community does not have the infrastructure to support tooling efforts. HG pointed out how the situation is better in France, where the development of tools and languages is supported and recognized by institutions like INRIA. (My note: In hindsight, this is reflected by the success of long-term efforts like those that led to Esterel, Coq and CAML, amongst others.)
The panelists suggested that conferences and journals should be keener to accept tools and case studies as refereed contributions on a par with papers. JFG pointed out that Science of Computer Programming has now a track devoted to "expositions on implementations of and experiments with novel programming languages, systems and methods." The journal's web page also states that "It must be emphasized that papers describing new software tools of relevance to SCP are welcome under the strict condition that the source code of the tools is open." KGL also stated that the TACAS conference series was initiated precisely to give more visibility to tooling efforts in our community.
For a one-slide viewpoint on the economics of tool development look at slide 72 of Kim's presentation at the workshop.
What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?
JFG: Theory does not have much to offer to industry. We should probably view concurrency theory as a nice mathematical theory that need not have any real-world application.
As for what is it that we can do as a CONCUR community to assist in the work on tools, JFG's answer is to organize that as many students are being taught their use and effective application as possible. One of the biggest problems that we are facing is that far too few people in industry understand what formal methods and their tools can effectively bring to industry. To be on the safe side, they do not see where the techniques are effective and what they offer. They however understand that there are other pressing needs to invest time in.
If we teach formal methods we should teach the most advanced ones that the students can swallow. If they understand the advanced methods they can apply the more straightforward techniques. Of course the reverse does not hold. Don't teach them UML and expect them to understand mCRL2. But if you teach them mCRL2, they will not have any conceptual difficulty to apply UML.
KGL: We should make our techniques fit into the whole system-development process. We should also make sure that existing model-based tools that are already in use by industry have much stronger semantic foundations.
HG: Our community is in trouble! The model of concurrency that is prevalent is industry is Java-like (threads and shared variables). Our foundational beliefs are exotic for industry and message-passing, as used in our process calculi, is not the order of the day. Our major challenge is in pushing the clean model of concurrency we like. Every computer science department in every university should play its part in achieving this aim. Education is one of our best weapons to make industry accept our models and the techniques based upon them.
What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?
Kim G. Larsen (KGL): The vast majority of computing systems in operation today are embedded. Proper modelling and analysis of the behaviour of embedded systems requires time. However, industry does not really seem to care whether the model of time used in the models is discrete or continuous. When analyzing embedded systems, quantitative analysis techniques are needed and I expect that stochastics will play an increasing role in the future.
Jan Friso Groote (JFG): Basically, industry do not know what they want and there is little point in chasing their ever-changing needs. Concrete modelling and analysis of computing systems should be based on a uniform calculus, which is rich enough to model the problem scenarios at hand. As far as real-time is concerned, it should be expressible in the uniform calculus.
How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?
Both JFG and KGL reported on several examples of interaction with industry where there seemed no relation between "size" and the success of the interaction. Kim described a successful collaboration on testing of GUI applications with a one-person company having basically no technological expertise. This was compared with the collaboration with Bang and Olufsen, which was a failure despite the resounding success of their first cooperation.
Jan Friso highlighted the successful cooperations with industry within the Laquso laboratory. (See here for more information.)
Hubert Garavel (HG) stated there are three necessary conditions for a successful collaboration with industry. The company should have
- a strong interest in quality,
- a lot of money and
- a formal modelling group in house.
Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?
There seemed to be a general opinion here that probabilistic modelling is nice, but not necessary. More precisely, none of the speakers had yet met an example that they could not model adequately because their models and tools do not permit stochastic or probabilistic modelling.
JFG stated that he wants to work on research topics that can have applicability in real-life scenarios. He wants to have interaction with industry mainly as a way to learn what are the good/bad aspects of his modelling language and his tools. He feels that one should push for the use of the same model for verification and performance evaluation. (I have a vague recollection that this opinion was also shared by Kim.)
How can we, as a community, foster the building of industrial-strength tools based on sound theories?
The general feeling amongst the panelists was that our community does not have the infrastructure to support tooling efforts. HG pointed out how the situation is better in France, where the development of tools and languages is supported and recognized by institutions like INRIA. (My note: In hindsight, this is reflected by the success of long-term efforts like those that led to Esterel, Coq and CAML, amongst others.)
The panelists suggested that conferences and journals should be keener to accept tools and case studies as refereed contributions on a par with papers. JFG pointed out that Science of Computer Programming has now a track devoted to "expositions on implementations of and experiments with novel programming languages, systems and methods." The journal's web page also states that "It must be emphasized that papers describing new software tools of relevance to SCP are welcome under the strict condition that the source code of the tools is open." KGL also stated that the TACAS conference series was initiated precisely to give more visibility to tooling efforts in our community.
For a one-slide viewpoint on the economics of tool development look at slide 72 of Kim's presentation at the workshop.
What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?
JFG: Theory does not have much to offer to industry. We should probably view concurrency theory as a nice mathematical theory that need not have any real-world application.
As for what is it that we can do as a CONCUR community to assist in the work on tools, JFG's answer is to organize that as many students are being taught their use and effective application as possible. One of the biggest problems that we are facing is that far too few people in industry understand what formal methods and their tools can effectively bring to industry. To be on the safe side, they do not see where the techniques are effective and what they offer. They however understand that there are other pressing needs to invest time in.
If we teach formal methods we should teach the most advanced ones that the students can swallow. If they understand the advanced methods they can apply the more straightforward techniques. Of course the reverse does not hold. Don't teach them UML and expect them to understand mCRL2. But if you teach them mCRL2, they will not have any conceptual difficulty to apply UML.
KGL: We should make our techniques fit into the whole system-development process. We should also make sure that existing model-based tools that are already in use by industry have much stronger semantic foundations.
HG: Our community is in trouble! The model of concurrency that is prevalent is industry is Java-like (threads and shared variables). Our foundational beliefs are exotic for industry and message-passing, as used in our process calculi, is not the order of the day. Our major challenge is in pushing the clean model of concurrency we like. Every computer science department in every university should play its part in achieving this aim. Education is one of our best weapons to make industry accept our models and the techniques based upon them.
0 Comments:
Post a Comment
<< Home