Sunday, May 19, 2013

Some lessons from MiSE at ICSE

I just finished attending the two-day Modeling in Software Engineering workshop at the International Conference on Software Engineering in San Francisco.

Here are some of the take-away lessons for me (these do not necessarily reflect the ideas of the speakers, but my interpretations and/or extensions of their ideas)

Industrial use of modeling: There was very interesting discussion about the use of modeling in industry, but there seem to be two key and related directions for such use: Michael Whalen on Saturday gave lots of examples of the use of Matlab and SImulink in various critical systems (and particularly the use of StateFlow). Lionel Briand, on the other hand talked about using UML and its profiles to solve various engineering problems, again, however, he mostly focused on critical systems. In a panel he pointed out that most of the Simulink models he had worked with are just graphical representations of what could just as well be written in code (i.e. with little or nothing in the way of additional abstraction).

What struck me was that both presenters, and others, seemed to embrace what I might call 'scruffy' modelling: Briand talked about users adapting UML to their needs, and others talking about SImulink as  a tool that does not have the formal basis of competing tools, but nonetheless serves its users well.

Many people in the workshop pointed out that we need to boost the uptake of modelling. Various ways to achieve this were emphasized:

  • Improve education of modelling
  • Build libraries of examples, including exciting real-world ones, and ones that show scaling up
  • Make tools that are simpler and/or better so more 'ordinary' developers will consider taking up modelling
  • Allow modeling notations to work with each other and other languages and tools

It turns out that all four of these have long been objectives of my Umple project. So it seems to me that if the Umple project pushes on at its present pace, we stand to have a big impact.

Speaking of Umple, I gave a short presentation that seemed to be well received, although my personal demonstrations to a number of participants seemed much more effective with people appearing to be quite impressed. The lessons from this is that people really can see the advantages of our approach, but a hands-on and personal approach may work best, as a way to help people see the light.

Context: Another theme of the MiSE workshop that repeatedly appeared was 'context'. Briand pointed out that understanding the problem and its context is critical before working on a model-based solution; the modelling technique to be used will depend deeply on this context. Context can be requirements vs. design, or the specifics of the domain, the fact that space systems must be radiation hardened, or some aspect of the particular problem.

In my opinion, they are certainly right: Understanding the context is critical, and the tool, notation or technique needs to be selected to fit the context, However I also believe that we need to work on generalities that can apply to multiple contexts, in the same manner that general-purpose programming languages can be used in multiple contexts. For example the general notion of concept/class generalization hierarchies can be applied in almost every context, whether it be modeling the domain, specifying requirements for the types of data to be handled, or designing a system for code generation. I think state machines can also be applied in a wider variety of contexts, where people currently do not apply them: They are applied in many real-time systems, and they have been applied for specifying the navigation in user interfaces. But in my experience they can be applied in systems such as in this Umple example.

Testing: An interesting theme that came up several times related to testing: It was pointed out that it is worthwhile to generate tests from a model, but it also must be respected that in the context of a model used to generate code, these tests serve only to verify that the code generator is working properly! Such tests do not validate the model. Additional testing of the system is always essential.

Semantics and analysis: There was a lot of agreement that the power of modeling abstractions can be leveraged to enable analysis of the properties of systems. To do this however, it seems to me that semantics needs to be pinned down and better defined. 'Scruffy' use of UML and simulink seem to detract from these possibilities. Again, one of the objectives of Umple is to select a well-defined subset of UML, to define the semantics of this very well, and and to be able to analyse system designs in addition to generating systems from the models.


No comments:

Post a Comment