Sunday, December 30, 2007

A Good Example from Canada

I recently became aware of the Canada Research Chairs programme. That programme has been running since the year 2000, and aims at establishing 2000 research professorships—the so-called Canada Research Chairs—in universities across Canada by 2008. The Canada Research Chairs programme invests $300 million a year to attract and retain some of the world's most accomplished and promising minds.

I encourage the readers of this blog to have a look at the web site for the programme. There is a lot of interesting material there, and I cannot help but think that many countries would be well served by setting up a similar programme to attract the best possible scientists in all disciplines. Now, this is something well worth lobbying for in the coming year, isn't it?

If you do not have time to look at the web site I linked to above, here is my executive summary of the programme.

  • Each eligible degree-granting institution in Canada receives an allocation of Chairs. For each Chair, a university nominates a researcher whose work complements its strategic research plan and who meets the program's high standards.

    Three members of a college of reviewers, composed of experts from around the world, assess each nomination and recommend whether to support it.

  • Universities are allocated Chairs in proportion to the amount of research grant funding they have received from the three federal granting agencies: NSERC, CIHR, and SSHRC in the three years prior to the year of the allocation.

  • There are two types of Canada Research Chair:

    Tier 1 Chairs, tenable for seven years and renewable, are for outstanding researchers acknowledged by their peers as world leaders in their fields. For each Tier 1 Chair, the university receives $200,000 annually for seven years.

    Tier 2 Chairs, tenable for five years and renewable once, are for exceptional emerging researchers, acknowledged by their peers as having the potential to lead in their field. For each Tier 2 Chair, the university receives $100,000 annually for five years.

  • As you can see, there is a strong financial incentive to attract people to these endowed chairs!
  • Chairholders are also eligible for infrastructure support from the Canada Foundation for Innovation (CFI) to help acquire state-of-the-art equipment essential to their work.

  • If an institution's performance decreases relative to other institutions to the extent that the next recalculation of Chair allocations results in that institution's allocation being reduced, the Chairs Secretariat will reclaim, as appropriate, one or more of its unoccupied Chairs. Should all of the institution's Chairs be occupied, the secretariat will negotiate with the university on how best to reclaim the lost Chair(s).

Of course, the success of a programme like this one should be measured by the quality of the people who take up the chairs. (Italy has a similar programme already in place. You can read about it in a short article in Nature, with commentaries in the blog posts "The Runaway Brains" and "Brain Drain and Brain Gain".) You can look up the chairholders in all disciplines here. A quick browse through the names of the Canada Research chairholders in Information Technology and Mathematics makes me pretty sure that you'll find outstanding people in your area of interest.

Wouldn't it be great if we could convince our own ministries for education, university and research to set up a Research Chairs programme along the Canadian lines? Let's see what the new year will bring, but I do not hold my breath. I am already doing so waiting for the result of the pending research grant applications .

I wish a happy and productive 2008 to all readers of this blog.


Thursday, December 20, 2007

Workshop on Women in TCS

The department of CS at Princeton University is hosting a "Women in Theory" student workshop in Princeton on June 14-18, 2008. See http://www.cs.princeton.edu/theory/index.php/Main/WIT08 for more details and list of confirmed speakers. (Via in theory.)

I am happy to see an initiative like this. In fact, I believe that we should have more events that highlight the achievements of women in TCS and that offer prospective students role models that they can look up to.

During my student days in Pisa, I thought that it was very natural for CS classes to be attended by roughly an equal number of men and women. I also followed a good number of classes where lecturers or TAs were female members of staff. Back then, I never thought that computer science was a male-dominated subject, and I had no reason to think so. Only much later, did I realize that what I thought was the norm was, in fact, an exception and, by the time I taught a class in Aalborg that was being attended by only one female student out of about 50 registered students, the lack of women in CS was not a surprise to me any more.

It is still my impression that Italy has a fairly substantial number of female (theoretical) computer scientists---at least compared to countries in Northern Europe. People often ask me for the reasons behind this phenomenon, and I am always at a loss to try and explain it.

Do any of you have a good explanation why countries like France and Italy seem to suffer less than others from the lack of women in subjects like CS? Could it be that things are getting worse there too?

Saturday, December 15, 2007

Computer Scientist: 21st Century Renaissance Man

The period from January till May each year is when faculty at the School of Computer Science at Reykjavík University try to make a determined effort to entice students to study our lovely discipline. (I know, we should do so all the year round, but somehow our good intentions do not become good deeds on a regular basis by themselves :-()

As part of our 2008 campaign, I have coauthored a short essay entitled Computer Scientist: 21st Century Renaissance Man. There is nothing particularly new in it, but I hope that it is readable and that it carries the message that CS is much more than most laypeople believe it is. Maybe some of you will find it useful for your own PR campaigns. Feel free to use it, if you think it may help.

You can draw some more inspiration from the items in our suggested-reading list. More essays of general interest may be found here. See also the excellent survey collection at Theory Matters.


Friday, December 14, 2007

Positions in Computer Science and Applied Maths at Reykjavík University

Some readers of this blog might be interested in the following job announcement. We are particularly interested in applicants in Computer Security, System Dependability, and related areas within the field of computer science. Moreover, Software Engineering is intended in a broad sense and we welcome applications from people working in, e.g., formal development techniques, model-based software development, and testing and verification.

In order to implement its ambitious strategy in research and teaching, the School of Computer Science at Reykjavik University seeks to hire faculty members for new academic positions. The following links point to pages with more detailed information about the vacant positions.

Applied Mathematics: http://hr.is/?PageID=6595
Computer Science: http://hr.is/?PageID=6596
Software Engineering: http://hr.is/?PageID=6608

In all cases,
position levels can range from assistant professor to full professor, depending on the qualifications of the applicant. Salary level is negotiable and relocation assistance is offered. The position is available immediately, but later starting dates can be negotiated.

Informal communication and discussions are encouraged, and interested candidates are welcome to contact the Dean of the School of Computer Science, Dr. Ari K. Jónsson (ari@ru.is), for further information.


Wednesday, December 12, 2007

Registration for ICALP 2008 is Open

The registration page for ICALP 2008 and affiliated events went live yesterday evening at http://www.ru.is/icalp08/registration.html.

We have done our best to keep the registration fees as competitive as we possibly could. The prices on the registration form are in ISK, but, by way of example, the regular early fee for a non-ICALP participant to a one-day workshop is around 77 euros, which become roughly 56 euros for somebody who registers also for ICALP. The early registration fee for ICALP is a little below 380 euros (including the excursion).

If you know that you will attend ICALP 2008, as you should :-), I strongly encourage you to book your flights and accommodation early. July is prime holiday time in Iceland, and you are more likely to get a good deal on your flights if you book as early as you possibly can.

Let me end, in the style of Numb3rs, by noting that ICALP 2008 is

13 workshops
8 days
5 invited speakers
3 tracks
2 prize awards
1 conference.

Thanks to my co-organizer Magnús M. Halldórsson for pointing out this "ICALP for fun" countdown and that this is going to be a Fibonacci-like ICALP!

Ranking of Excellent European Graduate Programmes in Natural Sciences

This report of the Centre for Higher Education Development (CHE), which was released about a week ago, may be of interest to readers of this blog. The CHE is a think tank for higher education. Based on international comparisons, they develop models for the modernization of higher education systems and institutions.

Their report develops a Ranking of Excellent European Graduate Programmes in Natural Sciences (viz. biology, chemistry, mathematics and physics), which is intended as an orientation guide for undergraduates, helping them find their way around European Higher Education while at the same time helping them to choose a suitable university for their graduate studies: Master’s and PhD.

At first sight, the report looks very well done, and for an Italian expatriate like me it is good to see that Italian institutions are doing rather well. I'd like to see a similar analysis carried out for programmes in computer science.

Let me try and provide a few remarks on the CHE report. In passing, I'll also offer some personal conclusions related to what the findings of this report may mean for a country like Iceland, which has great ambitions despite its tiny population.

Let me start by focusing on one message from the report that I find most important here (quoted from page 14 of the report). While reading the following text, bear in mind that "subject areas" refers to biology, chemistry, mathematics and physics, and not to a huge array of different disciplines.

"Another interesting finding is the fact that most institutions (33) are selected in only one subject area, 15 in two subject areas, 4 in three and also only 4 in all subject areas. If, even in the relatively closely connected academic fields of the natural sciences and mathematics, only 14% of the very top institutions in one geographic region are featuring three or all four subject areas, this can indeed be taken as an argument against institutionwide rankings."

Even though I enjoy reading the results of university-wide rankings, I believe that what should concern students choosing where to pursue their studies and funding agencies determining where to invest their research funds in specific disciplines is not the overall ranking of a university, but rather its excellence in the specific topic of interest. For this reason, I agree with the finding that subject-specific rankings are much more informative than institution-wide ones.

Of course, in general, a university that scores highly institution-wide won't have any very weak department. However, there may be, and indeed there are, universities that have peaks of true excellence in specific areas, even though they may not be world-beaters in many areas. If I were a prospective PhD student, I would prefer going to study in a department which is known to be top-class in the specific area of my interest rather than going to university X just because it has a globally good reputation. Quoting from the report:

"Prospective doctoral students are possibly less interested in the general performance of a faculty or department than in a specific research group. They usually have very clear ideas about the specialised topic on which they are focusing. Thus, it might be of some value for a student searching for a biology doctoral programme specialising in insects to know that the faculty at University A is excellent in its research output in this domain. However, it might be much more interesting for this individual to
learn that he could delve into honeybee studies in Würzburg's bee group. Or, a student in astrophysics might be attracted less by the overall performance of the Physics Department at the University of Copenhagen than by its research group focusing on dark matter and cosmology."

So my first conclusion is:

Conclusion 1. Our business as academic institutions is reputation. It is better to be known in a few selected areas than to be unknown in many. Icelandic universities should prioritize and place more resources in those areas where they can maintain or build a strong reputation internationally. The competition is growing stronger by the day; nobody stands still and we will need many more resources in the future just to maintain our present standing where we have one.

The second point that I'd like to pick out from the report is the minimum entry requirement for even entering the evaluation. The 3000 ISI publications from a institution over the evaluation period are indeed a very tall order for any Icelandic institution at this moment in time. Sometimes we pat ourselves on the shoulders and tell each other how well we are doing, and for very good reasons. However, we should never lose sight of the "big picture". A very good practice for any scientist is to remain humble, to know that there is a lot one does not know, and to keep in mind that there are very many strong scientists and departments out there.

As Socrates famously put it, "A wise man is one who knows he does not know." In this setting, I would translate this statement into something like this:

Conclusion 2. A wise rector/dean/head of department is one who knows that her university/faculty/department will need to improve its research quality and output considerably just to maintain its present status, no matter what its present strength is. The only way to do so is to hire the best possible researchers, to give them the best possible working environment and the freedom to follow their research interests. Research output will need to be considered when distributing research money to ensure that the most funding goes where the highest "interests" (read "quality publications in internationally recognized outlets") will be generated.

Two of the indicators considered by the CHE Ranking are
  • the percentage of international and female staff within the group of staff with a doctorate and
  • the percentage of female and international doctoral and master's students.

I am afraid that, despite our snow queens, we score badly on both of these fronts. Ranking measurements aside, it is of paramount importance for science in Iceland to nurture female talent and to seek actively to hire the best available female applicants. Mind you, I am against hiring female applicants just because of their gender. What I am saying is that our departments should have search committees who actively nurture connections with the best possible female applicants for positions and that outstanding female applicants should be given precedence when they are at least as good as the competition. Here the ministry could also chip in with some financial incentives to universities to hire outstanding female applicants. (I won't turn this into a conclusion though )

One may wonder whether some research groups from Icelandic universities can make it into the big league. The answer to this natural question that emerges from the CHE report is, I believe, positive. Look at the bottom of page 12 in the report. There you will read:

"Looking at table 2, the United Kingdom not only attains the largest number of gold medals but also the largest number of medals in total within the excellence group. Switzerland, with only three universities in this group, is in third place concerning gold medals and holds the largest relative percentage of gold medals: 16 out of 22 medals in the whole."

This is an outstanding, and not unexpected, performance of Swiss institutions. In fact, ETH Zurich is one of only four universities with gold medals in all of the subjects in the excellence group (the others being Imperial College, the University of Cambridge and the University of Utrecht)! How can Switzerland achieve this outstanding level of academic achievement? Rather than trying to answer this question myself, I will rely on higher authority and freely quote a few excerpts from an interview to the Italian mathematician Alfio Quarteroni (professor at the Ecole Federale Polytechnique de Lausanne and at the Politecnico di Milano) published in this book.

  • Switzerland has only two federal universities (ETH and EFPL).
  • These are two truly international institutions. To wit, about 70% of their professors are foreigners, and so are about 65% of the PhD students and about 33% of their undergraduates.
  • Each of the few and carefully chosen full professors in those institutions has the financial resources to build her own research team. For instance, Quarteroni's team has about 20 members. (As a curiousity, they helped build Alinghi, the boat that has won the last two installments of the America's cup.)
  • Quarteroni roughly says: "EFPL offers outstanding environmental and quality conditions that I have not found elsewhere. I have worked at the University of Minnesota, at Paris VI and, for shorter periods, in about 50 universities and research centres throughout the world, including NASA at Langley; well, on the basis of my personal experience, Lausanne is the place where I have been able to realize my goals in the simplest, fastest and most efficient way."

Conclusion 3. I let you draw your own conclusions as to what we need to do here in Iceland in order to approach the lofty heights that those Swiss institutions as well as several universities in Finland, The Netherlands, and Sweden have managed to attain. The above opinions of Quarteroni's raise many questions which I hope Icelandic university administrators will be willing to answer.


RU in the Guardian Education

Colin Stirling pointed out this article in the Guardian Education to me yesterday. This article is going to generate a little publicity for my current institution (Reykjavík University) and we can certainly do with that! It is also certainly true that Icelandic (academic) institutions feature more women in leading positions than elsewhere, and that their wages are comparable to those of equally qualified male colleagues. However, I am not so sure that Iceland is a particularly good example of a country that attracts good numbers of female students and members of staff in science and technology. We still have far too few women enrolling in computer science degrees, for instance, and my department employs one female professor and two female assistant professors. (Yes, we now have English web pages!) We will need to work very hard to try and change this situation, and this is one of our tasks for the future as far as recruiting is concerned both at student and staff level.

But enough grumping, let's enjoy our five minutes of fame in the British media, even though, as my rector Svafa Groenfeldt wrote to me,

"The interview was ok but as always the journalists take a bit of an artistic license when they quote what we said :-)"

Tuesday, December 11, 2007

A Cancellation Theorem for BCCSP

Wan Fokkink, Anna Ingolfsdottir and I have recently completed the paper A Cancellation Theorem for BCCSP, which is now available from the web page where Anna and I collect our papers.

The aim of this paper is to prove a cancellation result for the language BCCSP modulo all the classic semantics in van Glabbeek's linear time-branching time spectrum. The statement of this result is as follows.

Theorem. Let t and u be BCCSP terms that do not contain the variable x as a summand. Let <= be a preorder in van Glabbeek's spectrum. If t+x <= u+x then t <= u.

Apart from having some intrinsic interest, this cancellation result plays a crucial role in the study of the cover equations, in the sense of Fokkink and Nain, that characterize the studied semantics.

Fokkink and Nain proved the instance of the above theorem for failures semantics, with the aim to obtain an ω-completeness result for this semantics; their proof is rather delicate. To the best of our knowledge, failures semantics has so far been the only semantics in the spectrum for which the above result has been published. In our paper, we provide a proof of the above-mentioned property for all of the other semantics in the linear time-branching time spectrum. Despite the naturalness of the statement, which appears obvious, these proofs are far from trivial (at least for yours truly), and quite technical. I myself was really surprised by the amount of work we needed to do to prove such an "obvious" statement. (In case you wonder, the "obvious" statement is false in general. Consider, for instance, an algebra with carrier {0,1} and where the sum of any two elements is 1. Then the inequation y+x <= z+x holds, but y <= z obviously does not.)

I hope that some of the readers of this blog will find the paper worth reading. The techniques used in the proof of the cancellation theorem may also have some independent interest.

Sunday, December 09, 2007

Accepted Papers at FOSSACS 2008

The list of accepted papers for FOSSACS 2008 is out. I was in the PC for the conference, and I have to say that this year the number of very good submissions was substantially higher than the number of slots. (And this without counting the papers with which I had to declare a conflict of interests.)

Looking at the accepted papers, it is striking how many of them have French authors. In fact, France was the country with the largest number of authors of submitted papers this year (67 to be precise) , and the acceptance ratio for papers co-authored by French authors was high. By way of comparison, the country that had the second largest number of authors was the US with 29. Germany, Italy and the UK were roughly on a par with the US.

French TCS is hot, judging by these figures. LSV alone contributes at least five papers to FOSSACS 2008. This is quite a way of celebrating their tenth anniversary.

Thursday, December 06, 2007

ACM Fellows - 2007

I just had a look at the list of ACM Fellows for 2007. It has been a good year for TCS, and even concurrency theory and computer-aided verification are well represented in the list. Congratulations to Rajeev Alur, Lance Fortnow, Georg Gottlob, Rajeev Motwani, Amir Pnueli and all the other the new fellows.

On the subject of awards, I yesterday saw the call for nominations for the first CAV award. I have at least one person that I'd really like to nominate, but since I would prefer that person to be in cool Reykjavík for ICALP 2008 instead of sultry Princeton for CAV 2008, I guess I'll have to wait for one year before sending in my nomination :-) Who would you nominate for such an award? Note that
The cited contribution(s) must have been made not more recently than five years ago and not over twenty years ago. In addition, the contribution(s) should not yet have received recognition via a major award, such as the ACM Turing or Kanellakis Awards.
Unfortunately, the CAV organizers decided to fix the dates for their conference so that they coincide with the dates for ICALP 2008, which were known long before theirs. Time will tell whether this was an inspired decision on their part.

Wednesday, December 05, 2007

ICALP 2008: Second Call for Papers

The second call for papers for ICALP 2008 has just been posted to several mailing lists. The chances are that your mailboxes will be flooded by a good number of copies of the call for papers, but, in case this does not happen, you have been notified via this post :-) (In fact, I was quite amazed to discover that this modest blog is featured in the Theory of Computing Blog Aggregator, and so the odd post of mine might even be read by a CS theorist or two.)

In fact, submissions to ICALP 2008 are already open! To submit, please follow this link. Registration for the conference and affiliated workshops will be open very soon, allowing prospective participants to book flights and accommodation at a decent price. Iceland is a hot tourist destination in July.

Tuesday, December 04, 2007

ICALP 2008: Co-located Events

The list of events that will be co-located with ICALP 2008 (6-13 July, Reykjavík, Iceland) is now available. We received a record number of workshop proposals, and the number of affiliated events witnesses the interest in visiting the land that hosts me right now.

A second call for paper for ICALP 2008 will be posted very soon. Sharpen your pencils, and submit your best paper to the conference!

Sunday, December 02, 2007

10 Years of Verification in Cachan: Part I

On November 26 and 27, Anna and I were in Paris for 10 Years of Verification in Cachan, a two-day workshop organized by the Laboratoire Spécification & Vérification (LSV) of the ENS Cachan to celebrate its 10th anniversary. The workshop was centered around two special award ceremonies honouring two very good colleagues and friends of ours: Patricia Bouyer received CNRS's 2007 Bronze Medal for Computer Science (Monday 26th November), and Kim G. Larsen became Doctor Honoris Causa at ENS Cachan (Tuesday 27th November).

Despite being very busy and getting nowhere fast, we felt that we really ought to make the trip to Paris for this event, to which the organizers had kindly invited us to contribute talks. We were visiting professors at LSV in May 1998, when that laboratory was in its infancy, and we have very fond memories, both scientifically and socially, from that stay. In fact, our connection with LSV started informally long before that time when François Laroussinie and I shared an office in Aalborg for about a year starting from the autumn 1994 (when BRICS started its activities).

Today, there is little doubt that LSV is one of the hotbeds of TCS research in the Paris area, where there is really no shortage of talent and of extremely strong CS departments covering the whole gamut of TCS research. According to Philippe Schnoebelen, the present director of LSV, the centre now has about 40 members, and since its inception it had graduated 33 PhD students, seven of whom have been hired by CNRS. This is just one of the many indications of the success achieved by LSV over the first ten years of its existence. To my mind, another definite sign of impact is the number of former members of the laboratory who have taken up high-profile academic positions elsewhere. Here is what I could find on the LSV web site.
The workshop was a really enjoyable event. We had a great time, listened to some excellent talks, met a lot of colleagues and friends, and tasted some very good food. The organization was simply superb, thanks to the sterling efforts of Philippe Schnoebelen, Thomas Chatain, and Stéphanie Delaune.

Apart from my presentation, which ended the event, the two-day workshop featured talks by André Arnold, Pierre Wolper, Kim G. Larsen, Claude Kirchner, Marta Kwiatkowska, Michel Bidoit, Patricia Bouyer, Anna Ingólfsdóttir, Wolfgang Thomas, and Colin Stirling. This was really an embarrassment of riches, and I learned a lot from all of the talks---even from the two delivered in French :-) The quality of the presentations by these colleagues was invariably high, and the talks offered very accessible introductions to several areas of research covered by the members of LSV.

It would take way too long to report on all of the presentations. In this post and in subsequent ones, I'll therefore limit myself to recalling a few opinions and trivia that I heard at the workshop.

In his talk 25 years of automata and verification, Pierre Wolper went on record as saying that "Complexity, as traditionally measured, is not very relevant in verification." To wit, in his talk he pointed out that verification algorithms with high worst-case complexity turn out to perform well in practice and are widely used. Prime examples are automata-theoretic algorithms for LTL model checking as well as those implemented in the tool MONA, which implements decision procedures for the Weak Second-order Theory of One or Two successors (aka WS1S/WS2S)---a theory that is not elementary-recursive, as shown by Albert Meyer in this seminal paper. On the other hand, he presented an example from his own work yielding efficient automata-theoretic algorithms for CTL model checking, which are not used in practice. His conclusion was that one cannot trust complexity results. (I myself have mixed feelings about this opinion of Wolper's. Maybe I'll devote a post to this topic when life is less hectic. In the meantime, I'd love to hear your opinion.)

Pierre Wolper also said that his LICS 1986 paper An Automata-Theoretic Approach to Automatic Program Verification (Preliminary Report) with Moshe Vardi, which eventually won the LICS Test-of-Time Award and formed the basis for their paper that won the Gödel prize in 2000, had been rejected by two or three conferences before being accepted at LICS 1986! He also recalled the following sequence of reactions by Gerald Holzmann to their automata-theoretic algorithms for LTL model checking:
  1. "It must be wrong!"
  2. "It is impractical!"
  3. "It does not fit into SPIN!"
Eventually, the algorithm was implemented in SPIN, showing the power of elegant ideas in practice.

At the end of his talk, Pierre called for renewed efforts in implementing automata. I keep telling my students that automata are the most basic computational model in computer science, and I cannot help but share Pierre's call to arms.

I hope to report on a few other talks from the workshop when I have managed to catch up with a few of the items on my to-do list.

Tuesday, November 20, 2007

Acknowledging Author Contributions to Papers

Yesterday, I had a quick look at a news item published on the Italian on-line science magazine Galileo. (Warning: It's in Italian.) The news item describes work on computational biology carried out by Niko Beerenwinkel (ETH Zürich) and co-workers. The authors develop a new mathematical model for the somatic evolution of colorectal cancers and use it to derive an analytical approximation for the expected waiting time to the cancer phenotype.

This all sounds very interesting, but what caught my eye as a casual observer was the following information on page 19 of the resulting paper.

Author Contributions
K.W.K., V.E.V., and B.V. performed experiments; N.B., T.A., D.D., A.T., and M.A.N. developed the mathematical model; and N.B., T.A., D.D., B.V., and M.A.N. wrote the paper.
Have any of you ever written anything similar on any of your papers? I wonder what similar piece of text might accompany a TCS paper. Will we ever see a TCS article stating anything like the following piece of text?

X and Y proved Lemmas 1-4 and Theorem 2, Z proved Theorem 1, contributed ideas to the proof of Lemma 3 and found an error in the original proof of Theorem 1. Author W wrote the paper and was the main driving force in previous joint work that led to the development of this paper.

And what about papers that also report on implementations? I can't imagine how detailed the description of author contributions would have to be then.

I feel relieved to live in a research environment where we collaborate and mainly use alphabetical order amongst the authors---and not just because my surname is Aceto :-) I personally like to work by closely following the Hardy-Littlewood Rules for collaboration, whose spirit is very much at odds with the identification of author contributions.

Friday, November 16, 2007

A Wonderful Read On Gödel's Theorem

I recently finished reading the book Gödel's Theorem: An Incomplete Guide to its Use and Abuse by Torkel Franzén. This is a book I would recommend to all of the readers of this blog. (Not "This is a book I would recommend to all of the readers of this book", as I had written originally. Thanks to Luca Trevisan for spotting this embarrassing statement of mine. Haste generates monsters, alas :-() It is accessible yet precise, and it sets the record straight on what Gödel's theorem implies and what it does not imply. While reading it I quickly learned that, despite my previous reading of texts on Gödel's theorem, I suffered from some of the misunderstandings that Franzén discusses in his lovely book.

To quote from the overview chapter in the book:

The aim of the present addition to the literature on Gödel’s theorem is to set out the content, scope and limits of the incompleteness theorem in such a way as to allow a reader with no knowledge of formal logic to form a sober and soundly based opinion of these various arguments and reflections invoking the theorem. To this end, a number of commonly occurring such arguments and reflections will be presented, in an attempt to counteract common misconceptions and clarify the issues. The formulations of these arguments and reflections, when not attributed to any specific author, are adaptations of statements found on the net, representative of many such reflections.
I think that the book succeeds beautifully to do just that, and will repay re-reading.

Torkel Franzén also maintained the page Gödel on the net, from which the book eventually originated.

You might have noticed the use of the past tense in the previous sentence. Unfortunately, I just discovered that the author passed away in 2006. He left some books that, like this one, have received praise from many sources. I certainly enjoyed reading this book, and I am feeling tempted to open my purse and purchase a copy of this volume for my personal library.

Addendum (17 November 2007): I encourage any reader who wants to catch a glimpse of what this book is about, and of the writing style of the author, to have a look at this short article Torkel Franzén wrote for the Notices of the AMS (volume 53, number 4). Here is an example of one of the many things I personally learned from the book.

Let us say that a formal system has an arithmetical component if it is possible to interpret some of its statements as statements about the natural numbers, in such a way that the system proves some basic principles of arithmetic having to do with summation and multiplication. Given this, we can produce (using Barkley Rosser’s strengthening of Gödel’s theorem in conjunction with the proof of the Matiyasevich-Davis-Robinson-Putnam theorem about the representability of recursively enumerable sets by Diophantine equations) a particular statement of the form “The Diophantine equation p(x1 , . . . , xn ) = 0 has no solution” which is undecidable in the theory, provided it is consistent.

Franzén remarks that

....it is mathematically a very striking fact that any sufficiently strong consistent formal system is incomplete with respect to this class of statements,....

However, no famous arithmetical conjecture has been shown to be undecidable in ZFC.....From a logician's point of view, it would be immensely interesting if some famous arithmetical conjecture turned out to be undecidable in ZFC, but it seems too much to hope for.
Wikipedia has a short list of mathematical statements that are independent of ZFC on this page. You may find it interesting.

Saharon Shelah's result to the effect that the Whitehead problem is undecidable in ZFC is discussed in this paper by Eklof. Shelah has also shown that the problem is independent of ZFC+CH.

There is also an interesting post on a combinatorial problem that is independent of ZFC on the complexity blog. That post also includes several pointers to pieces that are well worth reading.

Thursday, November 15, 2007

Rejecta Mathematica

Silvio Capobianco pointed out to me a new mathematical journal, whose existence he discovered by looking at this opinion of Doron Zeilberger's. The journal is called Rejecta Mathematica, and its description reads

Rejecta Mathematica is a new, open access, online journal that publishes only papers that have been rejected from peer-reviewed journals (or conferences with comparable review standards) in the mathematical sciences.
A unique aspect of Rejecta Mathematica is that each paper will be published together with an open letter from the authors discussing the paper's original review process, disclosing any known flaws in the paper and stating the case for the paper's value to the community.

Is this a joke? This and other natural questions are discussed here. In case you wonder who the editors of this journal are, look here.

I am looking forward to seeing what papers will appear in the first volume of the journal :-)

Tuesday, November 13, 2007

My Web Page Has Moved

My directories in Aalborg are about to be closed down, so after three years I have finally needed to move my web pages to Reykjavík University. I have been loathe to do so until now because Aalborg was ensuring adequate visibility on the web to my pages. My web page used to come out as first link in a Google search for "Luca Aceto" and was one of the top hits in a Google search for "Aceto" (where I am competing with the vinegar producers :-)).

My fears have now materialized. This search does not find my RU web page on the first page, which is not good. I'll have to explain to our web administrator that if Google can't find my home page, I basically don't exist.

Luckily, however, this blog is still visible and from there one can find my home page in a rather roundabout way :-)

Monday, November 12, 2007

The Origins of Bisimulation and Coinduction

I recently finished reading the paper On the Origins of Bisimulation, Coinduction, and Fixed Points by Davide Sangiorgi.

I had the pleasure of hearing Davide give a talk on the topic of this survey paper during a workshop I co-organized in August 2005 in Bertinoro, and I had been waiting for the paper to appear in print ever since. The wait has now been vindicated by this truly excellent piece of scholarship. I suggest that any reader of this blog grabs a copy of this paper and reads it. I myself learned something new from it and enjoyed reading it immensely.

In this well-organized paper, Davide takes us on a historical journey tracing back the origins of the ideas of bisimulation and coinduction through the three fields where they emerged independently: Computer Science, Modal Logic and Set Theory. He highlights the role played by some key figures in the development of bisimulation and co-induction, but he does not forget to mention the interesting contributions of less-known players. Each section of the paper ends with a discussion of the ideas it presents and speculates on the reasons why some authors went very close to defining bisimulation, but did not take the extra step needed to discover/invent bisimilarity.

The level of scholarship in the paper is outstanding. We need to thank Davide for making the effort to find appropriate sources, to read them so carefully and to present his findings in such a well-organized survey paper.

I trust that the writing of this paper took a lot of work and I hope that our community will recognize the importance of writing articles like this one. Shouldn't journals accept historical papers of this quality? I believe they should!

Thanks again to Davide for making the effort to write this piece. I firmly believe that we should have more articles like this one and hope that this paper will enjoy a wide readership throughout the TCS community as a whole.

Friday, November 09, 2007

BEATCS Is Open Access

Vladimiro Sassone, the editor in chief of the Bulletin of the EATCS, has posted the appended message on several theory-related mailing lists. I hope that my readers will enjoy reading the Bulletin and will feel enticed to contribute to it by sending in survey articles for its columns, technical articles, conference and workshop reports, position papers, puzzles, obituaries etc.

I have always been a staunch supporter of an open-access Bulletin. A professional society like the EATCS must suppport the TCS research community in all possible ways. Having an open-access publication like the Bulletin is one of the contributions that the EATCS can have. Compare with SIGACT News, whose access is restricted to SIGACT members. Is this the best way to serve the TCS community?

I also believe that an open-access Bulletin of the EATCS will entice more colleagues to contribute to the Bulletin since their contributions will stand a better chance to be read and appreciated by anyone within the TCS community. This will also increase the quality of the contributions.


Support the EATCS and the open-access Bulletin by becoming a member of the EATCS! Becoming a member is easy, and only costs € 30 for a period of one year. (In fact, it is only 25 € if you are also member of SIGACT.)
To quote from the classic Nike ad, "Just do it"

----------------------------------

Dear Colleagues,

Since 2003 all issues of the Bulletin of the EATCS, the European
Association for Theoretical Computer Science, have been produced
entirely electronically and made available on the web for members only.

The EATCS is now piloting the idea of Open Access for the Bulletin,
in the spirit of best serving its research community. So, until
further notice the volumes from no 79 onwards of the Bulletin of the
EATCS will be available from

http://www.eatcs.org/publications/bulletin.html

With best regards (and apologies for cross-posting)
\vs




Friday, November 02, 2007

Joining Our "User Group"

In case you are using the book Reactive Systems: Modelling, Specification and Verification for one of your courses, I'd love to hear from you! I'll then mention your course on the list of classes that use the book and you'll be informed once we make supplementary material available on the book's website (or via other channels).

I look forward to hearing from you.

Thursday, November 01, 2007

Submissions for ICALP 2008 Now Open

The submission page for ICALP 2008 is up. See here. (The submission site for track C will be available soon.) Be sure to submit your best paper to the conference!

The deadline for workshop proposals passed yesterday, and I can already say that we will be able to offer a very varied and large selection of workshops. Watch this space and be the first to know which workshops will be co-located with the conference :-)

Thursday, October 25, 2007

Does Presenting Your Results Simply Damage Your Early Career Development?

Over the last couple of days, I have been peeking at Nicole Immorlica's enjoyable live report from FOCS 2007. This comment on a post of hers, amongst others, really got me thinking.

There is another factor going on here, which is that younger people often feel the need to "impress" the audience (both those in and outside their area) with how difficult their result is. For better or for worse, this is to some extent essential for grad students hoping to get faculty jobs, and assistant profs looking ahead to their tenure letters.

Once someone gets tenure, they can decide (depending on their personality) whether to keep up this ruse, or whether to aim to present results as simply as possible.

I agree that there is a general tendency amongst researchers at an early stage of their career to show off the hardness of their results, both in in their talks and in their papers. (I, for one, would write all of my early papers differently now.) However, I am not at all convinced that this is essential to foster one's early career, as claimed by the commentator.

When it comes to talks, as in other job-related things, I tend to be a follower of Paul Halmos. (See this compilation of his thoughts that recently appeared in the Notices of the AMS.)

....the purpose of a public lecture is to inform, but to do so in a manner that makes it possible for the audience to absorb the information. An attractive presentation with no content is worthless, to be sure, but a lump of indigestible information is worth no more.…

Less is more, said the great architect Mies van der Rohe, and if all lecturers remember that adage, all audiences would be both wiser and happier.

Have you ever disliked a lecture because it was too elementary? I am sure that there are people who would answer yes to that question, but not many. Every time I have asked the question, the person who answered said no, and then looked a little surprised at hearing the answer. A public lecture should be simple and elementary; it should not be complicated and technical.

I like to think that this is the message we should pass on to young researchers from the very beginning of their research careers. Building a reputation as a clear speaker, who can make difficult ideas and results accessible to her/his audience, can only help in one's career.

Isn't it going to be a huge bonus to be remembered as a person who gave a highly accessible and entertaining talk at FOCS/ICALP/LICS/SODA/STOC 2nnn? Why don't we tell our young colleagues that a talk at a generalist conference is essentially a public lecture? The result would be better talks at conferences, a happier audience, and an increased reputation for the speakers.

Wednesday, October 24, 2007

Call For Nominations: EATCS Award

The call for nominations for the EATCS Award 2008 has been published (pdf). The EATCS Award committee consists of Catuscia Palamidessi, David Peleg (chair), and Emo Welzl. Nominations should be sent to David Peleg by December 1, 2007. Further details on the award, including the stellar list of previous awardees, is here.

Be sure to nominate your favourite TCS researcher! The award will be delivered at ICALP'08 in Reykjavík.

Monday, October 22, 2007

Call For Nominations For The Gödel Prize 2008

The 2008 Gödel Prize for outstanding journal articles in the area of theoretical computer science, co-sponsored by EATCS and ACM SIGACT, will be awarded at ICALP 2008, which I am co-organizing in Reykjavík with Magnús M. Halldórsson and Anna Ingolfsdottir . The call for nominations, with deadline 31 January 2008, is now available here.

Make sure that you nominate your favourite paper in TCS for the award! One of the authors of the award-winning paper will attend the conference and will deliver a talk.

Sunday, October 14, 2007

Deadline Approaching: Call for Workshops for ICALP 2008

The deadline for workshop proposals for ICALP 2008 is 31 October 2007. See the call for workshops for details on how to submit a proposal.

Friday, October 12, 2007

NordSec 2007

Over the last couple of days, ICE-TCS and the School of Computer Science at Reykjavík University hosted NordSec 2007, the 12th Nordic Workshop on Secure IT-systems. The workshop featured invited talks by Cedric Fournet (Microsoft Research, Cambridge, UK) and Greg Morissett (Harvard University, USA). I attended a few talks at the workshop (in between visits to my office to take care of typical daily chores and make sure that my inbox did not overflow), and chaired a session with short presentations.

I found the event interesting and enjoyable. In particular, the two invited talks were excellent. Cedric Fournet presented some joint work with Andrew D. Gordon and Sergio Maffeis reported in the paper A Type Discipline for Authorization in Distributed Systems. In his talk, Cedric addressed the following two key questions.
  1. How can we express good authorization policies?
  2. How can we enforce good authorization policies?
Cedric's tenet in the talk was that logics are good languages for expressing policies, and that type systems can be used enforce good policies at compile time. He also showed the audience what can be guaranteed at compile time when parts of the system are compromised, and the role that "robust safety" (safety despite compromised principals) plays in reasoning about processes.

Greg Morissett delivered an inspiring talk on the Cyclone project. Cyclone is a safe dialect of C. It is designed so that pure Cyclone programs are not vulnerable to a wide class of bugs that plague C programs such as buffer overflows. (Greg said early on in his talk that the legacy of C is, if you allow me to phrase one of his slides as a regular expression, (Buffer overrun)* and that he hates C!) Greg's talk was a great ad for some of the topics I am about to cover in my course on semantics for programming languages, and highlighted how good time-honoured theory in new clothes can help improve on the safety of a beast like C.

I think that his talk gave each member of the audience something to take home, and that's one of the main secrets of a successful invited address. I just wish that more of my colleagues had been there to listen.

CONCUR 2009

Breaking news! I am pleased to announce that CONCUR 2009 will be held in Bologna, and will organized by Mario Bravetti and Gianluigi Zavattaro. This will be the second Italian CONCUR after Pisa 1996.

Congratulations to Bologna, which hosts one of the strongest CS departments in Italy and is a hotbed of concurrency-theoretic research, and to CONCUR for visiting a lovely city.

Monday, October 08, 2007

Prize Time

It is the time of the year when prize announcements are out fast and furious.

The ink has not yet dried on the news announcing that the 2007 Nobel prize for medicine has been awarded to Mario R. Capecchi, Sir Martin J. Evans and Oliver Smithies "for their discoveries of principles for introducing specific gene modifications in mice by the use of embryonic stem cells". (I notice that Mario Capecchi is yet another Italian expatriate who, like Fermi and Segre before him, has become a US citizen.)

In mathematics/CS, László Lovász has received the Bolyai Prize, and Ben Green, Hershel Smith Professor of Mathematics at Cambridge University (UK), will receive the 2007 SASTRA Ramanujan Prize "for his phenomenal contributions to number theory by himself and in collaboration with Terence Tao, that have changed the face of combinatorial additive number theory."

Tomorrow is the turn of the Nobel prize for physics. It'll be interesting to see who receives it. I hope for an award to an Italian in Italy :-)

Sunday, October 07, 2007

Frits Vaandrager's Answers

I have received a contribution from Frits Vaandrager to the discussion arising from the CONCUR workshop. I am very pleased to post them here.
Enjoy!


# What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?

When modelling embedded systems and/or protocols, real-time aspects have to be dealt with very frequently. If one prefers not to use a real-time model checker, one can often encode
real-time constraints in finite state model checkers. For instance, in an STTT paper from 2002, Brinksma, Mader and Fehnker analyzed a PLC control schedule using SPIN, and obtained results that were (at that time) competitive with Uppaal. Their trick was to advance the discrete clock variable not one by one, but directly to the point of the next event.

In his PhD thesis, Rajeev Alur has made a good case for dense time models. In the case of open systems with components that have different clocks, dense time is conceptually the right
thing to do. But in practice one often gets away with discrete-time models.

Industry has no preference for dense-time or discrete-time: any approach that solves their problems is fine.


# How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?

The maturity of model checkers has increased enormously over the last years and as a result it becomes much easier to apply them, also for non-experts. I give two examples:

- Allen B. Downey has written "The little book of semaphores", a nice book in which he presents solutions to dozens of concurrency problems, ranging from classical ones (like the barbershop)
to tricky and obscure problems (like the Sushi bar). This year I asked a class of first-year computer science students with just a few hours of model checking experience to pick a problem from Downey's book (a different problem for each pair of students) and to model and analyze Downey's solution for it using Uppaal. As a result, my students spotted several mistakes in Downey's book, mistakes which have been confirmed by the author.

- Matthijs Mekking, a student interested in implementing protocols and with no particular expertise in formal methods just completed an MSc thesis project at NLnet Labs on the SHIM6
protocol, a proposed IETF standard for multi-homing. At some point he started to model the protocol using Uppaal and became enthusiastic. He managed to find several nontrivial mistakes
in the standard and greatly impressed the protocol designers. His results directly influenced the new version of the standard. (See http://www.ita.cs.ru.nl/publications/papers/fvaan/SHIM6/)

SMEs usually don't have verification specialists in house, so they need some advice on modelling and analysis. But with todays technology it is possible to get results rather quickly, and they can do part of the work themselves. The situation will further improve when MSc and PhD graduates with a background in model checking get jobs at these companies. My group is collaborating with several SMEs and the verification problems they provide us with are often
interesting from an academic perspective.

Unlike SMEs large companies are able to invest in long-term research.

# Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?

Yes!! There is a great need for stochastic and probabilistic modelling and analysis techniques, and I would for instance welcome a tool that combines the functionality of Uppaal and PRISM.

The Zeroconf protocol, for instance, is full of probabilistic features that we could not model using Uppaal. If we really want to make impact in the area of wireless sensor networks, mobile ad-hoc networks, and P2P networks we need to extend model checkers with probabilities since the design of these networks can only be properly understood if we take probabilities into account.

# How can we, as a community, foster the building of industrial-strength tools based on sound theories?

Theory for the sake of theory is good and important but IMHO the concurrency community has too much of it. In order to really push model checking technology into industry the performance and functionality of these tools must be further improved by several orders of magnitude. This can only be achieved by a combined and focused effort of a large team of researchers and is also a major academic challenge.

In order to test new ideas one needs prototype tools to play with. However, I believe it is not healthy that almost every research group on model checking has its own tool. Only a few groups manage to keep their model checking tool in the air for more than a decade. Developing an
industrial-strength model checking tool requires a huge effort. I think academic groups have to join forces if they want to build (and maintain!) industrial-strength tools. Uppaal has been highly succesful in part due to the continued efforts from teams in Aalborg, Uppsala, Nijmegen, Twente and elsewhere. So why don't the CADP and muCRL teams join forces? Why isn't it possible to establish stronger ties between Uppaal and IF? Why are there so many probabilistic model
checkers?

By combining efforts we can be much more effective.

# What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?

Contributions are very powerful modeling languages and concepts, and of course model checking.

A major challenge ahead of us is to combine features. In model checking people proposed symmetry reduction, partial order reduction, CEGAR, etc. What about the combination of all these features? How can we prove it is sound. Can we combine probabilistic choice, hybrid aspects, real-time, and hierarchy as in state-charts? I believe it will be possible to define such combinations but I expect we will need theorem provers and proof assistents to help us to manage the resulting complexity.

I am aware of applications in other areas, but I believe computer system engineering will remain the most important application area for concurrency theory the coming decade.

Friday, October 05, 2007

Joost-Pieter Katoen On The Need for Probabilistic and Stochastic Modelling

I am posting this message on behalf of Joost-Pieter Katoen, who sent me his reaction to one of the questions posed to the panel members during our workshop at CONCUR 2007. Enjoy!

I'd like to answer to the question on the need for stochastic and probabilistic modeling (and analysis). Some concrete examples of case studies provided by industry for which probabilistic aspects are very important are listed below. The importance of explicitly modeling random effects explicitly stands or falls with the kind of property to be established, of course, so I am definitely not claiming that these examples cannot (and should not) be modeled by techniques that do not support random phenomena.

1. Leader election in IEEE 1394: in case of a conflict (two nodes
pretend to be a leader), the contending nodes send a message (be
my parent) and randomly wait either short or long. What is the
optimal policy to resolve the contention the fastest? (This turns
out to be a slightly unbiased coin).

2. In the Ametist EU-project, the German industrial partner Axxom
generated schedules for a lacquer production plant. While doing
so, they abstracted from many details that the lacquer producer
supplies such as: the average fraction of time a resource is not
operational, the fraction of (operational) time the resource can
be used because necessary human support is present, and so
forth. In their abstraction they scheduled tasks conservatively
and they were interested in whether they could improve their
schedules while reducing the probability to miss the deadline.
Clearly, a stochastic modeling is needed, and indeed has been
carried out using a stochastic process algebra.

3. Hubert and Holger should be able to say much more about
a recent project they are pursuing with a French company on
the validation of multiprocessor multi-threaded architectures.
I do not know exactly what they are investigating, but they use
stochastic process algebras to model!

Finally, let me say that (as Moshe is also indicating) that the
interest in probabilistic modeling is growing steadily. To give
an example, the European Space Agency (ESA) is currently
considering to use probabilistic modeling and analysis in the
context of AADL, an architecture specification language where
an important ingredient ais the failure rates of components.

All in all, it is fair to say that there is a quest for probabilistic
techniques!

Joost-Pieter Katoen

CNRS Bronze Medal To Concurrency Theorist

I learned last night that Patricia Bouyer (CNRS researcher at LSV, ENS Cachan) has received a CNRS bronze medal 2007 in section 7. The CNRS bronze medal is awarded for outstanding achievement by a junior researcher.

Congratulations to Patricia, with whom I was lucky enough to work on this paper. At that time, Patricia was a freshly minted MSc student, but her talent was already abundantly clear. I recall that we were trying to show that a property language was sufficiently expressive to describe all of the properties of timed automata that could be checked by reachability analysis in the context of a test process. Patricia constructed a test process which had no counterpart in the property language, and this led to a strengthening of the property language, which we eventually were able to prove expressively complete.

From those early days, Patricia has gone from strength to strength, as witnessed by her impressive output. I believe that there is much more to come from her.

Tuesday, October 02, 2007

Reverse Age Discrimination

Every now and then, I rant on this blog about Italian academia. I do this because of the strong ties I have with my country (or in typical Italian fashion I should say "well, actually, with my region Abruzzo" or even "with my home town, Pescara" :-)) , and because I am saddened by the frustration I sense in the Italian colleagues I meet at conferences or I exchange emails with.

The latest installment of the saga of Italian academia I read is the commentary Reverse Age Discrimination written for Nature Physics by Francesco Sylos Labini and Stefano Zapperi, two physicists based in Rome. Their piece paints a rather bleak picture for young Italian scientists and reports on what they call "Lo Tsnunami dell'Universita' Italiana" (The Tsunami of Italian University). I encourage you to read their opinions and to look at the statistics they report on here.

The figures are amazing. Italian universities have an unusually large fraction of old professors. In Italy, 41% of the university professors are aged 60 or older and only 4% are below 40. If we consider full professors only, we discover that overall in Italy more than 47% are aged 60 or older, but for physicists this percentage reaches a hefty 64%!

Add to this picture that Italian universities have a very irregular recruitment flow, with sudden bursts of hiring based on seniority rather than on merit, and you will see that the only option
left for several talented young scientists is to move to other countries. The commentary reports that this year the French National Research Council (CNRS) selected seven young researchers (CR2) in theoretical physics, out of whom four were Italians! If we look at CS, let me go on record once more as saying that Italians in Paris would form a very strong TCS department by any measure.

Let me close by quoting from the last paragraph in the article.

To conclude, we acknowledge that some older scientists are active and productive, and that European countries should do something to keep them engaged. .... The broad question of how to use the experience of older faculty without hindering the advancement of the younger generation remains an important challenge.

I hope that this is a challenge that Italy's academic establishment will face very soon. The future of Italian science depends on it.

Monday, September 24, 2007

Luca de Alfaro in the News

At CONCUR 2007, I learned that our colleague and member of WG1.8 Luca de Alfaro has recently hit the news for his work on an evaluation of trust in contributions to Wikipedia. (See Demo: Coloring the text of the Wikipedia according to its trust.) Luca's work is based on the idea of colouring the text of Wikipedia articles according to a computed value of trust. The trust value of each word of a Wikipedia article is computed as a function of the reputation of the original author, as well as the reputation of all authors who subsequently revised the text.

This work of Luca's was mentioned in the Washington Post (I cannot locate the link anymore), the Jerusalem Post, ACM TechNews and SFGate.com amongst others.

Luca will also be the PC chair for FOSSACS 2009 in York, but this will give him less media exposure :-)

Sunday, September 23, 2007

Nadia Busi (1968-2007)

During the excursion for CONCUR 2007, while visiting the Arrábida Convent, we were hit by the sad news that our colleague and friend Nadia Busi had suddenly passed away the night before. A paper having Nadia as one of the authors had been presented at the conference just before we set off for the excursion.

Our community has lost a very active and creative researcher, a woman who could be an example for others to follow and, above all, a very kind human being.

An obituary for Nadia, written by Roberto Gorrieri, will appear in the October issue of the BEATCS. You can read it here.

"We are such stuff
As dreams are made on, and our little life
Is rounded with a sleep."
(Shakespeare, "The Tempest")

Saturday, September 22, 2007

Guest Post from Jorge Perez

I received this contribution from Jorge Perez, and post it with pleasure.

I have found your reports on the WG1.8 Workshop extremely enlightening. I am specially impressed by the invited speaker's opinions on the role of probabilistic/stochastic modeling. In my opinion, and having little experience on the theoretical side of the topic, their positions are, at the very least, certainly a matter for debate.

The topic of the workshop also made me remember of the spirit of my former research group in Colombia. For many reasons, the group (called AVISPA, see http://avispa.puj.edu.co/) is interested essentially in applied concurrency. This means thinking at the same time in theory and usable applications, beyond prototypes. Perhaps the most successful application area has been Computer Music.
In collaboration with the French Acoustics/Music Research Institute (IRCAM), AVISPA has designed and implemented formal, concurrent languages to assist music composers in their duties. I know this is not properly an 'industrial application' but it could be an example of how concurrency theory can be fruitfully applied.

AVISPA has recently launched a project on applied concurrency: it is
called REACT (see
http://cic.puj.edu.co/wiki/doku.php?id=grupos:avispa:react), and it aims at consolidating our concurrency languages in several application areas, including security, systems biology, computer music (or more generally: multimedia semantic interaction).

Sorry for the long post, but the topic is very interesting for me and I wanted to share this (perhaps unusual) approach to applied concurrency.

Friday, September 21, 2007

Some Answers By the Workshop Speakers

In a previous post, I listed the questions that the attendees at our CONCUR 2007 workshop raised to the invited speakers and panel members. I also promised to pen down my recollections of their answers. Here they, hoping that I won't misrepresent the opinions of these colleagues too much. I am sure that they will correct me if I inadvertently do so. Anything I write below should be taken with a huge pinch of salt :-)

What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?

Kim G. Larsen (KGL): The vast majority of computing systems in operation today are embedded. Proper modelling and analysis of the behaviour of embedded systems requires time. However, industry does not really seem to care whether the model of time used in the models is discrete or continuous. When analyzing embedded systems, quantitative analysis techniques are needed and I expect that stochastics will play an increasing role in the future.

Jan Friso Groote (JFG): Basically, industry do not know what they want and there is little point in chasing their ever-changing needs. Concrete modelling and analysis of computing systems should be based on a uniform calculus, which is rich enough to model the problem scenarios at hand. As far as real-time is concerned, it should be expressible in the uniform calculus.

How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?

Both JFG and KGL reported on several examples of interaction with industry where there seemed no relation between "size" and the success of the interaction. Kim described a successful collaboration on testing of GUI applications with a one-person company having basically no technological expertise. This was compared with the collaboration with Bang and Olufsen, which was a failure despite the resounding success of their first cooperation.

Jan Friso highlighted the successful cooperations with industry within the Laquso project. (See here for more information.)

Hubert Garavel (HG) stated there are three necessary conditions for a successful collaboration with industry. The company should have
  1. a strong interest in quality,
  2. a lot of money and
  3. a formal modelling group in house.
These criteria are difficult to find in a small- or medium-size company. In particular, the third criterion above is useful because it opens up the possibility of embedding a company engineer within the academic research group.

Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?

There seemed to be a general opinion here that probabilistic modelling is nice, but not necessary. More precisely, none of the speakers had yet met an example that they could not model adequately because their models and tools do not permit stochastic or probabilistic modelling.

JFG stated that he wants to work on research topics that can have applicability in real-life scenarios. He wants to have interaction with industry mainly as a way to learn what are the good/bad aspects of his modelling language and his tools. He feels that one should push for the use of the same model for verification and performance evaluation. (I have a vague recollection that this opinion was also shared by Kim.)

How can we, as a community, foster the building of industrial-strength tools based on sound theories?

The general feeling amongst the panelists was that our community does not have the infrastructure to support tooling efforts. HG pointed out how the situation is better in France, where the development of tools and languages is supported and recognized by institutions like INRIA. (My note: In hindsight, this is reflected by the success of long-term efforts like those that led to Esterel, Coq and CAML, amongst others.)

The panelists suggested that conferences and journals should be keener to accept tools and case studies as refereed contributions on a par with papers. JFG pointed out that Science of Computer Programming has now a track devoted to "expositions on implementations of and experiments with novel programming languages, systems and methods." The journal's web page also states that "It must be emphasized that papers describing new software tools of relevance to SCP are welcome under the strict condition that the source code of the tools is open." KGL also stated that the TACAS conference series was initiated precisely to give more visibility to tooling efforts in our community.

For a one-slide viewpoint on the economics of tool development look at slide 72 of Kim's presentation at the workshop.

What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?


JFG: Theory does not have much to offer to industry. We should probably view concurrency theory as a nice mathematical theory that need not have any real-world application.

As for what is it that we can do as a CONCUR community to assist in the work on tools, JFG's answer is to organize that as many students are being taught their use and effective application as possible. One of the biggest problems that we are facing is that far too few people in industry understand what formal methods and their tools can effectively bring to industry. To be on the safe side, they do not see where the techniques are effective and what they offer. They however understand that there are other pressing needs to invest time in.

If we teach formal methods we should teach the most advanced ones that the students can swallow. If they understand the advanced methods they can apply the more straightforward techniques. Of course the reverse does not hold. Don't teach them UML and expect them to understand mCRL2. But if you teach them mCRL2, they will not have any conceptual difficulty to apply UML.

KGL: We should make our techniques fit into the whole system-development process. We should also make sure that existing model-based tools that are already in use by industry have much stronger semantic foundations.

HG: Our community is in trouble! The model of concurrency that is prevalent is industry is Java-like (threads and shared variables). Our foundational beliefs are exotic for industry and message-passing, as used in our process calculi, is not the order of the day. Our major challenge is in pushing the clean model of concurrency we like. Every computer science department in every university should play its part in achieving this aim. Education is one of our best weapons to make industry accept our models and the techniques based upon them.

Saturday, September 15, 2007

Concurrency Column for October 2007

I have just posted the piece that will appear in the concurrency column for the October 2007 issue of the BEATCS. The article, authored by Sergio Maffeis, is entitled Dynamic Web Data and Process Calculi and reports on the use of concepts and techniques from process calculi in the study and analysis of dynamic data.

Enjoy it.

Friday, September 14, 2007

Rance Cleaveland's Answers

It is a great pleasure for me to post some answers by my old mate Rance Cleaveland to the questions raised at the CONCUR workshop. Thanks to Rance for sharing his experience with us!


What an interesting workshop; I deeply regret not being able to attend.

At Luca's gentle suggestion ;-) I thought I would have a go at questions he posed at the end of his post.


What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?


In my experience with Reactis, my company's testing and verification tool, Reactis customers absolutely need real-time support. This is due to their applications, which are in automotive and aerospace and develop embedded control software. The most widely used commercial modeling languages (Simulink, Stateflow, SCADE) also include real-time as an intrinsic part of their semantics.

Ironically, given the sound and fury in the academic community, the industrial people I have interacted with for the most part do not care whether time is discrete or continuous. Sometimes, I have encountered customers who want to do hybrid-systems style modeling, and for these people continuity is important.


How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?



Regarding SMEs (small- and medium-size enterprises ... a common acronym among US policymakers): I think the best way to involve them is via projects funded by third parties (governments, or a large partner). SMEs generally don't have the overheads to support "blue-sky" research, and their investment-return horizons are necessarily of shorter duration. At both Reactive Systems and Fraunhofer, our concurrency-oriented SME collaborations have either involved collaborations on government research grants or project work on behalf of larger customer. In the latter cases, it was important that we work in commercial notations (e.g. Simulink) rather than research-oriented ones.

Large companies do have resources to put into more basic research, but there is another phenomenon to be aware of: researchers in these companies often view outside researchers as competitors for their internal research funds. So collaborations with these organizations are highly dependent on the personal connections between company and non-company researchers. So-called "business unit" personnel are often the easiest to deal with, but in this case there needs to be a clear, typically short-term, pay-off to them for the collaboration.


Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?


We support simple probabilistic modeling in Reactis in the form of probability distributions over system inputs that we sample when creating tests. This feature, however, is almost never used by our customers. The reasons for this mostly boil down to a lack of training these engineers receive in stochastic modeling and control, which in turn is tied into the lack of good (or maybe standard) theory for stochastic differential equations.

More precisely, the engineers in automotive and aero that I've dealt with are usually mechanical or electrical engineers with backgrounds in control theory. The feedback control they use relies on plant models (i.e. "environments") being given as differential equations, which are deterministic. The plant models they devise for testing their control-system designs often have parameters that they tweak in order to test how their ideas work under different conditions.

These engineers talk in the abstract about how useful it would be to develop analytical frameworks for probabilistic plants, but tractable theories of probability spaces of differential equations are unknown, as far as I can tell.


How can we, as a community, foster the building of industrial-strength tools based on sound theories?


To have an industial following, tools have to work with languages that industry uses. For most research tools this is a problem, because the input languages are typically invented by the tool developers.

I see two possibilities. One is to work on commercial languages such as Simulink. These languages are often a catastrophe from a mathematical perspective, but they also usually contain subsets that can be nicely formalized for the purposes of giving tool support. If tools have a nice "intermediate notation" into which these cores can be translated, then this offers a pathway for potential industrial customers to experiment with the tools.

The second approach is to become involved in standardization efforts for modeling languages. UML 2.0 has benefited to some extent from concurrency theory, but there are many aspects of that language that remain informal and imprecise.


What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?


I think the best way to answer first question is to "trace backward" from commercial tools / modeling languages that have some basis in concurrency. Such tools would include those based on Statecharts (Stateflow, STATEMATE, BetterState); others based on Message Sequence Charts (Rational Rose, other UML tools); the French synchronous-language tools (SCADE, Esterel); tools that include model checkers (the EDA = "electronic design automation" industry); tools that use model-checking-base ideas for other analyses (Reactis, DesignVerifier).

Unfortunately the process-algebra community has had relatively little impact on commercial tool development. This is not due to shortcomings in the theory, in my opinion, but in the inattention that compositionality continues to receive in the (non-research) modeling community. In my experience, event-based modeling is also relatively uncommon, at least in auto and aero: sampling of "state variables" is the preferred modeling paradigm.

I personally would like to see work on semantically robust combinations of specification formulas (e.g. MSCs + state machines, or temporal logic + process algebra) and related tools; theoretically well-founded approaches to verifying systems that use floating-point numbers; and compositional, graphical modeling languages (lots of work done already, but still no commercial interest).

Moshe Vardi's Answers

Moshe Vardi has kindly sent me some brief answers to the questions I posted yesterday. I post them here in order to foster further discussion, and thank Moshe for his contribution.

  • What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?
Moshe: Complex hardware designs today are locally synchronous, but globally asynchronous. Using multiple clocks to specify temporal properties is very
important, but I personally saw, so far, no application that required
dense-time reasoning. (Perhaps timing analysis of circuits requires dense
time?)
  • How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?
Moshe: Large companies can afford to think longer term and invest in internal/external research. This is quite harder to do with small companies, which typically looks for immediate solutions to their problems.
  • Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?
Moshe: I have not seen it so far, but I do hear people starting to talk about probabilistic behavior of computer circuits. Perhaps we'll see a growing need for stochastic modeling in a few years.

  • How can we, as a community, foster the building of industrial-strength tools based on sound theories?
Moshe: Academia can rarely build industrial-strength tools. Academic tools are
often the work of a single graduate student. Industry can put several PhD-level people on a single project.

  • What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?
Moshe: IMHO, the biggest successes of the concurrency-theory community, broadly
conceived, is model checking and the development of synchronous languages. At the same time, many research direction in concurrency theory, such as process calculi and bisimulation theory have had fairly minimal impact. The theory community is often attracted to research based on its theoretical appeal, which does not always correlate with its industrial applicability. This does not mean that industrial applicability should be the guiding principle of research. Nevertheless, it would be worthwhile to pause once in a few years and examine the potential applicability of
theoretical research.