I don't blog much in summer. That's mostly because I am either enjoying some time off or I am busy figuring out "new stuff".

So here's a bit of a hint what currently keeps me busy. If you read this in an RSS aggregator, you better come to the website for this explanation to make sense.

This page here is composed from several elements. There are navigation elements on the left, including a calendar, a categories tree and an archive link list that are built based on index information of the content store. The rest of the page, header and footer elements aside, contains the articles, which are composed onto the page based on a set of rules and there's some content aggregation going on to produce, for instance, the comment counter. Each of these jobs takes a little time and they are worked on sequentially, while the data is acquired from the backend, the web-site (rendering) thread sits idle.

Likewise, imagine you have an intranet web portal that's customizable and gives the user all sorts of individual information like the items on his/her task list, the unread mails, today's weather at his/her location, a list of current projects and their status, etc.  All of these are little visual components on the page that are individually requested and each data item takes some time (even if not much) to acquire. Likely more than here on this dasBlog site. And all the information comes from several, distributed services with the portal page providing the visual aggregation. Again, usually all these things are worked on sequentially. If you have a dozen of those elements on a page and it takes unusually long to get one of them, you'll still sit there and wait for the whole dozen. If the browser times out on you during the wait, you won't get anything, even if 11 out of 12 information items could be acquired.

One aspect of what I am working having all those 12 things done at the same time and allow the rendering thread to do a little bit of work whenever one of the items is ready and to allow the page to proceed whenever it loses its patience with one or more of those jobs. So all of the data acquisition work happens in parallel rather than in sequence and the results can be collected and processed in random order and as they are ready. What's really exciting about this from an SOA perspective is that I am killing request/response in the process. The model sits entirely on one-way messaging. No return values, not output parameters anywhere in sight.

In case you wondered why it is so silent around here ... that's why.

Updated: