I work for Red Hat, where I lead JBoss technical direction and research/development. Prior to this I was SOA Technical Development Manager and Director of Standards. I was Chief Architect and co-founder at Arjuna Technologies, an HP spin-off (where I was a Distinguished Engineer). I've been working in the area of reliable distributed systems since the mid-80's. My PhD was on fault-tolerant distributed systems, replication and transactions. I'm also a Professor at Newcastle University and Lyon.
Sunday, November 29, 2009
RESTful transactions round two
I've been working on a second draft of the RESTful transactions work that I've mentioned before. This time I'm doing it for Bill and his REST-* effort. I revised the original for JavaOne but didn't get a chance to use it in our presentation. So I'm taking this opportunity to apply some standards boiler-plate and bring it up to date. Plus it's always good to revisit something you did almost a decade ago and use the benefit of those intervening years and the experience gained.
Monday, November 23, 2009
Enterprise OSGi: two is obviously better than one
I think OSGi is important for several reasons. I think Enterprise OSGi is an interesting approach, particularly as it leverages JEE. I've even contributed to some of the work, for example around the transactions component. JBoss is doing a lot of implementation work around OSGi too.
I have to admit that I haven't been paying close attention to OSGi for a few months. However, I had heard about the new Apache Aries project. Unfortunately I just heard from a friend about the Eclipse Gemini project. Now I've been involved with standards long enough to have experienced first hand the political games that rivals play with each other. It's unfortunate because it rarely benefits users, tending to obscure the reasons for choosing one approach over another, confuse people, and ultimately delaying the uptake of the standard or technology involved.
Maybe I'm missing the underlying reasons why Oracle and SpringSource decided that Aries wasn't the right project for them. However, I really wish that as an industry driven primarily by technologists we could leave the politics behind and try to work far more collaboratively, and particularly where open source is concerned! As a slight aside, that's one of the things I really like about HPTS: it doesn't matter which company you're from, people talk and interact freely to try to better our collective understanding of problems and lessons learnt.
Update: I should point out that in the paragraph above I wasn't siding with Aries over Gemini, simply that Aries started first.
I have to admit that I haven't been paying close attention to OSGi for a few months. However, I had heard about the new Apache Aries project. Unfortunately I just heard from a friend about the Eclipse Gemini project. Now I've been involved with standards long enough to have experienced first hand the political games that rivals play with each other. It's unfortunate because it rarely benefits users, tending to obscure the reasons for choosing one approach over another, confuse people, and ultimately delaying the uptake of the standard or technology involved.
Maybe I'm missing the underlying reasons why Oracle and SpringSource decided that Aries wasn't the right project for them. However, I really wish that as an industry driven primarily by technologists we could leave the politics behind and try to work far more collaboratively, and particularly where open source is concerned! As a slight aside, that's one of the things I really like about HPTS: it doesn't matter which company you're from, people talk and interact freely to try to better our collective understanding of problems and lessons learnt.
Update: I should point out that in the paragraph above I wasn't siding with Aries over Gemini, simply that Aries started first.
Saturday, November 21, 2009
The future of Java
For one reason or another I've been thinking about the future of Java for a while; more the language than the platform (JEE). I think that the JEE platform will continue to evolve over the coming years, more likely into a series of specific vertical solutions. But this entry isn't about the platform, it's about the language.
Although I've been using Java since it was known as Oak and have written a fair amount with it (for example, I wrote the first ever Java transaction implementation over Christimas 1996), it's never been my favourite programming language. There are some things that I liked about the language from the start, such as threading, but others, such as garbage collection (yes, I like the control of a good delete) and lack of multiple inheritance, that I didn't. The Java language has certainly evolved over the past decade, mostly for the better (garbage collection is a lot better now and we have templates), but in general the language still takes a lowest common denominator approach. And it has got rather bloated.
Ignoring several assembly languages, over the years I've learnt many high level languages including Pascal, Mesa, Smalltalk-80, Lisp, Prolog, C/C++, Simula, D, Forth and of course Java. My favourite is still C++ though. Yes I know it has its flaws and yes I know it's not the most forgiving of languages. I can't quite put my finger on precisely why I still prefer C++. I remember when we were testing the first pre-releases of Cfront for AT&T back in the mid 1980's and wondering along with Graeme and Stuart as to whether we could port it to our Atari's using Metacomco C. I seem to recall us making some progress, but Linux came on the scene bringing with it gcc. But none of this explains why I prefer C++. Maybe it's the level of direct control it gives you (as with C). Or maybe it's something else. Whatever, I'm sure it's all subjective.
Anyway, I digress. Where does this leave Java? Well I think if you look back over the past 40+ years of high level programming languages one thing is very obvious: change happens. Our ability to reason about complex algorithms and the best way of putting them into code evolves, whether it's declarative, procedural, object-oriented or something else. I think it's fairly clear that Java's dominance in the industry will wane. Yes it'll be legacy for many years to come; the 21st Century COBOL, so it will continue to be important. But something new is coming. Maybe not today and maybe not tomorrow. But it's coming nonetheless.
Although I've been using Java since it was known as Oak and have written a fair amount with it (for example, I wrote the first ever Java transaction implementation over Christimas 1996), it's never been my favourite programming language. There are some things that I liked about the language from the start, such as threading, but others, such as garbage collection (yes, I like the control of a good delete) and lack of multiple inheritance, that I didn't. The Java language has certainly evolved over the past decade, mostly for the better (garbage collection is a lot better now and we have templates), but in general the language still takes a lowest common denominator approach. And it has got rather bloated.
Ignoring several assembly languages, over the years I've learnt many high level languages including Pascal, Mesa, Smalltalk-80, Lisp, Prolog, C/C++, Simula, D, Forth and of course Java. My favourite is still C++ though. Yes I know it has its flaws and yes I know it's not the most forgiving of languages. I can't quite put my finger on precisely why I still prefer C++. I remember when we were testing the first pre-releases of Cfront for AT&T back in the mid 1980's and wondering along with Graeme and Stuart as to whether we could port it to our Atari's using Metacomco C. I seem to recall us making some progress, but Linux came on the scene bringing with it gcc. But none of this explains why I prefer C++. Maybe it's the level of direct control it gives you (as with C). Or maybe it's something else. Whatever, I'm sure it's all subjective.
Anyway, I digress. Where does this leave Java? Well I think if you look back over the past 40+ years of high level programming languages one thing is very obvious: change happens. Our ability to reason about complex algorithms and the best way of putting them into code evolves, whether it's declarative, procedural, object-oriented or something else. I think it's fairly clear that Java's dominance in the industry will wane. Yes it'll be legacy for many years to come; the 21st Century COBOL, so it will continue to be important. But something new is coming. Maybe not today and maybe not tomorrow. But it's coming nonetheless.
Friday, November 20, 2009
ArchiteCloud 2010
I have the pleasure of bring on the ArchiteCloud 2010 program committee. If you've any papers hiding in your "to do" lists then get them in, as this promises to be a great event! Maybe I can use this as an excuse to visit Australia next year too!
Tuesday, November 17, 2009
Santa is an architect
It's drawing near to that time of the year again when thoughts turn to snow, presents, turkeys and all things festive. So it was that I was watching a program on TV yesterday where Santa was the main character and my 7 year old and I began to discuss the ways in which Santa manages to get presents to all of the good little girls and boys around the globe in a single night. Of course we covered all of the usual ideas, such as time dilation, wormholes and even time travel. My son thought that magic was the solution, but I pointed out that these days what with global warming and the fact that it's been shown that continual use of magic harms the environment, it's doubtful. Let's also not forget that magic reindeers produce a lot of CO2 as well as other effluent.
So where does that leave us (apart from with a rapidly disillusioned child)? The answer was obvious: although in the past he's probably used a combination of all of the above techniques (have to placate child), today he's taken a software architecture course and figured out that federation works well and scales. He has millions (billions?) of proxies in each country who do his work for them. He sends them information about what needs getting (in advance of course) and relies on them to buy the presents and distribute them locally. Those proxies may themselves have proxies in a recursive manner. Yes we all know it's the elves who build and distributed the toys to the shops, but it's the masses of proxies that get the delivery work done. And of course these helpers are parents, grand-parents etc.
So next time the question arises you'll know the answer: Santa is a coordinator and we're all interposed coordinators in the grand scheme of things ;-)
So where does that leave us (apart from with a rapidly disillusioned child)? The answer was obvious: although in the past he's probably used a combination of all of the above techniques (have to placate child), today he's taken a software architecture course and figured out that federation works well and scales. He has millions (billions?) of proxies in each country who do his work for them. He sends them information about what needs getting (in advance of course) and relies on them to buy the presents and distribute them locally. Those proxies may themselves have proxies in a recursive manner. Yes we all know it's the elves who build and distributed the toys to the shops, but it's the masses of proxies that get the delivery work done. And of course these helpers are parents, grand-parents etc.
So next time the question arises you'll know the answer: Santa is a coordinator and we're all interposed coordinators in the grand scheme of things ;-)
Monday, November 09, 2009
In-memory durability and HPTS
Back in the 1980's when I was writing the proposal for my PhD work I was looking at various uses for replication (at that point strong consistency protocols). There are a number of reasons for replicating data or an object, including high-availability, fault tolerance through design diversity and improving application performance. For the latter this could include reading data from a physically closer replica, or one that resides on a faster machine available through a faster network path.
But in terms of how replication and transactions could play well together it was using replicas as "fast backing store" aka a highly available in-memory log that seemed the logical thing to concentrate on. We certainly had success in this approach, but the general idea of replication for in-memory durability didn't really seem to take off within the industry until relatively recently. I think one of the important reasons for this is that improvements in network speeds and faster processors have continued to outstrip disk performance, making these kinds of optimization less academic and more mainstream. So it was with a lot of interest that I listened to presentation after presentation at this year's HPTS about this approach. Of course there were presentations on improving disk speeds and using flash drives as a second-level cache too, so it was a good workshop all round.
But in terms of how replication and transactions could play well together it was using replicas as "fast backing store" aka a highly available in-memory log that seemed the logical thing to concentrate on. We certainly had success in this approach, but the general idea of replication for in-memory durability didn't really seem to take off within the industry until relatively recently. I think one of the important reasons for this is that improvements in network speeds and faster processors have continued to outstrip disk performance, making these kinds of optimization less academic and more mainstream. So it was with a lot of interest that I listened to presentation after presentation at this year's HPTS about this approach. Of course there were presentations on improving disk speeds and using flash drives as a second-level cache too, so it was a good workshop all round.