Sunday, June 10, 2012

When did we stop remembering?

Over the past year or so I've been reading articles and papers, or watching recorded presentations, on fault tolerance and distributed systems, produced over the last couple of years. And whilst some of it has been good, a common theme throughout has been the lack of reflection on the large body of work that has been done in this area for the past four decades or more! I've mentioned this issue in the past and had hoped that it was a passing trend. Unfortunately I just finished watching a video from someone earlier this year at the "cutting edge" of this space who described all of the issues with distributed systems, fault tolerance and reliability; not once did he mention Lamport's Time, Clocks and Ordering of Events in a Distributed System (yet he discussed the same issues as if they were "new"), failure suspectors, the work of Gray, Bernstein and others. The list goes on! If this had been a presentation in the 1970's or 80's then it would have been OK. But in the 2nd decade of the 21st century, where most work in the software arena has been digitised and is searchable, there is no excuse!

4 comments:

  1. Mark:

    great observation. FWIW, i've seen this often and i attribute this problem, in part, to a lack of "history" being taught/promoted in the field. IOW, we've been at this long enough that i think more time/effort should be devoted to chronicling and curating historical trends and 'markers' in the information field.

    ReplyDelete
  2. open source opened the gates for both good and bad to pass through...welcome to the new era of ego driven delivery...this is not about ignorance of the past its about arrogance in the present thrown in with a big helping of NIH syndrome and a sprinkling of cloning

    ReplyDelete
  3. I graduated with a degree in software engineering in 2004. This means I am young and still have a lot to learn. It also means I can comment on what the current generation of young open source developers was taught at university. We were taught some of the good history, but we didn't spend any decent amount of time on it, the great papers and computer scientists were just a series of names crammed into a single concurrent systems course that was memorised for the exam and promptly forgotten. Most of what we were taught about the history of software development, what was drilled into us, and what stuck, was the ugly side. The $100 million dollar failed projects. The lack of process, lack of design, lack of testing, lack of education, lack of everything in the software industry. We were taught that the most important thing for us to do, if we want to take the software industry forward, is to invest in process. At the time, agile was just becoming a thing, we were taught waterfall, and taught that some projects would be better suited to agile. From talking to todays graduates, it seems that they are now taught that agile is the only way to do things.

    Basically, we were taught that software development has a dark and horrible history. And looking back, there were a lot of terrible practices and technologies in use. The timeless gems are few and far between. How are we meant to differentiate between what was good and what was bad? So please forgive us for throwing the baby out with the bathwater. We were taught basically that history was bad, and that process was the answer, not computer science, so we ignored the computer science side, and focussed on moving forward with process.

    ReplyDelete
  4. This field does employ historians and real historic reflection. I think that is the major cause for this recycling. Plus: we are still too focused on teaching young students single-core algorithmics for nearlly all of their studies (at least my experience in germany

    ReplyDelete