For a while now we've seen various debates around microservices, such as how they compare to SOA, whether the emphasis should be on size (micro), whether HTTP (and REST) is the preferred communication style, where and why you should adopt them as well as when you shouldn't? The list goes on and on and I've participated in a few of them.
Recently at work we've been focusing on how best to consider microservices within an existing architecture, i.e., how, why and when to breakdown so-called monoliths into microservices. We've had a number of our teams involved in these discussions, including Vert.x, WildFly Swarm and OpenShift. We've made great progress and this article isn't about that work - I'll leave it to the various teams and others to report once it's ready.
However, during this work I also went on vacation and that gave me time to ponder on life, the universe and everything microservices related! During the time away I kept coming back to two fundamental questions. The first: why use microservices? The second: how can end-users tell if they're being used to (re-) construct (distributed) applications? Much of what we've heard about microservices has been from the perspective of developers who will use microservices, not necessarily the end-user of (re-)architected applications. And of course you're probably asking a third: how does all of this relate to subatomic particles? Patience and all will be revealed.
To answer the first question, there are a lot of reasons why people, vendors, analysts etc. suggest you should consider microservices, either as a building block for new applications or, as seems more common at the moment, as a way of refactoring your existing application or service(s) which may be monolithic in nature. At the core though is the requirement to have an architecture which allows for constituent components to be developed, revised and released independently of the entire application. The so-called "Pizza Team" approach, for instance.
This then leads us nicely to the second question: how can you tell an application has been developed, or re-architected, using microservices? If you're a user of a service or application, chances are that unless the source code is available to review and you've got that inclination, "microservices enabled" isn't necessarily going to be one of the slogans used to market it. And in fact should you care? Ultimately what you're probably more interested in is a mixture of things such as cost, reliability, performance and suitability for purpose. But let's assume you do want to know. How can you tell?
Well this is where the subatomic particles come in. Given my university degree majored in physics and computing I share an equal love for both and at times when my mind wanders I like to try to see similarities between the two areas. In the past, for instance, I've used Heisenberg's Uncertainty Principle to describe weak consistency transactions protocols. This time around I was again recalling Heisenberg; those of you who have also studied physics or have a passing interest will know that the wave-particle duality of subatomic particles cannot be view directly but can be inferred, for instance using Young's Slit experiment and firing a single "particle" at two slits to observe an interference pattern which is reminiscent of those produced by wave interference. This is a pretty extreme example of how we can infer the properties of particles we cannot view directly. Others exist, including Rutherford's original experiment to infer the existence of the atomic nucleus; I'll leave that as an exercise to the interested reader, but will say it's a fascinating area of science.
Now where all of this comes full circle is that if you're an end-user of some piece of software that has been well architected and does its job, is released frequently enough for you to do your work efficiently and basically doesn't get in the way, could you tell if it was architected or re-architected using microservices? The answer in this case is most probably no. But on the flip side, suppose you've been using an application or service which is released too slowly for you (e.g., bug fixes take months to arrive), and maybe requires a rebuild of your code each time it is released. Then let's assume things change and not only do you get updates on a daily basis but they often fit seamlessly in to your own application usage. Does this mean that the developers have switched to microservices? Unfortunately the answer is no less definitive than previously because whilst a correct use of microservices would be an answer, there are other approaches which could give the same results - despite what you may have read, good software development has existed for decades.
Therefore, without looking at the code how can an end-user know whether or not microservices are being used and why is that important? It's important because there's a lot of hype around microservices at the moment and some people are making purchasing decisions based on whether or not they are present, so you probably do need some way to confirm. Architecture diagrams are great but they're no substitute for code. But if you can't see the code, it's tricky to infer one way or the other. However, on the flip side maybe as an end-user you really shouldn't care as long as you get what you want from the application/service? Good architectures and good software architects win out in the end using a variety of techniques.
Note: yeah, the other obvious analogy between microservices and subatomic particles could be that maybe microservices are the smallest divisible aspects of your application that make sense; you can't really re-factor your code smaller than a microservice in just the same way that you can't go beyond sub-atomic particles. However, since there are things smaller than subatomic I didn't want to go there.