In this case, however, I’m not talking simply about creating and controlling interoperability from the developer level. Tools and services like Dell’s Boomi or IBM’s CastIron have existed for years, and have some success in delivering more flexibility to integration between applications and services. However, these services are focused on solving the developer’s key issues with integration –how to make sure messages move between components based on a process definition and one or more translations, if needed.
But today application operators see a tangental set of problems, and these problems are increasingly becoming difficult to deal with. For the operators, the problem of interoperability has several parts:
None of this is a shock to most IT operators, but there is one other element that I’ve hinted at before that is creating the rapid expansion of complexity facing operations today, and that is the sheer volume of integrations between software and data elements both within and across organizational boundaries. It’s no longer a good idea to think of individual applications in isolation, or to assume a data element has one customer, or even one set of customers with a common purpose for using that data.
Today we live in a world where almost everything that matters in business is connected by a finite number of degrees of separation from just about everything else in that category. Cloud computing is one driver, but the success of REST APIs is another, as is the explosion of so-called “big data” and analytics across businesses and industries.
We, in business software, exist in large part to automate the economy, in my opinion. The economy is a massive, highly integrated complex adaptive system. Our software is rapidly coming to mimic it.We need standard operations interoperability
All of this brings me to the opportunity that this interoperability explosion brings to operators and vendors of operations tools alike. If we are going to manage software and data that interoperates as a system at such a massive scale, we need tools that interoperate in support of that system. We need to begin to implement much of what my friend, Chris Hoff, called for five years ago from the security software community:
We all know that what we need is robust protocols, strong mutual authentication, encryption, resilient operating systems and applications that don’t suck.
But because we can’t wait until the Sun explodes to get this, we need a way for these individual security components to securely communicate and interoperate using a common protocol based upon open standards.
We need to push for an approach to an ecosystem that allows devices that have visibility to our data and the network that interconnects them to tap this messaging bus and either enact a disposition, describe how to, and communicate appropriately when we do so.
We have the technology, we have the ability, we have the need. Now all we need is the vendor gene pool to get off their duff and work together to get it done. The only thing preventing this is GREED.
Amen, Chris. That remains as true today as it was then, as far as I can tell. Only now the scope has exploded to include all of application and infrastructure operations, not just security software. While everyone is looking for standards that allow one tool to talk to another, we are missing the bigger picture. We need standards that allow every component in the operations arsenal to exchange events with any other component, within understood guidelines. That may be as simple as setting the expectations that any operations software will have both an execution and a notification API set.
Another option is a formal event taxonomy and protocol, but that option doesn’t interest me very much. Those standards tend to become outdated quickly and are far too restrictive.
One last thing: John Palfrey and Urs Gasser have written a book on interoperability which I am in the middle of reading. So far, the most interesting aspect of the model they describe is a multi-tiered view of interoperability that supplements data and software interoperability with human and institutional interoperability. The latter two concepts are incredibly important in the new cloud-based systems world.
It’s not good enough to focus on software, protocols and APIs. We have to begin to work together as an ecosystem to overcome the human and institutional barriers to better IT interoperability. Unfortunately, lack of interoperability often benefits software vendors, and as Hoff noted above, the only thing preventing this is greed.
Photo credit: Cory Doctorow