Inter-Operating, It’s a real thing

Integration, interface, and Interoperability are confusing buzzwords. 

By definition, integration means to add, mix, or combine and to unite. On the other hand, interface means border, boundary line, a point where two systems, subjects, or organizations meet and interact. Throw in the term “interoperability,” which describes the extent to which systems and devices can exchange data and interpret it. You’ve got some serious thinking and the most sophisticated approach, resulting in a richer experience for any interface to present as output.

In layman’s terms: interoperable systems speak the same language.

Where in, integration is more like having a conversation through an interpreter. (Like going to Canada not knowing how to speak French and needing two Germans to help you communicate: one who speaks English and one who speaks French.) With Interoperability, everybody speaks English (or German or French, or whatever the agreed-upon language is…). The point is the systems can talk to each other with no added complexity or delay.

Every industry has a unique vocabulary, and Technology is no exception. The problem: some of those sticky, often-used words take on a life of their own, are easily misunderstood, and can even end up misused to the point of meaninglessness. These so-called “buzzwords” (and their actual meanings) may seem inconsequential on the surface. Still, suppose we will effectively communicate and address challenges in an industry as complex as healthcare. In that case, it’s essential that we all clearly understand the terms we’re fighting for—and against. To help define one of the biggest buzzwords in Tech today: Interoperability is one found, people often use without knowing what it means.

Interoperability isn’t integration.

People use the words integration and Interoperability interchangeably, but there’s a pretty big difference between the two. Integration refers to connecting applications so that the other one can access data from one system. Where integration involves a third party—in software terms, middleware—translating data and making it “work” for the receiving system. In this scenario, it’s not a direct path for information from point A to point B.

Interoperability is a real-time data exchange between systems without middleware.

When systems are interoperable, they can share information and interpret incoming data and present it as it was received, preserving its original context.

It is about more than semantics.

Currently, most data exchange in our industry is the result of integration, but achieving Interoperability is vital to technical operations in the future. Why? The immediate access to information interoperability makes it possible to allow for both a complete view and the ability to be agile when it comes to complying with requests and reporting requirements. And these data-driven activities are crucial to success in the value-based world.

An example of this level of information access is particularly vital for an ambulatory surgery center, where they are entities outside of large hospitals (typically having their closed data systems). In an interoperable world, all stakeholders in the continuum can easily access and use the data within other systems, making up-to-date—even up-to-the-minute—information retrieval possible.

The change to true Interoperability won’t be an evolution—it’ll be a revolution requiring a large amount of future-focused thinking. The Kantara Initiative is the most likely place to lead the way with guidelines. Still, the onus is on providers to partner with vendors that support a universal standard. In this way, the future of Interoperability is in our hands—and yours. And that’s a sentiment that needs no translation.

Interoperability–An Exchange

In an optimal implementation and a software sense, “Interoperability is a characteristic of a product or system whose interfaces are completely understood to work with other products or systems, at present or future, in either implementation or access, without any restrictions,” according to Wikipedia. 

The definition becomes even more robotic from there: “Semantic interoperability is the ability to automatically interpret the information exchanged meaningfully and accurately to produce useful results as defined by the end-users of both systems.” In my opinion, Interoperability means if you are expecting information to come from two or more sources, and you get it, and it makes sense, then you have a win on your hands.

Here’s the catch: 

Upon further definition, “Interoperability would allow different systems to work together in their existing state; however, future upgrades, developments, or improvements to any of these products can cause interoperability to cease.”

In short:

  • Interoperability implies exchanges between a range of products. (see interface)
  • Interoperable systems work together now, but the future is uncertain
  • A guiding principle rather than a technical specification
  • Upgrades or product advances can terminate Interoperability

Integration–Full Functionality

With the integration, software product works as one solution instead of passing information between different systems. ‘One system contains the same code and database. Integrated systems work tightly together, like the pieces of the whole are ‘one.’ System updates are more accessible, as are real-time reporting requirements. Integrated solutions share the same databases, so there are no mapping codes between systems, substantially reducing errors and downtime. Any changes are automatically applied to your whole operation. The integration provides a unified user experience that combines data, reporting, and workflow across a single business platform. Integration is indisputably the most authentic, most unified way a software system can be utilized.

Of Note: 

An integrated system allows a series of products to talk to each other in their current state and provides backward and forwards compatibility with future versions of each product within the structure.

  • One uninterrupted system
  • Real-timeAll data is immediately gathered, stored, mediated, and reportable
  • Information is not decentralized; no synchronization is needed
  • Data transfers are reliable and workflow performance is accelerated
  • No mapping updates required, less maintenance
  • Business Intelligence Reporting is up-to-the-minute

Interface–The Bridge

An interface is like a bridge that lets two programs share information. The information can come from different sources that may use different programming languages. Business systems can send and receive data, but otherwise, they act independently of each other.

An interface doesn’t allow you to sync data between systems in real-time. If and when you need to sync your data from separate policies, ensure that your system network is powerful enough to handle running data sync, often sufficient to be close to real-time.

Another consideration is the maintenance of mapping codes between systems. Mapping codes act as the directory for information from one system into another. If any changes are made in either mode, your mappings table may have to be updated, or the software might pull information from the wrong place giving you incorrect data.

In short:

  • Separate software products communicate under limited capacity.
  • Data is maintained in multiple locations requiring more administration.
  • Additional steps to exchange data
  • Consistently preserve, monitor, and update mappings.
  • Real-time synchronization is not available.

Differences between nomadicity, portability, and mobility?

  1. Nomadicity is the tendency of a person, or group of people, to move with relative frequency. Leonard Kleinrock and others have written of the need to support today’s increasingly mobile workers with nomadic computing, the use of portable computing devices, and, ideally, constant access to the Internet and data on other computers.
  2. Nomadicity means none restrictive connectivity or limited connectivity.
  3. Portability means the ability to jump across networks.
  4. Portability means the ability to transfer from one machine or system to another.
  5. Mobility means when you have seamless and wireless connectivity.
  6. Mobility means the ability to move freely and easily.


Portability, on the other hand (at least as the term is used in the computer software domain), concerns the ease with which some software artifact can be made to function correctly in some computing platform environment other than that for which it was designed. For example, can the software artifact run under a different operating system or execution framework or on a computer with a different instruction set? How much modification/configuring is required for a given target execution environment? Although portability may have some relationship to a software component’s ability to interact with other elements. 

Mobility is focused on the ability to run regardless of the system, framework, computer platform, or instruction set. To accomplish mobility with true Interoperability, the use of a shared ontology must be present. Software instruction sets can and should behave differently from others; however, both the unstructured data with metadata and fully structured data with defined mappings. Must behave the same to maintain integrity, speed of delivery, and use to disparate applications regardless of platform or systems.

Posted in Uncategorized | Leave a comment

Welcome to the first day of the rest of your future

The World Wide Web or often better termed  Wild Wild Web, a little history of the mystery


A little history of the mystery of the World Wide Web

The Web as we know it today is our lifeline. Like our cell phone, it’s hard to imagine life without it. But the Web as we know it today is just the tip of the information iceberg, and to some extent we’re on the Titanic. I know that sounds dramatic, but as I look forward ten years, I see a personalization crisis—to some extent a privacy crisis—unless we do something about it. And I’m not alone. Some of our best minds are hard at work on the issue. This is where you can monitor their progress in “user-speak,” not “techno-speak.” This blog is about the personal Web—the personal cloud, identity management, and regaining control of our personal profiles and information so the Web serves us more effectively.

To do that, we’ll need to explore the iceberg. Today’s Web has come a long way since it first gained traction in the early 90’s.  Though we all know that the Internet was developed by various defense departments in several countries, early adopters of the World Wide Web itself were primarily university-based scientific departments or physics laboratories such as CERNFermilab and SLAC.

The WorldWideWeb (WWW) project allowed transmission of different kinds of data, not just text, and allowed hyperlinks to be made to any information anywhere. The WWW project at CERN was started to allow physicists to share data, news, and documentation.

The Web is now an always-on, anybody-can-play, all encompassing resource pool. It’s gone from a trickle of information to a fire hose. I remember hearing an analogy from a professor at the University of Utah first in 1995 who said, “The Internet and World Wide Web will be the largest library ever in existence, where you can find anything you may want. Someone’s taken all the books off the shelves, piled them up in the middle of the room, and shut the lights off. You have no idea who’s in the library with you, or what they’re looking for.”  That’s pretty much true today, too.   I’d add that it’s not just that you don’t know who’s in the library with you, but the library doesn’t care who you are. The pile of books in front of you should be about what you’re actually interested in. And, importantly, information you’re interested in now, not what you cared about last week.

Early 1990 to 1994 Websites intermingled links for both the HTTP web protocol and the then-popular Gopher protocol, which provided access to content through hypertext menus presented as a file system rather than through HTML files. Early Web users would navigate either by bookmarking popular directory pages, such as Berners-Lee’s first site at, or by consulting updated lists such as the NCSA “What’s New” page. Some sites were also indexed by WAIS, enabling users to submit full-text searches similar to the capability later provided by search engines.

There was still no graphical browser available for computers besides the NeXT. Who founded NeXT? None other then Steve Jobs who had been forced out of Apple Computer in 1985. After puttering around for several months, Jobs decided that he would return to the computer industry. He invested $7 million to create a brand new company, NeXT Inc, which was light years ahead of its competition in the scientific marketplace in terms of its understanding of and infrastructure for human interface. To me, this is more evidence and proof behind the theory of the “Steve Factor.” Aside from the several excellent books written about Steve, interested readers might want to check out The Outliers, in which Malcolm Gladwell has made excellent observations on the topic of being different or going beyond one’s peers.

Although  HTTP protocol and human interface innovations were helping to define the Web, there was still something missing. Although HTTP literally means, “hypertext transfer protocol,” hypertext was still in its infancy. It had a growth spurt in April 1992 with the release of Erwise, an application developed at Helsinki University of Technology, and in May by ViolaWWW, created by Pei-Yuan Wei, which included advanced features such as embedded graphics, scripting, and animation. ViolaWWW was originally an application for HyperCard from Apple Computer. Both programs ran on the X Window System for Unix.

The turning point for the World Wide Web was the introduction of the Mosaic web browser in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UIUC), led by Marc Andreessen. The origins of Mosaic had begun in 1992. In November 1992, the NCSA at the University of Illinois (UIUC) established a website. In December 1992, Andreessen and Eric Bina, students attending UIUC and working at the NCSA, began work on Mosaic. They released an X Window browser in February 1993. It gained popularity due to its strong support of integrated multimedia, and the authors’ rapid response to user bug reports and recommendations for new features.

The first Microsoft Windows browser was Cello, written by Thomas R. Bruce for the Legal Information Institute at Cornell Law School to provide legal information, since more lawyers had more access to Windows than to Unix. Cello was released in June 1993.

After graduation from UIUC, Marc Andreessen and James H. Clark, former CEO of Silicon Graphics, met and formed Mosaic Communications Corporation to develop the Mosaic browser commercially. The company changed its name to Netscape in April 1994, and the browser was developed further as Netscape Navigator.  Today we recognize that code and the efforts behind it as FireFox from Mozilla. It was at this intersection that we saw the launching of the commercial Web, one driven by corporations and advertising revenue. Another way to look at it is that the current Web is on the supply side of the user relationship. What’s missing is the demand side.

All the milestones, efforts, and breakthroughs of the past couple of decades—pretty much everything we take for granted today—is the supply side Web. A bright spot on the demand side of the ledger is the Live Web. While today’s Web is the place where everything exists, the Live Web is where things happen. Dr. Windley calls it the Event Web. He sees it as a place where we can realize meshed collaboration on an ad hoc basis. Along with the Live Web, we need a more personal Web. While today’s personalization algorithms are all based on corporations serving up what we needed yesterday and how we needed it, what’s needed is an infrastructure that helps us define what we want in the moment, and defines trajectories of what we may need in the future. We need more control over what we get from the Web.

This blog is dedicated to an open discussion and comments on how this new Internet of Events―one that is deeply rooted in services―differs from Cloud Computing in general. What is becoming clear is that we are on our way to transforming the undifferentiated Cloud into Personal Clouds and Personal Cloud Services. This blog will speak to Personal Services of all kinds and types. What do you see as the services you’ll need to make the Cloud more useful to you? Perhaps there are efforts underway already to provide those services, and I’ll be able to fill you in on progress. So I need your input. Please ask questions, make observations, and join in our conversations. Thank you!

Posted in Cloud Computing | Tagged , , , , | 1 Comment