The Unfulfilled Promise of XML

Michael Vizard

When the XML file format initially came out, it was hailed as a way to finally overcome our interoperability issues. After all, it was a self-describing, data-neutral file format that would make it a lot easier for applications to share data.


Since then, we've seen mainstream adoption of XML, but the vision of what XML as a lingua franca or bridge language for interoperability has fallen short. What basically happened is that with so many different industry organizations going off to implement XML, we wound up with a lot of different XML schemas that still needed to be integrated. Granted, it's is a vast improvement over what we use to have to deal with, but a lot of hand coding and massaging of XML continues to be required to interoperate two applications.


But hope springs eternal. The HR-XML consortium recently announced that it would finally comply with the XML guidelines defined by the Open Applications Group and the United Nations Centre for Trade Facilitation and Electronic Business (UN/CEFACT) in version 3.0 of a specification that is widely used in human resources applications. This development should make it easier to integrate HR applications with the rest of the enterprise applications that support XML.


Alas, we still have an array of XML schemas describing any number of processes and documents. Within each vertical industry, we're getting better at defining XML schema standards, but the standards for integrating specific business processes still need a lot of work. And even if they get defined better, it's not at all clear that this would enable any advances in terms of automating XML schema integration.


So the question is, have we gone as far as we can go in terms of advancing the state of the XML art? Or is this just moment of time when it only seems like the promise of XML may be unfilled, but something profound is right around the corner?



Add Comment      Leave a comment on this blog post
Nov 3, 2009 10:01 AM Chuck Allen Chuck Allen  says:

Nice post. I think you've hit the nail on the head. There have been some significant achievements in terms of architectural convergence among XML standards, but there is more work to be done and - in my opinion - not a lot of leadership to see it through.

I was in the thick of the HR-XML/OAGIS convergence -- from selling the idea to developing the new libary. While the results are a positive step forward, let's just say this wasn't a very easy sell. It also doesn't mean that major interoperability gains are imminent. At a time when HR-XML should be broadening its influence, it is turning inward. Consider the new license under which HR-XML 3.0 is available.

http://ns.hr-xml.org/schemas/LicenseAgreement.pdf

HR-XML has transitioned from having one of the most liberal licenses (much like simple BSD) to one of the most restrictive. If you take this license literally (which is the way licenses are to be read), it is hard to see that a non-member organization has any rights other than to view the standards since developing software, creating instances, or publishing documentation based on HR-XML are  "derivative" works that only members have rights to produce.

A big part of the problem is that standards organizations are wedded to processes and funding models that are considerably past their prime. The process of drafting specifications over a period of months or years, releasing them, implementing them after-the-fact, and then issuing new versions based on implementer feedback takes too long and doesn't guaranty what is produced will work in the real world. Standards organizations need to open up and get iterative and test driven (all the while maintaining a solid means of handling intellectual property). Unfortunately, this type of change challenges existing funding models. Moreover, there is not enough imagination and too many conflicts of interest for these organizations to make difficult changes.

Reply
Dec 14, 2009 11:25 AM Buddy Kresge Buddy Kresge  says:

While XML provides a powerful representation framework, for true interoperability you have to start tackling semantics.  While there are many ways to accomplish this (e.g. OWl, RDF, etc.), a simple way to achieve the injection of ontological principles into a schema is to have elements with one and only one meaning....a very tight meaning!  If there are multiple meanings, then you have multiple elements that tease out these differences in meaning.

Models tend to try to be just another Object-Oriented exercise.  Typically 'standards bodies' focus on 'resuable objects' that are so open, that in order to use them everyone has to right an implementation guide.  For example, create an element called SSN, rather than using some 'PersonIdentifier' where one person fills out the decorating attributes differently than someone else.  How the heck are you to have interoperability with such 'built-in chaos'?

In order for any organization to have interoperability, XML/Schemas must be designed from the standpoint of relaying semantics and those semantics cannot be ambiguous.  Instead, people tend to turn to a 'reusable object in XML' exercise - which in my opinion, is doomed from the onset.

The key to success = semantic representation and injection ontological engineering principles into the design/model

Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.