Aracılığıyla paylaş


Open Documents Panel at GOSCON

I sat on a panel today at the Government Open Source Convention with folks from SUN, IBM, Adobe, and the OpenDocument Foundation. I have been remiss in blogging as of late due to an absolute tidal wave of work that has taken time away from posting out my thoughts. I'm sure some of you have been deeply disappointed with my silence. :-)

Before I share my thoughts on the panel - I would like to thank Andy Stein (Director of IT for the City of Newport News, VA)  for doing such a great job moderating the panel. Unlike some panels where things rat-hole and the breadth of an issue doesn't get touched upon - Andy kept us in line and moving through topics. Not easy with five opinionated people sharing the stage.

There were two things that struck me from the conversation:

  1. The discussion around a single format is clearly still very active - but seems more and more disconnected from reality every day.  
  2. Detractors of Open XML are so intent on tearing down the standard, they have lost sight of the broader goals around open documents.

Single Format:

The panel started with a presentation from the OpenDocument Foundation. This is an organization whose charter states that they will promote ODF. (I almost want to write that twice...) They are now saying that commercial interests have so distorted the goals of ODF that they have decided a different document format is really the way to go. They are advocating a move to the W3C Compound Document Format.

Ok, I know Rob Weir has already written a big piece on this and is working hard to distance the real ODF crowd from the...well...unreal ODF crowd. More on this later.

As I mentioned, the panel also had Jim King, the technical lead for Adobe's ISO efforts on PDF. Ah, here is a different document format that is rather broadly used as well. I would question Jim's statement that there are more documents in PDF than in any other format. That is most certainly true online - but saved on hard drives around the world? My guess is that MS Office formats would be more common - but that is conjecture on my part. But I digress. The fact is, they have done innovative work on document formats of a different sort and are standardizing that work. But, Jim went on to share the fact that they too have a "XML friendly" document format up on their site for you to use as well.

And, there is ODF.

And there is UOF.

And there are >100 vertical industry document formats using XML.

And (and this is a biggie) these formats progress and change over time so ODF 1.0 docs will need to be addressed by apps built only to work with ODF 1.2 (for example)

And...

So, I am a bit baffled with the "there should be only one" discussion still going on. Is it possible that document formats represent a different space than something like a network protocol? That ultimately, the document formats really are a representation of features/functions of the apps that use them, and that since this is software - translation really is the answer? Hasn't that always been the answer? Isn't that why software is SO VERY different from the physical world?

Tearing Down Instead of Building Up

I really don't want to point fingers or get into an issue of personal style here. The panel members were all professionals working on things they care deeply about. All good on that point. I am just not convinced that putting so much effort into tearing down Open XML actually achieves the stated goals of those interested in the benefits of open documents.

Open XML will be broadly used throughout the world. I don't think this is in question today. Be it through the negative lens applied by many to the adoption rate of Microsoft Office - or through my own bias of the fact that we are seeing hundreds of implementations with the promise of many, many more - it is going to be used.

So, we get back to a fundamental question - is it a good thing for Open XML to be a standard vs. something completely controlled by Microsoft?

Before many of the pro-ODF crowd flame me - I reject your claims that it is still under our control today. the TC at Ecma working on this is made up of 20+ orgs. and the long-term maintenance proposal of the JTC-1 spec has already been proposed to keep the JTC-1 and Ecma specs in line with each other (snarky aside: this has not been the case with ODF 1.1 and 1.2...maybe it will happen with 1.3).

It is fundamentally better to have the specification be available as an international standard considering the global adoption of Open XML. The core team focused on tearing down Open XML are acting against the interests of tens of thousands of organizations, millions of users, and certainly most governments around the world.

 

Anyway - it was an interesting day today. Thanks to the other panel members for a really sharp conversation.

Comments

  • Anonymous
    October 16, 2007
    PingBack from http://www.artofbam.com/wordpress/?p=9471

  • Anonymous
    October 16, 2007
    bad name for a format: Open XML totally confusing... it suggests that there is a "closed" XML. Call it Office Open XML ( bad name too but not so bad )

  • Anonymous
    October 17, 2007
    I can see the need for new standards when the old ones don't fit anymore, or when new technology emerges. e.g. 802.11B/802.11G. However here there's one current problem to be solved: universal access to documents. Just like 802.11G solves current wireles networking needs, one document format could have been sufficient, if the problem is defined correctly. I remember Microsoft pushing intel and AMD to solve 64-bit computing in new processors in one way, so you could build one version of windows for both AMD and INTEL, so MS knows very well how a single standard can help. I must assume that Microsoft has looked at bringing their document formats to a standards body earlier, but has decided that there is more to gain from keeping the formats propriatary. I don't know where the idea came from to bring the propriatary staroffice format to oasis or ISO, but the move appeared to be a brilliant one. It's this brilliance that you are trying to undo with a lot of paper and brute force. You keep saying that there are a lot of people trying to tear OOXML down, however I think it's the other way around. If you look at http://www.consortiuminfo.org/standardsblog/article.php?story=20071016092352827 you will see that the intense lobbying MS has done over the past few months is now obstructing the work of the entire SC34 committee. So while others may focus on keeping OOXML out of the door, Microsoft doesn't care obstructing the work of SC34 only to get it's proposal through. The least you could have done is inform those new P-members about some basic obligations they have. As I said earlier, I know nothing about the inner workings of both formats, however your tactics make me think you don't have much faith in OOXML yourself, otherwise you could just let the process handle itself.

  • Anonymous
    October 17, 2007
    Well, all I can say is, better you than me pal ... After reading the posturing from the foundation folk, and Rob Weir's lament, plus the composition of that panel (and the hype about what the foundation would be offering), all I could wonder about was what you would have to report afterwards.   Thanks for the rapid response.  I look forward to your thoughtful reflections on aspects that you want to follow-up on.   I think I was fortunate to be engaged in standards work that while contentious was not so poisonous as what you have to face.  Thanks for the commitment and civility that you maintain.

  • Anonymous
    October 17, 2007
    Henk - thanks for the comment. I am more than happy to be educated more in this space, but I really wonder at whether there are different classes of technologies that get standardized. Clearly, a foundational point of discussion about standardization is the idea of consolidation of factors to drive a common approach to a problem. Ok, that is the argument for having a single standard. Yet, that is balanced by the points about innovation and the power of market-based darwinism. The act of creating a standard does not mean that it is market-viable. There ar etons of examples of places where multiple solutions begin the game but time and the market drive towards the consolidation rather than the standards themselves. So then, that leads to the next point that the role of a document format can be viewed as either something to be unified by all vendors (not a reality today in the least), or as a reflection of the innovations in the tools. If the later is more true (which is clearly the case today), then is the consumer not better off with document formats being standardized (many of them) so that translation is done better in the future than it was in the past? Dennis - Thanks for the symphathy. It was a good conversation, clearly some differences of opinions, but good overall. If you have issues you want me to address in the blog, I'm always open to suggestions. Jason

  • Anonymous
    October 17, 2007
    The comment has been removed

  • Anonymous
    October 17, 2007
    The comment has been removed

  • Anonymous
    October 17, 2007
    Jason, Your analogy between document formats and network protocols is very interesting, and comparing the two strikes me as very informative:

  • Modern computer networks have a layered set of protocols, with an interoperability protocol (IP) to smooth over the differences between the myriad of hardware protocols (ethernet, token ring, modems...).  This allows efficient implementations to be developed for diverse low-level situations, while allowing perfect, effortless interoperability.

  • The above is possible because networking protocols all share the common goal of delivering unprocessed, arbitrary blocks of binary data.  Any attempt to develop strong translation between document formats - especially if you want a multiplicity of formats - will require such a common goal to be developed, and will most likely require a standardised meta-format.

  • Networking protocols need a meta-format rather than a single format because different hardware protocols serve different use cases (e.g. ethernet is short-range, modems are long range).  I don't think we know yet whether this is actually the case for document formats, and I suspect we'll spend the next 10-20 years finding out.

    • Andrew
  • Anonymous
    October 17, 2007
    The comment has been removed

  • Anonymous
    October 17, 2007
    Simon - I have a habit of typeing "SUN" instead of "Sun" and I have no idea why. I will stop that - it is a quirk and I'll fix it. Maybe I think you guys should be more declarative about  your name. Think big, impressive music, and then say SUN! :-) Thx Jason

  • Anonymous
    October 17, 2007
    The comment has been removed

  • Anonymous
    October 18, 2007
    Jason wrote: ================ start ========================== If the later is more true (which is clearly the case today), then is  the consumer not better off with document formats being standardized (many of them) so that translation is done better in the future than it was in the past? ================================= end ============ I'd think that simply documenting the format would have been enough to facilitate the writing of translators. It might even have been a good idea. I have always thought, but correct me if I'm wrong, that the main reason to standardize something like OOXML is the almost automatic credibility gained by the ISO stamp. Within a year almost everyone will have forgotten how that stamp was achieved. Just like some sports games where someone becomes champion because of a miss judgment of a refferee. Some people will always remember the mistake, but most people simply celebrate the champion's victory.

  • Anonymous
    October 19, 2007
    Forterra is starting a discussion on multiple prototocols supported by a single client vs multiple clients supported by a single server.   This question has become pertinent in the virtual worlds interoperability discussions.    VW clients (real time 3D) have been around for some time so there is a large established base.  VW servers have been around for some time but only with the rise of game platforms and Second Life did this become the big sticking point although there was a Living Worlds specification in the very late 90s. We seem to be headed toward an architectural impasse or a pass through the mountains.  I can't tell which.  What is obvious is that multiple divisions in multiple companies working with multiple consortia working with the same standards org are trying to solve the same problem.   One wonders if the commonalties are as real as they appear or if the solutions will converge.   Interesting problem.

  • Anonymous
    October 19, 2007
    Hi Jason, Besides the fact that, as Jim pointed out, "Open XML" is awfully generic and misleading I'm not sure what to take from the fact that you seem to insist on using that name rather than "OOXML". Is this because you're really talking about Microsoft Office's format and not the standard wannabe your company shove through ECMA? Aren't the two supposed to be the same? Is this an admittance that the two are effectively meant to be different from Microsoft's point of view? If that's the case, what good is OOXML supposed to do exactly? Thanks.

  • Anonymous
    October 19, 2007
    The comment has been removed

  • Anonymous
    October 21, 2007
    The comment has been removed