back to article W3C squeezes XML into portability

Web-standards group the W3C has published its preferred standard for compressing XML documents into something more suitable for transmission over radio, and perhaps everywhere else too. Efficient XML Interchange (EXI) originated with a company called AgileDelta, whose CTO is still editor of the specification which tokenises XML …

COMMENTS

This topic is closed for new posts.
Boffin

Readibility of XML

"XML is good at being human readable". Actually, no.

Some programming languages, such as C++ and Java, employ a wide variety of syntax. Some other programming languages, such as LISP, PostScript and Forth, make use of a unifying concept that dramatically reduces the amount of syntax in the language. For example, LISP treats almost everything as a list: data, function calls, function definitions, mathematical expressions, if-then-else statements and so on. PostScript and Forth both treat almost everything as operations on a stack.

XML is another language that tries to unify many dissimilar concepts into a small amount of syntax. In XML's case, the unifying concept is that everything can be represented by an element and/or attribute.

A significant subset of humans find such highly unified languages to be elegant. But another significant subset of humans find such highly unified languages to be frustratingly confusing.

So the claim that, "XML is good at being human readable" is wrong for a significant subset of people.

1
0

Myth of "human readable" XML

Agreed that the "human readability" of XML is pointless and in any case is something of a myth. We still need tools to read such files and convert the binary encoding (e.g. ASCII, UTF) into something we can see and understand (e.g. a character glyph on a display screen). EXI is no different in this regard. The only argument is over the availability of tools to do the decoding.

So it would be nice to see the vendors add support for EXI in some popular tools, for example Microsoft Internet Explorer 9, Safari, Firefox 4 and perhaps Altova XMLSpy. People might then be a little less scared by EXI.

0
0
Silver badge

About time

<longtagtodescribesomethingnohumanshouldevereadinthisdayandagewhyelsedoyouthinkiboughtacomputer>

Have to agree with the first poster though - XML was readable for about two minutes and just became a data expansion tool.

Did use to use XML wrappers to compress the data when networks were slow - parse the whole file and then replace repeated tags with indices for a lookup. Amazed to seem most files were 90% tags and sod all data. </longtagtodescribesomethingnohumanshouldevereadinthisdayandagewhyelsedoyouthinkiboughtacomputer>

0
0

About time too

Why anyone thought that XML was a good way to exchange large amounts of data is a mystery to me.

But, hey, storage is getting cheaper and processors are getting faster all the time so why bother to come up with an encoding that is space efficient and quick to parse when you have something as horredously inefficient as XML.

2
0
Anonymous Coward

@John Miles 1, @Peter X

> Why anyone thought that XML was a good way to exchange large amounts of data is a mystery to me.

It was not designed for this. The problem is people (and companies) who can't use their common sense and see its obvious limits.

> XML is good because it preserves the semantics of data

I don't think you understand what semantics is; it's in the interpretation of data, which means it's not intrinsic to the data, which means XML cannot embody semantics (well, that's my view).

Also bear in mind that

<tag>stuff</tag>

IS NOT XML. It's an xml syntax. Here are other XML syntaxes: <http://www.ibm.com/developerworks/xml/library/x-syntax.html>. Since they can all produce the same underlying conceptual data model (perhaps bar namespaces, which some of these precede), they are equivalent to the conventional XML syntax, and their interpretation is the same. It would be the same as replacing { and } in C with pascal's begin ... end but changing nothing else. Different tokens, same interpretation, if you modified the compiler to recognise them as such.

The distinction between XML semantics and XML syntax was never made clear, hence the confustion. I understand the infoset stuff was done to help clarify this <http://en.wikipedia.org/wiki/XML_Information_Set> but it's a long time since I read it.

The XML syntax only encodes a conceptual tree-structured thingy; this tree-structured thingy is the XML itself . How you interpret the XML tree-thingy is where semantics come in.

Anyway, back to the point, XML is fine if you don't get stupid with it.

0
0
Anonymous Coward

sounds like wap to me

it will be dead in no time i think...

0
0

Why XML is good

XML is good because it preserves the semantics of data, as opposed to say a comma delimited file, and includes namespaces so it's easier to add to, or remove data from without then having to change the software wot consumes said data.

The human readable bit is just that you can view/create/update XML data with a simple text editor. Or pretty much any programming language since it's just a text file. This may not seem like a big deal, but if you're stranded managing a legacy system and you need it to talk to something else, things like this really matter.

But don't misinterpret "human readable" as meaning it is intended to be consumed by end-users.

On a similar note, building applications in HTML/CSS/JavaScript seems *really* dumb if you're approaching the problem as a "real" programmer, but the advantage is that the barrier to entry is much much lower. So it could be argued that Firefox because popular because it's extension mechanism was much easier to hack about with. XML does the same for exchange of data.

Is XML efficient? Nope. But that wasn't the intent. But it is useful as a data exchange format that's easy to work with -- easy as in lower barrier to entry.

2
0

Re: Why XML is good

Peter X wrote: 'But don't misinterpret "human readable" as meaning it is intended to be consumed by end-users.'

Section 1.1 of the XML specification lists the original design goals for XML.

http://www.w3.org/TR/xml/#sec-origin-goals

Number 6 in the list is: "XML documents should be human-legible and reasonably clear." There is nothing in the wording to suggest that "human-legible" is inapplicable to, say, end users, so I disagree with you on that point. Even leaving end users aside, XML fails the human-legible test for the significant portion of techies who find it difficult to understand languages (such as XML, LISP, PostScript and Forth) that try to unify many concepts into a minimal amount of syntax.

Peter X went on to write, "Is XML efficient? Nope. But that wasn't the intent. But it is useful as a data exchange format that's easy to work with -- easy as in lower barrier to entry."

I agree with you on this point, but I find it very frustrating that much of the IT industry has been willing to settle on something as mediocre as XML simply because it's slightly less bad than its predecessors.

1
1
FAIL

Oh Wow!

So, back in the olden days file formats were binary, small and efficient. Along rolls XML with it's nicely structured, human readable format, becoming the next big thing in file formats.

And now, the new big thing to address some of the issues surrounding XML (like file size, parser complexity, load speeds, etc) is a binary file format!

*sigh*

2
0
Pint

Efficient and Human Readable

Some of us dinosaurs remember when MS Word roamed the Earth ... we used it because it was efficient. Our tiny little brains thought that was true.

0
0
This topic is closed for new posts.

Forums