Firstly, a factual correction: the type B connector is part of the HDMI 1.0 spec, not just 1.3, and it's pretty much equivalent to dual-link DVI-D. Almost nobody used it, just like it took years for dual-link DVI to appear on consumer graphics cards. HDMI 1.3's bandwidth increase uses the same connector as before, but increases the maximum bandwidth down the (single) link from 165MHz (DVI single-link maximum, HDMI 1.2 and below) to a 340MHz pixel clock equivalent. HDMI 1.3 retains support for the type B connector, so, if anyone uses it, you'd get roughly the same bandwidth as the upper end of DisplayPort.
The display industry doesn't seem to be concentrating too hard on actually improving things with each update.
First, we had a VGA cable. People got quite good at producing decent RAMDACs, and they gave a pretty good image at up to 2048x1536 at 85Hz on a cheapish 19" CRT. Certainly 1600x1200 at 75Hz was common.
Then DVI got launched. Most people implemented only single-link (reserving dual-link for workstation cards), and, lo, these gave better images for driving the $1000 17" LCDs that were available at the time. The CRT-based consumers were rightly unimpressed, and the standard took years to find acceptance - and it's done so now only because millions of office workers decided that a 1280x1024 19" screen with pixels the size of bricks is an improvement over a 19" UXGA CRT. Digital signalling is a good thing, but introducing the technology at a less capable point than the analogue equivalent is a hard thing to sell. For evidence, note that Matrox's dual/triple-head2go products are mostly analogue-based, because most laptops still aren't providing dual-link DVI bandwidth.
However, DVI did two things right: 1) it made the connector capable of carrying analogue data as well, so people didn't have to throw their monitors out (almost no cards are DVI-D only), and 2) it made the dual-link mode backwards-compatible. It's also a reasonably sturdy connector, although I prefer the LVDS connector on the SGI 1600SW.
Then came HDMI. In spite of it being possible to run the HDMI protocol (which is DVI with extra bits added) through a DVI connector with an adaptor, consumers were told a graphics card with an HDMI connector on it is an "improvement". This is "improvement" as in "doesn't support VGA, and doesn't support dual-link DVI, but is otherwise identical" (i.e. it's a significant reduction in functionality). I find the decision to run the audio over the video cable to be a little dubious, but it uses the same signal wires as DVI (the audio "islands" are sent during blanking periods) so, other than saving a few cents, there's little to have been gained by switching connector.
HDMI tried to keep equivalency with DVI by the type B connector, but because it's not a superset of type A, and because everyone *used* type A, nobody put a type B connector on "just in case" in the way that the dual-link DVI connector got added (where there was effectively no downside).
Giving up, apparently, on the type B connector as a generally-accepted item, HDMI 1.3 adds more bandwidth for either higher colour depth or higher resolution - both of which are specified as part of dual-link DVI (that is, there was a standard way of doing this with the type B connector anyway). The spec says that HDMI signals will (now) switch to dual-link only at some unspecified frequency above 340MHz, whereas DVI signals will continue to go dual-link at 165MHz. Note that upping the frequency while basically obsoleting the type B connector (the standard can't be bothered to say when it should be used, except for DVI data) means that huge numbers of graphics cards with dual-link DVI outputs and plenty of bandwidth for doing so can't be used to drive HDMI 1.3 devices at full resolution/bandwidth - they'll be limited to 165MHz. This mess might have been avoided if some attempt to promote the type B connector, rather than giving up on it, had happened - but the two connectors should never have been incompatible in the first place. Allowing for higher bandwidths per link would encourage higher dual-link DVI support, with backwards compatibility, from consumer graphics cards - and the DVI spec places no upper limit on performance under dual-link anyway.
DisplayPort intends to replace both HDMI (depending on who you ask) and DVI, and also LVDS (for internal connectors). Is it a "better" standard than HDMI? Probably, from an objective viewpoint (in the way that FireWire is "better" than USB). Does it actually offer the consumer anything more? Probably not: the bandwidth is of the same order as a type B HDMI 1.3 connector, the ability to drive multiple displays is limited and not such an issue for the average user (multiple HDMI cables are possibly more flexible anyway), and the much-vaunted fibre-optic link possibilities are available for DVI and HDMI from third parties such as Gefen (at a cost). DisplayPort appears to be an attempt by Vesa to re-establish themselves, and if it succeeds, it appears that it will be only through corporate politics.
UDI seems to be sinking without trace, and we can be thankful that yet another standard may no be foisted on us.
I can see no reason why DVI, the oldest digital standard being discussed, is inferior to any of the alternatives - other than the minute cost of the connector. As JJ commented, the other connectors are more flimsy and likely to be damaged (although I approve of the HDMI 1.3 mini-HDMI connector, which I accept has its place in portable media devices); I have no problems with the three SCART connectors on the back of my television (they'd be easier to plug in if they were even bigger!) and I doubt the size of the HDMI connector makes much difference to the average 60" plasma screen.
A change in connectors (now that DVI is finally becoming established), of course, forces the consumers into an upgrade cycle. You can guess who that benefits, and it's not the consumer. The display industry - especially in home electronics - has a long history of not being able to settle on one standard (I've used all of UHF, composite, S-Video, component, 5-pin DIN, SCART, VGA, LVDS, LFH-60 and DVI in my time, and could have added Firewire, CAT-5 and HDMI, not to mention the number of display standards and the marketing departments muddying the "what is 1080p" question). It's time the media stood up for the consumer for once.
Whatever can be argued for any of these connectors, I'm in no doubt that the best thing for the consumer is that there be *one* connector, and that any upgrades should be backwards compatible (which hopefully guarantees that they really *are* upgrades). It's not acceptable to agree to disagree and then let the consumers fight it out - the inconvenience to the consumer vastly outweighs the inconvenience to the companies who couldn't agree between so many ways of sending essentially the same data down a cable. This rebounds on the companies; HD adoption would, I'm sure, have been faster if people hadn't been waiting out the HD-DVD/Blu-Ray war, the HDCP disagreements, the digital TV/EDTV/1366x768/1080i/1080p30/1080p60/overscanning mess, etc. By refusing to compromise, and refusing to treat the consumer with respect, everyone loses; the sooner this is accepted, the nearer to a utopia of easy-to-use high-quality displays we can get.
But I'm not bitter. (Incidentally, I have a desk full of four CRTs and an LVDS 1600SW at work, and a CRT + a quad-DVI T221 at home. Neither HDMI nor DisplayPort have any appeal for me, which will cause a problem when the next graphics card I buy drops dual-link DVI-I support.)