back to article Walking through MIME fields: Snubbing Steve Jobs to Star Trek tech

Dr Nathaniel Borenstein didn't just reject Steve Jobs once - Borenstein twice defied the man accorded god-like status for conquering whole industries and cowing hardened executives into kneeling before him. Borenstein is the inventor of modern email, or more specifically he created the Multipurpose Internet Mail Extension (MIME …

COMMENTS

This topic is closed for new posts.

Page:

Gold badge
Thumb Up

email makes a handy *enabling* protocol

for data transfer between systems.

*Theoretically* EDI standards should have rendered this obsolete *decades* ago.

Kind of like X400 in fact.

And yet in 2012...

People forget just how *proprietary* email was before internet email became common. Which ISP you had *mattered* in *exactly* the same way people feel you have to be on Facebook to be able to connect with other users.

Ideally a critical mass of users will develop and *internet* standards will develop for the services that FB and Gmail have identified a *real* demand for which don't tie you hand and foot to them forever.

And it's good to see some folk spotted Jobby's dictatorial nature under the hippy routine.

8
2
Silver badge
Meh

So 2.8

million e.mails are sent every second.

So thats 100 000 actual messages and 2.7 million spam e.mails judging by the ratio in my inbox

4
1
Thumb Up

Wow, that was close...

Apple nearly had an iron grip on the tech behind email attachments. That would REALLY have helped the Internet and email world to takeoff.

4
3
Silver badge

Re: Wow, that was close...

Well, it works for Webikt... Hows that doing again?

5
2
Silver badge

Re: Wow, that was close...

Unicode is and was an open standard. It has never in any sense been owned by NextStep or Apple.

Had Jobs succeeded, international text support would have been better from day one, except in the real world where a whole bunch of functioning systems would have been unable to implement MIME and it would probably have taken even longer for a standards-based approach that isn't Anglo-centric to win out.

5
4
Facepalm

@ThomH

WTF are you talking about, Unicode started out with people from Apple and Xerox trying to create a universal standard character set in the 80's. Job's wanting to broaden it's use in the early 90's would not have slowed anything down.

I forgot, idiots will be idiots and facts and history mean nothing.

1
6
Silver badge

@ThomH

"Had Jobs succeeded, international text support would have been better from day one, "

Except Jobs probably had in mind 16 bit unicode which still isn't enough for all the worlds languages so now we'd been having to hack that to support korean or malay or whatever.

"even longer for a standards-based approach that isn't Anglo-centric to win out."

TBH , if these other nationalities really had an issue with the english latin character set being the basis of computing then they were quite free to go and design and build their own computers. But except for russia, britain and japan hardly any did, they just accepted the hand-me-downs from the USA. So, sucks to be them eh? If they want their version of the letter 'O' with a line and squiggle they'll just have to put up with having to use 2 or 3 bytes to represent it.

2
2
Silver badge

Re: @ThomH

@a_been: people who live in glass houses shouldn't throw stones. banjomike accuses Jobs of attempting to get _ownership_ of email, I point out that Unicode wasn't _owned_ by NextStep or Apple. The exact point of open standards is to divorce ownership from creation — it doesn't matter where the thing was developed, what matters in this context is that even if Unicode had been used for the initial MIME standard that would have given NextStep and Apple no control whatsoever over email as a whole. I'd suggest that in future you pay more attention to what people have written before charging at them with playground insults.

@boltar: I said 'better', not 'perfect'.

2
0
Silver badge

This guy again?

We were doing all that he lays claim to long before he came along.

2
10
Silver badge

Re: This guy again?

Do any of you downvoters know what uuencoding is, and how it was used long before MIME existed?

1
8
Silver badge
Thumb Down

Re: This guy again?

Yes. I indeed remember stitching together UUEncoded mails and failing to decode them because one line got cut off and how editing a mime-types and mailcap files so that Pine used your viewer of choice was a piece of cake in comparison.

2
0

Re: This guy again?

Yup, I remember uuencode/decode - when it worked, I found it really neat but by $DEITY, it could be finicky and broke regularly. Some mail clients could handle uuencode fairly well (ISTR Eudora did an OK job of it), but MIME mail clients worked far more reliably in my experience.

3
0
Meh

Re: This guy again?

Not a downvoter, but yes, I do remember uuencoding. And used uucp as an e-mail transport. Fine as far as it went, which was basically to avoid borking 8-bit code transmitted through 7-bit links (anyone in the house remember modems? Prestel anybody?). MIME, or more specifically the clients that handled it, made life easier for my users (and by extension the admins).

As stated in several posts as well as the article, it may not be the best or most elegant or least Anglo-centric, but it's done a job sufficiently well to get widely adopted. Whilst that's a rather VHS vs. Betamax view, it's the 'real' world. Want something better? Make sure that adopting it is stupidly easy for consumers (and vendors), painless and better still gives a blindingly obvious advantage. Ideally, make the change totally invisible. Remember your major audience now is largely consumers, i.e. non-tech literate. When they ask "Why can't I do 'x'? The old system let me do 'x'!", wittering on about elegance, efficiency, egalatarianism, future nirvana and all the rest doesn't cut it.

4
0
Anonymous Coward

uuencode ...

Rather good actually. It needed a nice front end for modern user types (no idea there is a command line, let alone what a command is). MTAs such as sendmail and MMDF allow message delivery to a programme, such as uudecode (in badly configured installations, a source of a security problem), so that the recipient got his "multimedia" message.

I note that the article author seems to think MIME and WWW simplified email addressing. I can only say that, in 1986, I was working with sendmail, UNIX mailX, MH, etc. using "name@x.y.z", developing UNIX-VMS gateway protocols and an address book. The predecessor was the "!" separated path style. But even that was being remapped by sendmail to the "@" style in cleverer implementations. UKsendmail hid most of the nastiness, for example.

I find it interesting, as an aside, that Steve Jobs was so prescient and quick to spot the possibilities. While I appreciate the reservations of the academic, Jobs's willingness to take a risk and break new ground was usually well founded, as demonstrated by even the current Apple products and their galvanising effect on the industry. It is a shame that the academics did not seize the chance. Perhaps one needs a strong self-belief to do this It is ironic that, now, only OS X can still compile what must have been more or less pure UNIX code. One could call that, "open".

X.400, by the way, was far better and more functional than its terrified detractors claimed. I used it, as did many firms needing security and reliable delivery. PP and Quipu were rather good and used, for instance, for early pager and SMS to X.400 email gateways. ISODE is, I think, still going.

X.500, leading to LDAP, came out of this set of standards.

So do n't be too sniffy about things that happened during your childhood or adolescence.

.

.

.

.

2
0
WTF?

Much as I hate to say it...

Jobs was right. The existing email infrastructure didn't work properly anyway. Fixing it 'enough' for English-speaking academics too polte to send spam is very Internet but the rest of us might have been better served by starting again.

3
4
Gold badge

Re: Much as I hate to say it...

You must have read a different article from me. Jobs' only suggestion was to use Unicode. That isn't the same as "start again" and wouldn't have helped with spam even if it was.

Spam is caused by the ability of botnets to send email from compromised machines. You don't know who runs the botnet and the victim covers the costs. You could *start* to tackle the problem if there was a reliable way to contact the ISP of the compromised sender and a way to block that ISP at your end if you weren't satisfied with their response. That's a transport layer problem, basically revolving around the issue of identification/authentication. (You or your ISP needs to authenticate the sender or their ISP before the email is allowed to pass.) Email protocols deal with the application layer above. Therefore, re-engineering email protocols would have no effect on spam.

8
0
Anonymous Coward

Re: Much as I hate to say it...

What were you reading?

1st Page, 10th para

Borenstein tells us. "I said: 'Steve, Unicode will be more beautiful and ideal but it's going to break everything in the install base.'

2nd Page 9th para

If you were stating from scratch, you wouldn't say anytime you want to transmit you wouldn't start at Base64 and 7-bit ASCII," Borenstein says.

Sounds a lot like starting again or not starting again to me. So they didn't start again...

1
2
Thumb Down

Re: Much as I hate to say it...

That's right! Not using Base64 and 7-bit ASCII is *really* going to stop SPAM, which is the major pain with existing e-mail. Oh, wait.

5
2
Anonymous Coward

Exchange

If only we could get microsoft to bin their annoying TNEF (winmail.dat) attachments - then perhaps email attachments really could be standardised.

8
0
LDS
Silver badge

Re: Exchange

TNEF brings along Exchange specific message informations. Not everything an Exchange message contains fits the RFC 2822 standards.

0
0
Anonymous Coward

TNEF reeks of NIH

They didn't try very hard. Witness the "thread index" headers where a perfectly good mechanism already existed. I get the distinct impression the people who wrote that pile of... software, as well as its client, didn't have the first clue about writing emails, about existing practices (witness quoting verbatim instead of removing, then forcibly munging to below 65 characters line length, of signatures), didn't understand the specifications, and really didn't want to write email software. Which is all too true, as it's a "collaboration server" that does email by way of plugin that's designed to play barely, but certainly not well, with others.

They could have done a much better job of making it do all they wanted it to do without resorting to TNEF as we know it today. It would have involved "open standards", possibly a few new mime/types plus published specifications.

7
0

Re: TNEF reeks of NIH

Open Standards is not the Microsoft way. Embrace and Extend is the Microsoft way.

Fortunately for the world, Microsoft is in terminal decline.

4
3
Flame

Re: TNEF reeks of NIH

"Microsoft is in terminal decline"

Whether their decline is terminal or not is unclear. The problem is that there are plenty of others out there who want to get you into their proprietary walled gardens and making nice money at it.

NIH hasn't gone away. Look at the iTunes marketplace for example. Allegedly, it is not easy to put something on it if it "duplicates the functionality" of something already there. That is NIH at its finest!

2
0
Ru
Unhappy

Re: "Open Standards is not the Microsoft way"

"Embrace and Extend" is a bit nineties-to-mid-noughties, really. Nowadays we prefer "Embrace and Partially Implement", which is clearly the industry accepted way of dealing with open standards.

Of late, you may notice that MS is indeed embracing open standards in this way, because it was unable to entice enough people to use its proprietary ones. This is fairly standard in Big Corporation land, and is a pattern you can see going back decades.

NIH, though... that's a tricky one. Big software patent wielding companies like MS often cannot risk using open software and standards because they simply cannot guarantee they are patent and copyright unencumbered. Live by the sword, die by the sword, and all that... it is safer for them to reimplement similar technologies themselves. Its a slightly more subtle effect of the current (US) IP situation than the usual legal warfare, but still a very wasteful one given the amount of time and resources expended on such a pointless endeavour instead of making new stuff, or old things better.

0
0
LDS
Silver badge

Kill the email to get its data

I understand why Facebook & C. would like to kill the email - it transports a lot of data that avoid their "information stealing" process. If they can get people to move their data across their services, they can gather a lot more of information (your information...) to sell. That's hwy GMail was born..

And that's exactly why we should stay away from them and use our own mail servers where we have total control on what gets in and what gets out, who access data and who doesn't, without lending our data for free to the hungry data vampires.

13
0
Devil

Re: Kill the email to get its data

And that is exactly why I only use GMail for unimportant stuff like an addy for websites you need to supply an email address to see or technical mailing lists or ...

For real email, I use a totally independent hosting service which also hosts my personal websites and is unconnected to Google, Farcebook or any of the other content snoopers. I may have to pay a little for it but it helps a lot.

BTW, you _can_ use GMail via IMAP/SMTP from Thunderbird et. al. and use X.509, PGP or GPG encryption on your stuff as it passes through Google. That may or may not help depending on whether or not they can grab your keys and decrypt your mail.

2
0
Headmaster

Is there a subeditor in the house?

«"I have trouble believing the idea email is dying when it's still growing but a deeper level we are finding social networking lets different types of communications, and it might be a better for medium for certain things but not or others.»

And that's just on page 3 (fnarr). I make that about the fifth sentence that needs fixing. It's a quotation, but don't you fix mangled quotations?

2
0
Anonymous Coward

The rot in email is in its misuse

Social media? Usually treat email like a convenient dumping ground for "notifications" and messages that exhort you to log in to the proprietary service to read the message waiting for you. If they'd stick it right in my inbox they'd save me a lot of time.

Same with just about every mechanical email delivery. Even those that ought to know better: Most ticketing systems that claim "full email integration" also really only mean "treat it as a dumping ground for crap notices nobody wants to read", though maybe you're lucky and somewhere in the system templates lurk that might be configured. But that's all. It's a one-way street, or rather, a waste chute.

Then again. most people don't know how to write a proper email either. You know, snappily written, points separated out in paragraph, to the point, nicely formatted. But then, replies are possibly worse. Top-posted, one-liners including possibly megabytes of quoted material. Absolute wastage of effort, both in bandwidth and storage costs, and human effort to deal with it.

Email works best when the people using it can muster that little effort, aided by a little training, to write something worth reading. Nobody's had that training, nobody's seen a proper email in ages. It's no coincidence that USENET, built on much the same technology but expressly geared toward group interaction, is /just too hard/ for the average user these days. web-based stuff is that much shinier, even if treats email as its dumping ground. Somehow that's email's fault.

5
1
Silver badge

Another case of

you change your program I cant be bothered to change mine. Or perhaps - we've change the standard so you have to change your computer and software for our benefit.

7 bit ASCII is universally understood (almost) and wastes about a millionth as much bandwidth as videos and pdfs that are pointlessly embedded in email when a simple url that could be ignored would do.

The only problem with email and mime is its misuse - and if you can work out a way to stop that ...

2
0
Bronze badge

7 bit US-ASCII - Grrr!

@ Tom 7

"7 bit ASCII is universally understood (almost)"

Yes, but it's actually 7 bit US-ASCII, which is unfriendly for Latin based languages with accented characters. Just moving to 8 bit would have solved a large part of that problem without going to full blown Unicode. But yes, that would also have involved someone else's software, and hardware drivers too.

2
1
Silver badge

Re: 7 bit US-ASCII - Grrr!

"7 bit US-ASCII, which is unfriendly for Latin based languages"

No. It is not. Do you know what uuencoding is, and why it was invented? Do you know how to properly use it?

1
4

Re: 7 bit US-ASCII - Grrr!

"7 bit US-ASCII, which is unfriendly for Latin based languages"

"No. It is not."

Yes, it is. There's no accented / diacritic characters in 7-bit US-ASCII, which is obviously "unfriendly" at best for non-English Latin-based languages. Or so my friend František tells me...

"Do you know what uuencoding is, and why it was invented?"

Yup. It's a nasty hack to shoehorn 8-bit data into 6-bits worth of US-ASCII printable characters, so it wouldn't get mangled by the variety of non-ASCII character sets in use back then. And it even failed at that, as anyone with the misfortune to have dealt with passing uuencoded data through IBM mainframes in the day could tell you.

I mean, if you were arguing for Base64 encoding you might have a point - but that's pretty much the default, 100%-guaranteed-to-pass-through-anything-and-still-work guts of MIME anyway...

2
0
Silver badge

Re: 7 bit US-ASCII - Grrr!

UUE's not a nasty hack. It works. If you know how to use it.

IBM's SNA spoke an entirely different language.

It worked, and was a bigger network that then then fledgling Internet ... but it wasn't designed to be compatible with anything other than itself.

Extended Binary Coded Decimal Interchange Code.

I can't believe I typed that ... must be nearly a quarter century since I last saw it spelled out.

0
2
Silver badge

Re: 7 bit US-ASCII - Grrr!

"There's no accented / diacritic characters in 7-bit US-ASCII, which is obviously "unfriendly" at best for non-English Latin-based language"

Not that unfriendly. Most of the accent marks and other character modifiers in latin character set european languages are syntactic fluff and the languages are still perfectly understandable without them.

0
3

Re: 7 bit US-ASCII - Grrr!

<i>Most of the accent marks and other character modifiers in latin character set european languages are syntactic fluff and the languages are still perfectly understandable without them.</i>

You clearly don't know many European languages. Finnish, Estonian and Hungarian are difficult to understand without accented vowels for starters. I'm sure the same goes for a number of other non-Germanic European languages as well.

1
0
Facepalm

Re: 7 bit US-ASCII - Grrr!

"Most of the accent marks and other character modifiers in latin character set european languages are syntactic fluff and the languages are still perfectly understandable without them."

Really? "syntactic fluff" ? Phonemic maybe, not syntactic for most languages I know.

Letters are there to represent sounds (ignoring English so called spelling rules for the moment)

Change the letter and you change the word. If someone went and replaced all the 'O's in your document with 'U's you'd be a bit upset wouldn't you? People can often guess from context, but not always.

1
0

Re: 7 bit US-ASCII - Grrr!

Of, course it's US-ASCII, it's the *ONLY* ASCII there is. You know, the *A*merican *S*tandard *C*ode for *I*nformation *I*nterchange.

-dZ.

1
0
Silver badge
WTF?

Re: 7 bit US-ASCII - Grrr!

"You clearly don't know many European languages. Finnish, Estonian and Hungarian are difficult to understand without accented vowels for starters. I'm sure the same goes for a number of other non-Germanic European languages as well."

Funny how they all manage perfectly well using english keyboards here in London (I work with a lot of foreign nations) to write emails , surf the web then etc isn't it then? Even the russians who use the cyrillic alphabet can transcode quite happily using ascii.

0
0
Silver badge

Re: 7 bit US-ASCII - Grrr!

"Change the letter and you change the word. If someone went and replaced all the 'O's in your document with 'U's you'd be a bit upset wouldn't you? People can often guess from context, but not always."

Well I find find French perfectly readable without all the accents and cedilla. Most of them serve no useful purpose since everyone knows how the word is pronounced anyway , you don't need a line to tell you where the emphasis goes.

0
0

Re: 7 bit US-ASCII - Grrr!

"Funny how they all manage perfectly well using english keyboards here in London"

Let me guess - writing in English. Here's a little eye opener for you: change the front vowels (which includes the accented ones) into back vowels in Finnish words and they entirely change their meaning. The meaning cannot always be interpreted from context either.

0
0
Silver badge
WTF?

Re: 7 bit US-ASCII - Grrr!

"Let me guess - writing in English."

Err no , writing in their own languages to friends and family.

"change the front vowels (which includes the accented ones) into back vowels in Finnish words"

What the hell is a front vowel and a back vowel?

"The meaning cannot always be interpreted from context either."

Ah well, I guess you're just screwed then if you don't have a finnish keyboard available at all times?

0
0

Re: 7 bit US-ASCII - Grrr!

If you believe that the diacritics in French are used to mark emphasis then you don't know very much French I'm afraid. Having said this, you are right that diacritics are not strictly necessary to understand French, but that rule varies from one language to the next. Some are easier to deal with when diacritics are removed than others.

0
0
Stop

Re: 7 bit US-ASCII - Grrr!

"Ah well, I guess you're just screwed then if you don't have a finnish keyboard available at all times?"

Not completely screwed -- most of us change the keyboard layout and type "blindly" since we remember where those keys are in the Finnish keyboard. So you press the semicolon for ö for example, or apostrophe for ä.

But imagine what the Chinese would have to do? That is *really* hard to decipher from ASCII.

Anyway, technology is supposed to be there to make things easier for the end user, not to create all these unnecessary complications and forcing changes to writing systems that have been in use for centuries.

0
0
Silver badge

@Cazzo Enorme (was: Re: 7 bit US-ASCII - Grrr!)

"You clearly don't know many European languages."

Apparently, neither do you.

"Finnish, Estonian and Hungarian" aren't "European", no matter how hard you squint at them.

You seem to be well-named, Cazzo. Somehow I doubt the Enorme.

0
0
Silver badge

Re: 7 bit US-ASCII - Grrr!

@Wensleydale Cheese:

There were relays around then that really did lop the 7th bit off, just because. Or add or remove white space. Or if an e-mail went through an EBCDIC relay then who knows what would come out the other end.

The idea of being able to specify the character set and encoding instead of forcing everyone to use Unicode or Latin-1 was probably the best solution available at the time without breaking everything. But then His Jobsness never worried about breaking the previous version.

0
0
Silver badge

Re: 7 bit US-ASCII - Grrr!

"If you believe that the diacritics in French are used to mark emphasis then you don't know very much French I'm afraid"

Funny how the locals never have a problem understanding me then. But the accents generally are where the stress occurs in the pronounciation (if at all) as well as modifying the sound. You almost never find the stress on a non accented letter if there's an accented one in the same word.

0
0
Silver badge

Re: 7 bit US-ASCII - Grrr!

in english i think itd be right to say its about the same as being forced to use lower case letters for everything or not being able to use punctuation not the end of the world but its a little harder to read certainly rather lacking when it comes editing anything halfway serious

0
0
Silver badge

Re: 7 bit US-ASCII - Grrr!

> IBM's SNA spoke an entirely different language.

> ...

> Extended Binary Coded Decimal Interchange Code.

The relationship between SNA and EBCDIC is political, not technical. IBM systems used EBCDIC long before the SNA acronym was coined. The original S/360 had an ASCII hardware option, but defaulted to EBCDIC in large part because ASCII hadn't been standardized until very late in the S/360's development. (S/360 went GA in '64; first ASCII standard published in '63.)

Other IBM systems followed suit because there was no compelling reason to switch to ASCII. The ASCII option was dropped from the S/360 because there was no demand. It wasn't until they got into the PC business (and shortly after that the Unix workstation business) that ASCII became relevant to them.

SNA was one half of a reaction to the failure of the Future Systems project (the other half was SAA), which itself was a reaction to the changing legal and commercial landscape. Like unbundling software and making it a chargeable item, SNA was part of an effort to increase revenues by creating a locked-in customer base. (Maybe IBM should have patented walled gardens; they could sue Apple now [joke].)

SNA is a vast array of network protocols. I mean vast - even the small fraction of SNA tech docs I have on hand outweighs the full collection of RFCs. SNA does primarily use EBCDIC, because the machines it was designed for were mostly EBCDIC machines, but equating SNA and EBCDIC is like equating the national highway system with its signage.

> It worked, and was a bigger network that then then fledgling Internet ... but it wasn't designed

> to be compatible with anything other than itself.

ASCII wasn't "designed to be compatible with anything other than itself". Neither was TCP/IP. I don't see your point.

0
0

Re: @Cazzo Enorme (was: 7 bit US-ASCII - Grrr!)

@jake: ""Finnish, Estonian and Hungarian" aren't "European", no matter how hard you squint at them."

Eastern European but certainly European. Hence why I studied them at SSEES - The School Of Slavonic and East European Studies. If you want to argue that Hungarian in particular isn't a European language then any credible linguist would be amused to hear you try.

As for the size of my cazzo, that's a secret between me and your wife.

0
1
Silver badge

Re: @Cazzo Enorme (was: 7 bit US-ASCII - Grrr!)

Nope. Over the top and to the West, not along the coast.

My wife laughs in your general direction, piccolo.

0
0

Page:

This topic is closed for new posts.

Forums