playback with dolby c off
was also a crude way of compensating for cheap tapes with lousy top end frequency response.
Digital audio began life with high ideals and worthy engineering feats, with its extended dynamic range came the promise of noise-free recording. This is a story of how it first charmed and then choked the industry it was designed to enhance. HHB Digital Information Exchange agenda and newsletter 1987 1987 DIE roster and HHB …
Very few commercial tapes were created with Dolby C, Dolby B was the norm.
Dolby C appeared on high end audio systems for users when recording their own material, and as far as I am aware, no record labels actually sold tapes recorded with Dolby C.
IIRC, Dolby B had a fixed expansion above a single frequency, Dolby C used multiple frequencies (two or three) and different levels of expansion for each, and Dolby D used a continuously variable expansion level dependent on the frequency. I can't ever remember seeing home grade equipment with anything better than Dolby C.
I think what JimC was referring to was using Dolby C during recording and switching NR off completely during playback, which would force a lot of (admittedly messy) high end back out through an otherwise woolly-sounding system - be that because of crappy tape, crappy front-end, or both.
Spot on explanation of dithering, although isn't "granulation noise" usually referred to as "quantization noise"?
It is rather remarkable that a 1/2LSB sine wave without dithering becomes a square wave to the ear, while a 1/2LSB sine wave with dithering sounds like a sine wave + white noise.
A nitpick, but of a very common misconception.
Dithering does not mask the quantisation artefacts with random noise. That would imply that the amplitude of the noise added needs to be of an amplitude high enough to drown out the quantisation. In fact the amplitude of the dither is one half of the least significant bit. It does not mask the quantisation artefacts, it actually removes them totally. In its simplest form the use of AIWN (additive independent white noise) will totally decorrelate quantisation artefacts. (It is the fact that the errors in quantisation are correlated with one another that leads them to contain objectionable audible sound.) However my noting that the ear has vastly greater sensitivity in some frequencies, and much less in others - especially in the top octave - it is possible to craft the dither in such a way that increased noise in those parts of the audible spectrum we are less sensitive to can be traded for higher resolution in those areas we have greater sensitivity to. Thus it is possible to get a genuine 18 bits of resolution in the mid bands of a 16 bit digital recording. Most recording systems provide the engineer with a range of dither options, with different profiles being suited to different recordings.
Surprisingly, it did not mention aliasing at frequencies over 1/2 of the sampling rate.
It is that phenomenon which was mostly responsible for the choice of 44.1kHz (which is twice as high as 20kHz plus allowance for filter leakage). Anything below that would have meant that the bandwidth of the recorded signal had to be lower than 20kHz and therefore not acceptable as vinyl replacement.
It vaguely explained why it was required (needing to measure the peak and trough of the sine wave), however it's pure Nyquist–Shannon sampling theorem that dictates the sample rate, aliasing is a symptom of not filtering. 0 -20KHz was the desired range, 2KHz was added as a low-band filter "buffer" (44.1 can't reliably recreate 22KHz of sound). You have to sample at twice the rate of the max rate.
The ".1" was added as 44.1KHz can be represented effectively in both PAL and NTSC systems. Similarly, 48KHz is the magic number that can represent 22KHz with a filter and work on PAL/NTSC systems.
I recall going to Mullard conferences in London where they talked about CD design choices. 44.1KHz was the Lowest figure they wanted to use. The original engineers wanted something like 100KHz, but the marketing guys put on two constraints:
* The disk had to fit into a machine that would go in a DIN car radio cutout
* The disk had to hold a one hour duration
Given that they were going to be pressed on modified record presses, and the properties of the then laser heads, which determined the bit rate in the plastic, they had to squeeze down to 44.1KHz
44.1 was largely down to legacy video systems.
http://www.cs.columbia.edu/~hgs/audio/44.1.html
It seems we rate convenience over quality these days. If anything killed the progress of digital audio it was tape and no reliable alternative (at that time). We now have flash memory in sufficient sizes, but it wasn't the case back in the 80s and early 90s.
44.1KHz met the twin targets of achievable anti-aliasing filters on the record side and sufficient bit rate to record in active picture time on a U-matic - or on a standard broadcast video circuit. I recall using both variants in the BBC in the mid-eighties.
Not convinced by the 22.1/8 format matching cassette tape, though: vaguely, 10KHz and about 40dB s/n? A decent cassette with decent tape could better that, particularly if (as you describe) the various Dolby options were used.
Well written, unsentimental, engaging and timely producing memories of my first sampler - an 8 Bit Ensoniq Mirage, my first CD player, and mulling over getting a Minidisc player (didn't).
Hope you will also touch on those failed digital formats - DCC and Minidisc spring too mind.
I worked as a bench engineer at the Laskys branch at 257 Tottenham Court Road. The big brach was at no 42 which became Micro Anvika (I think they are on the critical list also). The poster who said that the 44.1 was chosen as it could be factored into the 625 line 50 field PAL sync and the 525 line 60 field NTSC sync is correct as it was expected that modifed VCRs would be the intial medium for the largest market segment. It was also above the 20Khz audio bandwith by a factor of two and a bit.
The real memory was Phil Oakey, wonky haired vocalist from the second incarnation of the Human League coming to buy a Sony PCM-F1 and its matching portable (ish) Beta VCR. There was also a Hitachi-Shibaden VHS machine that was a self contained unit of VCR deck and ADA amp. The later VCR "Hi-Fi" arrangement was analogue but with VCR effective tape speed and 70+db dynamic range. Most VHS Hifi recorders had nasty IC based rec and play amps so it never reached its potential. For a long time, with the right recorder, it made more sense than digital for the amateur or semi-pro user.
As I recall you could could defeat the SCMS on the Sony F1 with a small DC voltage to the left PCM output. As two machines were a years wages it was not a practical restriction. You were always going to send analogue to the pressing plant. Some indie recordings were still mixed down to a Dolby C cassette deck and providing you didn't want to stable a stereo image or much dynamic range, neither of which were essential to the synth based pop music of the day, they sounded just fine.
Years later I gave up my soldering iron and became a sysadmin to finance my... law degree. Thus proving that digital audio only eats itself and all those who sail in it.
Depends on what you count on as failure. Minidisc was a consumer failure, but in certain places it was embraced.
Local radio was one. Beforehand we were lugging around Marantz cassette recorders which were big, heavy and the audio was only ever as good as the cassette you put into it (I picked up a load of metal tapes from Tandy cheap which gave excellent quality). Units were also often poorly maintained and nobody ever thought of buying their own.
Then Minidisc came along. Instead of this huge heavy beast, we had this small unit that you could hold in your hand (or larger if you went for one of the "pro" models). Indeed the high end consumer units from the likes of Sony offered the features we needed, were desirable consumer goods and were also affordable. So many of us shelled out the 150 or so notes and bought our own.
My own recorder, purchased from the local Sony centre, travelled thousands of miles, recording in helicopters, planes, even a glider. The only time it ever missed a beat in the field was during a winch launch in a glider which to be fair I was fully expecting.
The quality was excellent as well and the media was very reliable (other than idiot journos ejecting discs before the TOC had written). The ATRAC codec was generally fine for speech although not without it's faults but it was so far ahead of what we had before you could forgive it.
These days it's all digital audio recorders using SD cards but minidisc was where it all started. Still have both of my units as I don't have the heart to get rid of them. My original silver Sony which sadly no longer reliably records (due to thousands of hours of use I suspect) and a newer Sony Net MD unit in striking metallic blue that I purchased just shortly before I left the industry.
Consumer failure? Yes, but I loved earning a living with them!
The minidisc mostly failed because you couldn't easily put music on it. Although it was digital media (thus people expected to be able and copy audio to it) you actually had to record it. In real time.
As such it sort of turned into a "r/o-like" medium and that was bound to fail with all the mp3 players (also cd-based players) which were by then already floating around and were much easier to use.
Ease of use won over quality.
The biggest problem on the consumer side was that minidisc was never affordable. You couldn't pick up a unit for less than 100 quid, and if you wanted a decent Sony unit you were looking at £150+.
I actually put a Sony MD head unit in my car so I could listen to interviews I'd just done before getting back to the edit suite. Cost me about 200 quid. Lovely unit and MD was great in the car because they were so sturdy. But nobody was ever going to pay that kind of money.
And then the iPod came along and finished it off.
Ultimately the biggest problem with the Mindisc was Sony itself. It invented the Net MD, a way of copying music onto the player via USB, but hobbled it so badly that it was easier to do it in real time. Awful unreliable software and only 1 way transfers (you couldn't copy back from the unit).
If MD's had been more affordable and Sony hadn't dragged their heels on the USB transfer, it might have lived in the consumer space longer.
Incidentally I spoke to a friend this morning who confirmed that he still uses MD's for quick dubs in the studio as he can just eject the disc and run. His quote "it's easy". Blanks are still readily available from media suppliers as well.
@ShelLuser. Well my minidisk player came with the awful sonic stage software which is something that sony specialises in, shit software for their otherwise good quality hardware. It's the same for the sony ebook reader, nice bit of hardware, shame about the software.
Didn't sony also produce a minidisk player as a HiFi separate as well?
I think what really killed the mini disk (and DAT as well) was that there was no way to copy stuff from them.
Still the minidisk player does have an advantage over any of the iPood devices, you can replace the battery.
the fail is for the sony copyright mafiaa
@Field Marshal Von Krakenfart Although there was a half hearted attempt to push DAT into the consumer space, its real home was the recording studio. And considering when I last bought a unit in 2006 it cost 800 quid (IIRC!), I'm not sure how they would have ever scaled it down to a price consumers could afford unless they cut features and build quality. And frankly our studio DAT could be rather fragile and was often away for repair (you have no idea how ham fisted local radio presenters can be).
There's hundreds of thousands of hours of audio stored on DAT so you still see them sitting in the corner of studios. Indeed I know someone who bought one when Sony made the announcement they were being discontinued so he had a spare. Was still boxed up last time I checked.
And yes there were hi-fi separates for minidisc. Again I had one myself. There were also some very nice studio units. For many years Radio 2 played most of its jingles from minidisc as they were an ideal replacement for the old tape cart system.
Minidisc found a home in the broadcast and live presentation/theatrical markets because of their reliability, convenience (a 2-U rack unit compared to 1/4" reel-to-reel Revox tape machines?!!), but mostly for their instant start capability. I think MDs used some sort of read-ahead buffer which stored audio from the disc, meaning the audio started as soon as you pushed the button (unlike CDs) - perfect for jingles, cued sound effects, etc!
I think this was probably originally designed to make them more shock resistant than CDs in portable players, the advantage for performance being a useful side-effect?
Mindisc was far more successful than DCC, which had no appeal to anyone and was dead in the water. MD provided a brilliant audio solution for quite a few years (a decade or more) - and durable in their protective cases, I still have discs from the mid 90s which play quite happily - until the storage size of affordable flash memory gave us useful portable MP3 tech. where the cost of a GB made it worth the effort. There's no way MD was a complete failure, despite not being mass-market.
DCC, yes, total waste of time! Digital cassettes? No random access? Tape always wears out? Yuck! MD was like being able to record your own CDs, years before the average PC and CDR burners made that possible for most of us, and it was small and oh so portable.
Great article, but no mention of the ADAT (or if there was I missed it). A very popular eight track digital tape machine that recorded to high quality VHS tapes, with the added bonus of being able to sync several machines together to get sixteen track digital recording. It was a fairly expensive bit of kit, but just about within reach of ordinary musicians. I recorded an album on them back in the late 1990's, and the guitarist in my current band still has two in his loft. Nowadays it's Ableton Live at home, and almost inevitably Pro Tools in any studio we go to.
Looking forward to the next instalment.
Two-channel digital audio was a reality at the BBC in the early '70s, so I was surprised to hear it took another 10 years to get it to work in commercial multi-track systems. The ad on page 1 shows they were using these recorders with an analogue mixing desk, with all the conversion problems that involved. How long did it take for full digital mixing to happen? I still remember record companies making a big fuss about selling compact discs with full 'DDD' recordings, and I'm sure that had become fairly standard in the early 90s.
Almost didn't click on this one as the title made it look like some more crap from Orlowski, glad I did ;).
It and some others got deleted; as of now there is just the original comment and the editor's reply.
I once worked at a computer magazine where the editorial guys were heavily pressured by the advertising guys to write favourable reviews of heavily advertised product. They resisted, but the magazine folded, maybe due to poor advertising sales...
I would hate to think the same is happening to the Great British Hifi Press. But in this case I'd actually prefer it to the alternative - that they actually believe what they say!
What a bunch of children they are.
I've emailed the magazine about my disgust at their behaviour - not something I normally do but it provided great procrastination from doing some actual work here. : )
It doesn't matter how cheap the cable is, as long as it's built OK of course. USB carries digital data, it does not know what that data is, all it knows is that it ALL has to get to the destination. It uses handshaking and error detection, everything always gets through correctly, it simply doesn't behave like analogue cables at all, in any way!
http://www.whathifi.com/review/chord-company-prodac-0
"The change comes down to a change of the RCA terminal to one that uses a (posh) plastic outer casing instead of metal. It is said to damp down vibrations fed into the cable and reduce stray electrical currents.
And there’s no denying this seemingly insignificant change lifts the Prodac’s performance noticeably. It’s sweeter, smoother and more precise than before without losing any of the excitement of the older version."
No denying, eh? obviously not if you purge the contrary comments.
(A Fool) <----------- easily parted ------------> (A Fool's Money)
Need I say more?
I had thes out in a hifi shop, and the chap conceded that once you got up to £20 SPDIF cables you really couldn't tell the difference on double-blind testing, and that some people were happy enough with bell-wire with a couple of coaxial plugs. Not USB, mind, but much the same principle.
Not only does What Hi-Fi shove expensive crap with lying reviews, it deletes comments that disagree with them, mine and someone else's comment agreed with me have consequentially been deleted. So much for 'Your Opinion' - they've clearly been paid for good reviews, magazines need the hardware, the hardware needs the magazines. Eating themselves in neat little circles.
What, ONLY fifty-five quid for an audiophool version USB cable? That's so cheap... it actually hurts. Go compare that denon ethernet cable (AK-DL1, complete with "signal directional markings" so the electrons know which way to go) for 500 bucks. Or some of that pear cable for, er, quite a bit more.
Anyway, this was about the recording industry, where even though the kit is spendy it does deliver something useful, not the audiophool side where people claim they can hear impurities in their cables so they want oxygen-free copper in their "interconnects", or silver, or mercury lines so the sound'll be more fluid, or.... Oh dear, I've given someone ideas now didn't I?
Wow what a great article! I really enjoyed reading that; thank you!
Who remembers 'oversampling'. IIRC I had a Marantz CD player (long since worn out but gave very good service) that had 4x oversampling. IIRC it actually span the disk 4x faster and sampled everything 4 times or something like that. It was something to do with removing high frequency noise that was just outside the human hearing range, but could still 'annoy' the listener.
Does anyone remember that? Can anyone explain it better than me? I'm sure I've got it wrong!
Thanks
Oversampling means increasing the sample rate - it was originally introduced by Philips because at the time their DAC (TDA1540 from memory) only had 14bits, so they used this technique to get closer to 16. The disk span at the same rate, the extra samples are created mathematically in a filter (SAA7220 in Philips' incarnation). With a new sample rate of 176kHz the high frequency noise (images) is easier to remove, being further from the wanted signal.
The album The Visitors by Abba was marked as DDD and that was 1981 (picked up an original release second hand a few years back). Listening to the original release (and not the overcooked remastered versions) it sounds quite thin and clinical compared to later releases by other bands (again not the overcooked modern releases). Perhaps because nobody had any real idea how to master for the new format?
I wonder to what extent the motivation behind the creation of non-recordable formats was driven by being able to control the manufacture and supply of recordings? Think back to the birth of was cylinders etc, I imagine a large part of the idea behind their invention was having the ability to record as well as play back. By the time you got to bakelite records, was mass production the main idea behind the format or was controlling or eliminating duplication a major concern back then?
The contract between the record company and the artist still contains clauses that the artists pay for any product broken in transit. The companies also insist that artists pay towards the R&D on CDs.
The first is a throwback to bakelite 78s, the second a blatant rip-off.
The record companies ALL deserve to die. Horribly. They got fat and greedy in the boom years, then lazy and complacent. They paid over the odds for back catalogue, then shamelessly exploit it. (yes I do mean you T***la M**wn). They buy old songs then force wannabees to sing them on TV talent shows, then flog the dross to grannies and pre-teens.
Die. Soon.
The Mitsubishi multitrack was a dog to get working, it spent more time in the workshop than the studio so we went with the Sony instead.
I remember Sony had a full page advert in a trade mag with a 3324 sitting on the back of a Mitsubishi truck. the caption was something like "They make good trucks" or something like that. It made me laugh at the time.
I actually appeared in What HiFi back in 1996, listening to Vinyl! They did seem like quite nice people.
However I soon learnt whilst attending a Sound Recording Course that the idea of 'HiFi' and Studios dont quite match up and I learnt quite quickly that having a mega-pounds set-up will never quite match the sound you get from a Studio, especially a live session whilst the red light is on.
I'm quite happy with my Technics SUV90D amp and £10 speakers, I really couldnt care what anyone thinks about that.. Christ I am still using a 10yr old Soundblaster Live! Platinum with add drive for my own studio...
This article is good overall, though it makes some annoying Stephen Fry-like statements, for example:
"So don’t ever let anyone tell you digital systems aren’t noisy – digital audio depends on analogue noise patterns to mask the presence of its own artifacts, granulation noise."
Digital systems aren't noisy. The noise floor of digital audio is infinitessimal compared to analogue, regardless of dithering. And what is "analogue noise" ? Did the author intend to say "random noise" ?