Compute time was billed at around £750-800 per hour
"Compute" is a verb. The word you're looking for is "computer", which is a noun.
Think today's computing industry moves fast? Try that of decades past. Moore's Law predicts that processor power will double every couple of years or so, but on December 7, 1962, the scientific computing power of the entire UK is said to have doubled in a single day. That was the day they switched on the original Ferranti …
"Compute" is a verb. The word you're looking for is "computer", which is a noun.
Compute time != computer time. ATLAS was a multi tasking system, so in modern parlance the CPU time spent solving your pet problem (i.e. the "compute" time) would not be the same as the amount of elapsed time the computer spent running your program. Or in other words, even if it took an hour to run a program to completion, it may have only actually applied 5 minutes of actual compute time to it during that period.
Surely time is a noun. Wouldn't that make compute an adjective? More specifically, since it is based on a verb, a participle adjective?
"Surely time is a noun. Wouldn't that make compute an adjective? More specifically, since it is based on a verb, a participle adjective?"
It's a noun used as an adjective.
Use of a verb form (a participle) as an adjective requires the present or past participle of the verb. So, this would require 'computing time' or 'computed time'. (cf. 'baking apple' and 'baked apple', but _not_ 'bake apple')
If you want to use the noun as an adjective (cf. 'apple pie', 'car park') then you'd need to say 'computer time'.
Using the bare verb form, 'compute time', is the equivalent of saying "Don't talk now, it's not talk time, it's eat time." It looks and sounds clumsy and ugly. However, language changes fastly in this modern world and I look forward to more of these disruptive read experiences.
"The word you're looking for is "computer", which is a noun."
In the pre-electronic time a "computer" was a lady with a quill pen. At Harvard College Observatory they ware paid $0.25 to $0.35 per hour in the 1920s.
Or perhaps "computing" - also acceptable as a noun.
I'm finding it difficult to comprehend how the author can have got something so obvious so horribly wrong.
"I'm finding it difficult to comprehend how the author can have got something so obvious so horribly wrong."
Don't worry, the beatings will continue until morale improves.
Syntax Error !
We routinely refer to "compute time", which is how many CPU-seconds are used across all processors, vs clock time or the computation time of one cpu.
Pedantry would result in the language police renaming disk drives, without thinking about the actual history of the word.
Computers compute, no matter if human or machine.
I can finally get on with my life - after all, this air traffic won't control itself
Are you going to beat the impiety out of us or beat the happiness into us?
The word you want is computation
I like to note that people here have probably used more processing power than the Atlas was capable of over its lifetime to argue the toss over an irrelevant piece of grammar.
Welcome to the internet. :)
> In the pre-electronic time a "computer" was a lady with a quill pen. At Harvard College Observatory they ware paid $0.25 to $0.35 per hour in the 1920s.
Since we all seem to be trying to out-pedant one another, may I just add that quill pens were replaced with steel nibs long before the 1920s? They may still have been dipped in ink, but no geese were plucked in the making of them.
Sounds like a case of US English vs Proper English.
In the Queen's English, compound nouns are formed from participles of verbs and genitive cases of nouns. Also (not directly relevant here), collective nouns are generally treated as plural. In the strange dialect picked up by the descendants of those who forgot to pack a dictionary when the Mayflower sailed from Plymouth , compound nouns are formed from infinitives of verbs and accusative cases of nouns -- and collective nouns are always treated as singular.
So whilst we would say "computing time", Americans might well say "compute time". Compare also "The girls' swimming team have chosen their new mascot" vs. "The girl swim team has chosen its new mascot".
"Don't worry, the beatings will continue until moral improves."
I think you meant "morale"...unless that was deliberate and I missed it.
I speak as a native Brit. I understand that collective nouns can make sense as plurals, but I prefer to treat them as singular when I can, because a group of something is a singular thing and singular things take singular verbs.
And no, the Americans don't get everything right, but they did not forget to pack a dictionary because the Mayflower sailed before Johnson wrote it (and the people who sailed on the Mayflower were all subjects of the British Monarch and largely expected to remain so). What actually happened was that the language changed in different ways on either side of the water. US English is closer to British English than some English dialects, particularly Scots (and Doric along with Ulster Scots/Ullans). Patrick Stewart could talk Yorkshire to you and you wouldn't have a clue what he was saying unless you were from near the village he grew up in.
And at one time, the English who had emigrated to plantation-era Ireland believed the English in England were letting English go to the dogs and that they were speaking a truer form.
"Think today's computing industry moves fast? Try that of decades past. Moore's Law predicts that processor power will double every couple of years or so, but on December 7, 1962, the scientific computing power of the entire UK is said to have doubled in a single day..That was the day they switched on the original Ferranti Atlas."
C'mon! All together now! Let me hear you!
"But could it run Crysis?"
Most likely: no, but it might have played pong by tele-type
"December 7, 1962 .. was the day they switched on the original Ferranti Atlas, the UK's first supercomputer .. The Atlas delivered nearly a hundred-fold advance in processing power over previous computers and it brought many innovations, including virtual memory and a multitasking operating system" ...
What are the specs in modern day parlance?
In beardseconds please.
Preferably how many ips/tips/hips or kips, since beardseconds are inherently variable. Especially female beardseconds, which are difficult to measure.
From the Wikipedia article linked below, a floating point multiply took 5μs - so (very roughly) 0.2Mflops.
The current fastest computer (Titan - also the name of the Atlas 2 prototype) has 20Pflops and cost $100 million (roughly equivalent in modern terms). So a 10^11 improvement in 50 years, or doubling every 16 months - someone should turn that into a law, or something.
Unless you understand mainframe architecture the specs won't really make sense compared to a desktop machine.
Actually quite substantial, according to the wikipedia page. It had 48 bit words, with 16kw of RAM and 96kw of drum storage --- that's 96kB and 576kB --- mapped into a 24-bit virtual address space, and what looked like paging between drum and core so that applications could use the full address space. It had an MMU, full interrupt systems and was asynchronously clocked (something that's still not done much today). Performance seems to have been about 500000 flops.
The washing machine comment is a bit harsh: this is way more powerful than the kind of embedded PIC you find in that sort of thing, and even beats a lot of 8-bit microcontrollers of today. I think it's roughly equivalent to about the 16-bit microcontroller class, although you'll need to find one with an FPU.
Does anyone have a reference to the instruction set?
The instruction set can be found here:
Hey! An emulator!
"When we had finished commissioning, we eventually got to a point where the machine would run for ten minutes without fail, and at that point we all cheered and went to the pub to celebrate surviving ten minutes,"
Wind the clock forward 20-ish years and I could have said the same, only it was the software not falling over that we celebrated.
Surely I am not the only person here who actually used that machine? It rocked, until the day its console caught fire and we had to send our jobs to the Harwell Atlas while they rebuilt it.
For performance numbers see http://en.wikipedia.org/wiki/Atlas_Computer_%28Manchester%29#Hardware
The chick who's standing up - do you have her number?
Phone number, or the seial number of her Zimmer frame?
Who carried on using these things until 1976? Amazing.
No you're not the only one - I used London University's Atlas around 1967 when doing post-grad biochemistry, to process readings from a scintillation counter. A lab assistant in a brown coat would take my printouts away, and 3 or 4 days later bring back another pile of fan-fold with the results. Then I'd sit down with a clockwork desk calculator to finish the job.
Now I understand the origin of the HCF assembler command (Halt and Catch Fire). Thank you.
Early masks of the Motorola 68000CPU had this bug as I recall. There was an opcode that caused one set of data bus buffers to turn on all 1s, and another to turn on all zeroes. The result was a deeply satisfying but expensive crack and a smell of frying epoxy, in the commercial version. The ceramic clad version just sat there and got hot till the wires fell off the leadframe.
"Now I understand the origin of the HCF assembler command (Halt and Catch Fire)."
The ICL 2903 had a built-in card reader that was totally controlled by software. It was possible to get the microcode timing sequence wrong - and effectively issue a command to the picker solenoid of "select, hold, and catch fire".
"Each of the three Atlas machines". A bit unfair, because three Atlas II machines were also built.
This also might be of interest:
The owner of the farm where Flossie is housed has, unfortunately had to sell the property (the original Darling Buds Of May series location) and a new owner is in the offing, so the future of the machine is uncertain. Flossie is normally on demonstration on the same day as an annual charity classic car show at the farm that I am involved with (slightly off-topic, I know!), so the whole event may be in jeopardy.
I wonder which management genius thought it would be a good idea to build a precision machine in a locomotive factory?
Hard to remember how good Britain used to be at this sort of thing. Not just Atlas but the monster transistor computers built by the University of Manchester.
Britain is still very good at this sort of thing. Both building precision stuff and crap middle management.
Can't think of any washing machines off the top of my head that sport a miltitasking OS, unless they happened to be Internet connected or something. Even then a proper MMU and an FPU would be pretty redundant.
Why middle management? It's the people on over £200k a year that seem to have undefinable jobs in failing organisations.
You'd almost think if we got in some Japanese or European mainland management for a British workforce we could turn out world class stuff.
Oh wait, we do.
IIRC, back in the 90s, Philips washing machines used out of spec. SPARC Motherboards.
I have no idea what OS they ran, but Solaris would make sense.
Multitasking may be overkill, but I'd certainly like a washing machine that I could program myself. E.g. to overcome bulletproof Zanussi's inability to rinse satisfactorily.
the manufacturing site mentioned in the video for Atlas, during its later days through ICL & Fujitsu Services, had an informative mural on a corridor showing the evolution of the UK computer industry including Atlas. I think the tower at West Gorton might be gone now as it was shedding lumps from on high when I was last there.
Years ago (mid-70's) we had to write programs - FORTRAN, I think - only 5 lines or so - to teach us computing. (Or was it Basic?) We only had one mechanical tty to use.
This was at Coventry Technical College, and I seem to remember the instructor telling us a) the machine was at Manchester, and b) it was frightenly expensive to use - we only had a couple of minutes or so per student. He'd vet our paper-written programs first, and made us practice keyboard skills from a sheet of copied paper (the copier, IIRC, was based on methylated spirits, and used a drum) so we didn't waste time on "hunt 'n' peck" typing.
In the days when my then-girlfriend was using punched cards, which sometimes she'd bring a stack home so I could help her debug - and her engineer fell about laughing when I told him "Half a byte is called a nibble" - which it is!
Yoof of today don't know they've been born. Luxury (etc.)
> the copier, IIRC, was based on methylated spirits
Ah, Roneo machines.we used to do our school newsletter on them, I still remember the smell.
Biting the hand that feeds IT © 1998–2017