Perhaps they should get their data into some from of computer analysis system
and they could see if there is some bias in their data.
Mind you top quality lawyers would be cheaper than their nearest available option for that.
8318 publicly visible posts • joined 11 Jun 2009
No its an indentation Nazi which is just as bad. It just shifts the problem somewhere else, it doesnt get rid of it, Because its not really a problem, Its the underlying concepts that people find difficult.
I have been teaching some youngsters and most prefer brackets when things get a bit complicated, it may be my influence I'll grant you, but when a code block stretched to a few tens of lines parentheses matching is a god send. Who needs a collapsible editor when you can find a matching delimiter. And why do collapsible editors only ever collapse from the top?
I remember when, at one company we started of trying to work out how to be secure without locking ourselves out. Its not as easy as it sounds. A safe with the encryption code on an envelope? For $190 million that's not safe.You can keep adding layer after layer but it always ends up with a SPOF, either in terms of making it possible for others to get in or locking yourself out. Banks have two key access to boxes that are freely available to people with diamond tipped drills.
When working on something that is 'designed' to avoid outside scrutiny I can easily imagine keeping it in house and not trusting your colleagues this loophole was never looped.
I worked at Martlesham - the culture you mention - obfuscation, delay, decision-aversion only came in with privatisation. We had technology there that BT would love to gave now but they filled the place with accountants and introduced and internal market et voila "Your every move and decision scrutinised"
The pension fund defecit was their own making - they could have put money into it - they would have just had to pay a little tax but they knew the moronic tory government that had privatised BT had said they would cover any defecit of all the people like me they made redundant.
Your rant is so far from the reality I remember I can only guess you got it from the tory handbook of making up shit.
That's just going to start a wave of people complaining about their sparrows being eaten. Peregrine Falcons take the odd rock dove - but a having twice been near the killing zone of a PF i can tell you the noise made by the raptor coming in at over 150mph to take a bird by surprise will take you by surprise - and that's a lot more volume than a pigeon dropping!
I think one of the troubles with scientists is they did not spend enough time playing as children. I used to spend days playing in the dunes near my grandparents in Aberdeen and most of Mars can easily be created with sand and wind alone - no need for water. Its because we have a lot here that we tend to assume its necessary for there to have been some on Mars at some time, but even rocks can flow like a fluid to cause the erosion seem in many places and wind will layer sand to make sedimentary rock.
The latest theories about earth's water (which doesnt match the isotope ratios of comets in the least) seem to suggest it may have been created in the hot depths of the earth, something that existed in Mars for a billion years leaving 3.5billion years of wind to create most of what we see now.
Terse but understandable code? That really depends on the problem at hand - many problems I'd have to deal with cannot be written in terse and understandable code because the problem is not terse or understandable even its most simple breakdown. I always used to say to people who said they could write code without a goto to fuck off and write a device driver, or if that didnt work run something on their object code that removed all jmp instructions.
I grew up in chip design and some code worked on a quantum level in emulating how device worked, When that code didnt work as it should you couldn't modify the code - some of which was the combination of several PhD lifetimes of the best minds in the world, If you found a situation where their code failed you just had to make sure you never called their code in a way that would make it fail.
I used to code as a tool - I'd write stuff that made my chip-design easier and less error prone. The best part of my job then was computers were slow and even my hyper efficient jobs used to take a few days to run let me go and read all the journals in our library and, around '87 I came across a paper where the author looked at C and wrote simple code to check for buffer overflows, unreleased memory and a whole bunch of other testable errors and I swept these up and added a stage to make which was run before make debug so by the time my code was make debugged 95% of the invisible errors were already eradicated. Just making sure strcpy and malloc were not used in any code you test unless by sandbox proxies let alone consider for production makes your life so much easier (well once you've stopped using MS programs that need to live in 64k segments).
I've done this in any language I've had to code in for more than a few weeks as the couple of weeks spent writing the belts and braces pays for itself pretty quickly - even if the morons who run the project wont allow you the time its worth sneaking it in under the radar.
Never had a buffer overflow in any language once I'd worked out it was a problem. If is beyond you to write a little routine to do the checking for you and make sure you use that rather than the potentially buggy one.
This stuff was only ever hard when done in assembler without macros.
RISC is RISC - it could theoretically have microcode*, its just that people discovered that a lot of the time a compiler could optimise stuff and make things a lot faster than the same source compiled for the microcoded stuff. When memory was as valuable as the CPU then microcoded stuff seemed like a good idea. Sophie Wilson seems to have spotted that the simplicity of the 6502 could be combined with a limited instruction set, 32 bits and still burn through source code faster than MISC and using a lot less CO2!
The meaning of core will disappear as number crunching chips for AI come more online - my RaspberryPi ARM CPU has vector processing code which are surprisingly nippy for doing some maths stuff and I can see more generalise arrays of this sort of stuff optimised in different way for AI rather than, say graphics, which will become very important in the future but IBM seem to have shown 8bit maths is more than adequate for a huge amount of AI stuff so how many 8bit cores does a 64bit 'core' count as?
* you could argue that memory cache is a form of microcode but for memory access rather than ALU access.
I was lucky enough to be into chip design and I remember getting a numeric co-processor to speed up circuit simulation (PSpice on a 286/287) and boy did it speed it up! ISTR it was faster than on our VAX780. Agree with you on segmentation though I think it put back PC computing by 13 possibly 15 years - I played with CP/M 68K around the same time and it was a dream to code for, I still wonder who the twats at IBM were that wanted Kildall to sign a non-disclosure agreement and buy CP/M outright and yet didnt do the same to Gates.
It had a 68008 and 8 bit data bus but it was only half as fast as the 68000 which, as a 32 bit chip was actually pretty damn good. You could write code without worrying about 64k boundaries and if IBM had chosen it instead of 8088 (also 8 bit data) and they'd got CP/M 68K instead of MSDOS the first 15 years of its life wouldnt have been a complete fucking pain.
Even mainframes were slow at the time. They made you think about how to make them work for you, rather than sending emails to someone else to get them to do it or writing a document to pretend you'd done some! I used to use my first at work IBM pc to craft scripts to do all sorts of editing to file listings that would then be modified by another script to create massive batches of jobs to tun on the mainframe, which while slow, could run for a couple of weeks at a time doing my work for me.
Drupal, like many web site arseistants, seems to be built on the false premise that something complicated can be made simpler. All it does is move the hard bit somewhere else, where you will have to meet it, rather than face to face, via a whole new bunch of not quite worked out problems. It's the Fourier transform of a Spike, rather than a short click you end up with signals in every frequency.
The ipad would not do what he wanted it to do for any money he can afford. Ditto MacOS.
As for changing to another windows OS he's one of these people that bought into computing is easy so he's not up for that.
I may offer to help him upgrade his machine to Linux but not until he's calmed down. He spent ages preparing his demo and getting the people together to watch it and MS takes over his PC and makes him look like a twat (or so he feels), Why cant they just have the common courtesy to say your machine needs updating and it will grind to a halt for a while is that OK or do you have a living to earn from this shit piece of software we made others put on your computer.
appeared in many papers as a grid that you would look through desperately trying to find the shoot,
I found it on one site - it would have been an inch or two off the the left on all the other ones published. It seems most sites crop (crap) the photo to fit the layout without person with clue checking it.
The problem seems to be that 90% of the people involved in developing web pages are "definitely not skilled in the art".
I must confess I have my name on a patent but my job was at stake for not agreeing to write about something I'd knocked up the functional core of in a lunch meeting! I doubt it would stand up to any scrutiny but my managers thought its was the dogs bollocks whereas to me it was obvious to anyone reading their first art pamphlet as Viz would describe it. I was probably skilled in that art but 25 years on I'm still learning about computing.
People used to brew beer twice from the same grain - the first mash would produce a good strong brew - possibly up to 10%, and then the grain was mashed again and the small beer made.
Modern brewing uses a technique called sparging where a small amount of much hotter water is used to wash the remaining sugars out of the grain after the first (and now only) mash liquor has been drained off.
Conversely there is a pub near Hexham called The Twice Brewed where they used to add more grain for the second mash to make an even stronger ale, Its in a village called Once Brewed!
Were LA's doing things inefficiently or did Crapita just manage to sell the idea they were? I worked for one small council that had managed to avoid outsourcing and we had far far fewer IT staff, in house and out, than all the outsourced ones of similar size.
I believe its just another tory myth that public services are inefficient - as outsourcing proves time and time again.