* Posts by Steven Jones

1526 publicly visible posts • joined 21 May 2007

Scotland bans smut. What smut? Won't say

Steven Jones

@Chris W

Films with certificates are exempt.

Steven Jones

@ElReg!comments!Pierre

Nope - you are quite simply wrong. The question was asked of the Crown Office and Procurator Fiscal Service under what conditions they would prosecute. Whilst they will only do so if there's a reasonable chance of a conviction, it is not they that decide whether the law has been broken. With a law of this type, where the boundaries are subject to interpretation, only a court can decide and establish the case law.

So if you will, the speed limit sign is there, it's just not that easy to work out where the limit is, and only when case law is established will that become clearer. However, the prosecuting authorities may have other conditions they use to work out what cases are to be prosecuted. For instance, they might decide that it is not in the public interest to prosecute people with just a couple of suspect videos if they have not come to the attention of the police for related offences.

It's about time the great British public learnt the difference between what the role of public prosecutors and that of courts.

Steven Jones

@Danny 14

It is not magistrates that decide whether somebody is proescuted or not. That decision is made before the case arrives in court. The magistrates job is essentially to decide on guilt and what any penalty might be. If you are doing 31 mph in a 30 mph zone then, unless there is some doubt or unusual circumstances the magistrate will have not choice but to find you guilty.

Steven Jones

@Blofeld's Cat

Only if you have a video of the event, and only if the action was designed to provoke sexual gratification. One of the ridiculous things about this law is that you can have a video depicting all sorts of nasty physical acts, but it's only illegal if it's to provoke a sexual reaction. So a video depicting a murder in a realistic way is legal whilst the same is not true of a rape scene. That's unless its gained a certificate, in which case it is all irrelevant.

Steven Jones

@Mike Shepherd

Of course removing the speed limit signs is not an exact analogy. The analogy would be the police not declaring just how far you have to be over a speed limit before they will prosecute. So, for instance, it may be policy not to prosecute those travelling at 44mph in a 40mph zone, but those policies can change, and they don't want everybody driving to the limit of what the the authorities would tolerate.

Of course there's a problem in interpretation of the law as this is not such a cut-and-dried situation as a speed limit. The problem here is that nobody is quite sure where the boundary will be, and neither will the prosecuting authorities until there are some test cases.

Crematorium to heat council swimming pool

Steven Jones

Doubtful

I'd like to see the technical details involved there. I suspect they are just chucking the bunnies into an incinerator with a load of combustible rubbish. A body with a lot of fat will contain quite a bit of fuel, but human beings are approximately two-thirds water, not to mention all the mineral parts that don't burn. When the road manager for Gram Parsons used five gallons of gasoline to try an incinerate his body and coffin at Joshua Tree in a sort of improvised cremation, he only partly succeeded.

In any case, just how many bunnies are running amok in Sweden to make any meaningful energy contribution?

Steven Jones

And the issue is?

I'm not sure what the problem is. It would be inappropriate to have happy swimmers splashing around in sight of the crematorium, but that's a proximity issue, not about reclaiming the waste heat from the furnaces seems perfectly sensible. It's not as if somebody is promoting the composting of bodies or using them for methane production. The energy will be coming from the burning of gas. Dead bodies do not make good fuel.

Some people may recall a scene from Brave New World where the chimneys at Slough Crematorium were equipped to recover valuable chemicals from incinerated bodies, an invention which I feel owes more to artistic license than economics, but this is a step on the way.

There are also those who opt for "green burials" where they do, indeed, expect to fertilise the earth. Then there are those societies that practice sky burials and you get recycled for vulture food.

Big new wind turbines too close together, says top boffin

Steven Jones

Smaller turbines are not the answer

Smaller turbines are less efficient, they spin fast, make more noise, require more masts and, as they span less of the vertical height ot the moving air mass, are simply not going to have access to so much wind energy (and, before anybody suggests it, it is not practicable to vertically stack multipl wind turbines on the same mast for all sorts of reasons).

However, it's hardly surprising the more energy that's taken out of the moving air stream, the more interactions there will be between nearby turbines and other windfarms.

Man killed by own cock

Steven Jones

Oh Dear

This story is clearly a Sara Bee revenge fantasy. I think the warning needs to be heeded.

BBC rebuilds Civilisation in HD

Steven Jones

Science Documentaries

Far an away the best science series in the last decade (and a bit) was Aubrey Manning's "Earth Story". It was a masterpiece in how to tell a compelling story of discovery without recourse to historical reconstructions or over-emphasis on the histrionics of the presenter. Simply superb, and a lesson to producers and directors on how to do this stuff.

Horizon is still very, very patchy. The recent Ben Miller programme was an insult to the intelligence where he, as a Cambridge Physics graduate, and somebody who started a PhD, pretends to be incapable of explaining to some mythical woman at a dinner part what the 1 degree meant along with the concept of temperature (and later on, normal distributions and averages). It skittered along, larded with pointless shots of his Citreon DS and climbing a ladder to read a thermometer on the roof of his house, starting with some very simple illustrations to throw in a bit of quantum mechanics, superfluidity and nuclear fusion without adequately explaining some of the basics on the way.

For instance, the importance of triple points as a standard temperature was emphasised, but they failed completely to explain the role of pressure which would leave a lot of confused folk wondering why they don't see this happening in nromal life. Indeed they didn't explain what a phase change was at all, something pretty critical to the understanding of what melting and boling point means (which they mentioned in passing).

Frankly a fairly moderately educated primary school teacher can explain what temperature is in qualitative terms. The illustration they did use for temperature (dodgem cars moving faster) was OK in itself, but that's only a metaphor for temperature in a gas, not a liquid or solid.

However, it wasn't primarily the poorly explained science and structure that got to me, it was Ben Miller being forced to go through this pretence that he couldn't explain this stuff perfectly well, and was forced to rely on a succession of world leading experts to do it. Indeed, I'm inclined to think that we might next find him in a brain-dead populist SF series looking something like a cross between Hollyoaks and Torchwood infest with malevolent time-travelling dinosaurs eating large numbers of the population without, apparently, coming to public attention. You mean somebody has thought of that already?

Steven Jones

Wrong series

A lot of Civilisation was patronising, and in its implicitation that the arts virtually define civilisation, it misses out on the most important aspects. The discussions are largely those through the eye of the artist. The roles of civics, of education, of governance, of rationalism, of technology, of science, of philosophy, of mathematics are all subserviant to the visual arts. That's not to mention that the focus was narrowly on western Christian culture.

To my mind, a far greater insight into the human progression and civilisation can be seen in Jacob Bronowski's Ascent of Man. Whilst this, inevitably, identifies with western rationalism, Bronowski being an archtypal modern inheritor of the enlightenment tradition. However, he allows space for other cultures and was certainly no philistine when it came to the arts. Prior to WWII he moved to Majorca to be near Robert Grave, he married a sculptress (who died only recently) and was a supporter of modern art. If you forgive the crude graphics, Ascent of Man has stood the test of time very well indeed.

Bronowski was the man that I secretly suspect Richard Dawkins wished he was. We were the poorer for his early death. If anybody could communicate the important of rationalism, of secularism, of the nature of humanity without raising an allergic reaction, it was him. Also, Jacob Bronowski left us Lisa Jardine. Lord Clark begat Alan. Enough said I feel.

Come on BBC, at least broadcast Ascent of Man again, and think that there's more to civilisation than that seen through the eye of an artist.

Who are the biggest electric car liars - the BBC, or Tesla Motors?

Steven Jones

Numbers

Even if the 9 kWh figure for refining a gallon of petrol is correct (I'd like a reference please) which I assume is the thermal energy and not electricity, then it will not drive the Tesla for 48km. That's because you've made that classic mistake of not allowing for the thermodynamic efficiency of power generation and distribution. On UK power generation, the electricity delivered at the wall socket is about 31% of the original thermal energy. The reference I have is a little old, so it may have gone up a percent or two as some modern gas generated power stations are a bit more thermally efficient.

http://www.powerwatch.org.uk/energy/graham.asp

That would turn your 48km into 15km. Ten miles is nice, but a small diesel will do well over 100km on a UK gallon. Even my Focus diesel average 90km a gallon. If you are going to use a Nissan Leaf as a comparison point to a diesel car, then we need to choose something modern and comparable. The Skoda Fabia Greenline II 1.2 is rated at 89 gm/km of CO2 and 132km on a UK gallon.

On CO2 it's not quite up there with the Leaf, but it's only 11% higher and the list price is a lot lower. The electricity costs on the Leaf are only so much lower because there isn't a 200% markup (tripling of the price) due to duty & VAT. The Leaf also gets subsidised in other ways (like £5k of goivernment money on the purchase price). If we had a rational carbon tax, that put tax on in the proportion of CO2 produced, the fuel costs of the small diesel and the electric car would only be a couple of pence per mile different (at least in the UK). It will take a fundamental change to the UK's mix of electricity generation plus "smart charging" systems to change that.

For the great majority of ordinary people who have mixed motoring requirements, have limited car parking space, and can't afford the luxury of two different cars, the electric cars simply don't cost in when comparing like-for-like practical runabouts when issues like purchase price are factored in.

Teslas and Corvettes are not in the same league - they are toys for rich folk with spare cash. They are not day-to-day transport for ordinary folk.

The problem with Robert, is that he's an evangelist and he selectively quotes (like claiming his Mitsubishi costs only 1.33 p per mile). That's rubbish as it neglects things like depreciation and financing costs on the higher list prices. If he wants an informed discussion on energy efficiency and the physics involved and some mathematical modelling, then fine. However, I think he's somewhat outqualified by many here. Personally I prefer the more balanced arguments that you find in the scientific literature that discuss these things more objectively without his characterisation of those with opposing ideas as mindless petrol heads.

Steven Jones

Still Nonsense

All those things you list don't come anywhere near a 200% overhead. Firstly gas flaring is vastly reduced as it has a value of its own. Tankers are incredibly energy efficient - they use a tiny, tiny fraction of the equivalent of the cargo. The evidence? Just go compare the size of the fuel tanks on an oil tanker with the cargo tanks. Refineries do, of course, use quite a bit of energy, but a bit of research reveals it's a few percent of the product. Delivering fuel to the petrol station takes energy of course, but again, just look at the size of the tanker's fuel tank and compare that with the main cargo tank. Also that fuel tank will handle many trips from the depot.

In all, it's a fraction that's used, not a multiple. There are some sources which are particularly inefficient. Oil shales and sands are the worst, but it's still possible to reach almost 90% efficiency in some cases.

Of course, when doing all these calculations it's important to remember that there are non-fuel products produced. Chemical feedstocks of various sorts, tars and so on. So if you just take the fuel fraction you will get a higher ratio, but properly those overheads belong to the products produced.

Steven Jones

Real World Electric Car Economics

In claiming that your Mitsubishi iMiev costs only about 1.3p per mile, a rate only achievable if you have access to off-peak power at 5p per kWh, you have glossed over the little issue of that car's list price which is a princely £23,990 pound, and that's after the government has contributed £5,000.

Compare that with a Skoda Fabia Greenline II, which lists at £11,445 and has a quoted combined mileage figure of about 83.1mpg of unders 7p per mile. Even at the 1.3p per mile figure, it's going to take 220,000 miles to get payback on fuel alone (and if you are paying 10p per kWh for your power, that becomes over 280,000 miles. Of course you will save on maintenance costs and there are some other government subsidies (like no road tax), but those expensive batteries aren't going to last over 200,000 miles, and I suspect they are a deal more expensive than a reconditioned car engine.

There is also a point that the electric car only gets it's energy cheap because the tax on electricity is so low. Two-thirds of the pump price is tax, and that an electric car using power largely produced by thermal power stations is only paying reduced rate VAT is a huge subsidy, which is hardly justified by the difference in CO2 emissions. The Skoda is rated 89gm/km whilst the iMiev's works out at about 65gm/km based on typical UK electricity generation (I've used 450gm per kWh, but that's on the generous side).

OK - that's a saving and doesn't include mining/drilling/fuel transportation, but then the power generation numbers don't either. The point is that it's not hugely better (in the US where coal generation is used more widely, electric cars are worse on CO2 emissions - there's a Scientific American article on this).

Also, many of us do not have the luxury of being able to afford two vehicles, ones as a local run-around, the other for longer journeys. We don't have the parking space, the power supplies or the finance. This is not to say that electric vehicles don't have a place, but it's a niche at the moment and it simply isn't an option for the great majority of people.

My personal hope is for plug-in range-extended hybrids as there is a case for smart-charging systems which can make use of inherently peaky renewable power supplies. However, nothing beats the energy density of storage of hydrocarbons (when containment is taken into account), and the future for long distance travel probably involves artificially produced hydrocarbons in tandem with some battery storage. However, pure electric cars are almost certainly never going to replace IC engines.

Steven Jones

Robert Llewellyn

I've read Robert Llewellyn ranting on about this rather a lot. He was on TV testing the Nissan Leaf (probably 5th Gear). Whilst he admitted that the CO2 emissions per km traveled with typical UK power generation was only about 20% lower than that of a similar sized moden IC car, he then went on to say that when you take into account drilling for oil, transportation, refining and so on, that the the latter went up to something like 365 gms/km (from memory) implying that more than twice as much CO2 is used in producing petrol/diesel as burining it. That is, of course, complete nonsense - we do not burn the equivalent of more than two-thirds of the crude oil in producing diesel & petrol.

However, it's an improvement on some of his earlier claims where he was claiming > 80% efficiency for electrical cars without taking into account the approximately 30-35% thermal efficiency of typical UK power generation and distribution.

For the moment, electric cars are just local run-abouts. The only practical way of dealing with the range issue in the short term is the approach taken by Chrysler with range-extending IC engines. I'm far from convinced there will ever be the equivalent of petrol stations for electric power. Quite simply, dumping 50kWh into a car battery in a couple of minutes requires so much power that there must be serious safety concerns. The equivalent of a petrol station with 8 pumps would be delivering 12MW at that rate. That's simply a huge amount of power - perhaps what you'd deliver to 4,000 houses at some moderately busy times. That's quite a potential explosion if something goes wrong.

Trickle charging a hybrid car using "green" electricity with intelligent metering systems makes sense as that can absorb peaks in renewables. Building more thermal power stations to allow for peak delivery of power very fast makes none.

No need for speed, says Oz communications shadow

Steven Jones

Power

In the UK, at least, it's a legal requirement for fixed lines to work through a power cut to allow for emergency calls, although many people use phones (like DECT) which are reliant on mains so this is not always effective. Mobile phones are ubiquitous, and they are not reliant on mains (provided the base station has a UPS). Secondly, it's perfectly possible to produce a fibre termination device with its own battery backup, although whether this is really necessary with mobile phones around is surely a moot point as it will add considerably to costs, bulk and maintenance.

Steven Jones

Just a politician's ploy

It's just a politician doing what a politican does. Use the most extreme argument and setup a straw man which you then demolish. It's blindingly obvious that no household could ever use 1TB per second, and I suspect the number of corporations that use that much WAN bandwidth for internal purposes can be counted on the fingers of one hand.

However, it's certainly possible to make a case for 100Mbps (say three channels of hi-def h.264 @ 24Mbps). There is a case to be made for higher peaks (if you use remote backup for your data and you need to restore several 100GB then 1Gbps would be nice). !00Mbps is just about possible using FC to the kerb. Beyond that you are into FTTH territory.

Steven Jones

A terrabyte per second at home?

Let me get this right - one Terrabyte per second? That's 8 Terrabits per second, or the equivalent of 800 10Gb connections all working in parallel.

I sincerely hope this is a figure for the aggregated capacity of all the links in a city. Otherwise it strikes me as simply hype.

Sainsbury's is abandoning tape

Steven Jones

Agreed (to a point)

Agreed that there are volumes where tape wins out for a full copy. However, one thing that disk does allow for that is more difficult with tape, and that is incremental backup. In effect, that's what de-dupe does. It looks like a full backup but is, in fact, an incremental in terms of space occupied. It would be an exceptional organisation that turned over 3PB of updates a day (an average of almost 35GB per second of update).

However, there's a penalty to de-dupe which is often not mentioned, and that is the very process of de-dupe gets rid of an important level of redundancy. If you take 10 full backups, you no longer have 10 independent copies. All the copies will share the same de-duped block (using "block" as a loose term for the unit of de-duping) unless multiple real copies are held. That means putting a huge amount of trust in the resilience of the backup storage solution and it's complex mapping database. All those devices have a lot of resilience built in for those functions, but you'd better hope that it has been extraordinarily well tested or a corruption could wreck the entire backup store.

Also, in order to off-site, you need either to destage to tape or to use storage replication of the backup store over some fairly meaty network connections.

Cable vendor slapped for unproven claims

Steven Jones

Hows does ADSL work

OK - back to my physics days. The reason why ordinary phone cable can carry ADSL signals is that (from the exchange) it's a balanced pair - at least until you hit the household wiring when ring-line separation (at least in the UK) can be a problem as can some nasty home wiring kits. Both are easily dealt with.

Note that the pairs from the exchange are fairly loosely twisted, as they were designed for noise rejection at a fairly narrow range of audio frequencies. Nevertheless, it's good enough for ADSL. What essentially happens with ADSL is that it uses a relatively low raido frequency band (about that of MW/LW) up to about 2Mhz. That bandwidth is separated into a number of sub-channels of, I think, about 20Khz each. Each sub-channel is assigned a number of bits depending on how good the SNR ratio is as the receiving end. The higher the frequency, the more the signal is attentuated (largely due to skin effects with I did the calculation - at higher frequencies less of the copper is used, and at the characteristic impedance of phone lines, then the resultant increase in resistance seems to be the main cause of degradation). Anyway, the upshot of this is that the higher frequencies get attenuated more and can carry fewer bits.

Each sub-channel is modulate using an encoding system called QAM (from memory) which uses both amplitude and phase modulation. Indeed it's what modems used in the dial-up days before broadband, but confined to audio frequencies (and the reason it was confined to audio frequencies was simply because the signal had to pass through a digital exchange system than itself could only encode 64Kbps, in Europe, or 56Kbps, in the US). As ADSL signals only go as far as the DSLAM in the exchange (or streetbox), it never hits the exchange.

Anyway, really clever stuff is how the signal is how the signal is modulated and de-modulated. Back in the old days it was done using inductors and capacitors and analogue circuits and an ordinary phone line would be lucky to hit 9600baud and the modems were physically huge. However, these the modulation is essentiall created as a digital signal in the first place which is then put onto the phone line using a digital-to-analogue converter (DAC). As the far end, the very much attenuated signal (perhaps reduced in power by a factor of a million or 60dB) is converted back into a digital one using an analoge to digital converter. Then a bit of magic happens - the string of numbers representing the, buy now, heavily distorted modulate signal is def into some very clever mathematical algorithms which essentially reconstruct the relevant bit streams.

This bit of magic is called Digital Signal Processing (DSP) and is at the heart of all modern high speed communication and digital broadcasting. DSP is truly one of the wonders of the modern age, and is only hard-core mathemeticians.

So those that say that there is no such thing as a digital signal are right - well, at least over cabling of any length.

Steven Jones

Directional cable

Nope - it's not a joke, or if it is it's an expensive one perpetrated on gullible fools. You can genuinely buy audio cables with direction indicators. Originally they used to give instructions as to which end was to be conencted to what piece of equipment (e.g one end the speaker, the other the amp). Unfortunately some people found they'd made a mistake in installing the cables the wrong way round and found that, lo and behold, that they hadn't heard it. Of course the explanation wasn't that the directionality is a myth, but that magically the directionality mattered, but which direction depended on your particular circumstances and type of kit. Simple solution, some manufacturers kept the magical quality of directionality by putting arrows on the sheaths, but you, the audophile, with your infallible ears will decide which direction to connect the cable and the arrows will enable you to reconnect the same way.

Yes, and people do believe this stuff. The fact that no measuring equipment on earth can detect the effect is irrelevant. There are others - some people believe in cable "burn-in". You have to run it in for several tens of hours before it gives of its best. You can even buy special pieces of equipment to "burn in" your audio cables.

(Nb. there are some valid circumstances where a shield should only be connected at one end, but that's not a matter of directionality but of avoiding nasties like earth loops. In that case what connector is at what end does matter, but then the cable isn't symetrical).

Steven Jones

Very unlikely

I think you need to go back and do the math's and put this into a circuit modeler. In a real circuit such as you describe, and even such a crude design as a you have described (and I sincerely hope that the sort of amplifier that people pay thousands for) will not look like that, you simply will not get inifinite steps as you describe. Firstly diodes don't turn on instantaneously when subject to a sine wave input, they also have a finite forward resistance which decreases as the forward bias increases. That transformer has resistance which will damp the current flow and it simply will not create a step change. It's also easy enough to measure.

In any case, this sudden current flow is most evident at power-up when the smoothing capacitors are discharged. During normal operation, this simply won't happen as it only has to cope with the ripple current. In any event, if the power supply was generating bursts of RFI at intervals of 100Hz then it would be easily measurable. You'd see it on the amp output, you'd be able to measure it on harmonic distortion. In any event, what on earth (no pun intended) would that have to do with the mains power cable?

This is not to say that there aren't dodgy power supplies that damage audio performance. However, it's not rocket science to design a proper one, and if that's done (including filtering), then the power cable is irrelevant provided it's not so poor that it can't deliver the mains.

Steven Jones

Irrelevant

It's irrelevant whether one side of the loudspeaker feed is earthed or not as far as current flow goes. It's still AC. That's because the potential of the active side transitions from negative to positive and back again and the current direction reverses in both cables and essentially averages out to zero (assuming that the amp isn't so badly designed it puts a DC bias on the output). In any case, if the resistance of a cable varied according to direction of the current flow, it would be easily measurable. I've never seen the slightest evidence of actual measurements demonstrating that. It's simply hokum and yet another myth spread around by audiophiles.

Now to take the other point about differential drives, which is what XLR cables are used for. There are two main reasons for using differential outputs. The first is for noise rejection. The use of balanced wires means that the interference largely cancels out. Differential outputs are used on many systems where noise immunity is important. The second main reason is that on the output side a differential output allows for a much higher voltage swing between the terminals. For a given DC supply the peak-to-peak maximum achievable is doubled, which means four times the power can be delivered. Maybe not too much of an issue with mains powered equipment, but useful if you have an amplifier powered from a 12V DC system. Of course you also get noise rejection, but that really isn't an issue on driving high-powered passive speakers in anywhere but the worst environments.

So whether one signal is grounded or not, the current flow is still AC. Directional cables are so much junk science - if they were directional it would be measurable. Full stop.

(In fact you will find some XLR cables do carry a small DC current. That's because some3 condenser microphones requite a "ghost" power feed which means that there is a net current flow in one direction, albeit generally tiny. Needless to say it has precisely zero effect on the quality of the audio.)

Steven Jones

@crowley

No problem with that - if the cable is so bad that it gets uncorrectable errors then you will get corrupt pictures. Just how the TV displays it will vary. You get the same thing with Freeview when the quality is marginal.

Steven Jones

@Dick Emery

I'm curious - how do you do a blind test on video output...

(alright, I know the rules).

Steven Jones

HDMI cables

It's quite simple, HDMI is a digital interface. In fact, if you are using a commercial Blu-ray disk, most likely it's an encrypted digital link. Either the TV can recover the digital signal or it can't, and any faults will be obvious ones in picture break-up. Indeed, if the encrypted signal was not arried perfectly (with any built-in error correction), the signal would not be recoverable at all.

Of course there are advantages to higher quality cables, like robustness. However, as long as it meets the appropriate HDMI standard for the resoution in question (skew, attentuation, cross-talk etc.), it will either work, or it won't. Higher quality cables may be capable of longer runs, but they won't improve picture quality.

http://www.hdmicablecomparison.net/hdmi-cable-quality-comparison/

Note the above does not, of course, apply to SCART or other analogue cables.

Human beings have a wonderful ability to convince themselves that they are seeing a difference which doesn't exist, even if it is technically not possible. It is generally related to the amount of money that they've just spent, and it's a charactestic exploited ruthlessly by vendors.

Steven Jones

Mains Supplies

"Mains supplies are one area that can reportedly make a difference"

Unless your mains supply is grossly deficient (maybe a generator) and falls outside normal tolerances, this is junk. It's the job of the power supply in the audio equipment to provide appropriately regulated outputs to the audio components (and all the power interconnects etc.). This, including appropriate shielding and filteringwhich should eliminate the effect of any normal power supply variations to the audio equipment or any mains-borne RFI. Designing high-quality power supplies is hardly rocket science, and a decent one will withstand pretty well anything a normal mains supply can throw at it (which also means those mains conditioners are a waste of money too if your audio kit is designed properly).

Of course if the Audio kit isn't properly shielded from interference, then that's another problem but still comes down to proper design.

Putting a mobile phone next to analogue hi-fi kit isn't a bad way of dining out if the shielding has been done properly (and associated external signal cabling). If you don't get breakthrough from that, it's probably pretty well shielded.

MySpace unfriends 500 staff

Steven Jones

Sad for the staff

The only good thing about this is Murdoch has lost out. I wonder which executives he's had shot (methaphorically speaking of course) because of the original investment decision?

IBM's mainframe-blade hybrid to do Windows

Steven Jones

The point of this?

I'm not wholly sure I see the point of all this (at least from a customer's point of view). There seems to be precious little real commonality of technology between the mainframe side and Xeon blades. I hope this thing is properly modular so that swapping out your mainframe to a later generation doesn't mean you have to change all the Xeon blades too (or, more likely, vice-versa).

The clue on the this would appear to be the wording of a "land grab" by the mainframe execs. That rather gives an impression of IT department warlords at battle with one another, with this as another weapon. Not a great way to run any company.

I rather suspect that, in the x86/x64 space, IT managers are better off picking the best and most cost-effective solution for their purposes (which might well be a single-source supplier), but it's not obvious what a mainframe integration will do for most customers, even the mainframe ones unless that's almost all that they do.

Lawyers fear Assange faces death penalty in US

Steven Jones

Hype

It's hype - in effect, nobody can be extradited from an EU country without a cast-iron guarantee that the death penalty will not apply (or, for that matter, be transferred to an extra-judicial environment Guantanamo Bay). Of course it's possible to argue that once he's in US hands, they could do what they like, but it's inconcievable that the US authorities would countenance abrogation of guarantees in such a public case.

This is always assuming that the US could come up with a charge on which Assange could be extradited. That might be rather difficult. Even with are rather watered down extradition checks (in the UK at least) to the US, there are still far more obstacles to an extradition than there is on the European Arrest Warrant.

It's difficult to see how Assange would be more easily extradited from the Sweden to the US than from the UK to the US. However, should the US try, then I rather expect that William Hague would be happy that the Swedes took all the flak.

nb. even gaining a conviction in the US on the Espionage Act looks very tricky. Whatever their shortcomings, US courts have a very good record in recognising the right of self expression and there are plenty of powerful figures that would be fighting Assange's corner.

Apple refuses frozen iPhone repair

Steven Jones

Probably condensation

The problem will almost certainly be condensation. If the iPhone was at a sub-zero temperature and was put in a warm car then it's very possible condensation will form internally and that does electronics no good at all. I've known it happen to laptops left in a car boot overnight and turned on in a warm office.

Kindle lets users lend e-books to mates via email

Steven Jones

@Irish Donkey

"And anyone that says other wise is either a lawyer or a liar..."

And those are alternatives?

Arcam Solo rDac wireless digital-to-analogue converter

Steven Jones

@Francis Boyle

I'm in total agreement. So we have to have randomly imprecise DACs which, in some way, make the analogue signal more "musical" by, presumably, introducing some distortion or other.

(Of course there are tools out there to emulate the audio characteristics of analogue devices, such as certain models of guitar amps, but if anybody is to use those, I'd rather it was the originator of the music and not in something as mundane as a DAC - it's job is to reconstruct the waveform as accurately as possible from the digital content).

That second link is almost pure hokum.

Steven Jones

@Shades

Perhaps if there was an irony or sarcasm icon available, then you might have appreciated the reponse as such...

Steven Jones

Audiophile stuff

I for one am highly suspicious of statements about quality being an "order of magnitude" better, which has a mathematical, not subjective definition. It's quite amazing that when properly set up double-blind testing is carried out, a lot of these differences suddenly become undectable.

As for gold plating digital connections. Well, I suppose it looks pretty and if you keep your audio equipment in a damp cellar, then it might prevent corrosion, but it makes sod-all difference to sound quality. It will basically either work or not without much inbetween, like pretty well all digital communication.

Not to say that bad DACs don't exist, but if they do, then it's objectively measurable with proper instrumentation. Any properly equipped audio lab would be able to measure how accurately the analogue output tracks the digital data for a lossless feed. However, obhective measurements like that don't sell over-priced electronics to gullible members of the public...

Samsung SH-B123 internal 12x BD-Rom drive

Steven Jones

Get a burner

For £90 you can get a fairly decent Blu-Ray burner. OK, you won't get the 12 x read rate that this device claims (but doesn't seem to get near in the real world), but just how useful is that ability anyway? I'd go for the burner every time, which is really useful if you produce and distribute more HD content than will fit on a DVD.

Micron revs flashy SSD line

Steven Jones

Almost certainly not the SSD

The ratings of all the SSDs that I've come across are considerably lower than 2.5" HDDs.

The reason why you laptop drained the battery faster was almost certainly because something that was previously I/O bound was able to run faster and more frequently and hence chewed up a lot more power in the CPU. Very likely it was some sort of background task or related to security. Given that the read latency on an SSD is usually something of the order of 100 times lower than a 2.5" drive, then it can have a huge knock-on effect on the other resources used as an I/O bottleneck is relieved.

Tom's Hardware carried out a flawed benchmark test on power consumption on SSDs that found exactly this.

Who will rid me of these obsolete PCs?

Steven Jones

DC vs AC Power Capacity

I have news for you. A kW is a kW whether it's powered by DC, AC or recycled cow-farts. It's one of the universal laws of physics. Of course a power supply will draw more power from the mains than it can deliver at the DC levels as they aren't 100% efficient (good ones can be over 80% efficient).

So an 80% efficient power supply delivering 300W of DC power will be drawing 375W from the mains with 75W disappearing as heat (less the odd few watts used by the cooling fan). In fact it's pretty near impossible for any power supply to reach it's actual rating as that would require each of the separate DC ratings for the various voltages available to be optimised.

So if a 1 kW power supply is only drawing the power of a 100W light bulb that's only because the DC side is delivering perhaps 80W.

Nb. it's important to note that the difference between the rated power output and the rated power consumption. If you want to be picky then you can get into power factors, but that's not a subject that the average user needs to get into.

Small biz calls for end date on enhanced 17.5% VAT

Steven Jones

But you did...

If you voted at all, you voted for less money in your pocket as whoever got in would have been increasing tax, pay would be relatively static and inflation would have continued.

Of course none of the parties would have actually put it that way, but reality breaks through one way or another. We are still subsidising our lifestyles through State borrowing at an unsustainable rate.

Steven Jones

Note even £2.05

In fact the actual increase should have been more like 2.1% as the actual ratio is in question is 1.2/1.175 (I blame the BBC as they keep calling it a 2.5% increase). Rounded to the nearest whole penny, your cup of coffee should have been £2.04.

Intel unveils itsy-bitsy, teeny-weeny SSDs

Steven Jones

@Daniel Evans.

G is the universal gravitational constant, but it is clearly not what was meant in the bloody article. They clearly meant what is normally written as "g", which is an acceleration of about 9.81 metres per second squared.

We were equipped with common sense for a reason, and it's not to go blindly down case-sensitive rat-holes as you would head.

Steven Jones

Latency...

"In a data center people would generally prefer to have big storage capacity as they can access drives in parallel for speed. The greater density allows more storage for the power and cooling required."

It's perfectly true that you can get more IOPs (and throughput) by installing more drives at the cost of power consumption, cost and data centre footprint. However, there is one think hard drives can never do, and that is consistently achieve random read latency times of less than about 6ms. With SSDs you can get down to tens of microseconds (or, realistically, on a SAN hundres of milliseconds). If you have an application which is I/O bound on random reads, then only SSDs will get you out of it once you;'ve exhausted the practicalities of caching. (Random writes are not such an issue - enterprise arrays will cache those in NV store).

As for 4TB drives in the enterprise, then we have had real issues with the reliability of high density drives. They are OK for some semi-archival uses (for which you'd never use SSD anyway - at least not for the forseeable future). However, use them hard, and they fail at a much higher rate than the lower density enterprise drives. Then there is the problem that re-building RAID sets with such high density drives takes a very long time as capacity inevitably outgrows throughput (read/write throughput goes up in proportion to linear bit density, capacity as to square of linear bit density). That pushes you to double parity which removes some of the capacity advantages and can also impact performance.

SSDs will erode the top end of the enterprise drive market as prices drop. Realistically, write endurance is not going to be too much of an issue. Enterprise drives fail too, and once they get to the limit of their practical operating life, failure rates start to increase and swap-outs happen more frequently. Of course it's generally covered by maintenance contracts, and exactly the same thing will happen with enterprise SSDs.

Disks won't go away, at least for the forseeable future, it will just get pushed more and more to the bulk storage area.

Steven Jones

Microdrive

I'm sure he meant the tiny hard drives that were available in CompactFlash (type II) which have reached as much as 8GB by 2008. However, they haven't generally been available in CF II format for several years and you'd be lucky to find anything bigger than 2GB.

Steven Jones

Shocking Stats

G in this respect will be the acceleration due to gravity at the Earth's surface, not the universal gravitational constant. The former being or rather more practical use to the average person, not to mention the units of the universal gravitational constant make no sense at all in this context.

As for 400G/2ms this is not a single unit, but a statement that the device can withstand a rate of deceleration of 400G (or about 3,924 metres per second squared) for 2 milliseconds. That is not the same as 200G for 1 millisecond (which would be half the rate of deceleration for half the period). Do the mathematics on this, and you find that this will allow for this to be installed into a device that can be dropped a metre or two (depending on the degree of bounce) provided that the SSD is decelerated over a few millimetres (which can be achieved through a deformable case, rubber mountings or some combination of such). Most laptops won't survive a 2 metre fall onto a concrete surface, although they might on a heavily carpeted one.

So the 1,500G in 0.5 ms means that the device will stand almost 4 times the rate of deceleration albeit for a quarter of the time. What this means is that for the same collision speed it can be slowed a lot faster and hence the mounting and casing needs to allow for less movement for the same drop height.

As this device is only 10gms, then the 1,500G is equivalent to a force of about 15N or about that experience by a 1.5Kg mass at the Earth's surface. On that basis I suspect that force could well be sustained much longer than 0.5ms, possibly indefinitely. However, it has to be installed into real devices and those will probably have to be designed for real world situations (like being dropped off a 1 metre worktop).

Massive new US spy airship 'could be used to carry big cargoes'

Steven Jones

Practicality of compressors revisited...

There's plenty of stuff around compressors on airships. It has been a subject of research and pretty well every designer has decide it is impractical. Every now and then somebody floats (pun intended) a maverick idea, but nothing much comes of it. Do some research on the Internet and you'll find this type of stuff.

As for the weight - measured in the region of several tonnes. As far as time goes, an airship can rise an awful long way in a couple of minutes. There are cases in the old, pre-WWII days, of airships rising many thousands of feet in a minute despite venting hydrogen, which is a lot faster than any practical compressor could achieve. Airships are not only vulnerable to the expansion of their lifting gas as they rise, but also to relatively minor changes in air pressure due to variations in local atmospheric and weather conditions.

Generally using thrust to force the airship down is going to be more efficient than trying to balance bouyancy through compressing relatively large volumes of gas. However, once you lose control then there will be no choice but to vent as nothing will deal with the situation fast enough.

Steven Jones

The explanation

The reason why compressors are impracticable in helium balloons is simply because compressing a large volume of gas quickly requires a huge amount of power and heavy weight equipment. Then you need high pressure tanks to store the compressed gas. Any compressor light enough to be carried on an airship without hopelessly compromising payload capacity is going to be nothing like powerful enough. The volumes of gas involved are simply enormous measured. You would need to be able to compress hundreds of thousands of litres in a matter of several 10s of seconds.

A compressor with a total swept volume capacity of 10 litres (which would be fairly big) would require 10,000 strokes to suck in a 100,000 litres (which would probably take a few minutes), and you'd need multi-stage to get it down to a decent volume. and the whole thing would need a lot of power plus fuel and weigh rather a lot.

That's why compressors haven't been used in Airships and ballasting/venting are preferred. The position with subs is not the same. Volumes are much smaller, weight isn't a problem and there's usually plenty of power available. Water has the great advantage of being pretty well incompressible. The atmosphere is anything but.

London's tube demands faster-than-NFC ticketing

Steven Jones

"move sensor about 2 paces further from the barrier"?

Have you thought this through? You've increased the gap to almost 2 metres which is about double what it needs to be. Unless you are going to allow the second person in line to present for authorisation before he first has cleared the system, you are going to add approaching a second to every barrier clearance. If you do allow this overlapping of presentation and clearing you are in for enless confusion when the guy in front gets refused yet there's an authorisation for the one behind. This is quite apart from needing to replace/amend all the barriers and the approaches.

Basically it's unworkable...

Steven Jones

Complex multi-purpose via simple & dedicated

There's a more than slightly less than serious note to this article. The reason why the speed matters should be clear to anybody who travels through a very busy underground station during the rush hour. In order to get the throughput you need to minimise the number of times passengers have to break step when passing through the gates. Just a short delay and the gates will close again. You can see what happens when somebody hesitates and interrupts the flow. A certain amount can be tolerated, but if it happens a lot you'll get queuing and a jamming up of the system. Quite apart from the costs involved, there simply isn't the room to put many more gates in the many of the busiest stations to allow for significant slow downs. Adding the best part of a second to average responses is not going to help.

Also, as anybody who works in the area of performance measurement is concerned, the average is only part of the story. For real time systems like this, you need reliable, repeatable transaction times. An average of about a second can easily hide signficant outliers several times that. Given that NFC will require a customer handset to respond very fast when the device has got many other loads to deal with, then it's easy to see that responses might be somewhat erratic in real life.

Server workloads to go '70% virtual' by 2014

Steven Jones

Big VMs

large VMs can cause big issues with load balancing in a typical farm. Also vCPUs are not a real CPU. You can't expect the same sort of throughput, especially on a large VM with the issues involves with core scheduling and so on. It's a lot better than it was when cores were co-scheduled, but if you can effectively get the equivalent throughput to an 4 physical core machine on an 8 vCPU envionment in a mixed workload envionment, then you are doing well unless you run things with very low contention levels.

As we have some physical workloads running on 48 x64 cores, and a large number of 24 x x86 core workloads, then it is more efficient to go physical in many cases. Environments like J2EE are already virtual. Creating more OS images is not always the best thing to do. Also, the cost of ESX can't be ignored.

VMs have their place. However, it's not always the answer to all things and if the penalty is generatings many more operating systems to manage, it can be positively bad.