* Posts by Steven Jones

1376 posts • joined 21 May 2007

Page:

Hackers pop German steel mill, wreck furnace

Steven Jones
Bronze badge

It could be worse

Possibly just as well that the Germans have decided to close down their nuclear power plants if they can't keep critical control systems safe from hackers. Although I would hope that things would at least fail safely, even if not cheaply.

nb. I initially found a few elements of this story not entirely plausible, but as it seems to be official then so must it be.

1
0

Hipsters ahoy! Top Ten BOARD games for festive family fun

Steven Jones
Bronze badge

Gift idea

So now you know what to take as a gift to apartment 4A 2311 North Los Robles Avenue, Pasadena should you ever be invited for Christmas. Personally I'd take a bottle or two of pinot noir to apartment 4B.

0
0

UK air traffic bods deny they 'skimped' on IT investment after server mega-fail

Steven Jones
Bronze badge

Ludicrous sub head-line

If the Register's summary is correct, Richard Deaken s didn't make a statement that "90s kit isn't 'ancient'". What he said was the system had it's roots in the 90's. To put this in context, the World Wide Web has its roots in the late 80's. For that matter, the first draft definition of TCP/IP dates from the early 70's.

It's wholly irrelevant from when the technology originated. What matters is how it has been developed. After all, we are still basing out day-to-day usage of geometry based on what Euclid set out over 2,000 years ago. Roots matter. They stop trees falling over when the wind blows.

In the meantime, please don't misrepresent what was said. The kit isn't from the 90's, and nobody seems to be seriously claiming this was a hardware failure.

1
1

Beware of merging, telcos. CHEAPER SPECTRUM follows

Steven Jones
Bronze badge

Re: Correction: it wasn't Gordon Brown

Gordon Brown set the objective, which was quite simply to maximise the sale value of the 3G licenses. That he didn't personally design the auction, is not relevant. Although given Gordon Brown loved nothing better than to manipulate figures (like expensive PFI contracts to keep debt off the books), I'd be amazed if he didn't personally approve the final form of the auction.

nb. the economist who advised on the format of the auction was Paul Klemperer, an Oxford academic, who has been very active in defending the decisions made.

2
0
Steven Jones
Bronze badge

Even if it's conceded that the original 3G licence auction maximised the prices paid by the operators, and this didn't result in higher prices to the consumer on the grounds that they were sunk costs (more debatable) and that it didn't adversely impact other aspects, like network investment and thereby economic activity (even more debatable), then there is a much more fundamental reason why the exercise can't be repeated.

That's because at the time of the 3G auction, there were more potential bidders for bandwidth than there were available chunks of spectrum. In addition to the incumbents, there were a number of other operators seeking entry into the UK market including the (state backed) France Telecom and Deutches Telekom. It was this unique blend of ambitious operators and limited supply backed by inflated telecom valuations (and some de-facto state guarantees) that drove bid prices far past their economic value. Once the shareholders and financiers came round to noticing this, the supply of ready money dried up and auctions all across Europe then got fractions of what was achieved in the UK and Germany.

These circumstances will never happen again. It doesn't matter if there are 3 or 4 operators. The costs of entry into the UK and building a new network are immense. The only way that spectrum prices could be manipulated upwards would be to offer fewer chunks of spectrum than there are operators. By definition, that will lose one operator from the new spectrum. It's quite possible that one of the weaker players might decide the whole thing is not worth pursuing anyway and seek to either run as a low cost operator on existing spectrum or pursue other options. Of course if the spectrum is auctioned off such that all operators can get a chunk, then that's less of an issue, but it will not, of course, recreate the circumstances of the 3G auction.

So now that the fit of hubris of 2000 is over, there is no way that the telecom companies are ever going to fall for this again. The 2013 auction fell short of government targets by about £1bn (it raise £2.5bn vs the £22.5bn of the 3G auction). The circumstances at the turn of the millennium are not going to repeat themselves.

There's also another issue. Seeking to maximise the value of the spectrum to the state simply in the capital cost of the license, rather than through more continuous revenues from taxation on increased economic activity is surely short sighted.

In any event, 3 or 4 operators. It's not going to make a great difference to state revenues. The CEO of Telefónica César Alierta, has noted that the industry is not going to play ball with states that manipulate the circumstances of an auction in order to maximise a one-off return.

(nb. in the US, a similar auction approach to that which was eventually taken by the UK government in 2000 was ruled illegal and had to be retracted.)

3
0

FCC says taxpayer-bankrolled bumpkin broadband must be at least 10Mbps

Steven Jones
Bronze badge

What's in a name...

So they won't call it broadband. Simple.

1
0

Which country has 2nd largest social welfare system in the world?

Steven Jones
Bronze badge

It's perfectly proper to include the publicly funded part of healthcare as welfare. Private expenditure on health is another issue (although you could make a case that tax breaks on health insurance could be included).

2
0
Steven Jones
Bronze badge

Re: Sorta

I did link to a source. I think the 10-11% figure includes private health expenditure, not just public. When it comes to private health expenditure in the UK, it's not just those BUPA policies. There's a significant part of health costs that are only partly covered by the NHS. Expenditure with Opticians is primarily private as is much of the dental work.

6
0
Steven Jones
Bronze badge

On reason why US welfare expenditure is so high is surely down to the incredible inefficiency (form a financial point of view) of the American medical system. It's not commonly appreciated that the US government spends almost the same % of GDP on their public systems (Medicare and Medicaid) as the UK government does on the NHS. Given the difference in coverage, this is astonishing. It's around the 8% mark in both cases, and in the same general area as many large western countries. It can't even be explained because the US has an older population. It doesn't, but rather the reverse.

A lot of this must come down to the basic cost structure of the US medical industry with al the insurance, legal indemnity, billing and other cost issues (and some very well paid medical staff).

http://data.worldbank.org/indicator/SH.XPD.PUBL.ZS

21
0

Hi-torque tank engines: EXTREME car hacking with The Register

Steven Jones
Bronze badge

Re: Then there is Jay Leno's Blastolene Special

Not to mention a motorbike powered by a gas turbine engine from a helicopter.

1
0

Telcos spaff $36bn on gobbling Uncle Sam's radio frequencies for beefier cell coverage

Steven Jones
Bronze badge

Re: False economies here

It's true the UK and several other European auctions were setup to maximise the auction value. The German auction was even more expensive than the UK one. It wasn't helped by a number of state owned telco operators (like France Telecom) piling into the action with state backing. For the existing operators, it was an existential threat. They either got one of the new bands or they were essentially dead. Unfortunately for some countries slower off the mark, like Italy, the bubble has burst as shareholders and banks took fright. Then it all came crashing down.

Fortunately, things have calmed down a bit. To put this in perspective, the UK 3G auction raised $34bn in 2000 (about 2.5% of GDP). Correct for inflation, and it's considerably more than this latest US auction, a country with five times the population. Not exactly cheap, but not the insane levels of 2000 in proportion to the market size.

0
0

Coming clean: Ten cordless vacuum cleaners

Steven Jones
Bronze badge

Wandered into an alternative universe...

Vacuum cleaner tests. VACUUM CLEANER TESTS? In The Register. What the hell is happening?

0
5

The cloud that goes puff: Seagate Central home NAS woes

Steven Jones
Bronze badge

That's why everybody should have backup policies...

How many times can it be pointed out, have a backup strategy, and implement it properly. No device is foolproof. No cloud storage system will be perfect. Even if the hardware is perfect, software is not, and neither is you (the user) perfect. And then there's the little issue of ransomware or other malicious software.

Have a backup system, make sure it works and it validates what it does properly. Decide how much data you can afford to lose, and plan you system appropriately. Relying on a single storage device or service, not matter how well engineered is insufficient. Nothing on earth can guarantee you won't lose data, but you can improve the odds.

My mechanism involves two independent backup external drives, which I rotate frequently, and I always keep one off site. It is far, far quicker to recover from than any cloud system. And that includes driving 25 miles to my parent's house (where I keep the second copy) to pick up the "disaster recovery" system. (By all means use cloud for small incremental changes for interim backups as well).

I count this as a low cost solution given the risks of losing all your data.

1
0

Toyota to launch hydrogen (ie, NATURAL GAS) powered fuel cell hybrid

Steven Jones
Bronze badge

Why is this called a hybrid?S

Count me as confused, but I don't understand why this is called a hybrid. It seems to be a straight hydrogen fuel-cell car (and there have been other examples, albeit mostly prototypes). In contrast, surely a hybrid (by definition) includes two (or maybe more) power sources.

0
0

Broadband sellers in the UK are UP TO no good, says Which?

Steven Jones
Bronze badge

Re: Not this bloody chestnut again

Go look at the thinkbroadband.com site, and there's very clear evidence of increase in speeds. The upper quartile measure (which aligns quite well with an estimated 26% take-up of so-called superfast packages) has moved up a long way in the past year or so.

Of course, there will be something approaching 10% that will not get such speeds from the first phase of BDUK, but there's very good reason to believe (like Cornwall), that the original objectives will be overachieved by some margin. But that will still leave some disappointed of course, albeit there are later funding phases.

Of course if somebody could magic up the estimated £30bn required for a full fibre network, then all could change. However, nobody has managed to come up with a viable way for paying for it which is politically acceptable (equivalent to about £4 per line over a 30 year period).

0
0

Million Mask March: Anonymous' London Guy Fawkes protest a damp squib

Steven Jones
Bronze badge

Re: Hackery

You're expecting neutral, dispassionate fact reporting from the Register? It's not the BBC news you know, who at least have to pay lip service to the idea.

8
4

Snapper's decisions: Whatever happened to REAL photography?

Steven Jones
Bronze badge

Re: Many good points - however

The only sort of back there is for large format are digital scanning backs, which work rather like a flat-bed scanner in that there's a linear array which physically move across the focusing plane. If you want, one model produces a 1.1GB files with 48 bit output.

Of course, they are useless for moving subjects.

http://www.betterlight.com/products4X5.html

0
0
Steven Jones
Bronze badge

Re: Skill is all you need?

I'd certainly like to see a cameraphone produce a good closeup photo of a bird in flight, or a macro photo or a great closeup of an athlete. Or of a myriad of different subjects.

This trope that a great photographer will always surpass the limitations of their equipment, and outshine the mere snapper is always trotted out. Of course, a gifted photographer will beat the snapper, but it's still the case that for some sorts of photographs you need the right equipment.

0
0
Steven Jones
Bronze badge

Re: Good article

There are a very few systems cameras with a fully electronic shutter, like the Sony A7S and the Panasonic GH4. Unfortunately, the problem is that they take a long time to scan the sensor as they lack what's called a "global shutter". On the latter, the senor can, in effect, take an instantaneous "snapshot" of scene. However, on a CMOS sensor, the photosites have to be read sequentially, and row-by-row. On even relatively low resolution sensors with 12-14MP, this process takes, perhaps, 30ms. In consequence, for even modestly fast shutter speeds, the sensor rows have to be cleared and read as a sort of rolling strip that passes up the sensor. Of course, this is essentially what a focal plane shutter does, by exposing a narrow strip for higher shutter speeds. The difference is, electronic shutters take about 1/30th second, whilst a half-decent focal plane shutter traverse the sensor in about 1/250th sec or less. What this means is the top of the image is exposed before the bottom, so you get "leaning verticals" on moving objects. That's called "rolling shutter". You still get it on focal plane shutters, but it's about an order magnitude worse on electronic ones. Also, this problem is worse the higher resolution the sensor, which is why you don't see the option on sensors of 16MP upwards. (A lot of cameras do have an option for something called "EFCS", or electronic first curtain shutter. That's a partially electronic shutter which uses electronics to clear the photosites (which can be done faster than reading), and this runs ahead of a physical second curtain which shuts of the exposure. It's quieter than a fully mechanical shutter, but far from silent.

You see the problem with "rolling shutter" on a lot of video cameras with CMOS sensors as you get weird effects like twisted aeroplane propeller blades. It';s technically possible to create a CMOS sensor with a global shutter, but (currently at least), only by creating a temporary charge storage area for each pixel, which means giving over silicon real-estate which, in turn, means compromising other aspects of sensor performance, like dynamic range and noise performance.

Having taken more than a few photos at gigs myself, I know the problem of noisy shutters. Of course it depends on the circumstances. In a full-on rock performance, especially if it's outdoors and you are in the pit in front of the stage, not problem. If it's a folk singer or a string quartet in a quiet concert hall, it's nasty (not to mention at wedding ceremonies).

1
0
Steven Jones
Bronze badge

Re: Total light?

Yes, your 6D collects about 2.6 x the amount of light in total (as Canon APS-C has a crop factor of about 1.6). That translates to about 1.5 stops better performance across the ISO range.

For some things, bigger is better.

1
0
Steven Jones
Bronze badge

Re: Good article

The four thirds system was actually defined as a joint venture between Olympus and Kodak, not Panasonic. However, it's now a consortium which includes Panasonic. The micro four-thirds system was defined by Olympus and Panasonic and essentially defined a new lens mount with a shorter register eliminating the option of a mirror box (but the sensor format is that of the original four-thirds system).

In the medium/long term it seems to me that fully electronic system cameras (like MFT, Sony. Fuji etc.) will gradually push DSLRs into a niche market as having mirrors flapping around doesn't seem like the future.

All that's needed now is for a manufacturer to crack the problem of the fully electronic shutters on systems cameras (existing examples all have serious shortcomings), and we can have properly silent cameras.

2
0
Steven Jones
Bronze badge

Re: CCDs on DSLRs

I see. I do remember the proposal. It also would have been CCD at the time. It hit all sorts of technical issue and, in retrospect, was a dead-end as there were all sorts of integration issues and a DSLR designed from the ground-up would have lots of advantages.

The "system camera" approach is actually returning. If you take something like the Sony A7 series, they have a full-frame sensor and, because of the very short sensor-flange distance, can mount almost every 35mm (or MF) lens made via adapters, excepting only some of those which are fully "fly by wire". OK, it's not quite the same as a film back, as it includes all viewfinder and a lens mount, but it's not so much different in principle to using a digital back on an MF camera.

nb. the CCD vs CMOS argument is one of those "religious war" issues which comes up from time to time.

0
0
Steven Jones
Bronze badge

Re: Total light?

You are right in that image quality is ultimately limited by the total number of photos detected, but that's over the whole image area (and, for a common output resolution, that's per-pixel). In principle that's purely a function of the lens alone. A smaller sensor requires a proportionately shorter focal length in order to get the same field of view. However, to collect the same number of photons, it will need an aperture proportionately wider. Take the example of a 35mm so-called "full frame" sensor 24x36mm and imagine you mount a 50mm lens with an aperture of f4. Now imagine a sensor of half the dimension, 12 x 18mm (not a usual sensor size, but it makes the arithmetic easier). You will now need a 25mm lens to get the same field of view, and to collect the same total number of photons in a given exposure time, it will now have to be f2 (and get the same depth of field characteristics). This is all part of what's called "the principle of equivalence". As the f-stop is simply the focal length divided by the aperture diameter, then you can see the physical diameter of the aperture will be exactly the same in both cases. As the maximum (physical) diameter of the aperture is the primary factor that dictates the lens diameter, you can see that for the same light gathering power the two lenses will (broadly) be similar diameter (although not length).

So the question might be asked, why do we need large sensors, if we can just use smaller sensors with wider lenses. Leaving aside the issue that lenses with very small f-stops become increasingly difficult and expensive to design (only partly ameliorated by the smaller image circle), there is a major sensor limitation. That is the ability of a sensor to detect photons before saturating. Broadly speaking, a sensor with 4 times the surface area can detect 4 times the number of photons before saturating (or blowing highlights). Note that this is not just sensors it applies to, but also film. Slide film, especially, "blows" highlights and to collect more light in total, you need bigger films.

Of course there is another issue, that for any given output resolution, the smaller sensor will have to have smaller photosites (clearly half the dimensions in this case) and that, in turn, means the 25mm lens would have to be able to resolve twice as well.

As the ultimate dynamic range of the sensor is defined by the ratio between the saturation level and what's called the "noise floor", there is an advantage to the larger sensor. It has the potential for four times the number of detected photons before saturation which means, all other things being equal, it can achieve a couple more EV of dynamic range.

There's a lot more to it than that of course but, essentially, the reason "big is better" just comes down to that ability to detect more photons by dint of the greater surface area.

Try this.

http://theory.uchicago.edu/~ejm/pix/20d/tests/noise/index.html

1
0
Steven Jones
Bronze badge

CCDs on DSLRs

Why the CCD obsession? They were used on early DSLRs, but CMOS technology has long overtaken CCD performance for the size of sensors used in SLRs. Indeed, even in the MF world CMOS sensors have started appearing (all utilising a recently released Sony sensor). Also, Leica have no adopted a CMOS sensor for their M9. For comparable cameras, current CMOS sensors (size for size) beat CCD on frame rate, high ISO performance, dynamic range and colour sensitivity. Yes, there are some exceptional specialist CCD sensors for scientific work, but not for still photography.

For example, here's a DXOmark comparison of the Leica M 240 (CMOS) vs a couple of Leica M9 (CCD) models. On all the objective criteria, the CMOS model wins out all through the ISO range.

http://www.dxomark.com/Cameras/Compare/Side-by-side/Leica-M-Typ-240-versus-Leica-M9-P-versus-Leica-M9___844_721_640

Some people claim there's such a thing as CCD "colour". However, both CMOS and CCDs are (close to) colour blind, and what gives them the ability to distinguish colour is the filter matrix. (There is an exception, the Foveon sensor, which distinguishes colour by difference in the silicon depth penetration by different wavelength photons. To be ultra-picky, some video cameras use prismatic separation using multiple sensors, but not any current still cameras).

If you want an CCD DSLR, then there are plenty of second hand ones around. Here's a list of Nikon models with the sensor type listed. There's also the Leica S MF DSLR if you have very deep pockets and don't care about frame rates or performance at anything much above base ISO.

http://en.wikipedia.org/wiki/Comparison_of_Nikon_DSLR_cameras

0
0
Steven Jones
Bronze badge

m43 resolution 4Mpix???

" A four-thirds format digital camera is unlikely to deliver more than four megapixels of information per frame, irrespective of how much data it outputs. "

This is simply nonsense. Quite apart from it bearing no resemblance to the resolving power of typical lenses, DXOmark have tested a large number of M43 lenses and measured resolutions far in excess of 4MP. I cursory glance came up with several that reached 11MP. There are also plenty of on-line comparative samples which show this.

A further point is that the limit of lens resolution isn't a binary thing (and neither is diffraction limiting for that matter). It manifests as a gradual loss in contrast. There's no sudden "cut-off". It depends on the criteria used.

Further, even if the sensor does "out resolve" the lens, there are still advantages as the 2x2 bayer matrix found on most digital cameras is twice the pitch of the sensors. Thus you get better colour sampling with the higher resolution sensor.

Also, it takes far more than 4 photosites to output a single RGB pixel. Any algorithm which did so, would produce horrible results apart, maybe, a 2 x downsizing. In practice, demosaicing algorithms are complex beasts and make a huge difference to the final output.

8
0

Sporty in all but name: Peugeot 308 e-THP 110

Steven Jones
Bronze badge

Re: Wake me up when it's less boring or does more mpg

My 5 year old 1.6 Focus diesel turbo still averages 53-54mpg, and only costs £30 per year to tax (as it just scraped under the relevant CO2 emission target at the time). From memory it had around 112bhp, but it seems to plenty for my purposes. I think it cost be just under £14K at the time. Yes, there are issues with diesel particulates (although it's got a particulate filter, so I don't know how much that helps).

In all, it's not obvious that much progress has been made in the last 5 years.

0
0

Kingston's aviation empire: From industry firsts to Airfix heroes

Steven Jones
Bronze badge

Re: Hunter

I recently took this picture of a hawk (XL592) which used to be sitting on a concrete plinth at Booker airfield looking very sorry for itself. I cam across it in a field at the back of the 15th century Ockwell's Manor, on the outskirts of Maidenhead. Since I took the photo it's paint job has been completed with RAF roundels.

The Hawk was/is a very tough aircraft, and I suspect it will be flying long after more modern fighters are grounded as it is relatively simple. Unlike modern aircraft it doesn't need complex electronics to fly, and I suspect the Hawk will be flown by enthusiasts long after modern fighters are grounded.

There is something about highly tuned machinery from the 60s, whether it's fighter aircraft or racing cars. They somehow "look right" whilst the modern stuff just seems to have ugly bits tacked on.

https://imageshack.com/i/p46nVzX8j

nb. I'm not sure the item mentioned that Tommy Sopwith almost won the Americas cup in 1934 and there was more than a little controversy about the result.

0
0

You get what you pay for: Kingston's SSDNow V310 960GB whopper

Steven Jones
Bronze badge

Re: A matter of precision

@DougS

In fact, HDDs have to fly their heads nanometres above the disk surface, not microns.

However, to go back to the original question why HDDs are cheaper (per GB) than flash storage, it's largely down to how the data is stored. On an HDD, the data is stored in the form of magnetic domains on a substrate. The manufacturing process does not require every single bit to be represented by a photo-lithographic process. So once that bit of high-precision engineering has been produced to fly the heads into the right position (and decades of engineering have minimised that process), the coating comes relatively cheap. That's also why tape store is (per GB) less than on disk. The coating comes cheap.

The other issue is that each new generation of flash storage requires an immense investment in new equipment as it's dealing with fundamentally smaller elements. You can't, for instance, simply take something designed for 20nm elements and adapt it to 14nm. In contrast, the mechanical side of HDDs has remained relatively static for a long time. Platters, bearings, motors. servos and so on are pretty well the same save that a bit more precision is required in track location and following. The heads have to be designed to fly lower and with narrower gaps, but the process is more one of refining and evolution than having to throw out a whole plant.

0
0
Steven Jones
Bronze badge

Write endurance /= reliability

Surely write endurance and reliability are two things which are only loosely connected. The first is essentially the lifetime of the device for a given workload pattern whilst reliability is much better descried using failure rate figures within the drive's anticipated lifetime.

However, I'd certainly sit up and take notice of those write endurance figures. If they are to be believed, and there aren't other factors at play, this would really open the device up for use in enterprise storage uses, especially if the storage device can balance write loads over many devices so that "hot spots" don't arise.

0
0

White LED lies: It's great, but Nobel physics prize-winning great?

Steven Jones
Bronze badge

Re: LED Lighting: RGB vs UV/Blue + White/Yellow Phosphor

That's correct. In practice, most white LEDs don't produce "white" light by mixing primaries. The do it by using a phosphor coating which "downshifts" much of the blue light to longer wavelengths and mixing this with the blue light that penetrates the phosphor layer.

Of course, none of these technologies produce the continuous (black-body) spectrum of an incandescent bulb, although many people seem to believe they do.

13
0

Revealed: Malware that forces weak ATMs to spit out 'ALL THE CASH'

Steven Jones
Bronze badge

Physical access to the machines control system doesn't give you access to the money cassettes. That's totally different. Bank staff, for instance, will have access to some parts of the machine to recover things like "swallowed" cards. What they won't have is access to the hardened money "safe".

It's far easier to gain access to the control panels than the money safe. The real issue is that it is so easy to "infect" a machine with malware.

1
0

Top 10 SSDs: Price, performance and capacity

Steven Jones
Bronze badge

Unless you are doing something truly exception, hitting the write endurance limit is simply not an issue. Save that consideration for those running update intensive server applications. Most likely something else will fail on your machine first. HDDs don't have an indefinite like either (and don't make the mistake of thinking MTBF gives you expected lifetime - it doesn't; it's a statistical value that applies to devices within their rated lifetime and, generally, HDD manufacturers never give you rate lifetimes).

As for relocating MyDocs, MyMusic and so on, that's very easy. Assuming you are using Windows 7 or 8, then what I do is assign a system partition on the SSD large enough to take all the system files, program files and so on with plenty of room for expansion. Next I create a data partition on the SSD, which is where I place my MyDocs folder. Then, on an HDD I create partitions for my major data areas (like video, pictures). I then mount these as sub-folders in MyVideo, MyPictures. That way all these "mass storage" areas appear in subfolders in the relevant storage areas (you can also use symbolic links, but I prefer to "hard partition" the mass storage areas).

Of course you don't have to place MyDocs on the SSD. You can place it on an HDD, but personally I find that it's useful to be able to place some data files on the SSD for speed purposes. For example, I place my email client files on the SSD, and some applications (like Lightroom) greatly benefit from keeping the meta-data files on the SSD. As a general point, be careful to make sure that programs (like email clients) place their data and, as far as possible, config files in the data area. That way it's much easier to move to a new machine.

I backup the system partition using an imaging product (which allows me to restore the system without wiping out data). I backup the data areas using a synchronising backup to an external disk (USB3 in my case, but NAS eSATA etc. will work too). I prefer sync type backup as it allows me to mount those onto another machine and get to my data.

The general principle is you should always have a backup regime which is designed for the worst case. Any disk can fail catastrophically, so design the regime such that you don't lose everything. If all the things to worry about, write-persistence is one of the last to consider.

0
0
Steven Jones
Bronze badge

Re: alas

Valuing storage solely by £/GB is like comparing food on the basis of how many calories you can buy. Yes, it's a factor, but far from the only one.

I did work for a company once where the accountants actually did work on that principle. They seemed to have great trouble understanding why anything else might matter...

4
2
Steven Jones
Bronze badge

Re: I don't get it

For most users the PCIe performance gain will simply not show. Also, SATA reviews are applicable to more people. They fit laptops as well as desktops. It also doesn't involve complex issues over drivers, boot arrangements and so on.

For the most part, it's not throughput that makes the user experience so much better with SSDs, it's the vastly reduced latency and (the other side of that coin), increased IOPs. I'd venture for most people, 500MBps is going to be plenty. I think PCI-e is almost a separate market and really won't figure in considerations unless you are the ultimate speed freak or have some specialist server application.

If it's things like boot time, system responsiveness and application start-up times that matter, then the real-world difference you'll see on most PCs will be very small. That's not surprising, as most applications will have other resource bottlenecks (like cpu, network activity, or interactions with devices other than storage). These latter start to dominate response times.

For example, see this.

http://www.overclock.net/t/1489684/ssd-interface-comparison-pci-express-vs-sata

If you really must have hyper-fast copying of large files or are running an incredibly I/O intensive enterprise app, then go ahead. But my guess is that this is irrelevant to most people wanting a "consumer level" SSD. All of them will transform the user experience, and it's probably ease of migration, reliability and price that's is most important for this sort of comparison.

10
1

Ellison: Sparc M7 is Oracle's most important silicon EVER

Steven Jones
Bronze badge

Re: Boastful bravado

The announcement appears to be an acceptance that Sparc is no longer a general purpose processing chip but something optimised for running Oracle applications. Admittedly that's an awful lot in the application space - it's not just databases of course. However, it does seem a retreat from what SPARC was once mooted to be. A high-speed RISC general purpose, cost-effect, CPU that could compete on all aspects of performance and suit a wide range of applications.

Of course the real problem is that the T series essentially gave up on single thread performance in favour of increased aggregate throughput. It's a bet that applications will be developed to suit this architecture. As many of us found when deploying T series machines (often bought by senior managers who swallowed SUN's line of power efficiency, throughput and virtualisation), they were fundamentally crippled for some sorts of applications. It often showed itself up where latency was an issue. Call centres are expensive to operate, and keeping agents (and customers) hanging around for slow systems is not efficient.

As it is I would not choose SPARC except for reasons of supporting legacy applications.

1
1

WHY did Sunday Mirror stoop to slurping selfies for smut sting?

Steven Jones
Bronze badge

Re: Entrapment???

It's not entrapment in the legal sense by a law enforcement agency, but entrapment according to one of the other recognised definitions. See meaning 2 (a), which would seem to cover it.

Is to why quote an example in criminal law, as Andrew did, the justification would appear to be that the press can justify their actions under the press code by analogy to the way it's interpreted by the coursts as applying to criminal law. Quite why he chose a US example though, I'm not sure as the interpretation of entrapment is different in the two regimes. Generally the US allows the legal enforcement agencies far more freedom. Witness the various cases involving exports of arms perpetrated by the FBI. Those would not be allowed in UK law.

en·trap (n-trp)

tr.v. en·trapped, en·trap·ping, en·traps

1. To catch in or as if in a trap.

2. a. To lure into danger, difficulty, or a compromising situation. See Synonyms at catch.

b. To lure into performing a previously or otherwise uncontemplated illegal act.

0
0
Steven Jones
Bronze badge

Re: the Mirror couldn't even take its own selfie

I've no doubt papers used to be lazy (remember all those stories of never ending expense-paid lunches, corrupt employment practices for printers and so on). However, those halcyon days have gone. These days newspapers (with a few exceptions) are under enormous financial pressure with their circulation eroded by media fragmentation and the internet. Then there other main source of income, advertising is being strangled by competition from the on-line world. Even those papers with successful on-line presence can't get anywhere near recovering the difference.

So the short story is, they aren't so much unwilling to do proper journalism as unable to finance it.

nb. what's true for newspapers is also true for free-to-air broadcasting financed by advertising.

0
0

Ordnance Survey intern plonks houses, trees, rivers and roads on GB Minecraft map

Steven Jones
Bronze badge

Re: Liverpool underground nightclubs!

Down to a sunless sea.

Which accurately describes the Irish Sea on most summers.

4
0

SanDisk Extreme Pro SSD – courting speed freaks and gamers

Steven Jones
Bronze badge

Re: An important question : SSD failure modes?

I'm not talking abut enterprise arrays (the article was not about such things). Where I used to work we had storage measured in petabytes and databases in the 100s of GB. Those sort of devices are actively monitored and, in any case, use RAID protection. Pre-emptive swapouts happened, but complete failures happened too. Sometimes the pre-emptive swapouts were done because of unexpected failures over a number of drives and batch failures were present.

No, I was referring to my own personal experience of consumer grade units. The monitoring of those is poor and I would hazard a guess that the vast majority of users never look at the smart stats and, in any case, wouldn't have much of a clue on what they mean. Also my experience dates back a long time when such stats didn't exist or were poorly monitored. The few clues you used to get were things like sticky bearings stopping the drive spinning up (a sharp rap with a screwdriver handle). The catastrophic failures tended to involve a lot of clicking or floods of I/O error messages followed by a locked up machine. Or a laptop that didn't work. (Or a PVR that stopped working).

0
0
Steven Jones
Bronze badge

Re: An important question : SSD failure modes?

All the HDDs I had failed completely without warning. If people base their backup model on having warning of an impending failure then they are playing a dangerous game. Admittedly I've not had an HDD failure for a few years, but you have to be prepared for the worse.

Anyway, with 10 year guarantees, perhaps we'll hear less about write endurance which, for the vast majority of uses outside of enterprise servers, is simply not a serious issue.

4
1

What the 4K: High-def DisplayPort vid meets reversible USB Type C

Steven Jones
Bronze badge

Re: sigh

True, in part, but as there are no commercial (or broadcast) systems generate more than 60fps and much of the original material is captured at 24 or 25ps, 240Hz gets you nowhere.

Video games might be different, but super-high frame rates for broadcast are pointless.

1
0
Steven Jones
Bronze badge

Re: sigh

Motion blur is part of how moving images work. Indeed there are some cinematic advantages in having the blur in frames rather than the eye having to reconstruct it through the persistence of vision. Bear in mind that the great majority of films used to be shot at 24fps with a two blade shutter to greatly reduce flickering.

Indeed much electronic capture is also done at 24 or 25fps to maintain a "cinematic" look.

What high (video) frame rate can help with is dealing with flicker through higher refresh rates, and there's some evidence that there's evidence that 100Hz is preferable to 50Hz in that respect. But high refresh rates has nothing to do with "motion blur". High refresh rates also don't have to be done in the source. They can be synthesised by the display device.

1
1

A SCORCHIO fatboy SSD: Samsung SSD850 PRO 3D V-NAND

Steven Jones
Bronze badge

Re: Worst article in ages

"I feel violated, thanks for wasting my time."

What a drama queen...

1
0

Brit telcos warn Scots that voting Yes could lead to HEFTY bills

Steven Jones
Bronze badge

Re: If prices go up, we’ll know whom to blame.

I cannot imagine that the the UK (which has a veto) would agree to Scotland the EU unless it had a Schengen opt-out. I also expect that will be agreed as part of any independence negotiations as part of the agreement to have an open border with Scotland. Ireland is, after all, not part of the Schengen agreement. It may, or may not cause accession problems into the EU. Just one of those unknowns.

2
0
Steven Jones
Bronze badge

Re: Obligations

Scotland would, like any independent state, be legally able to nationalise any asset on their land. However, they'd have to pay compensation or face enormous international consequences (especially if they were a member of the EU).

So the nationalisation of the assets (money aside) is not really the issue. The problem is untangling a highly integrated organisation and, especially, the IT systems which are largely split on functional, not regional grounds. Also the IPRs would remain with the registered company (in the UK) as would any licencing deals. So, plenty of work for lawyers, accountants and IT people...

0
0
Steven Jones
Bronze badge

Re: More scare mongering

Indeed. The Scots could only nationalise the assets (for which they'd have to pay compensation or be an international pariah). However, their real problem will be trying to separate off what is a tightly integrated national company with common IT systems (most of them split on functional, not regional basis). All the IPRs for those systems will reside with the original company (registered in the UK) and you can bet that every single software and hardware vendor will require a renegotiation of their licences and support contracts for what will be a new organisation.

However, the big problem will be splitting all those IT systems into separate, national ones. That's, to put it mildly, a huge logistical problem fraught with difficulties and costs.Of course they could decide to just nationalise the assets covered by Openreach and, maybe, BT Wholesale. I'm sure such issues could be fixed, eventually, but it would come at a considerable cost.

Another little interesting issue is who would, in the event of such a renationalisation, be responsible for that part of the pension deficit attributable to BT pensioners (current and future). The employment contracts will have been variously made with the GPO, The Post Office and BT and so I would expect some form of split of responsibility.

So plenty of work for the lawyers, accountants and software people.

nb. a side-effect of the Ofcom/BT resettlement into three separate entities is that it would be operationally much easier to split along the OR/BTW/BTR boundaries than on regional lines. The latter formed no part of Ofcom's considerations.

0
0
Steven Jones
Bronze badge

Re: If prices go up, we'll know who to blame.

Dead easy. It doesn't matter what the country is called, but the Scots will be legally leaving the UK. It's also well established in International law that were a relatively small part (in population) of a state gains independence, the "continuing state" effectively remains signatory of international treaties.

Not one single politician of any note that I've heard of, whether UK or EU has questioned that principle.

5
0
Steven Jones
Bronze badge

Re: duh

Scotland can't "stay" in the EU as it's a new country and will have to be admitted as such. As for the UK leaving the EU, that's predicated on the Conservatives getting an overall majority (looking unlikely) and getting a "yes" to leave after the results of any renegotiating.

nb. one benefit to the Scots in not being a member of the EU is that they would have full control of their fishing waters. The Norwegians do a rather better job of administering theirs than the EU does of a what they view as, in effect, a common asset.

8
0

Britain's housing crisis: What are we going to do about it?

Steven Jones
Bronze badge

That 2-3% figure of land occupied by buildings, according to the UK National Ecosystem Assessment is what's left of the 10.6% of England categorised as urban after removing the space occupied by gardens, allotments, parks, playing fields, open water and other green spaces.

So yes, you can build more, but at the expense of higher density housing, loss of green spaces, loss of gardens and a general reduction in amenities. Also, all that new housing requires amenities so that 2-3% number is highly misleading.

The real problem is too many people, but as nobody is going to come up with a solution to that any time soon, we are stuck with it.

So it's certainly possible to increase the supply of housing and, probably, decrease the price, but the environment will get more crowded, there will be less public and private space per person and it will be made even worse by attracting yet more people into the most overcrowded parts of the country. The price of space will continue to go up with the population.

The demand is not that of people already resident in the UK but those with the aspirations to be residents as well.

0
1

Love XKCD? Love science? You'll love a book about science from Randall Munroe

Steven Jones
Bronze badge

Just keep near the surface

I'd be a bit worried if something significant did happen to anybody taking a swim in a spent nuclear fuel pool unless they were unwise enough to try and and swim down near the fuel (even then, exposure will be limited as there's a limit to how long people can hold their breathe; that's assuming nobody is using diving equipment). After all, besides cooling, the pools are designed to be deep enough to shield anybody at the surface from significant levels of exposure to radiation. They only need to be about 6m deep for that purpose, and are all at least twice that depth.

9
0

Page:

Forums