1366 posts • joined 21 May 2007
Re: False economies here
It's true the UK and several other European auctions were setup to maximise the auction value. The German auction was even more expensive than the UK one. It wasn't helped by a number of state owned telco operators (like France Telecom) piling into the action with state backing. For the existing operators, it was an existential threat. They either got one of the new bands or they were essentially dead. Unfortunately for some countries slower off the mark, like Italy, the bubble has burst as shareholders and banks took fright. Then it all came crashing down.
Fortunately, things have calmed down a bit. To put this in perspective, the UK 3G auction raised $34bn in 2000 (about 2.5% of GDP). Correct for inflation, and it's considerably more than this latest US auction, a country with five times the population. Not exactly cheap, but not the insane levels of 2000 in proportion to the market size.
Wandered into an alternative universe...
Vacuum cleaner tests. VACUUM CLEANER TESTS? In The Register. What the hell is happening?
That's why everybody should have backup policies...
How many times can it be pointed out, have a backup strategy, and implement it properly. No device is foolproof. No cloud storage system will be perfect. Even if the hardware is perfect, software is not, and neither is you (the user) perfect. And then there's the little issue of ransomware or other malicious software.
Have a backup system, make sure it works and it validates what it does properly. Decide how much data you can afford to lose, and plan you system appropriately. Relying on a single storage device or service, not matter how well engineered is insufficient. Nothing on earth can guarantee you won't lose data, but you can improve the odds.
My mechanism involves two independent backup external drives, which I rotate frequently, and I always keep one off site. It is far, far quicker to recover from than any cloud system. And that includes driving 25 miles to my parent's house (where I keep the second copy) to pick up the "disaster recovery" system. (By all means use cloud for small incremental changes for interim backups as well).
I count this as a low cost solution given the risks of losing all your data.
Why is this called a hybrid?S
Count me as confused, but I don't understand why this is called a hybrid. It seems to be a straight hydrogen fuel-cell car (and there have been other examples, albeit mostly prototypes). In contrast, surely a hybrid (by definition) includes two (or maybe more) power sources.
Re: Not this bloody chestnut again
Go look at the thinkbroadband.com site, and there's very clear evidence of increase in speeds. The upper quartile measure (which aligns quite well with an estimated 26% take-up of so-called superfast packages) has moved up a long way in the past year or so.
Of course, there will be something approaching 10% that will not get such speeds from the first phase of BDUK, but there's very good reason to believe (like Cornwall), that the original objectives will be overachieved by some margin. But that will still leave some disappointed of course, albeit there are later funding phases.
Of course if somebody could magic up the estimated £30bn required for a full fibre network, then all could change. However, nobody has managed to come up with a viable way for paying for it which is politically acceptable (equivalent to about £4 per line over a 30 year period).
You're expecting neutral, dispassionate fact reporting from the Register? It's not the BBC news you know, who at least have to pay lip service to the idea.
Re: Many good points - however
The only sort of back there is for large format are digital scanning backs, which work rather like a flat-bed scanner in that there's a linear array which physically move across the focusing plane. If you want, one model produces a 1.1GB files with 48 bit output.
Of course, they are useless for moving subjects.
Re: Skill is all you need?
I'd certainly like to see a cameraphone produce a good closeup photo of a bird in flight, or a macro photo or a great closeup of an athlete. Or of a myriad of different subjects.
This trope that a great photographer will always surpass the limitations of their equipment, and outshine the mere snapper is always trotted out. Of course, a gifted photographer will beat the snapper, but it's still the case that for some sorts of photographs you need the right equipment.
Re: Good article
There are a very few systems cameras with a fully electronic shutter, like the Sony A7S and the Panasonic GH4. Unfortunately, the problem is that they take a long time to scan the sensor as they lack what's called a "global shutter". On the latter, the senor can, in effect, take an instantaneous "snapshot" of scene. However, on a CMOS sensor, the photosites have to be read sequentially, and row-by-row. On even relatively low resolution sensors with 12-14MP, this process takes, perhaps, 30ms. In consequence, for even modestly fast shutter speeds, the sensor rows have to be cleared and read as a sort of rolling strip that passes up the sensor. Of course, this is essentially what a focal plane shutter does, by exposing a narrow strip for higher shutter speeds. The difference is, electronic shutters take about 1/30th second, whilst a half-decent focal plane shutter traverse the sensor in about 1/250th sec or less. What this means is the top of the image is exposed before the bottom, so you get "leaning verticals" on moving objects. That's called "rolling shutter". You still get it on focal plane shutters, but it's about an order magnitude worse on electronic ones. Also, this problem is worse the higher resolution the sensor, which is why you don't see the option on sensors of 16MP upwards. (A lot of cameras do have an option for something called "EFCS", or electronic first curtain shutter. That's a partially electronic shutter which uses electronics to clear the photosites (which can be done faster than reading), and this runs ahead of a physical second curtain which shuts of the exposure. It's quieter than a fully mechanical shutter, but far from silent.
You see the problem with "rolling shutter" on a lot of video cameras with CMOS sensors as you get weird effects like twisted aeroplane propeller blades. It';s technically possible to create a CMOS sensor with a global shutter, but (currently at least), only by creating a temporary charge storage area for each pixel, which means giving over silicon real-estate which, in turn, means compromising other aspects of sensor performance, like dynamic range and noise performance.
Having taken more than a few photos at gigs myself, I know the problem of noisy shutters. Of course it depends on the circumstances. In a full-on rock performance, especially if it's outdoors and you are in the pit in front of the stage, not problem. If it's a folk singer or a string quartet in a quiet concert hall, it's nasty (not to mention at wedding ceremonies).
Re: Total light?
Yes, your 6D collects about 2.6 x the amount of light in total (as Canon APS-C has a crop factor of about 1.6). That translates to about 1.5 stops better performance across the ISO range.
For some things, bigger is better.
Re: Good article
The four thirds system was actually defined as a joint venture between Olympus and Kodak, not Panasonic. However, it's now a consortium which includes Panasonic. The micro four-thirds system was defined by Olympus and Panasonic and essentially defined a new lens mount with a shorter register eliminating the option of a mirror box (but the sensor format is that of the original four-thirds system).
In the medium/long term it seems to me that fully electronic system cameras (like MFT, Sony. Fuji etc.) will gradually push DSLRs into a niche market as having mirrors flapping around doesn't seem like the future.
All that's needed now is for a manufacturer to crack the problem of the fully electronic shutters on systems cameras (existing examples all have serious shortcomings), and we can have properly silent cameras.
Re: CCDs on DSLRs
I see. I do remember the proposal. It also would have been CCD at the time. It hit all sorts of technical issue and, in retrospect, was a dead-end as there were all sorts of integration issues and a DSLR designed from the ground-up would have lots of advantages.
The "system camera" approach is actually returning. If you take something like the Sony A7 series, they have a full-frame sensor and, because of the very short sensor-flange distance, can mount almost every 35mm (or MF) lens made via adapters, excepting only some of those which are fully "fly by wire". OK, it's not quite the same as a film back, as it includes all viewfinder and a lens mount, but it's not so much different in principle to using a digital back on an MF camera.
nb. the CCD vs CMOS argument is one of those "religious war" issues which comes up from time to time.
Re: Total light?
You are right in that image quality is ultimately limited by the total number of photos detected, but that's over the whole image area (and, for a common output resolution, that's per-pixel). In principle that's purely a function of the lens alone. A smaller sensor requires a proportionately shorter focal length in order to get the same field of view. However, to collect the same number of photons, it will need an aperture proportionately wider. Take the example of a 35mm so-called "full frame" sensor 24x36mm and imagine you mount a 50mm lens with an aperture of f4. Now imagine a sensor of half the dimension, 12 x 18mm (not a usual sensor size, but it makes the arithmetic easier). You will now need a 25mm lens to get the same field of view, and to collect the same total number of photons in a given exposure time, it will now have to be f2 (and get the same depth of field characteristics). This is all part of what's called "the principle of equivalence". As the f-stop is simply the focal length divided by the aperture diameter, then you can see the physical diameter of the aperture will be exactly the same in both cases. As the maximum (physical) diameter of the aperture is the primary factor that dictates the lens diameter, you can see that for the same light gathering power the two lenses will (broadly) be similar diameter (although not length).
So the question might be asked, why do we need large sensors, if we can just use smaller sensors with wider lenses. Leaving aside the issue that lenses with very small f-stops become increasingly difficult and expensive to design (only partly ameliorated by the smaller image circle), there is a major sensor limitation. That is the ability of a sensor to detect photons before saturating. Broadly speaking, a sensor with 4 times the surface area can detect 4 times the number of photons before saturating (or blowing highlights). Note that this is not just sensors it applies to, but also film. Slide film, especially, "blows" highlights and to collect more light in total, you need bigger films.
Of course there is another issue, that for any given output resolution, the smaller sensor will have to have smaller photosites (clearly half the dimensions in this case) and that, in turn, means the 25mm lens would have to be able to resolve twice as well.
As the ultimate dynamic range of the sensor is defined by the ratio between the saturation level and what's called the "noise floor", there is an advantage to the larger sensor. It has the potential for four times the number of detected photons before saturation which means, all other things being equal, it can achieve a couple more EV of dynamic range.
There's a lot more to it than that of course but, essentially, the reason "big is better" just comes down to that ability to detect more photons by dint of the greater surface area.
CCDs on DSLRs
Why the CCD obsession? They were used on early DSLRs, but CMOS technology has long overtaken CCD performance for the size of sensors used in SLRs. Indeed, even in the MF world CMOS sensors have started appearing (all utilising a recently released Sony sensor). Also, Leica have no adopted a CMOS sensor for their M9. For comparable cameras, current CMOS sensors (size for size) beat CCD on frame rate, high ISO performance, dynamic range and colour sensitivity. Yes, there are some exceptional specialist CCD sensors for scientific work, but not for still photography.
For example, here's a DXOmark comparison of the Leica M 240 (CMOS) vs a couple of Leica M9 (CCD) models. On all the objective criteria, the CMOS model wins out all through the ISO range.
Some people claim there's such a thing as CCD "colour". However, both CMOS and CCDs are (close to) colour blind, and what gives them the ability to distinguish colour is the filter matrix. (There is an exception, the Foveon sensor, which distinguishes colour by difference in the silicon depth penetration by different wavelength photons. To be ultra-picky, some video cameras use prismatic separation using multiple sensors, but not any current still cameras).
If you want an CCD DSLR, then there are plenty of second hand ones around. Here's a list of Nikon models with the sensor type listed. There's also the Leica S MF DSLR if you have very deep pockets and don't care about frame rates or performance at anything much above base ISO.
m43 resolution 4Mpix???
" A four-thirds format digital camera is unlikely to deliver more than four megapixels of information per frame, irrespective of how much data it outputs. "
This is simply nonsense. Quite apart from it bearing no resemblance to the resolving power of typical lenses, DXOmark have tested a large number of M43 lenses and measured resolutions far in excess of 4MP. I cursory glance came up with several that reached 11MP. There are also plenty of on-line comparative samples which show this.
A further point is that the limit of lens resolution isn't a binary thing (and neither is diffraction limiting for that matter). It manifests as a gradual loss in contrast. There's no sudden "cut-off". It depends on the criteria used.
Further, even if the sensor does "out resolve" the lens, there are still advantages as the 2x2 bayer matrix found on most digital cameras is twice the pitch of the sensors. Thus you get better colour sampling with the higher resolution sensor.
Also, it takes far more than 4 photosites to output a single RGB pixel. Any algorithm which did so, would produce horrible results apart, maybe, a 2 x downsizing. In practice, demosaicing algorithms are complex beasts and make a huge difference to the final output.
Re: Wake me up when it's less boring or does more mpg
My 5 year old 1.6 Focus diesel turbo still averages 53-54mpg, and only costs £30 per year to tax (as it just scraped under the relevant CO2 emission target at the time). From memory it had around 112bhp, but it seems to plenty for my purposes. I think it cost be just under £14K at the time. Yes, there are issues with diesel particulates (although it's got a particulate filter, so I don't know how much that helps).
In all, it's not obvious that much progress has been made in the last 5 years.
I recently took this picture of a hawk (XL592) which used to be sitting on a concrete plinth at Booker airfield looking very sorry for itself. I cam across it in a field at the back of the 15th century Ockwell's Manor, on the outskirts of Maidenhead. Since I took the photo it's paint job has been completed with RAF roundels.
The Hawk was/is a very tough aircraft, and I suspect it will be flying long after more modern fighters are grounded as it is relatively simple. Unlike modern aircraft it doesn't need complex electronics to fly, and I suspect the Hawk will be flown by enthusiasts long after modern fighters are grounded.
There is something about highly tuned machinery from the 60s, whether it's fighter aircraft or racing cars. They somehow "look right" whilst the modern stuff just seems to have ugly bits tacked on.
nb. I'm not sure the item mentioned that Tommy Sopwith almost won the Americas cup in 1934 and there was more than a little controversy about the result.
Re: A matter of precision
In fact, HDDs have to fly their heads nanometres above the disk surface, not microns.
However, to go back to the original question why HDDs are cheaper (per GB) than flash storage, it's largely down to how the data is stored. On an HDD, the data is stored in the form of magnetic domains on a substrate. The manufacturing process does not require every single bit to be represented by a photo-lithographic process. So once that bit of high-precision engineering has been produced to fly the heads into the right position (and decades of engineering have minimised that process), the coating comes relatively cheap. That's also why tape store is (per GB) less than on disk. The coating comes cheap.
The other issue is that each new generation of flash storage requires an immense investment in new equipment as it's dealing with fundamentally smaller elements. You can't, for instance, simply take something designed for 20nm elements and adapt it to 14nm. In contrast, the mechanical side of HDDs has remained relatively static for a long time. Platters, bearings, motors. servos and so on are pretty well the same save that a bit more precision is required in track location and following. The heads have to be designed to fly lower and with narrower gaps, but the process is more one of refining and evolution than having to throw out a whole plant.
Write endurance /= reliability
Surely write endurance and reliability are two things which are only loosely connected. The first is essentially the lifetime of the device for a given workload pattern whilst reliability is much better descried using failure rate figures within the drive's anticipated lifetime.
However, I'd certainly sit up and take notice of those write endurance figures. If they are to be believed, and there aren't other factors at play, this would really open the device up for use in enterprise storage uses, especially if the storage device can balance write loads over many devices so that "hot spots" don't arise.
Re: LED Lighting: RGB vs UV/Blue + White/Yellow Phosphor
That's correct. In practice, most white LEDs don't produce "white" light by mixing primaries. The do it by using a phosphor coating which "downshifts" much of the blue light to longer wavelengths and mixing this with the blue light that penetrates the phosphor layer.
Of course, none of these technologies produce the continuous (black-body) spectrum of an incandescent bulb, although many people seem to believe they do.
Physical access to the machines control system doesn't give you access to the money cassettes. That's totally different. Bank staff, for instance, will have access to some parts of the machine to recover things like "swallowed" cards. What they won't have is access to the hardened money "safe".
It's far easier to gain access to the control panels than the money safe. The real issue is that it is so easy to "infect" a machine with malware.
Unless you are doing something truly exception, hitting the write endurance limit is simply not an issue. Save that consideration for those running update intensive server applications. Most likely something else will fail on your machine first. HDDs don't have an indefinite like either (and don't make the mistake of thinking MTBF gives you expected lifetime - it doesn't; it's a statistical value that applies to devices within their rated lifetime and, generally, HDD manufacturers never give you rate lifetimes).
As for relocating MyDocs, MyMusic and so on, that's very easy. Assuming you are using Windows 7 or 8, then what I do is assign a system partition on the SSD large enough to take all the system files, program files and so on with plenty of room for expansion. Next I create a data partition on the SSD, which is where I place my MyDocs folder. Then, on an HDD I create partitions for my major data areas (like video, pictures). I then mount these as sub-folders in MyVideo, MyPictures. That way all these "mass storage" areas appear in subfolders in the relevant storage areas (you can also use symbolic links, but I prefer to "hard partition" the mass storage areas).
Of course you don't have to place MyDocs on the SSD. You can place it on an HDD, but personally I find that it's useful to be able to place some data files on the SSD for speed purposes. For example, I place my email client files on the SSD, and some applications (like Lightroom) greatly benefit from keeping the meta-data files on the SSD. As a general point, be careful to make sure that programs (like email clients) place their data and, as far as possible, config files in the data area. That way it's much easier to move to a new machine.
I backup the system partition using an imaging product (which allows me to restore the system without wiping out data). I backup the data areas using a synchronising backup to an external disk (USB3 in my case, but NAS eSATA etc. will work too). I prefer sync type backup as it allows me to mount those onto another machine and get to my data.
The general principle is you should always have a backup regime which is designed for the worst case. Any disk can fail catastrophically, so design the regime such that you don't lose everything. If all the things to worry about, write-persistence is one of the last to consider.
Valuing storage solely by £/GB is like comparing food on the basis of how many calories you can buy. Yes, it's a factor, but far from the only one.
I did work for a company once where the accountants actually did work on that principle. They seemed to have great trouble understanding why anything else might matter...
Re: I don't get it
For most users the PCIe performance gain will simply not show. Also, SATA reviews are applicable to more people. They fit laptops as well as desktops. It also doesn't involve complex issues over drivers, boot arrangements and so on.
For the most part, it's not throughput that makes the user experience so much better with SSDs, it's the vastly reduced latency and (the other side of that coin), increased IOPs. I'd venture for most people, 500MBps is going to be plenty. I think PCI-e is almost a separate market and really won't figure in considerations unless you are the ultimate speed freak or have some specialist server application.
If it's things like boot time, system responsiveness and application start-up times that matter, then the real-world difference you'll see on most PCs will be very small. That's not surprising, as most applications will have other resource bottlenecks (like cpu, network activity, or interactions with devices other than storage). These latter start to dominate response times.
For example, see this.
If you really must have hyper-fast copying of large files or are running an incredibly I/O intensive enterprise app, then go ahead. But my guess is that this is irrelevant to most people wanting a "consumer level" SSD. All of them will transform the user experience, and it's probably ease of migration, reliability and price that's is most important for this sort of comparison.
Re: Boastful bravado
The announcement appears to be an acceptance that Sparc is no longer a general purpose processing chip but something optimised for running Oracle applications. Admittedly that's an awful lot in the application space - it's not just databases of course. However, it does seem a retreat from what SPARC was once mooted to be. A high-speed RISC general purpose, cost-effect, CPU that could compete on all aspects of performance and suit a wide range of applications.
Of course the real problem is that the T series essentially gave up on single thread performance in favour of increased aggregate throughput. It's a bet that applications will be developed to suit this architecture. As many of us found when deploying T series machines (often bought by senior managers who swallowed SUN's line of power efficiency, throughput and virtualisation), they were fundamentally crippled for some sorts of applications. It often showed itself up where latency was an issue. Call centres are expensive to operate, and keeping agents (and customers) hanging around for slow systems is not efficient.
As it is I would not choose SPARC except for reasons of supporting legacy applications.
It's not entrapment in the legal sense by a law enforcement agency, but entrapment according to one of the other recognised definitions. See meaning 2 (a), which would seem to cover it.
Is to why quote an example in criminal law, as Andrew did, the justification would appear to be that the press can justify their actions under the press code by analogy to the way it's interpreted by the coursts as applying to criminal law. Quite why he chose a US example though, I'm not sure as the interpretation of entrapment is different in the two regimes. Generally the US allows the legal enforcement agencies far more freedom. Witness the various cases involving exports of arms perpetrated by the FBI. Those would not be allowed in UK law.
tr.v. en·trapped, en·trap·ping, en·traps
1. To catch in or as if in a trap.
2. a. To lure into danger, difficulty, or a compromising situation. See Synonyms at catch.
b. To lure into performing a previously or otherwise uncontemplated illegal act.
Re: the Mirror couldn't even take its own selfie
I've no doubt papers used to be lazy (remember all those stories of never ending expense-paid lunches, corrupt employment practices for printers and so on). However, those halcyon days have gone. These days newspapers (with a few exceptions) are under enormous financial pressure with their circulation eroded by media fragmentation and the internet. Then there other main source of income, advertising is being strangled by competition from the on-line world. Even those papers with successful on-line presence can't get anywhere near recovering the difference.
So the short story is, they aren't so much unwilling to do proper journalism as unable to finance it.
nb. what's true for newspapers is also true for free-to-air broadcasting financed by advertising.
Re: Liverpool underground nightclubs!
Down to a sunless sea.
Which accurately describes the Irish Sea on most summers.
Re: An important question : SSD failure modes?
I'm not talking abut enterprise arrays (the article was not about such things). Where I used to work we had storage measured in petabytes and databases in the 100s of GB. Those sort of devices are actively monitored and, in any case, use RAID protection. Pre-emptive swapouts happened, but complete failures happened too. Sometimes the pre-emptive swapouts were done because of unexpected failures over a number of drives and batch failures were present.
No, I was referring to my own personal experience of consumer grade units. The monitoring of those is poor and I would hazard a guess that the vast majority of users never look at the smart stats and, in any case, wouldn't have much of a clue on what they mean. Also my experience dates back a long time when such stats didn't exist or were poorly monitored. The few clues you used to get were things like sticky bearings stopping the drive spinning up (a sharp rap with a screwdriver handle). The catastrophic failures tended to involve a lot of clicking or floods of I/O error messages followed by a locked up machine. Or a laptop that didn't work. (Or a PVR that stopped working).
Re: An important question : SSD failure modes?
All the HDDs I had failed completely without warning. If people base their backup model on having warning of an impending failure then they are playing a dangerous game. Admittedly I've not had an HDD failure for a few years, but you have to be prepared for the worse.
Anyway, with 10 year guarantees, perhaps we'll hear less about write endurance which, for the vast majority of uses outside of enterprise servers, is simply not a serious issue.
True, in part, but as there are no commercial (or broadcast) systems generate more than 60fps and much of the original material is captured at 24 or 25ps, 240Hz gets you nowhere.
Video games might be different, but super-high frame rates for broadcast are pointless.
Motion blur is part of how moving images work. Indeed there are some cinematic advantages in having the blur in frames rather than the eye having to reconstruct it through the persistence of vision. Bear in mind that the great majority of films used to be shot at 24fps with a two blade shutter to greatly reduce flickering.
Indeed much electronic capture is also done at 24 or 25fps to maintain a "cinematic" look.
What high (video) frame rate can help with is dealing with flicker through higher refresh rates, and there's some evidence that there's evidence that 100Hz is preferable to 50Hz in that respect. But high refresh rates has nothing to do with "motion blur". High refresh rates also don't have to be done in the source. They can be synthesised by the display device.
Re: Worst article in ages
"I feel violated, thanks for wasting my time."
What a drama queen...
Re: If prices go up, we’ll know whom to blame.
I cannot imagine that the the UK (which has a veto) would agree to Scotland the EU unless it had a Schengen opt-out. I also expect that will be agreed as part of any independence negotiations as part of the agreement to have an open border with Scotland. Ireland is, after all, not part of the Schengen agreement. It may, or may not cause accession problems into the EU. Just one of those unknowns.
Scotland would, like any independent state, be legally able to nationalise any asset on their land. However, they'd have to pay compensation or face enormous international consequences (especially if they were a member of the EU).
So the nationalisation of the assets (money aside) is not really the issue. The problem is untangling a highly integrated organisation and, especially, the IT systems which are largely split on functional, not regional grounds. Also the IPRs would remain with the registered company (in the UK) as would any licencing deals. So, plenty of work for lawyers, accountants and IT people...
Re: More scare mongering
Indeed. The Scots could only nationalise the assets (for which they'd have to pay compensation or be an international pariah). However, their real problem will be trying to separate off what is a tightly integrated national company with common IT systems (most of them split on functional, not regional basis). All the IPRs for those systems will reside with the original company (registered in the UK) and you can bet that every single software and hardware vendor will require a renegotiation of their licences and support contracts for what will be a new organisation.
However, the big problem will be splitting all those IT systems into separate, national ones. That's, to put it mildly, a huge logistical problem fraught with difficulties and costs.Of course they could decide to just nationalise the assets covered by Openreach and, maybe, BT Wholesale. I'm sure such issues could be fixed, eventually, but it would come at a considerable cost.
Another little interesting issue is who would, in the event of such a renationalisation, be responsible for that part of the pension deficit attributable to BT pensioners (current and future). The employment contracts will have been variously made with the GPO, The Post Office and BT and so I would expect some form of split of responsibility.
So plenty of work for the lawyers, accountants and software people.
nb. a side-effect of the Ofcom/BT resettlement into three separate entities is that it would be operationally much easier to split along the OR/BTW/BTR boundaries than on regional lines. The latter formed no part of Ofcom's considerations.
Re: If prices go up, we'll know who to blame.
Dead easy. It doesn't matter what the country is called, but the Scots will be legally leaving the UK. It's also well established in International law that were a relatively small part (in population) of a state gains independence, the "continuing state" effectively remains signatory of international treaties.
Not one single politician of any note that I've heard of, whether UK or EU has questioned that principle.
Scotland can't "stay" in the EU as it's a new country and will have to be admitted as such. As for the UK leaving the EU, that's predicated on the Conservatives getting an overall majority (looking unlikely) and getting a "yes" to leave after the results of any renegotiating.
nb. one benefit to the Scots in not being a member of the EU is that they would have full control of their fishing waters. The Norwegians do a rather better job of administering theirs than the EU does of a what they view as, in effect, a common asset.
That 2-3% figure of land occupied by buildings, according to the UK National Ecosystem Assessment is what's left of the 10.6% of England categorised as urban after removing the space occupied by gardens, allotments, parks, playing fields, open water and other green spaces.
So yes, you can build more, but at the expense of higher density housing, loss of green spaces, loss of gardens and a general reduction in amenities. Also, all that new housing requires amenities so that 2-3% number is highly misleading.
The real problem is too many people, but as nobody is going to come up with a solution to that any time soon, we are stuck with it.
So it's certainly possible to increase the supply of housing and, probably, decrease the price, but the environment will get more crowded, there will be less public and private space per person and it will be made even worse by attracting yet more people into the most overcrowded parts of the country. The price of space will continue to go up with the population.
The demand is not that of people already resident in the UK but those with the aspirations to be residents as well.
Just keep near the surface
I'd be a bit worried if something significant did happen to anybody taking a swim in a spent nuclear fuel pool unless they were unwise enough to try and and swim down near the fuel (even then, exposure will be limited as there's a limit to how long people can hold their breathe; that's assuming nobody is using diving equipment). After all, besides cooling, the pools are designed to be deep enough to shield anybody at the surface from significant levels of exposure to radiation. They only need to be about 6m deep for that purpose, and are all at least twice that depth.
Re: I don't get it @Tom 64
Supercavitation does not vapourise and recondense all the liquid in the path of the "missile", just a relatively small proportion. The process is used to generate the bubble in which the object travels and massively reduces surface friction.
Most of the water will be displaced, but due to the very low friction the energy used in pushing the water aside will (mostly) be returned when it collapses back together after the "missile" bubble has passed.
Re: Great, maybe...
The crazy thing is when you see somebody storming down the motorway at 70+ with read fog lights on. Either visibility is such that you don't need your fog lights turned on, or you shouldn't be doing anything like 70 mph.
Personally I hate rear fog lights being turned on at the slightest hint of mist or rain. Not only do they dazzle, but they also mask brake lights as both are rated the same (23 W I think).
Rear fog lights should only be used for really poor visibility when you should be traveling relatively slowly.
Rather than read this fairly workaday piece by Tim Worstall, then I'd suggest getting a copy of "Red Plenty" by Francis Spufford. It's a semi-fictionalised version of the attempt to produce a planned economy in the Soviet Union, mixing up real characters. It paints a picture of idealistic economists trying to produce workable and attempting to reconcile them with party dogma. It also paints the role played by black marketeers and attempts to handle the rigidity of the systems. For good measure, it's also got an evocative picture of how lung cancer develops and an incredibly painful insight into what might be called a Soviet baby factory.
It's written with a good deal of sympathy for those who were young and idealistic and is no simple condemnation of the system.
It's a bit difficult to categorise as a book, but as with all Francis Spufford's books it's written with enormous panache and style.
Re: Mr Hawking – Over-rated - Big Bang Mythology
Revered worldwide I'd say. Hardly just a UK phenomenon. However, you surely know the answer. It is that image of a great mind (and it is a great mind) trapped in a flawed body. Icons matter.
Absolutely. They should be reporting on percentile figures, like medians, quartiles and so on. A far better way of characterising statistics where there is a huge disparity between the bottom and the top of the range. That's why median income level is favoured over average. It far better represents the "typical" experience.
Re: even if he did...
It's absolutely the same in the UK (with the same issues over interpretation). If somebody acts as to subvert an injunction, then they may - and I repeat may - be committing a criminal act.
Of course it's no surprise that US and UK jurisdictions are similar in this area as both are based on the same common law roots.
There are two interesting things here. The first is the reach of the law. A UK citizen resident in the UK is an easy target. The second, and maybe more worrying, is if the scope was ever extended to those who give advice on how to bypass injunctions on ISPs. The latter would make a huge number of people vulnerable, but as the common law in this area is not well established, who knows.
Incidentally, in this case I suspect the USA authorities will (eventually) close loopholes as the US is, if anything, far more intent on "protecting" IPR than is even the UK or EU authorities. I think there are already treaties being discussed...
The real meaning of DQ...
The real meaning "Digital Quotient" is not so much how "technical savvy" somebody is, but rather a measure of how susceptible they are to being consumers of the latest products and services from Internet and gadget companies.
So not so much a reflection of understanding, but more a statement of fashion sense in the world of being a consumer of hi-tech services and products...
Re: Jealousy reigns
DEC Alpha didn't streak away from anything. DEC got into trouble as it became greedy and complacent as it took its customer base for granted, charging ridiculous amounts for such basics as TCP/IP stacks. When a whole host of UNIX-based "hot box" companies came along, companies voted with their wallets to what they saw as a more open market where competition drove down pricing. DEC's reaction came too little and too late and VMS became just a legacy products with high costs.
Of course the independents selling UNIX based kit all gradually succumbed from competition and the impact of commodity hardware and growth of Linux and Windows servers. Only the big-boys survived, and even SUN fell into the hands of Oracle.
Re: A little alarmist...
The decision to open up the spectrum is not the same as actually having to deploy it. I know there are border issues so that adjacent countries have to coordinate allocation of bandwidth at a detail level, but that's mostly an issue for countries with land borders and not the UK (well, unless Scotland votes "yes").
So I don't see any reason why if country A wants to use a given band for mobile usage why country B has to implement it.
(nb. I am aware that there are actually issues with overlapping bandwidths between France and the UK such that transmission powers and direction are actually something which are subject to agreement at the narrower parts of the channel, but I don't think it changes the principle).
A little alarmist...
Unlike the analogue switchover, politicians will get involved with this if Freeview was to be crippled. The reason there was no big outcry over loss of analogue was that DTV gave clear advantages. More channels, better quality (assuming the broadcaster didn't choose stupidly low bit rates) plus all the advantages of PVRs. All those VHS recorders had clearly had their day and we were also at the point where people were swapping out old CRTs for flat screens. In the big picture of things, the costs were modest and most people didn't have to do much save buy the right receiving equipment.
In short, all the planets were in alignment and it was, bar a few, clearly in (almost) everybody's. However, if the Freeview platform itself came under threat, then expect all hell to break loose. There are lots of issues with broadband as a universal platform; holes in the coverage, cost to viewers and reliability. If your broadband goes down at the moment, you've got the option of watching TV. If that's dependent on the same delivery system, then you're out of luck until it's fixed.
That leaves Freesat, but not everybody want's to stick a dish on their house (assuming they've got a reasonable location), and it means chucking away serviceable hardware with no obvious advantage. In a house with many receivers there's also a distribution problem. It's not as simple as putting a distribution amp in the loft space.
If there are sufficient grumbles, then it politically won't work. Think of all the issues that have arisen with shutdown of FM radio. People see very few advantages, but what the do get is a situation where their car radio won't work apart from and expensive swap-out or some nasty kludge which means their steering-column controls don't work.
nb. I see little point in 4K as, in the average UK living room, people sit far too far away from the TV screen to see any difference unless you buy a simply huge TV screen. Much better that any more bandwidth or improved codecs are used to enhance HD quality (which is often abysmal).
- Vid Antarctic ice THICKER than first feared – penguin-bot boffins
- Hi-torque tank engines: EXTREME car hacking with The Register
- Review What's MISSING on Amazon Fire Phone... and why it WON'T set the world alight
- Antique Code Show World of Warcraft then and now: From Orcs and Humans to Warlords of Draenor
- Product round-up Trousers down for six of the best affordable Androids