Feeds

* Posts by Steven Jones

1328 posts • joined 21 May 2007

Page:

Britain's housing crisis: What are we going to do about it?

Steven Jones
Bronze badge

That 2-3% figure of land occupied by buildings, according to the UK National Ecosystem Assessment is what's left of the 10.6% of England categorised as urban after removing the space occupied by gardens, allotments, parks, playing fields, open water and other green spaces.

So yes, you can build more, but at the expense of higher density housing, loss of green spaces, loss of gardens and a general reduction in amenities. Also, all that new housing requires amenities so that 2-3% number is highly misleading.

The real problem is too many people, but as nobody is going to come up with a solution to that any time soon, we are stuck with it.

So it's certainly possible to increase the supply of housing and, probably, decrease the price, but the environment will get more crowded, there will be less public and private space per person and it will be made even worse by attracting yet more people into the most overcrowded parts of the country. The price of space will continue to go up with the population.

The demand is not that of people already resident in the UK but those with the aspirations to be residents as well.

0
1

Love XKCD? Love science? You'll love a book about science from Randall Munroe

Steven Jones
Bronze badge

Just keep near the surface

I'd be a bit worried if something significant did happen to anybody taking a swim in a spent nuclear fuel pool unless they were unwise enough to try and and swim down near the fuel (even then, exposure will be limited as there's a limit to how long people can hold their breathe; that's assuming nobody is using diving equipment). After all, besides cooling, the pools are designed to be deep enough to shield anybody at the surface from significant levels of exposure to radiation. They only need to be about 6m deep for that purpose, and are all at least twice that depth.

7
0

China building SUPERSONIC SUBMARINE that travels in a BUBBLE

Steven Jones
Bronze badge

Re: I don't get it @Tom 64

Supercavitation does not vapourise and recondense all the liquid in the path of the "missile", just a relatively small proportion. The process is used to generate the bubble in which the object travels and massively reduces surface friction.

Most of the water will be displaced, but due to the very low friction the energy used in pushing the water aside will (mostly) be returned when it collapses back together after the "missile" bubble has passed.

1
0

Bright lights, affordable motor: Ford puts LED headlights onto Mondeo

Steven Jones
Bronze badge

Re: Great, maybe...

The crazy thing is when you see somebody storming down the motorway at 70+ with read fog lights on. Either visibility is such that you don't need your fog lights turned on, or you shouldn't be doing anything like 70 mph.

Personally I hate rear fog lights being turned on at the slightest hint of mist or rain. Not only do they dazzle, but they also mask brake lights as both are rated the same (23 W I think).

Rear fog lights should only be used for really poor visibility when you should be traveling relatively slowly.

6
0

Govt control? Hah! It's IMPOSSIBLE to have a successful command economy

Steven Jones
Bronze badge

Red Plenty...

Rather than read this fairly workaday piece by Tim Worstall, then I'd suggest getting a copy of "Red Plenty" by Francis Spufford. It's a semi-fictionalised version of the attempt to produce a planned economy in the Soviet Union, mixing up real characters. It paints a picture of idealistic economists trying to produce workable and attempting to reconcile them with party dogma. It also paints the role played by black marketeers and attempts to handle the rigidity of the systems. For good measure, it's also got an evocative picture of how lung cancer develops and an incredibly painful insight into what might be called a Soviet baby factory.

It's written with a good deal of sympathy for those who were young and idealistic and is no simple condemnation of the system.

It's a bit difficult to categorise as a book, but as with all Francis Spufford's books it's written with enormous panache and style.

http://www.amazon.co.uk/Red-Plenty-Francis-Spufford/dp/0571225241/ref=sr_1_1?ie=UTF8&qid=1407933868&sr=8-1&keywords=red+plenty

2
0

Stephen Hawking biopic: Big on romance, not so much with the science?

Steven Jones
Bronze badge

Re: Mr Hawking – Over-rated - Big Bang Mythology

Revered worldwide I'd say. Hardly just a UK phenomenon. However, you surely know the answer. It is that image of a great mind (and it is a great mind) trapped in a flawed body. Icons matter.

1
0

America's hot and cold spots for broadband revealed in new map

Steven Jones
Bronze badge

Re: Average?

@Ole Juul

Absolutely. They should be reporting on percentile figures, like medians, quartiles and so on. A far better way of characterising statistics where there is a huge disparity between the bottom and the top of the range. That's why median income level is favoured over average. It far better represents the "typical" experience.

2
0

London cops cuff 20-year-old man for unblocking blocked websites

Steven Jones
Bronze badge

Re: even if he did...

It's absolutely the same in the UK (with the same issues over interpretation). If somebody acts as to subvert an injunction, then they may - and I repeat may - be committing a criminal act.

Of course it's no surprise that US and UK jurisdictions are similar in this area as both are based on the same common law roots.

There are two interesting things here. The first is the reach of the law. A UK citizen resident in the UK is an easy target. The second, and maybe more worrying, is if the scope was ever extended to those who give advice on how to bypass injunctions on ISPs. The latter would make a huge number of people vulnerable, but as the common law in this area is not well established, who knows.

Incidentally, in this case I suspect the USA authorities will (eventually) close loopholes as the US is, if anything, far more intent on "protecting" IPR than is even the UK or EU authorities. I think there are already treaties being discussed...

2
0

Brit kids match 45-year-old fogies' tech skill level by the age of 6

Steven Jones
Bronze badge

The real meaning of DQ...

The real meaning "Digital Quotient" is not so much how "technical savvy" somebody is, but rather a measure of how susceptible they are to being consumers of the latest products and services from Internet and gadget companies.

So not so much a reflection of understanding, but more a statement of fashion sense in the world of being a consumer of hi-tech services and products...

8
1

Call off the firing squad: HP grants stay of execution to OpenVMS

Steven Jones
Bronze badge

Re: Jealousy reigns

DEC Alpha didn't streak away from anything. DEC got into trouble as it became greedy and complacent as it took its customer base for granted, charging ridiculous amounts for such basics as TCP/IP stacks. When a whole host of UNIX-based "hot box" companies came along, companies voted with their wallets to what they saw as a more open market where competition drove down pricing. DEC's reaction came too little and too late and VMS became just a legacy products with high costs.

Of course the independents selling UNIX based kit all gradually succumbed from competition and the impact of commodity hardware and growth of Linux and Windows servers. Only the big-boys survived, and even SUN fell into the hands of Oracle.

1
0

4K video on terrestrial TV? Not if the WRC shares frequencies to mobiles

Steven Jones
Bronze badge

Re: A little alarmist...

The decision to open up the spectrum is not the same as actually having to deploy it. I know there are border issues so that adjacent countries have to coordinate allocation of bandwidth at a detail level, but that's mostly an issue for countries with land borders and not the UK (well, unless Scotland votes "yes").

So I don't see any reason why if country A wants to use a given band for mobile usage why country B has to implement it.

(nb. I am aware that there are actually issues with overlapping bandwidths between France and the UK such that transmission powers and direction are actually something which are subject to agreement at the narrower parts of the channel, but I don't think it changes the principle).

0
0
Steven Jones
Bronze badge

A little alarmist...

Unlike the analogue switchover, politicians will get involved with this if Freeview was to be crippled. The reason there was no big outcry over loss of analogue was that DTV gave clear advantages. More channels, better quality (assuming the broadcaster didn't choose stupidly low bit rates) plus all the advantages of PVRs. All those VHS recorders had clearly had their day and we were also at the point where people were swapping out old CRTs for flat screens. In the big picture of things, the costs were modest and most people didn't have to do much save buy the right receiving equipment.

In short, all the planets were in alignment and it was, bar a few, clearly in (almost) everybody's. However, if the Freeview platform itself came under threat, then expect all hell to break loose. There are lots of issues with broadband as a universal platform; holes in the coverage, cost to viewers and reliability. If your broadband goes down at the moment, you've got the option of watching TV. If that's dependent on the same delivery system, then you're out of luck until it's fixed.

That leaves Freesat, but not everybody want's to stick a dish on their house (assuming they've got a reasonable location), and it means chucking away serviceable hardware with no obvious advantage. In a house with many receivers there's also a distribution problem. It's not as simple as putting a distribution amp in the loft space.

If there are sufficient grumbles, then it politically won't work. Think of all the issues that have arisen with shutdown of FM radio. People see very few advantages, but what the do get is a situation where their car radio won't work apart from and expensive swap-out or some nasty kludge which means their steering-column controls don't work.

nb. I see little point in 4K as, in the average UK living room, people sit far too far away from the TV screen to see any difference unless you buy a simply huge TV screen. Much better that any more bandwidth or improved codecs are used to enhance HD quality (which is often abysmal).

3
0

Scotland's BIG question: Will independence cost me my broadband?

Steven Jones
Bronze badge

Re: Promises

From what I've read it would appear those born in Scotland would be automatically be granted Scottish citizenship, but other nationalities resident there, or those with other claims (like Scottish parents or grandparents) would have to apply and be granted citizenship by whatever means i

An interesting question is going to be which government will be responsible for paying state pensions (or, for that matter paying out occupational pensions for government employees where there's no funded scheme - such as civil servants). Entitlement for state pensions will have been by NI contribution whilst in the UK, but as there's not funded scheme, it just comes out of general taxation. Are future old age pensions then going to be paid by residency at the time of independence? Is it going to be by nationality (in which case, what about the very large number who will have dual nationality).

In all, this is is just one of thousands of detailed issues that arise from independence, and it's all meant to be sorted out (amicably) in a couple of years.

nb. interesting question - who will represent the interests of the "RUK" in the event of a yes vote? I would expect citizens of the RUK and Scotland respectively to want the best possible deal for themselves. As the "yes vote" has been taken in the absence of the actual terms of independence, then what happens if agreement isn't reached? Fascinating stuff.

5
2
Steven Jones
Bronze badge

Splitting utilities on geographical lines?

Should Scotland become independent, then I can see that if the regulation becomes more onerous, it will be in the interests of several utilities, and their respective shareholders, to divest split into separate subsidiaries on geographic lines and even, perhaps, into wholly separate companies.

There would be tricky issues to resolve, especially where, historically, services such as IT are fully integrated. This could lead to a considerable increase in costs. But, if the regulatory regimes across the border are very different, then this would inevitably increase costs too.

In the case of some utilities (including that of BT), it may not be in the interests of Scots as I'm pretty sure that the (overall) lower population densities and greater distances involved will mean that the network north of the border is more expensive to maintain (per property) than that of the more densely populated rest of the UK. That might lead to wholly different calculations for things like wholesale line cost calculations.

Of course it might leave some interesting questions on how pension deficits are to be funded, although I imagine such liabilities could also be split on geographical lines.

4
0

Stick a 4K in them: Super high-res TVs are DONE

Steven Jones
Bronze badge

Re: aware of the benefits of 4K

Personally, I don't see that my personal experience of a film is going to be much improved, if at all, by tiny perceptual differences like that unless I have a dedicated home cinema room. Even then, I'd gladly swap some sort of techno-fix for quality of writing, acting, plotting and screenplay. Measuring film quality by pixels strikes me as a strange game.

It's reckoned that at 2m or so, you need a 55 inch screen or more to perceive any difference in 4K. As I sit 2.6m away that makes for something more like 75 inches. Far too big for my room. Really something for those with dedicated home cinemas.

nb. the "streamers" ought to think about upgrading their HD bandwidths. If they want to use 10mbps+, then that's probably going to make a bigger qualitative difference to the "viewing experience" than heavily compressed 4K.

15
1

Yorkshire cops fail to grasp principle behind BT Fon Wi-Fi network

Steven Jones
Bronze badge

Re: Not wanting to defend plod, but

There's no "alleged tracability". If you connect via FON, you arrive on the Internet via a completely public IP address using logon credentials. The traffic is just as traceable as that for any traffic coming from a device connected to your home network. Indeed, more so as there are not credentials passed from devices on your home network to the ISP (unless there's a back-door in the router which logs MAC addresses and sends them to the ISP).

Of course, somebody could always steal your details, but that is true of any public network where you logon with a userid and password. Indeed, it's true of somebody who gains access to your home network logon details (how many people freely give their WiFi passwords to friends and family to put in their phones and other devices; how secure are those?). The only systems which are really proof against stolen details are where one time password devices are required.

This is plod knowing a little and thinking he's somehow qualified to lecture the world. If he wants a security problem to worry about, then it's about accessing public networks at all. It would be pretty easy to mimic a BT FON connection.

2
0

ARM: We've signed 41 new deals and we are IN to the Internet Of Stuff

Steven Jones
Bronze badge

What's the real market size of x86 vs ARM

ARM is a minnow in comparison with Intel. However, this surely gives an entirely misleading view of the relative value of the ARM semiconductor processor/gpu market vs that of the Intel x66/gpu one. It would be interesting to see such a comparison to give an idea of the relative financial strengths of the two competing architectures.

0
0

FLAPE – the next BIG THING in storage

Steven Jones
Bronze badge

"El Reg wants to know: Could a disk read/write head work on more than 1 track at a time? Wouldn’t that increase disk I/O bandwidth?"

I theory yet, but in practice (with modern high density drives) it's not practicable. It's certainly not possible to put two completely independent head mechanisms on a disk due to vibration and air-flow issues. The other option, to have a single head with multiple read-write heads comes right up against the problem that the tracks are simply far too close together on disk to be written simultaneously. I suppose you might conceivably have some sort of staggered arrangement whereby the "parallel" tracks are being written a little distance apart, but I suspect problems will remain over heat dissipation, the size and weight of the head, error recovery and so on.

The reason that reading/writing tracks on tape is practical is that they are much wide apart (and they don't have to be moved fast).

nb. I seem to recall back in the days of fixed-head disks there were some that could work in parallel, although that may be my imagination.

0
0

BT: Whew, we've been cleared of major privacy breach. Oh SNAP, another webmail blunder

Steven Jones
Bronze badge

Re: Why is BT relying on a US supplier for webmail?

What in-house resources? The great majority of development and support is off-shored in an attempt to keep costs down. Buying in solutions from specialists is the norm in business these days due to economies of scale (did you not notice the previous system was run by Yahoo! ?).

1
1

10Gbps over crumbling COPPER: Boffins cram bits down telco wire

Steven Jones
Bronze badge

Re: "...can't I pay a regular ADSL type service charge?"

ISPs already offer TV & Films over broadband in the UK (as do the likes of NetFlix). You can buy them as bundles. However, there's a major difference in the regulatory regime in that there's no option to cross-subsidise infrastructure roll-out from retail revenues. Wholesale line rental is subject to extremely tight regulation, and whilst FTTC wholesale pricing is not regulated as yet, it has to be sold as a wholesale service to all operators.

The consequence? A lot of retail competition, but not much money for infrastructure investment.

0
0
Steven Jones
Bronze badge

Re: not round here

VDSL and FTTC makes no difference to the copper run to the exchange. Apart from passing through a low-pass filter at the cabinet, there's no difference. Indeed, if you had ADSL before, the line would have passed through a similar low-pass filter at the exchange. (Plus a similar low-pass filter in your house). Unless there was a poor connection made at the green box, then I can's see why there would have been any difference to call quality at all.

0
0

Russian law will force citizens' personal data to be stored locally

Steven Jones
Bronze badge

They could make business more difficult.

The Russians might not easily be able to prevent their citizens using Facebook, Twitter or the like, but they can make it very difficult for such services to make money from local advertising or local financial transactions. Of course, with services like eBay or Amazon, it's essentially impossible to operate without some form of local presence.

Bear in mind the US authorities became very aggressive with companies that offered on-line gambling to their citizens.

4
0

BT and TalkTalk BOTH claim victory as Ofcom tackles fibre price row

Steven Jones
Bronze badge

Re: That old horse:

BT ducting is available under PIA.

http://www.openreach.co.uk/orpg/home/products/ductandpolesharing/ductandpolesharing/ductandpolesharing.do

6
1

How practical is an electric car in London?

Steven Jones
Bronze badge

Re: @ecofeco

Indeed. The only way that the ICE will disappear is if somebody comes up with a cost effective fuel cell which can work off high energy content liquid fuels (perhaps ethanol). I discount liquid (or compressed) hydrogen as producing it is thermodynamically highly inefficient and it's tricky to store and distribute.

Batteries have fundamental capacity constraints dictated by electrochemistry. Lithium is already just about the best candidate we have, as it is the third lightest of the elements, but it still has very poor energy storage (in battery form) when compared to hydrocarbon fuels. Battery powered vehicles could well have a role in short range, commuting and delivery functions, but not for long range delivery or a general purpose family vehicle. For those, some form of easily transportable liquid fuel will surely still be best, and at the moment, the ICE is what we have.

16
1

NASA beams vid from space via laser

Steven Jones
Bronze badge

Wildly misleading NASA claim. This is why...

NASA's claim is wholly misleading. A 1 metre diameter receiving dish might well subtend approximately the same angle as the diameter of a human hair at 30 feet, but that's not the most important factor. What is far more important is the degree of divergence of the laser beam, which you can guarantee is far more than a metre by the time it hits the Earth's surface.

Human hair isn't a great standard measure, as the size varies a lot. However, if we take 2/1000th of an inch, it will subtend an angle of about 5 micro-radians. To a good degree of approximation, laser beam divergence depends on the minimum (waist) diameter of the beam and the (1,500nm) wavelength. If we take a reasonable beam "waist diameter" of 1mm, that gives a beam radial divergence angle of about 470 micro-radians. In other words, the degree of precision required is, perhaps, only about 1/100th of that claimed. Also, of course, the ISS moves in a rather smoother, and predictable manner than a human being walking.

To put this in perspective, it's reckoned that competition level target rifles can manage accuracies as high as 100 micro-radians, albeit, not hand held of course.

Plug in the minimum ISS orbital height of the 350km, and you get a beam diameter of about 160 metres, so the receiver only has to be in that area. Of course the transmission was likely at a considerably greater distance than the minimum ISS orbital height, but then that doesn't change the degree of accuracy required.

ps. sorry about the mixed units, but that's NASA for you, quoting wavelengths in nm and distances in feet. You'd have thought by now, having crashed a probe due to mixing up systems, they'd have stuck with the metric system.

0
0

Vodafone: SPOOKS are plugged DIRECTLY into our network

Steven Jones
Bronze badge

Re: Stop taking GCHQ money the first place Vodafone Executives!!!

There's no "allowing" involved. It will be mandated by government.

0
0

Oh, wow. US Secret Service wants a Twitter sarcasm-spotter

Steven Jones
Bronze badge

A bit late for Paul Chambers

If the UK authorities had such a filter, it might just have saved Paul Chambers quite a lot of trouble, as it was fairly clear that the police, prosecution and lower levels of the judiciary were completely unable to do it using normal judgement.

5
0

ARM to open CPU design centre in Taiwan

Steven Jones
Bronze badge

Re: Why Taiwan?

It's very common for multi-national companies to spread their research around the globe. It's particularly important for the likes of ARM which has a business model which involves licensing technology and working as a partner with customers. By having research facilities close to their main customers, then they can have a much closer and responsive relationship than would be possible from a purely UK base, with all the travel, language and time zone issues. It also has the side-benefit of being seen as a true partner, and not just a foreign supplier.

As far as China is concerned, then that's a bit of a different issue. Having research facilities in Taiwan might not be looked upon very favourably by the Chinese government.

0
0

Vodafone turns to EU, asks it to FORCE 'fair' fibre pricing

Steven Jones
Bronze badge

Sensational headline trumps reading The Register's own news items...

Indeed, it's hardly unusual for The Register to confuse telecom products. Indeed in 2012 the Register actually published an item over Ofcom's proposed regulation of BT's leased line pricing outside of London. However, a sensationalist headline seems to trump proper research, even when it's stuff they've published themselves.

http://www.theregister.co.uk/2012/07/05/ofcom_mulls_price_cap/

1
0

Gee thanks, Ofcom! BT 'pleased' to hang onto pricing 'freedom' for Openreach fibre product

Steven Jones
Bronze badge

Re: if BT are delighted......

Whilst it was true that the first tranche of BT shares were sold at £1.30, there were two further tranches at £3.35 and £4.10 respectively. Also, looking at the RPI index, you have to apply a factor of 2.46 to that first tranche price to correct for inflation. So that £1.30 is equal to £3.20 at today's prices. As the two other tranches were in 1991 and 1993 respectively, the effect of inflation was not so high, but you are still looking at equivalent prices well over £5 (although there were staged payments for all offerings). Against that, there are more shares in circulation now (due to a share option issued) which does dilute the price, but then the shareholders stumped up that extra cash so it really balances out. A further calculation ought to involve the O2 offload, but as that business scarcely existed in 1985, it's not really very relevant,

If you start doing all the mathematics on this, you find that BT's capitalisation value is (inflation adjusted), considerably below what it was when the final tranche of shares was issued, even making allowances for all the factors I've mentions. At the time of privatisation, BT over 250,000 UK employees, whilst these days it's now more like 90,000. As far as UK operations are concerned, it's a much smaller company than before despite the product range being vastly larger, such is the state of what is now a highly diverse telecoms industry.

0
0

WHOMP! There it is: IBM demos 154TB tape

Steven Jones
Bronze badge

Re: LTO22?

I'm not questioning the future of tapes for very large archives and very large backups. For those purposes it's still unbeatable for cost, power consumption and robustness in transport. However, what I'm pointing out is that there's something inherently unscalable with both tapes and disks and performance becomes a big issue.

nb. I made a bit of an error with the requirement for more heads. In fact the number of heads is increased linearly with bit density, then total read/write time remains fixed. My analysis only works if the number of heads is kept fixed (tracks have to go up, but that's a different matter). LTO has doubled the number of heads in the past, which explains why the total read/write time hasn't increased as much as expected, but this can't go on for ever. So another doubling of head numbers before this theoretical LTO-12 arrives might allow for total read/write time of about 16 hours.

1
0
Steven Jones
Bronze badge

LTO22?

"Vulture View: With its LTO connections, such an IBM technology could well banish Sony’s impressive 185TB effort to the tape wilderness. Reckoning on LTO generations doubling capacity we could see a hypothetical LTO-22 pass the 154TB mark."

I think Vulture Centre needs a little bit of work on his mathematics. If capacity doubles every generation, it will take 6 generations to go from LTO 6's native 2.5TB to 160TB, which would make it LTO12 (or LTO-11 if it's compressed capacity at 2:1).

Of course, one of the big problems with a 160TB is just how long is required to read/write the whole thing. Assuming that the bit density is the same in both directions, if you double the linear density, you quadruple the capacity. Even if you double the number of tracks (and therefore read/write heads), you only double the read/write speed so it will take twice as long to read/write an entire tape as compared to 4 drives of the previous generation. Scale up LTO6 to LTO12 and that would, theoretically, mean an uncompressed read/write speed of about 1.28GB per second, or 5.56GB per second on 2:1 compressible data. That means it would take about 34 hours to read/write the whole 160TB. A saving in media space and drives of course, but you create a bottleneck, and all that assumes you can keep doubling the number of heads per generation (unlikely I would say). Basically a bottleneck, not unrelated to that of (linear) write speed on ever larger disks. Namely capacity goes to the square of bit density whilst linear access speed is only linearly related.

(Random disk access is another issue which is even worse).

2
0

Chap rebuilds BBC Micro in JavaScript

Steven Jones
Bronze badge

Re: "zippy performance, something the original BBC Micro was not famous for"

@ Jim 59

Complacent? As in designed a brand new RISC processor and (for those days) very fast computer complete with a full graphical operating system using such things as scalable typefaces in the shape of the Archimedes? All on a small fraction of the resources available to the industry giants.

On reflection, it was probably inevitable that Acorn were going to get steamrollered by the sheer mass and inertia of the IBM PC and all the clones that followed. Staying alive in a niche market is about companies can do in such a tidal wave of commoditisation, and even IBM had to bail out eventually. Only Apple have retained any sizeable market share for an alternative personal computer standard (at least prior to the smart phone/pad revolution). However, that complacent company you describe left the legacy of ARM, which now designs and licenses by far the most popular CPU architecture (by number) that has ever existed.

1
0
Steven Jones
Bronze badge

Indeed, I've no idea what era the author first became aware of computers, but the BBC micro was faster than virtually all the direct competitors at the time. Indeed, BBC Basic (which was advanced for the time) was famously much faster than the competition.

The 6502 was clocked at 2Mhz whilst some of the competitors ran at 1Mhz. Whilst the Z80 based competition was often clocked at 4MHz, this was a somewhat misleading comparison as the Z80 as the latter used more "clock ticks" for most operations (like memory access). The Z80 could be faster in some circumstances as it had more registers to play with, but the 6502 had some tricks of its own in referring to low memory. Generally I found that my 2Mhz BBC micro outran the 4MHz Nascom II I built.

Anyway, this was long ago, but the BBC was considered pretty fast for its time.

8
0

Now that's PROPER SCIENCE: Boffins teach robo-arm to catch flying beer bottle

Steven Jones
Bronze badge

What's "ultra fast"?

That's impressive, although perhaps the author of the piece might care to define rather more precisely "ultra fast" means as, literally speaking, it just means "beyond fast". As I'm not at all sure what the limit of "fast" is, I don't know if this is equivalent to a gentle lob, a fast bowler or a speeding bullet.

0
0

Google CAN be told to delete sensitive data from its search results, rules top EU court

Steven Jones
Bronze badge

Re: Barmy (@Psyx)

Driving too fast is most certainly a criminal offence. A criminal offense is (usually) when the state prosecutes somebody (or an organisation) for breaking the law. Exceptionally, there can be private prosecutions, when a private individual (or some types of organisation) takes on the role of the state as prosecutor. The penalties include such things as fines, community service, being bound over or custodial sentences.

In contrast, civil law is between private individuals or organisations to resolve issues such as contract law, nuisance, libel and so on. The penalty for losing such cases is often a financial award to the plaintiff or various sorts of court orders. If there's a fine involved, it is not a civil offence.

What you are probably confusing is whether an offence gets you a criminal record or not. Driving too fast doesn't (in general). However, it most certainly is not a civil offence.

The US does distinguish between (minor) misdemeanours and (major) felonies in criminal law (but neither are anything to do with civil law).

2
1
Steven Jones
Bronze badge

Re: Barmy

It would only violate US law if the filtering extended to those accessing Google from US territory, which, almost certainly, it will not. Indeed if Google were to do such a thing, they'd be in trouble.

Of course that makes it simple to bypass the filtering - just use a proxy service based in the US. I rather suspect any European journalist will use this technique if filtering becomes at all common.

1
0

Minecraft players can now download Denmark – all of it – in 1:1 scale

Steven Jones
Bronze badge

Never mind the nominal scale, it's detaile that matters.

1:1 as a definition is clear nonsense. What matters is resolution. In principle, you can zoom any source of mapping information into whatever scaling you like, but it's the mapping resolution that gives you the detail. If what is meant be 1:1 scaling means that you can expect to see on your computer screen the level of detail you would see if there in person, then it's clearly not the case. Such a model would require vastly more information than a mere 1TB.

So, as always, don't swallow the headline, but find out what it really means.

2
0

ARM brushes off dip in mobile revenues with sunny forecast for coming year

Steven Jones
Bronze badge

Re: Eh?

Simply because the pound has appreciated against the dollar. Of course, if you go back and recalculated the conversion using today's, rather than the historical conversion rate then applicable, you'll get exactly the same growth rate.

0
0

95 floors in 43 SECONDS: Hitachi's new ultra-high-speed lift

Steven Jones
Bronze badge

Re: One for Newton...

nb. just checking my maths, I'd made a slip, so this is the formal derivation. Always show your work I was told...

Total distance = 440m

Total time is 43 secs

Top Speed is 20 m/s

Rate of acceleration and deceleration are identical

The take

a = rate of acceleration (will be -a for deceleration)

t = time take for acceleration (clearly time for deceleration will also be t)

Clearly the acceleration time (t) is simply the top speed (20 m/s) divided by acceleration rate so :-

t = 20 / a

The distance traveled during the acceleration phase is given by the equation

s = ut + 1/2 * at^2

where s = distance travelled

u = initial speed

a = rate acceleration

t = time

but u = 0, so we get

s = 1/2 * at^2

the distance (and time) traveled during deceleration will be identical. The time spent traveling at top speed will be the total time (43) less the time spent accelerating and decelerating (2t). Substituting for the t with 20/a (see earlier), and adding in the distance in acceleration and deceleration, we get.

(43 - 2t) * 20 + 1/2 * at^2 + 1/2 * at^2 = 440

expanding

860 - 40t + at^2 = 440

rearranging

420 = 40t - at^2

But, we know that t = 20/a so, substituting for t

420 = 800/a - 400/a

therefore

420a = 400

therefore

a = 20 / 21 m/s^2

But t = 20/a, so t = 21 secs

So this actually means accelerating at approx 0.95 m/s for 21 seconds, traveling at 20 m/s for 1 second and then decelerating at approx 0.95ms for 21 seonds. So about +/- 0.096g.

0
0
Steven Jones
Bronze badge

One for Newton...

@TheVogon

Your analysis is flawed as it has a single period of acceleration which means you'll be existing the top of the tower at some considerable speed. Modeling the travel as a single period of acceleration is wrong - you have to use two equal periods of acceleration (with opposite signs) use the lifts maximum speed (20 m/s) as the limit and then have the remaining time traveling at top speed.

Assuming acceleration/deceleration at at a constant rate, the maximum speed is 20 m/s, the height traveled is 440m and the time taken is 43s, the lift accelerates for 11 secs at approx 1.82 m/s^2 (110m), then travels for 21 secs at 20 m/s (420m) and then decelerates at approx -1.82 m/s^2 for 11 seconds (110m). The acceleration encountered (net that of gravity) is therefore about +/- 0.185g.

0
2

VAT's all folks: Telecoms and services tax to be set at consumer's homeland rate

Steven Jones
Bronze badge

Re: Next year, I will mostly be living in Luxembourg

I rather think they won't be relying on your IP address to locate you, but details from your credit card, bank account or other payment method.

0
0

Why won't you DIE? IBM's S/360 and its legacy at 50

Steven Jones
Bronze badge

Re: The first clones...

You are quite right - DME, not DMA (it's been many years). DMA is, of course, Direct Memory Access. I was aware there was also a 1900 emulation too, and there were those who swore by the merits of George (if I've remembered the name properly). Of course, the 1900 had absolutely nothing to do with the DNA of the IBM S/3260.

0
0
Steven Jones
Bronze badge

The first clones...

It's worth mentioning that in 1965 RCA produced a mainframe which was a semi-clone of the S/360, and almost bankrupted the company in an attempt to compete with IBM. It was binary compatible at the non-privileged code level, but had a rather different "improved" architecture for handling interrupts faster by having multiple (in part) register sets. The idea, in the days when much application code was still written in assembly code, was that applications could be ported relatively easy.

The RCA Spectra appeared over in the UK as well, but re-badged as an English Electric System 4/70. Some of these machines were still in use in the early 1980s. Indeed, UK real-time air cargo handling and related customs clearance ran on System 4/70s during this period (as did RAF stores). Of course, English Electric had become part of ICL back in 1968. Eventually, ICL were forced to produce a microcode emulation of the System 4 to run on their 2900 mainframes (a method called DMA) in order to support legacy applications which the government was still running.

In a little bit of irony, the (bespoke) operating systems and applications mentioned were ported back onto IBM mainframes (running under VM), and at least some such applications ran well into the 21st century. Indeed, I'm not sure the RAF stores system isn't still running it...

Of course, this had little to do with the "true" IBM mainframe clone market that emerged in the late 1970s and flowered in the last part of the 20th century, mostly through Amdahl, Hitachi and Fujitsu.

0
0

David Cameron defends BT's taxpayer-funded broadband 'monopoly': It's a 'success story'

Steven Jones
Bronze badge

Re: Gaps?

It was each of the local authorities that carried out the Open Market Reviews under the BDUK framework, and they would have treated all the telecommunication suppliers in that respect (although clearly that means VM or BT for the vast majority), and they that drew up the intervention areas. Of course it's always going to be difficult for small companies with limited finances to commit expenditure, but frankly that's because in the world of major infrastructure projects the capital requirements are high, as are the risks. If they weren't, there wouldd be hundreds of local companies doing it, and frankly there aren't. It's a game for companies with deep pockets who can absorb risks. (Like Google - who can afford a large scale commercial experiment with Google Fibre).

What you appear to be requiring is that a commercial company releases its investment plans to competitors, and I really don't see that happening, especially in an area of investment like telecommunications, where there can be rapid take-up and change.

I suspect that the tendency in fixed line telecommunications is very similar to that for electricity, water and gas. The economies of scale are with large operators and that's a natural state of the market. What that means of course is we end up with a highly interventionist regulated environment (which is what we have), with more competition at the higher and added value levels. There will be specific areas were smaller companies will make an impact - industrial estates, new apartment blocks and so on (there have been some developments recently), but I don't think we are going to see the country somehow covered by a patchwork of small, local network suppliers. That's how both electricity and telephone provision started out, and it ended up being consolidated into national networks in virtually every country you care to name (usually nationalised, as in the UK, but privately in places like the US). By a quirk of history, Hull & Kingston retained it's own local telephone network, but that's highly unusual.

One other point. It's rather unfortunate that public money is required at all to subsidise rural roll-out. In the case of the telephone (and other utilities), that subsidy was achieved via a cross-subsidy system. That continues to this day in that the copper loops in rural areas are cross-subsidised from revenues in urban areas. That can be done, as there is a regulatory regime that is enforced via a USC, but that's not the route that Ofcom (or government policy) favours. What they went for is a highly competitive market as deep into the network as technology allows, which with ADSL was essentially from the DSLAM onwards. As penetration goes deeper into the network, then costs become prohibitive and you end up with a de-facto monopoly on FTTC solutions. However, the structure of the market - with very low cost competition via LLU operators means that there isn't the potential to cross-subsides roll-out.

Perhaps if Ofcom had adopted a model which actually represented the differences in cost structures between urban and rural, such that customers in those areas bore the real cost of provision, then subsidies wouldn't have been required (cross or otherwise), the market would have provided. However, I rather suspect that rural dwellers wouldn't much appreciate paying the full commercial costs involved, but as the market was structured, they didn't have the choice.

0
0
Steven Jones
Bronze badge

Re: Gaps?

@ Warm Braw

EU state assistance rules do not allow any substantial overbuild of any comparable existing privately-funded system. In the case of the BDUK funded scheme, that included VM broadband as that is capable of exceeding the chose NGA standards for "superspeed". Indeed, VM keep a close eye on this for obvious reasons and would object to any state funded competitor encroaching on their "patch" to any significant extent.

The BDUK process includes an open market survey asking for any (credible) privately-funded schemes before the intervention areas were defined. The length of time required to gain EU approval was responsible for a considerable part of the delay in the scheme as, not surprisingly, politicians tend not to consider such issues before making their announcement.

So, I can't be sure in your area, if there is a "legal" overlap with the VM network. If there is, most probably it was part of the commercial roll-out. It's extremely common (almost the norm) for some cabinets on an exchange to be part of commercial roll-out and others to be on the BDUK scheme as they were not considered to be commercially viable. It can be very difficult to tell the difference. Some authorities (like the Bucks & Herts schemes) actually publish which cabinets are to be enabled as part of the BDUK scheme, but that's far from universal.

nb. my cabinet is similarly in a VM area and was enabled three months ago, but it was part of the commercial roll-out, whilst I expect others on the same exchange (serving smaller communities) to be BDUK enabled.

That doesn't mean there won't be a small amount of overlap, as inevitably the footprint of a

0
0
Steven Jones
Bronze badge

Re: If you're going to plow billions into telecoms infrastructure...

I'm not sure what country you are from, but it's spelt plough in the UK.

As for spending £30bn on an FTTH network (which is a credible figure and about what the equivalent island of Jersey is spending per property), then you'll need to find a legal way of doing it. Overbuilding the VM and BT NGA networks would fall foul of EU laws on state assistance, so you have to factor in re-nationalising their access networks, which will wipe out half or more of your budget. So now your down (optimistically) to £15bn, which is nothing like enough. So let's make your budget £45bn. Also, how are you going to get people off of the copper network? The evidence is that the majority of folk stick to the copper as it's cheap (being a sunk cost) and meets most of their needs. Withdraw it, and you've got a whole bunch of LLU operators who'll want compensation for the investments they've made in kit. In reality, any government would be stuck with running both fibre and copper in parallel for many years, and wholesale charges will be forced up to recover the costs.

What you are describing is exactly what the Labour government decided to do in Australia with the National Broadband Network. That was aimed at delivering fibre to 93% of properties (so didn't covered the remote areas) and, on the latest review, was costed at $72bn (AUS), or £40bn albeit about 40% of the properties. It's since been downgraded to a mixture of FTTP and FTTC, but it still going to cost £24bn (that's assuming it actually delivers).

Against that, the public expenditure on broadband infrastructure in the UK is very low. In fact, many might argue that there is something wrong with the regulatory and commercial structure if the government is spending public money anyway. The problem is the path that Ofcom (and most EU regulators) have gone down. They've forced down the price of copper to the point where it's very difficult to justify investment in NGAs outside "prime" areas as a mechanism for minimising retail pricing. There is precious little incentive for private investors to put money into infrastructure.

0
0
Steven Jones
Bronze badge

Re: fudging numbers

You are just plain wrong - the percentage of EO lines in rural areas is nothing like 90% (although it could be for individual exchanges). Among other things, relatively few village exchanges serve just one village, and all the others will have cabinets. There are solutions for EO lines, but I rather suspect that they aren't priorities as other lines can be covered at lower costs.

1
0
Steven Jones
Bronze badge

Re: This is a real success story for our country

If you think you can connect more than a small minority of properties to fibre with a budget of £1.2bn, you are living in fantasy land.

4
3

UK regulators: We will be CHECKING UP on banks' IT systems

Steven Jones
Bronze badge

Re: "antiquated nature of bank IT systems"

Concentrating on the underlying hardware and OS rather misses the point. Certainly you can run rock-solid IT systems on mainframes, and characterising them as "antiquated" actually tells you nothing about the underlying resilience of the applications. However, even the most reliable and robust systems can be undermined by poorly trained and managed staff. It shouldn't be forgotten that the 2012 RBS outage was not due to dodgy Windows XP, Linux or UNIX systems, but a problem with the support and maintenance processes of good old CA-7 on a mainframe system. It's not that CA-7 or Z-OS is fundamentally unreliable, but a failure in good operational and support practices.

The real issue is that, in the drive to reduce costs and roll out new features, that what is being sacrificed is the quality and experience of operational, technical support and IT management staff and resources. If good practices are not maintained, then even the most reliable hardware in the world will not prevent catastrophic outages.

6
0

Page: