Whats the problem
If you don't want the web shield then turn it off. Is it really that difficult ?
70 posts • joined 24 Jun 2009
If you don't want the web shield then turn it off. Is it really that difficult ?
Skegness for 'Skegex' - The Midlands Meccano exhibition in July ( really, http://www.nmmg.org.uk/skegex.html ) - Fantastic ingenuity and dedication by all those constructors.
Openreach carries all Broadband connections except for VM's approx 4 M - i.e. it carries around 18 M connections, so it is a hugely dominant provider. The other providers ( e.g. TT, Sky, EE, Vodafone etc. ) are all dependent upon a service provided by their dominant competitor.
The current Broadband situation is rather as if there were several different chains of petrol stations ( e.g. BP, Shell, Texaco etc.) but they all had to use refineries operated by just one company e.g. BP and were dependent upon BP for prices and deliveries. We don't provide petrol that way, so why is it good way to provide broadband?
I've been a recipient of long Cc: lists from well meaning senders, and occasionally done this kind of thing by accident ( though not on that scale ).
In this spam/phishing era perhaps mail programs could help by adding a few more functions and checking
- Have a 'Send with recipients hidden' button to make it easy/obvious ( though same functionality as Bcc)
- Have a check that prompts 'Did you really mean to send this to 100 visible recipients' ( where the limit 100 or whatever is user configurable).
Now watch it flow back into the jug when I reverse the time direction control
Brilliant critique. Thank you!
Given that Google maps uses info from phones to update traffic stats on roads could not a similar approach to be used for operator and phone performance on a geographic basis.
An app that occasionally reported GPS location, network operator, phone model, RF signal strength and optionally, a user 'quality score' would allow a dynamic data set to be built that gave an accurate view of network coverage on different phone types. This would reflect actual performance rather than 'predicted coverage' and also an indication of how each phone model affects things.
Problem would be getting sufficient take-up given that the operators are hardly likely to encourage it - they would much rather advertise 98 % coverage than have a system report 75% (or whatever).
Didn't Libre/Open office come from a German product Star Office that Sun bought years ago? I wonder if the German comments are a hang over from the original code ?
So wind energy and solar are providing 8% (each ?) of the energy input they can currently address - that's a pretty good result (and as several comments have remarked, one might not want a great deal more served by an intermittent source).
Of course there is much more to be done to address non-electrical energy - demand side as well as the supply side ( smaller, more efficient cars. fewer unnecessary journeys, better insulated houses with more sensible thermostat settings, wearing a sweater in winter etc. etc. ).
There is an excellent, balanced and scientific book on this by Prof David Mackie of Cambridge University "Sustainable Energy without the Hot Air" - either buy it or read it on his website
Its going to take work and dedication to do this, but either we do it or our children in years to come will shake their heads and wonder why their parents were so short sighted, stupid or selfish.
There's nothing new about 4 x 25G WDM 100G interfaces, we already have 100G LR4 interfaces which will do exactly what they describe in the article, but they are very expensive and consume power & space. Presumably IBM have managed to produce a much smaller and lower power interface which will help a lot. I suspect it's still on a separate chip. If one could get true 'on-chip' optical 100G WDM interconnects that really would be something.
As I understand it network (non-)neutrality is mainly about ISPs giving preferential or poorer treatment to certain content providers, regardless of what their subscribers want. I can't see how giving each individual subscriber the choice to restrict (or not ) access to specific classes of websites relates to an unavoidable restriction by the ISP. The two are quite different site selection paradigms.
Incidentally, I've been running Sky's filter for a year or so with no inconvenience ( no 'false positives' as far as I can tell ), and whilst it no doubt does not block all content in the specified classes, the alternative of putting individual shields on several PCs, Tablets and phones is impractical. If one doesn't want it then its a 30 second operation to turn it off. What is the problem ?
There's a lot of truth in what you say - The BBC (or for that matter other channels) hardly ever deal with Science/Technology on its own terms. It has to be dumbed down or jazzed up to make it "appealing". Contrast this with, for example, Radio 3 CD Review ( which I like ). For 3hrs every Saturday morning on a main national channel music is discussed, analysed and illustrated at length using sometimes technical but still accessible terms for those that want to listen.
'Codes that changed' the world was a partly flawed step in the right direction, but given the plethora of Radio and TV channels now available, when will some broadcaster have the courage to start doing 'real' science/technlogy programs that really appeal to and inform the millions who work in or study such subjects.
Just to add a bit of context - Ofcom (and Cisco VNI) reports indicate that UK Mobile data volume is equivalent to only about 5% the volume of fixed BB data ( 29PB vs 650PB, June 2013 data). Though of course some of the fixed BB data will include traffic from home wifi connections to phones.
Having made it all the way down to this lovely and remote part of England it's also worth seeing the fascinating Cable and Wireless Telegraph museum at Porthcurno on the Lands End peninsular. A superb collection of Telegraph instruments; a fusion of engineering and craftsman's work of art. For sea views try the unique and romantic Minack open air theatre high up on the nearby cliffs.
Re-engineer the billing so that calling party pays something for the call to be accepted - this charge could even be waived if the call was held for 1 minute or the called party entered a code ( so that genuine callers were not charged). In that way it would now cost a much more money for all those silent and no hope calls to be placed. Not sure how this would actually be implemented ( harder with international calls) - but given that the incentive for making the calls is financial then a financial deterrent seems the best approach.
How can they make a loss when the charge 10% of sale cost ( + paypal charges ) just for running a website - it beggars belief
Agree Facebook is similar to TV - an entertainment activity and should be valued on the same basis. Wrong to value as work since some of remuneration is to compensate for the fact that we have to work, not that we simply want to. Conversely, no one is being paid to use Facebook.
Average UK person watches 3 - 4 hrs TV per day (apparently), approx 100 hrs per month. The average Pay TV bill is about £50 per month, so value placed on watching 'valued' TV is about 50p per hour (or 12-15p if you are BBC/Freeview only watcher). That is not too far away from the 40c per hour value placed on Facebook. Multiplying 'TV spend' by 32B hours gives £16B or $25B. Much closer to the $12B figure given.
Agree re. wanting a cheap dongle to allow one to cast audio to 'non-dnla amplifiers' - been wanting something cheap and simple to do this for a long time.
Of course it would need power as well ( assuming it connects using phono plugs or 3.5mm jack).
Why is it that I can seldom get a signal on a train to make a quick call, yet the person next to me can be loudly blabbering trivia into their phone for the entire journey ( or perhaps they are talking so relentlessly that they don't notice that the call dropped ages ago ).
Small correction - the Bombes didn't decrypt the messages themselves, but rather found the Enigma settings used to encode just one sample message on a particular network, a process that might take an hour or more. Once the settings were known much simpler machines could be used to decode the messages themselves very quickly (different keys were used on different networks and the settings changed each day so a number of bombe runs would be needed).
There is a good description of the bombe operation on http://www.ellsbury.com/enigmabombe.htm
Talk Talk with 36% take up has been running filter by far the longest - so suggests around 30% is the long running take-up ( with it varying higher or lower depending on how the option is presented).
According to ONS 2012 statistics only approx 30% of households have 1 or more children - so as its mainly such households that would use a 'child' filter, it sounds like quite a good takeup.
You say Sky is 'bombarded' with reclassification requests. It has 5M subs, with 8% take up that means 400k are using the filter, and off those 110 request a reclassifcation each month, i.e. 0.025%. Not much of a bombardment.
It will be interesting to see how this plays out. NFV in the home CPE is operating in a rather different environment from the enterprise device. Enterprise networks have massive symmetric bandwidth and almost no latency, so there is no performance overhead in virtualising functions.
But for the typical home user on an ADSL connection their upstream channel may be constrained to 500 - 1000 kbit/s and packetisation and error correction of the link introduces 5 - 20 ms round trip latency. It remains to be seen whether the performance of such a link between the CPE and the centralised NFV server constrains what is possible.
Interesting and stimulating article with a lot of cogent points. Agree there is much progress needed on Loudspeakers, but not so sure about the differences between bass enclosures.
At bass frequencies all loudspeakers can be viewed as 'motors' driving some sort of acoustic/mechanical filter. In a sealed box the filter is a resonant system comprising the cone mass and the combined springiness of the air and the cone suspension, if done badly there could be a resonant peak. A bass reflex box has an additional filter comprising the air in the port and the cabinet compliance. A transmission line is also some sort of low pass filter + a delay. So in all cases you hear a filtered version of what goes in, its just a matter of how the contribution of the filter is managed and trading that off against cabinet size, efficiency etc.
In regard to transmission lines and transients - Unlike a true transmission line, higher frequencies are attenuated in the line so the port is only emitting delayed low frequencies, this will still change transients (though less than shown) - but so will all other cabinets. Interestingly in the PMC cross section the driver is part way down the 'transimssion line' - so it will work a bit like a resonant tube as well, so there are two 'bass reinforcement' mechanisms being used.
Rather looks like its moved south of the river as well ? Didn't know Shoreditch types ventured down there.
Didn't Ogle also design the Reliant Scimitar - that was a rather different proposition. Beautiful and fast.
Surely one might expect such behaviour. If a company takes revenue from customers but manages to avoid paying taxes in the countries where those customers live then why would one expect it to contribute back to the open source community that it draws so much from.
But would the US be able to use this information without revealing (or at least implying) how they got hold of it. That is always the problem of intelligence gathering, its hard to make use of without revealling sources.
Years ago when Mary Whitehouse complained about some programs on TV the response was 'well you can always turn it off'. Surely the same answer applies to network filtering; if you don't want it then just turn it off.
For a DSL network you get whatever your sync rate is on the copper line and then the choke point (where subscribers share bandwidth) is the backhaul link from exchange to the core network. This can be eliminated very easily and quickly by migrating to a higher rate backhaul (providing the operator is willing to pay for it). Much the same is true of FTTC.
For a 'Cable' network the data flows are like a tree fanout across the neighbourhood. The choke point is where they all come together at the 'trunk' of the tree i.e. the head end. You can split it into two trees to remove the 'choking' - but this requires work in the street and lots of re-connections. Difficult, expensive and slow! This is probably the reason why cable networks congest on shared bandwidth, i.e. it takes time and investment to relieve it.
As well as Wilkes 100th anniversary, this year is also the 75th anniversary of the Cambridge Computer Lab that he set up ( and for which EDSAC was built). I came across an interesting history of the the Cambridge Lab published this year. It's a relatively non-technical history of the early days as well as the rapid developments from the 1980s onwards.
"Cambridge Computing' by Prof Haroon Ahmed
Interesting difference in philosophy between these two approaches. As I understand it ZigBee is generally based on an 'in home' hub providing a gateway via your home broadband router, whereas" Weightless" communicates wirelessly with some more remote base station - for example down the street or in the middle of a town.
ZigBee is widely available in a range of devices, and would be very easy and cheap to extend to applications like remote meter reading, but requires the home router to be functioning so could not be universally deployed. Weightless is much earlier in the development process and needs external base-stations to be deployed, more costly, but eliminates dependency on the home router.
It will be interesting to see which is taken up, and how consumers react to the different operating models.
P.S. I have no connection with either Weightless or ZigBee
Good analysis and many good points in this article, but the alternate scenario it assumes is that "We can go on forever generating cheap electricity in a way that trashes the environment". We can't.
Sustainable energy sources cost more initially, and may well do so for a long time (though the true cost of unsustainable energy will eventually emerge, at which point it may not appear quite so cheap after all).
We will have to get used to this, so the only way to contain or reduce household energy payments will be to reduce consumption by more careful use and energy efficiency measures.
You get MS Office 365 banner adds on the Greater Anglia train WiFi as well.
Unfortunately the WiFi internet connection was not working beyond that. So all one saw was the Microsoft banner with nothing useful actually happening. A reassuringly familiar experience with MS applications!
Its a nice idea - but the closing paragraph says the attenuation is 3.5 dB/km. Current long-haul fibre has attenuation of 0.2 - 0.3 dB/km and one can go up to 100km before amplification. This new fibre would need an amplifier or regenerator every 10km, not very attractive! They need to fix the attenuation problem first.
Not quite - I think it was an optical switch - just switched wavelengths between different directions, no packet switching. I saw the prototype, a monster of a device. Chi-Ro (?) was bought by Nortel, who then crashed in 2009, but similar functionality now available as a small ROADM module from many vendors.
I like the reminder to choose a good algorithm before throwing hardware/systems at it.
Another problem with expensive commercial solutions is that because they cost so much they have to be justified by applying them to a wide variety of requirements - and then they struggle to fulfill the one you are interested in. Whereas if there is minimal cost in deploying an open source solution it can be focused on addressing a particular requirement => Smaller, simpler, faster and more reliable system.
Of course, there is the resultant risk/problem of multiple fragmented solutions. But providing they are not too disparate, several separate solutions that reliably do their own job may be preferable to one monster that never quite delivers.
The worst thought about this is that anyone working with the PC is breathing that junk in as well
I've got a monitoring device on my line (Sam Knows box) and get a rock steady 18 - 19 Mbit/s 24 hours a day (also confirmed by occasional manual speed tests at various times). FYI its a 0.8km long line from a supposedly despised provider.
In my work I do a lot of analysis of network performance. Ofcom reports average UK download speeds ~ 9 Mbit/s ( improved somewhat in 2012!) but Akamai reports 6.3 Mbit/s. (Both these figures are measuring something different from the average UK line sync speed, which is about 12.7 Mbit/s).
Haven't managed to get full Akamai report yet so don't understand their methodology ( I do understand the Ofcom methodology - and it is sound). Depending on exactly how Akamai measure things there may be subtle limitations on the speed that would be reported - but it is surprising that the Akamai figure is only 70% of the Ofcom figure.
Incidentally, there are about 10% of lines <= 2 Mbit, some of these will be very expensive to improve - but others might be improved by sorting out home wiring.
Say I sign up to a 24 month contract at £25 per month (i.e. £600 commitment) and get a free shiny new i-Thingy valued at £400. In principle the operator has to borrow money to pay for the i-Thingy now, but they can easily do this at a fixed rate if they choose.
So out of the £25 pm about 2/3 of it can be a fixed cost to the operator. The remainder is the cost of providing the service and is subject to inflation - but this can be estimated fairly accurately (or could be hedged if the operator chooses).
It should not be difficult or that costly for the operator to offer the service at an almost known fixed cost to himself, so why do they need the flexibility to vary the charge ?
I wonder if the 'kill it and see if they notice' test is already being applied to FM as well?
For a few hours every 1-2 months the Radio 3 FM signal from Wrotham (Kent) inexplicably reduces by 15-20 dB causing noticeable degradation of signal. If one reports this (and manages to get a reply) it is said to be due to engineering work or an un-determined fault.
Why does it take the BBC/Arquiva hours to notice the problem?
Why do they make it so hard to report? One has to plough through pages of 'Why do I have a poor signal' web-based troubleshooting script before one can tell them it is their equipment at fault.
Perhaps the thinking is: Make it hard enough to report the problem and most people will give up, so only a tiny minority complain, so it obviously isn't affecting most people and can be switched off.
Lots of money saved, and quality down the pan.
First 'hard disk digital recorder' ? A company I worked for in 1984 bought a hard disk digital video recording system from a small Californian company called 'PEL' (which went bankrupt shortly thereafter). The system was a bit of a monster - two racks worth. Disk speeds were slow in those days so the video was digitised and streamed to 8 separate disk drives in parallel. There was also another system built by logica that used Ampex multiplatter disks with the video streamed to 8 platters in parallel - that proved unreliable owing to the problem of maintaining simultaneous track alignment across many platters.
Of course these systems did not have recording scheduling software - but they did record video on hard disks. So there is prior art (and probably some patents).
This has a disquieting similarity to Neville Shute's chilling book "On the beach" (written circa 1950) which describes a world where mankind slowly exterminates itself by unleashing a global nuclear war. The last remaining country (Australia) encases pages of the Encyclopedia Britannica in glass, presumably in the hope it will survive till another human species appears and can read the assembled knowledge. I guess 100M years might be enough.
So if no broadcaster (that doesn't have a TV license income) can justify having their own HD channel all to themselves, then why not run it as a "shared channel" that they can pay to slot in occasional programs or series. A sort of HD mix or sampler channel.
If one could get 10% of the electricity bill by agreeing allow interruption of the tumble drier or dishwasher cycle at peak times it might be quite an attractive option.
Same logic applies to ISPs, but the effect is less noticeable
It would have been more helpful if the uSwitch report had distinguised between the broadband network and the internet and content providers.
The most recent Ofcom report (May 2011) shows a slow down of ~ 10% at peak times. These measurements are done on a much more careful basis and cover the connection from a major internet peering point to the customer. Of course some geographic areas may be worse (and some not so bad).
See http://stakeholders.ofcom.org.uk/binaries/research/telecoms-research/bbspeeds2011/bb-speeds-may2011.pdf (page 35). Not as dull as you might expect!
The uSwitch report is a bit short on detail, but proabbly includes the slow down due to the internet and servers outside of a providers network. That is of course what the customer sees in reality, but its wrong to focus on broadband providers for the problem
It seems highly inequitable that artists get 50-70 year copyright protection of their music, whilst the scientists and engineers that invent the machines used to play, record, transmit, publish, store and listen to that music only get 20-25 year patent protection.
If relatively short patent protection of scientific ideas is a 'good thing' (and helps to nurture innovation and economic success) - then why not apply the same time limits to artistic creation?
There really is something attractive in the speed-up from using just one or two machine operations to perform a single source code operation e.g. x = x + 1 (rather than many arising from interpreted steps).
So the "brick on a stick" is replaced by a paving slab !
Employers 'try it on as well'
I was one of 300 or so other employees who were illegally dismissed by our company (after it went into administration, but was still trading). The administrators employed top London lawyers and tried every procedure they could think of to block our claim. We had to get barristers to pursue our claim in a tribunal. After about 15 months the admistrators did an about face and said they did not have any grounds for dismissal without notice and so would not now contest the claim (though they did argue some technicalities).
We won (and the judge agreed very much with our case and dismissed the technicalities). We should have had a large 'award' except when a company is in administration it does not have to pay such awards. Instead the goverment gives the claimants a 'token' sum.
I guess a quicker process would have helped, but don't make it harder for employees.