47 posts • joined 24 Jun 2009
Going south ?
Rather looks like its moved south of the river as well ? Didn't know Shoreditch types ventured down there.
Didn't Ogle also design the Reliant Scimitar - that was a rather different proposition. Beautiful and fast.
A pattern of non-contribution
Surely one might expect such behaviour. If a company takes revenue from customers but manages to avoid paying taxes in the countries where those customers live then why would one expect it to contribute back to the open source community that it draws so much from.
But would the US be able to use this information without revealing (or at least implying) how they got hold of it. That is always the problem of intelligence gathering, its hard to make use of without revealling sources.
you can always turn it off
Years ago when Mary Whitehouse complained about some programs on TV the response was 'well you can always turn it off'. Surely the same answer applies to network filtering; if you don't want it then just turn it off.
Different Choke points and ease of upgrade
For a DSL network you get whatever your sync rate is on the copper line and then the choke point (where subscribers share bandwidth) is the backhaul link from exchange to the core network. This can be eliminated very easily and quickly by migrating to a higher rate backhaul (providing the operator is willing to pay for it). Much the same is true of FTTC.
For a 'Cable' network the data flows are like a tree fanout across the neighbourhood. The choke point is where they all come together at the 'trunk' of the tree i.e. the head end. You can split it into two trees to remove the 'choking' - but this requires work in the street and lots of re-connections. Difficult, expensive and slow! This is probably the reason why cable networks congest on shared bandwidth, i.e. it takes time and investment to relieve it.
History of computing at Cambridge
As well as Wilkes 100th anniversary, this year is also the 75th anniversary of the Cambridge Computer Lab that he set up ( and for which EDSAC was built). I came across an interesting history of the the Cambridge Lab published this year. It's a relatively non-technical history of the early days as well as the rapid developments from the 1980s onwards.
"Cambridge Computing' by Prof Haroon Ahmed
ZigBee and Weightless - Different deployment models
Interesting difference in philosophy between these two approaches. As I understand it ZigBee is generally based on an 'in home' hub providing a gateway via your home broadband router, whereas" Weightless" communicates wirelessly with some more remote base station - for example down the street or in the middle of a town.
ZigBee is widely available in a range of devices, and would be very easy and cheap to extend to applications like remote meter reading, but requires the home router to be functioning so could not be universally deployed. Weightless is much earlier in the development process and needs external base-stations to be deployed, more costly, but eliminates dependency on the home router.
It will be interesting to see which is taken up, and how consumers react to the different operating models.
P.S. I have no connection with either Weightless or ZigBee
Good analysis, but misses 'the elephant in the room'
Good analysis and many good points in this article, but the alternate scenario it assumes is that "We can go on forever generating cheap electricity in a way that trashes the environment". We can't.
Sustainable energy sources cost more initially, and may well do so for a long time (though the true cost of unsustainable energy will eventually emerge, at which point it may not appear quite so cheap after all).
We will have to get used to this, so the only way to contain or reduce household energy payments will be to reduce consumption by more careful use and energy efficiency measures.
MS Stalled on the train
You get MS Office 365 banner adds on the Greater Anglia train WiFi as well.
Unfortunately the WiFi internet connection was not working beyond that. So all one saw was the Microsoft banner with nothing useful actually happening. A reassuringly familiar experience with MS applications!
Attenuation killer in the small print
Its a nice idea - but the closing paragraph says the attenuation is 3.5 dB/km. Current long-haul fibre has attenuation of 0.2 - 0.3 dB/km and one can go up to 100km before amplification. This new fibre would need an amplifier or regenerator every 10km, not very attractive! They need to fix the attenuation problem first.
Re: Simpsons Did It
Not quite - I think it was an optical switch - just switched wavelengths between different directions, no packet switching. I saw the prototype, a monster of a device. Chi-Ro (?) was bought by Nortel, who then crashed in 2009, but similar functionality now available as a small ROADM module from many vendors.
Can afford to do focussed solutions with Open Source
I like the reminder to choose a good algorithm before throwing hardware/systems at it.
Another problem with expensive commercial solutions is that because they cost so much they have to be justified by applying them to a wide variety of requirements - and then they struggle to fulfill the one you are interested in. Whereas if there is minimal cost in deploying an open source solution it can be focused on addressing a particular requirement => Smaller, simpler, faster and more reliable system.
Of course, there is the resultant risk/problem of multiple fragmented solutions. But providing they are not too disparate, several separate solutions that reliably do their own job may be preferable to one monster that never quite delivers.
You're breathing this stuff!
The worst thought about this is that anyone working with the PC is breathing that junk in as well
Rock steady 18 - 19 Mbit/s
I've got a monitoring device on my line (Sam Knows box) and get a rock steady 18 - 19 Mbit/s 24 hours a day (also confirmed by occasional manual speed tests at various times). FYI its a 0.8km long line from a supposedly despised provider.
Why discrepancy between Akamai and Ofcom speeds ?
In my work I do a lot of analysis of network performance. Ofcom reports average UK download speeds ~ 9 Mbit/s ( improved somewhat in 2012!) but Akamai reports 6.3 Mbit/s. (Both these figures are measuring something different from the average UK line sync speed, which is about 12.7 Mbit/s).
Haven't managed to get full Akamai report yet so don't understand their methodology ( I do understand the Ofcom methodology - and it is sound). Depending on exactly how Akamai measure things there may be subtle limitations on the speed that would be reported - but it is surprising that the Akamai figure is only 70% of the Ofcom figure.
Incidentally, there are about 10% of lines <= 2 Mbit, some of these will be very expensive to improve - but others might be improved by sorting out home wiring.
Operators should be able to fix contract rates by appropriate finance
Say I sign up to a 24 month contract at £25 per month (i.e. £600 commitment) and get a free shiny new i-Thingy valued at £400. In principle the operator has to borrow money to pay for the i-Thingy now, but they can easily do this at a fixed rate if they choose.
So out of the £25 pm about 2/3 of it can be a fixed cost to the operator. The remainder is the cost of providing the service and is subject to inflation - but this can be estimated fairly accurately (or could be hedged if the operator chooses).
It should not be difficult or that costly for the operator to offer the service at an almost known fixed cost to himself, so why do they need the flexibility to vary the charge ?
FM radio also being fiddled with ?
I wonder if the 'kill it and see if they notice' test is already being applied to FM as well?
For a few hours every 1-2 months the Radio 3 FM signal from Wrotham (Kent) inexplicably reduces by 15-20 dB causing noticeable degradation of signal. If one reports this (and manages to get a reply) it is said to be due to engineering work or an un-determined fault.
Why does it take the BBC/Arquiva hours to notice the problem?
Why do they make it so hard to report? One has to plough through pages of 'Why do I have a poor signal' web-based troubleshooting script before one can tell them it is their equipment at fault.
Perhaps the thinking is: Make it hard enough to report the problem and most people will give up, so only a tiny minority complain, so it obviously isn't affecting most people and can be switched off.
Lots of money saved, and quality down the pan.
Re: someone must have invented the DVR?
First 'hard disk digital recorder' ? A company I worked for in 1984 bought a hard disk digital video recording system from a small Californian company called 'PEL' (which went bankrupt shortly thereafter). The system was a bit of a monster - two racks worth. Disk speeds were slow in those days so the video was digitised and streamed to 8 separate disk drives in parallel. There was also another system built by logica that used Ampex multiplatter disks with the video streamed to 8 platters in parallel - that proved unreliable owing to the problem of maintaining simultaneous track alignment across many platters.
Of course these systems did not have recording scheduling software - but they did record video on hard disks. So there is prior art (and probably some patents).
Reminds me of 'On the Beach'
This has a disquieting similarity to Neville Shute's chilling book "On the beach" (written circa 1950) which describes a world where mankind slowly exterminates itself by unleashing a global nuclear war. The last remaining country (Australia) encases pages of the Encyclopedia Britannica in glass, presumably in the hope it will survive till another human species appears and can read the assembled knowledge. I guess 100M years might be enough.
Shared HD channel
So if no broadcaster (that doesn't have a TV license income) can justify having their own HD channel all to themselves, then why not run it as a "shared channel" that they can pay to slot in occasional programs or series. A sort of HD mix or sampler channel.
If one could get 10% of the electricity bill by agreeing allow interruption of the tumble drier or dishwasher cycle at peak times it might be quite an attractive option.
Same logic applies to ISPs, but the effect is less noticeable
Broadband or Internet slowdown ?
It would have been more helpful if the uSwitch report had distinguised between the broadband network and the internet and content providers.
The most recent Ofcom report (May 2011) shows a slow down of ~ 10% at peak times. These measurements are done on a much more careful basis and cover the connection from a major internet peering point to the customer. Of course some geographic areas may be worse (and some not so bad).
See http://stakeholders.ofcom.org.uk/binaries/research/telecoms-research/bbspeeds2011/bb-speeds-may2011.pdf (page 35). Not as dull as you might expect!
The uSwitch report is a bit short on detail, but proabbly includes the slow down due to the internet and servers outside of a providers network. That is of course what the customer sees in reality, but its wrong to focus on broadband providers for the problem
Music and the Machine
It seems highly inequitable that artists get 50-70 year copyright protection of their music, whilst the scientists and engineers that invent the machines used to play, record, transmit, publish, store and listen to that music only get 20-25 year patent protection.
If relatively short patent protection of scientific ideas is a 'good thing' (and helps to nurture innovation and economic success) - then why not apply the same time limits to artistic creation?
There really is something attractive in the speed-up from using just one or two machine operations to perform a single source code operation e.g. x = x + 1 (rather than many arising from interpreted steps).
brick on a stick
So the "brick on a stick" is replaced by a paving slab !
It cut sboth ways
Employers 'try it on as well'
I was one of 300 or so other employees who were illegally dismissed by our company (after it went into administration, but was still trading). The administrators employed top London lawyers and tried every procedure they could think of to block our claim. We had to get barristers to pursue our claim in a tribunal. After about 15 months the admistrators did an about face and said they did not have any grounds for dismissal without notice and so would not now contest the claim (though they did argue some technicalities).
We won (and the judge agreed very much with our case and dismissed the technicalities). We should have had a large 'award' except when a company is in administration it does not have to pay such awards. Instead the goverment gives the claimants a 'token' sum.
I guess a quicker process would have helped, but don't make it harder for employees.
Send in the seals
So if Theresa says no will the US send in the seals for him ?
Rerun of Microsoft IE6 incompatibility ?
Some years ago Microsoft damaged the prospects of other browsers with a raft of incompatible extensions and features that only worked in IE6 etc. (though ultimately this backfired).
I hope Google aren't trying a similar technique to try to kill off the smaller browsers.
Most welcome development
It will be a most welcome development if Spotify can bring its speed, simplicity and 'just works' approach to managing material on an iPod rather than being forced to use bloated iTunes ( which always leaves me wondering "what is happening now?" )
Putting Microsoft software on a Toyota should at least prevent any risk of uncontrolled acceleration
Data driven diagramming...
Thanks for the GraphViz tip (I think it comes from something BellLabs/ATT did many years ago for graphs in Troff diagrams! ).
Not exactly what I wanted but lead to the google directory page of graph drawing systems
This led to "uDrawGraph" - Lacks quite a few features that NetViz had, but produces decent graphics has an API which could be useful.
What about data driven diagrams (e.g. NetViz)
I used to use a program called NetViz some years ago. The graphics and drawing capabilities were not great, and it was far too expensive. But you could very easily import and export attributes associated with Nodes and Links and cause the elements to change appearance & annotation depeding upon the value of attributes. You could export data associated with a diagram rather than just the 'appearance'.
This made it very easy to generate large diagrams with lots of data/annotation without having to 'draw' it all by hand and to synchonise other systems with the data shown on the diagram. NetViz seems to have been dropped long ago by its supplier (CA), but I have yet to find a suitable replacement.
Many of the 'diagramming' applications seem to be fancy drawing programs with a networking add-on. What I need is more of a data management program with a data driven diagramming add-on.
Any ideas ?
About time too
Why anyone thought that XML was a good way to exchange large amounts of data is a mystery to me.
But, hey, storage is getting cheaper and processors are getting faster all the time so why bother to come up with an encoding that is space efficient and quick to parse when you have something as horredously inefficient as XML.
Protecting doctors and hiding bad practice
Yes, the downside of keeping information highly protected is that it reduces the ability to perform analyses of things such as patient outcomes, and so identify badly performing doctors or unsuitable practices.
Perhaps this is really behind some of the BMA objections ?
Widening access in a controlled manner introduces some risk of loss of confidentiality, but not doing so allows poor practices to continue that result in premature death or unecessary suffering.
I know which approach I would prefer.
But what about reflections
The idea of 'nulling' at the central receive antenna sounds good in an ideal enviroment, but each transmitting antennae will generate reflections of varying intensity, so there will not be a perfect null. Nulling will help, and may be part of a further range of measures. But in a reflective environment I wonder how much attenuation will really be achieved ?
Just Garbage collection ?
Not sure what the issue is.
Isn't this just the same paradigm as garbage collection in a program with dynamic memory allocaiton. When the last pointer goes away the space should be reclaimed.
On the other hand if there is still a reference to the data block, then someone still wants it and you are no worse off than if you had not de-duped, the data would still be there because the other guy had not deleted it.
PDP-8, 11 and Vax
Just for the record the PDP-8 used 12 bit words (and also had a sort of 24bit floating point capability with an optional expensive hardware accelerator). You could run real scientific programs in 4k, 8k or 12k words of memory.
Though DEC saw their biggest success with Vax and whilst it was an impressive architecture it came a bit late to the market. It was also encumbered by VMS which struck me as unnecessarily complex (being based on RSX/11) in contrast to the much more cleanly architected Vax Unix that DEC could have picked up from UC Berkeley a year or two later, but instead vacilated over for years.
Based on (admitedly limited) experience of post grad study at one US university (UC Berkeley).
1) Top US PhDs involve a lot of demanding and advanced work, including two years advanced course work (at least when I was there)
2) Many doing post grad study were doing so to help them get a hgh paying technical job.
With that motivation and dedication its not wholly suprising that they tend to be employed.
P.S. Succeeding in something challenging is a source of enjoyment and satisfaction.
Elegant, Simple and Powerful
The thing that set Unix apart is the elegance, simplicity and power of its basic design. In the late 70s' PDP-11 computers with 2 Mbyte (max memory) could support 20-30 concurrent users with a core OS API that is substantailly unchanged in todays' Unixes (though many other features have been added).
Its a credit to the designers that so many of the original ideas have stood the test of time and provided a platform for so many new developments.
Current windows systems also include many 'features' of the original 80's DOS, but how we suffer for that continuity...
What responsibility for carriage of illegal material
Surely the situation for an ISP is rather like an operator of a 'toll road' e.g. Dartford Crossing, or M6 Birmingham relief road.
If on occasions someone pays the toll and then drives a truck with stolen goods along the road, is the road operator supposed to check contents of every truck, the identity of the driver and bear responsibility for the contents ?
Of course not, so why are ISPs supposed to put in costly infrastructure to do just that for a small percentage of the material they might carry
Jobsworth CRB amplification insanity
One problem with CRB checks is that each (public) organisation feels the need to go one step further.
Recently a friend of ours offered to act as one of the parent helpers on a (one off) school trip to a museum - accompanying the class her child was in. She was told she couldn't go because she needed to be CRB checked. This surpised me, so I checked with the local education authority on her behalf - and as expected was told that the county DO NOT require this for a one off visit - it was purely something the school itself had dreamt up.
Schools lament the lack of parental involvement, and then they put stupid obstacles in the way of parents who offer to help.
The final irony is that the lady concerned is a health vistor, and of course thoroughly CRB checked in that occupation for dealing with vulnerable people. But of course this does not meet the schools own invented requirements.
Pretty good from here
Clouds cleared after about 9.30 so sat in my back garden at 10.30 (in South East of England) and got a pretty good display - a small meteor every 2-3 minutes and every 10 mins or so a cracker with a 'sparkling' trail. Not spectacular, but memorable.
Open Office opens WordPerfect and others
Had to open a very important 50 page WordPerfect 5 file dating from about 1984 the other day, and OpenOffice did it fine!
Its also been opening Office2007 files ver 3.1 for a long time.
There are about twenty file types it opens.
re. funny maths
Nothing unusual about 2.5 + 3.5 adding up to 12.
Nortel execs had a bit of track record of overstating the numbers when it suited their bonus calcs.
Regaining virginity ?
Useful article, but what is disappointing is that none of the applications get start up time anywhere near the original 35secs of the new clean install - One feels something could be done to recover this? Is it just extraneous applications that slow down boot? If so, are they really needed? Or could they be arranged to splutter in to life when the PC is up and going rather than forcing the user to wait till they sort themelves out before allowing the user to do anything useful.
Agree about the memory upgrade, particularly for really 'challenged' PCs e.g. 256Mb - its quite easy to do as well - just need to be careful and gentle.
Radio Mic spectrum - replace kit
Quite apart from the politics and economics of charging for spectrum there is the other issue that all the 'entertainment industry' UHF radio mics will need to be changed or modified. I guess in the business they probably get replaced every few years anyway, but this kit is not cheap - so it could be a major initial one off cost.
The good news ( I think) is that non-professionals who use radio mics on the unlicensed part of the spectrum e.g. Churches, clubs, personal use etc. will still have that part of the spectrum available.
See this article http://www.theaudiofiles.net/?p=386#more-386
- Updated Zucker punched: Google gobbles Facebook-wooed Titan Aerospace
- Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
- Windows 8.1, which you probably haven't upgraded to yet, ALREADY OBSOLETE
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Android engineer: We DIDN'T copy Apple OR follow Samsung's orders