1860 posts • joined 15 Jun 2007
You've been reading the Wikipedia article on the Nyquist Frequency, and particularly the section on Aliasing sinusoidal waveforms.
This is a very special case, and does not mean that you can reconstruct any waveform from a sample of 1/2 of the frequency of it's highest component. It's really pointing out the minimum sampling rate that allows you to differentiate between one sine wave and another with an integer multiple of the it's frequency. The important thing is that you have to know is that it is a sine wave before you start.
There are many special cases, and the one that I like to think of is a sine wave at 1/4 of the sampling frequency, which at 44.1 KHz sampling, would make the frequency of the sinewave 11.25KHz, well within the hearing range of most people. This would mean that if sampled at exactly 90 degree intervals, you would get something between a perfect sawtooth and a square wave. Of course, if you know it is a sine wave, you can reconstruct it, but on a CD player it would be stupid to assume that everything you play will be a sine wave, so it tends to use some mathematical spline to smooth the waveform, and this is what will be fed to the analogue part of the system. Different implementations of CD use different smoothing functions, but none of them can perfectly reconstruct the original signal in every case.
As has been pointed out, this is a pathological case, but it illustrates that digital sampling can never be anyway close to perfect unless the sampling rate is many times the maximum frequency, certainly more than twice, whereas a mechanical system could be perfect within a range of frequencies, even though it is unlikely to be so because of material physics.
The secret is...
... to prevent the batteries dropping below a certain critical temperature. So the batteries powering the heater must be inside the heated enclosure.
It's really a bit of a shame that the internal resistance of the batteries is not a bit higher. If it were, the act of powering the electronics may generate enough heat to keep the batteries warm, or at least slow down the cooling rate!
But according to the specs. Energizer Lithium should be good to around -40 C, so I'd be a bit surprised if they would be a problem for most of the ascent.
Watson is not a single computer any more
While what we saw on Jeopardy! could clearly be seen as a computer cluster running as a single service, what IBM have now is an analytics application that runs as a fenced cloud service. This means that it runs on just your data, and that data is separated from another companies data, as much as anything is fenced in a cloud service.
So, if you trust company data separation in the cloud, you're just as safe using the IBM Cognitive Computing service as any other cloud service.
I'm not saying how safe I feel that is, however...
Re: And does anyone actually use this in Linux?
At work I support four separate HPC clusters. I have one virtual desktop allocated to each so that I can have all the windows on each cluster grouped together. When you have hundreds of nodes, most of which should be identical, but often have specific problems
I have another four, one for a full-screen mail session, one for a full screen web-browser (with multiple tabs), another for various monitoring tools, and one used for anything else that takes my fancy (typically local windows on my workstation).
Counting the open windows I have today (which has been a quiet day), I have 18 windows open, scattered across all 8 desktops. On busy days, I can have between 30 and 40 open windows. I can switch between workspaces easily and know that all of the windows open on one desktop relate to one particular facet of my work. I would hate to fit all of that into even 2 or 3 monitors, even if I were prepared to sacrifice the desk space.
I've been working in a similar fashion to this for nearly 25 years!
I use virtual desktops at home as well on my personal laptop, mainly to separate out different things I am doing at the same time. For example, at the moment I am working out how to typeset music while referring to on-line tutorials (full screen musical notation editor without intruding window decorations in one desktop, browser in another, rapidly switching between them by pressing two keys).
Honestly, unless you are incredibly single-minded and can really concentrate on just one thing at once, I believe that almost anybody could benefit from multiple desktops.
I was using vtwm on UNIX in 1990. Both CDE and OS/2 Warp had it from 1994.
Vtwm was interesting, because rather than separate 'desktops', what it gave you was a scrollable/snappable window over a much larger desktop than the size of the screen. This meant that you could have a huge window that you could move the visible screen over. Coupled with hotkeys to control the window manager rather than on-screen buttons, it made a very usable and flexible environment. I did find the source for it a while ago, and compiled it up again, but I'm afraid that I'm now corrupted by the need to support freedesktop extensions from more modern window managers.
And IIRC, the AT&T 5620 Blit had some rudimentary multi-view extensions to Layers in the mid '80s.
I can't remember whether the Sun 3 that I played with in the early/mid 80's had a virtual extension to SunView. I think that they preferred icon boxes to contain multiple minimised windows that you could open and close as a group.
Re: For heavens sake
He could do a James May.
Get someone to life-size the plastic kit, and have fun building it. He could also put 'himself' into the driving position.
Re: A year left to run on the EE contract?
Firstly, the shares will drop like a stone once the news was out that there was a brick wall ahead, and that may affect the way they can generate operational credit. A potentially solvent company operating without credit is doomed to fail (remember what happened to Woolworths).
Secondly, the current owners may want to bail out of the business, and this looks like a simple way of doing it while offloading the hassle of trying to find a new operating model to someone else. The current owners will just become creditors, and will either get some money back if it is wound up, or will get shares in the newly re-invented company if a new operational model can be found.
Re: Look at my own personal island from the skies...
Why did I think of Tracy Island when looking at the pictures.
Maybe with the island, technology gained from Virgin Galactic, and his altruistic tenancies will enable him to set up International Rescue?
Hmmm. Not got enough children though.
Re: Not the Dart!
I followed the link, and then looked for the Dart.
It looks sort of smart from the outside, but the austerity of the inside is a bit bleak. But I think I would probably prefer it to the Goggomobile featured in the advert!
I think that the world could do with smaller engined cars
* * I'm currently doing a round-trip daily commute of ~90 mile in a three-cylinder, 800cc car, and apart from the fuel savings, don't really see the difference from a larger car.
Re: The BBC is really starting to piss me off. @bill 36
I think you need to understand how satellite transmission footprints work.
In order to be able to cover the whole of Europe, it would be necessary to transmit from several Astra satellites.
The move from Astra 2D to Astra 2E could have been for many reasons. Astra 2D may have been being retired (I know that it wasn't, but it could have been). The BBC's lease of the service on 2D may have expired and they were forced to move to a different satellite. Or maybe, UK license payers in the extreme north may not have been able to get a signal from 2D, but 2E coveres them better.
Hmmm. As the BBCs mandate is primarily to provide broadcast media to the UK, the last appears to be a pretty convincing reason. It's enabled them to provide a service to parts of their core area that were previously not serviced.
To me, this seems entirely reasonable. What would you have wanted. That they increase the cost of providing the service by hogging satellite bandwidth by using channels on two satellites?
It strikes me that what expat's are suffering from is collateral damage from an entirely justifiable action. Only if you can prove that the BBC did it solely to cut off people from outside of their core audience could you really claim that it was a deliberate TVWF infringement.
Re: Two important differences for Apple Pay
I've found a use for NFC. I have smart-tags scattered around the place that changes the mode of my Sony Xperia depending on where I am.
When I'm in the car, it selects car mode, with big icons and the phone automatically in speaker mode. When at work, it turns the phone to silent mode with vibrate on.
And so on. I'm still finding uses for it, although setting up the actions is a bit tricky. It's really a useful feature, and doesn't appear to affect the battery life too much.
I might be being stupid here...
but... I cannot see anything in the article that suggests the replacement of IP. Indeed, the diagram still has IP listed in layer 2, along with (strangely) UDP. Extrapolating from this, what they may have done is eliminated TCP.
It looks to me like it is a super-network that sits above the network layer, probably as a way to make it network-independent. It's not in itself going to replace IPv4 or IPv6, which may exist for some time until some other alternative comes along.
Try a double-coated Husky. They never seem to stop moulting, and the soft under-fur is great at gumming up the brushes of a vacuum cleaner.
Re: UK too this winter
During "The Winter of Discontent", I used to do my homework by candle-light listening to Radio 2!
It was one of the reasons I asked for my own radio as one of my next birthday presents, just so I could listen to Radio 1 or Radio Luxenbourg (I was too far away from the Thames Estuary to get Caroline).
If the lights went out now, I'd probably reach for the guitar and pick away for a few hours. I have a battery powered practice amp, so could even use my electric.
I think my kids would probably play "cards against humanity" or another card game for a while. They've also recently re-discovered board games.
Re: *All* TV programs?
Many is the time I've seen text and graphics on a monitor on the Tardis console generated by a BBC micro in old era Dr. Who (mode 2/5 is a dead give away).
OK, I'll bet that the 'code' shown was nothing to do with the story, but there is a precedent for using a popular micro like the RPi in Dr Who.
What "New Series" are you talking about? If it's Series 8 of the New Era, then that's not surprising, it was the first episode.
If it's the New Era itself, starting with Christopher Eccleston, then you cannot really categorise it as a single "series", seeing how variable it has been.
I hope that they can bring it back from the travesty I feel it had become with Matt Smith as the Doctor, but I fear that the problem now is the lack of imagination of the writers. The last seriously good episode in my opinion was "The Doctor's Wife", which was written by Neil Gaiman, not one of the stock writers.
It's a shame no-one complained about the story!
Derivative, to say the least.
In my view, this was NOT a good start to Peter Capaldi's term as The Doctor.
At least it makes it easy to get better.
Russian sanctions against European sanctions about Ukraine
"Nice satellites you've got there, tovarishch. Would be a shame if they ended up in the wrong orbit, nyeht!"
Re: Good, but Banana Pi is the better beasty. @Gert
Only 30 addresses? My DHCP server struggles to allocate addresses even though it has ~100 to play with (the other 100+ addresses are in reserved ranges for static IP addresses). And I have used something like 30 of these static addresses for machines I want to have fixes addresses - like the main laptops for each of the kids so that I can monitor/arbitrate who is using the most traffic as well.
I have seven adults in the house, with WiFi mobile phones, tablets and eBook readers, laptops and larger gaming rigs. Add to this all the consoles and hand-held games, set-top boxes, and a smattering for the infrastructure devices (WiFi hubs and routers) and we've used up a significant part of a Class-C subnet just in one house! I'm really not looking forward to transitioning IPv6 (I'll probably set up an IPv4 island when I have to!)
Re: @VinceH (@Peter Gathercole)
The ATPL board had some jumpers, but I think that it synthesized a write enable from the address bus.
As a result, you could not move the various buffers (like the disk buffers) or worksapce for DFS into sideways RAM. There were hacked DFSs (I think the Watford DDFS was one) that could work in shadow mode, but it did that by changing the addresses of the buffers in the code, not re-directing the addresses.
The Solidisk board for the Model B was more sophisticated, but I believe that it required a wire either inserted into one of the chip sockets in parallel to the chip pin, or a fly lead soldered to the board.
The way that the ATPL add-in worked was basically that any write to an address above &8000 got directed into the (single) bank of static RAM, regardless of the ROM select register. Some ROM providers got canny to this, and during ROM setup, would do a write to overwrite some of the ROM image (Wordwise was the first one that I came across) to cause the initialisation to crash the BEEB if it was running the ROM image from RAM. This could be prevented by adding a switch to the write-enable line of the static memory (there was a solder link and pads for a switch on the ATPL board) that would disable the writes to the RAM. The sequence would be load the image, write protect the RAM, and reset the BEEB (in fact you did not need to reset the BEEB, there was an OSCLI call to initialise the new image - something I used to enable switching between the runtime and compile ROMs in RAM of the Acornsoft ISO Pascal system, which came as 2 ROMs).
Back to Wordwise, when I got a Master 128 (at work), which did not have a write defeat switch for the sideways RAM, I hacked Wordwise to remove the offending code in the image to still allow it to work. Not that I used Wordwise. If I was using the BEEB as a word-processor, I preferred View, but if I just wanted an editor, I used the one built in to the ISO Pascal runtime. Most of my documentation was actually done on my (well, work's, but I was the sole sysadmin, so it was "mine") UNIX box using nroff and a Qume Sprint 5 daisy-wheel printer.
The shuffling of the programmes down was something that was done way before the B+ or B+128. You would load a small piece of machine code into the cassette buffer or somewhere, *LOAD the cassette image into a higher memory, and then move the data down before changing the video mode.
Some of the ROM toolkits did this for you. I think that both DISK DOCTOR and the ROM based BEEBUG monitor had this feature.
What the B+ and B+128 did do, however, was allow the disk subsystem to use 'shadow' memory for the various disk buffers, meaning that PAGE remained at &0E00, rather than the &1900 that was normal for a machine with Acorn DFS on either the Intel 8271 or WD1770 disk controllers, or &1A00 for a system with disk and Econet, or &2100 (I think) for a system with ADFS (yes, you could get ADFS for BBC Model B's, it was used to run the 10MB hard-disk in a Level 3 Econet server).
They also moved the screen into shadow memory so that memory up to &7FFF was available regardless of the screen mode. The primary use for the extra 64KB of memory in the B+128 was to hold RAM copies of sideways ROM packages. I have an ATPL Sideways RAM board for that (but only 16K of static memory) so I never invested in a B+ or B+128, or a Solidisk add on shadow RAM board.
Must have a play again sometime.
Re: @LDS - Not sure what you mean. @oldcoder
The first UNIX system I ever used had 2 RK05 cartridge disks, each 2.5MB in size, and 128KB of memory (this pre-dated the PC by several years). It was never about the size of the disk, it was about the speed of the disk and the model used for running commands, especially if they were chained together in a pipeline.
I used a system that had a minimal UNIX-like OS (it was so similar, I wondered whether it was a direct port of V6) on two floppy disks. One was the system, and the other was used for user/application data including the pipe files (if you remember back as far as UNIX Version/Edition 6/7, you will remember that unlinked files were used to keep the data that was in the pipeline).
The amount of thrash that went on between the two disks whenever you ran something as simple as "ls -l | more" (IIRC it was a port of UNIX V6 with some BSD 2.3 enhancements, possibly called IDRIS) was more than anybody could bear, and for these systems, you could only really use the OS as an application launcher, not in the way that a UNIX power user would use it.
AFAIK, all systems that Ken worked on either had Core memory, which was persistent and had the OS loaded from paper tape or DECtape, or had hard-disks. There were no floppy based UNIX systems at Murray Hill.
PDP11s (except for the very smallest ones) had MMUs that allowed them to address up to 256KB or 4 MB of memory dependent on which model they were.
@LDS - Not sure what you mean.
I am positing that PC-DOS was never provided by Microsoft. If CP/M-86 had been the OS for the IBM PC, then MS-DOS, OS/2 and Windows would never have happened, and the PC would have evolved to multi-tasking and protected mode machine as the hardware became cheap enough, because the rudimentary features were already in CP/M-86. With a proper multi-tasking OS, a windowing desktop would have followed quite naturally.
I've deliberately not mentioned UNIX, although it has been my career, because I'm well aware that in the early '80s, the requirement for a hard disk that UNIX has would have prevented it from appearing on commodity hardware.
Yes, I admit that some historic features of UNIX may be undesirable, particularly the security model which is effective but probably too simplistic by what is required today, but I would again suggest that if UNIX had been more prominent outside of the server room, there would have been more pressure to modernise some of the least desirable features of UNIX. In some respects, UNIX is a victim of being as good as it was when it was written. It's been just about capable as written, so people were able to work around problems, never requiring significant re-write.
Nice to see the PDP-11 architecture being used as the reference for mini-computer memory management. Should always be regarded as a classic architecture.
But the final analysis is flawed. There were micro's with MMUs available when the IBM PC was produced. There were MMUs for 68000s and Z8000s that would have allowed proper protected mode OS's like UNIX or Concurrent CPM/86 to run on the desktop. They were, however, too expensive for the types of machines that IBM envisaged (single user, single task machines that worked like Apple ][s, but with a more 'modern' processor). Cost and maximising profit was the main cause of using poor hardware that did not have the required capabilities for security.
It was a failure of imagination that led to the development of the IBM PC and PC/MS-DOS in the first place, and once out there, nobody was going to be able to shake the dominance of these platforms on the desktop, even though they were technically flawed and limited, even when they were new.
Imagine if Gary Kildall had actually met and agreed to supply IBM with the OS for the IBM PC. I'm absolutely sure that with a CPM/86 derived OS, multi-tasking, potentially multi-user and running protected-mode processes, together with a supervisor mode OS would have appeared in desktop machines way before WinNT.
Windows even now is still living with the legacy of poor design decisions taken in MS-DOS and early versions of Windows, which persisted well into the times of hardware (and indeed Windows core security capabilities) capable of running properly protected.
Re: Incredible Business Machines @naive
I would not say that it was only disks and tape drives that broke. I've been involved with many other hardware failures across the spectrum, but the one thing RS/6000/pSeries/Power systems will do is actually tell you what is most likely to have failed.
It also had (actually, still has) very good hardware diagnostics (for AIX systems) to back up the POST and BIST checking, although almost everybody has forgotten them. Add in the HMC call-home and remote console functions that were added somewhere around the millennium for the pSeries systems, and you have a platform that is robust, stable and supportable, and is IMHO still best-of-breed (of the UNIX systems) when it comes to running a service.
Re: Standard Windows timings @kain preacher
And people say that changing settings in Linux is obscure and convoluted!
I know, I just could not resist the double-entendre.
Re: Waggle Worry
Chances are that the canards probably won't have a huge effect at the launch altitude because of the rarefied atmosphere.
They will come into their own as the 'plane descends.
"servers nearing end of life over the next five years."
That would probably be all of the servers that they are currently running, as most companies depreciate capital assets including IT over a period of 3-5 years.
And that's just the financial side. At the current rate of change, they would be technically obsolete before then.
BTW. My home IT infrastructure is built on obsolete or discarded systems, so if anybody wants to get rid of their working 5 year old Xeon or Core Quad system, I would be quite happy to discuss giving it a home (running Linux, of course).
Re: It used to be women
Harmful emissions from CRT tubes wasn't all nonsense, it's just that the concerns persisted well beyond the point where they were relevant.
Shooting high power electron beams in the direction of people, even though there was some form of screen between the beam generator and the people did result in various types of radiation, from visible through to X-ray, low-energy beta and possibly even alpha particles or fast ions.
Very early CRTs probably did emit small amounts of harmful radiation. But by the time they were commonplace in offices and homes, the problems were sufficiently well understood that any alpha (which were probably stopped by the glass anyway), beta and even X-rays were being blocked by coatings on the glass or diverted away from the person sitting at the screen. There is not enough energy in the electron beams to generate gamma radiation.
So any terminal/monitor made after the late 1970s were not a problem, but the information persisted.
An interesting page is the description of the stickers on the tubes of Lear Siegler ADM3a terminals (an early glass TTY) at http://www.tentacle.franken.de/adm3a.
The interview on Radio 4 this morning was talking about posting video clips obtained from television coverage onto YouTube or other social media. With no fair-use provision in UK copyright law, any video obtained from transmitted material that is redistributed is a breach of copyright, unless allowed by a specific waiver of copyright.
What is not copyright breach is using a phone in the ground to record part of the match, and then posting that. That may breach the terms and conditions of the ticket, but would not be a copyright offence (unless the owners of the advertising objected to that appearing - but they'd be stupid to complain about wider distribution of their adverts!)
Re: But this has four!!
You forgot longevity. A pen with only one colour probably has more of that ink, and will last longer.
Another similarity to WinXP!
"Degrees make sense though"
While this is generally true, it depends on what counts as a degree.
I used to think that a degree meant that the person had succeeded in achieving an advanced qualification, requiring learning and diligence and often independent thinking, without being watched all the time (like at school), and all the time exhibiting restraint against the worst excesses of the results of being free from parental oversight.
This lack of oversight was one of the primary differences between universities and polytechnics. Poly's kept a close watch on their students, and offered better support services to advise students and keep them on their courses. Universities often just let the students sort themselves out, or fail.
Nowadays, it seems to me that students are given subjects that are less rigorous, and also have much better support services that attempts to prevent the students from failing. This means that University is much less academically and personally demanding (although I acknowledge that there are financial pressures), resulting in the value of a degree being diminished.
I know that I am generalising. I'm sure some universities are still turning out excellent graduates. But many aren't, and this means that industry no longer values a degree as a guarantee of certain qualities, and that is what is damaging.
Bring back the rigour that a degree used to represent, and I will agree wholeheartedly with your statement.
P.S. I graduated in 1981 from one of the long-established universities, after nearly failing my degree at the end of the first year. The fact that I nearly failed was scary, and taught me a lot, and I believe that it enhanced my resultant work ethic and character..
The large increase in numbers of students
is actually because the Government believed the crap they were being fed about more graduates in the job market leading to higher productivity and a move to a skills-based economy.
They forgot that in order to have that number of students, it was necessary to actually have courses that kids wanted and were able to do, and that led to degrees in the most unlikely and useless subjects. To cap it all, they encouraged all the Polytechnics, which were turning out useful people with lesser qualifications, but suited to industry, into second-class universities (I worked at a Poly. before the switch, and it was excellent at what it did, but that was not turning out degree graduates). Couple that with the travesty that is a "foundation" degree that pollutes the meaning of a degree, and it's a real mess.
And then, because there were more students, they could not afford the the grant system, so introduced loans, which are not saving *any* money because of the poor rate of pay-back (often graduates do not pass the threshold at which they start paying the loan back, because they are not using [or can't work in the field of] their degree).
We need to go back to Universities being elitist, turning out the right number of the right people for the jobs that really need a degree, and move back to apprenticeships and on-the-job training for the majority of young people. Competition for fewer university places means that those that want to go work hard at 'A' levels, and stay working hard to keep on the courses.
Re: to counter mr mugger, you need a panic PIN
What should be done is that they make it so the panic PIN will work in the hand-held devices, and will dispense money the first time it's used in an ATM but alert the bank and the Police. The mugger won't know that they don't have the proper PIN, and hopefully will release the victim.
The bank can then flag the card to cause any ATMs to go out-of-service (rather than declining the card) whenever the card is used again, hopefully leading the mugger to be unsure whether the card has been blocked (in case they demand that a second transaction is done by the victim), or whether the ATM is truly faulty. All the time, you pass the location on to the police whenever the card is used.
The customer and the bank may argue who pays for the first cash withdrawal (the bank will want to make sure that it really was a withdrawal under duress), but that should be a small problem.
Re: Metaphor @Stopeshop
The original question was "Can you tell me which other OS was ported twice to an other processor architecture?"
It said nothing about serial ports.
I admit I got it wrong about MacOS. Maybe I should have said NeXTstep(68000)->OSX(powerpc)->OSX(x86-64)!
I think that if you look at the myriad of Linux ports out there, you will find one that is not one port away from x86 anyway.
And I know there are a lot of UNIX ports out there, but how about AIX(ROMP)->AIX(POWER)->AIX(IA64 - although did not last long), and along the way there were s370 and x86 ports as well.
Ummm, off the top of my head.
BeOS (may be stretching things here)
It's more common that you might think.
Re: hmm POSIX
I'm constantly infuriated by my Linux colleagues who assume that Linux is a POSIX compliant operating system, and that anything written for Linux can be easily backported to UNIX or other POSIX compliant operating systems.
I currently work supporting and AIX HPC in an environment where Linux is used extensively for other data manipulation and modelling work. I keep getting questions like "Why is Linux tool X or Y or Z not available on the HPC", and I have to patiently explain that because the tool requires the complete KDE or Gnome environment, or reliance on dbus or udev or KMS, none of which are in the POSIX standard, or any number of cumulative package dependences, a back port is almost impossible.
They cannot see that Linux has done the Embrace and Extend, and is well down the Extinguish path against UNIX and POSIX in a manner that would make Microsoft proud.
And I would not mind too much if there was a new POSIX standard that was extended to specify parts of the Linux and GNU tool chain that genetic UNICIES could be extended to include, but there is no such thing! There was the LSB, but that's an unmaintained standard that everybody ignores.
There is no workable Linux standard! And to cap it all, there is almost no Linux distribution that has even got full POSIX 1003 compliance, much less the more recent UNIX V7 <rant>(FOR GOD SAKE - UNIX V7 ALREADY MEANT SOMETHING! COULD THEY NOT HAVE USED ANOTHER NAME!)</rant> standard.
UNIX is standardised. Linux is not. Linux should work like UNIX, not the other way round.
Still I think I approve of the extended lease of life for VMS.
Re: biggest reason NOT to vote tory this election
This is what the previous administration was attempting when they designed the databases to back up the Identity Card scheme that the Conservatives were so keen to put down. By adding a super-key associated with someone's identity to all the other databases, it would have enabled them to join together disparate information sources however they wanted.
They tried again in 2009 with Clause 152 of the Coroners and Justice Bill.
I seem to remember one "David Cameron" was particularly keen to oppose the measures.
I'm sure every Government wants to do this, but there are safeguards called Information Sharing Orders that deliberately restrict how government departments share data so as not to upset the citizen vs. state balance.. If this plan is implemented, they will be tearing up all of these, to the advantage of the state against it's own citizens.
Re: Doom for US tech companies
This is interesting. What about US companies operating government contracts in other countries.
For example, in the UK, IBM run parts of the IT for the DVLA, the ID and Passort Service, parts of DEFRA, and probably other government or civil service entities. I think HP has a strong relationship with the Inland Revenue, and I'm absolutely certain that there is one or more US company associated with running the NHS IT systems.
And the UK Government has said that it intends to use Office 365 (although how that sits with the ODF statement recently, I don't know).
Re: USB Firewall
I ought to point out that on Linux is it perfectly possible to whitelist your udev rules so that only known devices (manufacturer, ID and function) can be configured.
Of course, this will not prevent a device masquerading as another by using the ID strings of another device, but it would make the attack surface much smaller in that the miscreant would have to know which devices are allowed.
The other thing that I'm spotting here is a suggestion that the code in the USB device could examine the system. I'm not sure whether that is possible, particularly if it is appearing as a keyboard. Flow of data is particularly one-way for a keyboard. Those that offer programmability in the hardware (gamers keyboards, for example) generally appear as more than one USB device anyway, with the non-keyboard device being used as a control point for the controller generating keyboard scan codes. You could block all but the keyboard device.
If it is configured solely as a keyboard, I don't think that the OS would send any data to it for it to be able to look at the system. At least not for a USB device. If it were fireware, then all bets would be off.
Re: Cut or compress
In order to axe all of the repeats, it would be necessary to produce many times the current amount of new programmes. This will either mean programmes created with extremely low budgets, or the cost of watching TV, either directly by subscription or by increased advertising increasing significantly.
Face it. All the time that there is something like the current airtime available, repeats will happen.
I think that there may be scope in eliminating some of the channels. Maybe set a limit of a dozen channels, but make sure that they cover a wide spectrum of quality programmes to appeal to a broad audience.
Re: It doesn't matter how good the display is if there's nothing to display
I'm going senile, obviously!
The TV manufacturers want a repeat of the 'flat panel' effect. It won't happen.
For most people, TV's are a long-term purchase. Provided it still works, they would not normally consider replacing them.
Flat-panel TVs, once they became cheap enough, shifted the paradigm. People replaced perfectly functional CRT TVs, not particularly because the picture was better, but because flat-panel TVs occupy much less space than a CRT. Couple that with a significantly reduced power consumption for LCD TVs at a time when people were being made energy aware, and the CRTs went down to the recycling centres by the truckload. That enabled people to reclaim space in their living rooms so that the TV was no longer the major piece of furniture it had been, feel good about reducing their energy footprint and, by the way, have 'better' pictures (although I still know people who prefer high scan rate CRT TVs over flat-panels).
This was reflected in how fast CRT TV's disappeared from the shops once flat panel TV's got to within spitting distance of the price of CRTs. And often, it was not the high cost TV's that generated the profits. It was the wholesale replacement of hundreds of thousands (millions?) of TVs with low-to-midrange price tags that earned the money.
We won't see this happening again unless there is some overwhelming technology leap that provides a must-have feature. 3D and 4K are not that, and I can't really see anything on the horizon that would. Maybe a virtual floating screen so that you don't even need to dedicate wall space, but I doubt that is within current technology.
Planned obsolescence is the manufacturers best bet to keep TV sales ticking over (maybe that is why they use such damned poor Chinese capacitors - the single most common cause of TV failure), but I'm sure if it was revealed that this was a deliberate policy, the consumer groups would be up in arms!
Re: It doesn't matter how good the display is if there's nothing to display
Sky in the UK delivers 1080i60 (at least that's what my telly and Wikipedia says). That's an interlaced 1920x1080 image at 60 frames a second, so two successive frames make up a full 1920x1080 image, effectively halving the frame rate (most televisions do some form of de-interlacing on such an image by combining the 'odd' and 'even' lines into a single frame, and actually displaying it at half the frame rate).
This means that in most cases, provided that the original was shot at 30 frames per second (and most made-for-TV programmes are), there should be no effective difference between 1080i and 1080p (1080p will transmit two identical frames, 1080i will construct a single frame from two adjacent frames). Of course, any material shot at the full 60 frames per second will suffer de-interlacing artefacts when transmitted at 1080i.
You can also get quantization errors if the original was shot at 24, 25 or some other number of frames a second. There will be some of this type of error whenever the original frame rate does not match the display rate.
Re: ODF is not open source @J.G.
You obviously don't mean that you save it from View on your BBC. As far as I'm aware, all support and updates for that stopped at least a decade before XML and ODF were defined.
I'm supposing that you are using something that understands View format (as it is from a much earlier age, it's a much simpler format, and one that probably leaked in it's entirety into the public domain), and can write ODF.
I don't appear to have any View files convenient at the moment, but the version of LibreOffice I have installed does not appear to have explicit View support, although it may be there.
Re: Short-term vs. Long term
I was of course talking about a term of a government mandate, i.e. the time between elections, not the overall length of government.
What I was eluding to is the fact that this could be being done to go on their election
propaganda manifesto. If it wins them votes, then they benefit, and can work out whether it was a good idea or not, but they're still in power. If they lose, then it's not their problem anyway.
Short-term vs. Long term
Remember that UK Governments last no more than 5 years. This means that a wholesale switch to SaaS will show expense removed from the balance sheet before the next election.
The ongoing costs will be the problem of the next administration. Like with PPP and PFI.
You may also find that software counts as a Capital expenditure, so reducing that is also a win (when presented to the weak-minded electorate) for them in apparently reducing the costs of Government.
It's all a bit smoke-and-mirrors.
- 'Kim Kardashian snaps naked selfies with a BLACKBERRY'. *Twitterati gasps*
- Crawling from the Wreckage THE DEATH OF ECONOMICS: Aircraft design vs flat-lining financial models
- Review iPhone 6: Hey, looking good slim. How about... oh, your battery died
- +Comment EMC, HP blockbuster 'merger' shocker comes a cropper
- Moon landing was real and WE CAN PROVE IT, says Nvidia