1391 posts • joined Friday 15th June 2007 09:17 GMT
Any reasonable system should run it's internal clock on UTC all the time, regardless of timezone and whether DST is in effect, and just alter the presentation of local time according to it's location, so 'adjusting' the clock should never be required. You should never have have the hardware clock changed on a correctly configured UNIX or Linux system, except to take into account leap-seconds or clock-drift.
This has always bugged the hell out of me when using a dual boot Linux/other-OS PC. Linux worked just swell changing the presented time according to DST, and then you boot your other operating system that ALTERS THE FREAKING UNDERLYING CLOCK!!!
When I put Ubuntu Lucid Lynx (10.04) on my Thinkpad last year, I was expecting this, but found that some misguided bright spark has added code to Linux (probably somewhere else other than Canonical) that actually expects the underlying clock to be changed incorrectly by the other-OS, and then 'breaks' the working tradditional UNIX/Linux time support to take this incorrect clock setting into account. Talk about working around someone else's errors.
Good for all the people who want it to just work and regularly boot both OS's, bad for anybody who actually understands what should be going on. Drove me nuts for an hour or two.
Hang on a sec. Back to iOS. Isn't it a UNIX/BSD derived OS?......
The trojan is an ELF executable, presumably for whichever processor runs in the Dlink router, but the vector to get it in there would appear to be a compromised MS Windows system that then attempts to brute-force access to the router. So there are actually two components, one of which infects a windows system, and the second of which is installed on the router by the first.
Thanks for explaining, although I did post this as a springboard to get replies.
As I have supported diskless UNIX systems for several years in the past (and will be again very shortly), I do understand about sharing a system image (which, incidental, on windows breaks a whole host of software unless you jump through hoops to redirect stuff away from the C: drive, which will be read-only, somewhere else - personal experience of pain here), and also identical hardware on the desktop. It's not a new technology except to Windows shops.
Citrix, VMWare and Microsoft are waaaaaaay behind the curve here compared to UNIX, both in diskless and remote display, and I have to feel that bending current windows to make it fit into a diskless/remote display model is the wrong way to go about it. Better would be to have made a 'new' windows with native thin client support and some compatibility with 'old' windows, than using a crowbar on the existing models. After all, MS did product switch before with NT. Maybe Longhorn should have been this, but they apparently could not get it to work without ex DEC system architects and IBM's assistance (WinNT history 101).
And I did talk about de-duplication, which is effectively what shared image is all about, and I did also talk about low power, diskless desktop display systems, but after a quick search, the only people I could find selling them was Wyse, who sell a diskless system running Windows CE for about the same price (once you factor peripherals in) as a basic PC. Many people in the past tried diskless PC's, and almost all of them are now NOT doing it (the earliest I remember was DEC Pathworks, which had diskless DOS systems with a network filesystem).
My closing comments about having been here before with other architectures still stand IMHO. I still think we have been here before, and I also still think that the current in-vogue implementations are flawed and designed to maximise revenue for suppliers rather than provide a good environment for customers.
Does not affect LTS
LTS releases will still be 12.04 and probably 14.04. No difference.
6.06, 8.04 and 10.04 (Dapper, Hardy and Lucid) were for all versions (server, desktop, netbook), and that will not change.
So let me get this straight
You put all of your PC desktop images into large servers held in the data centre.
You then use something on each desktop to run a virtual session to those large servers.
What are those devices on the desktop? Oh yes, PC's.
I know that the devices on the desktop will be cheap/low power PC's, but bearing in mind how powerful even a basic PC is nowadays, where is the saving?
If you were to sell it to me as an administrative saving, or a deployment cost saving, or even as a data de-duplication saving, then I may be interested. But as a power saving?
Of course, if the desktop devices were diskless, low power consumption (ARM type power) real thin clients then this may make sense, but we've been here before, and commodity PC's always undercut specialist net devices (where are Tektronix, Oracle, NCD et. al. with their thin-clients, Netstations and X-terminals now. Oh yes, out of that business). The cost ends up being the screen, keyboard and I/O devices, not the PC itself.
Where savings are being made at the moment is that older low-power PC's are being used as the access devices, but this is unlikely to give you a power saving, and is not going to be a model for phase 2 and later roll-outs!
@HMB - Word of advice
Check the native resolution of the display panel in the TV (it should be printed on the box or in the instruction manual). Too many TV's (and not always just the cheap ones) on the market today claim and will accept a 1080p signal, but will then downscale it to 1388x768 or 1680x1040 or whatever their native display panel can do.
I had a big argument with a major on-line retailer about this when the published resolution for a TV I bought from them was wrong on their website, and they were extremely slow to accept the fault. Even then, I needed to go through their onerous RMA process, which takes about 2 weeks, before they would refund.
I actually went through forcing the driver to override the EDID value read from the TV to prove the case, and actually at the end of the day concluded that using the VGA port rather than the HDMI or DVI port was far more flexible and gave more control.
I've just re-read my post, and it's odd.
I've written about fictional things, set in the future, but described them in the past tense!
It seemed right at the time, but it looks feels funny now. I think the common oxymoron is "a future history".
...there is no guarantee that the overhaul that Obama wants is the one that most of the commenters here want. He just make it easier for Big Business to get their patents through. It depends on whether the problem he perceives is that the patents are not working, or whether they take too long to grant.
I suspect the latter (him being lobbied by the people with the deep pockets), which will probably make things worse from our perspective, not better.
"I always get the impression the utilities were written by clever people who just happen to be lazy and can't spell properly"
Actually, it was more like the Teletype ASR-33's in common use as terminals back when UNIX was being formed being sooooooooooooo sloooooooow, that people abbreviated commands and flags so that they could work reasonably quickly. It is also the reason why ed is incredibly terse, but ultimately very powerful as a line editor. I recommend to every serious UNIX/Linux user that they learn ed, just so they can use vi and sed effectively.
Another explanation is that they were lazy, a positive attribute for all good system admins. (do what's needed efficiently and with the least effort). At least, that's my opinion.
@AC re. Maybe it's just diesels
"lazily cruise along singing along to Radio 4"
What do you listen to on Radio 4 that you can sing along to?
The only thing I can think of is Desert Island Disks, and you don't even get the whole tracks on that. Or maybe it is the music rounds of "I'm sorry, I havn't a clue" such as "Pick up Song", which I admit it is difficult not to sing along with, but which only lasts for a minute or two!
I find (especially listening to the Today program interviewing politicians) that I get angry, and end up shouting, impotantly, at the radio!
ST:NG Enterprise-B, C and E not lead ships
NCC1701-B was an Excelsior (updated) class,
NCC1701-C was an Ambassador class,
NCC1701-E was a Sovereign class.
Only the NX-01 (ST:E) was the lead ship of it's class.
Anorak please. Yes, with the Star Trek patches sown on the left breast.
The downside of the VMS DCL command line model was that for every command, you had to add entries to the DCL command dictionary, whatever that was called (it's been a long, long time since I did any DCL configuration, and much of that was on RSX/11M not VMS). All of the command line parsing and in-line help that allowed abbreviations was done by the DCL command processor, although I do believe that it passed any unprocessed arguments through to the program as a last resort.
One of the most useful and irritating features (both at the same time) is that UNIX-like operating systems left that to the commands , although if you look at ancient UNIXes, there were very strong conventions that should have been followed (if you look back at the USENET archives, there are many quite animated discussions about whether the shell should do more command line processing than it did/does). This meant that you could easily add commands to UNIX, and that internal shell commands, functions, aliases and external programs appeared almost seamless.
One of the quirks, however, was that some of the ancient commands, many still used today, never did abide by the conventions (the most obvious of examples of this are the "dd" and "find" commands, that have been there almost since Epoch began, and never did conform).
Over time and UNIX versions, the conventions broke down. One character flags gave way to words, and as soon as that happened, you have to code around whether the arguments like "-Fart" are equivalent to "-F -a -r -t", or whether there is a wordy argument which actually tells the program to break wind (this is a quick example off the top of my head from the "ls" command. There may be better ones).
Then the BSD/GNU people got involved, and introduced what I regard as the abomination of the -- (double minus) argument, supposedly to allow the syntax to be extended, but actually misused my many post-linux developers to completely replace the original UNIX flag processing implemented by getopts.
Couple this with the fact that often the GNU and GNU-like commands were never documented by proper man pages (or even their supposed native "info" command), or even by informative usage strings, and that there was effectively no controlling influence (a flaw in the Open Source contributory model itself) and you get to today's mess. I can totally sympathise with you with it being difficult and awkward to use (I actually count VAX/VMS as being the OS I would prefer to use after UNIX)
I look back fondly to V7 and SVR2 and SVR3 days, when conventions meant something, and people bothered to use them.
My coat is the sad old gabardine raincoat with the Lyons annotated UNIX source in the pocket (which, in case anybody bothers to read to the bottom of my past comments, I lost, but have found again).
I agree completely with your analysis and your sentiments, and even that Ubuntu peaked around Hardy Heron, but what I have difficulty with is whether limiting shareholding to small investors could be achieved, and whether it would work.
I also have a problem in how you would keep the shares with small investors, unless they were bought and sold through a market other than one of the stock markets, or managed completely internally making it a privately held company with an internalised share market (and I don't know how that would be organised or regulated for a large number of investors).
Also, having shares is a method of raising capital to invest. It is not a way of attracting a revenue stream, regardless of how it is implemented.
With shares, there is always the expectation that the shareholder could get their money back. I think that you are eluding to a micro-donation scheme, where you give them money and expect to gain some influence as a result. This is more like a subscription.
I've briefly looked at the charity shares model and I must admit that I don't understand how it works (probably my bad). It looks to me if you effectively lend your money to a charity for them to use and get financial value, and you get the satisfaction of knowing that you are helping and getting value back in the way of ethical satisfaction rather than financial gain. I can't see how that can be a business model for an organisation like Canonical. They still need to get some revenue stream in.
Of course, I may have misunderstood what you are saying (in fact, that is probably quite likely).
What alternatives and why?
Maybe MS are trying to make sure that older OS's are retired as well. I'm sure that IE8 could be made to run on XP, but why give users the choice to not pay MS more money by replacing the OS.
Interestingly, I still run a system that I boot on occasion which is Win 2000 Pro. Firing up IE6 on this (IE 7 cannot be installed) already says "Install another browser", with a link to IE8 which then tells me it can't run on that OS.
I know that Win 2000 is out of support now, so I have no cause for complaint (this system has some legacy payroll software that I can't re-install because of expired license key restrictions that allow me to run it in query mode, but not re-install it on another system - and the supplier no longer exists!), which I have to keep running for 3.5 more years to satisfy HMRC data retention requirements. It gets started about once every 6 months just to prove it still works.
@AC re Although....
Although I am in danger of resurrecting a long dead argument, I dispute that the ZX81 had more compute power than the BBC micro.
Although the BEEB's CPU only ran at 2MHz, whereas the ZX81 ran at 3.75MHz, the BEEB's CPU was a 6502 that executed most machine instructions in a single clock tick, whereas the Z80 in the ZX81 averaged four clock ticks per instruction, and some of them required up to 13.
This was the subject of endless controversy between Sinclair/Amstrad owners and BEEB/Apple/VIC20/C64 owners at the time.
In general, the Z80 instruction set was more advanced that the 6502, containing more instructions, more addressing modes, and even some proto 16 bit arithmetic instructions (by treating pairs of 8 bit registers as a 16 bit register). The 6502 contained enough instructions to do what was required though, and was considerably easier to program (the bible of both processors, written by Rodney Zak, "Programming the Z80" was at least twice as fat as his "Programming the 6502", and had smaller print to boot). This made the debate a forerunner of the CICS vs. RISC argument, which hinged around similar concepts.
Remember, though, that BBC BASIC was blindingly fast for the time, and it remained the fastest BASIC available well into the advent of 16 bit micros, as documented by PCW's BASIC benchmark that they ran on all the systems that they reviewed.
And the BBC micro had a built in assembler, and a means of passing arguments between BASIC and your machine code, and also documented all of the OS I/O, sound and graphics calls that you could make from your machine code. And of course, the display in the BBC was totally hardware driven, freeing the CPU up to run your programs.
I used to write both Z80 code on a ZX81 and Spectrum, and 6502 on a BBC, and believe me, 6502 was easier, and for basic data manipulation, faster.
Actually, they were not thermal at all. The paper was covered with aluminium, which conducted electricity, and was 'written' by a wire that passed a current through the paper as it wizzed round on a rubber belt. Where the 'spark' hit the paper, the aluminium vaporized, letting the black paper below show through. Crude, noisy, and completely incompatible with listening to the radio. That is why the 'paper' was silver with black print, and also why you got a new high power supply for the ZX81 if you bought the printer.
I believe that they were not allowed to sell the printer in the US, because (surprise, surprise) it contravened the US electrical interference regulations.
This was an example of innovative thinking that made Britain good at creating ideas, but pretty crap at exploiting them!
@ForthIsNotDead re. sound.
Although out of the box, the ZX81 did not have sound, there were a number of third party add-ons that gave you sound.
I had a Quicksilver board that had an AY8910 on it, feeding a secondary modulator to add sound to the TV signal.
Quicksilver also had a number of other accessories including a high resolution graphics board and a programmable character board. You needed an interface board that sat between the ZX81 and the RAM pack, which provided two interface slots for the add-ons. Ugly as sin, and made the RAM pack wobble problem different, but as I added an external keyboard to mine, complete with power switch and reset button, I did not have to touch my ZX81 at all.
As I've said before on these forums, my ZX81 actually had 18K of memory, the 16K RAM Pack, the 1K of internal static memory re-mapped to a different address when the RAM Pack was plugged in, and another 1K of static RAM on the ULA side of the data bus isolating resistors to hold programmable characters that were accessible by changing the contents of the Z80's I register, which was used to hold the base address of the character generation table, normally in the ROM. Happy days!
It was not called FPGA, although the technology was similar. The real name was a ULA, or Uncommitted Logic Array, which was a bleading edge technology in 1980/81, invented by Ferranti, a British company.
The ZX81, Spectrum and BBC Micro all had ULA's in them, to consolidate the function of dozens of 7400 TTL chips into a single large chip.
Unfortunately, the technology was still immature, and the production problems that Ferranti had were a large part of the shipping problems for all of the systems mentioned. I waited for nearly 6 months to get my BBC model B that had been ordered as soon as Acorn would take orders.
By the time later systems came along, it was possible to have your own design of chip fabricated moderately cheaply, so the ULA died an ignominious death
@stucs201 re. MK14
Are you sure it was a MK14? Many universities used a KIM-1 (produced by Commodore, at least at the end), which was a similar product but used a MOSTEK 6502 rather than the Nat. Semi SC/MP which is what I believe was in the MK14.
I remember having to write a sine-wave generator on a KIM-1, and I managed to get a higher resolution wave than everyone else by having a lookup table with just 1/4 of the whole cycle in the lookup table, whereas everybody else had at least a half cycle (and some of them stupidly coded the whole cycle). Fitting it into 512 BYTES was a real challenge. But then I also managed to write a simple lunar lander on the Sinclair Cambridge programmable calculator in just 32 program steps!
Nuclear fission sustainable?
I'm playing devils advocate here, because I'm pro-nuclear, but I would suggest that the amount of fissionable material on Earth is limited just like any other earthbound energy resource.
This means that in the long term, you cannot use 'sustainable' in connection to nuclear.
The way I look at it is that the Earth has many energy resources, but because of entropy, they are all limited to one extent or another. Fossil fuels are stored energy from the Sun, nuclear fission is heavy elements (from the death of older stars) acquired during the formation of the Solar system), nuclear fusion is light elements probably from the formation of the Sun, wind and ocean currents are driven mainly by solar energy from the Sun, tide is gravitational (mainly from the Moon, and tidal drag is causing the Moon to slow down and approach the Earth so even this is finite), geothermal is (probably) natural nuclear fission (see above) combined with tidal effects from the moon, biofuels are capturing energy from the Sun and direct solar is (obviously) from the Sun.
So if you discount total matter conversion (and boy would that be useful), and fusion of hydrogen electrolysed from water (finite on Earth but lots of it), all energy except direct and indirect solar energy is limited. And the Sun won't last forever!
@AC re firmware licenses
I have not checked, but I'm fairly certain that the license is valid and legal as long as you own the device, whether it is working or not (otherwise you may not have a valid license if the device breaks and you then repair it).
One thing I am not too certain about is transfer of the license. In the Microsoft and IBM EULA's (mainly software) that I've read in the past, there are often quite stringent and restrictive rules applied to the transfer of the license. I'm not sure what happens to the license for Sony devices if you give away a device that contains licensed firmware. Does the recipient have license to use it? Do they have to accept the EULA, and if so, how do they know about it? Can they be held to something if they have no right of return of the original device? All interesting questions. Anybody any ideas?
I'm taking issue with your "if you wanted to run Linux you should have bought a PC" line.
The Cell processor is interesting and different from any Intel/AMD/ARM processor that you will find in commodity PC and other hardware, and using a PS3 to run Linux was the cheapest and easiest way of getting access to it. It is not for you to say that it has no value, that is down to the person doing it. As an intellectual exercise, being able to program a Cell has serious merit to some people (I know, I have talked to some of them)
It is quite clear that Sony did something that had significant impact to a part of their customer base (even if it was only a small part), and that should be investigated. But two wrongs do not make a right, so publishing details of a hack that includes Sony IP is almost certainly against copyright legislation. But if the hack includes no Sony code or firmware, then I'm not sure whether that is illegal. If all that was published was a technique utilising an API, especially if the API was itself published, then I don't think (and IANAL) that that would actually count as copyright infringement.
What may be an issue is whether identifying the technique is against national implementations of the European Union Copyright Directive. This is actually more restrictive than DMCA when it comes to breaking a protection, but as that is not direct legislation, you need to see how any country has enacted it into their own lawbook.
I believe that of all of the European Union countries, Germany is the one that has enacted the EUCD most closely in their national legislation, so he may actually be being accused of an offence against that, rather than straight forward copyright infringement. This means that he may be on very shaky ground.
I think this is US "pants", i.e. trousers, slacks etc, rather than UK "pants", i.e. knickers, briefs (or in the US vernacular "panties"). Unorthodox, but not quite so kinky.
If it was her undergarments, I would hope it was a 2.5" drive rather than a 3.5". Ouch.
Silly me. If I had read the referenced article, it is actually clear that this is what is done (and actually says that simple radios would not work). But unfortunately, the Reg. article was trimmed, and the relevant information was one of the bits that was removed. Just proves that it is important not to take Reg. articles at face value, something I should have learned by now.
OK, are you are using one FM frequency per phone? I can't see this scaling well.
I would guess that if you had a number of frequencies corresponding to fixed distances from the stage, you could use a GPS app. in the phone to select the correct frequency based on what your phone thinks is the distance from the stage, but you would then have 'banding' where the person immediately in front of you is apparently in a different band, and gets the music out-of-sync to you. I'd have to do some math to see whether this is likely to be a problem, and also to see how many rings, and thus how many frequencies you would need for say Central Park or one of the big stadiums.
Another thought. If they told you the distance from the stage, and gave you the correct frequency (maybe on information posts in the venue) why use a smart phone at all. An ordinary FM radio with a digital tuner (or any phone with an FM radio) would suffice. Or maybe repeater speakers as part of the sound rig using a similar system. That way it could be encrypted, and not give the show away at all.
@Steve McPolin re. IB and RDMA
I do not agree with your comparison between Firewire and Infiniband.
Let me say that I am not an Infiniband programmer, but I do support HPC systems using IB and RDMA. Having said that, I do understand, IMHO, a bit about how RDMA is implemented, at least on the UNIX systems I support.
RDMA is a way for a one-sided communication (amongst other things), that allows a system (A) to perform a memory operation in another system's (B) memory space without the involvement of the second system's OS. But that does not mean that the system B is completely divorced from the transfer process, nor does it mean that A has full, unrestricted access to B's memory.
Before an RDMA operation can be performed in IB on UNIX, system B has to set up a memory region, and also set up an access window to that region to allow system A to use it. System B is then given access to that region without B's involvement, but cannot (as long as there is no flaw in the HW/FW/SW stack) go outside that region.
This means that it is perfectly possible to have the benefits of RDMA without compromising the security of the entire OS, and if a long-term window is set up (say, for an HPC type workload that runs for some time and uses the window for many transfers), the involvement of the OS on B is limited to setting up the window at the beginning, and breaking it down at the end of it's use.
Now I do not know whether Thunderbolt has this ability, or if it has, whether it is configured and used in MacOS, but just because RDMA is available does not mean that the system is completely compromised.
From what I read about Firewire (and this is just from the Web), the default was that RDMA was turned on, and was not limited by default. This is really the flaw, and could almost certainly be addressed by careful system administration, but if you don't know what to fix, you won't do it. I have heard other stories from various Web resources that Firewire really did have this flaw, and that it could really be exploited by plugin hardware. Even if the quoted example illustrates flawed system administration, just think how much useful information can be gleaned from direct access to the memory of a system.
Unfortunately, the good ideas of the hardware engineers do not always match with the requirements of real-world environments. But you would have thought that someone in the driver design process would have gone "..Hang on a minute, don't you think that this is opens a security hole...", but then I have seen too little joined up thinking in large organizations recently. Too many people still think that a PC is personal.
I reserve my judgement on Thunderbolt until there is more information.
I'm normally on the side of the employee
but in this case, if the published policy was to use encrypted sticks, the worker was given an encrypted stick, and was told to use the encrypted stick, but subsequently didn't merely 'because they had problems', and then did not get the problems addressed, this should be a serious disciplinary issue.
The employee should be reprimanded at the very least, and if the employment policies allow, held up in front to the rest of the work force to illustrate how important these things are. This is especially true if they are in any position of seniority.
If this is not done, the excuse will always be that 'it is an education issue', and we will see these things happening more and more.
@david 12 re MS Office EULA
Interesting. I went there, and was prompted to download a .EXE file that contained a PDF and and XPS file.
WTF? An EXE file!!! What's wrong with a straight-forward download of the PDF? This is MS obfuscation at it's worst.
Indeed, on the page for reading the license terms there is the following
"Supported Operating Systems:Windows 2000;Windows Server 2003;Windows Vista;Windows XP"
JUST TO READ THE TERMS AND CONDITIONS.
As I am using a Linux system at an enlightened organization at the moment, a .EXE file is of little use without some form of MS tie in. Sort of undermines your comment a bit.
OK. You do that, and you get a list of every style used, including the standard styles which have been slightly modified.
I had to take an MS Word document a few years ago which had been hacked together from several different documents, and do some work on it. When looking at the styles, each time one of the source documents had been modified, or the template changed, and then parts cut out and pasted together, you got another variant of the same style. In the list of styles, I had about 60 listed, many of which, from their name, should have been the same style.
It took me the best part of a week to trawl through the document, eliminating the minor variations, and consolidating the styles together. I got the number of used styles down to about a dozen.
This is not necessarily a problem with Word, but in the fact that most people who use word processors that use styles don't actually know how to use styles properly (I admit that I didn't know before cleaning this mess up).
In my view, this reinforces my view that ALL current word processors are too complex, and all most people need (even those writing quite complex documents) is something only a little more beefy than Wordpad.
I admit that I am biased, as I was an n/troff and runoff user before I came across Word 2 on DOS 2.1. I didn't like Word then, and I have to say, I would prefer not to use it now.
Music ain't what it used to be
When I used to read the Hi-Fi press, music was all about live and recorded and listened to in acoustically favourable surroundings. The nirvana was having an acoustically neutral room built around the sound system.
The music quality was important then, and everybody was striving to have the music sound like it did when it was played.
Nowadays, it is only classical music, and a very small amount of audiophile recordings that have this goal. Almost everything else will be autotuned, compressed, expanded, mixed, spatially processed and otherwise messed around with to death.
As a result, it's pointless paying this much for headphones if you are listening to modern recording, distributed digitally in any lossy format, in noisy environments. Pay a few 10's of pounds for something that is comfortable and produces a sound you like, and buy a few pairs.
My current set for everyday use are generic ear-canal, Omega brand (a crap stick-the-name-on-anything brand) that cost 6 quid that I found by trial and error. The sound is adequate, even with uncompressed audio, they block external noise out well, and they are so cheap that I don't worry about breaking or loosing them. And I don't look like a twat wearing them on the tube (not that I ride the tube often). I do have a couple of pairs of in-ear Sennheiser 'phones which I prefer the sound of, but I worry about knackering them every time I pull them. out of my pocket.
My home based Hi-Fi, which was always best-of-breed at purchase time low-to midrange kit (cherry picked NAD, Project, JVC, Technics and a pair of ancient Keesonic [niche brand from 25 years ago] speakers) has a pair of mid-range over-ear Sennheisers, and a paid of elderly Beyer-Dynamic headphones. I am very happy with the sound, and must people still go Wow! when they compare it to their modern systems. Especially when playing vinyl!
Bring back real music, that's what I say. Especially get rid of autotune. Anybody who can't sing in tune without this should be boo'd off stage.
Does something this small
actually count as a supercomputer?
So bearing in mind how many MS bugs we've known about in the past, how many more were fixed silently? Does that make a further mockery of some of the "Linux - Get the Facts" campaign that they ran a few years back?
Kal-El might be a bit of a problem, but I can't imagine that the same would be true about Wayne, Logan or Stark. After all, Logan and Stark have other meanings, and if DC can prevent Wayne being used, then Mr Roonie and probably Harry Enfield would also have to look out.
Ignoring an image, why 200 bytes? Surely, Camera location, <=4 bytes (4 billion locations should be sufficient across the UK), number plate <=7 for UK plates (let's be generous and allow 10), and a timestamp, again let's be generous and allow 8 bytes (UNIX per-second timestamps can still be represented by 4 bytes until 2036).
Total, 22 bytes per record, maybe with some overhead.
Multiply by your 350 million records a day (I'm not sure if this is actually accurate, as not every vehicle is driven every day, and I want to be shown the 10 ANPR cameras a day that I pass), and you still only come up with around 10GB a day. Deem that data useless after 6 months, and you're talking about 2TB, or one current generation high end Sata disk in total.
The recorded image is only necessary if you intend to use record for enforcement (for say, driving without road tax, MOT, or insurance), but probably not for locating people by spotting their cars. If you had automatic discard of images of vehicles that were legal and not on a hit-list, then you would not need to keep all of the images anyway.
Bearing in mind that many more times that volume of data is transmitted and stored daily in any number of data applications, it does not seem too far fetched after all.
BTW. I wish to state that I do not condone this information being kept and used for tracking purposes at all. I don't mind to get illegal cars/drivers off the road, but that's it as far as I am concerned.
If I read it correctly, the MPEG-LA consortium are wanting H.264 decoders in the silicon, so that to play it, you do some setup (window size and position etc.) and then just throw the encoded byte stream directly at the hardware. Almost zero CPU usage.
If this is the case, you may be able to use some of the other accelerated features for WebM, but not the direct decoding.
Again, I am happy to be corrected by someone with more knowledge.
Microsoft want to make IE a lever to sell Windows. As such, they need to differentiate it from Firefox, Chrome, Safari and Opera.
Unfortunately, with previous versions of IE, it appeared that their model was to make it significantly non-standards compliant but push it to become a de-facto standard so people who wanted a 'full' internet experience would be inconvenienced by the other browsers. Add a dependency between the browser and new versions of Windows (i.e. can't run IE9 on WinXP), and hey presto, you've made it so people with totally usable PC's need to upgrade to the new OS to use the 'features', adding to MS's coffers in the process.
They need to learn that this will no longer happen and indeed may backfire, so unless they embrace the standards, they will rapidly loose that lever, and will have to rely on other tactics in order to push Windows. I believe that they think that HTML5 including the H.264 codec protected by patents will be the enabler to bind people to Windows for another round of PC purchases.
Re: WINE ?
Yes, but they will not be running IE9 as AFAIK, there is no Vista or Win7 persona for Wine yet. What I've read appears to suggest that MS have made sufficient changes in the last two versions of Windows to require significant work from the Wine community to make it compatible.
It means exactly what it says
using the graphic hardware for operations that you would otherwise have to use the CPU for. This includes things like, but not limited to, scaling, texture fill, pixmap copying, shading, and may go up as far as complex rendering such as hidden object removal, surface mapping, and shadow and light-source calculation. Of course, much of this is unnecessary for Flashplayer, and having a hardware H.264 decoder built into the graphics hardware is probably what MPEG-LA and it's members (including Apple) are talking about.
What Adobe may be doing is adding direct support for certain mobile graphic chipsets directly in Flash. They can do this without the graphics hardware vendors direct support if the graphics API is well enough documented. Of course, documenting such things to sufficient detail is entirely foreign to some companies.
In the PC world, much of this is not really necessary because you have abstraction layers like DirectX or OpenGL with hardware vendor support, and can have hardware accelerated graphics while only writing to one or two well-known API's. As far as I am aware, there is no such abstraction for any of the mobile platforms, but I would be happy for someone to correct me if that is not the case.
@James Hughes 1
I like to think that the moderator team enjoy us playing word games with them sometimes. I really was trying to be humorous myself, but maybe I was too subtle for some people.
If I manage to elicit a direct response from one of them once in a blue moon, then that makes all the comments I write worthwhile.
My word. What a reaction!
Sarah. While I don't disagree with your statement, your reaction seems a bit extreme.
I know that you never claim to be without any bias, but you normally leave it to us to be quite so loud. Does this post infringe house rules 7 or 9? I would hate to see the moderator banned from the forum!
Ah. Maybe you left the screen lock off when you went for coffee or something, and it was not Sarah who wrote this!
How do you tell?
These are tree seeds from earth, taken out and brought back. They may have been subject to vacuum and cold, but that is probably the same as tree seeds in a seedbank.
The only way they could be identified was if they had done an FF4 and become stretchy, stone like, invisible, or could burst into flame, or had been labelled and/or otherwise recorded.
This is such a non-story.
You beat me to it
I was going to say exactly the same. Sadly, my iPod died, so my Rovi is currently sitting on a shelf unloved while I listen to music and FM radio on my Android phone.
I'll take issue with the review, however. Quote - "Indeed, the more robust signal you tend to get on the move also makes up for any step down in sound quality" - Where the hell did you test this?
Whenever I try to listen on the move, the dropout rate and the bubbling mud problem from varying signal strength makes DAB impossible to listen to, especially for spoken word programmes. Try listening to a comedy show when you keep missing the punch-lines. I know I live out in the sticks, but I also have this problem travelling down the M5 within 10 miles of a major city with a correctly installed DAB radio specifically made for a car.
With FM, you get a hiss or maybe some distorted output, but you can follow a conversation. Try doing that when you completely loose the channel for seconds at a time.
Don't get me wrong. I'm all for Digital Radio, and have five in total in the house and car, but until the transmitter power levels increase, I'll be listening to FM on the move.
BTW swisstoni. The delay is caused by the time it takes to digitise the audio at source and decode in your receiver. It's never going to go away regardless of what anybody tries to do. You could try buying a more expensive receiver, but there is no guaranteeing that it will have a faster processor in it, and would only address the decoding side. I'm afraid you will just have to learn to live with it.
Never convinced about RM
Many, many years ago I justified and built a computer resource for teaching what was called Computer Appreciation around BBC micros, ECOnet and Acorn file servers in a UK Polytechnic. I managed to get so much for the budget, including colour monitors, robot arms, digitizer tablets, touch screens, printers, plotters, speech recognition and synthesizers, teletext decoders, video cameras, mice, lightpens, and a basic CAD systems (remember, at the time, an Apple ][ with a Bitstick was considered good). We also had representative word processor, spreadsheet and database products and a good Basic (natch) as well as Acornsoft Pascal. This was to teach people from all learning disciplines, not just computing students, what computers could do in the early 1980's before DOS/Windows PC's had achieved dominance, and also before people had Home computers.
The main Computer Unit poured scorn on our efforts, because they wanted to put in RM 480Z's backed up by a 380Z as a file server. Equipping the lab with what they suggested would have blown the whole budget on the computer hardware without being able to afford any of the peripherals, demonstrating that their systems were overpriced. And I don't believe that it could have been used for half of what we actually used it for.
I was very sad to see Acorn loose out in the education sector, especially when RM moved to MS-DOS based systems. The 80186 based Nimbus 1's were neither cheap, standard, or IBM PC compatible.
I just looked at their web site, and saw a device for mobile data capture, priced "from £3.75". I thought "Wow, that looks interesting and cheap. Maybe RM have learned". I then looked further. £3.75 apparently buys you an extension cable (basically a 3m M-F 15 pin serial cable from what I can see).
The actual device itself costs £170, plus £19 for the software (you would have thought they could have included that, bearing in mind that it is essential to use with a computer), and all of the more interesting additional sensors cost £50-80. Suddenly I'm not interested.
I'm glad I'm not working in education any more. The thought of hearding rooms-full of PC's would fill me with dread.
SCO keep telling the bankruptcy court that if they are allowed to continue their claim against IBM, and win, that they will have enough money to settle the money they owe.
Unless someone convinces said court that the claims are fatuous, and will never be more than a way of paying lawyers, then the court *HAS* to listen, because they have a duty to recover as much value as possible for the creditors.
And although there have been rulings on the admissibility of the evidence and who owns the copyright, the original case is still outstanding. There may still be a small chance that there is a case against IBM because not all of the documents are in the public domain, but even if there is, it is unlikely that SCO would actually be able to collect any of the money.
I keep feeling that the case against IBM has to reach court and be finally ruled on, so that this whole mess can be consigned to the history books.
What worries me is that Novell are the ones who objected last time this UnXis tried to by the assets, but Novell is not the company it was. Microsoft had a finger in the pie when Attachmate bought Novell. If Microsoft had some way of suggesting to Attachmate that MS would pay more for the IP they bought in the deal if Attachmate/Novell would not object this time round, the whole circus could start anew (shudder).
It was used very often
to kick off a software installer. Even people like PCW used to use it for their cover-mounted CD and DVD's.
I've installed a recent HP printer, and it used autorun (the installation instructions did document how to run it without auto run, but it was phrased like "If the installer program doesn't automatically start, open the CD, and .....").
My significant other (worded to attempt to not to upset the Moderatorix) has some craft software that needs the CD inserted explicitly in the D: drive (and heaven forbid if your CD is not the D: drive), and the instructions for this expect autorun to work, and do not contain an alternative. I keep explaining this, and she keeps telling me that her computer is broken because the software does not start. Grrrrrrrr.
I think too many of the people commenting here are in the Windows support business, where they are in control of any software installation, and do not talk to home and SOHO businesses where simplicity and hand-holding is essential for people who just use computers as tools.
I can't be so old that this has passed out of memory, can I?
That's exactly what I think. Contrary to belief in spontaneous human combustion, the human body does not burn well, and you need additional fuel to perform the act. That's why they build funeral pyres.
Seems perfectly sensible to me to recover the heat, and use it.