Oleg kept a close eye out in case Aleksandr or Sergei ever came back. Oh, how he'd punish them for abandoning him in the back-end of beyond, without even as much as an OLPM (One Laptop Per Meerkat!)
2651 posts • joined 15 Jun 2007
Re: Blank Media
My daughter told me recently that several media creation packages (she mentioned iMovie and Creative Suite/Cloud) are now removing the ability to write optical media from their most recent iterations. Many laptop/ultrabooks no longer even come with optical storage devices. It really upset her, as she has no desire to use the Cloud as a transmission path for private media she's editing for a friend.
It strikes me as if optical media is becoming a bit of a pariah. I still don't trust flash memory devices for long term storage. We need some new long-term storage media!
Re: No more Paris articles @Lester
It strikes me that it would have been quicker to set up an explosives factory and get the correct licences in Spain than chase FAA regulatory approval to allow you to import and fly a rocket powered 'drone' aircraft in US airspace, especially one that can be programmed from outside of the US.
They're probably trying to prove it's some nefarious terrorist plot.
What I'm now waiting for is for the license to be permanently declined, and the plane prevented from being exported from the US without a "dual use, foreign defence article" ITAR export license, which will be similarly denied.
Re: No more Paris articles
I'm sure I remember something about that.
Oh yes. It's on my tea cup. What was it trying to do again?
You jest @chris lively
but that is part of the iterative process of developing an accurate model!
If a model doesn't indicate what actually happened, then it's not accurate. So you try to understand why it was wrong, change it so that it does agree more closely with reality, and then wait for the next discrepancy. All the time, your running it against historical data to see how accurate it is. So a slew of new data is quite useful.
What would be wrong would be to silently correct the models (or even worse, manipulate the data or start conditions), and then claim that they were accurate all along. But I don't think that this is what would happen.
If you think that anybody with serious climate credentials believes that the current climate models are complete or accurate at the moment then you're bonkers (and so are they!). We just aren't that clever.
Unfortunately, a lot of the people taking note of the models have political agendas.
I know that it's a bit of a downer ...
... but after NASA's recent string of outstanding successes, it's almost reassuring to know that they can have mishaps.
They were beginning to look almost infallible.
You know, WH Smiths actually does have a place on the High Street. In many small towns, they are often the only book seller stocking current titles there, and what a lot of people don't realise is that the other news agents in any area almost certainly get their news papers and magazines delivered through the WH Smith distribution channels.
I'm not saying that I agree with the way that they are reducing the space set aside to books in the smaller stores, as they only really now stock the big name author and celebrity books. They may still have one or two books from a couple of dozen other authors, but you can guarantee that you will not be able to buy a complete series from anything other than the major stores. "Oh", they say when asked, "We can always order them in for you". Yes. I can do that too, and Amazon may be cheaper.
But I still value a shop on my High street that has reasonable range and quality of stationary, books, magazines, maps, and many other things, when the rest of the chains have abandoned towns with populations under 15,000, so I still go out of my way to buy things from them.
The problem that many people who don't visit small towns don't appreciate is that they are being abandoned by the large shopping chains. You could say that it's my fault for living in such a town, but it's 20+ miles as the crow flies to get to the next largest town, and the roads mean that it's 45 minutes each way. Buying from the Internet is fine, but if I have to do a 40-50 mile round trip, just to buy things over the counter, it can make life more complicated.
Re: Just where do ... @AC re 10 Billion people
Yes. a good war with significant collateral damage, followed by famine and pestilence may be a way of keeping the population down, but it's not the one that I was aiming at.
I suspect that even though the 20th century had a number of quite serious conflicts, they did not significantly slow the rise of global population. The global flu pandemic of 1918 may have killed more people than the first and second world wars combined, although you could argue that that disaster would not have happened if large numbers of soldiers returning home after WW1 had not carried it with them.
IMHO, in these days of modern, mobile population and low-manpower warfare, wars are just going to displace more people more quickly, rather than killing them.
Re: Just where do ... @AC re 10 Billion people
Ah, yes. But those 10 billion people won't live where the food is produced, so significant amounts of effort and resource would have to be expended getting the food to them (and, yes, I'm aware that the UK is a net importer of food).
If you let the population that lives in marginal areas procreate, and keep them alive beyond their locally available means with aid and medical treatment, these areas will become breeding grounds for people who will migrate, attempting to gain access to already populated less marginal areas.
I don't know where you live, but if it is in Europe, you can't help but notice the number of people trying to cross or skirt the Mediterranean. If things continue as they have been this year, we will soon see ghettos and shanty towns spring up around cities in eastern and southern Europe. These will not seem strange to the people who occupy them, after all, what is the difference between a hut with a tin roof in Athens or Lampedusa compared to one in Aleppo, but will severely degrade the lives of the native citizens.
Imagine how that is going to change if clean water becomes a conflict resource, driving ever more people into migration. Current clean water schemes in drought areas are not sustainable, because most of them are either tapping into aquifers and underground rivers, or even worse, into fossil water. The first will cause surface springs and rivers downstream to dry up, denying water to other people, and the second will not be replenished in the lifetime of current generations. This is not free water. Extracting it all has consequences (look up what's happening in California), and one of these could be future conflict.
No. Whilst I don't practice what Trevor is suggesting (I'm aiming for a stable population, with reasonable procreation levels to help pay for my state funded pension), I do agree that there are places on earth where we really should not be encouraging population growth. Trying to cap the population to something only a little above today's level is far, far preferable to driving the population to the point where it takes all of humanities efforts to just sustain the higher population.
I know I hold an NIMBY, elitist and uncomfortable view of the future, but I cannot see any alternative short of having a world government that rations out the worlds resources evenly.
Re: Silly... @Cynical
Well, who knew that the teleco's would steal power from their customers! (although I'm sure that it will be in the terms and conditions of the contract).
I certainly didn't!
I hope it will be isolated from my neighbours. I would not like to let them steal power as well.
I must admit that I hope that the power draw is quite low and suitably protected in the router, because I would not trust the wires in a 4 pair telephone cable to carry any significant amount of current. And, yes, I do know that it's currently providing the power for a POTS phones at the moment.
Re: Silly... @Cynical
"Up a pole" ignores the fact that the distribution point will need power. Most telecom poles will not have any power at all. Add to that the fact that there may be several houses fed from the same pole, and no matter how small they are, you will end up with something quite sizeable sitting in the air, where it's vulnerable.
No, it's going to be on or under the ground.
Filling your desktop background with icons/folders is only really useful if you can see it!
For years and years (25 or so, pre-dating Windows 95), I have run on UNIX systems with screens covering the background almost completely. Not just a single maximised windows, but lots and lots of overlapping ones (at the moment, when I log on to my main work machine, my start-up configuration fires up 11 automatically, one of which is a browser with three tabs opened on start-up, spread over 8 virtual screens. And that is just the start of the day!)
I know that modern window managers often have a 'show the desktop' button or key sequence somewhere, but I don't want to minimise the open windows. I want to be able to pull a menu up over the top of the windows that are open. This makes desktop icons useless (and very ugly) to me.
A configurable pop-up menu, triggered by a suitable mouse/keyboard event, with 'walking' sub menus suits me perfectly. The Start button, on an auto-hide window bar works, so does a key-mouse combination (as I used to use on twm and derivatives) and a swipe to screen edge also works. I don't care beyond the first day or so while the action gets committed to muscle memory.
Of course, I expect the menu order and basic layout to remain fixed (none of this automatic management and re-arrangement of items thank-you-very-much) so that the menu looks the same each time it's brought up. Once I'm used to each system, I can work with it provided it does not change.
Currently, I'm very comfortable with all of win95/XP/Vista/7, Gnome 2 and KDE 3 style systems, mixed and matched on a daily basis. The whole concept of Unity, and my limited exposure to the Windows 8 'Modern' desktop feels foreign to me, and I can only get on with the Android type interfaces on devices where I'm pretty much forced to only do one thing at once.
The old ones are the good ones!
Windows 7 ate 9.
Re: Descisions @AC
Are we going to get to the state where the car's AI goes catatonic because it could not reconcile it any of it's actions with the Three Laws?
We need Susan Calvin!
Re: Pro Tip @Uffe Seerup
Of course, while there may still be occasions when you need to gain additional privilege to carry out some functions, for most applications it is not necessary if the application is written correctly in the first place.
Historically, this appears to have been a difficult lesson for Windows developers to learn.
Sorry for the direct attack on Windows, but security-illiterate Windows programmers have blighted application development on all platforms for years, and often pass their bad-practice on to newer generations of coders.
Re: Pro Tip
As a committed UNIX and Linux proponant, I've frequently said that the security model of UNIX-like operating systems is one of it's weakest features. but the flip side of this is that the role based access control systems, where you acquire additional privilege through further authentication is complex, and very rarely used properly or correctly.
This can be seen in the slow take-up of RBAC in the proprietary UNIXes that implemented it nearly 20 years ago, and SELinux, as well as the number of times that it is not used, or not used appropriately in other OSs.
Bearing in mind how many people even working in the industry as a whole don't understand what RBAC is, or how it works, the well understood UNIX-like SUID, uid and euid mechanism, which is basically less complex, and deployed properly by a greater number of people may be preferable.
Of course, the large number of senior application developers who cut their teeth on Windows ME and earlier, who just disable all the security or insist on it running with privilege on whatever platform they're on to get their applications to work properly are a serious problem with many applications. Fortunately, the security message is finally getting through, and the influence of these people is waning, and their legacy applications are disappearing into history.
Re: Register, please stick with what you are good at...@Rusty 1
Well, in case you hadn't noticed, there is currently an election going on for the leadership of the Labour party in the UK, and Jeremy Corbyn is highly placed in the current polls. And he uses Richard Murphy to justify some of his economic policy.
So it's actually a very timely article, if you wish to try to influence people to not vote for Jeremy.
I personally agree with Tim, and think that the policies being proposed are stark raving bonkers, but I don't have the economics background to put a reasoned argument together. Tim, thanks for doing it in a way that is far, far clearer to people like me.
Whether the Register is a suitable place for this, however, is debatable.
Re: Linux v Windows
Sorry, this is going to be a long, historical post, explaining why there is actually no such thing as a 'winprinter', although possibly a more accurate description of GDI printer may be more appropriate.
Back in the day, printers used to have Page Description Languages, such as ESC/P for Epson printers, and PCL for HP printers (and many others. Each manufacturer defined their own). These were often supersets of plain ol' ASCII in most cases, with some escape sequences to allow things like switching to different fonts, superscript, subscript, bold and italics etc.
In fact, many printers still do. Last time I looked, Epson still included ESC/P in their printers.
The problem with this type of support was that you were limited by what the printer could do, and how well the text formatters knew about them. Anybody remembering Epson FX80 printers used from Wordstar or any similar software would be quite familiar with this, especially loading a printer description into the word processor during the setup.
Some printers, however, were quite 'clever' and included very high level PDLs, examples being PostScript and the later versions of PCL, and these tended to the be printers that would be used on UNIX systems. This was through the very hard to configure successfully System V LP system. Most of the time, this required the formatting program to be aware of the printer type, and LP used to just shunt the bytes to the printer. Some support for slightly more intelligent printers crept in, but generally all they really handled was pagination rather than formatting.
Adobe and/or Microsoft (and possibly others) had a bright idea. Most dot matrix printers had a graphics mode, and they decided to take the responsibility for formatting the page away from the printer, do the formatting to a bitmap in memory, and then send the page out to the printer as a graphics image. What this allowed them to do was to ignore the limitations of the printers built-in capabilities, and use any font, size or any other graphics construct that they cared to code into their software.
When ink-jet printers came along, even if they did have a high level PDL built in, it tended to be ignored, and rendering the page still happened in the computer, sending it out as a graphics image. This became the standard way of handling printers in Windows and MacOS, and eventually became abstracted in the OS, so that the software would use an OS defined printer format that would be rendered by OS components before sending to the printer.
Eventually, some printer manufacturers decided that it was pointless putting significant processing power in the printer, and thus were true 'winprinters' born, especially those using the Graphics Device Interface (GDI) that is a part of Windows. Basically these printers were so dumb that they could do nothing themselves other than take a bitmap of the page, normally in an unpublished proprietary format. But that did not alter the fact that other more capable printers were effectively being treated the same!
The problem, as far as UNIX and Linux was concerned, was that for may years after rendering was being done in other OSs, they still used the old PDL model to drive printers. So printers that did not have any PDL at all could not be used. This seriously limited what could be done without some serious knowledge of the printer and the way it was attached.
Step up Ghostscript, which was originally a way of displaying Postscript on screen. Some clever bod realised that you could use PostScript as a generic PDL, and then use Ghostscript in the computer to render the page into a bitmap, and then send this out to the printer with a suitable graphics converter. Suddenly, it became possible to use very basic printers on UNIX/Linux, and get reasonable results, as almost all programs knew how to write PostScript. Eventually, this become Ghostprint, which became common in most Linuxes.
Later, a similar project started using the GIMP (GNU Image Manipulation Program) backend print drivers for a similar print method, and this became Gutenprint, which largely replaced Ghostprint by default in most of the major distros.
When Apple decided to switch MacOS to a BSD UNIX platform (OSX), they decided that the previous print backends were clumsy, and needed improvement. In one of the most useful things that Apple have ever done, they wrote a common backend for all UNIX-like OSs, which is where the Common UNIX Printing System (CUPS) came from. Because CUPS was written as an Open Source project, it has been wildly successful, and has almost completely replaced the older print systems in Linux and UNIX.
So nowadays, even UNIX and Linux effectively drive almost all printers in the same way as Windows, and can often be configured to use so-called winprinters, including some that have required reverse-engineering the unpublished GDI-printer formats.
Re: Linux v Windows @Badmouth
Even average users. Most printers just plug in, get recognised and work. Really.
In the worst case I commented on above, the HP LaserJet 1000 (which, to be fair, was marketed as a Windows only printer, with no official support for anything later than Windows XP), I followed a Google link to the HP website, clicked on download the script, and ran it in a terminal window according to the instructions on the Web page. 20 minutes later (it was an EeePC 701, not the fastest machine on the planet), after answering some very simple questions, I had a working printer.
The HP LasetJet 1000 is an abomination! To save a few cents, it does not even have a large enough bootstrap ROM to hold the operating firmware, let alone Flash memory. Every time it's powered on, it has to have downloaded it's operating firmware from the connected computer. And there's no power switch, or in fact any switches or buttons. The two indicators are a green power LED and an amber error LED.
Of course, I triggered that 20 minute job after insisting that I, as an 'experienced' Linux user of 17 years 'who could work it out by myself' spent a fruitless couple of hours hacking around in Synaptic, 'Add a printer' dialogues and the CUPS configuration!
Chances are that an average user, doing the sensible thing (if it could ever be considered sensible to actually try to use this crippled printer) documented on the HP support website, would have had it working much quicker. Ho hum. So much for 'experience'.
Re: Linux v Windows @Badmouth
I think that you need to look at more hardware, and maybe more recent Linux distros. The days of having to compile everything up from source are long gone.
I run everything from GDI printers, through HP, Epson and Brother, and although there are some problems, if you're using a fairly mainstream distro, many, many printers have local page-imaging support in CUPS and Gutenprint for many so-called winprinters, and even the most obnoxious printers often have some support from the manufacturers for Linux.
The worst I've come across recently was the GDI HP LaserJet 1000 (ancient, purchased from a car-boot for a very specific job), which eventually worked when I used an installation script from the HP support site that adds a special USB driver to the kernel, and then configures CUPS to raterise the pages in the correct format.
Thankfully, the worst offender (Lexmark) have left SOHO market, and their business oriented printers understand PostScript and PCL5e and later, so work pretty much out of the box with generic drivers that ship with all Linuxes.
Other than bleeding-edge devices, most hardware things work without installing drivers (or even putting a driver disk in). There is niche hardware, of course, but I would say that more and more, hardware vendors are learning that they cannot ignore Linux, and often the support that they write for OSX can be adapted relatively easily for Linux.
For run-of-the-mill hardware that you find in most consumer computers nowadays, it is much easier to do a vanilla installation of Linux than it is to do the same with generic Windows installation media. Windows users rely very heavily on the vendor tweaked installation media. If they actually had to do it from Microsoft supplied generic media, they would discover a new world of pain, especially if the network hardware in their machine is not recognised by the standard Windows drivers (as was the case on the last two PC's I most recently built).
The wry comment I was trying to make is that if you are relying on the manufacture of TV panels to make UHD monitors available, you would have to accept the size of the panel as well.
I cannot really see any TV manufacturer making a UHD TV smaller than 32", and looking at the story, 40" was the smallest TV referenced. That's where 40" came from.
This time, I was not making any comment about whether UHD was really going to increase your computer experience (although I have in the past).
You don't have to dream. Just buy one of these TVs, and get a display adapter that will do the correct level of HDMI, and plug it in. If you don't like HDMI, the TV will probably have component video and maybe DVI as well, looking at the back of the TV under review.
Whether it would be any good as a monitor is a moot point, but it would be an interesting exercise.
I still say that sitting that close to a 40"+ monitor would be an uncomfortable experience, but I thought that back in the early 1980s the first time I saw a 20" black and white monitor on a Sun 2/50 after working on 12" VDUs, and look where we are now!
Yes, but only if you want a 40"+ monitor. I cannot imagine making the desk space for such a beast!
Re: Totally .... @Steve
I deliberately said alkaline for this reason, and also because potentially ruining a disposable battery is preferable to doing the same to a rechargeable.
I should have said why, maybe. Thanks for providing the extra warning for other people.
Re: Totally ....
But if you short out a battery, any battery, the internal resistance of the battery will cause it to become a heater...
And the greater the capacity of the battery, the greater the heat effect.
Try this at home, but take suitable precautions.
Get an alkaline AA battery, and put a wire connecting both ends. Leave it for 10 seconds, disconnect, and then see how hot the battery is. But be careful, it'll be hot!
If these devices were made correctly, they should have some form of current limiter, which would prevent a short causing them to overheat (could be as simple as a fuse). But I guess at least the faulty ones don't, or it does not function correctly.
Ah, spirit copiers.
I used to be one of the copier monitors (you know, turning the handle to operate the Banda machine) at school to run off copies from the carefully preserved masters that used to serve year after year from certain of my teachers. You used to be able to tell how many copies had been run off by how faint they were. And oh, the smell. I'm sure I was high some days at school!
I spent a year teaching at a UK Polytechnic in the mid '80s, and I found it easier and quicker to type up my hand-outs on my BBC micro at home, print it onto a spirit master on a Qume daisy-wheel printer, and then run 40 copies off for the class.
Using the photocopiers for more that 10 copies was banned because of the cost, and the offset-print service that was supposed to be used had a three-day turnaround time. As a new lecturer, all my material was produced new, and very rarely three days before I needed it. Possibly the most challenging year of my life!
Hurrah! The IBM model M!
Sony aren't much better, unless it is for their permium phones.
I have an Xperia SP, which I have owned for about 2 years, and still works fine, and does pretty much everything I need it to. The only exception that the internal flash storage is getting full, mainly because the thumbnail cache for the Album app. currently sits at about 1GB of the internal flash used.
Although 4.4 was originally promised, it never happened, and Sony are saying that they are not intending to issue further patches for 4.3 on any device. And that's ignoring the ISP.
The problem is, as I see it, that consumers who do not want to update their phone every year are being left stranded with nowhere to go apart from something like Cyanogen.
I tend to pass my phones down to my kids. Until recently, I had a Samsung Galaxy Apollo running 2.3 and an Sony Xperia Neo running 2.4.3 in use by my kids (the Samsung finally give up the ghost a few weeks back) and I tend to keep phones for 2 years before moving on.
But I look at the phones that I may move on to, and very little in the midrange that I'm looking at is much better than my SP, and those that are are generally still running 4.3 or 4.4, so may already or could soon enter the unpatched category. I don't value a phone enough to either pay £200+, or enter into a £25+ per month contract that would get me a higher end phone that is likely to remain patched for any length of time.
I think that there should be regulation that forces updates for a minimum time, at least as long as the longest contract, from the point of initial sale or supply rather than introduction on all devices that could be vulnerable (something like at least four years from introduction or two years from sale, whichever is latest)
Re: The Impact On The Public Was Terrible @Vorland
I dispute that Hood was useless, but in the state she went into the battle with Bismarck, for all the reasons already stated, she was a flawed ship.
Had the deck armour and shell handling been addressed, she would have been better, but it is clear that Bismarck was an even better ship, not just because she was more modern, but because the Germans cheated in their adherence to the Washington and London naval treaties by understating her size.
As built, Hood was nearly as well protected as the Queen Elizabeth class superdreadnought battleships, and had she been modernised in a similar way to Queen Elizabeth, Valiant and Warspite, she would have been a much more serious contender. But the work required would have taken over a year, and priority was put on speeding up building the newer King George V class.
What a lot of people here appear to be missing is the difference in tactics between offensive and defensive naval requirements. Although in it's total size, the British Home Fleet easily outnumbered the German High Seas Fleet, the British could not force an engagement at their choosing. They had to spread their capability around to get the best chance of any engagement.
The Germans, in comparison, had only to avoid the British ships in order to be a disrupting force. As can be seen in the earlier commerce cruises of Admirals Graf Spee, Sheer, and Hipper, and Lutzow, which between them tied up such a lot of ships escorting and searching for them that prevented their use for other purposes. As an illustrative point, once Tirpitz and Sharnhorst were out of the picture, the Home Fleet was significantly reduced, allowing more ships in the Mediterranean and Pacific.
Imaging if instead of a ship like Graf Spee, which could be fought off by a number of cruisers in a task force or convoy, Bismarck had been in the Atlantic, sailing from Brest. Effectively, every convoy across the Atlantic would have to have been escorted by a battleship, as a screen of even 6" or 8" armed cruisers, let alone just destroyers, sloops and corvettes would easily have been swept aside, effectively destroying the convoy and making any remaining merchant ships easy prey for the U boats. See what happened to PQ17 in the Arctic after the mere threat of an attack from Tirpitz. It would have had a dramatic effect on the war as a whole.
Once Bismarck had been sunk, Hitler became so reluctant to risk his surface naval power that they effectively became impotent, residing in ports to be picked off one at a time by air attack.
Re: The Impact On The Public Was Terrible @Vorland
That's not correct. It is clear that a direct 15" shell it would cause terrible damage, but battleships rather than battlecruisers were armoured to take punishment as they dealt it. Think of the punishment that both Bismarck herself and Scharnhorst suffered from gunfire from British battleships without sinking, before possibly being sunk by torpedo.
Hood was engaged in a turn to bring her aft turrets to bear, which was required to double the number of guns able to fire on Bismarck. She would not have been able to alter course that much without slowing that turn. Couple that with the fact that the Bismarck was using ripple-fire, where there were not full salvo's being fired, but instead each turret was firing as they were reloaded, there was not really a gap to manoeuvre. The Germans had independent fire control in each turret, and it was generally accepted that German fire control was second to none in the world at the time (typical German efficiency!)
It is arguable that maybe Hood and Prince of Wales should have got closer before turning, but that is debatable, and only makes sense if you realise how much weaker Hood was to plunging shells from long range fire compared to shallower trajectory fire, which would have tended to hit the more protected sides of the ship.
Hood was a ship from a previous generation. Because earlier ships did not have the elevation on their guns, their range was limited. At shorter ranges, shells are more likely to hit the side of the ship rather than the deck. In addition, whatever deck armour Hood had was arranged over several decks, rather than the all-or-nothing thick armour that came about as a result of the analysis of SMS Barden in gunfire trials after Hood's design was cast. As a result, Hood could probably have stood toe-to-toe with Bismarck for a considerable time at close range, but not at the range that the battle was fought.
The "luck" was where the fatal shell hit. It is widely regarded that it penetrated deep into the ship because of the weak deck armour before exploding, and then detonated either close to the fixed above waterline torpedo tubes, or in one of the magazines or the access ways that lead to the magazines. This was the cause of the loss of several battlecruisers at Jutland, and which had only partially been addressed in Hood. There were supposed to have been flash curtains between the magazine and the power room below the turrets, but it is theorized that these were open, because they slowed down the reloading of the guns.
As a result, munitions in the Hood were ignited, which led to her quick loss. There are many other places where had that shell hit, there would have been significant damage, but no loss of the ship.
Re: The Impact On The Public Was Terrible
Yes, but Scharnhorst not equal to Bismarck, and was engaged by a force consisting of several heavy and light cruisers that could match her in speed if not armament, and HMS Duke of York, that easily out-gunned her. The mistake was to allow DoY to get close enough to engage with her superior firepower. Once that happened, Scharnhorst did not really have a chance. But even then, it is not clear that the British ships actually sank her. The German hull design was outstanding, and it was proved over and over again that German capital ships were difficult to sink.
The real deciding factor was that the British radar that allowed the ships to locate and herd Scharnhorst, and then engage in dark condition that would have been impossible before the advent of ranging and fire control radar. This enabled the British to fight in the almost total darkness of an Arctic winter night. In theory, Scharnhorst should have been able to run away from the encounter, but her captain made some poor decisions, and did not know where the British ships were.
A similar encounter between a single KGV battleship and either Bismarck or Tirpitz would not have been anything like the same. I would have expected a 1-on-1 battle like this, even with supporting British ships, would have resulted in either both ships leaving damaged, with the worst damage being suffered by the British ship, or the British ship being sunk. As was the case in the Denmark Strait with Hood and Prince of Wales. If PoW had not withdrawn she would have been more seriously damaged than she was.
The KGVs, although modern ships, were smaller, slower, less heavily armoured, and although they had a bigger broadside, it was of smaller calibre guns (14" vs. 15") with a shorter range and penetrating power.
Up until the advent of the IJN Yamato and Musashi, and the US fast battleships, Bismarck and Tirpitz were regarded as the most potent battleships afloat. The British had to rely on numbers rather than the strength of their ships to counter them.
Re: The Impact On The Public Was Terrible @Reticulate
It was not Force 'H' that sank Bismarck, and depending on what you read, it may not have been the British warships that caused her to sink at all.
After Bismark's rudder was jammed by a similarly lucky torpedo strike from one of Ark Royal's Swordfish, HMS King George V, from Scapa Flow, and HMS Rodney, originally journeying to Canada for refit, engaged Bismarck, and fought her to a near standstill, but Bismarck was still floating and under power when KGV and Rodney had to withdraw because of lack of fuel. It was left to the cruiser and destroyers to try to sink Bismarck. It is debatable whether the numerous ship-launched torpedoes are what caused Bismarck to sink, or whether it sank due to the scuttling valves being opened.
It is clear that Bismarck was finished as a German warship, but as I said, it may not have been the British that sank her.
Re: The Impact On The Public Was Terrible
This is what you get when you make a ship "The Pride of the Royal Navy". The ship was almost a celebrity in it's own right, appearing in naval reviews, news reels, tea and cigarette cards, encyclopaedias, and 'Boys Own...' type books between the war. The same could be said about HMS Ark Royal, as well.
It is clear that when HMS Hood entered service in the early 1920's, she was one of the most modern ships afloat, with her sheer size, speed and beauty, for a warship, making her easily recognisable.
Unfortunately, she inherited the worst design characteristics of British battle cruisers from the first world war, and rapidly fell behind contemporary capital warship design between the wars.
In the mid 1930s, she was supposed to have had a major refit, strengthening the deck armour and changing the shell supply system, and having 'modern' fire control and aircraft detection radar and defence fitted. But the uncertainty of when hostilities with Germany would start, and the emergency capital shipbuilding program that was in progress meant that this never happened, and when the war started, she was in a very poor condition. She should not really have been sent against an adversary such as Bismark, especially not with HMS Prince of Wales, which had not actually been accepted into service, so was not ready for combat.
But such was the desperate need for capital ships, there was no alternative, and the rest is, as is said, history. There was an element of (bad) luck involved, but the outcome of that battle was almost a foregone conclusion. Hood would never have returned from the encounter in a good condition, and in hindsight, the outcome, although tragic, was actually about as good as could have been expected. This is because sufficient damage was done to Bismark to make it so that rather than continuing on to the Atlantic, she turned and headed for Breast for repairs, which gave Force 'H' the chance to find and damage her further, leading to her eventual demise.
If Bismark had turned back, and remained a potent force until Tirpitz was completed, the Royal Navy would have had to keep significantly more ships in home waters, and escorting convoys.
Imagine how difficult a force consisting of Bismark, Tirpitz, Scharnholst and Gneisenau, together with the Hipper class criusers would have been to cope with. It would have been really difficult for the Home Fleet to stand up against it, even though the Royal Navy would technically outnumber them.
The spin off from that would have changed the outcome of the war in the Mediterranean and the Far East. It is often easy to overlook the value that the British carriers gave in the Indian sea and Western Pacific while the US was so woefully short of carriers after the battles of the Coral Sea and Midway, and the Med. was critical for North Africa.
So the loss was tragic, as is most war, but it served a purpose.
OfficeJet G55 and OfficeJet 5610 - Integrated print heads. Both are old, but neither are two decades old. And the G55, costing >£400 was not cheap either.
All of the Officejets I have access to use integrated print heads with the cartridge. As does a recent Photosmart that I've used. I'm only really using HP SOHO printers, and I guess mine are quite old, so it may be that the higher or more recent model printers don't use integrated print heads.
I think the answer to your question with regard to replaceable print heads revolve around the fact that the product life-cycle is pretty rapid. Once a model is no longer sold, the parts are no longer manufactured. As a result, there are only a finite number of spares around, and if the company have done their R&D correctly, they probably won't keep more parts than they will need for warranty re-manufacture.
Once the product is out of warranty, chances are the parts are in very limited supply, and the marketing model is such that most people won't go to the bother of stripping a printer down to replace the print head, but will just buy a new printer.
There are two different methods of providing ink, typified by HP on the one hand, and the older Epson printers on the other.
HP, along with Canon and also Lexmark before they left the market provide you with a cartridge which includes the print head.Every time you change the cartridge, you also change the print head. This makes the cartridges much more complex (and expensive), but at least should maintain print quality over the lifetime of the printer.
Epson and Brother cartridges are buckets of ink, and if you go back to the late '90s, that's literally all they were. Plastic boxes filled ink, sometimes with foam to control the ink, together with the required holes to let the ink out. More recently, they've had a small amount of electronics in them, supposedly to allow the cartridge to monitor how much ink is left, but actually IMHO to try to make sure that you only use genuine cartridges.
I currently have an Epson R1800 photo printer that takes a T054X cartridges. I have cartridges from other Epson printers. I've recently found that the physical cartridge from another printer designated T06XX (i.e. the next generation of printer) will fit in the R1800, but the electronics prevent the cartridge from being recognised properly. This strikes me as being a blatant artificial control of the post sale ink market.
I kept a Stylus 1160, and before that a Stylus 880 (both models without electronics in the cartridge) going for a many years as my always-on network attached printers, because the ink was (comparatively) just so cheap. The 880 eventually had some print nozzles permanently blocked regardless of how I cleaned it, and the 1160 developed a power supply problem. That's when I picked up the R1800, hoping it to be similar. Unfortunately not.
So it sounds as if Epson are going to go back to their old method of making the printer everything, and the cartridges/ink tanks nothing other than reservoirs for the ink. Great. Just don't push the printer price up too high for artificial reasons.
Re: Nice title
Maybe El Reg should keep a running downtime count for the Government use of MS Office Online, and use the running total every time they have to carry a story like this!
Re: Rather sad @dajames
Population maintenance in developed countries is a problem, especially in the higher demographics. Many high achievers do not procreate sufficiently, possibly leading to a Darwinian decline in the aggregated capability of the population as a whole. Add to this the fact that the right to a found a family is article 16 of the UN Universal Declaration of Human Rights, whereas slobbing out in front of the telly is not listed.
And, <contentious_statement>I think that digging wells for villages in Africa, to increase health and thus reproductive viability of the people there will probably have a greater effect on world population than taking time of to have your 2.4 kids.</contentious_statement>
Be careful what examples you use!
Footnote: For the sake of my down-vote count, I completely believe that keeping people alive in a sustainable manner is better than allowing them to die, however...
Re: Drivers etc... (oh, the pain)
There was a mixed bag when it came to software modems. I actually found that I could get mwave modems working relatively well in Thinkpads around the same time as you had trouble. Admittedly at the time, Redhat 7 and 8 (not RHEL) did not have the mwave driver in their repositories, but it was available, and built relatively easily on mainstream Linux distros, and worked quite well.
I did have an HP Riptide sound/modem card combination card that was more troublesome, and I completely gave up on that, both for sound and modem.
I've not tried to get integrated modems in laptops working recently. It's all a bit pointless since 3G dongles and public/guest WiFi are so common.
Re: Indeed, this should be the case
But rather than bitch and whine now about something that cannot be altered, because time as we know it cannot be reversed, the people wanting to change it should bite the bullet, and actually work to ensure that the correct mix of people are entering professions, whatever they are, today for the next 5/10/20 years, depending on the type of role and level.
Like many things, to achieve a particular goal, it is always necessary to make adequate preparations.
Indeed, this should be the case
Now encourage a diverse set of people to start at the bottom, and then wait 5/10/20 years for them to gain the correct qualifications and experience, and hope they stay the course.
I really get fed up when activists of all sorts don't take the training/experience lead time into account when considering diversity. They absolutely need to look at the bottom of the stack, and be prepared to wait sufficient time for people to mature, rather than assuming it can be fixed just using quotas.
Actually, Currys were just fulfilling their legal obligations. When buying TV receiving equipment, by law they have to gather and pass on identity information to the TV Licensing authority.
If you pay by card, they will normally just pass enough information from that so that identity can be obtained from the bank. Alternatively, if you use a store loyalty card, that will suffice too.
I once bought a TV aerial amplifier from Tesco, wanting to pay cash, and having just lost my keyring clubcard. They refused to sell it to me without me providing my name and address. They did not even relent when I pointed out that it was not technically capable of receiving a TV signal, and that where it was going was not my house (I was getting it for my parents).
I know for a fact that they use Tesco clubcard information, because our card has a typo in the name on the card that we've never corrected. And after buying a TV, we got a nasty-o-gram from the license enforcers claiming that they could not identify a valid TV license under the name and address that the clubcard was registered to. I did nothing, waiting to see whether someone would actually spot that garthercole and gathercole actually only differed by one letter, and at some point they must have, because there was never any follow-up. It's a bit of a shame. I would have loved to have seen that go to court to watch it be thrown out.
What really annoyed me was when another shop asked me for the same information for exactly that purpose when I bought a simple DVD player! That really took the piss.
I believe I've heard that deliberately giving false information when buying TV receiving equipment in the UK can be deemed as fraud.
Edit: Hmm. Others beat me to this while I was typing it up. Must remember to be less verbose.
Re: Hang them by their lab. coats.
If you thing that all that is required to create an HPC is to bolt many utility systems together, the you really don't understand the problem. There is a diminishing return as you spread some workloads across more and more cores, even though this is what they are forced to do nowadays because they've pushed single core performance pretty much to their economic limits.
I was following the porting of the UM weather model onto the Power 7 processor in p7 775 systems, and I can say without any hesitation that there was a turning point where adding more processors made the model run slower, and the drop in performance was very rapid.
Understanding why this is the case can be a challenge, and one that cannot be generalised or codified such that it can be addressed by current software development tools. There may become a time when this is possible, but the cost of doing this has not been justified, and may never be worth the effort. It may be that it will never be worth the man-effort to do it, so we may have to hope that initiatives such as cognitive computing are able to deliver.
There's been a problem with computing for the last five decades or so. The rate of performance increase has been such that software engineering has never needed to keep up. In fact, the creation of software has been allowed to become spectacularly lazy in the assumption that the machines will just be fast enough to cope with inefficient software. This can be seen in the stupid memory footprint and significantly poor performance of much of the desktop tools that are used today.
The only places where the efficient running of code has been important is in embedded controllers, and ... HPC. So maybe rather than producing even more software engineering, software houses should go back to more simple engineering techniques, more like HPC than vice-versa.
Your extension of the car analogy is interesting. It's very strange that it needed a serious push from regulation before much of this increase in engineering effort was justified. And with the increased engineering comes the cost of the magic monks in blue overalls is getting higher per visit, such that it won't be long before it is cheaper to scrap a vehicle than to repair it relatively early in it's life. But in spite of this increased engineering, there's still justification for Bugatti Vayrons and F1 cars, and the skilled drivers to get the absolute best out of them. Just as there will be a requirement for real HPC systems, with manually tweaked code.
BTW, there is a whole sector of commodity HPC systems, bought in fixed configurations, with canned application development tool-sets the like of which, ironically, are used by F1 teams! They're just not the headline systems that are in the news, the top 10% as you put it.
Re: Hang them by their lab. coats.
Unfortunately, there is relatively little commonality between HPC systems from different vendors, and as with most large problems, it's the interconnect between the individual system images in the clusters that is most important, and different vendors quite jealously guard their specific implementation to maximise the value of their investment.
Unlike general purpose servers, there are a lot of tricks that go into HPC servers to make them as fast as they can be. Beside the interconnect, there's different ways of packaging multiple processors in as smaller power and space footprint as possible, and once you start putting so many processors into single system images, especially if they are heterogeneous processors (think hybrid or CPU/GPU processors), the way that the memory is laid out and accessed becomes very important. All of this can affect the way that the code has to be written. This is even though there are relatively efficient abstraction layers such as MPI, OpenMP and MPIch
This means that in order to get the absolute maximum utilisation, there is a long period of tuning when porting code from one to another, For example, the installation of the Cray XC40s that are currently replacing the IBM P7 775s at UKMO is a project that is running for over a year, from purchasing decision to final switchover, and much of that is taken up with the porting and resultant checking of the models between the systems.
I suppose that normal commercial systems vs. HPC systems is a bit like the difference between a Ford Transit and a Formula 1 car. You definitely want to invest in making the F1 car as fast as possible.
Any programme that will result in a consistent, efficient programming model that abstracts the system specifics to allow increased portability of code would be very welcome by pretty much everybody in the field.
I'm surprised nobody else's commented on breadth of the T's&C's.
Quotes from the T's&C's
"...to us and the Event Provider(s) ..."
"...actions inside and outside the venue..."
"...regardless of whether before, during or after play or performance..."
"...for any purpose ... any medium or context now known or hereafter developed"
"...without further authorization..."
OK. Reading this completely literally, this means that you've given the organisers and Police a blanket authorisation to record you for ever, and use that data for whatever they want, and this covers any image data that they already held before the event as well.
Anybody else a little bit worried by this? I would hope that it is so broad that it could be challenged, but unless it is deemed unfair by a court, it could have long-reaching effects on the attendees future rights.
My goodness. Two pints by volume of sweets for 50p. I'm not sure sweets have been that cheep since decimalization. Is Cameron old enough to remember decimalization? Probably not, he was only 4 and a half at the time.
Oh. You meant 4 ounces. That would be a quarter, not a quart! About 113g.
Family Friendly Filters
As the bill payer in my household, I'm waiting for the invite to turn the block on, and have been for a while.
I've not seen it, and I can still get to porn if I want, so I must assume that it's not in place.
I was always sceptical about this process. I suppose it's possible that one of the other members of the household may have seen and accepted it, but it was supposed to be such that only the person whose name the account was in is able to complete the form.
Does anybody on Orange/EE as an ISP have experience of how it was supposed to work?
I wish politicians would learn...
...that the Internet is trans-national, and as much as he would like to, he can't penalise a company outside of the UK.
All he can do is to try to get the UK ISPs to block access to offending sites, but as we've seen from TPB, that's like playing whack-a-mole.
I can sympathise with trying to keep certain content away from vulnerable people, but that doesn't mean that I can see a way of doing it without breaking the Internet!
"ARM makes the chips..."
No. How many times do we have to point this out.
ARM design the ISA and the cores for the chips, and then licenses these out to the actual chip makers.
If ARM actually made all the processors, they would be one of the biggest companies in the semiconductor manufacturing sector.
I don't know about self-cooking
After all, you have to provide light in the first place. Just make it infra-red, and it will cook the pork much more efficiently without turning it into micro lasers.