1351 posts • joined 10 Apr 2007
I see your point, but Intel pushing a bit harder (and fairly - hah!) should push ARM to get better just like ARM's given Intel the severe kick it needed to stop producing small thermal generators and start making more power efficient chips instead.
Intel wiping out ARM is a different prospect to Intel vs AMD as ARM licences it's chip designs and sub-components to be manufactured by a wide variety of companies, some of which are at least as influential as Intel.
I thoroughly agree about the x86 often not being the correct chip for the job, these are desktop PC chips at heart and while subverting them for use in mobile phones and tablets isn't inappropriate as these are genuinely portable computers often with capabilities far in excess of what we had on desktop systems ten years ago. For lesser systems such as basic PLC and control systems they are overkill and the convoluted and inefficient instruction set combined with the higher integration requirements and subsequent costs really hold them back as well.
Competition. Yay! Good! ...and so on.
...now if they were to mandate well (and clearly) defined interfaces between systems then they could pick and choose suppliers as they feel fit and choose the solutions that are most effective, reliable and cost-effective for the job.
Unfortunately we all know that this won't happen and with past experience with the five listed, each will assign twenty project "managers" to each project. These project managers will change every project meeting and the chance of meeting the same one twice will be slim, nobody in these five suppliers will take overall responsibility for anything and all the actual, real work will be sub-contracted out to almost as frustrated sub-contractors as the the client (Network Rail).
Re: I've said it before.
Oh no... and it's not even Friday.
My prediction, as ever, is "Yet Another Rubber Faced" loon. Like the previous few.
Noddy Holder would make it special though!
Cloud (online data storage mainly) security, the real side of it, is a continual minefield.
In the end, it's safest to work on the assumption that if the data is replicated out of the EU then it has gone to somewhere insecure that has no real concept of privacy, such as the US. The US safe harbor(sp) agreements are never enforced or checked and are usually so specific in use that your data will bypass the agreements and won't be covered by them... and the safe harbour agreements only dictate what the company may voluntarily do with your data in their posession, as noted here it has nothing to do with legal, government or other processes.
Where I work a small amount of our data is extremely confidential and sensitive as it relates to high profile events and court cases, a sizeable chunk is commercially sensitive such that the company involved would not like a competitor to access it but most is trivial and of little interest to almost everyone. To be safe we work on the basis that it's all very confidential and as a result there's no way we can seriously consider cloud data or cloud application hosting, especially if the service provider mirrors the data outside of the EU but that we'd also be implicitly trusting all staff, contractors and other third parties that are involved with the hosting.
The thing is, compared to contemporary PC games at the time of each release, Halo has always been found wanting. The less than optimal controls are fine for a console but poor in comparison to the PC FPS gamers' choice of mouse and keyboard. The game itself was unremarkable compared to the features, visuals and gameplay or the PC FPS games that arrived at the same time and the conversion attempts of Halo from XBOX to PC really only served to highlight the gulf in quality. It's not helped that XBOX hardware was static and PC hardware always moved on but that's the nature of console game design and development.
Not that Halo isn't a fun game, it's polished enough so that it is good fun to grab some friends and shoot them, or even shoot with them! But compared to similar games it's nothing special so it's cult following on the XBOX does tend to leave PC gamers with bemused expressions.
I'll get me towel...
While some viruses do jump species, in reality complete jumps are very, very rare. We're all subject to millions of viruses every day, especially those who work or otherwise live with animals, most of these viruses are just not capable of properly infecting humans.
The reason is down to what viruses actually are - to put it in a fairly basic manner, they are a form of life that cannot reproduce on its own and instead has to invade other life (cells) and hijack the mechanisms of these cells in order to reproduce itself - essentially the lowest form of parasite and just like higher forms of parasite, they need to be specialised to do this effectively. From Wikipedia "A virus is a small infectious agent that can replicate only inside the living cells of an organism." (http://en.wikipedia.org/wiki/Virus). To start with, many viruses have to survive in the open environment, or at least outside of a host organism and HIV, for example, is not particularly tough and instead has to be transmitted without using the open environment... which is why, you won't get HIV through (non-intimate or blood sharing) contact with a carrier, through touching what they've touched or inhaling when the carrier has sneezed or coughed. A virus then needs to be lucky enough to find cells that it can utilise, seeking these through the chemical markers of the cells, latching onto these cells, invading them and hijacking the inner mechanisms of the cell. Through all these steps the virus is exposed to being cleaned up, removed, disabled or rendered useless by basic biological and chemical processes or in the case of cross-species, just not able to get far as it will target the wrong genes, mechanisms or chemical signatures. Once in a cell a virus then uses and controls the cell's mechanisms to reproduce itself (some viruses also cause the cells to reproduce, which is the general cause of virus induced cancers). Even once in a cell, if a virus causes the cell to outwardly reflect that the cell is not working correctly then the cell, and the virus in it, will be destroyed by the body's immune system. It's not easy being a new virus!
Viruses mutate a lot, therefore when living continually with animals that are shedding viruses there is always a chance that one virus strain might happen to be able to infect a human. A certain level of "infection" is quite common but usually entirely harmless as the viruses fail at some step of their hijack and reproduction process. For example, the virus might be able to infect a cell but not to reproduce. Even should a virus happen to manage to reproduce itself, it still has to find its way from one human to another, and that's a very different problem as that involves getting out, crossing between hosts somehow (HIV takes shortcuts on this and needs direct transmission) and then evading the immune system and defences of the new potential host.
I think other primates are immune to the (human) HIV as the HIV virus specifically targets human cells - it has to be it's so specialised and effective. It's not a case that other primates are immune, implying that their immune systems can deal with it, rather that other primates have enough differences in their cells that the human HIV virus has no overall success in infecting them. Other primates have their own versions of HIV and these are not successful in infecting humans as similarly they are too specialised.
Many of these are designed to facilitate a Bring Your Own Device (BYOD) environment, such as the ability to print using Wi-Fi Direct, share the screen using Miracast, pair with printers using near field communication (NFC), and have Windows 8.1 devices act as Wi-Fi hotspots via built-in broadband tethering.
Huh? Just how much BYOD FUD is being paid for around here?
Print using Wi-Fi Direct... most useful for tablets and laptops to print to non-domain or non-local printers. i.e. go to a friends house, print something on their printer. A BYOD system in an office environment will connect to the local network and be given access to printers through the carefully controlled access to printers functionality that the BYOD sellers are selling.
Share a screen with Miracast. Nope, I'm at a loss. This is particularly related to BYOD how?
Pair with printers using NFC? Sounds like basically the same as Wi-Fi Direct...
Windows 8 Wi-Fi Hotspots... so, useful for the home, bugger all of use in a corporate environment and nothing to do with BYOD.
Well that's an odd one isn't it?
In theory, if a company's selling point is like the previously mentioned Abercrombie and similar, that they have fit, healthy looking attractive staff to front their store, should they not be allowed to advertise and look for these people? Likewise when a fashion house is parading their clothes, they need models of particular looks and sizes to model them, are they wrong to have this requirement? Do car companies pick the models to drape themselves over their latest luxury cars have requirements as well?
Not to say that it's right or wrong, but political correctness can go too far and some jobs do need above average staff - not necessarily just on the physical looks front either.
Re: At least with this site...
Yes. And better to accept this and understand that this is the way things are than to try to deny it.
Re: racism is racism
not forgetting to get an (Indian*) Curry, sit on your (Swiss) furniture. And the flag you're waving will probably have been made in China anyway...
* Yes, I am aware that many "Indian" dishes that we are used to originate outside of India.
Re: Special cable
As this special USB cable would sit between the computer and the standard Apple cable/circuitry would it still charge? From my understanding this would make the setup little different to a low powered plug charger.
Supporting touch is good. Even Windows 7 made huge strides on that front.
Crippling the functionality of an OS's UI so that it's only properly usable on a touch screen, that's not so good.
"But we also recognize there are many non-touch devices in use today"
As in 99.999%(*) of the things are non-touch. Which makes targetting an OS UI so that it's only properly usable on (smallish) touch screen systems a stupidity of amazing incompetence levels.
"The new tip appears anytime you move the mouse to the bottom left corner of the screen, and is always visible on the taskbar when on the desktop."
Always visible in the desktop mode is a good start. Invisible buttons (and actions) in arbitrary places on the screen are still an example of some of the worst UI design possible. If a user is not using a mouse, how are they meant to know that an arbitrary screen location has any functionality or what it is? For those rabid lovers of keyboard only interfaces or MS shills... it is very important that all actions are obvious or accessible to users. Users are not telepathic; They have many experiences, expectations and capabilities and presenting a largely blank screen to a user and expecting each one to somehow "know" to thumb the correct spot on the screen takes UI design incompetence to a new level. Even I had to google, using my phone as the computer was unusable, to find out how to unlock the ****ing lock screen on Windows 8 the first time I came across it.
* Just an air plucked figure, but just as likely to be correct compared to many statistics.
Re: Where's The Eadon Angle?
Eadon (http://forums.theregister.co.uk/user/34672/) hasn't made any posts since Friday 24th May. Maybe he's burst a blood vessel or been kidnapped?
So MS partially correct the misuse of invisible UI elements but it's just window dressing, not real change and certainly not what users are screaming for. How not surprised do I look?
Precisely, it's not the lack of start menu that's the problem. The full screen start menu isn't the worst thing in the world, if done right and with enough flexibility and so the damn thing doesn't look like a useless screen full of incomprehensible icons, which I suppose they're starting to fix. It's the idiotic invisible "you need to know to thumb here" mentality in the entire UI that's the most pressing problem. The second most serious problem I'm not sure is either the context switching between Metro (whatever) and desktop environments and the differences between the applications that are in them, or the fact that half the interface is only properly usable with a touch screen which is utterly useless for desktop systems, barely useful for laptops and only properly useful for tablets.
Subjectively I find the entire Metro UI, the capitals everywhere, the irritating lack of a way to easily find functionality and the bland icons of nothing much very ugly. But that's my opinion, some people like it.
Re: It's all about value. Time to change
To add to this when it comes to staff who know vaguely one end of what they're selling compared to another... consider the "successful" tech shops. Those like carphone warehouse, apple stores and similar. These try hard to have staff that have a clue about what they're selling and can help the customer with it, giving people a reason to go to the store.
Other successful shops try, and even sometimes manage, the same as well... such as Boots (with their makeup specialists), John Lewis with their departmental product specialists, even Tesco have specialists in their mobile sub-shops. Unfortunately it doesn't always work and many stores are often staffed by uncaring imbeciles, but given the pay and working conditions this isn't always a surprise.
The closing stores the likes of Dixons were (are) staffed by muppets at every level. Sometimes stores slip through the cracks like PCWorld and continue to survive but these are the exceptions and are more due to them being closer to specialised supermarkets than places to learn about what you're buying.
Re: High Street
Agreed. It used to be a running joke the all the banks and post offices in town centres had no staff on at lunch time. Because they were at lunch themselves. While it's true that they need lunch, they are there to provide a service and a vast number of their customers who need their services would do it in the little spare time they have when they are open... i.e. at lunch time.
Some people get this service business, others don't - guess which ones are successful? A friend of mine owns a hair salon where her business plan was to provide out of town salon with in-town quality and style, but at convenient hours for those that work. The result... her and her staff work Tuesday through Thursday evenings through to 9pm, are open all day Saturday and have Monday off and start work later to compensate. It's a very successful salon because it provides services at the times that are needed.
Ah.... Friday again :)
Love the bit about personal emails :)
Re: For people who knew no better
Well, Amiga corp lost the plot which didn't help them much. They had some fantastic plans for the new releases of hardware and OS, plans that all went a "bit" (as in spectacularly) wrong with losing designs, political infights and generally losing the plot.
From discussions I had with Amiga engineers around then, one of the planned (or dreamed) features at the time was for Amiga windowing scheme to be more linked to the display hardware, therefore if a window needed just 2 bit planes, or needed 4 or 8 (or 24 I suspect) it could request this and the window area would be efficiently allocated and managed by the hardware. This way efficiency could be kept, high colour windows would use just the resources they required, low colour windows would use as little as possible and the entire display would be handled in hardware giving extremely fast and efficient windowing operations. Even from the start Amigas had hardware pointers for the mice (it took years for Windows PCs to approach the smoothness of the Amiga mouse pointers), multiple, stackable displays with different colour depths and heights and the evolution of this was to move to supporting different "screen" widths and allow them to be managed as if they were windows.
But things moved on, the Amiga unfortunately faltered and died and here we are now.
The Amiga wasn't the only system doing the rounds at time, there were of course the Atari ST and the Archimedes as well, both very capable systems in slightly different ways. And given choice could always use an expensive PC if your colour palette of choice was Black, White, Cyan and Purple or if you had much more money than sense and could find one, then you could buy and use a Mac - often monochrome and usually very closely tied to the Mac's strength at the time - Desktop Publishing, but very effective for it.
Re: Let's talk photon counts and well sizes
The comparison of the outside lens area and therefore the difference in maximum photon collection is all valid. However it's the poor efficiency of the lenses, and the poor efficiency of the sensors that are the biggest problem. A larger collecting (outside) lens will help and if the quality of the lens and the sensors are the same, then it will obviously be better. However there's much more to it than that...
If the sensor is twice as wide and high in one camera (quadrupling the available area) then the lens does not need to focus the light as tightly, allowing better image detail as there is inherently more accuracy through improved tolerances.
The sensors are also inefficient as in current devices as a large proportion of the incoming light is lost as it hits support circuitry and gaps between sensors and sensor groups rather than hitting the sensors. Improvements to the sensor orientation (vertical stacked circuitry) and other techniques are coming but don't appear to have made it to market yet. I can't remember the exact figures but right now it's something along the lines of 80% of incoming photons do not hit a sensor. Better collection results in a wider sampling range and therefore more accuracy across the scale.
I'm tempted with one like the rikomagic that was linked in the first post here... connect to WiFi, install XMBC, configure access to NAS and job done, one more media centre deployed. It's the additional wires that may be needed that could be annoying, these things are unlikely to be attachable direct to a display or will look appalling stuck out of the side if they can therefore they'll need a (short) extension cable.
But the same would apply to the original item this post was about, may often be unfittable without an extension cable and would need the hassle of external power as well.
Re: Maybe a bit flawed?
I've given up paying any real attention or not suspending belief whenever I see the source as Forrester...
They're just paid to write "research" articles that reflect whatever their customer wants. Therefore there's the inevitable BYOD mention in this report and a lot of statistics that frankly given the source figures, or even reality, could be spun to reflect whatever message Forrester's paying customers want.
Re: Real science
@AC - May 2013 04:48 GMT
Please use more CAPITALS. Foaming at the mouth comes across much better on the Internet when typing all in caps.
Re: Let's hope LM get the accelerometer the right way up this time.
Most well designed non-low level industrial electrical or electro-mechanical components are designed so that they cannot be fitted in an incorrect orientation. It's always possible to stuff up the original drawings but with modern circuit software and simulations if the part is correctly described then incorrect orientation should be flagged up very quickly. Unfortunately I've come across a lot of custom parts that don't stick with this basic principle and have seen a lot of destroyed components as a result, and have done some of this destroying myself. :)
One off pieces of machinery like these probes are largely assembled by humans and designing from the ground up on the basis that the assembling human will insert parts incorrectly given the slightest opportunity to do so is the right way to design. Unfortunately I have come across "engineers" who when faced with parts that didn't fit in the orientation that they tried bent or removed pins to force the part to fit rather than rotate the component 180 degrees therefore you can't always protect yourself from idiots but I'd hope that the NASA team employed better assemblers than these.
Competition is definitely good.
Improvements to IDEs for Android are also good as well, and while Intel's development tools don't have the wide scale use of others, they are very competent in places.
Re: Planned for 90 days, still going after 9 years...
Sounds like most "temporary solutions"...
Re: Seriously, PHP?
Hmmm... So because you don't like PHP and/or can't write neatly in it then nobody should?
Not that PHP doesn't feature a lot of stupid, especially in the various attempts at classes and objects, but like any flexible language you can choose to hang yourself with your code or write code neatly.
Bring on the idiotic holy language wars...
Maybe it should be: If you can't grasp assembly then you shouldn't be writing code...
Re: Nokia was right
Where does it show that other Android mobile makers aren't making money? They may not be making as much as Samsung, down to volume of sales for the quantity and volume benefits, but they can still make money.
Re: Hang on...
Completely. But sanity has nothing to do with this, but every one of these 3.2% definitely, absolutely, cost the media industry $millions each. That's $millions in sales they'd never have got in the first place.
Not to justify it, just to quantify against some form of reality.
There's an interesting disparity between fair, moral and the letter of the outdated (or caught out) law.
For example, I've downloaded ebook copies of the series "A Song of Ice and Fire" (Game of Thrones). Why? A friend lent me the physical books, in the way that we've borrowed and lent books for years, however I really didn't want to lug a dead tree or two around when I can have the convenience of them on my mobile device. I still have the (lent) pile of books on my shelves, I'm not lending them out to anyone else... but I'm still a pirate.
Likewise I've downloaded copies of most the movies that I own. An entire floor to ceiling cupboard of DVD size boxes takes up a lot of space and having them all on one central media server (XBMC) makes a lot of sense and we've watched some great movies that we'd previously forgotten that we owned as a result of losing them in the piles. It's interesting to remember that the whole point of creative media is to be consumed, it's no good sitting there on a shelf - although the media cartels seem to think that it's all about making them richer. Why download? it's quicker and less hassle to download than rip them myself and I can usually download them in better quality than I can rip them myself as well. Once ripped we don't have to sit through the unskippable "piracy is theft" lies that are fronted on all the DVDs (and unskippable adverts on Disney films) and can go straight to the movie itself - who really cares about the extras? To extend this there's the annoyance of Blu-Ray and the useless further extra features and the extreme tedium of waiting to get the disc to the point where you can actually start watching what you bought it for. But technically I'm still a pirate.
Re: Oh Dear
Look at the current world's top smartphone manufacturer...
Does this company produce phones for different OSes, hedging their bets? Yes. They might not have a wide range of devices but they do have a wide range of OSes, therefore keeping knowledge and skills they might otherwise have lost. It also keeps the suppliers on their toes as they know they need to continue improving.
Does this company somehow manage to promote their brand over the brand of the Operating System? Yes. The platform / Operating System is the enabler, not the crutch.
Now look at Nokia. They have one smartphone OS and they trumpet this as a sales ploy.
Re: If only there could be "Silvermont"-based netbooks
Wasn't it Intel that purposely hamstrung netbooks by restricting what was allowable when using it's chips and chipsets? They could have been good...
From the marketing junk I was subjected to, the ribbons were in place to replace toolbars, not menus. The MS hack demonstrated just why the ribbons were so good and every user was an idiot by enabling every single toolbar in MS Word and going "voila", look at the lack of space this gives the user for documents, this is why we "invented" ribbons. A sane attendee (who was shouted down as if he was an idiot), pointed out that no user enables every single toolbar simultaneously, that many of them appeared on demand and that the ribbon and other new UI crud took more space than toolbars and menus.
The big problems with the Win 8 shell (UI) is that it's an aborted mess of touch-screen optimal controls, half baked with a few non-touch interfaces thrown in and an overriding feel of "how the hell do I do something?" as everything is hidden away. The missing Start Button is just one of these idiocies... users need visual prompts that they have options (functions) available, designating random portions of a display that a user needs to thumb to bring up some functionality never has been, and never will be, good intuitive user interface design. There is no such thing as a fully intuitive user interface (it's not possible), but there's no excuse to not try and the next best thing is a consistent interface.
Re: Not exactly...
Agreed. They are leaps ahead of what they were before. It's no longer a case of "argh, integrated Intel chipset with obscure set of digits that'll take a day to track down" and now "that's not so bad, it works ok now".
They're not speed demons, but at least now they are perfectly adequate for the majority of computer users' needs. i.e. Using a computer in place of a typewriter and browsing the web a bit.
...and at the very least put a disclaimer on the article stating that it has been "sponsored" by XYZ vendor who happens to foist / sell BYOD management services.
Re: Ahhh I wondered why Flash had gone!
A few years ago I had the misfortune to have to deploy the entire Adobe suite to a small design studio of 12 people. That's 12 people, 8 pieces of software, each of which had a 24 (or so) digit key to type in.
To make the job much more interesting, Adobe didn't bother to indicate which of the 96 keys was to be entered into which product. That's right - all I was given was a list of 96 keys and a couple of these were upgrade keys.
I rang the fuckers up and explained, carefully, why pirating their shit is such a good idea compared to trying to do it legally.
Re: IP profits lawyers and extortion style law suits
I hate feeding trolls, but...
Trademark: Usually designated by the symbol: TM. Otherwise known in industry to really be an abbreviation for "Totally Meaningless". Now Registered Trade Mark, well that's something else entirely which is why it has a different symbol: ®. Registered Trade Marks are meaningful and are worth something.
I have the copyright on a very large number of works. In fact, until now(ish), it includes absolutely everything that I have produced for myself and not for somebody else.
It's a step in the right direction - not the ruling, the principle behind being more open/honest about the overall charges. Very much like the finance industry has been made to do with loans.
Now if inkjet manufacturers could just do the same with their "accidentally" DMCA riddled cartridges and "accidentally" integrated designs that integrate vital components with the disposable ones to force users to only use genuine (aka extortionate) cartridges from them then the world will be an even better place. Platinum is cheaper than inkjet ink.
Re: So. Jobs was wrong.
From what I know from discussions with real designers of products like it, the exposed antenna was a pretty bad design in the first place. While there's always a need to not shield an antenna by casing it in metal, exposing it was a way round this however this meant that it could be grounded by contact with the user's hands. The models that didn't suffer from reception problems appeared to have a good quality coating that covers and insulates the antenna, the models that didn't either didn't have this coating or it was poorly or patchily applied.
Oh dear. It's early Friday and already laughing at folk on t'Internet :)
“The amazing thing is that nature seems to have found ways of blowing up a wide range of stars in the most dramatic and violent way.”
That has to be the quote of the day :)
Apple have followed the stable, don't rock the boat, path with their phones for a little too long now. While the hardware is generally good, barring the obligatory build quality failures that plague production to a cost and a few other poor design choices, are well engineered as well. Unfortunately iOS itself is really not looking the whizz shiny interface that it used to be, it's still smooth and still works but now lags behind everything.
Microsoft have a much more fresh looking UI, although (subjectively for me, I find it ugly as sin), the UI is plagued by more inconsistencies than other phone OSes and while very good for a few tasks tends to really fail for others and fails on quite a lot of counts for basic UI principles. Again, and subjectively, I find the look of the some of the devices quite disappointing, but this is probably down to the sheer lack of choice as too many users and manufacturers have been burnt by the 6 to 6.5 to 7 changes. The lack of a wide range of devices and app support really seems to be hampering uptake and the Microsoft brand is not one that screams "trust" or "reliability" which is very bad for consumers. While the range of devices is wider than Apple's, therefore the lack of variety would seem an off argument, there isn't the cult of Apple or "cool" factor driving uptake.
Android... a very mixed bag of devices. The wide range gives a lot of choice for the consumer and the manufacturers really are producing a wide selection of looks, feels, features and price points. Unfortunately many of the devices are sub-standard and quite appalling to use and some manufacturers just fling so many models out there that you're left with no clear idea what are the good ones and which aren't. The manufacturer's own interfaces often detract rather than enhance the experience and add horrible delays to the delivery of OS updates, some of which are quite desirable.
...and my take on all of this. The competition is good. Without it we'd be languishing around with whatever crap a few suppliers feel fit to give us. Apple really changed the mobile market with a device that was polished at many levels and while they've stagnated recently it's not impossible for them, bean-counter excepting, to turn things around it does feel like they've lost it on the iPhone.
Re: The widespread belief that lithium-ion batteries don't suffer from “charge memory”
From reading the engineering rags relating to this kind of technology, the most damaging thing to do to these batteries is over charging. There is a very big difference between the cheaper and and the more expensive charging control circuits.
To reduce the risk of charge memory (aka tide marks) the majority of modern software controlled battery management systems have a charge threshold where they don't charge a battery when it's past (e.g.) 97% full unless it started below that point. This prevents the device from charging to 100%, running down to 99%, then charging to 100% and so on.
The other big problem is judging the scale of battery capacity itself. The tech and monitoring for this has is still improving all the time and in itself requires quite complicated management software to track the changes over time and even consider environmental factors in order to reliably produce an even close approximation of the real battery level.
Re: @Anon 12:10
But she didn't "cost the majority of miners their livelihood."
Take an industry that is bogged down by tradition, hamstrung by unions and deeply unprofitable... what do you have? You certainly don't have a viable industry. Despite the opinions of some of the rabid few, these businesses didn't exist solely for the benefit of providing jobs for those workers that, in between strikes, were employed in it. While I understand that there were some mines that were profitable at the time and were predicted to continue being for a reasonable period, most were not.
While there is an element of "greater good" in such infrastructure there are points when it makes no sense to prop up failing industries. It's never nice or pleasant for those involved at the actual working (end) level, but these are the realities of life and have been for hundreds of years. Whole industries have grown and disappeared in this country since even before the industrial revolution, many of them fighting tooth and nail, sporting dirty tricks including laws through friends in power, the whole lot, but eventually things change. The biggest problem is that areas were so entirely dependent on one industry that when the inevitable happened, it caused wholesale change in the area. These problems were predicted far before this time, mines only have finite resources and have run many times before, but with growing specialisation and dependency the risks were higher.
Seeing those funky clones reminded me of all the adverts I used to see in the computer rags of the time, featuring replacement cases for the Spectrums and similar. Giving them hard keys, better angles, all the works really while largely just moving the internals from one case to another.
Self service requires one of two things:
Trustworthy knowledgable users
Systems that are so simple that they can't really be configured wrongly (or if so, where a basic user is able to fix it), or systems so smart that everything is automated.
The last I heard are here there were plans to colour (paint) them more appropriately so they don't stick out like ugly sore thumbs. I believe that alternative shapes and designs were considered as well, as this is just a cover really as the interior will largely be the same.
But then telephone boxes are big red, largely useless (now) boxes, often scratched and damaged but they're a part of the sights of Britain now and quite a few are "protected" structures.
- JLaw, Kate Upton exposed in celeb nude pics hack
- Google flushes out users of old browsers by serving up CLUNKY, AGED version of search
- GCHQ protesters stick it to British spooks ... by drinking urine
- Page File Love XKCD? Love science? You'll love a book about science from Randall Munroe
- Facebook to let stalkers unearth buried posts with mobe search