Re: Lets clone it.
Yes, but how noisy is the critter? It must somehow live in a quiet cul-de-sac...
1496 posts • joined 10 Apr 2007
It's Biomedical then Engineering(*) and then "others". Between Biomedical and Engineering the rest of the funding is little more than a rounding error.
* Engineering includes topics such as carbon nanotubes and graphene.
Not sure about the large red nose. How about a nose that, for some utterly inexplicable reason, appears to regularly get longer?
Anyway, United is a terrible airline. Got an "upgrade" last year and honestly can't say what the difference was between that and cattle class.
The difference is that you paid more for it? Erm, nope - can't honestly say I saw a difference either. Although at one point a dirty curtain is pulled to separate you from those that didn't pay for it. If you could tell the difference...
Didn't it say that they had to download and use the UA app for this? If so, what has this got to do with the OS features and Android having to have an App?
There are plenty of cross platform streaming libraries out there for precisely this kind of purpose and to ensure that device OS coverage is as wide as possible.
That was a close call there. For one moment I thought you were about to mention... no wait... dammit, I nearly did it myself.
Being within 6ft of pliers is all very well. It's being able to reach the bastards after you're dropped them, for the second time, down a chute which you have to disassemble to retrieve them. Usually you'll require pliers to disassemble this chute...
As big spending customers they have too much leverage for M.S. to piss them off.
It's more fundamental than that. The banks have Microsoft's money...
Nice bank balance you have there, it would be a shame if anything were to happen to it...
That was exactly my thought when it comes to the updates. Technically they are compliant.
* It's not as if Windows XP will suddenly become more vulnerable than it is now.
* These systems run a modularised version of Windows XP with as much of the crud as is possible uninstalled or, in the worst cases, disabled. I have configured and deployed Windows XP like this as it is rock solid and the number of vectors for external attack is minimal. For example, you're vulnerable on the fundamental IP networking stack and your own application listening on this.
* These systems are individually firewalled to control the incoming and outgoing routes of data.
As for why XP? Because of the ease of development and the advantages a "mature" OS brings when it comes to the level and depth of device drivers. While a more restrictive OS would generally provide more security, given that there could be dozens of printer variants to support, dozens of card readers, dozens of screens and so on, separating the application and the device through the OS is the right way to go.
There is nothing "new" for you to jump through in Windows 8 if you've actually bothered to learn the keyboard shortcuts. Which is, incidentally, what you're supposed to do. (Mice are the biggest cause of RSI, not keyboards. You really aren't supposed to use them all the time.)
This may be an ideal, however MS have been steadily hiding keyboard shortcuts in windows and their applications, making them less obvious and often removing them entirely.
Not really. It would cost £2K per processor I wanted to run it on and I'd have to re-pay yearly or they'd sue me.
You forgot the bit about the documentation being extra, the product being obsolete and unsupported after a year leaving a system with serious functional and security holes. And you'll still be expected to pay per processor, or even per core, for this until you paid for an upgrade and continued again. Just with higher costs.
(and part 2)
If you kept shooting down all these arguments eventually you'd get to the nicely indefinable ones as people got more defensive. Such as "context switching is disorientating". Oh, grow a brain! You can handle the Start Menu but the Start Screen appearing causes you context disorientation?
See the above point about good user interfaces allowing a user to maintain their point of reference within an interface. Break this and you break the interface. It's not about "growing a brain", it's about producing a good interface in the first place. So it's not about the start screen causing disorientation in itself, it's about where you wind up afterwards in an entirely separate and dysfunctional interface.
I'm not a genius and I seem to manage it fine.
Please don't... I'm struggling right now with this invitation :)
Or "it looks like a child's toy". Well you can't argue against taste so that's fine, but you can set all the panels to grey if you want. It's not a functional argument as to why 8 is objectively worse.
I will admit that its appalling appearance is a subjective point. However the poor design's impact on usability can be objectively measured with time taken for users to find what they want and an aggregate survey of their subjective opinions.
Oh and lets not forget the video of some chap struggling to launch IE because his son didn't tell him the very basic fact that you can get the Start Screen from clicking in the lower left - something that Windows tells you the first time you start up. Never mind that the moment he was shown this he was fine. Never mind that I could find someone who would struggle with a Fischer-Price toy and video them if I wanted to. This apparently became evidence of how flawed Windows 8 was.
It is good evidence of a single instance of a serious flaw. A good interface should be obvious and consistent. There is no such thing as purely obvious when it comes to user interface design, however "adheres to expectations" is the closest we can get to. These are expectations such as a "button" indicates an action (not a hidden arbitrary section of a screen or a mouse right click) and that triggering these gives an indication of success (a poor design will leave the user wondering if the button they pressed worked). Consistency is where you learn something non-obvious but apply it throughout, for example you love the function of the Windows Key (I do too), however if this was removed in a later version of Windows, subverted in certain applications or drastically turned around (e.g. "Windows key now launches Internet Explorer, your portal for everything") would you be happy and appreciate this? Similarly a button with a green tick on it indicates acceptance, a button with a red cross on it indicates rejection. Simple things but without consistency they become worthless.
I feel deeply sorry for the MS engineers. They produced something that was well-thought out, objectively improved in many areas, still had the same capabilities of its predecessors, and when it was unveiled, a large section of the IT community (who should be open to change as much as anyone), did nothing but pour hate and abuse at what they'd worked on.
Some of the MS engineers did a great job. As in those that worked on the stability and reliability of the core Operating System, a chunk of this I suspect was removal of support for archaic hardware (VM installs of earlier versions of Windows that do not have this legacy crud are also more stable and run and boot considerably faster). Unfortunately those ultimately responsible for the User Interface (User Experience) should be taken to a dark alley and shot as a favour to humanity.
Let's deal with this step by step: Apologies for the length, but I'd rather tackle everything...
Example, the endless mantra that Windows 8 penalized mouse and keyboard users. Windows 8 was better for mouse and keyboard users. Any serious user should have already been in the habit of launching programs by tapping the Windows key (which is permanently a centimetre away from your left hand when using the keyboard) and typing the first few letters of what you want. Want Control Panel? Win-key + 'con'. I can literally launch it in under a second. And this search-launch function works faster in Windows 8 than in 7. Additionally, it includes documents and settings in the search. And people claim that it's all designed around Touch? What I've just described is faster than reaching for a display.
Windows 8, and in particular the Microsoft applications that come with it further devalue the keyboard and the mouse. Many keyboard shortcuts have silently removed. Only minor ones such as Ctrl-C, Ctrl-X and Ctrl-V. These no longer work and a user is forced to right click to copy, cut or paste text - admittedly this does favour the mouse but it requires a user to change hands and switch concentration to a different input device.
The mouse pointer is not a substitute for a touch interface, it is not possible to "fling" elements around in the same was as one can with a touch. The mouse scroll button does not scroll the UI elements that can only be scrolled by touch, using a largely hidden scroll bar that's too small or repeatedly clicking forward / back buttons (which are sometimes hidden until you guess the exact spot on screen).
A "normal" user does not operate a computer like a keyboard obsessed geek. A "normal" user tends to use a computer as a tool to perform the limited array of tasks they require. They don't give a monkey's what the name of the application they are looking for actually is (it is likely to also deal with their spelling mistakes), however they have learnt that the green "X" signifies Excel (or spreadsheets), the blue "W" signifies Word (or writing in some form) and the orange "O" signifies Outlook, but they'd prefer it was "E" for email. They also recognise "E" as "Internet", (another great mismatch). These normal users look for the visual clues that a visual user interface should be giving them - things like clear indications of their commonly used applications, what is a button and what is not and how to close an application or just shut down the computer.
And if for some reason you're too conditioned by older versions of Windows to adapt to using the keyboard and insist on launching something with the mouse.
The mouse was introduced to ease navigation in a graphical environment for average users. Yet you are already demonstrating crass stupidity in assuming that all users want to use the keyboard (searching for the letter to press each time) and would prefer this instead to clear functions being presented visually for them to choose from. Much like real life interactions - you look at a display of fruit on a stall and select from what is available, you'd be ****ed off if the display of fruit was hidden and you'd have to type on an unrelated the name of the fruit that you'd like without knowing what is available or sometimes even what the name might actually be. It's not a great example, but applicable enough as icon imagery is there to show a user what functions or options are available and the mouse is there to make it easier for the visual link between what you want and how to get it.
Well for those people who really struggle to adapt, mouse approach is also faster than in 7. In both cases [snip]more ranting about keyboard[/snip], you have to move the mouse to the lower left. This too is easier in Windows 8 because in Windows 7 you have to move it only so far and stop on the Start menu, so you must control your mouse movement. In 8 you just whip it to the lower left corner where it will stop by itself. Controlled movement is slower than uncontrolled movement and don't try to say that the extra few pixels travel offsets that because any honest person can try it right now (go ahead - see how fast you can move the mouse to the lower left corner compared to how fast you can move it to a small rectangle near but not at the lower left corner. And don't respond to this point until you've tried it).
So you are recommending that a user, as in a normal person, should somehow magically know that there is a hidden function in the bottom left hand corner of the screen, clicking on which will solve all of their problems? You are quite correct about the difference between constrained and unconstrained movement, but before attacking others and suggesting that they try it, you do realise that Windows 7 works exactly the same way? Yes, you can fling the mouse into the bottom left hand corner of the screen and click on the clear visual indicator. On the "constraint" of options being a very useful tool - a set of visual function indicators gives a neat, constrained list of options and is much better than a keyboard free type search of "everything". I actually like the ability to search everything with ease, but this should be seen as a useful tool and extension, not a crutch for a unusable user interface.
I regularly use a lot of different programs - far more than most. I counted them and the come to 27. My Start Screen has space for around fifty on the desktop machine, and around thirty-five on my old laptop. You know what that means? No navigating up and down menus carefully like the Start Menu. Which could pin a finite amount of things - I can't remember how many but it was less than the Start Screen on even my laptop. Again, it's faster and easier to whip the mouse to a large icon in the screen (and they're grouped by function too!) than it is to go to a menu option in the Start Menu, wait for the sub-menu to appear, move to the option you want, etc.
You have very succinctly proven you exact problem. You class yourself as a "power user". Guess what? 99.999% of Windows are not power users. They don't run 27 different applications, they don't care about such things. For most users a PC is a glorified typewriter that has the advantage that it comes with Internet and Email access.
But no, people clapped their hands over their ears and shouted "a UI designed for touch on a non-touch interface is stupid!" Never mind the facts, they'd found something to be angry about.
A UI designed on a non-touch interface is not stupid... it's mind bogglingly fuckwit stupid. I first developed touch interfaces over ten years ago and I can assure you, it is not extremely difficult to produce a combined interface that works well with both. It does introduce a lot of restrictions and in the end you either have a restricted interface (touch) or a much less restricted interface (mouse / keyboard). The chief difficulties are that a mouse is much more accurate as the hit point is more precise and a user does not have their hand / finger in the way and a mouse has two or three standard buttons allowing consistent selection / menu actions. Not that right click menus are exactly great from the UX point of view, however they are an established standard and are a good tool for providing additional control at the point of use compared to a user having to find the same control elsewhere in menus / buttons further away from the focus of interest. The point about them not being great for users is very pertinent when you realise than a great many users just don't know that right click menus exist...
The list of stupid objections was endless. The Start Screen would obscure what was on the page. Right - so you navigate the Start Menu without looking at it do you and without stopping from reading what you're reading in the main window? Of course you do...
I'll give you this point, but only on the very narrow aspect of looking at a modal interface (popup window) that either covers the screen or just prevents you from using the rest of the screen without disappearing. The poor thing about the new start screen is that it is a) ugly as sin (subjective of course, and it does depend on the content), b) is a poor way to locate what you want as it's invariably full of junk, which lead onto c) removes all of the useful features that have been built up in previous incarnations of windows. Such as Most Recently Used documents and applications, pinned items, sorted items, indicators for new items and so on. It's not that the old start menu was great, it's that the new one is a functional step backwards.
Or how about that opening a PDF would, by default, launch the Metro PDF reader causing the poor confused user into the Hell of Metro land where they would flounder helplessly. I heard that one loads of times. So switch the default app for PDFs, I'd say. It's just right-click on the file. But users wont know how to do that - they just want to read their PDF. Uh, you do know that Adobe Reader isn't part of Windows 7, right? That if the user just does it on Windows 7 it wont even open at all - just ask them if they want to install something that will read it? Uh, well, they respond. Some OEMs pre-installed Adobe Reader. Yeah, and they can do the same on Windows 8. Stop trying so hard to find things to struggle with.
Again you're thinking about a non-average user. An average user will just hate the mind jar switch to a deficient interface and the mind jar of how to get back to where they were previously (this is mind mapping allowing a user to visualise and refer to their position within an interface - it's a key user interface point and break it and you will confuse users).
@ Chairo - I was more thinking about the OS itself rather than the applications themselves. But it is down to the app developers as to what they support.
Finally, now that is a big improvement and hopefully something will carry through to the older iPads as well.
While a lot of people praise Apple for updating their own OS on their own Devices much faster than the often no-updates from of a Carrier updating their ghastly hacks on top of a Device Manufacturer's hacks on top of a common OS provided by another party... they do have a habit of stitching up the older devices. In marketing terms, it's the user's fault for not upgrading of course, and is therefore a prompt (stick) to upgrade for the new features in a new device (carrot).
The problem is that with this tech you're more likely to be bored to death by the powerpoint "presentation" delivered by the shark mounted laser projector than any time delayed (important to allow the chance of escape) groin dissecting high power laser that's strangely visible as a line in the air...
Death by powerpoint is a most foul, cruel and evil way to kill people but it could take a long time as most people are either immune to this by now (natural selection in action) or have adequate survival techniques already developed.
My thoughts to. How can employing more people of a specific gender specifically save money?
Next: Campaigns bemoaning that fact that 100% of jobs are occupied by humans.
I'm very, very sure that they aren't. Not sure what species they are, but human they ain't.
How many brontosaurus's is that?
Blah, blah blah... but what is that in Register Units? :)
I find IKEA is a worthwhile place to look around but only if you can stand the forced routing around their stores without going on a killing rampage (particularly after the piped music in their car parks), however it's often better to look and then buy elsewhere. This isn't always the case though as IKEA do genuinely produce some gems, but you have to be careful as they also push a lot of junk.
However I avoid flat pack furniture as much as reasonably possible, preferring my furniture to have rigidity other than being retro fitted with stiffeners, additional glue and nails or attached to a wall. The last free standing bedroom furniture I bought was custom made to order from a local furniture maker - it cost only 20% more than the equivalent flat pack, was made to our specifications and is well built so you don't feel that everytime you put something in a drawer the front will either fall off or the drawer base will collapse.
These things are a bit of an oddity...
Touch in windows 8 still sucks even though much of the user interface in Windows 8 (8.1) has been murdered to be "touch friendly", when configuring them you frequently have to head back to desktop applications so you can actually get some things properly configured. Such as networking... In essence, the entire Windows 8.1 shell feels and acts like yet another MS bodge job, massacred by clueless UX developers, further trounced by greedy, clueless marketing department drones and then rushed out half implemented.
It's not helped that, as noted by Steve Knox above, their specifications are just wrong for the price. For rather less money you can get a considerably more useful and powerful laptop which while this sounds daft, from experience most users want a keyboard with their Surface devices which basically means that they instantly turn into under powered, over priced laptops. The staggering inefficiency of the Windows OS and typical Windows applications really doesn't help them as the raw specifications and processing power does exceed competing tablets, but the final result just isn't the same.
As much as I hate the Windows 8 interface on a desktop PC, it does make sense in use on a touchscreen tablet as the interface conforms more to tablet expectations (which are rather different to desktop or server expectations). I'd like to be able to run windows, or at least some windows applications, in a tablet form factor and the business case of multiple users and easy access to files and documents makes a strong case for them compared to iPads (forget it) or Android tablets (much easier, but no corporate control). But again, even Microsoft's flagship application Office is awful on a tablet as the user interface has been massacred - except for the bits that they couldn't be bothered to update of course, these still popup with the same old desktop windows. The result is MS Office on a tablet feels like just another MS Office skin-refresh bodge job with restricted functionality and poor usability therefore users tend to use the desktop version and for that they require a keyboard and mouse. Touch does substitute for a mouse in a lot of situations but pair a keyboard with a tablet and what do you get? Yep, a laptop of sorts...
Azure recently overtook AWS to become the largest cloud Windows Server hoster...
Really? Sources please.
He keeps a box of unnecessary cable extensions, adapters and gadgets under his desk at home, another in the garage and three more at work, despite the fact that half the contents are racing towards obsolescence while the other half is so old and pointless that they could be sold as collectors’ items.
Ah, erm. I think I may, just possibly, on a tiniest smidgin of an off-chance... have a similar affliction.
(see icon). git.
I presume you are implying that your shed has better decor... and if it's a proper man-shed, heating as well.
Oh hell yes, now I remember the resistor colour codes... in black and white. And other gems such as sample circuits that somehow got mirrored in printing or not-so-carefully (or was it a ploy to buy more) skipped power regulating components.
Yep, I see your point with the finite number of bitcoins, eventually they will start to be worth more and more. Particularly as time goes by, more will go out of circulation either through being lost or hoarded.
I didn't mean that money being created required actual money to be printed or minted, it's that in order to have growth the extra money must come from somewhere. The daft modern fixation with perpetual economic growth is impossible without the new money that this growth "creates" coming from somewhere otherwise there is in reality no growth as existing money is just be recirculated and through attrition gradually lost, in a very similar manner to the finite number of bitcoins. While the banks like to create money through loaning money that they don't actually have (yet), there is only one final result with this scheme and that is an economic collapse, one of which we have recently experienced when banks were lending each other the promise of money in a self-perpetuating circle and it caught up with them.
When it comes to it, QE is nothing more than marketing speak for "create more money out of thin air". It's dressed up in various ways but this is all it is, there are no tangible assets behind it which just leaves promises to honour it.
As for the Bitcoin fans who keep banging on about inflation, Bitcoin has built-in deflation, which is worse. That certainly means it can never be a fully-fledged currency, as it makes proper banking pretty much impossible.
Bitcoins have no more "deflation", or perhaps more accurately "de-valuation" than any other currency and it probably has less and is also more open about it. In order to maintain the financial illusion that an economy is growing new money must be continually created - it used to be gold mines providing the scarce raw materials for the coinage but we no longer operate on a gold based economy. Think about it... if new money was not continually created then after a while the money would start to pool in the possession of the "rich" and there would be a scarcity of it at the "poor" end which doesn't really benefit anyone in the end as those with little money still need money to continue their "low value" transactions between each other. Taxation is one mechanism to force money back into circulation through the taxing organisation taking the money and redistributing it by spending it - which, incidentally, is where the theory of central government spending its way out of a recession comes into play as it stimulates the economy through the distribution of money to those who don't have so much of it. Importantly this tactic fails utterly when it's a case of distributing the money in such a way where it's impact to the wider populace is very low... such as the money leaving the country (this is why some countries historically had a prohibition on the export of money).
A currency, virtual or not, only has value if people perceive it to have value. There are a lot of old currencies that no longer exist or have general value because of this lack of agreed value on the part of the receiver and the giver. You won't get far trying to pay for something in Italian Lira, East German Marks nor the Roman Denarius. Coinage of these defunct currencies has specialist value for collectors, but not a lot more with the exception of the value of the metals themselves.
A currency is nothing more than one-step away from bartering. Rather than me having to trade onions for socks, instead I simply exchange onions for an agreeable number of tokens, and later exchange a agreeable number of these tokens for. The important part is that all parties agree that these tokens have a value, in this instance it's the number of onions or socks they are worth. The next most important point is that while there is a level of trust implied, people will always game the system therefore the tokens must be hard to reproduce which is where scarcity comes in.
In some ways, bitcoins are closer to the origin of currency than the current money markets and banking system where "money" is moved around and between databases on the pretence that it actually exists despite the fact that "calling" all of the money will find an enormous, impossible to fulfill, deficit.
So this is where an attacker profiles a target website, going through it and recording the document requests and the number and size of the data requests being returned. In other words something like "going to page X triggers Y separate requests of a particular size". The number and size of the resources requested are likely to differ between pages and the pattern of page progression will also indicate the page on the site, such as a user will typically follow a pattern of page visits because that is how the site is designed.
Clever enough stuff, but it does require that the site is already profiled, probably extensively and a few times... and no doubt regularly in case the site makes changes. This does limit the vector on this approach quite substantially.
The fix, of course, is to make either the page progression vary (pissing off users and making the website hard to use) or to vary the number and size of requests for each page in a site wide randomisation plan. If the website always produces, eg, 25 requests for each page and they are a consistent size then it'll be impossible to track the page progression.
You wouldn't believe how many Internet projects I come across that have exactly the same kind of blind arrogance like this. It's enough to give the entire industry a bad name... :)
Not really sure why you'd do this. I'd have guessed Windows expects multiple apps to write to different windows on the screen as they run. I could see the trouble starting when multiple apps on a remote server want in as well
The pain with windows GDI is one app with multiple concurrent threads of execution where it makes sense for them to update the interface independently. In theory it shouldn't be a problem because windows deals pretty well with multiple applications, with varying processor affinities, updating user interfaces simultaneously however as soon as you try to put this all into one application the deficiencies in the GDI start to come through. It's not unexpected of course, as windows was designed as a single user, single processor shell rather than anything more sophisticated and the multi-processor and multi-user was bolted on later as a virtual afterthought.
In case you're wondering why GDI is/was being used, many of the newer windows APIs are sometimes little more than translation or management layers for the underlying GDI layer so not only do you suffer from the hidden GDI problems but you also have another layer of abstraction and inefficiency on top to deal with. The aim was to fix this in WPF however WPF was practically unusable for a long time and brings its own problems to the game.
I know I'm not exactly an "average" developer, but I was working on multi-CPU x86 code in 2002 (on Athlon MP CPUs if it matters).
It's not hard, or at least I didn't find it so, when you are aware of concurrency issues and know how to code parallel tasks and in particular what can be easily run or is appropriate for concurrent processing.
The hardest part was dealing with the utter ball ache that was (and still is in some ways) concurrent access to the Windows GDI, let alone the complete train wreck often involved in running anything ActiveX related concurrently.
The Intel C extensions for parallel code also make it a doddle but, again, you need to know what you are doing. IMHO the historical ghastly native support in Visual Studio for concurrency was a big problem.
As for Intel vs ARM, yes the x86 instruction set sucks balls compared to the ARM instruction set and this requires a lot more (very) clever optimisations from Intel, but even with this aside, it's just depressing how for windows applications, in much of the code 95% of the time nothing productive is being done with the CPU cycles.
I can't be bothered to look this up, but I seem to remember that this was a highly successful campaign of negative marketing?
I'd much prefer an upstream speed that doesn't suck balls, but this doesn't grab the marketing headlines in the same way.
Same old shit peddled by the gob-smackingly clueless to satisfy the spectactularly stupid (or lazy in the case of parenting).
Very similar in the level of clueless to the "facebook, twitter, and so on must ban nek-nominating." cries. Errr, yeah.
@ Adam Foxton
That sounds like an even better idea... however isn't it theoretically possible to retrieve the contents of RAM even when the power has been lost or is this only particular types of RAM?
Let's see erase itself when it's unpowered...
How much electricity is required to wipe flash memory? Have a secondary battery just for that, it could even be built into the flash memory package itself and with multiple control routes to send the "wipe it now" signal it would be very hard to prevent without multiple precision drills hitting at precisely the same time.
But you can replace iPhone batteries and relatively easily. They are intentionally designed so the average consumer can't replace them, not for the buggers to be impossible to replace. Unless you get one where the assembler got a bit over enthusiastic with the glue of course...
Not all iPhones models are as easy as others to replace the battery though.
Good. While neither Apple nor Google are directly responsible for the gross piss taking that is going on with in-app purchases (Dungeon Keeper anyone?) but they are in control of the ecosystems that deliver these apps and therefore they are in a position to do something about it.
It's hardly a new product, these have been around for quite some time. What's extra is the bluetooth button which is a nice refinement.
While in theory they are good, in practice they suck balls because it's even harder to a) not shake the camera and b) point the camera in the right direction.
What usually makes "selfies" suck balls is that very few people know how to compose a selfie or to pose for it. Still, they are fun and there is nothing wrong with that.
Interesting to hear the process described.
Reuse is (usually) the best form of recycling.
Pah! When I were a lad we wrote code in machine code, none of this assembly business... that's for nancies.
Seriously, I did. I can still remember some of the numeric codes as well.
It's interesting how a nation can turn its citizens into unwanted pariahs abroad.
For quite a while now there appear to have been considerably more Canadians travelling around than (US) Americans. Although their passports tell a different story.
I have a very similar take with my S3, I can't actually see a compelling reason to replace it - it's easily a good enough phone for me and for what I use it for. e.g. calls, texts, mobile internet access and the odd game and note taking session - although I have a 2013 Nexus 7 for the more heavy content editing and not taking sessions.
The S4 is a good phone but I see no reason to upgrade, the S5 looks more of the same. If, or more likely when, Samsung give up supplying updates I'll drop a custom ROM on it. I'm tempted to do that now due to the ghastly mess they've made of WiFi with the most recent update and losing the TouchWiz interface and Samsung shovel-ware apps really won't be a loss at all.
Ouch - I thought the sandoxing between applications on iOS was better than that, however this sounds like it subverts the APIs that allow inter-app communication although the way the article is written is could be specific existing applications that have elevated access that are the problem.
You are personally anonymous to the government and NHS cronies through security-through-obscurity. However the buyer of your personal information (inevitably somehow linked to the same government and NHS cronies, odd that) will now know all they need to know about you. They will then link this new information to all of other personal information that you never gave permission to be shared.
And then people wonder why I always fill out such random information on forms that don't need this information, refuse to give my address to random shops for "catchment surveys" and other general flippances. So instead my bank sold, or lost, my personal details for me...
Sure I haven't seen it either. However seeing what it is meant to look like might help jog the grey cells...
Just a random google image search gives me this: http://www.energyroyd.org.uk/wp-content/uploads/2014/01/betterinfobettercare1.jpeg
...and nope, it doesn't ring any bells at all even with a pic of it.