Re: Do not touch the switch ...
Maybe they've upgraded the notice from a post-it note to one applied using tape?
1459 posts • joined 10 Apr 2007
Maybe they've upgraded the notice from a post-it note to one applied using tape?
The interesting thing about arsenic is that we need some, in the correct form, in our diet. The not so great thing about arsenic is that even a little too much is very unpleasant. While it has to be in the correct form, the same can be said for Carbon, Oxygen, Hydrogen, Iron and so on.
arsenic is the 20th most common element in the Earth’s crust and the 12th most common element in the human body
I'm unable to find a clear attribution of this statement, however we do have arsenic in appreciable quantities in our bodies which would indicate that it's required. I have seen peer reviewed articles reporting this however can't find them due to the expected "arsenic news" noise in Internet searches...
The pictures in the article clearly show an open reservoir. Adding human pee to that would hardly contaminate it compared to everything that is in it already.
Pharmacies will never be perfectly stocked all of the time, there's upwards of 3500 commonly prescribed drugs / doses - there's no way that your local pharmacy will have all of them to hand and definitely not the less common ones.
The concept of the electronic prescription service is that electronic prescriptions can either be sent to a specific pharmacy or the patient can be given a printed identifier which any pharmacy can use to retrieve a prescription. A pharmacy's dispensing and drug management system is required to support this functionality and the messaging is carried by the N3 network (direct or tunnelled), which for those that don't know is the largest private internet there is.
The advantage of sending a prescription to a pharmacy is that the pharmacy can use this to get the prescription ready for you to pick up which reduces your potential wait time It also allows them to smooth out the pharmacy assistants' picking process and to better use their dispensing pharmacist's time because they must double check every prescription before it is handed over.
The electronic prescription is also designed to remove the step where a paper prescription must be confirmed by a pharmacy with the prescribing doctor prior to picking and hand over. This is an obvious check to have in place as a prescription is nothing more than an easily forgable pre-printed slip of paper and this check does add quite a lot of burden to the process.
As I understand it, this is to dissuade pharmacies from cherry picking the supply of drugs, whether for stocking, financial or other considerations.
Why when I think of Charles Koch do I think of the tobacco industry? Where it is has been repeatedly proven to be very damaging to health and later leaked that this was known by the tobacco industry even earlier. However they squashed this research, justified their entire business to themselves and carried on as long as they could. Follow the money.
When it comes to climate change, so much damage has been done by those who are involved, including the green movement (nuclear power), governments (ignoring or adjusting reports as they feel fit and pandering to cronies) and I'd most definitely like to slap senseless the muppet (aka greedy economist) who came up with "carbon trading".
Climate modelling is ludicrously complicated, and as models are created new feedback loops and effects are discovered. It is impossible to model such a complicated system from the start taking into every effect. You have to start a model somewhere. So when something is observed but they're not sure why then this is still modelled and the understanding is that later this can be improved upon.
The only perfect model will be one that looks back as to what happened. By then it's too late of course and even that model would not be perfect for predicting future events, it'll be a start but that's it.
At present, Samsung's own apps and services are largely shunned by the hordes who buy its hardware.
Because they are gut wrenchingly appalling, interfere with anything standard and generally make the phone experience somewhat worse? Samsung make great phones, but are utterly clueless about software.
MrXavia's point about the purchases not being transferable from one device to another unless they're both Samsung devices is a good one - just hadn't thought about that before. If Samsung introduced a really good, unobtrusive and well managed app-store that could also be used on all Android installs (it could still include Samsung exclusives), then that's one thing. However with the heap of unremovable junk that Samsung shovel onto phones they aren't helping themselves.
The May release will be the first in more than a decade to not include any bulletins for Windows XP. The venerable OS was officially retired from support by Microsoft last month, though subsequent exploitation of flaws in the OS by miscreants has forced the company to issue an out-of-band update.
...and there I was thinking that the out-of-band update last month was to patch Internet Explorer, and the security-hell-hole that is ActiveX within it. That's not a flaw in XP as such.
Don't know why you've been downvoted, but any discussion of Apple tends to bring out weird responses...
In theory they should all be built as well as each other regardless of manufacturer as they should pass the same tests that Apple will have defined. Multiple, competing manufacturers should also be in competition with each other and therefore quality should improve overall. Or cost, but the same quality. Either way it shouldn't be a bad thing.
Spreading manufacture across providers is a wise decision - it's a high risk having a single manufacturer, particularly if Foxconn were to be hit by urgent priorities for other customers, potential ethical and legal issues over its workforce or anything else that could cause severe disruption at just the wrong moment.
As for the iPhone 6 and it's rumours... I don't especially care for the rumours as it's likely to be a largely incremental update than anything spectacular. This is not an attack on Apple, just an observation as to where the market is.
This is, of course, how apps for modern operating systems are programmed: they do little or nothing until the user interacts with the UI, or data comes in from the network, say. But it’s not a common approach with embedded devices, such as Arduino.
Errrr... what is the writer of this smoking? Any vaguely sane (*) embedded developer spends their life with event driven code. Depending on the embedded system it will either have hardware support and masks for event triggers or for simpler systems the developer has to implement everything in an execution loop but it's still event driven. Or does the writer think that having events built into a high level language or API makes an application event driven?
* I've seen some complete balls-ups from developers who plainly had no idea what they were doing at a low level (and often at a high level either). It was usually quicker to rip everything up and start again than attempt to step through and debug a spaghetti mess of interwoven interface and state checking code. The basic flow usually never needed to be complicated, often something like this:
1) Initialisation code
2) Check something
3) Do something depending on what happened in 2.
4) Exit code set? No, go back to 2.
5) Cleanup code [often considered optional]
UK robins only live a couple of years (fast paced years admittedly) and while some migrate, others don't. Even the robins that do migrate might not migrate very far, many were recorded as migrating just a few hundred miles if that far. They're adaptable (omnivorous) little birds that don't especially need to migrate in the UK as our winters and summers are mild enough. The situation is more confusing because populations of robins swap places...
As a side note, am I the only one thinking the film title should be 'Star Wars Episode 6 1/2: The Smell of Fear'...
It should be "Spaceballs 2: The Search for More Money"
...perhaps followed by "Spaceballs 3: The Search for Spaceballs 2"
Please no, not Zac Efron or any other largely talentless Disney "child star or teen-idol". And no ****ing time travel either - although I'll accept that there's a chance that an actor like Zac Efron might not ruin the film totally, adding time travel to it will.
I've been offered bribes of sex and cash in the past. Unfortunately it was a long time in the past and when I was working on the gates to a concert, nothing IT related.
Yes I'm that shallow.
I'm not. How tall is Anna Chapman? :)
"Effective or necessary" depends on your point of view. Certain certified environments mandate that all systems have antivirus software installed and that the images are regularly updated (I never did get a real answer as to what they meant by "regularly" though).
In some ways AV software is like a biological immune system where you get a shitty cold and get protection only after the infection, but you are protected from a repeat of the same virus - the difference is that your protection can be shared with others. One serious problem with AV software virus detection is that it is retrospective - it takes time for a virus to be detected in the wild as the "best" viruses avoid detection for a while and then for an AV vendor to produce working detection rules and to check these in house against various permutations of the virus and known "safe" software that shouldn't come up with a false positive. The AV vendors are always behind, and end users always suffer as each detection rule added to AV software increases the number of checks that need to be made, necessarily slowing your system down to a crawl.
It's not a situation where the AV vendors can ever win, the only "fix" is to improve prevention and this requires careful operating system design. When the most common PC operating system has roots in a system that was designed specifically for single user, stand-alone use with everything else cobbled on top in a frequently changing direction with "new" products and platforms abandoned and left hanging regularly it's no wonder we're in the mess we are in. A more secure system is a more closed and controlled system, but how closed can it get before we start reacting to the loss of the freedom that enjoyed before?
The heartbleed bug could be merrily implemented in any language that supports memory access, it was an algorithm error, not a bounds violation of any form.
Modula-2 might be ok but it was ruined by the inane insistence of the designer that it was going to be a single-pass compilation process. In reality this just doesn't work and you either wind up with horrible kludges to the code or progressively more unwieldy development environments.
I'm becoming increasingly convinced that there is simply *no excuse* for writing stuff in C (and C++) any more. There's just better ways to do it these days.
No one language is so superior to all the others that it is usable at all levels, from device driver all the way to up to user script level. As a general rule: the closer you get to the hardware, the lower the level of language that is appropriate for use. Efficiency really matters at the lower level, while wasting thousands of CPU cycles with boilerplate and support code is almost acceptable at the application level, it most definitely is not for an API call that could be called hundreds of thousands of times a second. Like everything there are always trade offs balancing code security and with efficiency.
GOTO statement still have some relevance, but in general in higher languages it should be avoided. An algorithm can usually be written in a more structured, clearer manner where a GOTO statement is no longer required.
I would much prefer to see a GOTO statement than a "BREAK <n>" statement where you have to work through the layers of conditionals and loops to work out how many levels are actually being skipped out of in the parameterised version of the BREAK statement. "COMEFROM" would be clearer :)
Lower level, of course, you will see the exact functionality of GOTO everywhere because it is a fundamental control structure - JUMP and (conditional) BRANCH operators are key to assembly language processes. It's just that with progress we've abstracted their use away to reduce the number of problems they cause.
The style definitely doesn't help - and I'm certainly not a "friend" to many of the code formatting styles out there which encourage poorly indented and defined conditional blocks.
It's an absolutely appalling bug to be in place because:
1) An automatic code formatter applied to the code would have shown the problem with ease in a visual review.
2) The compiler would have produced a warning that the code block following it is never executed. Modern compilers are helpful like that. Then utter fuckwit developers either turn off the warnings or ignore them as there are so many. Hint for the clueless: the warnings are there for a reason, deal with them.
3) Testing should have revealed this bug very quickly as the function would not have behaved as expected. To be fair what probably happened was that the code was tested, then the developer hit Ctrl-D while the cursor happened to be on the badly formatted line, duplicating it (Ctrl-D is a common shortcut on many IDEs) probably while pressing Ctrl-S to save the current source file. However again, a commit of the source and the subsequent diff should have revealed this error straight away unless it was introduced as part of a larger block of changes, in which case the unit tests should have been re-run for all cases and the fault identified.
A brain is a massively parallel pattern matching system. It also has a relatively ingenious lossy compression memory as well that manages to overlay many memories over each other but somehow keeping them intact enough to be separate.
Any individual, small part of the brain can be implemented faster in silicon, however the ability to form and reform a wide mesh of analog connections between many neurons (quite flexible processing "cores") is something that is very costly to implement.
Repetitive, exacting processes are ideal for procedural computers, however other processes such pattern matching, approximation and detail substitution are much more suited to neural networks. While one can substitute for the other in most circumstances, it is far less efficient.
You need to get a copy of Settlers 10th anniversary.
Oooh... thanks for that. I think I may need to purchase that straight away!
Settlers... with a cardboard screen splitting the screen in two so we couldn't so easily see what the other one was doing. Unfortunately settlers got worse and worse for 2-player with every new version and it lost all the charm of the first. The latest, is completely ruined by unisofts moronic insistence on everything being about meaningless "micro-purchases", a ratings ladder and very limited (if pretty) maps - they don't even permit a ****ing save game feature in two player because it might be mis-used in the ratings ladder. Guess what, we don't give a flying rats about the rating system, we just want to play the game. And without unisoft's DRM and other intrusive nonsense getting in the way as well.
Sensible Soccer (tournaments) - we took teams, played against each other, drunk beer. Some days just never got better.
I'd much rather that "touchwiz" was an optional, uninstallable skin that could be installed (vendor locked) or uninstalled as desired. Same for the other manufacturer's launchers as well. If I never see another bit of "carrier" content again I'll be happy as well, I remember too many phones utterly ruined by the total trash that the networks put onto devices while simultaneously removing anything useful that competes.
On a side note, if The Demon Spawn of Redmond, AKA M$ or Windoze (or whatever tired and unimaginative insults you can come up with) had announced plans to do this to their OEMs (or if they were to do so in future), what do you imagine would have been/will be posted on a thread like this one....hmm?
To be honest, this Google plan doesn't sound very different to how MS currently operate with Windows Phone, so they'd be hard pushed to make such an announcement.
While it is an extension of the Nexus devices, which in my opinion appear to be there to keep the other manufacturers on their toes, I'm not sure if this is going too far.
Maybe I'm missing something, but other than the headline where did it mention ARM? The Basic Qualifications section of the role lists "In-depth experience in optimizing workloads for high-performance x86 architecture" with no mention of ARM anywhere.
Intel are also working with integrated or custom dies as well and while it's a rather different licensing model to ARM's the basic principle is similar.
Headline writers... grrr... it's like they're attempting to catch our attention or something :)
Why so many downvotes for somebody simply for saying that they like a product? Bizarre.
Because this topic is fan-troll bait, including as it does the topics of Windows Phone and Android in one therefore even rational posts are going to get lots of spurious down votes but few counter-arguments.
The only comment I'd have about the above poster's comparison is that IMHO he's not comparing like-for-like devices, but given the number of devices and combinations of features and devices it's hard to really compare devices objectively.
I think we need a "I think this would be better demonstrated using playmobil figures" icon... :)
Yes, Video playback is a relatively low cpu power task - the processor has to do little more than orchestrate the passing of data to the dedicated video decode hardware that is genuinely efficient. Hence low power, as in a low-power CPU can perform the task.
The display will take more (electrical) power to display the video...
It also aids productivity because it ensures (*) that you concentrate on one thing at a time rather than continually flit like a geriatric lunatic between different tabs and downloads.
* as in, it could only do one thing at a time itself, therefore that is how you had to operate. No downloading in the background, no seeing the page until it was loaded, no tabs (don't remember an "open in new window feature" either)... and no .png support, no scripting... errr... I'll just load up lynx thanks. Did it even support marquee and flashing text?
I think I like the term "earslab" better though.
"....what about C#?...." The problem for C# for many of the Penguinistas is that they see it as firstly a Microsoft product, and it is, in their eyes, therefore too tainted for them to consider, despite it now being an ISO standard CLI language. They even get huffy over the FOSS Mono version, calling it an MS Trojan horse.
C# is a Microsoft product - while it is labelled as an "ISO standard CLI language", we all know how Microsoft rigged the standard for Office documents.
Once the Microsoft dependencies (libraries) are stripped out, there is unfortunately not a lot left to C#. While the same could be said for other languages, at least for many of the others there are working alternatives for the functional libraries. Once these Microsoft dependencies are removed there is not a lot of real incentive to use C# compared to C++ as there are relatively few compatible libraries and pools of example code, although recently more have been released. AIUI it's also quite a bit slower than C++ for many tasks due to the additional baggage that comes with managed code - in theory it is safer though.
It now has 540 million such profiles, of which around 300 million people are said to be active in the Google+ "stream".
Is this "around 300m people" the ones who haven't figured out how to or haven't yet, disabled the g+ "integration" options on everything google?
Unless there's a head crash, inserting old Amiga floppy disks into an old Amiga disk drive shouldn't damage them. There is a chance that if the data is magnetically "faded" (not sure what the correct term for this is) then it could be flipped by the read head but in this case the data is probably knackered anyway. Still, the caution that they exhibited wasn't entirely unwarranted given the potential value of what may be on them.
Amiga disks didn't operate with a variable speed, that was a feature of the Macintosh systems. The actual physical disk drive components used by the Amiga 3 1/2" SD floppy disk drives and PC 3 1/2" SD drives were the same it was the interfaces that were different. The biggest problem was that PC operating systems were designed such that supporting other formats other than their own was very difficult. AmigaOS, on the other hand, had a very flexible disk operating system and supported different formats with relative ease. Most problems with this support came down to supporting the primitive file systems and their inefficient use of disk space - e.g. 8.3 uppercase formatted file names compared to case-capable but case insensitive full length file names, 720k capacity compared to 880k. While annoying it is easy enough to copy content from an Amiga to a PC using an SD floppy disk, although if you want to preserve file names then it's a good ideal to compress the content into an archive file of some form - lha and lzh are supported by many PC archive applications. Other transfer alternatives are null modem cables and the huge number of transfer suites that are, or were, available for this, and even IPv4 networking if you have the patience to get it working. One of the most useful tools I remember was software that mounted an FTP site as just another drive in the Amiga - this allowed you to relatively painlessly copy files to and from a FTP server using whatever application you wanted.
Converting data from the majority of IFF files, which encompassed ILBM and a lot of other formats, is not a particularly troublesome task given even basic coding skills. Again there are a few tools still going that help with this.
There is also the issue that what is traditionally referred to as "junk" in the DNA is in reality not junk and is critical. As a result comparing a "few" marker genes in no way is a complete comparison of species - it's a starting point though. The actions controlled by this "junk" are very interlinked, resilient and there are clearly documented cases where different arrangements of this "junk" trigger the same end result.
Whatever it is for (most likely AppleTV of some form), a speech control interface is welcome. Siri may have its faults, but it's a step in the right direction.
Would save so many problems with losing the remote controls all the time...
Come friendly asteroids, land on Milton Keynes?
Just doesn't quite have the same ring to it as bombing Slough. Although I'd argue for Slough, Milton Keynes, Luton and a good few other places as well.
I am far from a luddite (maybe rather closer to a closet tech-geek), but why do the damn interfaces on these things have to be so awful?
It is much nicer to use a push on/off rotating dimmer switch compared to dual function up / down buttons. I hate button re-use with a passion, it makes for some of the worst interfaces. It's not as if switches have to mechanically control the circuit therefore a digital rotating control, perhaps with a mechanical stop, and a push on/off button is not hard. And get rid of the bloody LEDs. I have too many of these things glowing away for no readily useful reason and while a nice subtle LED lighting a switch in a dark room isn't an entirely bad thing, a "burn your eyes out it's so bright" blue LED is what tends to get fitted these days.
And as for the remotes... the cheapest, nastiest, OEM remotes with... wait for it... dual function barely explicably captioned (icon'd) buttons. Gits.
"Free markets have made you, and billions of people all over the world considerably richer."
So has (currency) inflation. Millio, millio, millio...
Faster and faster HFT systems is all very well, but what do they interact with? An external system?
If so, what are the speed of these systems because in any correct systems you should have transactions, and the negotiation "promise" stages leading to completion. Far more likely to be concurrency / queuing issues with these rather than a trader's systems. As these systems are dealing with trading finite resources ("resource" can be quite an abstract term, but even shares are finite) there should be a register of who owns what to ensure that duplication, and therefore fraud, is not committed where multiple parties claim ownership of a resource or more of a resource is being traded than in reality exists.
Just some thoughts that come up from these kind of systems...
Considering how clothing much many builders wear in the summer, she's quite well covered up really.
errr... yeah... I had a train of thought at some point but seem to have misplaced it. For some reason.
And then I noticed that in the (photoshopped) "asus beach babe" image the girl couldn't possibly be using the device due to the angle of the screen. Do I have my priorities right? :)
If the professionals can fit sensors upside down and confuse metric with imperial measurements, I'm sure a missed blown fuse is quite forgivable :)
That's intriguing. It would seem to me that it implies that the FP64 processing is implemented using the multiple steps of the FP32 circuitry (splitting and then re-merging the values?) rather than native FP64 circuitry.
Ah yes, the "in-memory database" that's effectively crippled due to lack of support for many standard and commonly used SQL operators.
However I'm sure that if you had specific data requirements that you need to run at an acceptable speed, you could redesign your database, separate the data that you need fast access for and then work around the dependencies. In general it may be useful for specific new cases, useless for speeding up existing databases.
Oh hell, yes. I forgot the bullshit of acronyms everywhere... with DS, DD, OH, LP, DH and everything else that just makes it all as cliquey and incomprehensible as possible.
Some of the info on mumsnet is actually useful - children are different and finding out what other parent's solutions, or attempts at solutions are, can be invaluable.
Unfortunately it's hard finding the useful information under the heap of junk posted by the batshit insane.
At full arms' length, those pixels are probably going to be far too small to make a difference
Full arm's length? When was the last time you saw anybody hold a hand held device such as a mobile or tablet at full arm's length? Apart from the logistics of doing this in public, you'll soon realise just how heavy these devices are, and how heavy your arms are, when you try to hold something at full arm's length for any amount of time.
My desktop monitor is only just a full arm's length away from me... strangely this makes it less than ideal for a touchscreen interface.