Only "sort-of" dimmable
Better than any of the other 2-wire "dimmable" LED lamps I know of though, most of which only go down to 20%
I would still like the last 11% though.
1617 posts • joined 16 Jun 2009
Better than any of the other 2-wire "dimmable" LED lamps I know of though, most of which only go down to 20%
I would still like the last 11% though.
Sunshine also has the most unbelievably stupid spacecraft crew ever to screw up on screen.
I still chike that they really expected us to swallow the idea of flooding a compartment with oxygen to blow out a fire, when they could just vent it to space...
You know, try a technique that might actually work instead of one that would definitely kill everyone on board and probably destroy the craft as well.
The worst part is that the writers could have got exactly the same "oh hell" result of the aftermath by using an actual firefighting technique, instead of something that made no sense whatsoever and must have sent the scientific advisors screaming for the hills and demanding to be removed from the credits...
Nope, this isn't "simply upgrading".
To do a comparison with another well-known portable computing supplier, he's doing the equivalent of getting rid of an iPad and buying a (tablet shaped) Macbook Pro. That's not an upgrade, it's a different class of device.
Windows RT isn't Windows 8. Superficially, they look very similar - but when it comes to what you can actually do with them, they are completely different.
To me this tweet really says "Microsoft exec confused by the difference between a Surface RT and a Surface Pro" as he's bought the wrong one.
No matter which Linux you pick, in ten years time you will definitely still be able to legally deploy that specific version.
That's the issue - right now companies are only just starting to use Win7 embedded in released products.
It takes a year or two to build and test the new system on a new underlying OS, so if you lose the ability to licence it only five years after it first became available, you might only get three years of shipping units before you have to switch to a new underlying OS.
Which takes you two years, so you end up continually re-writing just the OS layers,with only one year in the middle for actual new features etc.
Not sustainable, thus a short lifetime of Embedded licensing is a sure-fire way to kill all sales of it, forever.
As other posters have pointed out, several Embedded systems are already on Linux - and with modern toolkits like Qt, transitioning between WinXP and Linux is much easier than it used to be. It's not "tick the box", and probably won't ever be - but much easier than before.
It's also rather nice how easy a Linux is to lock down - after all, an always full screen application doesn't actually need a window manager...
Linux actually has the low-power embedded market almost completely sewn up - check what your smart TV or STB (router etc) runs!
In earlier Embedded it was normal to kill the shell entirely and replace it with your own, so merely suspending bits of the Win8 shell seems an odd way of describing it.
In almost every use of Win Embedded, the whole point is to kill every aspect of a "normal" Windows UI and replace it with the specialised UI for the particular use.
- If you can still see that it's Windows after the boot screen, you probably didn't do it right.
That said, manufacturers are only now moving to Win 7 Embedded, so usage of Win 8 is a few years away.
Unless they pull the rug out from under us by stopping licensing of new Win7 Embedded machines, in which case it'll scare everybody onto Linux.
One question to think about:
How many applications do you have installed?
Under all previous versions of Windows, the "launcher" had folders to let you organise your installed programs into groupings, and to let the installer put their links together.
That's gone in Windows 8.
If you have more programs than fits a screen, you have to flick through several pages.
If the installer used to put its links into a given folder (eg all from the same supplier), it won't anymore, you'll have to drag them around yourself.
Even iOS has folders on its "launcher" - and most people have far fewer things installed on their phone than on their PC.
The Win8 Start Screen just doesn't scale. If you've only got one screenful of applications, it's ok. As you add applications, it becomes more and more unwieldy.
Personally, I have well over a hundred applications installed - most are "rarely used", but I still need them and I still want them grouped nicely so I can find them quickly.
- As I use them rarely, I may not remember enough about the name to use Start->type, or even recognise the icon, but finding "widget drawing" in the "drawing" folder is easy.
Doing the same in Win8 Start Screen isn't possible. I must recognise the icon or know its name, or it is almost impossible to find.
- One example of a useful UI improvement that was clearly based on proper research is "Pinning" in Win7 - those few applications I use every day end up pinned, while the more rarely-used stay in folders in the Start menu.
There are some things in desktop Win 8 that are quite nice, including the new task manager.
Yes, there are a lot of nice things. It boots faster, has many improvements "under the bonnet" and has quite a few useful tweaks to built-in utilities.
Then they ruined it by ripping out some useful features and cramming a tablet UI on the front.
It's like bringing out a new, faster Ferrari but insisting you cannot buy it in red, and steering now uses a lever.
Obviously you could take it to a paint shop and bolt on a steering wheel, but why should you need to?
How can that be abuse of a monopoly?
The only reason Win8's TIFKAM even exists is because they are a monopoly - consumers are essentially forced to buy it if they want a PC at all. (As the OEMs are pushed away from Win7)
If they had a choice, almost nobody would buy Windows 8 - you can see approximately how many would choose it by looking at sales of Surface and Surface Pro.
I'm still pissed off that they neutered the task bar - so much of the cool stuff it did has gone :(
As a few earlier posts have pointed out, most of the evacuated area is less radioactive than Cornwall.
Unfortunately there is an outright panic attack whenever radioactivity is mentioned, regardless of the actual level.
Going through US airport security (excluding the flight) probably used to give you a bigger dose due to the X-ray backscatter machine than a day's visit to the "Area 2"
- Hard to be sure as the data on those was never published and probably wasn't ever measured for the staff and the queue.
Nope, you want to know the ground-level information, not a few hundred metres up.
I really hope they did include measuring the radiation - in fact, it would also be very interesting to do that everywhere.
I'd love it if radiation levels around the world became well-known, in the medium term it would remove the hysteria and replace it with the simple respect radiation deserves.
We'd all be safer for that.
Yes, but you see, the ASA's dictionary has a few missing pages and they can't afford a new one.
Perhaps we could help them out?
"Hey, let's spend billions buying underpants!"
"Why? Will it help?"
"Just buy billions of underpants and then get back to me."
Power lost = Current * Current * Resistance
Power supplied = Current * Voltage
If you don't understand why these two equations mean higher voltages are needed, you won't understand why the rest of that post wouldn't work either.
Go do some research - it's very interesting.
However, if you don't want to learn any physics or engineering, don't proffer opinions on them because you'll just look foolish.
You earn money on solar by stealing it from the poor.
You get between four and five times the going rate for your electricity generation, and the Grid is forced to buy whatever you generate, regardless of whether the Grid actually needs it or not, or even whether it gets any of it in the first place.
- Yes, this is the same as Tesco paying you £5 for each cucumber you grow and eat yourself.
It's almost certainly the most regressive taxation to ever come out of the UK's central Government - it drives the poor into fuel poverty while handing money to rich landowners.
Yet it was a Labour idea. So much for their "core values".
Isn't that 1-3% based on simple maths?
Area of surface we are likely to build wind turbines on multiplied by watts per square metre an array can generate, compared with giving everybody a reasonable amount of electric?
Sounds reasonable to me, given the paper's content.
Of course, you can argue his area is too small, or that 2/3 of EU energy consumption for everyone is too large, but I rather doubt it's out by an order of magnitude.
So even accepting that, the upper bound on global wind power would only be 10% or so.
The huddled masses of the world's really poor aren't going to stay huddled forever - and the sooner they get to the "modern" living standards the better for everyone, because that's the only way the human birth rate will reduce to a steady state. You'd better hope that happens before peak energy and peak food, because human wave attacks are very messy.
Just because leaping off a cliff is madness, doesn't make climbing the cliff sane.
Burning oil is foolish - if nothing else, it's far more useful as a raw material than a fuel.
However, building wind turbines is also foolish.
It's telling that you only feel able to post your ill-informed bigotry completely anonymously.
It would appear you don't even believe your post enough to associate your long-term (semi-anon) handle with it, which doesn't sound like someone with a genuine religious conviction.
You don't believe, you just hate those different to you, and last time I checked, Jesus didn't say to hate your neighbours.
By your definitions, childless couples and disabled husbands are just as bad as gay marriage, as neither provide the benefits you claim marriage is for.
Do you agree with that statement? Would you deny a paraplegic fiancé the right to marry?
Marriage is a public declaration that two people love each other, intend to devote the rest of their lives to each other and remain faithful to each other, regardless of changing circumstances.
That is valid for any two people.
AC@12:41: You could read the next paragraph: "[America] is the only place I've seen "swipe'n'nothing" credit card payment, anyway."
Also, this statement is just plain wrong:
Contactless payments are authorised, the card provides the authorisationThat's not authorisation, because it's taking money without asking the account holder anything.
The account holder is the only entity who can authorise money going out of an account. If the account holder didn't authorise a transaction, that transaction is unauthorised by definition.
By (EU) law, you must be refunded if the bank permitted a transaction without that authorisation.
So you're genuinely happy that anybody at all can make multiple transactions, each up to %VALUE% (£50?) once they've nicked your card? (Or even without stealing it, instead remotely using the RFID to determine the card number and doing a few CNP transactions until the anti-fraud trips in and blocks it.)
Leaving you with the fun and games of getting the money back, perhaps bank charges (and even court summons) due to going overdrawn or having cheques, direct debits or standing orders etc refused?
For most people it wouldn't take many £50 transactions to do that - just one may be enough.
That sounds like a dangerously foolish idea to me.
You must be American to like the idea of being able to pay for smallish transactions direct from an account with no authorisation or security process at all.
That's the only place I've seen "swipe'n'nothing" credit card payment, anyway.
The card is not the account holder!
I used to think the SciFi stories were silly where people had credit chips that almost anybody (usually the villain or hero) could easily use to take money from random civilians, but now it looks like the banks really do intend to go there.
The tax man will be very interested in any companies complaining about this behaviour, because it's a clear indication that the person is really an employee.
- So the employer has to pay NI, holiday pay, pension etc.
Not sure about other countries, but in the UK, if you can't subcontract your contract then you're usually considered an employee, and not a self-employed (freelance) contractor.
Maybe you've not heard the term before.
"Dogfooding" means "using your own products internally".
It is almost universally a good thing, as it saves the supplier money and helps find subtle bugs.
Aside from that, would you trust a supplier who doesn't trust their own products enough to rely on them for their own business?
It comes from "Eat your own dogfood".
wxWidgets is a widget toolkit - it's only the GUI.
Qt on the other hand is a complete cross-platform SDK, however its previous owners spent years developing it for a particular mobile phone OS, then set themselves on fire, slit their throats and threw Qt away.
So Qt is back to being a desktop SDK, now with a host of new, shiny, but unusable mobile OS extensions.
With is a terrible shame, I call it "being Elopped".
The Android (Necessitas) and iOS ports will fix this, but not yet.
You will not find "connect me to a database" widgets in either, you do have to do some of the work yourself.
If you are going to compare Apples, at least pick oranges instead of monkeys.
Armstrong took manual control of the last section of the computer landing in the LEM and drifted it along to find a flatter LZ - because it missed the target due to wrong data, and it wasn't possible for a human to see that until it had nearly landed.
With a hand over the "hard abort" button that would put the computer back in control and throw them back to the CM. That's still the closest to actual off-world piloting ever done. Perhaps the same will be done for a manned Mars lander, but I doubt it.
Apollo 13 did one or two 'manual' burns on a "we'll correct it later when the computer is running again" basis.
However, the burn was still pre-calculated by the boffins on Earth, the crew's job was to keep the craft in the same orientation during the burn.
Not to time the burn, not to work out how long to burn for or which direction to do it, and not even to know that it needed doing at all.
Finally, Gemini etc crews didn't do the rendezvous, just the final docking. Computers got them within a few hundred metres and at near-zero relative velocity, humans only handled the final touch-and-grab.
- Compare taking a ship across the Atlantic to New York with going the last 100m to the quayside.
Humans are also rather poor at it, demonstrated by how much practice was needed to get a small number of extremely experienced and highly trained individuals to be able to do it at all. (And how dented a lot of ships are! The big ones auto-dock now.)
Today it's mostly not done - grab the thing with an arm and drag it into place.
The big advantage a human-crewed mission has is being able to repair stuff. If the computer breaks, a human can turn it off, replace bad components and reload the software. A computer can't do that for itself.
Minor correction - human crewed, not piloted.
Computers have flown all spacecraft with the exception of Apollo 11's LEM*, Apollo 13* and the shuttle during the landing.
Launch and in-space manoeuvring is something humans just can't do - the timings and precision needed are too tight, and we simply can't do the observations either.
Any spaceflight is quite simply pre-calculate and let the computer burn the engines.
Humans can do the last bit of landing, but balancing on a tongue of flame is best left to a computer.
* 'cos it broke.
Sanity Soapbox, your argument simply does not exist at all. It is a non-argument, it not only has ceased to be but never was. It's a statement of an empty belief with no thought behind it whatsoever.
For the hard-of-thinking, here's a brief summary of evolution:
A random change occurs to the offspring of a lifeform compared to its parents.
That change will either be good for the offspring, bad for the offspring or make no detectable difference.
- If the change is good, it is more likely to survive and have offspring of its own, thus the descendants also have that particualr change and over time it becomes more common.
- If the change is bad, it is less likely to survive and have further offspring, thus the change will be rare or be lost entirely.
- If the change is indifferent, it has the same chance and so the change may be retained.
It's clear that given time, "advantageous" changes will accumulate (opposable thumbs, better eyesight...) and a variety of "harmless" differences will appear (freckles, hair colour...).
It's also clear that as the environment changes, the definitions of Good, Bad and Indifferent will also change.
Perhaps making something that was previously Indifferent a Good or Bad thing, or even something that was Bad (no eyes) Indifferent or even Good (it's now in a dark cave and needs less food than its eyed cousins), and vice-versa.
For newspapers the ad revenue is less than 1/10th of the revenue that they would have got from a print advert.
And the online newspaper pays less than 1/1,000,000 the cost of printing the advert to do so.
- I almost certainly didn't put enough zeros on that number, as the publication costs of online adverts are almost entirely borne directly by the reader of the advert (free) and by the advertising agency (already covered by the reduced revenue).
Yes, you need more throughput to get the same gross profit after paying the staff, but again, that's easier - compare the cost of printing and distributing 1000 newspapers to serving a website to those same 1000 readers.
It's true that many of the old ways of making money have gone. Tell that to the manuscript illuminators, they're the only ones who'll care.
Erm, the trademark had not been granted, and almost certainly won't be.
That's what the "call to send examples" is about - ensuring that Verber cannot be granted the mark by proving its already in common use in that sphere.
It would also mean that big companies can infringe small companys' patents without any fear of prosecution because they wouldn't be able to afford the legal action.
Loser pays works because you don't start the action unless you're pretty sure of winning.
Interesting, as my iPhone 4S has this exact problem.
It regularly takes over an hour* to realise it's no longer buried in an underground lair or down the Tube and thus there is a usable signal, if it would only look for it.
I've taken to dropping it into "Airplane" mode and back out again to force it to look and connect.
Drives me potty.
Admittedly, that's only iOS 6.0, 6.1 and 6.1.1, it's intermittent and I haven't had 6.1.2 long enough to really see if it's the same.
* Yes, really! Left in my pocket, wandering above ground for an hour or more, blinking in the sunshine, and then looking at it to see no signal. "Airplane" on/off and suddenly a host of missed-call texts.
That he passed out after drinking and dropped his ciggy.
Fitting a Freesat dish and replacing all the Freeview boxes could hit a few grand easily - £500 for the dish & fitting, £500 each Freesat box (equivalent spec to the FreeView they just killed)
The worst case will be places they can't fit a dish - national parks, listed buildings etc - and "have" to build scaffold (even if a picker is cheaper and better) to reach the masthead amp/antenna.
That could easily max them out - especially as the scaffold companies now know the budget in advance(!)
A Solid-State Relay is just a hard-fired SCR/triac.
Most CFLs and LEDs hate them and die, pretty much the same as if you tried to dim them.
You need an actual relay - best is a latching/pulse relay so it's a pulse of power to turn on, another pulse to turn off - more efficient!
Thought experiment - take an onion and consider what happens if one of the layers explodes.
Some onion is under the layer, some above. So some material is forced inwards, which the rest outwards.
Then consider gravity which pulls it all back together - the explosion must be big enough to push all the material faster than the star's escape velocity.
Therefore, to move everything away, the star-shattering kaboom has to be extremely asymmetrical.
Not very likely.
To start with, you're using an invalid term - it is legally impossible to commit "theft" of any form of intellectual property.
By definition, IP cannot be stolen - only infringed.
Yes, patents are necessary.
However software patents are provably unnecessary and almost certainly damaging.
Mathematics is not patentable as it cannot be invented, only discovered.
Algorithms are mathematics.
Software is algorithms.
Software (and the source for it) should only be protected by copyright, because it's a specific expression of ideas.
Expression, not invention.
On top of that, many patents are being granted that are not only extremely obvious, but are massive land grabs by making extremely wide claims - in some cases, not even merely obvious, but the only apparent way to do a particular task.
Otherwise we might as well patent "Reality TV" - it makes just as much sense, and might result in less of it...
until your hardware gets so old it is no longer supported by the latest and greatest
Which doesn't take long at all - the iPhone 3G (July 2008) didn't take iOS 4.3 (March 2011).
That's less than three years - so the "same OS level" argument is clearly tosh as it's not going to be possible.
Conversely, it is possible to have every Android device (including those not yet made) running the same OS level (although not the same images), as you're able to "roll your own" images. Which could both be a good idea (easy to prevent installation of any non-approved apps) and a bad idea as it means doing the work to roll their own.
However, locking yourself into a single supplier "for all time" is a fairly standard idiocy of Government and large organisations.
The real question is "What's the exit strategy?" How do they transition to another supplier of hardware and software in ten years time?
I'm practically certain this hasn't even been considered and they may well be locked into Apple forever - although the other end is more likely.
More that the resale value of a stolen iPad is higher than a paper pad.
And while the "Find my iPad" app appears to work reasonably well, it's reliant on the Apple maps...
It makes life even worse, as when you flip the connector over you'll move slightly and now try to put it in the next one - upside down.
First, check it's a GU10 and not an MR16.
The GU10 lamp has "top-hat" prongs, MR16 has straight pins.
Now look at the socket and note the four holes.
Two are round - ignore them, they are screws.
Two are elongated, these are the two to jam your top hat prongs into.
Align roughly with the fatter end of the elongated holes, insert, wiggle slightly and twist clockwise to engage.
If they're actually MR16 then the bigger holes are the screws, so you line the pins up with the two tiny holes and push.
In both cases the lamp will probably light up before you've inserted it all the way, burning your hand.
- Top safety tip - turn the damn thing off first. You can tell if its off because the lamp that doesn't work is off when it's off, and off when it's on.
Sorry, I should clarify.
No matter what you do or how much money you throw at security companies, as long as you have users or are connected to the Internet there will still be ways for malware to get in.
You can't sit on your laurels.
Excellent start, however constant vigilance is still required.
Vigilance, not just A N Other security tool.
There is an addendum you missed:
we have had 0 issues with being hit with malware/viruses since about 2002... that you know of.
It's plausible that some are zombies but you haven't spotted them yet - if their traffic patterns aren't too far away from normal and the end user hasn't complained, how would you know?
The average end user won't complain until the computer is "running really slow", so could be devoting an entire CPU core to malware without noticing.
I recall doing a Malwarebytes sweep and finding half of Sales with possibly bad things installed.
(And nobody in technical roles, but that's self-selection for you)
Electric handbrake seems pretty common over there.
My last US hire car had it, and that was a big-standard petrol.
I'm guessing its due to the prevalence of automatics, and idiots forgetting to put the brake on when parked.
But do Apple actually test the updates against all networks?
Or just the original firmware bundles?
Or neither, because Apple are completely in charge of both development and release of iOS?
It's the reason we've got cheap computing hardware at all, take away the x86 platform and you're stuck in the mire of widely variant hardware - costing more and much harder to code for.
Heck, mobile and tablet is the first wave of "impossible to really code for" - you can write software for them but not on them, and having done so you must supplicate at the feet of Apple, Google, Amazon etc before you can sell it to anyone else. Even for free.
The death of WIntel is also the death of Linux and BSD - they need each other. Ok, Windows can afford to lose a lot of market share - but not all of it.
Plus it's equally common to have a faulty ACB anyway.
A few years ago I blew one three times before the EC figured out it was faulty and not set wrong.
Brought up the building, and after about half an hour one corner went dark. So we reset and tried again...
- They don't half go with a bang when they trip.
When the eBook edition costs the same or more than the paperback, something is wrong.
When it's the same or more than the hardback, something is very wrong.
That's the comparison - why is an otherwise-identical something that clearly has a near-zero reproduction and distribution cost sold for the same (or higher!) price as something that clearly has a relatively high reproduction and distribution cost?
Never mind that induction charging is hopelessly inefficient, how much do you think it would cost to install that rail?
For an order-of-magnitude estimate, look up how much it costs per mile to electrify a section of the railway network. It's a lot more than that so add another zero, because you're digging up a road rather than stringing wires between poles.
Wrong metric - it's not % of journeys, it's % of calendar days.
For both hire and purchase you amortise the sunk costs over time, not journeys.
The only place you pay a hire car by distance is a taxi - except the meter still ticks if you're stationary, so not even then.
Personally, I need the long range for around 10 days a year, except I still need a vehicle at the other end which cranks it up to ~40 days. That's a lot of hire car charges!
Then there's the cost of the EV itself, which even with subsidy is higher than a new mid-sized people carrier - and unlike the people carrier the EV will be worth diddly-squat when I get rid of it, just like my laptop is worth nothing after a few years.
But a significant part of my annual mileage isn't.
So, I'd have to either buy and maintain two cars - one EV for commute, one diesel* for longer range - or hire a diesel* car every time I need the longer range.
At the moment, the EV simply costs far too much and depreciates too fast for either of those to be economically viable.
- I'm also very lucky in that I do have somewhere to charge an EV, most city dwellers don't so couldn't even consider it.
So my question is - where is the car that is both these things?
Plug-in EV for my daily commute, diesel genny for my occasional long journey?
How ****ing hard can it be if even Top Gear can cobble that together?
* For low-carbon long-range, diesel is the only choice.