Re: Toolbars disappearing?
So it was impossible for Microsoft to put the menu down the side, where a 16:9 monitor has tons of wasted space?
They certainly made it impossible for a user to put their toolbars there.
1631 posts • joined 16 Jun 2009
So it was impossible for Microsoft to put the menu down the side, where a 16:9 monitor has tons of wasted space?
They certainly made it impossible for a user to put their toolbars there.
Sorry, I missed the last thread!
They provide 1A (and USB data) in our application, tested in a multitude of low-altitude environments around the world. They're intended for 'industrial' phone charger type applications.
Zero disconnect force. To be more accurate, a little less than zero...
I'll dive through the WEEE bin on Monday, see if we have any in there and drop you a photo.
Probably because there are far, far more of them, most are run by average consumers and they are left running 24x7.
The majority of the commodity hardware sat on the actual Internet or in the DMZ (as opposed to NATed) is running Linux, simply to get a network stack.
ADSL routers, switches, firewall appliances, Internet-webcams - that kind of thing.
Comparatively, there are very few Windows PCs directly on the Internet anymore - it's only really servers these days, and one would hope the admins of the majority are sane and competent.
In most cases the idiocy is probably the device manufacturer, leaving telnet or SSH turned on with a default password.
Very few home users are going to check the security of the WAN port of their home ADSL router, and they aren't going to try SSH or telnet attacks when fitting a webcam to watch the kids from work, they'll just forward the ports or even put it in a DMZ.
It's not quite that easy, as the Stuxnet attack on a radiological facility showed.
If you can infect the computer that programs the facility, you can use that to corrupt the PLCs that actually run the place later on.
Effectively a kind of sneakernet.
Which is worrying, and why I'm fairly paranoid about my laptop!
Must have saved many thousands of lives.
Competitive vomit races, and even target vomitry!
Imagine it, vomit bouncing down the street, straight into the face of a passer by.
A perfect ten, by anyone's standards.
Putting a file on a machine puts the file on the machine!
Who would expect that!?
What exactly was the point of this research anyway?
- "Deleting" a file has never destroyed it on any media, these researchers don't say that the client apps claim to destroy the data and I've not heard of them claiming that either.
Was playing Mass Effect 2 recently and was struck how "forced" all the facial expressions looked. Very disappointing, given the fully pre-programmed nature meant they really could have done better than "smile/neutral/grimace".
So something that lets them cheat would help a lot for making believable characters.
Although nothing can save them from such terrible dialogue...
Hint - it isn't Britain.
It would be genuinely stupid for the UK to invest significantly in direct solar energy, given the general lack of sun.
I leave it as an exercise for the reader to figure out why it's massively subsidised in the UK with huge amounts of money is being thrown at wealthy landowners to install it.
It means the market.
In other words, they'll stop taking the photos and take their image libraries offline.
Essentially, if nobody pays for the pictures they'll have to find a "proper" job instead to pay their bills, and photography will be relegated to a part-time hobby.
That's what it says on the actual statute books.
The more you scream about copyright infringement being theft, the less attention anybody will pay you.
FACT is a lie.
Copyright infringement is a crime, but it's not theft - it's more closely related to fraud and breach of contract.
Actually, you can't.
Remote-controlled model aircraft are not permitted to go that high, and the batteries in a quad/octo don't last that long either - 20min is about the longest you'll get.
You can take a lot of really nice photos with a big quad or octo, but the aircraft to carry a decent camera is going to cost £5k or more.
Model aircraft are great for chase cams though!
It can also be done online, you usually don't need to go to court in person.
Costs £20 IIRC, added to the claim.
You will have to show that you've taken "reasonable" steps to settle the matter before going to court.
You'd be wrong though.
It's all about the fade, and a ten-step 1 second fade is very visible.
You need at least 20Hz update rate for fades in small tungsten lamps, for LED you'd want 30-40Hz.
10Hz only works for large tungsten, or if the response is artificially slowed to allow internal smoothing to work.
Better than any of the other 2-wire "dimmable" LED lamps I know of though, most of which only go down to 20%
I would still like the last 11% though.
Sunshine also has the most unbelievably stupid spacecraft crew ever to screw up on screen.
I still chike that they really expected us to swallow the idea of flooding a compartment with oxygen to blow out a fire, when they could just vent it to space...
You know, try a technique that might actually work instead of one that would definitely kill everyone on board and probably destroy the craft as well.
The worst part is that the writers could have got exactly the same "oh hell" result of the aftermath by using an actual firefighting technique, instead of something that made no sense whatsoever and must have sent the scientific advisors screaming for the hills and demanding to be removed from the credits...
Nope, this isn't "simply upgrading".
To do a comparison with another well-known portable computing supplier, he's doing the equivalent of getting rid of an iPad and buying a (tablet shaped) Macbook Pro. That's not an upgrade, it's a different class of device.
Windows RT isn't Windows 8. Superficially, they look very similar - but when it comes to what you can actually do with them, they are completely different.
To me this tweet really says "Microsoft exec confused by the difference between a Surface RT and a Surface Pro" as he's bought the wrong one.
No matter which Linux you pick, in ten years time you will definitely still be able to legally deploy that specific version.
That's the issue - right now companies are only just starting to use Win7 embedded in released products.
It takes a year or two to build and test the new system on a new underlying OS, so if you lose the ability to licence it only five years after it first became available, you might only get three years of shipping units before you have to switch to a new underlying OS.
Which takes you two years, so you end up continually re-writing just the OS layers,with only one year in the middle for actual new features etc.
Not sustainable, thus a short lifetime of Embedded licensing is a sure-fire way to kill all sales of it, forever.
As other posters have pointed out, several Embedded systems are already on Linux - and with modern toolkits like Qt, transitioning between WinXP and Linux is much easier than it used to be. It's not "tick the box", and probably won't ever be - but much easier than before.
It's also rather nice how easy a Linux is to lock down - after all, an always full screen application doesn't actually need a window manager...
Linux actually has the low-power embedded market almost completely sewn up - check what your smart TV or STB (router etc) runs!
In earlier Embedded it was normal to kill the shell entirely and replace it with your own, so merely suspending bits of the Win8 shell seems an odd way of describing it.
In almost every use of Win Embedded, the whole point is to kill every aspect of a "normal" Windows UI and replace it with the specialised UI for the particular use.
- If you can still see that it's Windows after the boot screen, you probably didn't do it right.
That said, manufacturers are only now moving to Win 7 Embedded, so usage of Win 8 is a few years away.
Unless they pull the rug out from under us by stopping licensing of new Win7 Embedded machines, in which case it'll scare everybody onto Linux.
One question to think about:
How many applications do you have installed?
Under all previous versions of Windows, the "launcher" had folders to let you organise your installed programs into groupings, and to let the installer put their links together.
That's gone in Windows 8.
If you have more programs than fits a screen, you have to flick through several pages.
If the installer used to put its links into a given folder (eg all from the same supplier), it won't anymore, you'll have to drag them around yourself.
Even iOS has folders on its "launcher" - and most people have far fewer things installed on their phone than on their PC.
The Win8 Start Screen just doesn't scale. If you've only got one screenful of applications, it's ok. As you add applications, it becomes more and more unwieldy.
Personally, I have well over a hundred applications installed - most are "rarely used", but I still need them and I still want them grouped nicely so I can find them quickly.
- As I use them rarely, I may not remember enough about the name to use Start->type, or even recognise the icon, but finding "widget drawing" in the "drawing" folder is easy.
Doing the same in Win8 Start Screen isn't possible. I must recognise the icon or know its name, or it is almost impossible to find.
- One example of a useful UI improvement that was clearly based on proper research is "Pinning" in Win7 - those few applications I use every day end up pinned, while the more rarely-used stay in folders in the Start menu.
There are some things in desktop Win 8 that are quite nice, including the new task manager.
Yes, there are a lot of nice things. It boots faster, has many improvements "under the bonnet" and has quite a few useful tweaks to built-in utilities.
Then they ruined it by ripping out some useful features and cramming a tablet UI on the front.
It's like bringing out a new, faster Ferrari but insisting you cannot buy it in red, and steering now uses a lever.
Obviously you could take it to a paint shop and bolt on a steering wheel, but why should you need to?
How can that be abuse of a monopoly?
The only reason Win8's TIFKAM even exists is because they are a monopoly - consumers are essentially forced to buy it if they want a PC at all. (As the OEMs are pushed away from Win7)
If they had a choice, almost nobody would buy Windows 8 - you can see approximately how many would choose it by looking at sales of Surface and Surface Pro.
I'm still pissed off that they neutered the task bar - so much of the cool stuff it did has gone :(
As a few earlier posts have pointed out, most of the evacuated area is less radioactive than Cornwall.
Unfortunately there is an outright panic attack whenever radioactivity is mentioned, regardless of the actual level.
Going through US airport security (excluding the flight) probably used to give you a bigger dose due to the X-ray backscatter machine than a day's visit to the "Area 2"
- Hard to be sure as the data on those was never published and probably wasn't ever measured for the staff and the queue.
Nope, you want to know the ground-level information, not a few hundred metres up.
I really hope they did include measuring the radiation - in fact, it would also be very interesting to do that everywhere.
I'd love it if radiation levels around the world became well-known, in the medium term it would remove the hysteria and replace it with the simple respect radiation deserves.
We'd all be safer for that.
Yes, but you see, the ASA's dictionary has a few missing pages and they can't afford a new one.
Perhaps we could help them out?
"Hey, let's spend billions buying underpants!"
"Why? Will it help?"
"Just buy billions of underpants and then get back to me."
Power lost = Current * Current * Resistance
Power supplied = Current * Voltage
If you don't understand why these two equations mean higher voltages are needed, you won't understand why the rest of that post wouldn't work either.
Go do some research - it's very interesting.
However, if you don't want to learn any physics or engineering, don't proffer opinions on them because you'll just look foolish.
You earn money on solar by stealing it from the poor.
You get between four and five times the going rate for your electricity generation, and the Grid is forced to buy whatever you generate, regardless of whether the Grid actually needs it or not, or even whether it gets any of it in the first place.
- Yes, this is the same as Tesco paying you £5 for each cucumber you grow and eat yourself.
It's almost certainly the most regressive taxation to ever come out of the UK's central Government - it drives the poor into fuel poverty while handing money to rich landowners.
Yet it was a Labour idea. So much for their "core values".
Isn't that 1-3% based on simple maths?
Area of surface we are likely to build wind turbines on multiplied by watts per square metre an array can generate, compared with giving everybody a reasonable amount of electric?
Sounds reasonable to me, given the paper's content.
Of course, you can argue his area is too small, or that 2/3 of EU energy consumption for everyone is too large, but I rather doubt it's out by an order of magnitude.
So even accepting that, the upper bound on global wind power would only be 10% or so.
The huddled masses of the world's really poor aren't going to stay huddled forever - and the sooner they get to the "modern" living standards the better for everyone, because that's the only way the human birth rate will reduce to a steady state. You'd better hope that happens before peak energy and peak food, because human wave attacks are very messy.
Just because leaping off a cliff is madness, doesn't make climbing the cliff sane.
Burning oil is foolish - if nothing else, it's far more useful as a raw material than a fuel.
However, building wind turbines is also foolish.
It's telling that you only feel able to post your ill-informed bigotry completely anonymously.
It would appear you don't even believe your post enough to associate your long-term (semi-anon) handle with it, which doesn't sound like someone with a genuine religious conviction.
You don't believe, you just hate those different to you, and last time I checked, Jesus didn't say to hate your neighbours.
By your definitions, childless couples and disabled husbands are just as bad as gay marriage, as neither provide the benefits you claim marriage is for.
Do you agree with that statement? Would you deny a paraplegic fiancé the right to marry?
Marriage is a public declaration that two people love each other, intend to devote the rest of their lives to each other and remain faithful to each other, regardless of changing circumstances.
That is valid for any two people.
AC@12:41: You could read the next paragraph: "[America] is the only place I've seen "swipe'n'nothing" credit card payment, anyway."
Also, this statement is just plain wrong:
Contactless payments are authorised, the card provides the authorisationThat's not authorisation, because it's taking money without asking the account holder anything.
The account holder is the only entity who can authorise money going out of an account. If the account holder didn't authorise a transaction, that transaction is unauthorised by definition.
By (EU) law, you must be refunded if the bank permitted a transaction without that authorisation.
So you're genuinely happy that anybody at all can make multiple transactions, each up to %VALUE% (£50?) once they've nicked your card? (Or even without stealing it, instead remotely using the RFID to determine the card number and doing a few CNP transactions until the anti-fraud trips in and blocks it.)
Leaving you with the fun and games of getting the money back, perhaps bank charges (and even court summons) due to going overdrawn or having cheques, direct debits or standing orders etc refused?
For most people it wouldn't take many £50 transactions to do that - just one may be enough.
That sounds like a dangerously foolish idea to me.
You must be American to like the idea of being able to pay for smallish transactions direct from an account with no authorisation or security process at all.
That's the only place I've seen "swipe'n'nothing" credit card payment, anyway.
The card is not the account holder!
I used to think the SciFi stories were silly where people had credit chips that almost anybody (usually the villain or hero) could easily use to take money from random civilians, but now it looks like the banks really do intend to go there.
The tax man will be very interested in any companies complaining about this behaviour, because it's a clear indication that the person is really an employee.
- So the employer has to pay NI, holiday pay, pension etc.
Not sure about other countries, but in the UK, if you can't subcontract your contract then you're usually considered an employee, and not a self-employed (freelance) contractor.
Maybe you've not heard the term before.
"Dogfooding" means "using your own products internally".
It is almost universally a good thing, as it saves the supplier money and helps find subtle bugs.
Aside from that, would you trust a supplier who doesn't trust their own products enough to rely on them for their own business?
It comes from "Eat your own dogfood".
wxWidgets is a widget toolkit - it's only the GUI.
Qt on the other hand is a complete cross-platform SDK, however its previous owners spent years developing it for a particular mobile phone OS, then set themselves on fire, slit their throats and threw Qt away.
So Qt is back to being a desktop SDK, now with a host of new, shiny, but unusable mobile OS extensions.
With is a terrible shame, I call it "being Elopped".
The Android (Necessitas) and iOS ports will fix this, but not yet.
You will not find "connect me to a database" widgets in either, you do have to do some of the work yourself.
If you are going to compare Apples, at least pick oranges instead of monkeys.
Armstrong took manual control of the last section of the computer landing in the LEM and drifted it along to find a flatter LZ - because it missed the target due to wrong data, and it wasn't possible for a human to see that until it had nearly landed.
With a hand over the "hard abort" button that would put the computer back in control and throw them back to the CM. That's still the closest to actual off-world piloting ever done. Perhaps the same will be done for a manned Mars lander, but I doubt it.
Apollo 13 did one or two 'manual' burns on a "we'll correct it later when the computer is running again" basis.
However, the burn was still pre-calculated by the boffins on Earth, the crew's job was to keep the craft in the same orientation during the burn.
Not to time the burn, not to work out how long to burn for or which direction to do it, and not even to know that it needed doing at all.
Finally, Gemini etc crews didn't do the rendezvous, just the final docking. Computers got them within a few hundred metres and at near-zero relative velocity, humans only handled the final touch-and-grab.
- Compare taking a ship across the Atlantic to New York with going the last 100m to the quayside.
Humans are also rather poor at it, demonstrated by how much practice was needed to get a small number of extremely experienced and highly trained individuals to be able to do it at all. (And how dented a lot of ships are! The big ones auto-dock now.)
Today it's mostly not done - grab the thing with an arm and drag it into place.
The big advantage a human-crewed mission has is being able to repair stuff. If the computer breaks, a human can turn it off, replace bad components and reload the software. A computer can't do that for itself.
Minor correction - human crewed, not piloted.
Computers have flown all spacecraft with the exception of Apollo 11's LEM*, Apollo 13* and the shuttle during the landing.
Launch and in-space manoeuvring is something humans just can't do - the timings and precision needed are too tight, and we simply can't do the observations either.
Any spaceflight is quite simply pre-calculate and let the computer burn the engines.
Humans can do the last bit of landing, but balancing on a tongue of flame is best left to a computer.
* 'cos it broke.
Sanity Soapbox, your argument simply does not exist at all. It is a non-argument, it not only has ceased to be but never was. It's a statement of an empty belief with no thought behind it whatsoever.
For the hard-of-thinking, here's a brief summary of evolution:
A random change occurs to the offspring of a lifeform compared to its parents.
That change will either be good for the offspring, bad for the offspring or make no detectable difference.
- If the change is good, it is more likely to survive and have offspring of its own, thus the descendants also have that particualr change and over time it becomes more common.
- If the change is bad, it is less likely to survive and have further offspring, thus the change will be rare or be lost entirely.
- If the change is indifferent, it has the same chance and so the change may be retained.
It's clear that given time, "advantageous" changes will accumulate (opposable thumbs, better eyesight...) and a variety of "harmless" differences will appear (freckles, hair colour...).
It's also clear that as the environment changes, the definitions of Good, Bad and Indifferent will also change.
Perhaps making something that was previously Indifferent a Good or Bad thing, or even something that was Bad (no eyes) Indifferent or even Good (it's now in a dark cave and needs less food than its eyed cousins), and vice-versa.
For newspapers the ad revenue is less than 1/10th of the revenue that they would have got from a print advert.
And the online newspaper pays less than 1/1,000,000 the cost of printing the advert to do so.
- I almost certainly didn't put enough zeros on that number, as the publication costs of online adverts are almost entirely borne directly by the reader of the advert (free) and by the advertising agency (already covered by the reduced revenue).
Yes, you need more throughput to get the same gross profit after paying the staff, but again, that's easier - compare the cost of printing and distributing 1000 newspapers to serving a website to those same 1000 readers.
It's true that many of the old ways of making money have gone. Tell that to the manuscript illuminators, they're the only ones who'll care.
Erm, the trademark had not been granted, and almost certainly won't be.
That's what the "call to send examples" is about - ensuring that Verber cannot be granted the mark by proving its already in common use in that sphere.
It would also mean that big companies can infringe small companys' patents without any fear of prosecution because they wouldn't be able to afford the legal action.
Loser pays works because you don't start the action unless you're pretty sure of winning.
Interesting, as my iPhone 4S has this exact problem.
It regularly takes over an hour* to realise it's no longer buried in an underground lair or down the Tube and thus there is a usable signal, if it would only look for it.
I've taken to dropping it into "Airplane" mode and back out again to force it to look and connect.
Drives me potty.
Admittedly, that's only iOS 6.0, 6.1 and 6.1.1, it's intermittent and I haven't had 6.1.2 long enough to really see if it's the same.
* Yes, really! Left in my pocket, wandering above ground for an hour or more, blinking in the sunshine, and then looking at it to see no signal. "Airplane" on/off and suddenly a host of missed-call texts.
That he passed out after drinking and dropped his ciggy.
Fitting a Freesat dish and replacing all the Freeview boxes could hit a few grand easily - £500 for the dish & fitting, £500 each Freesat box (equivalent spec to the FreeView they just killed)
The worst case will be places they can't fit a dish - national parks, listed buildings etc - and "have" to build scaffold (even if a picker is cheaper and better) to reach the masthead amp/antenna.
That could easily max them out - especially as the scaffold companies now know the budget in advance(!)
A Solid-State Relay is just a hard-fired SCR/triac.
Most CFLs and LEDs hate them and die, pretty much the same as if you tried to dim them.
You need an actual relay - best is a latching/pulse relay so it's a pulse of power to turn on, another pulse to turn off - more efficient!
Thought experiment - take an onion and consider what happens if one of the layers explodes.
Some onion is under the layer, some above. So some material is forced inwards, which the rest outwards.
Then consider gravity which pulls it all back together - the explosion must be big enough to push all the material faster than the star's escape velocity.
Therefore, to move everything away, the star-shattering kaboom has to be extremely asymmetrical.
Not very likely.
To start with, you're using an invalid term - it is legally impossible to commit "theft" of any form of intellectual property.
By definition, IP cannot be stolen - only infringed.
Yes, patents are necessary.
However software patents are provably unnecessary and almost certainly damaging.
Mathematics is not patentable as it cannot be invented, only discovered.
Algorithms are mathematics.
Software is algorithms.
Software (and the source for it) should only be protected by copyright, because it's a specific expression of ideas.
Expression, not invention.
On top of that, many patents are being granted that are not only extremely obvious, but are massive land grabs by making extremely wide claims - in some cases, not even merely obvious, but the only apparent way to do a particular task.
Otherwise we might as well patent "Reality TV" - it makes just as much sense, and might result in less of it...