Re: (Ok, "iOS, Android and WP7 only have one real browser. Not true of the others." WTF??
I somehow got the idea they all used the Webkit engine, evidently I was on crack.
Thanks for the reminder that Opera and Firefox do not.
1597 posts • joined 16 Jun 2009
I somehow got the idea they all used the Webkit engine, evidently I was on crack.
Thanks for the reminder that Opera and Firefox do not.
While neither tells you very much (of course), neither of them makes me want to buy their more detailed stats either as their comments imply they just want to put the other down rather than produce a useful dataset.
Both of these stats are about web browsing, and not applications, which means the underlying OS doesn't actually matter, you only care about the browser and form factor.
(Ok, iOS, Android and WP7 only have one real browser. Not true of the others.)
Net Applications bundling of "tablet" with "phone" commingles completely different form factors, thus giving no guidance whatsoever. How do I tell from that whether I should put more effort into making my site look good on a Windows Phone or Symbian? It feels like a statistic specifically invented to make iOS look good.
StatCounter on the other hand is useful, because it tells you how much effort to put into making your "for phone" version(s) of the website website look nice on anything other than the Android and iOS big two.
Again, it doesn't tell you whether it's worth working on the various tablet versions, but at least it's up front about that.
The argument about page impressions against unique users... What exactly is wrong with tracking both? I'd rather like to know if a huge number of page impressions is really only a small number of users, or if I just get a host of uniques who never come back, as these two are indicators that I'm doing different things wrong.
Nah, she had a lute, see?
"Zero-cross" switching is irrelevant, it's almost exclusively a marketing thing. It has no bearing on device reliability or whether or not it'll damage the telly. It merely reduces EMI switching noise - which is a good thing if you've got a lot of them, otherwise irrelevant.
- Consider a 3-pin-plug or the switch on the socket. Does that do zero-cross detect? Does the act of plugging in a TV damage it?
AC SSRs are actually a pair of back-to-back SCRs or a Triac, just like a dimmer. This means that the waveform they produce is not sinusoidal (both due to being non-resistive and having a recovery time during zero cross) and that they never fully "turn off" - there is always a small leakage current through the SSR. There is also a notable insertion loss (fixed voltage drop) which makes them rather inefficient compared to a relay.
Many power supplies can be damaged by these effects - continually feeding it a small current that's not enough to start it and making the zero-cross point fuzzy are both bad things to do to a power supply. Switch-mode PSUs also draw their current in 'pulses', which means the SSR may not actually stay turned on throughout the cycle and may 'starve' the SMP. (This has all kinds of strange effects)
Put simply, you need a physical relay contact that goes "click", because that's what the PSU in your equipment was designed for.
The best is a physically latching one as then no power is consumed keeping the relay "on" - this is impossible to do with SSRs as they must be powered to turn on. Thus relay-based ones can be considerably more efficient than SSR ones.
That's assuming you even come out ahead in the first place - if you have multiple items on the switch then it might be worth it, but just a TV is almost certainly pointless.
They're cheap, and have much lower-quality PSUs than TVs etc.
So how much power do they consume themselves? It is likely that some of them actually draw more power in their own standby than your TV does, and all of them will increase the power consumption when your TV is on.
The only one I could find that gave consumption values was a US version, and it said 0.08W in standby, and 0.50W when on. Let's assume a UK version is the same (it's probably worse due to the higher voltage)
My Plasma TV is specified to draw 0.30W in standby, thus I would save 0.22W when my TV is off, and cost 0.50W when my TV is on.
This means that simply to break even, my TV has to be off for 2.3 times as long as it is on.
Now take a look at a modern LED TV. Oh dear, 0.1W in standby.
That's before you consider the damage the SSR (solid-state-relay) -based ones will do to your delicate electronics.
Corporate fines are at best a lagging driver. Nobody takes much notice until somebody else has been hit with a big one, and then it takes a long time to change course anyway.
Aside from this, the risk severity was already as high as it could get - this current disaster would probably have triggered a state takeover of RBS had it happened when they weren't already zombified.
Even as it is, they are going to lose a lot of customers.
At the end of the day, the probability of this kind of failure is directly related to the complexity and importance of the system, and inversely to the experience of the people looking after it.
As they get amalgamated, more financial astronauts come up with more or more complex 'instruments', and experienced staff retire, "retire" or are made redundant, the probability increases.
So this is very likely to happen at least once more, and the only difference is that it's more likely to take a CxO with it next time.
Rubbish. Utter tosh.
If the US wanted him, they'd extradite him directly from the UK, as that's much easier than arranging for Sweden to extradite him and then extradite him again from Sweden.
Sweden aren't going to allow an extraordinary rendition either, because it'll go VERY public very quickly - how would that play in the media? Assange extradited and 'vanished' on the way over? Not going to happen.
So, ask yourself - Does it really make any kind of sense to fight two legal battles, one though a puppet and the other in person, to extradite someone instead of just one?
On top of that, the UK has a 'special relationship' with the US, including rather lenient extradition terms for any time the US want somebody. They don't have a similar agreement with Sweden.
If you're looking for a conspiracy theory, this is not one.
Bugger, that should read "..if they're at a petabit/s scale in fibre already without OAM..."
The fail is clearly me..
Once it gets reflected about a bit, the information on the OAM gets lost very quickly and is no longer recoverable.
Thus this technique is really only usable for point-to-point tight-beam transmission.
Which really means it's a bit 'meh' for radio-frequency below the tens-to-hundreds of GHz (when you have to do point-to-point tight-beam anyway as nothing else has reasonable power requirements for decent range), and equally not that useful in optical until you're in free space as the photons get scattered by the atmosphere.
I do think it would work pretty well in single-mode fibre though, which is interesting - if they're at a petabit/s scale in fibre already with OAM, OAM might add another order of magnitude.
As long as you stay on the surface of this world, and don't go up any particularly tall mountains.
Going down any particularly deep mines is also obviously out, as they don't have decent beer down there.
Go on then, why the downvotes?
Yep, that's the smart thing to do.
Never announce something until it's in a container on the way to the shops or sat in your warehouse.
Then pretend to be really surprised when somebody 'accidentally' leaks the details while you're still developing it.
All the exposure of a premature launch, none of the risks of the cries of "vapourware!!!"
Thus assuming your daughter still lives with you, you do still have and use television receiving equipment.
So you or your daughter does still need a licence and the TV licencing people remain accurate in their assumption.
If you only watch iPlayer and other on-demand "repeats" and never at the same time as it's broadcast, that's when you don't need a licence.
- Incidentally, there's no longer such a thing as a 'TV detector van'. Many years ago they could tell by the CRT baseband squeal, even knowing which channel, but it's now much easier and more accurate to assume that 99.9% of the population have and use TV receiving equipment, coupled with the database of TV receiving equipment purchases.
That said, Capita do have themselves to blame over continually bugging people who genuinely have no TV receiving kit. And even repeatedly visiting people who already have a TV licence - that one really annoys me.
"An emergency, critical, unavoidable SP pack will fix that."
Even Microsoft wouldn't be that stupid.
Their bread and butter revenue comes from corporate IT, and doing that would quite simply result in corporate IT shutting down all Windows Update and rapidly searching for an alternative operating system that didn't force that kind of change on them.
Can you imagine the CEO of MegaCorp turning on his computer one day and finding Metro instead of his expected desktop photo of his children?
- However, they probably wouldn't actually take up the alternative because the threat alone would force Microsoft to publicly fire whoever they could blame for forcing the update. Thus nobody below Ballmer would sign it off, and even he might get forced out by shareholders if it was proven.
You cannot change out the UI that radically in any corporate environment. The ribbon was a minor change compared to pushing Metro.
It worked beautifully oevr there, didn't it?
Compared to an Aygo, it's a right looker.
Sounds like it's nicer to drive than an Aygo as well and has a usable boot.
- Had an Aygo as a hire car once. Horrible little thing.
The Ovi (Nokia) store had 116,583 apps for Symbian as of December 2011.
More interesting is that the Windows Marketplace has more than 100,000 registered developers, which implies that the majority of devs have published somewhere between zero and one applications.
"There should be synergies and economies of scale from using the same O/S and same UI across all devices."
No, there are not. And here's why:
The user interface that's great on a 10" touchscreen tablet is hideously painful on a single, high-res monitor with a trackpad (laptop).
It's also very annoying on a multi-monitor desktop.
The different physical UIs need different 'software' UIs.
There might be some synergies in the OS, but probably not as much as you're thinking and it may even be more complicated anyway as cross-platform dev is hard.
Symbian was just getting good when Elop broke its legs in a memo.
Nokia then proceeded to shoot it in the head by deciding not to sell Symbian smartphones in the markets that can afford to buy large numbers of smartphones.
- Have any Symbian smartphones been offered for sale in the UK?
Given the above, rather a lot have been sold.
Plus Windows goes mad and literally blacks out the playback area or cripples the resolution if you try to clone the laptop display onto a projector. (At the time HDCP projectors didn't exist, now they are merely almost impossible to buy.)
This all drives us mad, and I took a sneaky joy in reminding a MS exec that the reason they couldn't see their video on their presenter's screen was Windows copy protection.
It would be great to see the things we used to be able to do easily come back, though I somehow doubt that happening.
I'm with you on the writer front. The guy might be a good musician but he can't write prose for toffee.
TBH It reads like a series of commentard postings, not all by the same person.
There is some interesting info on how artists currently (do/don't) get paid, and how they (did/didn't) get paid in the past, but it is fairly buried.
Fundamentally I suspect that the problem is very few people respect copyright, and the general reasons for that are fairly clear. The 'big media' don't respect it much either, and they've successfully lobbied for ridiculous copyright length - the latter stifles creativity because you can't bounce off existing works until they're truly ancient.
The fundamental problem is one of respect - it's basically been lost.
The "normalisation" you mention is more properly knowns as "compressed to hell and back", and has nothing to do with spending more or less on recording or mastering.
Leaving the dynamic range in would cost no more, possibly a bit (though not noticeably) less as it's a step you can leave out.
It's a deliberate choice by the producers of the track.
My guess is because every track wants to be the 'loudest' on the radio or the various music TV channels, for the same reason most adverts are also compressed to buggery.
Though compression does cover a multitude of sins, not least an the artist's inability to sing. Rather like autotune, but less obvious.
Autotune and sampling is where you look for the corner cutting. Sampling cuts out the musicians, autotune cuts out the vocalists, doing both cuts out most of a recording studio.
There are one heck of a lot of tracks using massive amounts of both, to the point where it could have been anybody 'singing' in the first place.
Not really. Inter-application communication has very tight granularity, but application-to-phone permissions are still quite big buckets:
It's a little odd as there are a few very fine-grained permissions, while most are very large buckets. eg location info has several different permissions, while others let the app do pretty much whatever it wants to "X".
It's still not possible to deny an application a permission while still running it, or alter permissions after installation - for example, almost every social network app seems to want GPS location. What if I don't want it to have my location but am happy for everything else?
Or even more common, I'm happy for it to use the Internet but not for it to use my phone or SMS/MMS. When abroad it's easy to kill Internet, but not possible to kill phone/SMS/MMS.
Say you write a picture-munging application. It clearly needs to be able to read the pictures already on the phone as well as take photos with the camera, and it needs to save the results back to the picture gallery.
Equivalent examples exist for a lot of things.
The obvious solution is for more granularity in permissions - you may see my picture collection but not my sounds library etc.
More importantly, we need the ability to deny a particular permission to the application, eg It gets an empty and volatile persistent storage.
The fun part is that we don't have any living models for a bird or reptile anywhere near that size.
Mammals are clearly a bad model, and living things generally don't scale so birds and reptiles are of limited usefulness. For example, the size of the digestive tract varies immensely among animals of similar dimensions. (Compare horse to cow. Or even horse to hippo!)
All in all, it makes figuring out the likely physical build of a fossil creature incredibly difficult and often subject to bias - the Victorians went with 'slow and fat', hence the very high original weight estimates.
Smaller dinosaurs are much easier - birds are a close model, and we've even got a few skin impressions which helps immensely.
This cow is very small.
Those are very far away.
Small, far away.
Warranty periods cut (so clearly they know they're rubbish), and so many more failures post 'infant death' than before.
Yet nobody to run to, because they're both as bad as each other.
SSDs appear to have more suppliers, but remain rather expensive and still have rather variable reliability.
than any heath-robinson contraption you could reasonably build.
Two very simple and extremely reliable options:
1) A clock. After a given time, fire the rocket. Calculate the likely maximum flight time to altitude and add a bit, such that it will fire after balloon burst.
2a Strain gauge joining truss to balloon not via the parachute. When strain drops to near-zero for longer than 5 sec or so, fire the rocket.
2b) If you want to wait for the parachute to have opened and thus stabilised the truss, use two strain gauges - one on the parachute cord, one on the balloon cord.
When parachute cord has higher strain than balloon, fire.
2 a/b can be done entirely with discrete solid-state electronics (wheatstone bridge, op-amp, monostable, mosfet) no software or moving parts. (Avoid electrolytic caps)
Use two identical strain gauges in both cases to compensate for temp - both should respond the same.
Can easily test this in the freezer as pressure has no effect, only temperature.
The actual weight measurement wasn't the issue.
The problem was them asking for large amounts of irrelevant data - both parent's full names, dates of birth, full address and phone number?
Is all of that really required for the described intent? Could you justify all of that data to a judge?
Put simply, the school can contact the parent(s). The 3rd party agency doesn't need to know anything at all about the child or parents outside of the data strictly necessary for the study.
They need weight, sex, year and month of birth and that's it. If they really want to do fine-grained geo-location on the data, the full postcode is already too fine-grained for useful statistical analysis.
The phone number is certainly far too much - and even the name is unnecessary.
They probably sold the data on to some other party to use - imagine the phone calls "We sell weight loss plans for your children!" (And we already know they need it...)
What did the ICO say about it?
That's the sound of you missing the point by a few AU.
There were pretty good reasons for the MAJOR.MINOR.PATCH.BUILD version numbering systems used by almost everyone (with various company variations on the BUILD bit).
Going away from that causes confusion and digs holes for both users and developers to fall into.
For example, if the devs crank up the PATCH when they make a fairly big change, users become nervous about any upgrade because "the last minor patch changed everything!"
Pretty soon it ends up in a situation like 1.20.50.xx because now there is nothing big enough to justify crank the MAJOR. That then confuses people and support even further - when somebody says "1.2.5", do they mean 1.2.5, 1.25.xx or 1.20.50?
Alternatively if you crank the MAJOR every single time, nobody has a clue whether a change is actually significant.
That means some users don't want to upgrade because they think something big must have changed every time and they don't want to learn it. (yet)
Once they realise most are not big changes, they get scared because they cannot easily tell whether anything "big" is there and thus put off every upgrade until they have a week of nothing much going on to test the new version.
So they end up behind, because this major version was really just a patch while that one was major, and the other was minor, and they all happened way before the user got the time (or courage) to try any of them.
For most of the above, for "user" read "Corporate IT Dept".
Most home users will just leave the auto-update at defaults, and then get very surprised if anything changes - my mum always phones me when her "internet" changes, and I doubt she's alone.
There are so many hoops to jump through to get those to work it's unreal.
I never did figure it out properly and it appears that nobody else has either.
My software can detect the proxy and authenticate itself, but cannot use the system authentication, the user has to re-enter login details.
Firefox and many other applications are the other way around.
Even more applications (eg Dropbox) don't do either, requiring you to manually set all the proxy details.
So how are you supposed to do it? The only application that mostly manages is IE8, and that fails if there is a different proxy for HTTP and FTP.
They have been privately traded for a long time, that's what set the IPO price.
And yes, it's a record low as far back as I can be bothered to check (2010)
I disagree, company management can easily cause a business to fail.
There are many examples of a CEO or management team destroying a previously successful business.
Success is rarely directly dependent on management. In some cases it happens because management gets out of the way, in others they genuinely lead. (The latter is usually only in SMEs)
This is probably because destruction is both easier and faster than construction.
Sounds about right for the average man in the street.
I regularly see people trying to touch non-touchscreens and being surprised when it doesn't work.
1920x1080 is quite simply too small.
My previous laptop had 1200 vertical pixels, and I really notice the missing screen space in many applications.
Higher screen resolution also lets you make writing easier to read - compare this text drawn with 5x7 pixels to the text on your current monitor.
Now make the dots smaller while keeping the text the same physical size (using more dots) - again, it becomes nicer to read.
When we look at a 1080p movie, the upscaling of movie pixel to screen pixel again affords the possibility of nicer pixels - when in motion you can estimate what the 'missing' pixels would have been, increasing the effective resolution and making for a nicer movie experience.
Secondly, if the film is instead digitised at a higher resolution we'll get the real detail in the film - maybe even as high as the digital projectors used in the cinema.
Movie houses aren't going to sell those discs/licences until enough people have these 'above-HD' resolution screens for it to be worthwhile.
So who is Metro aimed at on the desktop or laptop PC?
It's not us "Power Users", because we run multiple applications simultaneously.
It's not novice users because "hot corners" and "charms" are undiscoverable and nothing looks "clickable".
It's not the intermediate users because they are used to how things look now. They are confused by minor changes (like "Start" becoming a blue blob), and find major changes genuinely frightening.
What use is that?
Right now I can glance at the taskbar, or see it out of the corner of my eye and know that my video rendering task is less than half done.
Thus it can be safely ignored, I don't need to go there and can keep on writing this bitchy post.
If I had to move the mouse on top of it to see that same information, I've got to move my hand over to the mouse, move the mouse pointer, place it over the application's icon, wait for the popup and find that it's less than half done so doesn't matter yet, then move the mouse back to my actual bitching task and my hand back to the keyboard.
In the meantime I could have bitched about many different things on the internet, or even done some actual work.
Woah, they threw away the taskbar completely?
Bloody hell, the sky HAS fallen. From a usability perspective, ITaskBarList2/3 is by far the most useful API in the whole of Windows 7.
The idea of being able to tell how far gone that massive download/upload/rendering task was and when it/they finished is simply beautiful.
WTF is it with you and this "multi-touch trackpad" malarkey? I'm really starting to think you don't actually know what a trackpad is!
TrackPADs are one of the worst pointing devices known to man, the primary limitation being that you cannot move the pointer very far before you have to pick up your hand, move back to the other side of the pad to move further.
Multitouch for a trackpad means that a few shortcuts can be built into the pad - eg right-click, scroll wheel - as it can tell the difference between one, two, many and lots of fingers and a limited number of gestures.
Metro is optimised for a touchSCREEN, and is possibly the worst interface ever developed for use with a trackPAD, multitouch or not.
That's because the Metro 'start menu' requires you to move the pointer further across the screen to get to the thing you wanted than the Windows XP/Vista/7 one, and is therefore considerably worse for trackpad users.
I don't mean they are necessarily very significant, just that they're probably the biggest hidden consumption in an average family home.
What power consumption is there in a home?
Lighting, fridge/freezer, hob/oven/grill, kitchen appliances, white goods, computers, TV/STB/HiFi, chargers, heating/HVAC, garden appliances, CNC milling machine...
The average home will have more chargers than TVs/STBs, so if you assume they all consume the same quiescent power, the sheer numbers mean that they draw more in total.
In reality, they have lower-quality PSUs than TVs and STBs, and draw a bit more. The various proper measurements I've found imply they are between 450mW and 900mW when not charging.
Thus assuming the chargers are idle for 20 hours a day (4 hours to charge the phone), that's between 3.3 and 6.6kWh wasted a year by each charger!
That's between one and two kettle-hours per charger.
You are aware that modern TVs draw under 0.5W in standby?
Considerably less than a smart meter, and probably less than a dumb one as well.
Standby is irrelevant these days.
The biggest 'hidden' consumer is probably phone chargers. They are built tiny and cheaply so have relatively high quiescent consumption, and are easy to forget about.
Some set-top boxes are pretty terrible though - some of the 'Top-up TV' ones draw a good 20W in standby, becuase they never actually power down.
None of these will be even visible on a smart meter display if course, as they will be hidden by the fridge or freezer, which draws small bursts whenever it warms up inside.
That worked in Windows XP, and you can do the same in Windows Vista and Windows 7.
However, you cannot turn Metro off in Windows 8.
Microsoft deliberately made that impossible - it was possible in the early developers preview, then the ability was removed and there's no sign that it's going to come back.
I've found it much nicer to use than Visual Studio 2010, notably it's much easier to bring up a program written on somebody else's computer and I've found its code modelling to be excellent.
It's free and open source.
The only annoying bit us that code panes are stuck in one window, once that is fixed it will be damn near perfect for C++ dev.
Equally, the other side has the same duty to avoid running up unreasonable costs.
Multiple delays are clearly running up avoidable costs.
So your lawyer asks for costs, which includes time taken off work.
Your lawyer should also have been pointing out that the repeated delays were unreasonable and could only be a deliberate stalling tactic aimed at increasing said costs, which should therefore be punitive.
Or did you have a useless lawyer?
There do seem to be a lot of those.
These "Smart Pens" with "Smart Paper" have been around for so long that they are the common and accepted way of signing off important documents in some areas of industry.
Two years ago (so before this patent was filed) the shipyard I was at used them for signing off things like "yes, the engine in this ship does actually work" - the kind of thing where rectifying a mistake costs millions so you want to be very sure!
(I never signed with the pen, being a pleb for a subcontractor.)
They'd already been using them for a few years before I saw them - not least for "Yes, that's the right size engine for the ship you'll be building in two years time."
Thus the prior art goes back a very long way.
Reading an early draft of EN-ISO 14819-2, I found a few slightly odd ones:
627 - No Motor Vehicles Without Catalytic Converters (Why would that change?)
628/629 - No Motor Vehicles With Even/Odd-Numbered Registration Plates
28 - Road Closed Intermittently (Huh?)
709 - Blasting Work
37 - Restaurant reopened. (What, no pub?)
1479 - Gunfire on Roadway - Danger (You don't say?)
Possibly the strangest would be: 1477 - Police Checkpoint.
Why would they advertise that?
Of course, the actual standard requires monies to be paid, and I'm not bothered enough to find out what exacting changes happened in the end.