1026 posts • joined 10 Apr 2007
Re: Excellent Stuff!
Yep. And before this article I didn't even consider how lifts and lift designs were tested. Obvious sense that they are and should be, but it's nice to see it so well (and visibly) demonstrated.
Re: Holy shit ...
Hahahaha.. That's almost a utterly messed up as the monster cables website.
So far the gem of the site has to be the "13 amp - High Performance Hi-Fi Fuses"... from only £34.94.
And the inevitable techno-babble bullshit: High performance fuses will protect the circuit but the integrety of the supply line is vastly upgraded, Like a good power cable allow 50+ hours to bed in.
I'm also going to have to have serious words with my "bad" power cables that have a very nasty habit of obeying the laws of physics (and sense) and tend to work straight away without requiring 50+ hours to start to work properly.
Just to add to the flickering (light) topic here... human eyes have considerably more movement sensors on the periphery than in the centre, interesting balanced by having almost no colour sensors in the periphery where we see in monochrome and the brain fills in the detail with what it remembers (or guesses from experience).
As a result, many household bulbs don't flicker when looked at directly but look (!) at them from the corner of your eye and you'll see the flicker. This flicker can also be seen when the light is reflecting off a surface. It's one of the (many) causes behind offices fitted with fluorescent bulbs giving staff headaches.Interestingly the flicker is also one of the reason that these bulbs often come in pairs (or more) as gives not only gives fail-over in the event of tube failure but reduces the impact of the flicker through it being masked by neighbouring tubes.
Incandescent bulbs also flicker due to the power supply frequency but the effect is negligible as they operate by heating an element and this element does not cool enough between cycles for the flicker to be noticeable. However you can make it so you can see the flicker if you use a high output bulb and reduce the output to minimal using a dimmer switch.
Re: Alternative to Windows?
Re: Start button - Never used it anyway.
Unfortunately you are a "power user" and have entirely missed the point.
99.999% of Windows users are not not power users. They have not memorised arcane keyboard shortcuts, they often don't even know that there are keyboard shortcuts (especially helped by MS's stupid insistence on hiding them as much as possible). They can just about wield a mouse in anger, often don't understand the shift key compared to the caps-lock key or know how to cursor through text instead of hitting backspace until they find the offending mistake and then retyping what they just deleted.
Users require an indication that there is some functionality available, this is a basic, fundamental aspect of good user interface design. They should not be given an artistically blank and meaningless screen and expected to somehow "know" how to bring up some functionality on it by clicking / thumbing arbitrary screen locations. This is why buttons were created in user interfaces (and hyperlinks in HTML documents) to give a user an indication that there is an action available. Unfortunately now we've gone backwards and the "artistic" (form over function) trend is to hide anything vaguely functional so a user is left having to randomly thumb an interface or patiently wave a mouse cursor over it hoping for something interesting to be revealed.
For what it's worth, I've been a specialist in User Interface design for over 20 years. I also use keyboard shortcuts extensively :)
Re: Sourceforge & Classic Shell to the rescue! (@AndrueC)
Precisely, I've done that for years on dedicated kiosk systems where they need to login (auto-login is a feature that's been there for a long time) and setting the explorer.exe replacement for that particular user.
It's staggering just how many developers of "embedded" (kiosk) systems using full Windows don't know this and calmly boot to explorer as the shell then auto-load their application. All it takes is a use to alt-tab and they have total control of the system. If a maintenance user of the embedded / kiosk system needs access to the explorer shell it's a simple matter of running explorer.exe from within the kiosk application and access is given.
Re: *looks at Eadon and laughs*
Rebuild to change the IP address?
I think you're getting confused with the nightmares of Windows NT4 service packs. Arrrrgghhh! They're sending shivers back just thinking about the farcical things we had to do to NT4 just to make some otherwise what should have been simple changes.
As for Linux vs BSD - they share a lot of code and features and there's a lot of movement going both ways. This makes a lot of sense and saves reinventing the wheel code wise.
Partly Political Broadcast
They already do... those kind of entertainment shows have to be fronted by a clear, unambiguous "This is a Partly Political Broadcast on behalf of <insert party here>" statement.
This notifies of an upcoming show where we all need to be prepared to suspend our disbelief circuits and get our chunder buckets ready to watch the grinning lunatics hug otherwise previously innocent, unsoiled children.
When you describe it like that, I completely agree about having a good range of devices - that have differentiating features.
The problem I have is that we're left looking at a line of devices that are (superficially) very similar to look at and who's to know which ones are actually worth the money and which ones are more land-fill?
Take a look at this page: http://www.htc.com/uk/smartphones/
Aside from the two Windows 8 devices (proving that it's not just Nokia that make them), the rest of the phones on the page are distinguishable by small variations in size (don't forget, they're all scaled to one size), by name... err HTC One - One SV, One X+, One XL, One X, One S and One V... wtf? By the marketing tag rubbish such as "Simply stunning" or "Exceptional performance comes standard" and the inevitable near 5 star rating lies that we expect to see on a manufacturer's own website.
There's probably only one or two phones on that page that are worth the bother for the money, a couple that are penis extensions for those with money and the rest? Probably land fill.
Looking at a similar Nokia page: http://www.nokia.com/gb-en/phones/lumia/ the problem's the same. Other than a couple of more rounded, possibly smaller, devices they all largely look the same and feature the same baffling product numbers that seem to make little sense and there's no order to on the page.
Another Lumia? If they're not careful they'll be as bad as HTC are/were and Sumsung are getting with so many damn devices it's hard to know which are the turkeys and which are genuinely good devices for the money.
The build and specifications look good, and while I like some features in WinPhone as some are well thought out and work well, I find a lot of the basics extremely irritating.
As a designer and UI specialist I find (subjectively) that the interface is ghastly and there's too much "hidden" functionality than is not obvious and instead you're left searching around the interface for arbitrary ellipses (...) or swiping randomly in the hope of finding what you're looking for. But then the latest Android versions have gone backwards on this invisible interface of "..." front as well, which isn't good.
Re: Does not compute
I read it that DirectAccess functionality is required on every device on your network. :)
Re: Does not compute
I'm not entirely sure how directaccess could protect a light bulb... unless this light bulb is also running windows, in which case that's a scary prospect - both the additional requirements and the sheer inefficiency of running 2gb of bloat on a light bulb (because we can bet these technologies won't be available on the embedded form).
But back to the other problem... I have a single Internet connection for my home, this is shared between multiple devices and systems as they don't have their own connection and I'm definitely not stupid enough to run an open routing gateway. Where does it make sense to put the protection? On each device, or on the gateway? Not that directaccess couldn't prove useful for windows only environments where you don't mind (or care) about the inevitable lock in, it could be a very useful additional tool, but for protecting arbitrary devices it just doesn't read like it's the right tool.
My guess, on this much more important question that Crysis framerate ;), is that it'll be able to complete the task in a time span marginally longer than the total read time of the disc. Unfortunately blu-ray drives are not exactly known for read speed...
Armed forces are running their own locked down network that's not the Internet? That's revolutionary! Maybe some of ours might like to consider that doing that (properly) is a good idea?
Re: Coat's First Law Of Optical Media
There must be a similar observation regarding tapes as well...
I, for one, welcome our new metal bug overlords.
See below ;)
Again, we see that all you need for Computing is Maths. And yet, almost without fail, the worst developers that I've come across have also been maths graduates.
I've had Maths specialists spray me with spittle for hours on end, telling me that the "next generation" of computer languages will write themselves, there's no point in learning development and that every application can and should be mathematically described. Even the simplest of optimisation / performance demonstrations between a mathematically described process and one that's been thought about failed to sway most of these spittle spreaders. It was almost a religion to many of them.
There are many things you need to be a good developer, a maths degree is far from the most important unless you're going to be specifically using maths in what you develop, in which case pray for the sanity of those that take over what you've developed if your maths is good but your coding skills are poor. Coding skills? Entirely separate from maths, are more about organisation and planning, logical awareness and experience and knowledge with a hearty dose of artistry thrown in than any degree course title.
Re: No you choose your degree at 13
When I got my choices at school as to what I studied, many of the combinations that I'd have liked to do were incompatible due to scheduling. It's wasn't just an arbitrary decision that I couldn't take both English and Computer Studies (as my computer course was known as) there were three streams and I in one of the choices I could choose either English, Computer Studies or Art. There was some sense to it, so the more science related people could choose science related topics, hence my "maths", "science/physics" and "computer studies" selection but I'd have liked to do English and Art but these were excluded.
So around the age of 13, is the time that many of these life long decisions are made.
So if it's only operational when transmitting crash data in response to a crash... just how the hell is this going to help when the vehicle is stolen unless the thief subsequently crashes the vehicle?
Some things just don't add up
Last time I flew back from the US (including an internal flight) I didn't even have a passport (it was lost/stolen).
It was almost comedy... "ID?" "I don't have any, I have lost my passport and am flying home" Oh, carry on then.
Landing home in the UK "ID?" "I lost it, but am a UK citizen" "OK, fill in this form" (form filled in - basically, name and address) "OK, welcome back to the UK"
Re: 5 screens sizes
Personally I've found the annoyance of different device and screen aspects and resolutions more annoying on iOS than on Android. Not that the wide range of screen resolutions on Android isn't an issue, but it feels like I have better inbuilt tools to deal with one application and multiple resolutions and ratios than in iOS.
Yes, there are different versions of Android to deal with - currently two main ones unless you want to be cutting edge. But even that's not too hard as you can target the cutting edge and have fallback to the older versions as the support libraries work quite nicely (at times :-) ). It does require testing but if you develop applications properly and cleanly separate functionality from interface (Model - View - Controller) then even if you have entirely different interfaces it is not always that difficult to develop, after all, many of us develop apps that can be operated in landscape or portrait mode and this kind of model is normal to us.
Re: CEO of company slags off major competitor
"Jobs spent much more time during these events talking about how great Apple stuff was"
...and this is exactly how you should do it. Jobs may have had his faults, but Apple's fortunes did turn around when he was there and negative marketing (criticising competitors) is a defeatist way to operate and is usually a sign of weakness, poor judgement and lousy marketing.
Most products and systems have advantages or disadvantages compared to other, more so when the operating environments are different - and don't forget that while iOS and Android nominally are similar their operating environments are different: Apple have a tight reign on the hardware, OS and applications where Android is much looser and open. [This isn't an argument as to which is "right", just stating facts - both approaches have major positive and negative points].
So when this guy starts to criticise a competitor like this (negative marketing) then it's an indication of weakness in him and likely his products too. Would you rather deal with somebody who is positive about their own products or somebody who is busy being negative about a competitor's where they should be telling you about theirs?
Well, the HDD (or other storage manufacturers) started to make a mess of things a long while ago. All in the concept of clarity, or maybe just sales and marketing lies...
The two sets of figures are now clarified in parallel - the base 10 (1TB = 1000GB, 1GB = 1000MB, 1MB = 1000K and so on) and the base 2 (1TB = 1024 GB).
So when a HDD manufacturer quotes a capacity in TB the total number of bytes compared to what they put on the packaging and what you might expect can be quite different as they'll use the base 10 values and if you're using the base 2 then you're going to be quite annoyed...
But this isn't a justification for BYOD. This is a need for a responsive IT / Facilities / whatever department that is progressive enough to supply the right equipment for the job rather than one-size-fits-nothing (bare minimum) approach.
Re: 70 per cent of US mobile workers now pay for their own kit
Yes. The article seems to be deliberately worded to make it appear that it's the purchase of (computer) equipment, where the reality is much more likely to be mobile phones or sat-navs, but then equipment could even include laptop bags, there's nothing to say that it's even electrical equipment. The original survey may clarify it more, but I don't have the time or inclination to go through a registration process to view it.
So it just looks like this "article" is just another BYOD sales advertorial trying to convince us that insanity is the best way forward as it's already happening and we should follow them off the cliff.
Any sane company that I've encountered recently has set of laptops for their road botherers to pick from as the size of a laptop can be quite a personal preference and tastes and requirements can vary quite drastically. Few still say "here's the one laptop model you may use" as there's little point in standardising like that because unless you buy all your laptops in one batch by the time you next come to order one your chosen model is inevitably obsolete and has been replaced.
I see your point, but Intel pushing a bit harder (and fairly - hah!) should push ARM to get better just like ARM's given Intel the severe kick it needed to stop producing small thermal generators and start making more power efficient chips instead.
Intel wiping out ARM is a different prospect to Intel vs AMD as ARM licences it's chip designs and sub-components to be manufactured by a wide variety of companies, some of which are at least as influential as Intel.
I thoroughly agree about the x86 often not being the correct chip for the job, these are desktop PC chips at heart and while subverting them for use in mobile phones and tablets isn't inappropriate as these are genuinely portable computers often with capabilities far in excess of what we had on desktop systems ten years ago. For lesser systems such as basic PLC and control systems they are overkill and the convoluted and inefficient instruction set combined with the higher integration requirements and subsequent costs really hold them back as well.
Competition. Yay! Good! ...and so on.
...now if they were to mandate well (and clearly) defined interfaces between systems then they could pick and choose suppliers as they feel fit and choose the solutions that are most effective, reliable and cost-effective for the job.
Unfortunately we all know that this won't happen and with past experience with the five listed, each will assign twenty project "managers" to each project. These project managers will change every project meeting and the chance of meeting the same one twice will be slim, nobody in these five suppliers will take overall responsibility for anything and all the actual, real work will be sub-contracted out to almost as frustrated sub-contractors as the the client (Network Rail).
Re: I've said it before.
Oh no... and it's not even Friday.
My prediction, as ever, is "Yet Another Rubber Faced" loon. Like the previous few.
Noddy Holder would make it special though!
Cloud (online data storage mainly) security, the real side of it, is a continual minefield.
In the end, it's safest to work on the assumption that if the data is replicated out of the EU then it has gone to somewhere insecure that has no real concept of privacy, such as the US. The US safe harbor(sp) agreements are never enforced or checked and are usually so specific in use that your data will bypass the agreements and won't be covered by them... and the safe harbour agreements only dictate what the company may voluntarily do with your data in their posession, as noted here it has nothing to do with legal, government or other processes.
Where I work a small amount of our data is extremely confidential and sensitive as it relates to high profile events and court cases, a sizeable chunk is commercially sensitive such that the company involved would not like a competitor to access it but most is trivial and of little interest to almost everyone. To be safe we work on the basis that it's all very confidential and as a result there's no way we can seriously consider cloud data or cloud application hosting, especially if the service provider mirrors the data outside of the EU but that we'd also be implicitly trusting all staff, contractors and other third parties that are involved with the hosting.
The thing is, compared to contemporary PC games at the time of each release, Halo has always been found wanting. The less than optimal controls are fine for a console but poor in comparison to the PC FPS gamers' choice of mouse and keyboard. The game itself was unremarkable compared to the features, visuals and gameplay or the PC FPS games that arrived at the same time and the conversion attempts of Halo from XBOX to PC really only served to highlight the gulf in quality. It's not helped that XBOX hardware was static and PC hardware always moved on but that's the nature of console game design and development.
Not that Halo isn't a fun game, it's polished enough so that it is good fun to grab some friends and shoot them, or even shoot with them! But compared to similar games it's nothing special so it's cult following on the XBOX does tend to leave PC gamers with bemused expressions.
I'll get me towel...
While some viruses do jump species, in reality complete jumps are very, very rare. We're all subject to millions of viruses every day, especially those who work or otherwise live with animals, most of these viruses are just not capable of properly infecting humans.
The reason is down to what viruses actually are - to put it in a fairly basic manner, they are a form of life that cannot reproduce on its own and instead has to invade other life (cells) and hijack the mechanisms of these cells in order to reproduce itself - essentially the lowest form of parasite and just like higher forms of parasite, they need to be specialised to do this effectively. From Wikipedia "A virus is a small infectious agent that can replicate only inside the living cells of an organism." (http://en.wikipedia.org/wiki/Virus). To start with, many viruses have to survive in the open environment, or at least outside of a host organism and HIV, for example, is not particularly tough and instead has to be transmitted without using the open environment... which is why, you won't get HIV through (non-intimate or blood sharing) contact with a carrier, through touching what they've touched or inhaling when the carrier has sneezed or coughed. A virus then needs to be lucky enough to find cells that it can utilise, seeking these through the chemical markers of the cells, latching onto these cells, invading them and hijacking the inner mechanisms of the cell. Through all these steps the virus is exposed to being cleaned up, removed, disabled or rendered useless by basic biological and chemical processes or in the case of cross-species, just not able to get far as it will target the wrong genes, mechanisms or chemical signatures. Once in a cell a virus then uses and controls the cell's mechanisms to reproduce itself (some viruses also cause the cells to reproduce, which is the general cause of virus induced cancers). Even once in a cell, if a virus causes the cell to outwardly reflect that the cell is not working correctly then the cell, and the virus in it, will be destroyed by the body's immune system. It's not easy being a new virus!
Viruses mutate a lot, therefore when living continually with animals that are shedding viruses there is always a chance that one virus strain might happen to be able to infect a human. A certain level of "infection" is quite common but usually entirely harmless as the viruses fail at some step of their hijack and reproduction process. For example, the virus might be able to infect a cell but not to reproduce. Even should a virus happen to manage to reproduce itself, it still has to find its way from one human to another, and that's a very different problem as that involves getting out, crossing between hosts somehow (HIV takes shortcuts on this and needs direct transmission) and then evading the immune system and defences of the new potential host.
I think other primates are immune to the (human) HIV as the HIV virus specifically targets human cells - it has to be it's so specialised and effective. It's not a case that other primates are immune, implying that their immune systems can deal with it, rather that other primates have enough differences in their cells that the human HIV virus has no overall success in infecting them. Other primates have their own versions of HIV and these are not successful in infecting humans as similarly they are too specialised.
Many of these are designed to facilitate a Bring Your Own Device (BYOD) environment, such as the ability to print using Wi-Fi Direct, share the screen using Miracast, pair with printers using near field communication (NFC), and have Windows 8.1 devices act as Wi-Fi hotspots via built-in broadband tethering.
Huh? Just how much BYOD FUD is being paid for around here?
Print using Wi-Fi Direct... most useful for tablets and laptops to print to non-domain or non-local printers. i.e. go to a friends house, print something on their printer. A BYOD system in an office environment will connect to the local network and be given access to printers through the carefully controlled access to printers functionality that the BYOD sellers are selling.
Share a screen with Miracast. Nope, I'm at a loss. This is particularly related to BYOD how?
Pair with printers using NFC? Sounds like basically the same as Wi-Fi Direct...
Windows 8 Wi-Fi Hotspots... so, useful for the home, bugger all of use in a corporate environment and nothing to do with BYOD.
Well that's an odd one isn't it?
In theory, if a company's selling point is like the previously mentioned Abercrombie and similar, that they have fit, healthy looking attractive staff to front their store, should they not be allowed to advertise and look for these people? Likewise when a fashion house is parading their clothes, they need models of particular looks and sizes to model them, are they wrong to have this requirement? Do car companies pick the models to drape themselves over their latest luxury cars have requirements as well?
Not to say that it's right or wrong, but political correctness can go too far and some jobs do need above average staff - not necessarily just on the physical looks front either.
Re: At least with this site...
Yes. And better to accept this and understand that this is the way things are than to try to deny it.
Re: racism is racism
not forgetting to get an (Indian*) Curry, sit on your (Swiss) furniture. And the flag you're waving will probably have been made in China anyway...
* Yes, I am aware that many "Indian" dishes that we are used to originate outside of India.
Re: Special cable
As this special USB cable would sit between the computer and the standard Apple cable/circuitry would it still charge? From my understanding this would make the setup little different to a low powered plug charger.
Supporting touch is good. Even Windows 7 made huge strides on that front.
Crippling the functionality of an OS's UI so that it's only properly usable on a touch screen, that's not so good.
"But we also recognize there are many non-touch devices in use today"
As in 99.999%(*) of the things are non-touch. Which makes targetting an OS UI so that it's only properly usable on (smallish) touch screen systems a stupidity of amazing incompetence levels.
"The new tip appears anytime you move the mouse to the bottom left corner of the screen, and is always visible on the taskbar when on the desktop."
Always visible in the desktop mode is a good start. Invisible buttons (and actions) in arbitrary places on the screen are still an example of some of the worst UI design possible. If a user is not using a mouse, how are they meant to know that an arbitrary screen location has any functionality or what it is? For those rabid lovers of keyboard only interfaces or MS shills... it is very important that all actions are obvious or accessible to users. Users are not telepathic; They have many experiences, expectations and capabilities and presenting a largely blank screen to a user and expecting each one to somehow "know" to thumb the correct spot on the screen takes UI design incompetence to a new level. Even I had to google, using my phone as the computer was unusable, to find out how to unlock the ****ing lock screen on Windows 8 the first time I came across it.
* Just an air plucked figure, but just as likely to be correct compared to many statistics.
Re: Where's The Eadon Angle?
Eadon (http://forums.theregister.co.uk/user/34672/) hasn't made any posts since Friday 24th May. Maybe he's burst a blood vessel or been kidnapped?
So MS partially correct the misuse of invisible UI elements but it's just window dressing, not real change and certainly not what users are screaming for. How not surprised do I look?
Precisely, it's not the lack of start menu that's the problem. The full screen start menu isn't the worst thing in the world, if done right and with enough flexibility and so the damn thing doesn't look like a useless screen full of incomprehensible icons, which I suppose they're starting to fix. It's the idiotic invisible "you need to know to thumb here" mentality in the entire UI that's the most pressing problem. The second most serious problem I'm not sure is either the context switching between Metro (whatever) and desktop environments and the differences between the applications that are in them, or the fact that half the interface is only properly usable with a touch screen which is utterly useless for desktop systems, barely useful for laptops and only properly useful for tablets.
Subjectively I find the entire Metro UI, the capitals everywhere, the irritating lack of a way to easily find functionality and the bland icons of nothing much very ugly. But that's my opinion, some people like it.
Re: It's all about value. Time to change
To add to this when it comes to staff who know vaguely one end of what they're selling compared to another... consider the "successful" tech shops. Those like carphone warehouse, apple stores and similar. These try hard to have staff that have a clue about what they're selling and can help the customer with it, giving people a reason to go to the store.
Other successful shops try, and even sometimes manage, the same as well... such as Boots (with their makeup specialists), John Lewis with their departmental product specialists, even Tesco have specialists in their mobile sub-shops. Unfortunately it doesn't always work and many stores are often staffed by uncaring imbeciles, but given the pay and working conditions this isn't always a surprise.
The closing stores the likes of Dixons were (are) staffed by muppets at every level. Sometimes stores slip through the cracks like PCWorld and continue to survive but these are the exceptions and are more due to them being closer to specialised supermarkets than places to learn about what you're buying.
Re: High Street
Agreed. It used to be a running joke the all the banks and post offices in town centres had no staff on at lunch time. Because they were at lunch themselves. While it's true that they need lunch, they are there to provide a service and a vast number of their customers who need their services would do it in the little spare time they have when they are open... i.e. at lunch time.
Some people get this service business, others don't - guess which ones are successful? A friend of mine owns a hair salon where her business plan was to provide out of town salon with in-town quality and style, but at convenient hours for those that work. The result... her and her staff work Tuesday through Thursday evenings through to 9pm, are open all day Saturday and have Monday off and start work later to compensate. It's a very successful salon because it provides services at the times that are needed.
Ah.... Friday again :)
Love the bit about personal emails :)
Re: For people who knew no better
Well, Amiga corp lost the plot which didn't help them much. They had some fantastic plans for the new releases of hardware and OS, plans that all went a "bit" (as in spectacularly) wrong with losing designs, political infights and generally losing the plot.
From discussions I had with Amiga engineers around then, one of the planned (or dreamed) features at the time was for Amiga windowing scheme to be more linked to the display hardware, therefore if a window needed just 2 bit planes, or needed 4 or 8 (or 24 I suspect) it could request this and the window area would be efficiently allocated and managed by the hardware. This way efficiency could be kept, high colour windows would use just the resources they required, low colour windows would use as little as possible and the entire display would be handled in hardware giving extremely fast and efficient windowing operations. Even from the start Amigas had hardware pointers for the mice (it took years for Windows PCs to approach the smoothness of the Amiga mouse pointers), multiple, stackable displays with different colour depths and heights and the evolution of this was to move to supporting different "screen" widths and allow them to be managed as if they were windows.
But things moved on, the Amiga unfortunately faltered and died and here we are now.
The Amiga wasn't the only system doing the rounds at time, there were of course the Atari ST and the Archimedes as well, both very capable systems in slightly different ways. And given choice could always use an expensive PC if your colour palette of choice was Black, White, Cyan and Purple or if you had much more money than sense and could find one, then you could buy and use a Mac - often monochrome and usually very closely tied to the Mac's strength at the time - Desktop Publishing, but very effective for it.
Re: Let's talk photon counts and well sizes
The comparison of the outside lens area and therefore the difference in maximum photon collection is all valid. However it's the poor efficiency of the lenses, and the poor efficiency of the sensors that are the biggest problem. A larger collecting (outside) lens will help and if the quality of the lens and the sensors are the same, then it will obviously be better. However there's much more to it than that...
If the sensor is twice as wide and high in one camera (quadrupling the available area) then the lens does not need to focus the light as tightly, allowing better image detail as there is inherently more accuracy through improved tolerances.
The sensors are also inefficient as in current devices as a large proportion of the incoming light is lost as it hits support circuitry and gaps between sensors and sensor groups rather than hitting the sensors. Improvements to the sensor orientation (vertical stacked circuitry) and other techniques are coming but don't appear to have made it to market yet. I can't remember the exact figures but right now it's something along the lines of 80% of incoming photons do not hit a sensor. Better collection results in a wider sampling range and therefore more accuracy across the scale.
I'm tempted with one like the rikomagic that was linked in the first post here... connect to WiFi, install XMBC, configure access to NAS and job done, one more media centre deployed. It's the additional wires that may be needed that could be annoying, these things are unlikely to be attachable direct to a display or will look appalling stuck out of the side if they can therefore they'll need a (short) extension cable.
But the same would apply to the original item this post was about, may often be unfittable without an extension cable and would need the hassle of external power as well.
Re: Maybe a bit flawed?
I've given up paying any real attention or not suspending belief whenever I see the source as Forrester...
They're just paid to write "research" articles that reflect whatever their customer wants. Therefore there's the inevitable BYOD mention in this report and a lot of statistics that frankly given the source figures, or even reality, could be spun to reflect whatever message Forrester's paying customers want.