1364 posts • joined 10 Apr 2007
Re: Does not compute
I'm not entirely sure how directaccess could protect a light bulb... unless this light bulb is also running windows, in which case that's a scary prospect - both the additional requirements and the sheer inefficiency of running 2gb of bloat on a light bulb (because we can bet these technologies won't be available on the embedded form).
But back to the other problem... I have a single Internet connection for my home, this is shared between multiple devices and systems as they don't have their own connection and I'm definitely not stupid enough to run an open routing gateway. Where does it make sense to put the protection? On each device, or on the gateway? Not that directaccess couldn't prove useful for windows only environments where you don't mind (or care) about the inevitable lock in, it could be a very useful additional tool, but for protecting arbitrary devices it just doesn't read like it's the right tool.
Another Lumia? If they're not careful they'll be as bad as HTC are/were and Sumsung are getting with so many damn devices it's hard to know which are the turkeys and which are genuinely good devices for the money.
The build and specifications look good, and while I like some features in WinPhone as some are well thought out and work well, I find a lot of the basics extremely irritating.
As a designer and UI specialist I find (subjectively) that the interface is ghastly and there's too much "hidden" functionality than is not obvious and instead you're left searching around the interface for arbitrary ellipses (...) or swiping randomly in the hope of finding what you're looking for. But then the latest Android versions have gone backwards on this invisible interface of "..." front as well, which isn't good.
Re: Coat's First Law Of Optical Media
There must be a similar observation regarding tapes as well...
I, for one, welcome our new metal bug overlords.
See below ;)
Again, we see that all you need for Computing is Maths. And yet, almost without fail, the worst developers that I've come across have also been maths graduates.
I've had Maths specialists spray me with spittle for hours on end, telling me that the "next generation" of computer languages will write themselves, there's no point in learning development and that every application can and should be mathematically described. Even the simplest of optimisation / performance demonstrations between a mathematically described process and one that's been thought about failed to sway most of these spittle spreaders. It was almost a religion to many of them.
There are many things you need to be a good developer, a maths degree is far from the most important unless you're going to be specifically using maths in what you develop, in which case pray for the sanity of those that take over what you've developed if your maths is good but your coding skills are poor. Coding skills? Entirely separate from maths, are more about organisation and planning, logical awareness and experience and knowledge with a hearty dose of artistry thrown in than any degree course title.
Re: No you choose your degree at 13
When I got my choices at school as to what I studied, many of the combinations that I'd have liked to do were incompatible due to scheduling. It's wasn't just an arbitrary decision that I couldn't take both English and Computer Studies (as my computer course was known as) there were three streams and I in one of the choices I could choose either English, Computer Studies or Art. There was some sense to it, so the more science related people could choose science related topics, hence my "maths", "science/physics" and "computer studies" selection but I'd have liked to do English and Art but these were excluded.
So around the age of 13, is the time that many of these life long decisions are made.
So if it's only operational when transmitting crash data in response to a crash... just how the hell is this going to help when the vehicle is stolen unless the thief subsequently crashes the vehicle?
Some things just don't add up
Last time I flew back from the US (including an internal flight) I didn't even have a passport (it was lost/stolen).
It was almost comedy... "ID?" "I don't have any, I have lost my passport and am flying home" Oh, carry on then.
Landing home in the UK "ID?" "I lost it, but am a UK citizen" "OK, fill in this form" (form filled in - basically, name and address) "OK, welcome back to the UK"
Re: 5 screens sizes
Personally I've found the annoyance of different device and screen aspects and resolutions more annoying on iOS than on Android. Not that the wide range of screen resolutions on Android isn't an issue, but it feels like I have better inbuilt tools to deal with one application and multiple resolutions and ratios than in iOS.
Yes, there are different versions of Android to deal with - currently two main ones unless you want to be cutting edge. But even that's not too hard as you can target the cutting edge and have fallback to the older versions as the support libraries work quite nicely (at times :-) ). It does require testing but if you develop applications properly and cleanly separate functionality from interface (Model - View - Controller) then even if you have entirely different interfaces it is not always that difficult to develop, after all, many of us develop apps that can be operated in landscape or portrait mode and this kind of model is normal to us.
Re: CEO of company slags off major competitor
"Jobs spent much more time during these events talking about how great Apple stuff was"
...and this is exactly how you should do it. Jobs may have had his faults, but Apple's fortunes did turn around when he was there and negative marketing (criticising competitors) is a defeatist way to operate and is usually a sign of weakness, poor judgement and lousy marketing.
Most products and systems have advantages or disadvantages compared to other, more so when the operating environments are different - and don't forget that while iOS and Android nominally are similar their operating environments are different: Apple have a tight reign on the hardware, OS and applications where Android is much looser and open. [This isn't an argument as to which is "right", just stating facts - both approaches have major positive and negative points].
So when this guy starts to criticise a competitor like this (negative marketing) then it's an indication of weakness in him and likely his products too. Would you rather deal with somebody who is positive about their own products or somebody who is busy being negative about a competitor's where they should be telling you about theirs?
Well, the HDD (or other storage manufacturers) started to make a mess of things a long while ago. All in the concept of clarity, or maybe just sales and marketing lies...
The two sets of figures are now clarified in parallel - the base 10 (1TB = 1000GB, 1GB = 1000MB, 1MB = 1000K and so on) and the base 2 (1TB = 1024 GB).
So when a HDD manufacturer quotes a capacity in TB the total number of bytes compared to what they put on the packaging and what you might expect can be quite different as they'll use the base 10 values and if you're using the base 2 then you're going to be quite annoyed...
But this isn't a justification for BYOD. This is a need for a responsive IT / Facilities / whatever department that is progressive enough to supply the right equipment for the job rather than one-size-fits-nothing (bare minimum) approach.
Re: 70 per cent of US mobile workers now pay for their own kit
Yes. The article seems to be deliberately worded to make it appear that it's the purchase of (computer) equipment, where the reality is much more likely to be mobile phones or sat-navs, but then equipment could even include laptop bags, there's nothing to say that it's even electrical equipment. The original survey may clarify it more, but I don't have the time or inclination to go through a registration process to view it.
So it just looks like this "article" is just another BYOD sales advertorial trying to convince us that insanity is the best way forward as it's already happening and we should follow them off the cliff.
Any sane company that I've encountered recently has set of laptops for their road botherers to pick from as the size of a laptop can be quite a personal preference and tastes and requirements can vary quite drastically. Few still say "here's the one laptop model you may use" as there's little point in standardising like that because unless you buy all your laptops in one batch by the time you next come to order one your chosen model is inevitably obsolete and has been replaced.
I see your point, but Intel pushing a bit harder (and fairly - hah!) should push ARM to get better just like ARM's given Intel the severe kick it needed to stop producing small thermal generators and start making more power efficient chips instead.
Intel wiping out ARM is a different prospect to Intel vs AMD as ARM licences it's chip designs and sub-components to be manufactured by a wide variety of companies, some of which are at least as influential as Intel.
I thoroughly agree about the x86 often not being the correct chip for the job, these are desktop PC chips at heart and while subverting them for use in mobile phones and tablets isn't inappropriate as these are genuinely portable computers often with capabilities far in excess of what we had on desktop systems ten years ago. For lesser systems such as basic PLC and control systems they are overkill and the convoluted and inefficient instruction set combined with the higher integration requirements and subsequent costs really hold them back as well.
Competition. Yay! Good! ...and so on.
...now if they were to mandate well (and clearly) defined interfaces between systems then they could pick and choose suppliers as they feel fit and choose the solutions that are most effective, reliable and cost-effective for the job.
Unfortunately we all know that this won't happen and with past experience with the five listed, each will assign twenty project "managers" to each project. These project managers will change every project meeting and the chance of meeting the same one twice will be slim, nobody in these five suppliers will take overall responsibility for anything and all the actual, real work will be sub-contracted out to almost as frustrated sub-contractors as the the client (Network Rail).
Re: I've said it before.
Oh no... and it's not even Friday.
My prediction, as ever, is "Yet Another Rubber Faced" loon. Like the previous few.
Noddy Holder would make it special though!
Cloud (online data storage mainly) security, the real side of it, is a continual minefield.
In the end, it's safest to work on the assumption that if the data is replicated out of the EU then it has gone to somewhere insecure that has no real concept of privacy, such as the US. The US safe harbor(sp) agreements are never enforced or checked and are usually so specific in use that your data will bypass the agreements and won't be covered by them... and the safe harbour agreements only dictate what the company may voluntarily do with your data in their posession, as noted here it has nothing to do with legal, government or other processes.
Where I work a small amount of our data is extremely confidential and sensitive as it relates to high profile events and court cases, a sizeable chunk is commercially sensitive such that the company involved would not like a competitor to access it but most is trivial and of little interest to almost everyone. To be safe we work on the basis that it's all very confidential and as a result there's no way we can seriously consider cloud data or cloud application hosting, especially if the service provider mirrors the data outside of the EU but that we'd also be implicitly trusting all staff, contractors and other third parties that are involved with the hosting.
The thing is, compared to contemporary PC games at the time of each release, Halo has always been found wanting. The less than optimal controls are fine for a console but poor in comparison to the PC FPS gamers' choice of mouse and keyboard. The game itself was unremarkable compared to the features, visuals and gameplay or the PC FPS games that arrived at the same time and the conversion attempts of Halo from XBOX to PC really only served to highlight the gulf in quality. It's not helped that XBOX hardware was static and PC hardware always moved on but that's the nature of console game design and development.
Not that Halo isn't a fun game, it's polished enough so that it is good fun to grab some friends and shoot them, or even shoot with them! But compared to similar games it's nothing special so it's cult following on the XBOX does tend to leave PC gamers with bemused expressions.
I'll get me towel...
While some viruses do jump species, in reality complete jumps are very, very rare. We're all subject to millions of viruses every day, especially those who work or otherwise live with animals, most of these viruses are just not capable of properly infecting humans.
The reason is down to what viruses actually are - to put it in a fairly basic manner, they are a form of life that cannot reproduce on its own and instead has to invade other life (cells) and hijack the mechanisms of these cells in order to reproduce itself - essentially the lowest form of parasite and just like higher forms of parasite, they need to be specialised to do this effectively. From Wikipedia "A virus is a small infectious agent that can replicate only inside the living cells of an organism." (http://en.wikipedia.org/wiki/Virus). To start with, many viruses have to survive in the open environment, or at least outside of a host organism and HIV, for example, is not particularly tough and instead has to be transmitted without using the open environment... which is why, you won't get HIV through (non-intimate or blood sharing) contact with a carrier, through touching what they've touched or inhaling when the carrier has sneezed or coughed. A virus then needs to be lucky enough to find cells that it can utilise, seeking these through the chemical markers of the cells, latching onto these cells, invading them and hijacking the inner mechanisms of the cell. Through all these steps the virus is exposed to being cleaned up, removed, disabled or rendered useless by basic biological and chemical processes or in the case of cross-species, just not able to get far as it will target the wrong genes, mechanisms or chemical signatures. Once in a cell a virus then uses and controls the cell's mechanisms to reproduce itself (some viruses also cause the cells to reproduce, which is the general cause of virus induced cancers). Even once in a cell, if a virus causes the cell to outwardly reflect that the cell is not working correctly then the cell, and the virus in it, will be destroyed by the body's immune system. It's not easy being a new virus!
Viruses mutate a lot, therefore when living continually with animals that are shedding viruses there is always a chance that one virus strain might happen to be able to infect a human. A certain level of "infection" is quite common but usually entirely harmless as the viruses fail at some step of their hijack and reproduction process. For example, the virus might be able to infect a cell but not to reproduce. Even should a virus happen to manage to reproduce itself, it still has to find its way from one human to another, and that's a very different problem as that involves getting out, crossing between hosts somehow (HIV takes shortcuts on this and needs direct transmission) and then evading the immune system and defences of the new potential host.
I think other primates are immune to the (human) HIV as the HIV virus specifically targets human cells - it has to be it's so specialised and effective. It's not a case that other primates are immune, implying that their immune systems can deal with it, rather that other primates have enough differences in their cells that the human HIV virus has no overall success in infecting them. Other primates have their own versions of HIV and these are not successful in infecting humans as similarly they are too specialised.
Many of these are designed to facilitate a Bring Your Own Device (BYOD) environment, such as the ability to print using Wi-Fi Direct, share the screen using Miracast, pair with printers using near field communication (NFC), and have Windows 8.1 devices act as Wi-Fi hotspots via built-in broadband tethering.
Huh? Just how much BYOD FUD is being paid for around here?
Print using Wi-Fi Direct... most useful for tablets and laptops to print to non-domain or non-local printers. i.e. go to a friends house, print something on their printer. A BYOD system in an office environment will connect to the local network and be given access to printers through the carefully controlled access to printers functionality that the BYOD sellers are selling.
Share a screen with Miracast. Nope, I'm at a loss. This is particularly related to BYOD how?
Pair with printers using NFC? Sounds like basically the same as Wi-Fi Direct...
Windows 8 Wi-Fi Hotspots... so, useful for the home, bugger all of use in a corporate environment and nothing to do with BYOD.
Well that's an odd one isn't it?
In theory, if a company's selling point is like the previously mentioned Abercrombie and similar, that they have fit, healthy looking attractive staff to front their store, should they not be allowed to advertise and look for these people? Likewise when a fashion house is parading their clothes, they need models of particular looks and sizes to model them, are they wrong to have this requirement? Do car companies pick the models to drape themselves over their latest luxury cars have requirements as well?
Not to say that it's right or wrong, but political correctness can go too far and some jobs do need above average staff - not necessarily just on the physical looks front either.
Re: At least with this site...
Yes. And better to accept this and understand that this is the way things are than to try to deny it.
Re: racism is racism
not forgetting to get an (Indian*) Curry, sit on your (Swiss) furniture. And the flag you're waving will probably have been made in China anyway...
* Yes, I am aware that many "Indian" dishes that we are used to originate outside of India.
Re: Special cable
As this special USB cable would sit between the computer and the standard Apple cable/circuitry would it still charge? From my understanding this would make the setup little different to a low powered plug charger.
Supporting touch is good. Even Windows 7 made huge strides on that front.
Crippling the functionality of an OS's UI so that it's only properly usable on a touch screen, that's not so good.
"But we also recognize there are many non-touch devices in use today"
As in 99.999%(*) of the things are non-touch. Which makes targetting an OS UI so that it's only properly usable on (smallish) touch screen systems a stupidity of amazing incompetence levels.
"The new tip appears anytime you move the mouse to the bottom left corner of the screen, and is always visible on the taskbar when on the desktop."
Always visible in the desktop mode is a good start. Invisible buttons (and actions) in arbitrary places on the screen are still an example of some of the worst UI design possible. If a user is not using a mouse, how are they meant to know that an arbitrary screen location has any functionality or what it is? For those rabid lovers of keyboard only interfaces or MS shills... it is very important that all actions are obvious or accessible to users. Users are not telepathic; They have many experiences, expectations and capabilities and presenting a largely blank screen to a user and expecting each one to somehow "know" to thumb the correct spot on the screen takes UI design incompetence to a new level. Even I had to google, using my phone as the computer was unusable, to find out how to unlock the ****ing lock screen on Windows 8 the first time I came across it.
* Just an air plucked figure, but just as likely to be correct compared to many statistics.
Re: Where's The Eadon Angle?
Eadon (http://forums.theregister.co.uk/user/34672/) hasn't made any posts since Friday 24th May. Maybe he's burst a blood vessel or been kidnapped?
So MS partially correct the misuse of invisible UI elements but it's just window dressing, not real change and certainly not what users are screaming for. How not surprised do I look?
Precisely, it's not the lack of start menu that's the problem. The full screen start menu isn't the worst thing in the world, if done right and with enough flexibility and so the damn thing doesn't look like a useless screen full of incomprehensible icons, which I suppose they're starting to fix. It's the idiotic invisible "you need to know to thumb here" mentality in the entire UI that's the most pressing problem. The second most serious problem I'm not sure is either the context switching between Metro (whatever) and desktop environments and the differences between the applications that are in them, or the fact that half the interface is only properly usable with a touch screen which is utterly useless for desktop systems, barely useful for laptops and only properly useful for tablets.
Subjectively I find the entire Metro UI, the capitals everywhere, the irritating lack of a way to easily find functionality and the bland icons of nothing much very ugly. But that's my opinion, some people like it.
Re: It's all about value. Time to change
To add to this when it comes to staff who know vaguely one end of what they're selling compared to another... consider the "successful" tech shops. Those like carphone warehouse, apple stores and similar. These try hard to have staff that have a clue about what they're selling and can help the customer with it, giving people a reason to go to the store.
Other successful shops try, and even sometimes manage, the same as well... such as Boots (with their makeup specialists), John Lewis with their departmental product specialists, even Tesco have specialists in their mobile sub-shops. Unfortunately it doesn't always work and many stores are often staffed by uncaring imbeciles, but given the pay and working conditions this isn't always a surprise.
The closing stores the likes of Dixons were (are) staffed by muppets at every level. Sometimes stores slip through the cracks like PCWorld and continue to survive but these are the exceptions and are more due to them being closer to specialised supermarkets than places to learn about what you're buying.
Re: High Street
Agreed. It used to be a running joke the all the banks and post offices in town centres had no staff on at lunch time. Because they were at lunch themselves. While it's true that they need lunch, they are there to provide a service and a vast number of their customers who need their services would do it in the little spare time they have when they are open... i.e. at lunch time.
Some people get this service business, others don't - guess which ones are successful? A friend of mine owns a hair salon where her business plan was to provide out of town salon with in-town quality and style, but at convenient hours for those that work. The result... her and her staff work Tuesday through Thursday evenings through to 9pm, are open all day Saturday and have Monday off and start work later to compensate. It's a very successful salon because it provides services at the times that are needed.
Ah.... Friday again :)
Love the bit about personal emails :)
Re: For people who knew no better
Well, Amiga corp lost the plot which didn't help them much. They had some fantastic plans for the new releases of hardware and OS, plans that all went a "bit" (as in spectacularly) wrong with losing designs, political infights and generally losing the plot.
From discussions I had with Amiga engineers around then, one of the planned (or dreamed) features at the time was for Amiga windowing scheme to be more linked to the display hardware, therefore if a window needed just 2 bit planes, or needed 4 or 8 (or 24 I suspect) it could request this and the window area would be efficiently allocated and managed by the hardware. This way efficiency could be kept, high colour windows would use just the resources they required, low colour windows would use as little as possible and the entire display would be handled in hardware giving extremely fast and efficient windowing operations. Even from the start Amigas had hardware pointers for the mice (it took years for Windows PCs to approach the smoothness of the Amiga mouse pointers), multiple, stackable displays with different colour depths and heights and the evolution of this was to move to supporting different "screen" widths and allow them to be managed as if they were windows.
But things moved on, the Amiga unfortunately faltered and died and here we are now.
The Amiga wasn't the only system doing the rounds at time, there were of course the Atari ST and the Archimedes as well, both very capable systems in slightly different ways. And given choice could always use an expensive PC if your colour palette of choice was Black, White, Cyan and Purple or if you had much more money than sense and could find one, then you could buy and use a Mac - often monochrome and usually very closely tied to the Mac's strength at the time - Desktop Publishing, but very effective for it.
Re: Let's talk photon counts and well sizes
The comparison of the outside lens area and therefore the difference in maximum photon collection is all valid. However it's the poor efficiency of the lenses, and the poor efficiency of the sensors that are the biggest problem. A larger collecting (outside) lens will help and if the quality of the lens and the sensors are the same, then it will obviously be better. However there's much more to it than that...
If the sensor is twice as wide and high in one camera (quadrupling the available area) then the lens does not need to focus the light as tightly, allowing better image detail as there is inherently more accuracy through improved tolerances.
The sensors are also inefficient as in current devices as a large proportion of the incoming light is lost as it hits support circuitry and gaps between sensors and sensor groups rather than hitting the sensors. Improvements to the sensor orientation (vertical stacked circuitry) and other techniques are coming but don't appear to have made it to market yet. I can't remember the exact figures but right now it's something along the lines of 80% of incoming photons do not hit a sensor. Better collection results in a wider sampling range and therefore more accuracy across the scale.
I'm tempted with one like the rikomagic that was linked in the first post here... connect to WiFi, install XMBC, configure access to NAS and job done, one more media centre deployed. It's the additional wires that may be needed that could be annoying, these things are unlikely to be attachable direct to a display or will look appalling stuck out of the side if they can therefore they'll need a (short) extension cable.
But the same would apply to the original item this post was about, may often be unfittable without an extension cable and would need the hassle of external power as well.
Re: Maybe a bit flawed?
I've given up paying any real attention or not suspending belief whenever I see the source as Forrester...
They're just paid to write "research" articles that reflect whatever their customer wants. Therefore there's the inevitable BYOD mention in this report and a lot of statistics that frankly given the source figures, or even reality, could be spun to reflect whatever message Forrester's paying customers want.
Re: Real science
@AC - May 2013 04:48 GMT
Please use more CAPITALS. Foaming at the mouth comes across much better on the Internet when typing all in caps.
Re: Let's hope LM get the accelerometer the right way up this time.
Most well designed non-low level industrial electrical or electro-mechanical components are designed so that they cannot be fitted in an incorrect orientation. It's always possible to stuff up the original drawings but with modern circuit software and simulations if the part is correctly described then incorrect orientation should be flagged up very quickly. Unfortunately I've come across a lot of custom parts that don't stick with this basic principle and have seen a lot of destroyed components as a result, and have done some of this destroying myself. :)
One off pieces of machinery like these probes are largely assembled by humans and designing from the ground up on the basis that the assembling human will insert parts incorrectly given the slightest opportunity to do so is the right way to design. Unfortunately I have come across "engineers" who when faced with parts that didn't fit in the orientation that they tried bent or removed pins to force the part to fit rather than rotate the component 180 degrees therefore you can't always protect yourself from idiots but I'd hope that the NASA team employed better assemblers than these.
Competition is definitely good.
Improvements to IDEs for Android are also good as well, and while Intel's development tools don't have the wide scale use of others, they are very competent in places.
Re: Planned for 90 days, still going after 9 years...
Sounds like most "temporary solutions"...
Re: Seriously, PHP?
Hmmm... So because you don't like PHP and/or can't write neatly in it then nobody should?
Not that PHP doesn't feature a lot of stupid, especially in the various attempts at classes and objects, but like any flexible language you can choose to hang yourself with your code or write code neatly.
Bring on the idiotic holy language wars...
Maybe it should be: If you can't grasp assembly then you shouldn't be writing code...
Re: Nokia was right
Where does it show that other Android mobile makers aren't making money? They may not be making as much as Samsung, down to volume of sales for the quantity and volume benefits, but they can still make money.
Re: Hang on...
Completely. But sanity has nothing to do with this, but every one of these 3.2% definitely, absolutely, cost the media industry $millions each. That's $millions in sales they'd never have got in the first place.
Not to justify it, just to quantify against some form of reality.
There's an interesting disparity between fair, moral and the letter of the outdated (or caught out) law.
For example, I've downloaded ebook copies of the series "A Song of Ice and Fire" (Game of Thrones). Why? A friend lent me the physical books, in the way that we've borrowed and lent books for years, however I really didn't want to lug a dead tree or two around when I can have the convenience of them on my mobile device. I still have the (lent) pile of books on my shelves, I'm not lending them out to anyone else... but I'm still a pirate.
Likewise I've downloaded copies of most the movies that I own. An entire floor to ceiling cupboard of DVD size boxes takes up a lot of space and having them all on one central media server (XBMC) makes a lot of sense and we've watched some great movies that we'd previously forgotten that we owned as a result of losing them in the piles. It's interesting to remember that the whole point of creative media is to be consumed, it's no good sitting there on a shelf - although the media cartels seem to think that it's all about making them richer. Why download? it's quicker and less hassle to download than rip them myself and I can usually download them in better quality than I can rip them myself as well. Once ripped we don't have to sit through the unskippable "piracy is theft" lies that are fronted on all the DVDs (and unskippable adverts on Disney films) and can go straight to the movie itself - who really cares about the extras? To extend this there's the annoyance of Blu-Ray and the useless further extra features and the extreme tedium of waiting to get the disc to the point where you can actually start watching what you bought it for. But technically I'm still a pirate.
Re: Oh Dear
Look at the current world's top smartphone manufacturer...
Does this company produce phones for different OSes, hedging their bets? Yes. They might not have a wide range of devices but they do have a wide range of OSes, therefore keeping knowledge and skills they might otherwise have lost. It also keeps the suppliers on their toes as they know they need to continue improving.
Does this company somehow manage to promote their brand over the brand of the Operating System? Yes. The platform / Operating System is the enabler, not the crutch.
Now look at Nokia. They have one smartphone OS and they trumpet this as a sales ploy.
Re: If only there could be "Silvermont"-based netbooks
Wasn't it Intel that purposely hamstrung netbooks by restricting what was allowable when using it's chips and chipsets? They could have been good...
From the marketing junk I was subjected to, the ribbons were in place to replace toolbars, not menus. The MS hack demonstrated just why the ribbons were so good and every user was an idiot by enabling every single toolbar in MS Word and going "voila", look at the lack of space this gives the user for documents, this is why we "invented" ribbons. A sane attendee (who was shouted down as if he was an idiot), pointed out that no user enables every single toolbar simultaneously, that many of them appeared on demand and that the ribbon and other new UI crud took more space than toolbars and menus.
The big problems with the Win 8 shell (UI) is that it's an aborted mess of touch-screen optimal controls, half baked with a few non-touch interfaces thrown in and an overriding feel of "how the hell do I do something?" as everything is hidden away. The missing Start Button is just one of these idiocies... users need visual prompts that they have options (functions) available, designating random portions of a display that a user needs to thumb to bring up some functionality never has been, and never will be, good intuitive user interface design. There is no such thing as a fully intuitive user interface (it's not possible), but there's no excuse to not try and the next best thing is a consistent interface.
Re: Not exactly...
Agreed. They are leaps ahead of what they were before. It's no longer a case of "argh, integrated Intel chipset with obscure set of digits that'll take a day to track down" and now "that's not so bad, it works ok now".
They're not speed demons, but at least now they are perfectly adequate for the majority of computer users' needs. i.e. Using a computer in place of a typewriter and browsing the web a bit.
- 'Windows 9' LEAK: Microsoft's playing catchup with Linux
- Review A SCORCHIO fatboy SSD: Samsung SSD850 PRO 3D V-NAND
- Was Earth once covered in HELLFIRE? No – more like a wet Sunday night in Iceland
- Every billionaire needs a PANZER TANK, right? STOP THERE, Paul Allen
- First Irish boy band U2. Now Apple pushes ANOTHER thing into iPhones, iPods, iPads