You're making too much of this
Microsoft knows better than you what you want, so get with it or get to another platform.
What is multitasking? Different people seem to mean different things when they use the word multitasking. The definition chosen has implications for accepting or rejecting the prevailing design choices of modern user interfaces. I have been a vocal critic of Windows 8's Metro interface. My chief complaint is that it does not …
Or for a Microsoft bias version -
50% for Internet Explorer to 'Bing' for answers
30% to create/ edit config files in Notepad
20% for PowerShell
The principle is the same. Metro might be nice for Grandma browsing the Daily Mail website, but for proper work, I'll look elsewhere, be it Linux or windows XP or 7.
Metro yes. But windows 8 is in the background so ditch metro unless you are going ARM.
Besides, blame apple for metro and the likes. I'm sick of seeing iPads. I now associate ipads and iPhones with chavs, like McLaren prams and burberry hats.
Win 8 is so overdue! die apple, die apple!
"Metro yes. But windows 8 is in the background so ditch metro unless you are going ARM."
Oh, Windows is inferior to everything else on the market even if you are going ARM. Android? Better. OSX? Better -- I don't like Apple products one bit, but iPhone, iPad, etc. are essentially OSX for ARM. Linux? You can run a FULL desktop on ARM, not some crippled mono-tasking thing with no apps available. (I've seen it, if you had an ARM netbook you wouldn't know it wasn't x86 until you bring up the hardware info.) Or (for a tablet) something like Unity or the like... which although awful on a desktop is meant for touch screens.
As for OP: multitasking has a definite definition, which Windows does meet. However, I must agree the interface for Win8 looks hideous, and allowing just 1 app or a limited split (as opposed to arbitrarily resizeable and relocatable windows) is a step back to about the early 1980s. Even Works for DOS (this had like a word processor, spreadsheet, and modem app all in one) allowed for 3 or 4 things onscreen, and limited repositioning of those windows.
Windows wasn't built in a day. Don't buy the metro only version that's what I'm saying. And to add to that point, this hasn't stopped.the.world and his Chav from buying iPads has it.
Android better than windows? Now that's a laugh. You are comparing a cupcake to a full sized chocolate gateau. Windows is so so much more than what you are making it to be.
Most of the time I can't even get the.cursor to line up in Android. Its
Back when Windows (ME) (More Errors) came out I was running a Dell desktop and had so many problems multitasking I bought the newest version of SUSE Linux and installed that. I immediately found I had 12 complete different desktops that would run simultaneously, what an improvement! I used that until the computer became outdated and the new HP Pavilion I bought had Windows Vista OEM. It's been working fine for my usage presently, but I'm getting ready to upgrade to a 64 bit machine. Judging from what I'm hearing about Metro, I'm going back to Linux. I wonder if I can upgrade the SUSE 6.0 version I have? Anyway, Microsoft had better take heed, because I'm not the only one contemplating this. It seems the new upgrades Microsoft come out with, cost more, and have less! That is just plain wrong!
> "Jack of all trades" comes to mind.
Jack of all trades
Jack of all trades,
master of none.
Jack of all trades,
master of none,
Certainly better than a master of one.
Actually, as much as I would like to blame such silly errors on multitasking, I was not multitasking when I did my final proof-read. I had in fact maximised the window and shooed the cat away so I could concentrate. But I still missed it. Embarrassing; doubly so in context.
It is however proof of nothing more than that we are capable of mistakes even when focused. While research agrees that multitasking does raise the error rate of our activities under certain circumstances, this particular error in writing only proves one thing: that I am in fact human after all.
You get to make mistakes. You're human.
That being said, if Microsoft is going to go down the Vista path of thinking that we'll all just get with the program and buy into their "Crap is Good" OS interface philosophy, they are likely to find more people sticking with Windows 7 than they'd like and I will certainly be one of them if Windows 8 absolutely does not allow the Metro Interface to be turned off and give me my choice of usual Windows Interface. I'm in no hurry to move from Vista. I have a Macbook Pro running Mac OS 10.7.4 like a champ (and won't upgrade that to Mountain Lion if it brakes or doesn't like, my Mac OS apps).
Why? Boring reasons generally. Vendors that only release software for windows, developers that only write their front end in vb5, lack of 1st line support staff for anything else, domain management, all the boring things. Sure they can be worked around if you're in a company that has any concept of strategy where there's some collective vision and such like. Most companies will be on the windows tread mill till someone does away with the entire mid to senior management chain.
I don't know about him but I can tell you why I am sticking with Win 7, because it IS a major upgrade over XP thanks to breadcrumbs, jumplists, MUCH better memory management, Readyboost (great for netbooks like mine) and all in all just a better built solid OS, oh and not having to run as admin because XP couldn't handle non admin worth spit.
As for why I will NOT be going to Win 8, Metro UI is awful and feels like I'm fighting the OS instead of it getting out of my way, its "always on" social crud slams the network,I often have multiple tasks going (because what's the point of having a hexacore if you don't use but a single task at a time?), without a touch screen it is a step backwards, I have NO desire to poke at my screen all day, the cost of replacing my screens with touch screen of similar caliber would be prohibitively expensive, its lack of customization irks me, and frankly it just doesn't "feel' good when i'm using it, and finally the obvious contempt MSFT is showing by ignoring their customers by not giving you an option to simply use the Win 7 UI.
Look we're all geeks here, right? We ALL know what this is, its a "Hail Mary" by Ballmer who has seen that while X86 is a mature market where folks won't replace the last one before it dies ARM is undergoing a MHz war like what made MSFT rich in the 90s. what Ballmer seems to forget is nobody runs Windows for MSFT software, they run it for the wealth of X86 third party stuff that simply won't be there on ARM. If I were to buy an ARM mobile device tomorrow it'd be Android as that is where the apps are NOT Win 8. Since Win 7 is good until 2020 I'm advising my customers to just skip it, although I'm sure i'll make out like a bandit for a good year while people who bought a Win 8 device have me wipe it for Win 7, just as I wiped Vista machines for XP.
Well, that is rather embarrassing, now isn't it? Not only speaks to multitasking's issues (which I highlighted in the article,) but also the "mythical man month" concept. After all, three others read this through before it was handed to a sub-editor who didn't catch it either.
Humans! We have heuristic autocorrection built into our brains!
The older I get, the more I come to believe that these autocorrection sequences correct for such minor errors without out conscious attention. Similar perhaps to how dyslexic people learn to read without having to concentrate.
It’s a truly fascinating topic to me; one I thoroughly enjoy researching. Given the number of my family members and friends involved in neural research lately, maybe I’ll even get the chance at some answers.
For the moment however, yes. Facepalm indeed. And egg on my face. So on and so fourth. Cheers!
Far more interested in why we seem to produce them with full conscious attention on the task at hand. Not to mention why our brains purposefully skim over the errors, reporting to the conscious mind "everything is a-okay" when it is in fact not.
It is an interesting and difficult concept for most people to grasp: what we perceive with our conscious mind is not in fact reality. Just because you see something does not mean it is there. Just because you don’t see something does not mean it isn’t there.
Our minds are heuristic processors that perform all sorts of different layers of filtration on raw input before presenting it to our conscious minds for consideration. Out vision alone is a great example: there are dozens of different layers of filtration required to provide us with what we perceive to be a single, homogenous, three dimension view of the world around us.
In reality, each eye is seeing a curved single-dimensional image with differing levels of resolution at the center to the edges, in addition to things like our blind spot. Many of us (myself included) actually see colour differently out of each eye. Furthermore, we don’t actually “see” (as in have enough photons from a given object strike our eye) everything that we “see.” A lot of what we “see” is in fact provided us by our memories of what an object “should” be.
Add to this that movement changes things. When something moves, some of these filters are actually bypassed to allow quicker access to the raw data by both our conscious minds and our brain stem. (So the endocrine system can make fight-or-flight decisions asynchronously to our relatively slow conscious decision making process.)
Our conscious minds are a high-level application running on top of a rather buggy kernel. Worse: the kernel is in love with Bayesian analysis, and the hardware sensors kind of suck. 10Mbit/sec for our shitty vision? And it requires ~2lbs of our brain dedicated to post-processing before it is even provided to applications for analysis?
Pffft. Back to the drawing board, random processes of evolution that resulted in the complex chemical interactions that allow me to bitch about things on the internets. Back to the drawing board!
In my previous life I had extensive experience in writing complex documents. A major part of the exercise involved reading it numerous times afterwards, testing for different errors.
We usually roughly followed the following regime:
Read for spelling mistakes
Read for grammar
Read for logic
Check bullets and numbering for consistency
Check headings for correctness
Give it a final scan and then hand it over to a colleague for more of the same.
You finally end up with a perfect, flawless document that you are proud to sign off, only to discover that there usually are numerous errors that most embarrassingly jump out at you when you re-read the thing (that is, if the client had not pointed them out at you already).
This used to greatly worry me, until I worked out that we read what should be there, not what is there.
The best way to avoid, no reduce those errors, are to give the thing to someone with no or little experience or knowledge of the topic - they tend to read what is there.
<== Refers to those infernal errors, of course.
The best way to avoid, no reduce those errors, are to give the thing to someone with no or little experience or knowledge of the topic - they tend to read what is there.
There are various tricks proofreaders employ which can help. One way to catch spelling errors, for example, is to read the text backwards, from last word to first. That makes it more difficult to read them as what you expect rather than what's actually on the page.
Similarly, some people mark up hardcopy of the text in various ways as they read, as a way to help catch fine (word omissions, homonyms, etc) and gross (structure, logic) errors.
Others find that reading aloud helps.
Best I can tell, the brain's auto correction feature is somewhat selfish. It will happily gloss over my own typos, omissions and grammatical errors, but if I then re-read e.g. an email I wrote 5 minutes later (and there does seems to be some sort of minimum delay) I'll spot all of them easily.
Same with things other people write, I'll spot their problems immediately.
According to dictionary.com your "nob" is either referring to my head (nonstandard to say the least,) something involving cribbage (wtf?) or you are calling me "a person of wealth or social importance." (I am neither, just by the by.)
I suspect instead you were attempting to call me a knob, which is "taboo (Brit) a slang word for penis." This is a far more context-appropriate method of mocking me for my article's word omission.
Please file this away for future reference. Cheers.
You mistake the post was intended for George.
I was calling the spell check grammar check spotter a nob, not you.
According to the mansfield dictionary what you call knob is nob. Always has been, always will be. You cannot refer to a dictionary for globalized or national slang spellings.
I suppose if they've nicked the Dock idea and the App Store idea from Apple, it was only a matter of time before they nicked the idea of telling you to shut up and do what you're bloody told.
I dislike the Metro UI for much the same reason as Trevor. It seems a shame after finally introducing some decent keyboard-shortcut-based Window snapping functions in 7 (which combined with the Desktops add-in from Sysinternals makes 7 a very decent multitasking environment) they're ditching it in favour of Metro's "Giant Mobile Phone" approach. I've tried it repeatedly and frankly it can %^&* off.
You might be right. If they don't expect people to be 'multi tasking' then yeah focus stealing makes more sense. Still damn stupid though especially if a password entry dialog is replaced by a text editor. Or a confirmation dialog saying 'Do you want to kill yourself and wipe out all living things within ten miles' appears just as you finish typing 'today' :)
It always happens to me when I go to type in my password when I am stairing at the other monitor going WTH why no *'s and then realise focus has been stolen (Applications which continually do that are deleted, or I whine to support if its something I have paid for and force them to tell me a way to stop it then I delete it and moan on the internet).
And example of real world multi-tasking, which can occur easily:
- download a torrent
- playing a music (or video) playlist
- email and/or IM
- Typing a report.
While typing the report, you would like to be kept abreast of what the other apps are doing, and likely all at once. This does not take much cognative power, as monitoring can be done "passively". This is also not a strange scenario, and is actually very likely.
So MS is correct, and the OP is correct. The apps will stilll run, but you can no longer monitor them all at once via Metro.
Personally, it sounds like MS is trying to justify metro as a useful tool. It's not. If you want to do the above, just switch back to "normal". How much time the average user spends in normal vs Metro, will utlimately dictate who (MS or OP) were right in the end.
(My personal view: OP)
For me multi tasking in the real world is something like this (note I have two large screens attached two my machine in the office plus two large monitor screens on the wall.)
Having six servers open in putty running tail -f on the logs when I know the servers are being a touch flakey and I want to catch the horrible stack trace as it happens so I A: have the stack trace at hand and B: can restart that server. Behind them I'll have a window with my load balancer so when the incident occurs I can bring that to the fore and drain the server to perform B. While vaguely paying attention to them in the right hand screen I will often be switching between a primary task, normally between email, ticket queues, documenting a process (e.g. how do you install blah software correctly for our environment), this document may be on half the screen while I have one or two more putty screens open for say actually installing the software.
I'll likely also have a window open for incident logging for when the error I am waiting for on the other screen so I can log the error and time and various metrix about the server when the error happens.
I'll also have the rich client for the monitoring software in the back ground. At the same time on the wall mounted screens the monitoring software is running in overview mode, if I see something change state I'll swap to the monitoring on my desk top and see why the state has changed, possibly resulting in me going off into a machine somewhere to fix the issue. Also on the wall monitors are ticket queues which will prompt me to go into my ticket system when required.
All the while I'll be talking to collegues about how awful our systems are, how rubbish our main vendors are and, questioning what kind of person thought it was such a great idea to get a such a large bunch of useless providers together in the first place.
Sure I could use classic mode, but what's the chances of that working properly?
End result? Use office in krossover, get the linux version of citrix client, no longer worry.
Get into IT they said, it'll be fun they said!
I'd agree with that ^
Does Metro multi-task - Yes
Does Metro multi-task in a way a user wants - Not for all, no.
To me multi-tasking is when all tasks are doing their thing and not prevented from doing that. That is, if you have a stock tracker and it is tracking stock while you are doing something else, even if you cannot see what it is doing, it is still multi-tasking. If it stops tracking stock when minimised or hidden then it's not multi-tasking.
The confusion is perhaps in Task Switching where Task Viewing is perhaps a better term, not the suspending and un-suspending of tasks.
My home use, 50% of the time I'll have either a browser or media player open, often part screen, sometimes full screen. I'll then bring my messenger clients to the fore when they start flashing at me in the task bar. 20% of the time I'll be writing or some other thing with a similar set up. Some of the time in these situations when I'm really only talking to one person I'll have the messenger app on the right hand of the screen at minimum width, if talking to two people I'll have them slightly over lapping so I don't need to use the task bar or alt+tab to switch (I can either click on the top of one, or the bottom of the other to change focus). The final 30% of the time I'll be playing games.
That's for my main pc which is plugged into the TV, my work pc at home (by work I mean the pc that doesn't have games of movies on and I can't see the TV from) I use in a similar way to the work PC, except I'll also have a number of rdp sessions to the virtual box machines on my TV connected desktop and I'm not needing to monitor anything.
The only time I really only have a single program, full screen, is when I'm playing games or when I'm at home and all my friends are at work. It's probbaly A-typical but it is the way I do things.
The only time I really only have a single program, full screen,...
I'm appalled at the arrogance of some applications (i.e. their writers) who reckon that applications must run full-screen.
I often need to keep an eye on several screens at the same time. I may wish to cur&paste between them - although the fact that the MS Windows window manager will always pop a window to the top of the display stack if you click on it can make that a nightmare, so perhaps its not surprising that the "full screen" is so common on Windows, given that it is pretty useless at letting you handle multiple windows in a reasonable manner. Linux has no problems - you can configure it the way you want. MS Windows has the capability, but MS doesn't give you the ability to use it.
Isn't the entire point that you cannot "just switch back to normal"?
Yes, there is the "classic Desktop", but that's another Metro app and doesn't have the taskbar we're currently used to. There's no switching, and that's where the problem lies -- that Microsoft isn't even allowing us to turn off Metro.
And everyone does it differently, justsplitting the screen in 2 with 66 / 34 won't work for 80% of us (arbitrary percentages make it sound official.)
Biting the hand that feeds IT © 1998–2019