234 posts • joined Wednesday 7th May 2008 13:11 GMT
Re: No suprises in any of that.
I clearly detect some bitterness in this comment. I'm afraid there's no getting around the immutable fact that no joypad offering has yet come close to the speed an precision of a mouse and keyboard.
I used to be a UT champion back in the day, and watching people play COD on the consoles, it's like they're swimming through treacle. I used regularly spin 180 degrees mid jump and instagib 2-3 players, and I wasn't the fastest! The console world is a far cry from this, as sweaty thumbs mash away at laughibly inaccurate "analogue" nipples, and turning a full 360 takes longer than reading the sunday supplement. This is why they seperate PC from console players in the vast majority of online gaming.
The other genre which console players seem to be laughibly oblivious to their unweildiness is racing games. I really enjoy a good blast round a track now and then, but do so rarely because it means I HAVE to unpack the steering wheel. I won't play without.
I've even had people try to argue that the joypad is superior. If the joypad really was better, or even if it wasn't suicidally dangerous, someone would have fitted a real production car with one, as the novelty value would sell.
If the thought of someone using a playsation controller to guide a 1 ton vehicle down the outside lane of the motorway give you cold sweats, clearly you're not a serious gamer of you use one for the digital variety.
No, the real reason the PC is in decline is because the serious gamer is in decline. A PC gaming rig is expensive to buy, and maintain, requiring constant upgrade, so requires serious incentive to invest. Now look at the games market. Nothing but repetetive sequels as the games giants are too afraid of taking risks, due to the massive cost of development these days.
Was I the only person who read this article, and heard the voice of Nathan Explosion screaming "It's gotta be brutal!"
Unfortunately I've dealt with enough of these types of customer service issues...
to know with almost abolute certainty, that the customer in question here is your typical ignorant arsehole.
He will have been the kind who in general doesn't listen to simple instructions, has no clue how to operate a computer, and will have installed a million browser toolbars, 6 unintended antivirus programs, and every peice of crapware available on the internet within minutes of plugging the dam thing in.
He then phones some poor hapless call centre support bod, and shouts for fifteen minutes demanding a brand new computer.
This all too typical genus is incapable of learning, because he/she refuses to accept just how useless they are at using a computer (never actually reading a message that pops up on the screen before clicking wildly in the hope of getting a new screensaver), and therefore makes the call centre bod's extremely difficult job simply impossible.
I mean, it takes incredible diplomacy to explain to someone who's patiently listening "the reason it's messed up again, is basically because you're an idiot" without offending them. Our poor call centre tech bod (not generally known for their social skills) would stand no chance with Mr. shouty here.
Some of you may think I'm being a little harsh toward the customer, but the evidence is in the story itself. An intelligent, patient man would be more interested in getting his computer working. While obviously annoyed by the unnecessary trip, would chalk it down to a simple misunderstanding, and hey, the PC is here now, and they're willing to fix the problem, so what's the problem?
I'm surprised Mac fanbois aren't proud of this moment
I mean, it's almost as if Somebody out there now considers a Mac to nearly be a real computer!
I mean, somebody's actually sat down, and made the effort to write a virus for it. Why? That's the real question. I mean, what are they going to gain? A huge collection of sepia toned pictures of hipsters drinking latte?
No, this is a sign. It's a sign that there's finally something of worth contained within them! Perhaps...
very good, but there are some schoolboy errors in the design
The first and foremeost, is that users are allowed ONE email address, no more.
This was a disaster when our corporate policy dictated a change in our default email reply-to address.
Firstly, as soon as a user emailed the helpdesk with their new email address (Spiceworks didn't seem to pick up the change in active directory) it generated an entirely new user.
This immediately fractured the ticket history. Secondly, you cannot now update the original users email address, as it has to be unique, and there is a freshly created user (with no details other than email) now reserving said address.
OK, so you delete the newly generated user, sacrificing the associated ticket, and change the address in the original account. Nope. Deleting a user doesn't remove it from the SQL table (just marks it as hidden), and it continues to reserve the email address. So you've lost the ticket, and still can't fix the problem of the incorrect email address.
This is when you have to get your hands dirty in the SQL tables. Deleting the newly generated user record allows you to update the original user account in Spiceworks, but now the helpdesk crashes.
It didn't take long to figure out why (although the Spiceworks logging is pretty woeful compared to most MS products).
To completely fix it, you have to do a search and replace for the erroneously created user ID in the comments, ticket_involvements, and tickets tables. Replacing any reference to ID with the original.
Of course, if a user then decides to use a different one of their email aliases (they all have several options) the whole fiasco will begin again.
I really like Spiceworks. I really like the fact it's free, but I would have some very serious reservations about paying for something with such fundamental issues. Especially now that I've seen under the bonnet!
Re: Media Center and their extenders
@stu_ekins – Yes, I do have a TV licence again. Posting anon though, due to references to download activity, which I continue to do simply because the legal alternatives don’t provide the same quality of services.
Indeed, there are some good products out there, but you have to wade through an enormous pile of dross to find them. On top of this, brown goods documentation is notoriously light on technical info, making your purchase decisions all the more difficult, and to be frank there is no financial incentive for the manufacturer to provide additional support and upgrades once you’ve purchased the item, so the many comments here indicating an industry wide dearth of firmware updates, while sad, comes as no real surprise.
One major gripe I have is the complete gamble you have to take with HDMI CEC support. Because it was only an optional component of the standard, the features that work on it are almost random. Sony barely support it at all, as they have their own solution (which only works on Sony products).
Try finding the supported range of CEC commands of your next TV/HiFi/DVD player before purchase! I used to work as a multimedia technician, setting up everything from lecterns for presentations to turn-key kiosks, and the very first thing we learned was that if you wanted to make the installation work properly, be idiot-proof enough for unassisted use, and future-proof for new features, you had to use a PC.
If something on a PC doesn’t support a certain feature, doesn’t work in the required manner, or doesn’t integrate with your pre-existing configuration, you can change it. Without a computer as the hub, all the individual components that make up your desired solution are a bit like the shapes and holes puzzle. Except that all the shapes were moulded by the same toddlers the puzzle was aimed at. You will get things to fit eventually, but it will never be seamless.
My parents, both in their 70s are not complete luddites, but they struggled so badly with their Smart TV/DVD/PVR/Set-top box/Hifi and the many remotes (oh, god, the remotes!) that I offered to set up an alternative.
Now, with a single remote, they can easily navigate their TV/streaming and catch-up/music/pictures/recordered and 1.4Tb of moveis, TV shows, Documentaries and Stand-up through a single unified interface.
Seriously, have a look on youtube at mediabrowser 3. I’ve tarted it up a bit more than the videos, so it looks even prettier, and is astonishingly easy to use.
And I haven’t even touched on the low cost of upgrades to support new features. This all started with a conversation about them buying a new TV to get DVBT-2 support. They now have this, AND blu-ray, for less than the price they were intending to pay.
Re: More grunt and customisation needed.
Never buy Sony!
This is not a new problem, and a very deliberate tactic. Bare in mind, Sony are the single biggest muscle behind the War On Piracy (ironic since their tape recorder very nearly didn't get released because of the exact same legal actions).
Since the start, Sony CD players haven't recognised various flavours of recorded CDs, their DVD players plead ignorant if you put a disc in them that isn't exactly the right colouration to be legit, and support for any digital format that could be linked to non-payment downloading is suspicious by its' absence.
Add to that their woeful support of HDMI CEC (preferring to try and foist their own proprietary solution, and lock you into all Sony kit) and the fact that their reputation for quality has far exceeded their ability to deliver for the last 20 years…
I'd agree with Paul 87 and therums about the Internet, but I'd like to submit XP as another factor
My reasoning is thus:
Before XP, home and professional markets were completely separate, and their two methodologies as alien to each other as carbon and silicon based life forms.
If you were designing software for NT, then your target market was clearly identified as a networked, business environment, and you designed your software appropriately.
This meant compliance with networking and security standards. Your software had to be resilient and flexible enough to cope with the myriad of network configurations, ACL restrictions, and of course, you are answerable to your multinational client with its army of lawyers.
If you were writing software for the home market, on the other hand, it was much more of a Wild West. Games were dumped in the root of C: so that they could be quickly navigated to in DOS, and rules were merely standing in the way of you gleaning a couple more FPS out of your game.
You were actually rewarded for bypassing standards, blitting the hardware and taking shortcuts.
Along came XP, and these two worlds collided with such force, we are still feeling the chaotic repercussions today. When the NT kernel became the platform for both, XP was flooded with rule breaking games, and hastily banged out code by teenagers in their bedrooms.
This quickly gave rise to the situation we are all familiar with. You had to run as nothing less than admin for all your software to work. This quickly bore a vicious circle, with small developers, lacking the resources to fully research all the intricacies of the NT platform, simply making assumptions that this should be the norm.
As evidence I submit my time as sysadmin in a school, 5 years on from XP release. The niche software, sometimes written by programming teams of one, would make a security consultant break down in tears, often storing config files in the windows folder, ignoring the registry, making assumptions about profile folder rights…. I could go on… and on…
Even Mozilla are guilty of many similar faux pas, which is why you don’t see any real corporate take-up. The sudden influx of lazy and/or hacker coders gave birth to a compromised NT environment that lasted more than a decade, giving rise to an entire new generation of coder who believed that this was the way things should be done.
I’ve only recently seen a change in trends with the proliferation of Win7. If the UAC comes up at any time you’re not installing NEW software, the programmer has done it wrong. End of story. The UAC is embarrassing a lot of corporations to go back and write it the right way, but we’ve still a long way to go.
Perhaps Win8s Android-esque declaration of rights at install time will push things further in the right direction?
If you try to suggest to any knowledgeable IT heads that they switch their corporate usage to Firefox or Chrome, you will be laughed out of the office, and rightly so.
Neither Firefox or Chrome can be centrally managed to any level even close enough to warrant a few seconds consideration. They are barely even written correctly for the windows platform.
Until only a few revisions ago, Firefox used to store its internet cache in the roaming profile folder, and Chrome used to install itself in the application data folder.
These are primary school mistakes that barred them from any serious network infrastructure on their own, but it doesn't end there. How do you set the corporate homepage? While the bigwigs in management might be allowed to play, how do you lock down all the configuration options so the plebs in the call centre don't drum up hundreds of support requests a day? How do you restrict Java so it can only be used on your intranet?
You can't even prevent them from installing crippling browser toolbars. When you've got 30,000 workstations to run and maintain on multiple sites, these aren't inconveniences. These issues will bring the world crashing down around you.
As a network administrator, priorities most often run by uptime of services, and then security. Where are their central update services? How do we ensure that every copy on every workstation has been patched against the vulnerability which has just gone wild?
In just the user section of Group Policy Management, IE has 801 configuration options, allowing you to customise everything from the proxy, homepage, activeX filtering, AJAX cross document messaging, autocomplete, plugins, allowable downloads and a miriad of others that could potentially be security vulns.
These options can be applied based on the user, the computer, a safe list of websites, IP range, certificate validity, and more, or any combination of the above to provide such granular control that a good sysadmin can lock down the browser to almost read-only levels when entering the wild, yet allow unprecedented access within intranet applications, without the user being aware.
By comparison, an unmanaged copy Firefox can do as much damage as a virus.
Oh, and if you want to moan at someone for the lingering existence of IE6, look unto your own profesion. Do you think we enjoy our users moaning about that crappy old version? Or that we like being out-of-date on patching? It's because the sprawling corporate intranet hasn't been updated to cope with the newer versions (and certainly won't work with Chrome!).
I'm not calling your ilk lazy. Naive, yes. That intranet is probably tied into 100,000 POS systems whos OS is hard coded in such a way to make it unfeasable to upgrade.
I'm by no means against the basic principle add-ons. Plugins, filters, bolt-ons, themes and extensions have been the saving grace of many applications. The problem comes from too many, or badly designed add-ons that cripple performance or cause instability in the host application.
Another feature I quite like in IE9 is the way the add-on manager not only keeps you alerted to changes, but also displays the performance hit you accrue with each one.
Given the amount of time I spend using web-based systems, disabling Java made a huge improvement. Alas, the only way I could streamline the browser any further would be to disable the AV add-on.
I only re-enable Java when I have to deal with certain network switches and printers etc. I can't remember the last time I stumbled on an actual website that required it.
The real attraction of IE for me is that I no longer NEED the miriad of add-ons that were prerequisits in the past. Back in the glory days of FF, websites weren't such resource hogs, and therefore if your browser was a little flabby round the mid section, it was barely noticable.
It still amazes today how much of a drain a website can be on your system. I've frequently seen browser instances of all 3 top 300Mb memory usage. With this in mind, streamlining makes a lot of sense
actually,in terms of security, resources and stability, Firefox is now about the worst at the moment.
IE 8 introduced the accelorator feature, which is pretty awesome once you know how the get the most out of it.
IE9 the introduced the pinned sites feature. This is a major boon, as a pinned site acts more like an installed application than a webpage. Sites that support this feature, such as facebook, outlook etc. can display notifications on their taskbar icon. This is an extremely useful feature for webmail!
Oh, and who wants yet another browser slowing plugin (FF, I'm looking straight at you!) like adblock, when you can import the ad list URL into inprivate browsing to achieve the same result?
By comparison, other browsers have demonstrated little or no new innovations that have really been game changers for me, at least. Firefox came close with the new tab management system, but while I was quite excited initially, my usage of the feature quickly subsided, consigning it to the 'gimmick' catagory.
Mozilla have screwed up royally to lose their userbase to Chrome so badly.
Don't get me wrong. I used to be a devout FF user for many years, until later versions became slow, unstable and a massive resource hog. I also hate the combined URL/search in IE and Chrome! I've gone through the gamut of browsers over the years. These days when I need raw performance for something like a flash game, I load the site in Chrome. But I soon start to miss the notification indicator and the ease of quick searches using accelerators, so it's not long after the game I revert to IE9. If IE10 delivers on performance, then Chrome will get kicked to the wayside too.
I do pity those who can't keep up with the times. Sorry, but 90s called asking for your opinions back.
Thank goodness for SCCM
Thankfully, I managed to create a custom task sequence to fix all the clients.
Using file inventory, I managed to create a collection query that listed all the machines containing the agen-xuv.ide.
I then advertised a task sequence that ran:
net stop savservice
It then deleted said file (several caveats for differing install locations, x64 etc.)
net start saveservice
This filtered through and cleaned 6k worth of clients in about 2 hours. I'm just glad I have VPN and RDP on my massively oversized Android phone. I had 90% of the solution in place while I was still on the bus to work.
Our poor email server is another matter - thankfully, not under my care!
All fanboi-ism to one side...
I held off getting a smart phone for a very, very long time. Firstly it was the data plans/contracts. A smart phone is just a very expensive (and bulky) phone if you can't use a data connection, so until infinite data contracts fell into line, I wasn't interested.
By which time, WinPho7 was on the horizon. Being a sysadmin of a largely MS estate, the idea of having the same OS on my phone as my managed network appealed greatly, conjuring dreams of vastly improved integration and manageability. As any sysadmin will tell you, mobile devices have steadily encroached into our lives like an unmanaged viral outbreak. Blackberry doesn’t go as far as we would like, and Apple are just a joke for Sarbanes–Oxley.
WinPho7 also appealed on a home level, with aspirations that my phone would seamlessly tie into my homegroup, unifying my media and communications experience
When WinPho7 arrived though, the reality fell far short of the dream. It is NOT related to Win7, as we all know.
So, I plumped for Android for my personal choice, and our corporate mobile policy rattles on, muddled and semi isolated from the rest of our infrastructure.
While I do really like Android on my HTC Sensation, I’ve already seen the adverse affects that market fragmentation has had on Googles OS. The problem is that the end device specs vary wildly. I wish I’d held out a little longer, and got the One X, like my friend. Ever since installing the Ice Cream Sandwich update, response times have been just a little sluggish, and occasionally grind to a crawl when too many apps get left open. On the other end of the spectrum, it seems that there are a great many apps that could have had a little more polish, but you get the distinct feeling the developer was going for compatibility with lower specced models.
I loath to admit it, but side-by-side, the facebook app on IOS is just that little bit nicer and more responsive, and this loses me vital bragging points down the pub against my much loathed apple touting comrades.
Two things in my mind make Win8 stand out. Firstly, it is essentially the same OS that desktops and laptops will ship with, so should tie into a server 2012 domain very nicely. Obviously optimistic speculation, but in theory, group policy management, centralised updating, and unified message integration should be as easy to manage as the desktop estate. In the home, MS have already done a wonderful job of making Win7 home computers play nicely together, so hopefully they’re planning to up the game even further in Win8.
Secondly, MS have chosen to tightly control the hardware specs. While this does reduce opportunities for innovation, it makes the lives of app developers a whole lot easier, and should in turn mean a smoother, slicker experience for the end user.
It’s still early days, and much dust to settle. While some commenters have expressed their scepticism about how much influence this legal wrangling will have, it is not the only variable on the battlefield. MS have got a lot of catching up to do, but they have a lot of promise. Perhaps while the two giants are fighting, it will give MS enough elbow room to push ahead?
Re: Lets not just blame java here
"In how many other OS's could a virus get in through a NON priviledged account"
The OS did NOT let the virus in, the JVM did. If I remember correctly, the last worm to successfully exploit a Windows vulnerability to actively spread from one machine to another without user intervention, was the Blaster/Sasser worm. Even then, I was running a school at the time, and although the Blaster successfully exploited the RPC vulnerability, the students machines were so heavily locked down via group policy that the process elevation attempts failed due to certain services being disabled.
There have been activeX exploits, but any sysadmin with half a brain can lock this down using the internet zone group policy settings.
Since then, almost all viral infections have either used social engineering tricks, or the unholy trio. Acrobat, Flash, or Java.
The Windows platform of today features ACL control over Filesystem, registry, and active process utilisation of such granular detail that it far outstrips any nix variant. It features Address Space Layout Randomisation that is superior to that offered by Linux or OSX. It has a very capable firewall built in and enabled as standard. Almost all network traffic is PKI encrypted by default. Hard disks can be hardware encrypted to FIPS 140-2 compliant levels.
But, a chain is only as strong as its weakest link. The problem with the MS platform today is not the underlying OS, but the plethora of badly written software that requires diligent sysadmins to punch dirty great holes in these security features to make them work.
And running any platform without some antivirus software is reckles at best, idiotic at worst.
Java is an elegant language on paper
But the VMs and general implementations are shockingly bad.
I've managed a fair few IT systems within schools in my time, and therefore have been introduced to "educational software". From what I gather, this term means "The developer is trying to educate himself in basic coding practice". And is usually sat at the back of the class with a dunce cap on his head.
Yes, you guessed it. Out of all the many appalling, steaming piles of useless code that crossed my desk, the worst examples always contained large chunks of Java. It doesn't even matter that much if the developer has some semi-decent skills, as Sun/Oracle will manage to screw it up with the next release.
I totally agree completely with foo_bar_baz. Because of its inevitable unreliability, java should always remain within static environments like a printer BIOS or on a server.
Re: Do we need to talk about radiation?
Thankyou sir for that truly epic Godwins Law reference!
I spat my coffee clear across the desk!
Radioactive Hitler! We're doomed!
The reason they leave the liver/kidney
It's the same reason they can be so picky about the food they eat. Their upper palette is extremely sensitive to amonia, which is given off by decaying and potentially harmful/poisonous meat.
Cats are truly magnificent predators. They are one of the few species on this planet whos digestive tract is optimised purely for a carniverous diet. You'd be surprised how many "carnivores" or "herbivores" that can actually eat alternatives. Even pandas show a preference for carrion when they can find it.
Their entire body is optimised as the perfect predator.
Hearing with directional/distance location accuracy only suprassed by the barn owl.
Natural camouflage in their coats.
retractable claws and soft paw pads allowing for incredible stealth.
Vertical slit iris optimised to detect rapid horizontal movement.
Reflective retina for night vision.
To name but a few evolutionary specialisms in one of the most successful mammalian predators on the planet
Yes, as several others have pointed out, VLC's a tired old dog. One of the most annoying features is when you hit pause, theres a delay.
What, am I watching this on VHS?
Media Player Classic is a truly well rounded app, but to get around the codec issue, simply download the K-lite codec pack, and choose "Lots of Stuff" during the install, and you get MPC.
Voila, you have high functionality media player, with codecs to to play just about anything, even the Bink and Smacker A/V codecs EA and Codemasters commonly use as the format for their in-game videos. Crucially, because they are system codecs, your other player apps can use them too, improving overall system flexibility.
For disc writing, I've sworn by Ashampoo Burning Studio (ver. 6, the free one) for years, and can count the number of writer drives more easily than the the number of discs I've worked through. Tiny memory footprint, extremely stable, fully featured, delightfully lacking in bloatware and I can run multiple instances, writing to multiple drives simultaniously.
I've used XBMC, and while I have no argument over cross platform issues, the interface is appalling unintuitive for novice users. I know this because I had a system with it set up for my parents in their living room.
No matter how much I tweaked the preferences or themes, they found it just too difficult to navigate.
The solution came with MediaBrowser, an open source plugin for media player which combines similar functionality, incredible beauty, and crucially, ease of use which means my parents, rather than watching telly, frequently browse through the 2TB of movies and TV using the media center remote, with the same ease that they use the DVD player
it's certainly a testament to the robustness of SD memory card technology.
I'm struggling to imagine another data storage medium that could withstand a tour of the human digestive tract.
Mind, I'm now strugglin to imagine another medium that COULD take a tour of the digestive tract.
I cant get the mental image of a man trying to swallow a VHS tape out of my head, now.
re: what is its application
As the article extensively mentioned, render farms are the biggest application.
3D ray tracing (if you can even still call it that given its umpteenth generational jump from the original concept) for movies requires truly vast quantities of number crunching for every frame of the movie.
One thing that piqued my curiosity, is that I couldn't see any fans or PSU, indicating that there is some type of blade-like enclosure this will go into.
I'll bet anything on 3rd party software causing this
My first thought was that it could possibly be a false positive from antivirus software, but I'd guess it's more likely to be a failed malware infection attempt that causes to dll to not be installed/updated correctly during the patching process
It all sounds suprisingly good
I do like the idea of kite-marking software, simply embarressing the software company into complying with the standards that were set out for the given platform they have chosen to code for would be a huge boon in overall safety.
As a sysadmin, this has been the bane of my life, and the primary reason the windows platform has been such an easy target. Even going back as far as XP SP2, in the right hands it was a pretty secure platform. With internet security zones, and a draconian group policy lock-down, you could make a windows box pretty resiliant.
Until that is, you tried to use any 3rd party software. At which point, you then found yourself turning off every safety feature because the programmer had decided it would be easier to write his config data into the program file folder, or worse system32.
Adobe might actually pull their finger out and fix their software. As for Spotify, the guy that thought it was a good idea to install the executable in the roaming appdata folder of the users profile needs to be shot. Repeatedly.
When using Linux do you have to log in as root, otherwise your web browser crashes? The UAC should cause immediate panic and a feverish antivirus scan. Instead we've been collectively conditioned by poorly written software to just say 'meh', and blindly click continue.
This is the crux of the matter.
irony overload from the openbsd fanboi
Yet again, the nix zelots mistake security by obscurity for perfection in code.
This argument is tiresome and tedious. Especially from an OS that uses Kerberos protocols from the stone age, laughably simplistic ACLs, no concept of domains or computer accounts, and up until very recently, easily crackable RC4 encryption.
At least MS fix their security vulns, instead of bickering for months on end as to wether it actually IS a vulnerability. Then again, in the slow moving world of the sleepy nix, a few months make no difference because nobody's bothered about trying to exploit it.
I've tried various nix solutions, and found them lacking. Everytime we try to intergrate some nix based system, without fail, we have to switch off huge swathes of security settings on our MS systems to make them (and here's the important word) BACKWARDS compatible.
Just this week I've have to switch off AES128 and AES256 because some idiot bought a solaris server, and don't get me started on the IPv6 switchover! Yet again, the only things that broke were non-MS.
yet another storm in a teacup
All this chicken-little knashing of teeth and arm waving is tediously familiar, and smacks of the same paranoia that surrounded CPUID and TPM technologies when they were first introduced.
At the end of the day, this is merely a step toward improving overall security within the OS. As with all the other "controversial" technologies, there will be an option to SWITCH IT OFF in the bios.
There ya go, you can breath again now.
I think this is only the start
When I first heard about MS buying Skype for a ludicrous amount of money, I thought "Meh, another blue-sky venture", but the more I thought about it, the more I realised that this is gonna be big. Really big.
About 18 months ago, we upgraded our phone system due to the company we bought it from going under. We replaced the lot with a new VOIP system.
The handsets were expensive, the software was expensive, the new POE switches were expensive, and the manhours in setting up a complex VLAN to support it... You get the idea.
The reasons we went with a digital VOIP system were many, but included the fact that we could use our existing ethernet infrastructure. As it turns out, the CAT5 wiring is about the only thing that our phones have in common with the rest of our IT infrastructure, and at the end of the day, it still only handles voice calls.
Now imagine Skype being intergrated into Active Directory and Exchange/outlook. The potential savings are incredible. You can buy a USB microphone/speaker in any formfactor you desire for next-to-nothing.
Contact details, routing and out-of-office, all become non-issues, as you already have these set up and configured within AD/Exchange - your phone simply follows your login. Changes and feature adding
Of course, this is all omitting the ADDED features, such as video, conference calling, instant messaging...
It's just Apple ADC all over again
Pretty nice monitor.
Very nice Docking Station.
Unfortunately, the Apple price looks even more ludicrous when you realise that, just like every other Apple 'innovation', they, will drop support for it within a few years.
When you buy a peice of expensive kit such as this, one of the biggest factors to take into consideration is its product lifespan, and I'm afraid you can expect a maximum 5 years of slowly whittling support for this standard before you won't be able to plug it into anything without first purchasing a massively overpriced adaptor from Apple, in turn completely defeating the point of the single wire system.
"But it's based on Displayport!"
Really? Oh, you mean the outside runner of the video standards? The one intended to improve on HDMI, but couldn't get off the starting blocks fast enough, so HDMI had already equaled resolution capabilities?
Seriously. How many non-Dell monitors do you see with DisplayPort? Don't get me wrong, it's a nice standard with the daisy-chaining and latching connector, but then again, Betamax was a nice standard too. Phillips System 2000 was even better.
If the people consider the reporting the crime is worth less than potentially devaluing the house...
...Then clearly, the 'crime' wasn't worth reporting in the first place.
There once was a time when the police were set up to prevent the daily body turning up in the Thames. Just stop and think about that for a moment. A dead body. In the Thames. Every single day.
Now try and sound serious when you talk about crime rates going up. These days apparently it's a crime serious enough to warrant police intervention when kids are having fun and making noise on the street. Despite the fact that there's nowhere else for them to go. Or your birthday party has gone on til 10.05pm - which is simply intolerable!
What you have to remember is that in general, the "house price worriers" set sit in an almost eclipsical venn diagram state with the "nosy neighbours". The people who come over to tell you off because YOU haven't mown YOUR lawn, and it's making their property look messy by association.
What we have here is a win-win situation. Because the nosy neighbour set will now have to think twice before crying wolf, police resources should be stretched slightly less, kids might get an inch of breathing room, and hopefully, social attitude towards what is now considered a 'crime' may settle a little closer to the tolerant levels
While I certainly understand your point, it is hardly justification for lazy and badly written software that forces you to run as administrator, making your entire OS vulnerable.
That's like me designing a generic fuel injector, and when asked why it's squirting petrol all over the manifold, replying "oh, that nozzle is there for a different model of engine, I can't be arsed to redesign it just for your car!"
If you install nothing but well written software, then you can quite happily run Windows under a limited account *as it was intended to be used in the first place*, only elevating to admin status to install software and drivers.
In an ideal world such as this, if the UAC pops up while you're NOT installing software or drivers, then it will give the user genuine pause for thought, rather than just assuming its Firefox/java/adobe et al trying to update itself in a non-compliant manner.
Instead, we have become so punch-drunk from the constant bombardment, most users blindly click OK everytime it appears, allowing any virus to run with YOUR administrative privileges, which you've been forced to use because your badly written software breaks otherwise.
That said, I am steadily seeing a shift in trends since the appearance of Win7, with more software becoming compliant. Even Adobe are pulling up their socks.
Nvidia still have a fair way to go with their new driver models, I am seeing a lot of buggy drivers. So much so we now buy ATI cards for our graphics workstations.
Mozilla once again demonstrating they have no place in an enterprise environment
Firefox is a very nice browser for home users, but that's it.
Nearly every aspect of the softwares architecture under the bonnet is wrong in every particular.
Self update should be done by an installed service, not by the running program trying to write back to its program folder, hence the UAC alarm.
Firefox abuses and incorrectly uses the local/roaming profile folders with a jaw-dropping lack of understanding that I would barely expect from a first year student of software engineering.
Finally, your software preferences and configuration need to be stored in a registry. Locally saved config files were passé in the early ninties.
Despite many tweaks and changes in the details, these 3 basic rules have been the backbone of the NT family from the very start. If you can't grasp them after 20 years, then frankly your maturity as a software engineer is in serious doubt.
It's pretty obvious that software developers who ignore these aspects of the Windows OS do so out of a hypocritical disdain for the very operating system they are writing for, and ironically, are the very cause of the Windows insecurity. You don't run your Linux install as root all the time, but you have to with Windows because of 3rd party software that breaks all the rules.
On a corporate network, IE is vastly more secure than any of the alternatives, not because of the underlying code, but because it uses the host OS correctly, it can be centrally managed using Group Policy, and configured to such a granular level that it can be locked down completely when viewing potentially dangerous zones.
You can't even centrally manage the bloody homepage of these rinky-dink browsers.
So, excuse me if I'm a little cynicle about Mozilla and Googles attempts to ratify the next web standard when they can't even comply with the standards of the host OS they are writing for.
I can see how that can be confusing, but if you've ever stood beneath a power pylon on a still day, you can very clearly hear the distinctive 50hz hum coming from the cables, yet they too have no moving parts.
I believe this comes from electromagnetic resonance. I have also heard IC chips that produce a high pitched whistle.
When I was younger, I used to be able to hear the extremely high pitched whine from cathode ray tubes... from outside the house! This freakish TV detector ability used to fascinate my schoolfriends
These are not the security holes you are looking for
While these newly discovered vulnerabilities are interesting, you need only look at the change in attack vector by viruses in the wild to realise the depth of change related to windows security in recent years.
Long gone are the days of the of the blaster/sasser worms. Even the dreaded conficker worm uses a combination of social engineering and brute force dictionary attacks.
And the drive-by web based attacks rely on exploiting vulnerabilities in commonly installed software like Acrobat, not the OS itself. There in lies the rub.
All the current security issues on the Windows platform can be laid squarly at the feet of badly written 3rd party software.
It all started when MS ditched the home market DOS based OS and consolidated on the NT platform with XP. Prior to this, people who wrote software for the NT platform understood that it was a network based OS with tightly regimented ACLs, and if they didn't take this into account, their software would not work.
Then came the flood of script kiddies, DOS programmers, and beard-stroking old-school Unix zealots, who refused to comply with the windows security model, making it so diffucult to run as a limited user we have to run as admins, giving anything we double-click on full rights to the entire OS.
"Program Files? that has a space in it, and would require some improvement in my programming skills. I'll just install in the root of C:"
"Windows registry? Looks complex. I'll just write back to config files in my install directory"
The net result is that as a sysadmin, you spend days tightly locking down your windows environment, and then weeks punching dirty great holes in it again to get badly written software working. No wonder you're average home user is vulnerable, They've been conditioned into thinking that every bit of software out there needs direct kernel access and sufficient rights to re-partition your hard disk, just so it can self-update.
Firefox behaves like a virus, trying to write-back to its program folder when updating (instead of using an installed service). I've seen Google Chrome install itself into the users profile folder before! Don't think the open source crowd do any better. The first thing that happens when you launch GIMP, is it does a great steaming dump all over your user profile. You'd think by the way these programs behaved the coders had never actually seen a windows computer before in their life.
When these 3rd party programs finally start using the now decade old, well documented windows security model, then so can we! On that day, we will be genuinely worried by the UAC pop-up, rather that just assuming it's Mozillas crappy updating routine.
Interesting points here
Only a handful of trolls out today, which is quite refreshing.
If I remember correctly, it was the other way round? Netscape BEGAN by selling their product for £30.
Microsoft entered the market, also punting their product for roughly the same price. Unfortunately for MS, theirs was vastly inferior to Netscapes, and didn't sell. Unfortunately for Netscape, MS had extremely deep pockets, and began giving it away for free, then going on to bundling it with the OS.
It was at this point, Netscape began giving away their product, in a last desperate bid to stay alive. Since Netscape had no other means of revenue generation, it didn't take long for them to go under. This was long before the large-scale "free software" movement. A movement that has only recently become possible due to the many alternate revenue generation means made available by the now, well established internet.
I'm certainly very interested in FF4 (especially the tab management system), but IMHO the 3D aspect is going to be a mere gimmick for some time yet. Don't forget that 3D on the web has been tried many, many times before.
Not just VRML, but 3D content has also been available through Flash and Shockwave for many years (albeit Flash used extra plugins) and has been used quite successfully for a great many web-based games, but whole websites?
Also, the OpenGL standard has been languishing for some time now. Remember that it's the games industry which has pushed 3D standards to their now dizzying heights, and like it or lump it, the vehicle of choice for this progress has been DirectX for a long time now.
MS has worked very closely with the games and graphics industries over the years, folding new innovations into DX. Whether you approve of their methods, the end result is the undeniably powerful DX11.
GL is a poor showing by comparrison, and because of this, the vast majority of consumer level cards pay little more than a token guesture for hardware support.
features in IE
I used to religously use firefox as my faithful browserfor close to knocking on ten years. I'd tried Opera, and while it had some nice features (and some very gimmicky) I found it struggled to render some pages.
I, like a great many reading/posting on this article, had settled into a comfortable groove. After all, if it ain't broke, why fix it? Then my 7yr old OS install started to crumble. I had literally hundreds of programs installed, and a mixed graphics card setup that made reinstalling nigh impossible.
First Firefox stopped working, despite several hours trying to coax it back to life. Months later, Opera flaked out.
I only give you the backstory because it took this much preassure to to force me to fire up the old blue e. Never been updated, it only took a few minutes of fighting with v6 before I downloaded 8.
Yes, it was slow. Yes it was clunky. There was feature that immediately jumped out, though. The ease at which you could add your own custom search providers.
I filed it away in my favourites with the intent of having another look when I had some genuine time to waste.
IE changed all that. When you click the add search provider, there's a link at the bottom called "Create Your Own Search Provider". It's so easy to use, (search your target website with the word TEST, copy the resultant URL, paste in the box and give it a name) that I began to add websites which I wouldn't have originally considered worth the effort.
The real game-changer though, is using them through the internet accelerator. Highlight a bit of text on any given website, click the accelerator icon, choose a search from the list, and it opens that website with the search results from the text you selected in a new tab.
This sounds medicore, until you spend a few seconds creating a set of search providers customised to your browsing. Then you begin to realise your getting through the web faster and in fewer clicks than ever before.
I've no doubt FF4 will improve on things. They're claiming even better JS engine, and the tab management feature looks amazing. On the flipside, Mozilla update mechanism is nothing short of retarded. Not only does the self-update break every security rule of the WIndows OS set out since XP, but it does this on startup of the application?!?
Either way, this is all healthy competition, and is spurring the market forwards
re: event veiwer.
You beat me to the punch on how resource heavy the new event veiwer is, but getting teary eyed for the old one? Really?
Yes, the old event viewer loaded quickly. It also closed just as snappily after you discovered it had recorded bugger all that would give you a clue as to why the machine had suffered a major breakdown and rebooted.
Just last week we had one user who was suffering frequent crashes. At first we put it down to errant software. After all, the XP install on her machine was nearly 5 years old, and had been passed through many users, filling up with crapware. After trying various band-aids and little fixes we bit the bullet and remastered her machine.
This seemed to fix it, until a few days later, it crashed again. The BSOD indicated a memory problem this time, so roll out memtest and various other diags. There was NOTHING in the XP event viewer to even give me a hint, even with all the logs switched to full. It passed with flying colours. By this point, my slightly cynical nature started leading me to suspect it was somehow user instigated.
We put Win7 on her machine, as we've found in practice it's a lot more resiliant and idiot-proof than XP. Again fine for a few days, then I get another call. This time, it had only lost network connectivity. I look in the event viewer, and there it was. I can't remember the exact phasing, but it stated that the HAL had failed to retrieve information from RAM after resuming from sleep.
Bingo. A BIOS update to fix the *known* bug in the ACPI firmware, and we're all sorted. I suppose a lesson to be learnt would be slightly less cynical. When I asked her what she was doing at the time of crash and she said "nothing" I might have realised she litterally meant nothing, hence the machine went to sleep.
Whereas XP refused to even admit that the machine had even rebooted, Win7 just barely stopped short of whipping the side panel off the case and pointing at the chip!
Of course, this is known as self diagnosis and monitoring, or as my collegue amusingly refers, 'naval staring', and we're seeing it more commonly everywhere. Most modern cars will simply tell you exactly what's gone wrong when you plug a laptop in.
I was indulging in a bit of 'motherboard reading' the other day, and spotted a connector I didn't recognise. After some googling I found it was for a motherboard tester that checked every single pathway and chip. The controller chip for that connector had it's own internal bus bridge between the V1 and V2 extended diags.
Wait, what? even the motherboard self diagnostic chip is so advanced now it has multiple internal clock speeds? The self analysis systems within modern computers and the Windows OS are getting ever more advanced and resilliant, but the laws of physics dictate that that these diags have to use resources.
So, can I begrudge the event view taking a few seconds longer to open? Not really. If I'd had Win7 on that machine to start with, it would have save both me and the user a week and a half of mucking about, and I can easily foresee it saving me a lot more time in the future
Yes. Yes, they have indeed buggered up the configuration. What you have described smacks of a lack of planning and/or understanding.
I've been building and implementing a multi-site SCCM infrastucture for about two years now, and while I can certainly understand the duanting complexity of the SCCM system, the power and flexability is staggering.
Our desktop environments vary wildly in their hardware/software setup, so an image-based OS deployment system like RIS, Ghost, or WDS is completely impractical. The difference between those and SCCM is dramatic. Our server Has the drivers and software for every machine, and applies them intelligently during setup.
SCCM takes a lot of investment of time to get right, but once it is, it's magic. One example is we have a large number of machines with M-Audio soundcards. M-Audio haven't written their drivers correctly, so the OS deployment will install the drivers, but not the software needed to configure them (this is possible, as the Intel Media Graphics Accelorator does this correctly).
To compensate, I have a collection whose membership rule is that the machine has said M-Audio card. I then have a sub-collection whose membership rule is that the machine DOES NOT have the M-Audio software. A Silent install is advertised to that sub collection. These get updated each time the machine performs a scheduled hardware inventory, so it's all completely automated.
The collection rules can be based on any information that can be collected from Active directory, network information or the WMI repository. I've found that the WMI can tell you the serial number off the battery in the laptop, if you know how to ask it!
Sounds like your guys are running the site in mixed-mode rather than native. That's why a second computer account is generated after the OS deployment.
You can avoid inputting the mac address by simply joining the computer to the domain, pushing the SCCM client out, and then adding it to the OS deployment collection. I've done this a few times, but I've often found that OEM Windows install takes so long to do its' initial configuration, it's quicker to input the mac and run a bare-metal deployment via pxe.
Dell are the best for this. Not only do they routinely put the mac on the outside of the packaging, but when you enter the service tag on their site, you can download a CAB file containing all the drivers, which you can import straight into SCCM. No mess, no fuss.
The software suite we install during OS deployment is a pretty standard afair, again, because all the systems are so different. You can create rules during the task sequence that check for variables, but I found it easier to keep that part simple, and have software advertisments based on the computers OU container, group memberships, and the most commonly logged on user.
It doesn't deploy all the software during the initial setup, but we give it to the user, and the silent installs cause no disruption. They just suddenly find a new program in their start menu.
nature vs science
Indeed, barrels are used thoughout nature. As for chemical explosions, much rarer, but they do exsist.
The bombardier beetle releases two chemical simultaniously from its abdomen. If Wickipedia is accurate, hydroquinone and hydrogen peroxide, which react violently, deterring predators.
Nature is amazing on many levels, but don't let the hippies fool you. The overwhelming majority of machines we produce are vastly more efficient at their task than natures solutions.
Flight, for example. Your average Boeing uses far less energy to achieve flight than any bird or insect. Nature can't even build a land animal of the same weight, let alone for flight.
In fact, the heaviest animal ever known is the blue whale, tipping in at 180 metric tons, The Antonov An-225 could quite happily fly said whale, with 70 tons spare for provisions to keep our whale alive and in relative comfort during the 2,500 mile journey, faster than any lifeform at 500mph.
This is primarily due to the infinately rotatable axel. Something nature can never achieve.
We have long since outrun nature in everything we strive for.
What a piece of work is a man, how noble in reason,
how infinite in faculties, in form and moving,
how express and admirable in action, how like an angel in apprehension,
how like a god!
I think you'll find...
It's Adobe and googles updating routines which are working in direct violation of the windows architecture.
Adobe are legendary for writing software for the windows platform, that isn't actually compliant with the Windows architecture. Non-standard installers, non-standard interfaces, and of course, non-standard updating routines.
This behaviour was almost acceptable back in the days of Windows 95, when you bypassed the standard windows libraries so that you could tweak your assembler written game kernel to run on a 66Mhz Pentium, but things have moved on.
We use the incredibly powerful SCCM to manage software distribution and updating, but even with amazing tools like that, each new Adobe update makes me die a little inside.
Having managed a few networks in my time, I've dealt with windows boxes and related security issues on various levels, and nothing was more telling than when dealing with locked-down user accounts.
Most readers on this site will be accustomed to small-to-medium windows networks where most users are granted a modicum of trust and rights over their own personal systems, but when you have environments like schools, prisons, call centres it is policy to "lock it down 'til it squeaks" that you start to see some of the dirty habbits of software you previously considered respectable.
Once you've locked down a winXP system, it is nigh impossible to infect it. Buffer overflow code executions fail when they attempt restricted actions. Process user elevations never happened because policies specify a whitelist of trusted locations locally and externally that executables can be run from.
We never had a problem with the students desktops (the teachers laptops on the other hand...)
Secure, that is, until you start having to punch dirty great holes in your own security to get shoddily designed bits of software working.
Firefox is a classic example. It's self update system breaks several fundamental rules of the windows environment. The most obvious of which, attempting to write back to its' own program folder.
This should never happen. The updating component should have been installed as a local service.
What really irks me, is that these aren't brand new rules that you could forgive people struggling to catch up with. The NT family were deisgned from the get-go so that in everyday use you run as a limited user but there are still too many lazy coders out there who take shortcuts that compromise the whole systems security, forcing you to run as root.
The UAC isn't intended as a direct security measure. It's there to embarrass the coders into writing their software in compliance with the platform they are developing it for. Just think of it as a big FAIL sticker on the 3rd party software everytime you see it.
Our Unix guy isn't here right now, so I can't get all the grizzly details, but our fileserver gave us equally unexpected and unpleasant results when we started tripping over its’ hidden limits.
We have a Unix baox attached to a SAN with approximately 4Tb storage, which at the time we considered ample capability for its job. It didn’t take long before we hit the dreaded inode limit.
As I say, I can’t remember all the precise details, but the EXT filessystem, formatted over a certain size (1 or 2 TB) assumes that the average size of each file will be around 1GB, and therefore allots an appropriate number of Inodes for this assumption.
If your average filesize is closer to a few KB, your Inodes run out loooong before you reach the drives space capacity. When he investigated this, he naturally assumed he chosen an incorrect option when creating the partition. After significant research, he discovered the was the default, and only configuration.
I love some of the suggestions that appeared between composing and posting my last comment!
Please tell me you're joking!
Pull out the hard drive? What, one hard drive containing 60m files? Or even a server, and it has ONE hard drive?!? This isn't the 70s, Rus.
1) it's at minimum RAID 5 array, quite possibly utilising the raid controller built into the servers motherboard. The raid controller is integral to keeping the data readable
2) You don't just 'pull out hard drive' on a server. In all likelyhood, that machine is still live, and hosting a miriad of roles and services for the network.
I'll give you 10/10 for optimism there. Xcopy would have failed just like the other commandline tools Trevor had tried. I've seen many a solution using insanely complex batch and kix scripts fail time and time again. The simple fact of the matter is that in a complex environment such as this, scripted systems invariably fail due to the unexpected and unforeseeable.
We have actually had to shift FROM a unix file server, TO a windows system. Over 2 terrabytes, we encountered severe limitations with the filesystem. That being we were regularly exceeding the inode limitations. After several weeks of research, we discovered that this was a fundamental design flaw of the filesystem, which assumed over that size the partition was going to be filled with files greater that 1Gb, not tens of millions of 1Kb files.
On top of that, Unix has a less advance Kerberos implementation, meaning computer account permisions could not be applied, and the time saving benefits of giving users access to Volume Shadow Copy dynamic restores meant were weren't forever routing through tape backups for every user that accidentally overwrote their word document.
large file transfers are always a challenge
Personally, I swear by Directory Opus, by GPsoftware.
I can attest to it's incredibly reliable performance, error handling and insanely flexible advanced features.
Aside from being able to copy vast quantities of data, handle errors, log all actions, migrate NTFS properties, automatically unprotect restricted files and re-copy files if the source is modified, it also has built-in FTP, an advanced synchronisation feature (useful for mopping up failed files after you've fixed the problem that stopped them being copied), and a truly unparralelled batch renaming system which among other things, can use Regular Expressions.
It also has tabbed browsing (you can save groups of tabs), duplicate file finding, built in Zip management, custom toolbar command creation, file/folder listing and printing....
Stangely, not a lot of sysadmins know about DOpus. I learnt of it during my Amiga days, in what seems like a lifetime ago. I always have a copy installed on my workstation, and at least 1 of my servers
have to agree with harryhedgehog
As a home user I was a devout AVG user for many years, but compared to MSSE, AVG is a total resource hog, and not very effective at it's job.
One annoyance with AVG is it's schizophrenic behaviour when an infected file gets lodged in a system restore point. Despite the infected file being completely inert (unless of course you put your machine back to that restore point) the monitoring component will periodically scream VIRUS! VIRUS!
So, you run the scanner across your entire hard disk, at the end of which it comes up "dunno what you're on about, mate. No virus here" because it can't access the System Volume Information.
Symantecs' product is less than useless. The first sign of a virus, and it curls up in a corner crying "Not in the face, not in the face!" You can tell your machine may have brushed by some 10 year old malware, because the Norton icon in the system tray is disabled, and you can't re-enable it. Oh, and the firewall in it continues to block software on your computer from accessing the net even AFTER YOU UNINSTALL IT!
Mcaffee have come up with an interesting solution. They install so much crap on your system it s-l-o-w-s... t-o... a... c-r-a-w-l. Your computer is literally too slow to catch a cold.
Sophos is an arrogant little turd. You fire up your computer, log in, and it essentially shouts "STOP EVERYTHING YOU ARE DOING! I'm updating myself." It seems to grab nearly every resource off you for the simple act of self-update. And don't get me started on the false positives!
Compared to this lot, MSSE isn't just out in front, it's lapped them several times
I'd have to agree with Ku...
What piqued my interest in Winpho7 was the underlying engine.
With shifted a great many in-house intra-services to .net, because the visual studio development environment is top notch.
This potentially opens some very interesting integration opportunities.
Time will tell...
Horses for courses indeed
While parallelisation can benefit a great many applications in general, there are also a great many algorythms which simply have to be serially processed, due to the next part relying on results from the prior code
Plus the greater the number of processes/threads, the greater the overhead of managing them becomes.
Ray tracing is a classic example. While chucking cores at a single image being rendered will greatly increase speed, you quickly hit a wall, due to each pixels colour being dependant on its' adjacent pixels (and further), due to anti-aliasing, specular flare, and other physical effects that simply aren't known until the other nearby pixels have been calculated.
In this scenario, 2,4, or even 8 cores will produce dramatic benefit when you split the image up into chunks and render each simultaniously, but each chunk then has to be 'stiched' together at the seams, and this generates more work than if the overall image had been done in one peice. Still a massive benefit overall.
But what happens when you have a 1:1 ratio of pixel to processor? The render re-hashing would be horrific, not just you the system, but also for the poor programmer who would have to figure out how to code for every pixel adjusting itelf to its adjacent, while that pixel is also still adjusting?
At least they appologised
I seem to remember Google had some similar problems early on with their cloud services.
Their response was more along the lines of "Meh. Something happened. We fixed it."
Not exactly a comforting response when your business is utterly dependant on said service.
blookin social networking cr*p
I wish MS would stop trying to ape Apple with and their ilk by offering useless fluff.
Windows Phone 7 is not going to make a dent in the the fluff market. Apple and google have sewn it up tight.
It's the corporate market that redmond really need to focus on. Never mind bloody facebook/twatter contacts, how about a user friendly means of filtering and searching exchange/ldap contacts?
WP7 is apparently based on the core of 7, with the ability to bolt on modular extra components from the full-fat version, like Location and Sensors and Bitlocker.
Now that I can see as being useful. Your field engineer wakes up in the morning, switches on his phone, and starts receiving his synchronised calender schedule for the day which was automatically generated from the call centre booking system.
All customer details are secured using bitlocker, the GPS unit tied into location and Sensors, then feeds the times and locations into some autoroute type software, and plots out the optimal journey (I know, I know.), and the customers contact details for that day would automatically generate a quick-access contact list.
Finally, because this is essentially a 'proper' windows platform, all phones can be managed in the same way as we would laptops, with software updates and configurations automatically managed through SCCM and group policy.
This is only touching on the potential offered by the ever aloof 'unified platform'.
Redmond are tantilisingly close this time, just as long as they don't fumble the ball.
You may say I'm a dreamer, but I'm not the only one...
Firstly, ripping the contents of a game to hard disk vastly speeds loading times.
Games are expensive, and discs get lost or scratched. Yes, Blu-ray too. The odds of either happening are vastly reduced if you only use the disc occasionally.
Sony PS3s are notorious for frequently breaking optical drives. Sony also charge more than £100 to replace what is essentially, a £20 blu-ray reader.
This also opens up the potential for "bedroom programming". Something that has been long, long lost amongst the endless parade of stale sequels.
These are not the veiled arguments of a pirate attempting to defend his illegal ways, these are the legitimate reasons I have not bought a console since 1998.
Yours sincerely, a lost customer.
By what twisted logic can you lay that blame for Flash crashing your machine at the feet of MS?
That's Adobe (one of the flakiest mainstream software developers on the planet) not fully testing their x64 implementation of the flash plugin.
As for Zone Alarm, don't touch it with a ten foot barge pole. I didn't even think they were still going. There's certainly no market for the product since XP SP2.
I can't remember that last time I've seen a BSOD that hasn't been caused by dodgey software, dodgey drivers, or hardware failiure.
MS's greatest weakness, is when you run the OS and application stack on insuficiently powered hardware. If you keep the CPU/GPU/RAM/HDD maxxed out for long enough, apps will start to fall over.
This is no different to any other OS.
Now cue the trolls "Linux has a smaller footprint..."