54 posts • joined Thursday 23rd August 2007 19:34 GMT
They got the same picking on with Vista. But with Vista it really wasn't Microsoft's fault so much as Intel and their crappy drivers and hardware. But with Windows 8 it's every bit Microsoft's fault and they deserve the criticism. Their new interface made the OS easy to use on tablets but considerably harder to use on traditional desktop machines. The need to do better.
The same thing has been happening in the PC market as well. You used to be able to expect that a high end PC would come equipped with a creative labs, turtle beach, or other comparatively high qualify audio card but these days you're hard pressed to even find them separately let alone buy a machine with a decent audio card pre-installed. I remember I used to be able to find motherboards with creative labs audio chipsets onboard. Now all I see are realtek chipsets. My home theater PC is poorer for it.
But I also agree that the source audio quality is a big problem these days. I have an old album called the Sierra Soundtrack Collection recorded my Mark Siebert that I bought back in the early 90s. The audio fidelity is absolutely amazing. It easily destroys newer stuff like my Disturbed albums in quality. Don't get me wrong, I love Disturbed's stuff but I do really miss the 90s focus on audio fidelity.
Cold Hard Reality
It's about time someone finally wrote in an article what I've been thinking for the last few years. I live 25 miles out of town in the middle of a desert. If my internet connection drops Only 5 times a day I consider it a very good day. I'm in exactly the situation you describe and there's not a damned thing I can do about it. I have exactly one internet provider here and I pay over $100 a month for a shitty 1.5mbps connection. This always on bullshit is simply and purely unacceptable. If either next gen console comes with an always connected requirement, I simply will not buy that console. I consider this matter non-negotiable. If they want my money, i require that they drop their always on connection requirement. I WILL NOT and CANNOT budge on this position.
I experienced the same crap for the longest time until I started making my friends and family buy their computers from an official vender like Dell or HP. When something goes wrong I tell them to call tech support. I'm much happier for it.
It's my opinion that the average buyer doesn't understand the difference between an E-ink screen and a tablet screen. When i walk into a tech store like Best Buy I see the e-book readers and tablets displayed next to each other in the same place in the store but I never see any displays or placards that explain the benefits of the E-ink screen. So naturally potential buyers see the tablets with their beautiful color screen and wonder why anyone would buy an e-book reader. I think retailers need to do a better job of making sure customers understand the benefit of only having to charge their e-reader up every few months no matter how much they use it.
I just wish they would update Top Gear on Netflix with the newer series. They stopped at series 17 and have not posted anything past that.
I think your assertion that Microsoft needs to "Think differently" could be expanded on. The problem, as I see it, is that Microsoft tried to create an interface that works for both desktop and tablet computers rather than creating two separate interfaces optimized for each task. They should have provided a way to active switch between "Desktop Mode" and "Tablet Mode" with a clear desktop oriented interface based on what has worked historically for desktop mode and the new Notro interface for tablet mode. This would have made Windows 8 into an OS that could easily slot into a wider set of usage scenarios. Instead by trying to create this hybrid interface they've done the opposite, narrowing the effective usage scenarios.
Desktop users want a Desktop OS not a tablet OS and this new interface suits tablets to the exclusion of effective desktop use.
Agreed. I still pay around $90 a month for 1.5mbps connectivity because they're the only high speed low-latency ISP that services my area and they're fully aware of that fact. I'm sick of being gouged by these greedy bastards. I really wish the FCC would do something about it.
Your methodology is flawed. The study in question shows that Mac users donate, on average, more than Windows users do. It's your assumption that Mac users are more generous than Windows users that's flawed. It's the causal relationship that matters here. It's no surprise to anyone that Apple computers generally cost more than comparable products running Windows. There are notable exceptions but as a general rule this is true. Given that fact, people on a limited budged are significantly more likely to purchase PCs running Windows than they are a Mac. The natural result of this inference is that percentage-wise, people who use Apple computers tend to have more disposable income to work with than those who use PCs running Windows. That's not an indication that said people are more generous, only that they have more money they don't need.
I find myself reminded of the South Park episode about the clouds of Smug surrounding eco-friendly car owners =D
I must admit, I simply do not understand how "Smart TV" could possibly be "The next big thing". It just seems to be to be an utterly pointless offering. All TVs sold today have HDMI input and it's literally second nature to just hook up a basic Home Theater PC that can easily deliver everything a "Smart TV" can deliver and a million things it can never hope to deliver. Why would anyone think that Smart TV was anything other than garbage with a pretty package wrapped around it? At least the 3D revolution had something tangible to offer. If you want a decent entertainment experience, get a decent 3D TV and hook up a decent gaming ready HTPC. Don't waste any effort, or especially money, on Smart TVs with features that even the cheapest HTPC can easily outclass.
One thing I couldn't help but notice is that you failed to mention how Windows 8's security compares to Windows 7 or Windows Vista. You say Windows 8 can be infected by 16% of the most popular malware when the OS' only protection is Windows Defender. I suspect that that if you ran those same tests against Windows Vista or 7 that you'd find similar results. My instinct is telling me that Windows 8 is likely as secure as it's predecessors and that you're focusing on Windows 8 in an attempt to grab headlines.
When you look at this from the Glass-Half-Full perspective, that means a clean install of Windows 8 is resistant to 84% of malware designed specifically to infect Windows machines. That's pretty good as far as I'm concerned. As the saying goes "You're trying to make a mountain out of a mole hill".
Vote with your Wallet
I applaud the ruling. I'm of the opinion that the current state of television programming is vile and untenable. The way I see it, if I pay for television programming, then that programming better be 100% ad-free. I refuse to pay for programming that wastes as much as half of my time with useless repetitive twaddle. Since cable operators feel the need to charge for their service yet still shove endless amounts of advertizing in my face, as far as I'm concerned, their service is utter shite and not worth even $1. I vote with my wallet on this and so should you. If they insist on flooding your peaceful house with disruptive crap then the service needs to be completely free.
I do understand the cable operator's perspective and business model. They charge for the service of bringing the available cable channels to your home but they don't actually do any of the programming itself. But that doesn't change the fact that cable is seriously expensive these days despite the fact that the programming is something like 99% ad supported. I buy every one of the shows I watch on blu-ray for less than half the cost of paying for cable for 6 months worth of time. Your average season of any particular show will run about $50 and that's for excellent quality, no commercials, and I can watch it on My schedule rather than trying to time shift it.
I just don't see why cable companies are still in business. It doesn't make sense.
Wow, this article is littered with bad information.
>> Yes, that same Microsoft that blistered the existing gaming competition with the XBox,
What're you smoking?! Microsoft and Sony tied for second place this generation and Microsoft barely qualified as a competitor in the previous one. Nintendo wiped the floor with both of them this generation for both market penetration and profitability.
>> and subsequently set the standard for interactive gaming with Kinect.
All gaming is interactive! That's the whole point!! Microsoft did NOT set the standard or even come close to it. It was only a few days ago that I read an review of Kinect and how no serious gamer would even consider waving their arms around to try and play an PRG or ANY game that requires quick reflexes. Kinect was knee-jerk reaction to Nintendo's success with the Wii just as Sony's Move is. No serious gamer uses it.
>> Valve CEO Gabe Newell called Windows 8 a "catastrophe" for gaming, citing the Metro interface and Microsoft's closed app store.
I wouldn't call Microsoft's app store a catastrophe because it's closed. I would call it bad for end users. I understand what Microsoft is trying to do but from an end user perspective they don't seem to realize that what they're doing is fragmenting the DRM landscape. Having a central gateway in the form of Steam made it obnoxious but grudgingly acceptable to work with DRM laden software. But every additional gateway that gets added to the landscape makes DRM in general more obnoxious and less acceptable. Microsoft isn't helping here.
>> The former concern is overblown - you don't have to use the Metro interface if you don't want to
It's not overblown! Metro is shit for anyone running a desktop computer and most hardcore PC gamers do so on a custom built desktop machine. To my knowledge you don't have the option to turn Metro off without using third party software to accomplish it. I could cope with the new start screen but every application's insistence on running in full screen metro mode is very much a catastrophe.
I like how fast and well optimized Windows 8 is but Microsoft really screwed up with the Metro interface. They made the OS completely unusable on most net-books and even on any machine running at 720p resolution (My HTPC) by artificially setting an unacceptable minimum resolution.
As usual you appear to have absolutely no idea what you're talking about Matt.
My experience with OpenSuse in general was only what I would call "Okay". I liked the look of the interface but I just kept running into too many strange bugs for my taste. With 11.4 Only around half the desktop effects work correctly on my Intel chipset powered laptop. If I turned on the wrong desktop effect the whole interface would grind to a halt and crash. There were also a lot of widgets that would just inexplicably crash on me. With 12.1 I found that the interface was just slow as tar running down a wall. Even when I turned most of the desktop effects off it just felt like I was constantly having to wait for the OS to catch up. Not at all a good experience for me. But it was the strange glitches in the "start menu" that really pushed me away from OpenSuse. There seemed to be some weird memory leak that caused items to start vanishing from the menu just leaving blank spaces in it's place until you mouse over those entries. I hate to say it but even Windows 98 was more stable than that. I haven't seen such bad behavior from an OS interface since Windows ME.
I also gave Ubuntu a try. I liked how fast Ubuntu performed but I didn't like the Unity interface and I found the bugs in the Gnome to be a nightmare. Any time I opened more than 7 or 8 browser windows the whole interface would flip out for no apparent reason.
Now I'm using the KDE release of LinuxMint and so far it's the first Linux distro that's behaved for me on my antiquated laptop.
Anyway, has anyone else run into similar problems with OpenSuse? If so, have they fixed those problems with 12.2?
I've always wondered what it is that turns previously intelligent individuals into raving copy-bots once they're employed in the field of IT journalism. If only I had a dollar for every incidence of some oblivious journalist claiming Windows Vista is a disaster I could buy Kim .com's house. At this point, Windows Vista is actually better than Windows 7. On a clean install, Vista with service pack 2 is faster and takes a lot less interface customization to make it work well than Windows 7 does. I've got both Windows 7 Ultimate and Windows Vista Ultimate here in my house and I keep Vista on my personal machine because I actually like it. If you genuinely do prefer Windows 7's interface then power to you. I personally hate it. Please stop repeating that baseless claim that 7 is better than Vista. They're virtually identical under the hood with 7 having slightly poorer Superfetch optimization. Microsoft tried to shorten the boot up time with Windows 7, by reducing the effectiveness of the Superfetch service, in response to user complaints over Vista's perceived boot time.
I think you're letting your opinion / perception of what many of these movies should have been cloud your judgement concerning the movie's quality. Some of these movies are indeed turds but some of them are actually decent movies in and of themselves if you just put aside any preconceptions based on the game of the same name. Doom, Resident Evil, Silent Hill, and Tomb Raider for example were all decent flics that I continue to enjoy to this day. If you put aside any opinions based on the game those movies aren't actually bad. most of them do tend to have at least one mediocre actor in it granted but what movie doesn't? Even Mario Brothers, despite having a script targeted at children, was an enjoyable movie. Street Fighter and Double Dragon were pretty much just stinkers though. I can barely get through either of them without falling asleep. For some perspective I recommend you sit down and watch Battlefield Earth or perhaps Manos: The Hands of Fate (Without the awesome riff-trax). Then you'll have an idea what a bad movie really looks like.
Ubisoft appears to be run by a fair lot of incompetents. Their Always Online fiasco annoyed me so much I stopped buying their games. I won't spend money on another Ubisoft published product until they publicly acknowledge that they screwed up and take steps to resolve my concerns. It's a shame really because I genuinely wanted to play through the Assassins Creed games that followed the first one but I refuse to give that shit company my money and I don't pirate my games. I haven't pirated a game since i was an idiot kid living in my parent's house and I have no intention of doing so ever again. I take offense at being treated like a software pirate after paying good money for a game I want to play.
I think there are larger issues at stake here than just whether or not Samsung copied Apple's product design. I think the real issue that this case will influence is whether or not Apple should have exclusivity over the tablet market. There concept of tablet computing in general is not an Apple invention. There have been numerous prior art examples. The data pads everyone carried around in Star Trek The Next Generation is a prominent example. The concept of portable tablet based computing has been around for a long time. Many manufacturers have tried to produce devices that fit this segment but I grudgingly admit that Apple was the first one to market with a device that truly broke the geek-only barrier. Bringing existing touch screen tech and online app distribution together in a user friendly way was the key in my opinion. But despite this I don't believe Apple should have exclusive rights to tablet computing in general. That's what this trial is really about. Apple is on a quest to eliminate Android so that they can exclusively control the tablet and smart phone markets. Samsung was just the most vulnerable enemy and represents an early step in that long term goal. I don't believe Apple has genuinely earned the right to own exclusivity over tablet and phone computing. And I don't believe it's in the consumer's best interest for Apple to win this trial.
Woz has the right idea
The buz on the internet and in most office meeting rooms is "Cloud". It seems to have caught on as "The next big thing". But I agree it's nothing more than the "leet speak" of the business world. Reality though is the black to the cloud's white. The simple undeniable fact is that if you move your critical data into the "cloud' all you're doing is adding one more fail point. Now not only do you have to worry about the functional reliability of your service provider's storage hardware but now you've just added your, and same said service provider's Internet connection to the list of things that can go wrong. Moving your data to the cloud Increases your down time rather than increasing it.
The cloud is fine for non-essential things like personal e-mail or for social sharing but for anything more important than that it's just snake oil.
Wow. Someone forgot to take his happy-pills this morning. The game isn't that bad. Not what I would call a Great like Diablo 2 was but it's still fun.
I do agree that Blizzard screwed up on an epic scale in regards to pretty much everything concerning the game's launch and management but the game itself is decent enough. The fact that a fan died while playing is testament.
Technology Matte Black
You know I was just thinking the exact opposite. I actually LIKE the matte black finish and I feel all tech should be available in basic black. I was looking through the phones the other day at best buy and I couldn't help but notice how many of them failed so spectacularly at visual aesthetics. It's like cell phone manufacturers have completely lost any sense of style or taste. Not one of them was completely matte black. Either they have a back cover that's a hideously different color than the rest of they have this god-awful silver band around the outer edge (apple style). I really can't stress enough just what an eye-sore most apple products are and it fizzles my brain to see so many phone manufacturers willfully copying Apple's lack of style.
I still think it should be a crime to use the color silver without an art degree or something XD
Another Plasma Fanboy
I'll start by saying I own a Panasonic 58 inch plasma screen so i am a bit biased. I did a ton of research before I purchased so knew what I was getting into.
I think the predicted death of plasma is premature. LCD is indeed cheaper but black levels are still a problem with LCD screens. Some of the more expensive LCD screens come comparably close to plasma in black level detail but by the time you get there you're actually paying more compared to plasma. OLED and Sony's Crystal-LED both produce excellent black levels that meet or exceed what plasma can do but both are 10 times as expensive so again, plasma is still the winner for people who want excellent quality for a reasonable price. It was only a few months ago I recall seeing Panasonic announce they had produced a flicker free plasma display at 150+ inches and as I am a fan of plasma I am definitely interested in seeing larger models come to market.
I love my current Panasonic screen. I use it for my home theater PC. The burn-in that people so often mention when discussing plasma has primarily been corrected with the more current models. My set is 2 years old now and there's no visible burn in despite the fact that I spend hours a day playing games with stationary on screen interface elements. Just remember to run the anti-image retention utility that every current plasma set comes with, regularly.
I noticed that a lot of you don't seem to see the point of this article. I agree that the article itself didn't touch very well on this. What the writer was trying to compare were processor's that have integrated graphics and therefor do not require a discreet graphics card or motherboard mounted graphics chip. Both Intel and AMD make what AMD calls an APU. The benefit of an APU is that the CPU and graphics chips don't have to communicate with each other over the PCIE bus and can thus exchange information faster using a lot less power. This is perfect for very small form factor machines like MiniITX and for laptops. Machines like this are ideal for home theater PC setups or for just general office work where heat output and power usage are critical.
I think I saw several people claim that businesses have no interest in graphics. That may technically be true but businesses DO have a very keen interest in power usage. When you have an office building with 1000 employees all using computers I can guarantee every watt matters. And that's where APUs really shine. They can deliver the performance of a mid-range discreet GPU at literally half the power usage of a comparable CPU/GPU combination.
I confess that I myself built my current office machine around AMD's A8-3870. I used an Asus MiniITX motherboard with on-board wireless lan and bluetooth, and an Antec ISK-300 MiniITX case. What I specifically love about this machine is that it costs less to run than a 100 watt light bulb and when I feel like playing a game I can even run things as current as Skyrim. Not at it's maximum detail naturally, but anything older than 2 years will run just fine maxed out.
So here's my opinion on the whole Intel / AMD comparison. If you're going to build a low power usage machine like this that you want the ability to run basic games on there's one thing to keep in mind. Intel, in it's entire long history, has never managed to produce a graphics driver that didn't suck. Intel does make faster CPUs but I wouldn't bet a dollar on their ability to make a GPU that will work correctly on every game you throw at it due to their shoddy driver work.
Then there's longevity to consider. AMD uses a unified driver architecture so that only a certain minimum interface between the core driver and the hardware has to be created for each new graphics chip. Their core driver works on every OS from Windows XP to Windows 8 and will likely to continue to do so for a long while still. In addition to that, their driver package supports all their hardware going back at least 5 years. This means you will not only be able to fully utilize your hardware with the upcoming Windows 8 but it's also likely you'll be able to do so in 3 or 4 years when Windows 9 comes out. You won't get that from Intel unless you have a discreet GPU in there as well and then your power usage and heat output is right back up there.
If you want a dedicated gaming machine go with an Intel CPU with a decent AMD or nVidia graphics card. But if you want a decent office machine that runs cool an quiet while idle but can still deliver some gaming oomph when it's needed, AMD's APU is the clear winner.
I couldn't help but notice how many commenters here seem to think that American authorities treated this case with a “Shoot first and ask questions later” approach. To any American, myself included, this illustrates just how little you know about how things work over here. That gunge-ho attitude you seem to attribute to American behavior is a myth. I agree that the seizures were completely illegal but if you think American authorities didn't fully know this before any action was taken then you've got your head buried up your arse. Money changed hands here. Plain and simple. And judging by the fact that they were willing to risk an international incident, I would say it was a LOT of money. Corruption in politics is something of which we have plenty.
Value For Money
I'm of the opinion that Television programming providers here in the US are actually on a downward slide that will, at some point in the future, take a sharp dive. Most of my family's older members spend a lot of time watching television but it seems the younger views, prefer to watch shows online. I'm rather in the middle on this trend.
I think television providers like Dish Network, Direct TV, Comcast, etc. have forgotten that viewers pay for Value not for programming. As I've gotten older I've formed the unwavering opinion that the rampant commercialization of today's television content has complete drained the value from that content. So much so that I refuse to pay for cable television. There are channels available that are commercial free, HBO / Showtime / Cinemax, but none of the content providers will sell you access to just a paid channel by itself without some for of basic or extended package also present on the account. This destroys the value completely. And so I refuse to give them any of my money, opting instead to wait until the show hits blu-ray (if I really like it) or netflix which is commercial free. I'd love it if say Dish Network would sell me access to just one premium channel by itself, HBO for example. That would provide genuine value for my money.
I've found it to be an insurmountable challenge to make television providers understand that I am an All or Nothing type of viewer. Either I am paying for the content and it had better be 100% commercial free, or it cost me nothing at all to view the content and I would expect it to be ad supported. Under no circumstances will I watch a show that I have to pay for AND has commercials in it. I've voted with my wallet by not purchasing cable television but my small contribution to this cause is like a fly trying to stop a freight train. I wish more people would stand up and make this opinion known to them.
Anyway. I think that older people are just so used to paying for commercial-infected content that they continue to pay blindly for material that should be totally free to them. I also think that as the older generations pass on and the more tech savvy generations are left that TV providers may suddenly find themselves without any customers.
It's good they are re-evaluating their structure. About 4 months ago I finally gave up on OpenSuse and moved to LinuxMint. I changed because of the excessive number of strange bugs i kept running into. Each successive release would fix some bugs but always had even more new ones so the problem just seemed to keep getting worse. The 12.1 release was so bad it was almost unusable on my netbook. LinuxMint isn't perfect but there are a lot fewer obvious defects than I saw in OpenSuse.
Can anyone confirm for me either that Metro apps run from the start screen are able to run in classic desktop mode by default or that Microsoft has enable the ability to run them at resolutions below 1024x768? I want to use this on my netbook but arbitrary limitations placed on the Metro interface made it impossible to use Windows 8 on my 1024x600 pixel screen.
Most people aren't going to want to spend an extra $500 for a machine that can do the same thing as a cheap netbook but is a little thinner. The CPU in the ultrabook is faster than a netbook CPU and Intel claims the GPU is Direct X 11 capable but Intel has been historically unable to produce drivers that can run even basic old games let alone anything current. So the end result is still a machine that does exactly the same as a cheap $400 laptop or even cheaper netbook but just weighs less. Worth an extra $500? Not likely!
I'm glad Sony made this decision. I prefer physical media because where I live the fastest internet connection available is 1.5mbps. It takes days to download a 10gb file and PS4 games are likely to be at least that big and probably bigger.
As far as backwards compatibility goes, I would like the PS4 to be backwards compatible but the rumors suggest this is unlikely. The rumors are saying that Sony is switching to an architecture closer to x86 to make coding for the console easier. If this is true it's unlikely the new machine will be able to emulate the cell processor fast enough to make PS3 games work. It could probably emulate PS1 and PS2 games well enough though if they put enough effort into the emulation engine.
It was only the first two or three models of the PS3 that backwards compatibility with PS2 games. The original 60gb model had hardware support for it and the subsequent 80gb model had software emulation.
I kind of feel the same way. I first learned to program on my C64 starting with Basic (Wrote a light cycles clone), then moved to inline assembly, then completely assembly, then machine code. Then later when I got my first PC I picked up C++ (I miss Borland's turbo C++ compiler), Cobol, and Pascal. Not because I was any sort of wiz at computer languages so much as they were all close enough that once you learned one, the rest were all just a matter of syntax. I got into it because I wanted to learn to write games like the ones I spent countless hours playing on my C64. Sometimes I feel sad that today's youth won't have that same opportunity to get into computers when it was still a hobby small enough to be enjoyable. These days everything is so abstracted that just getting into programming seems like a daunting task.
When you say 8 bit are you referring to the graphics or the CPU? Looking at the list of games, none of them appear to actually have 8-bit graphics so it must be a CPU reference.
Ant Attack: 5 colors counting the black background
Chuckie Egg: 4 colors
Elite: 6 colors
Hitchhiker's Guide: Seriously?
Hungry Horace: 6 colors
Jet Pac: 6 colors
Knights Lore: Cool looking but still only 5 colors
Manic Miner: 6 colors
Phantom Slayer: Definitely 4 color CGA - 2 bit graphics.
Tranz AM: 7 colors
Actual 8-bit graphics had the ability to display up to 256 colors. Meaning each pixel was represented by 8 consecutive bits able to store a number from 0 to 255. For this you had to use the 300x240 VGA resolution "mode 13" if I recall correctly. Though it's been forever since I did any computer programming so I can't remember for sure =D. 4 color graphics were CGA and were essentially 2 bit. 16 color graphics were EGA and were 4 bits per pixel. There were also some VGA modes that could only do 16 color graphics but that was more to a memory limitation and the VESA spec than the actual bits involved in the graphics. With 16 bits per pixel you had a palette of up to 65535 colors otherwise known as High Color. This was most commonly found in your Super VGA graphics cards.
It just struck me as odd that none of the games you showed here are actually using 8 bit graphics as the title seems to suggest.
People refer to the NES and the Sega Master System as 8-bit but that's a reference to the CPU I suppose because the NES was never capable of more than 4-bit graphics output.
I can't speak for everyone else but I still buy CDs for several reasons.
The CD represents a physical backup of the uncompressed audio that I can convert to any format I happen to need at the time. My music collection is currently in MP3 format. I use that in my car, on my phone, and on my PCs. If something were to go catastrophically wrong with my digital copies (Hard drives and Flash drives do fail unfortunately) I can always fall back onto my CDs to make new digital copies. Also if I decide to change from MP3 to a newer better sounding format I have that option without having to try and convert one lossy format to another. Converting from the original lossless CD always produces the best results.
Then there's the issue of control. If my music were linked to an online account of some kind it would mean I never have complete control over my investment. While most companies couldn't get away with anything truly shady, that fact doesn't do much to ensure my purchase would still be usable if the seller went out of business. The recording industry does seem to think it should have some right to control what it's sold to me after the sale. I disagree and I won't put myself in a position where they have the option of denying me my right to listen to the music I've purchased. Even if they promised never to use that power, I just can't accept giving them the power in the first place.
There have been newer physical formats to try and usurp the CD but so far they all seem to have some form of control mechanism in place that gives the seller the power to take my purchase away after the fact. For the most part people know this which is why I think CDs are still selling strong.
I limited my list to movies that were so bad I had genuine trouble finishing the movie awake.
Birdemic: shock and terror
Radar Secret Service
Mac and Me
The Horror of Spider Island
Hercules v.s. Karate
A Boy and his Dog
I failed miserably on Mac and Me. I tried twice to watch that pile of steaming rectal discharge but in both cases I passed out and woke up hours later. Good as any tranquilizer.
3D itself is more popular than 10%. According to the MPAA statistics, 21% of box office sales in 2010 were 3d. That's up 91% from 2009. I couldn't find any statistics for 2011. Still, that doesn't include any adjustments for the number of movies that were released without 3d versions. If you remove sales of 2d only shows from the results I suspect 3d showings would prove to be quite a lot more popular than 21% when a 3d showing is available. What's not popular is high price. There are some 3d TV's available for lower cost but in general 3d televisions do cost a lot more than non-3d televisions. I think that's why they only sold 10% of the market. I.E. The difference between a $7 movie ticket and a $10 3d movie ticket is nothing to most people but the difference between $1500 for an HD television and $2500 for the same model with 3d will make most people balk. Not to mention the $100 price tag for each pair of glasses. I personally do own a 3d Panasonic plasma television and I love it. So do all of my friends and family who have ever sat and watched a 3d movie with me. It's the cost of entry for the tech that's slowing the uptake not the tech itself.
I wouldn't expect these ultrabooks to sell very well. I think there are three usage categories that require specific hardware: Basic, Professional, and Gamer. Your basic profile includes things like web surfing, simple photo editing, and maybe older 2d games. You don't need a lot of RAM or a powerful GPU to accomplish these things. The Professional category is where you need a lot of ram and a decent CPU but not necessarily a decent GPU. That would be stuff like photoshop or render work where you don't actually have to have a decent GPU but it helps. And then there's the gamer usage where you absolutely need a decent GPU and RAM, and usually a decent CPU.
My instinct tells me that these ultrabooks will have a decent CPU and will have enough ram to run more than what you would expect from a netbook. But frankly Intel just sucks at GPUs and that means these ultrabooks will fall into that Professional usage category. Your average person will see the "three times higher than a netbook" price tag on these and realize that these machines aren't going to make surfing e-mail much faster than that $300 netbook will. IE "Why pay $1000 for a machine that can do the same things for me that a decent netbook or just a regular cheapo $500 laptop can do?".
Gamers are going to realize right away that they need a decent nVidia or ATI GPU in order to run anything current.
Most professionals will realize that they can find a decent 17 inch laptop with a decent GPU for only a few hundred more than the cost of an Ultrabook. That makes the ultrabook target audience very small. Pretty much limited to business owners who need extreme mobility or people who have enough extra cash that they don't mind the added expense. I just don't seeing these selling a lot unless Intel gives in and drops the price on the ultrabook chips to something close to a netbook chipset.
Aggression vs. Violence
I remember reading, a few months back, about some findings that had shown that playing violent video games did increase aggression for a short period after playing but that said aggression did not translate to any actual violent behaviour.
However I agree with John Carmack. My own experience has been that most violent video games help me to exorcise my anger in a safe environment.
Customer Relations Fail
Looks like I am back to boycotting Ubisoft again. I sent them a long letter last time explaining that I would not purchase their products while they continued the ridiculous Internet connection requirement policy. For a while they appeared to have learned from their mistake but it looks like they are back to the same old crap.
I require an absolute guarantee on the future playability of the game in exchange for my $60. The requirement of an Internet connection to play offline games compromises that guarantee. Since the only way to make them understand this is to hurt them in the pocketbook, my only option is to boycott their games and make sure they are well aware of the fact that they have lost my entertainment money to one of their competitors.
Depends on how you interpret the statement "not every browser does good" in your own mind. When I read that statement I think of "good" as a metaphor for "stable" and "Secure" which I wouldn't personally attribute to internet Explorer, and "Compatible" which I wouldn't attribute to either Chrome or Opera.
I do like the business model statements you quoted above. I think more companies than just open source software providers have forgotten these basic truths. Value means everything to the customer. Value is what customers pay for. Video game companies packing their products with ridiculously limiting DRM is what specifically comes to my own mind.
Since Mozilla doesn't sell their software and their business model is based largely on user install base I think this is probably a fairly effective marketing campaign. Consider the alternatives. Should they claim their browser is faster? More secure? When was the last time you actually believed anything at all that came out of the PR department of any corporation? When any company claims their product is better than their competitors I instantly distrust it and I wouldn't be surprised if that were representative of most people's perception of marketing these days.
I kinda like it when a company markets from philosophical or emotional position. "Who wants to pretend their hand is a gun?" - Sony's Move marketing campaign was awesome.
@US Law In Contradiction
@US Law In Contradiction
It may appear to be in contradiction but it is actually not. The law states that if you attack someone, you can be charged with assault. The law does not say that you will be prosecuted, only that you can be prosecuted. In cases where an individual's life is in immediate danger, the judge will usually make an exception in regards to prosecution. If someone tries to kill you, and you shoot and kill them in defense, you are still guilty of aggravated assault and manslaughter. You CAN be sent to prison for defending your own life. However the system will usually choose not to prosecute you under the grounds that your life was in immediate danger and that your response was justified. This information is emphasized in most concealed weapons courses. The officer who taught mine was very specific on this matter. If you choose to defend your own life, you have to be absolutely sure you have no other choice in the matter. And that's really the issue here as far as the law is concerned. Can you walk away? Well if a lunatic is chasing after you with an axe then. . .
Look at the old man's situation. Did he really have to strike the kid to get him to shut off his phone? Was his life in immediate danger? He could probably argue before the judge that he was convinced it was but the judge probably won't accept that as justification. He chose to commit assault and it is unlikely that the judge will consider his actions to be justified. Couldn't the old man have simply pointed the kid out to a flight attendant?
When that would-be terrorist was trying to light his shoe on fire the matter was different. It was obvious to the surrounding passengers that something untoward was afoot (Sorry for the bad pun BTW). The terrorist's actions put their lives in immediate danger and being stuck in a plane there was no possibility of escape. Their actions were justified. The old man's weren't.
Not that I like the situation. I personally think the kid deserved a good walloping and if it were my choice I'd let the gent who hit him walk free. I do hope the airline decides to prosecute the kid's parents for his violation of federal law. Maybe just enough to get the parents to properly discipline their child.
Timing Is Indeed Everything
I purchased my 3d TV a month ago on Black Friday. Previously I had been using a 10+ year old rear projection set that only supported component connections and was getting blurrier by the day. When I was shopping for a new set it was apparent that I could get a non-3d set for a lot cheaper than a 3d set. However when I started looking at the quality of the picture, by the time I got to the sets that had a good picture quality, the price difference between 3d and non-3d seemed almost negligible. That and I already owned a PS3 and I wanted to play some of my games in 3d. The place I got my set was running a package deal that included extra glasses plus the wiring plus a decent receiver to handle the device switching so I decided to go with it.
So far I don't regret my decision. I own a couple 3d movies now and I enjoy the 3d games I play :)
I agree with Stucs201. The right questions make all the difference to a survey like this. I also agree with Peter Coffin, The bundle deal with everything included really did help make the decision :)
Ya know I really miss the side of body lense design Sony used in the DSC-FXXX line of cameras. I own an F828 and I continue to love the thing. I was really hoping they would proceed to a DSLR version but sadly Sony discontinued this line of cameras.
I really wish someone would make a decent side of body DSLR to fill the gap left by Sony with this line because the center body lense style just feels uncomfortable to me.
Sexy? Not even close!
Hell no! sometimes I feel like I am the only person left who likes technology to be plain flat black. I absolutely despise laptops with those god aweful silver or grey sections. This machine's color choice makes me nauseous.
God this article was a retard fest. Including the author =D Therefore I have come to join in the pointless poo flinging fun!
Anonymous Coward @ Vista Smista, I fear your well reasoned argument will fall upon deaf ears here. Join us on our wheel-free train as we jovially paint our fell man with feces for his internet opinions!
Private vs. Public
I think Michael Dell taking the company private again is an excellent idea. Having worked for a lot of publicly traded companies I have come to the conclusions that a company's growth and quality really starts to decline shortly after it begins trading publicly.
With a publicly traded company, the CEO is designated by the investors holding controlling interest in the company. Said CEO is required to take direction from the company's chief investors. Investors all too often have no real idea what is actually good for the company. They tend to be driven purely by short term quarterly gains. This leads to the CEO having to do thing he or she knows will ultimately harm the company's long term viability in order to justify his job to the company's chief investors. So every quarter is spent scrambling to meet the projected quarterly gains so that the stock price doesn't fall. While this is not invariably the case, my impression is that it represents the general rule.
Back when Dell was founded, its direction was singular. Michael Dell knew what he needed to do to grow his company. Once the company started trading publicly, even though he is the chief investor, he is still subject to the pressure of having to sacrifice long term profitability in order to keep the stock price from falling. Going private again allows Dell to make the decisions it needs to make to return to its true glory days.
I personally feel this way about all publicly traded companies. It really upset me when Newegg announced their IPO. To me it means the quality of service will start to level off and eventually start its long fall to obscurity.
It's nice to see an article that mirrors my own view these matters. When I buy music or movies I insist on physical media. I am fine with online distribution for rentals or for streaming but for purchases, unless it’s some cheap $2 expansion or something that I won’t really miss, I demand a physical copy. When it comes right down to it, media companies really cannot be trusted long-term. Look at the PlayStation 3 and its Install Other OS feature, the company actually stated that they would not disable the feature in a future firmware release and then they went and did it anyway a few years later when they deemed the feature to be a potential threat to them. Media companies always do what is in their own best interest regardless of any prior promises to the contrary. They have a long nasty history of taking actions to protect themselves that ultimately screw the consumer as an "unfortunate" side effect. I reading reports that indicate the world is moving to online distribution but as long as I live I will stand by reliance on physical media. As long as I take care of my investment it will continue to be there for me years from now.
On a related subject, you also brought up the subject of a new physical media format to replace CDs. On principal this is an excellent idea. The only problem I see with this is greedy media company’s tendency to rely heavily on digital rights management. When the CD standard was ratified, said companies weren’t so concerned about “protecting their interests” and so we ended up with a reliable open standard that balances the needs of the provider and the consumer. When I buy a CD I don’t have to worry about losing the ability to play it due to some licensing deal done behind locked doors or conversely a failure to properly license (see Amazon and that fiasco with Orwell’s 1984). I don’t have to worry about it failing to play at my friend’s place because the disc somehow got locked to my own player. I think it would be nice to have a smaller more modern format that allows for much higher bit rate music. Something like a minidisc sized CD with the data density of a blu-ray disc. Unfortunately my instinct tells me any new format the media corps come up with will likely be leaden with some ridiculously restrictive DRM that simply destroys the product’s long term value (Glaring balefully at you Ubisoft).
Younger buyers may initially try online distribution options but as they grow older and get screwed a time or two by some soulless media conglomerate I think they will develop a greater appreciation for physical media distribution. Thus it is my opinion that physical media distribution is here to stay.
Vista continues to be good
I agree completely with the assessment that Vista is good and that vocal detractors have either never tried it, or have some misguided pre-conceived notion as to what a new Microsoft OS should have been. Vista started out rough but every OS Microsoft has ever release has had the same problems. I remember back when Windows XP was released. You could install a clean copy and within minutes you would start to run into strange problems with the task bar. I fielded a lot of complaints from my friends and family over that one. That and drivers were only available for the most current hardware on the market. If your machine was a year or two old you had a high likelihood of having to replace at least one piece of hardware. Since XP's release in 2001, most people have long forgotten the teething trouble that plagued its early adopters.
As far as I can tell, Vista's bad reputation was primarily the fault of the press / media. Almost everywhere I turn even today I constantly run into articles that claim Vista to be terrible or refer to it as Windows ME2. Yet all of my friends and family who use the OS, as well just about every IT professional I personally know, agree with the that the now matured Vista is actually quite good. I wish more media professionals would do first hand research rather than just quoting the opinions of other media professionals who also didn't bother doing any first hand research. You did a nice job here Gavin! It is refreshing to read an article that genuinely parrots my own experiences with Vista.
There appears to be multiple faults with what happened here. If you have a CCW you are only required to inform the police you are carrying if they so inquire. If they do ask and you fail to tell them of it then that is a violation. Still, it is a good idea to inform them up front. Helps to avoid problems. Brandishing a weapon is considered a felony however there will likely be some question in the court over whether or not threatening your own property with a firearm by flashing the weapon in public actually constitutes brandishing. If he didn't lay hand on the weapon and did not threaten the safety of the store clerk in any way then there is a chance he might get away with a slap on the wrist. He's gonna need a good lawyer though.