126 posts • joined Wednesday 14th December 2011 09:54 GMT
This is really missing a picture of the Trigger Happy TV guy with his torso sized mobile phone.
Have you never seen the difference?
The 3DS is substantially more powerful than the DS, and has much better resolution, even without the 3D feature. To stay competitive Nintendo has to focus on the newer platform rather than the one seeing very little new software.
Put simply, much of the best selling 3DS games would be very hard to do well on the NDS without giving up a lot of visual quality and looking bad compared to the cheapest smartphones.
Damn, that was a great site. I saw so much stuff I might never have known existed otherwise.
"Some of the biggest and most profitable names on the computing scene – Oracle, IBM and Microsoft – are currently working on relational database management systems."
Odd wording there. It makes it sounds as if those companies, which are long time players in the RDB industry, are just now preparing their first products.
Re: Skywings 3D?
I believe you are correct. Both games ran on the base hardware with nothing special in the cartridge.
Pilotwings started out as a hardware demo and IIRC there was source code in the early Japanese developr kits. We used to get imported Japanese gaming and home computer mags at the company I worked at in the late 80s and we spent a fair amount of time translating the article where Nintendo was making their first official showing of their next generation hardware to the press. What would become Pilotwings was the main demo for the Mode 7 features. This was in 1989, quite a while before the Super Famicom shipped in Japan.
Way Out was a real-time 3D maze on the Atari 800 way back in 1982. Even before then there were some wireframe games on the home computers inspired by Battlezone in the arcades. The original version of Stellar 7 on the Apple ][, IIRC.
Re: the MegaCD, FAIL all the way through
There were some gems in there, that is true. Sega screwed up in not getting more games developed that took good advantage of the hardware. So many were just cartridge games with some FMV bits strapped on or just awful FMV exercises entirely. The people behind Battlecorps also did Soul Star, another showpiece for the hardware features.
The worst thing about the failure of the Sega CD was that it gave Nintendo a scare and caused them to cancel their very promising SNES-CD. This had much better specs and was intended to launch at $200 in the US at a time when the Sega-CD listed for $300. In addition to the much deeper palette of the SNES being far better for FMV, the CD add-on was going to have a FX Chip built in. This meant any developer could make use of the chip without having to worry about the expense or have a game with very low ROM usage to make up the cost. With CD it didn't matter how big your game was, the cost was the same. (Unless, of course, it needed more than one disc but that was usually limited to awful FMV games.)
There were two games ready to go at launch for the SNES-CD. Konami's Xexex was a Gradius-type shooter with polygonal objects. That one was never released in any form. And Square's Secret of Mana was an action RPG with FMV sequences. The FMV was removed so that the game could be released on cartridge and there are places in the game where it is really obvious something expository is missing.
If the SNES-CD had been launched as planned, it could have altered history quite a bit. 3D would have become a major game feature years earlier, and the N64 would probably have been CD based and more competitive, both in terms of software costs and developers already accustomed to working with polygons.
Re: Learning by shipping or just ignoring
The problem is they shipped two beta version that were installed by millions of people, got tons of negative response and a lot of suggestions on what needed to change. And ignored all of it.
How much negative response did their need to be to tell them they had a problem on their hands? There was certainly enough to clue Sinofsky in he was never going to lead the company after this debacle.
That is just how it is. Expecting most people to learn more is banging your head against a very hard wall. And basing your estimate of how the transition to a new design will go based on a much higher level of user expertise than found in reality is a huge mistake.
I was able to adjust quickly to Win8 because I was already a fairly expert user on Windows. But I encounter very few users with comparable understanding of the UI outside of IT folks. A vast portion of the user base knows only exactly as much as they need to get by and nothing more, despite how much better their experience could be if a bit of effort were expended in learning.
The really irksome thing is the arrogance. A lot of the major complaints could have been alleviated with just a few bits of configurability and some minor additions. A tutorial for instance. All the user gets is a screen hinting at the hot corners during the first-time login. This is grossly inadequate. How insignificant of a cost would it have been to hire an outside firm to create an interactive tutorial to ship with the final release. During the betas there were over a dozen simple tutorials and cheat sheets in the app store. Just picking the best of those and adding it to the default install would have made a difference.
Re: Promises, promises
Why would they bother? Desktop users with graphic performance as a primary concern have plenty to choose from in video cards with Nvidia and AMD parts.
Intel is much more interested in design wins where power and physical volume are driving factors. The return on investment is far better for enabling better graphics performance with decent battery life in a notebook than for doing anything other than cutting video on the desktop. And as long as the corporate sector is satisfied with Intel's latest, which is still an improvement over the Ivy Bridge GPUs, they will continue to own more desktops than AMD and Nvidia combined by a huge margin. If a cubicle drone can get Skyrim to play decently on his Intel-only box, bonus!
Got it while the getting was good.
Recently B&N had a nice promotion. Buy a Nook HD (HD+ 32GB in my case) and get a free Nook Simple Touch Reader for free. This was especially nice as I had some B&N gift card accumulated. The Simple Touch is a nice upgrade over the original Nook I already had, except the touch function comes and goes with no warning, so I'll have to take it in for a reset or replacement. Thems the breaks.
I'm afraid, though, that B&N just won't last much longer. By throwing open the platform to outside software sellers they've given the strategy that was supposed to allow them to sell the tablet for less than a competing unit of comparable features than didn't lock you in to a single supply channel. B&N has been teetering for a while and this might be a preliminary move to folding up shop entirely.
Re: How strange
That might be the case if it were true but it isn't. MS and B&N are partners in a joint venture. Microsoft doesn't have any position in B&N itself.
Re: Why won't someone sell me..
There is no difficulty in removing the DRM from EPUB files purchased from Google Play store. This lets you put them on any device you like. Or you can simply purchase the item and find a torrent for the book in question. Nobody can really complain so long you paid. For that matter, you can buy Kindle books that have no EPUB version, remove the DRM, then convert them to EPUB using Calibre.
since it's just data files and not code, it's easy to move it to any device you prefer once you find the right tools. I prefer e-ink for reading, too.
So many problems
Who buys a Macbook to use as a full-time Windows machine? Certainly not anyone I've ever met.
All of the BootCamp and Parallels users I've known consider OS X their primary OS and go into Windows only as needed for specific tasks. Which means they have less third party software installed and a much lower exposure to malware. In some cases the Windows install only talks to the outside world to download updates and leads a very sheltered existence compared to a more typical Windows box. Just the fact that the total hours of run time on their Windows install is relatively low means they're pretty much guaranteed to record fewer crashes unless there are some disastrously bad drivers coming out of Apple.
Re: Bazza Will they go all the way then?
For Microsoft to buy RIM would mean running the anti-trust gauntlet. It might have been accepted when hardly anyone had heard of BlackBerry yet and Windows Mobile wasn't in intense competition for the same corporate customers. (Microsoft was once going to acquire Intuit and that fell through. Imagine how different some things would be with that one.) But that time is long, long ago.
WinRT doesn't necessarily need Outlook but it sure as hell needs a more serious communications hub app. They either need to rapidly grow the feature set in the current one or offer a high-end alternative in the store. Cheapskates on a x86 tablet can at least still get Live Mail and any number of third party alternatives.
The problem solves itself if low-end x86 tablets are more popular than ARM but hey need to hedge their bets.
Re: Preferred @JimC
Lotus had a horrible time trying to wrap their heads around GUI. When the original 128K Mac launched they announced an integrated all-dancing all-singing package called Jazz that nearly wrecked the company. They poured huge resources into trying to do something truly new on a platform that simply couldn't support it. The 128K Mac simply wasn't a practical product for anything beyond short Mac Write docs and showing the potential of what GUI could become. It wasn't until the 512K model and upgrades appeared that anyone sane would try to run their business on a Mac. By then Lotus had already gone last past deadline learning how memory hungry a GUI environment is compared to something like DOS. Even character mapped pseudo-GUIs needed a lot more memory to deal with all of the needed buffering.
By the time most machines had adequate resources, too many of Lotus' developers had been reduced to mere shells of men, and the company never really got its footing in the GUI world, except for acquired crews like the Ami folks.
Back then you didn't buy Windows so much as you bought a software package that used it. It was more of a development environment that got bundled into the product. Like Ventura Publisher was the primary way Digital Research's GEM got on to PCs. You could buy GEM separately but since Ventura Publisher was the main reason to have it, why bother?
The typical PC was so lacking resources back then that it was nuts to go into the GUI for anything less than a strongly visual app that needed it.
Actually, Lotus did.
Lotus Word Pro was descended from Samna's Ami, the first full function Windows word processor. It was out a year before the first Windows version of Word. I first used it as Lotus Ami Pro on Win3.x, on a NEC laptop. Lotus had a very competent set of office apps but by that point couldn't sell eternal youth. IBM bought them almost entirely for Notes. Microsoft even gave one of its Windows Pioneer Awards to the main coder of Ami Pro, not just for the product but also for the excellent feedback he gave the Windows dev team.
My sister still loves Lotus Word Pro with a bizarre passion and goes to great lengths to keep using it. She was a typesetter in her earlier life and something about LWP resonates with her special form of brain damage.
Because doing one thing really well is such a waste.
Versatility is only a virtue when you don't have any exceptional abilities. If I can afford it, I am going to acquire the best gear for the desired application. I could buy one enduro type motorcycle that is both street legal and can do alright off-road but two motorcycles with stronger qualities in each use is going to be far more enjoyable.
Dedicated e-readers are a trivial cost for any gainfully employed adult who spends any significant time reading. Having one in addition to a tablet is so minor a cost compared to the cost in eye strain from trying to do it all on one device.
Huge missed opportunity.
The first step in such a project would focus on the portions e have some idea of how to do. FTL drives, planet destroying beams, and much else about a Death Star is currently beyond our knowledge.
But what if we start with the stuff we do have an idea of how to do and work up from there? We may never have a Death Star amusement park in solar orbit but the intent of creating the framework of such an object would be a good D.D. Harriman sort of dodge to kick start an asteroid mining operation. Once you have that, a vast amount of potential is unleashed. (Whoever produced a cost estimate based on boosting all of the mass needed up from the Earth's gravity well really needs to read more on the subject of large scale extra-planetary construction.)
And yet Microsoft provided the File Format Converter free to all users of older versions of Office. Office 2003 users can output to DOCX and XLSX at no cost beyond the time for a simple download and install.
Re: Still plenty of .doc and .xls files around
There is no shortage of solutions.
Is the File Format Converter installed on the old systems? You might have better result outputting to DOCX on the old machines than through the newer Office. It depends on what was done that isn't being interpreted the same on the newer version.
Files requiring long term retention are typically not subject to editing. Just the opposite, they need to remain just as they are. So batch outputting them as print jobs to PDF is a good way to store them for the long haul. You'll probably be able to easily find a PDF reader 50 years from now.
Dealing with legacy systems is a good application for virtualisation. If the old software is on a fairly generic old PC, convert the contents of the hard drive to a VHD and make the system accessible within a much newer system. The same VM that works for ancient games will do just as well for an ancient accounting system.
Compatible where it matters
Word 2013 will open Word 97 files. That is a straightforward obvious need for Office 2013 to have value to longtime users. Importing and exporting to/from Outlook is a more esoteric operation used by a far smaller set of users and the need to do the operation to/from the older formats is smaller and shrinking subset of that.
If, when looking at an article like this one, you cannot see how it affects you, it very probably doesn't.
Much ado about nothing
I find myself wondering if you understand what importing and exporting means in this context. It doesn't mean if someone using Outlook 2013 receives a message with a Word 2000 document attached, that they will be unable to save the attachment and open it in Word.
At worst, if you want export a set of Outlook 2013 contacts to a Word 2003 file, you add an extra step by exporting it to DOCX first, then save it as DOC in Word. Wow, that'll collapse the company for sure.
If there are really lots of businesses that will be cripplingly affected by this, it is an opportunity for companies like Aspose to offer a solution that adds the functionality into Outlook 2013. But the slightly roundabout method is free.
But why is this a big deal at all? You say you're in IT and yet you seem to be completely unaware that Microsoft makes a free add-on for older versions of Office to equip them to handle DOCX and XLSX files. It's been around since Office 2007 launched. I have many cheapskate customers in field that operate on a shoestring who still install Office 2000 on new workstations and the File Format Converter is just part of the install procedure.
Find a real problem
Nothing you've described will be affected in the least by the changes in Outlook. All of those files embedded in your Outlook PST would be just as usable if you installed Office 2013 as before. You aren't likely to import them to or export to them from Outlook. Outlook doesn't much care with is in a binary attachment beyond security warnings. You don't need to convert anything. The most current versions of Word and Excel will still work with those files just fine BECAUSE IT MATTERS IN THOSE APPS AND NOT IN OUTLOOK. They dropped the functionality from Outlook because it hardly mattered and only added to the work load that could be better applied elsewhere.
And converting those files to an open archival format like PDF is a trivial task that can be automated for far less than the costs you suggest.
And how is this MS forcing someone to pay for an upgrade? "We stopped supporting a thing scarcely anyone does." This is neutral at worst. The small number attached to elderly software aren't likely to make the leap anyway and those who are already on more recent versions simply aren't affected by the change and it doesn't factor into whether the new version is attractive.
Re: Don't be at all surprised if...
You know there are such things as batch converters for just this sort of situation. But then it seems most comments are about whining instead of doing something practical.
Silly all or nothing attitude
I know several indie game developers who have full time jobs and solely do the game work on the side. Some of them give away the games and have a PayPal tip jar or other mechanism for donations. Others go for very low prices, typically 99 cent impulse buys. All of them have been pleasantly surprised by the amount of money that has come in.
They aren't getting rich or quitting their jobs to take up making games full time. But having a hobby that pays the mortgage is pretty sweet.
Proteus IV was a neighbor
I'll always have a soft spot for Demon Seed because the movie used the old Thousand Oaks Civic Center as the exterior for the building where Proteus IV supposedly lived. In the theater, my brother and I supposed that if the TOCC, about four mile from our house, was secretly an AI lab, then the house where Julie Christie lived might be very close by as well.
IBM's biggest enemy was IBM
I remember fondly the 1995 CES in Las Vegas. Windows 95 was being heavily promoted but still eight months away from shipping and nearly everyone was making jokes about how late it was. IBM had a booth for its consumer software division. I spent some time there talking to a VP and noted that everything there was for Windows 3.x and none of it for OS/2 Warp. Not true, he said, all of it works in Warp. But only as Windows apps, not natively with any of the strengths OS/2 offers.
He looked at me as if I'd grown horns and opened a third eye. Native OS/2 software? We have to make money at this. There isn't any money in OS/2 software.
This was the man who lead IBM's effort to enter the market for games and educational software.
It was even worse in other divisions. They were openly at war with the PC guys, who they perceived as directly undermining their products. Microsoft didn't have to be a very good fighter when its opponent was constantly being attacked by members of its own family.
Another notable incident, from the Comdex about two months earlier at the same Las Vegas Convention Center. IBM proudly rolled out the latest version of its 'cheap' development package for OS/2. Only $600. The Windows SDK from Microsoft? They slipped the CD in your bag if you came within 50 feet of their booth and held still for a few seconds. By the end of that Comdex I had a dozen of those discs.
Microsoft really wanted it, while IBM wasn't sure if they really cared.
Re: Absolute rip off, but not a toy
You're certain of something but they aren't facts.
Atari and Commodore produced models with everything up to the 68040. They were after different market segments. Because Apple was targeting a much higher price point and margins it could afford to be first out the gate with the latest from Motorola. But if it was about something you could afford and use Atari And commodore had much to offer.
At the time the SE/30 was introduced, Amiga 2000 models with 68020s and hard drives were readily available. The Mac was the better choice if you had something like desktop publishing in mind. But I was working at game developer Cinemaware then and we had a very early Mac II unit. We naturally wanted info on details that would aid in game development for this fast and colorful machine. When I called Apple and explained what we wanted to do I was essentially told "Steve doesn't like games on his computers." My feeling was, "Screw you, too, Steve."
I've avoided Apple products ever since and have never felt I was missing much from outside the RDF.
Is there any better way for the old guard newspapers to accelerate their decline than to discourage linking to their online edition?
These people have their wages paid by advertising, just like any Google employee. But they're oblivious to how things have changed. People no longer reflexively buy a newspaper or go to a newspaper's site. They go to sites of people who they find interesting and follow the links offered there. Discouraging that is suicidal.
Re: clarification for foreigners and the young
It was part of the culture long before then. It goes with the 70s t-shirt that reads, "If you cannot dazzle them with diamonds, baffle them with bullshit."
I had the same reaction. Nintendo seems to be reluctant to have people look at the unit itself much rather than focus on what it does. I suppose the tiny size and focus on cost reduction didn't leave a lot of room for style exercises.
I also suspect this unit exists because Nintendo is in a similar position to Sega when they introduced the Mega Drive/Genesis. The Master System had been a failure, mostly due to Sega's slowness to comprehend the advantages of Nintendo's publishing model. Sega had a lot of retailers with hard to sell inventory who weren't excited about carrying anything new from Sega. Sega also had the problem of very little third party support (and later a problem of third parties writing for the Genesis but bypassing Sega's publishing infrastructure) to compete with the massive support for Nintendo.
So the Genesis was designed to be backward compatible. The needed accessory to use Master System games was inexpensive because it did little beyond providing the needed connector types. Nearly all of the needed electronics were already in the console. The ploy succeeded in winning over retailers and the Genesis limped along until two things happened. First, Sega had two hugely popular games in EA's Madden and their own Sonic, and second, Nintendo launched the Super Nintendo but couldn't meet demand that first holiday season. A considerable number of consumers settled for a Sega instead and found it was actually pretty good. Not long after Sega challenged Nintendo's exclusivity contracts with third party publishers in court and got a settlement. Soon third party games were just as numerous for Sega as for Nintendo. (The NEC TurboGrafx 16 / PC Engine was doomed by this same contract arrangement that kept the best games from Japanfrom appearing in the US but NEC wouldn't sue Nintendo because they were a huge customer for NEC Semiconductor.)
Nintendo has a huge volume of awful third party Wii games flooding the retail channel. Early Wii U buyers are going to ignore the bulk of it and retailers are going to be annoyed at Nintendo. So they've come up with a super cheap Wii to appeal to the super cheap people who will settle for bargain bin crap and help clean out the retail channel of the garbage. And also buy a ton more copies of Mario and Zelda at the same time.
Everybody who cared has long since bought the full feature version of the Wii with WiFi and download capability. Just as everyone who cared about GameCube compatibility when there was still a lot of those games cluttering store shelves has long since been served. So, just as the GameCube support was cut out in favor of a price cut, so now has the online capability.
Expect to see GameCube classic like Mario Sunshine and Zelda Wind Waker appearing as DLC purchases on the Wii U sometime in 2013.
The game discs go into the drive slot, just like they always did. The 5.25" inch optical drive is the one component that determines the minimum dimensions for a device that plays Wii discs. As for the rest, I'd want to know if they did a die shrink on the Wii chip set or just cut stuff out like the WiFi hardware to reduce the size and cost. A side project to reduce the Wii chipset while developing the Wii U chipset would possible save some money since much of the same circuitry is getting used in both.
Nintendo has apparently decided the WiiWare, Virtual Console, and other sources of online sales aren't worth the trouble on the old platform. Hopefully, Wii U owners will have an updated access to those items without going into the Wii Mode because I strongly suspect Nintendo wants to cut off the old machine's online support ASAP to reduce costs.
I wouldn't be surprised to see some WiiWare compilation discs appearing as an alternative to selling those games to buyers of the Wii Mini. Likewise for Virtual console collections.
Re: the "gravity" of the situation
The planet takes on many tons of new material every day. Those shooting stars you see at night are masses entering the atmosphere. Most of them just become more atmosphere as they aren't big enough to make it to the ground. Astrophysics majors have devoted great amounts of time trying to find if the planet is slowly getting more massive in a discernible way or if the bits of atmosphere that wander off constantly makes it all even out. This is hugely important for figuring out the likelihood of finding other planets that have the conditions we evolved in and thus might also have something like us living there.
Most of the stuff getting caught in the Earth's gravity well daily isn't big enough to produce an effect you'd catch looking up at night, but it adds up to a considerable amount that would make you very unhappy, briefly, if you had it all in one place and headed toward your house. But it's spread out over a really immense area. Imagine a light rain that leaves just enough moisture to be visible on all the sidewalks for a mile around your home. Then imagine that thin bit of moisture gathered up in one square meter of sidewalk. Splash. The amount of water didn't change but how it is distributed can be the difference between a cool mist and drowning.
This was all discussed decades ago.
The writer references several authors but doesn't seem to remember the stories very well. I'm not seeing anything in this piece that wasn't covered back in the 70s. The idiotic UN treaty will turn to dust the moment someone stakes a claim and has the means on site to defend it. The cost of trying to wage piracy rather than just finding your rock to work is such that only a fool would fight over any claim.
Re: Boot on the other foot
Yeah, because there would be millions of people anxiously looking to install Windows on machines sold for running Linux.
It's pathetic, really. After twenty years, Linux is still dependent on the Windows hardware market to provide machines to install upon outside of specific markets where the OS has no identity for end users to consider, such as DVRs and other appliances. If there were really a viable market for, say, an Ubuntu tablet, it would be trivial for an OEM like Acer or Asus to make a generic model with no boot loader security active by default. But is there enough market to make it worth their trouble? It is a very different investment from a generic PC line that a fairly small company can produce and support.
Re: Not any non-Apple hardware
NT 4.0 had a version for MIPS back in the 90s. Microsoft was porting to every architecture that might pick up where Intel left off back then. PowerPC, Alpha, and MIPS all had folders in the NT 4 install disc to handle a wide array of possible machines. Windows CE ran on even more architectures, like Hitachi's SH series, as seen in some ports of PC games like Tomb Raider to the Sega Dreamcast. WinCE provided the DirectX APIs to simplify the task.
But Intel surprised itself and kept ramping up more powerful chips and the status quo continued. Windows 2000 was available for Alpha and DEC had a very impressive x86 emulator that got faster as it ran a particular app but they all faded away as the market for x86 alternatives int he desktop sector never extended beyond Apple in any big way.
There was a standard called CHRP intended to make PowerPC systems into a DIY platform comparable to the PC. IBM had a port of OS/2 Warp for it and it appeared there was some hope for diversity. But one of the first things Steve Jobs did on returning to Apple was pull the plug on their participation in CHRP. This was part of killing off the licensing of Mac OS to outside computer makers. Without Apple CHRP was quickly dead.
Re: No Desktop Need to Push Volume
I agree with the first part about sales volumes at the client not being there to drive demand like previous generations.
But you skipped the better part of a decade on the time from the standard appearing and it becoming typical in new desktop systems. At the time gigabit ethernet showed up most PCs didn't have a bus that could drive it properly. You had to have the extended PCI slot found only in high-end workstations until PCI-e starting replacing PCI in consumer desktops. Until then the biggest value for gigabit in most networks was to relieve backbone congestion.
It hasn't been that long since Intel motherboard included third party controllers for gigabit networking support. Gigabit is now cheap enough that it's used throughout my entire household network from the router on down to the switch in the entertainment center, although the PS3 is only item in there that does gigabit. (The Wii U lacks a wired network port entirely, much to my annoyance.)
Gigabit had a reasonable evolution but there is still plenty of 100 Mb gear being sold, especially on the consumer side. It's 100 Mb that had it really easy. Most of the world never dealt with networking before 100 Mb became the rule. In fact, I find I cannot recall ever seeing 10 Mb embedded on a motherboard.
Re: Mixed feelings @ Yet Another Anonymous coward
The Xbox 360 was NOT sold at a profit or even break-even at launch. It really didn't become profitable before attached software sales until the Slim model. That isn't factoring the RROD problem, which screwed up the business plan considerably over the original intent.
An Xbox 360 sales wasn't nearly as big a squirt of red ink as the PS3 at launch but it did go down in the debit column. Sony had a ridiculously ambitious concept that had to be revised late in the design cycle to include a dedicated GPU rather than having multiple CELL chips being assigned graphics tasks at the coder's whim. On top of that, Sony want backward compatibility but didn't come up with a practical means to implement it.
On the PS2, the chip that ran PS1 software also performed jobs like reading the controller input and a lot of other very necessary jobs. Thus it was well integrated into the PS2 and didn't add much cost for the portion that was solely needed for PS1 games. On the PS3 the PS2 support was entirely bolted on and completely separate from the PS3's operations, making it pure added cost to have the chips onboard to run PS2 games. If the original multiple CELL concept had survived they might have gone the emulation route and had only a one-time cost for developing the software rather than an added cost to every machine produced.
Sony overcame those problems but it did mean it took a lot longer to start seeing some return on the staggeringly huge investment. Likewise, if Microsoft had only had their intended testing system ready when Xbox 360 production started, the RROD problem would have been detected and fixed far sooner, saving a huge amount of money and damage to the platform's reputation.
Funny thing, Sony sells plenty of Blu-ray decks and a streaming video box that all handle MKV very ably. It feels redundant to have a Blu-ray deck in addition to my PS3 but it handles a bunch of stuff the consoles cannot, solely for lack of the software support. Apparently, the different divisions at Sony have different opinions on the issue.
Re: ... and not only that
At least until Amazon produces a phone. You can already get tablets from Amazon, Barnes and Noble, and others which do not ship with Google Play installed. Already, most phones will let you install the Amazon app store if you want to deny Google any post-sale profits on the phone.
Corporates would have ignored Win8 regardless
Massive companies that are still in the process of eliminating Windows XP dependencies in their array o mission critical software are in no hurry to have the latest. Getting past the anything goes era of ignoring the guidelines and producing software that uses antiques installers and techniques that threaten to trash the registry is a big leap for these outfits after decades of being able to ignore such things. Nevermind that it lead to huge costs in unreliability that could have been avoided if vendors were required to follow the XP guidelines published back in 2001. They painted themselves into this corner and once free won't be in any hurry for the next change.
Windows 8 could be completely non-controversial. Just all under the hood improvements and little change to the user facing portion. And it wouldn't matter for the corporates who have only recently gotten to Win7 or are still in the process. If Apple had a big footprint in the corporate market it would be facing a similar problem with its frequent OS releases. At the least, they would have to retain backward compatibility much longer than they do now. Being shut out from that market gives them freedom to do things that serve the other markets better.
It's pretty obvious how Windows RT will be a profit maker if they find they need to pretty much give it away to compete. Every bit of software for those systems, outside of corporate sideloading, comes through the Windows Store and Microsoft makes a nice slice on that stuff. It's finding a ground in between Apple where you can only buy hardware from them and only get software through them, and Google whose OS might not even be clearly named on a device like the Kindle Fire and may have its own app store rather than use Google's.
It appears if Microsoft accepted the new normal and sold the OS for a token amount in favor of focusing on the app store for profits, they could give Google a serious problem. So long as this is an OS installed solely by OEMs it shouldn't affect sales on the Windows 8 version. But it may make sense to cut the price a good deal if the app stores proves a money maker there as well.
Change is hard and uncomfortable but it has to happen now and then. The hard part is that there is no going back. Once the price tag of the OS is majorly reduced it cannot revert. Over the years, if you consider inflation and the massive growth of the feature set, Windows has gotten a lot cheaper. A consumer PC license has always been available for around $100. In the early days this meant getting DOS and Windows 3.x for $50 if you bought them together. Later, it meant Upgrade packages of the latest consumer edition for around $90 or the same price for an OEM full install version if you built your own machine.
$100 is a much lesser amount for the product than in 1992. The product does vastly more and the money is greatly devalued. (In the crudest terms, $100 in 1992 is equivalent to $164.87 today.) This would be fine if everybody was selling OSes as retail items or tied it exclusively to their brand of hardware. But they aren't. Android is out there and cannot be ignored on devices like tablets as Linux could on desktop PCs.
So the business model must change. Finding not too much and too little is the trick.
Re: @Destroy All Monsters
Elections have consequences. You cannot just move on without consideration for what you're about to tread in. If the last four years are any indication, hip waders will be recommended.
Re: Out of interest ...
Power management is a big one. Intel has a team that works with Microsoft and others on this, and you can see that there are two different HALs that can be installed on a new system based on the CPU brand. This is the deep down stuff the vast majority of coders never have to concern themselves about.
A few years ago there was a big problem for some owners of certain brand machines when a Windows Service Pack was installed and left the machine DOA. This was because the OEM thought they were clever and used an image for factory prep that had both HALs visible in the system directory, something that would never happen in a normal install. They thought it more efficient to have a single image for both AMD and Intel systems. The Service Pack installer assumed the first HAL it saw was a correct indicator of what kind of machine was being updated. Trying to use the wrong HAL meant the machine died during the OS boot.
Intelligence can be highly selective.
It's astounding that someone as smart as Brin, in specific areas at least, could be so incredibly naive about human nature and the politics. A politic party is merely a convenient form of signage to indicate to the public what side of an issue a politician is likely to take.
Say the politicians do give up any claim to a party alignment. Will they no longer stand for anything, have any beliefs? Will not those who closely watch such people and the general public soon slap labels on them to group them as per their voting habits?
Language is how we convey ideas and putting a label on something is how we avoid repeating the same lengthy explanation over and over unto exhaustion. It's far easier to say 'He leans left/right,' so long as they is a mutual understanding of what is meant.
Re: I wonder if it can be done.
IBM had the stomach. What they didn't have was the customers to buy PowerPC desktop/laptop systems in the kind of volume needed to justify the capital investment to keep on par with Intel. They need far more business than Apple could supply and nobody else with the needed cash was inclined to jump in. IBM had already strangled their own effort in its crib years earlier.
Without the right numbers no amount of intestinal fortitude would make it a business worth pursuing.
Re: I'm sure this is true, in some shape or form. But that's not the point.
The instruction set is hardly a problem. Intel has been dealing with converting that to something more effective on the fly since the 486. Most of the performance gains of the 486 were a direct result of that new-found freedom to do what they wanted internally. The criticism then was that this translation stage was terribly expensive compared to a native RISC chip. But Moore's Law took care of that, as Intel had known it would all along. The number of transistors needed for the translation stage became trivial in a few years.
The problem is the huge difference in performance levels between the ARM and Intel products. Low power is not a mystery if performance is not an issue. ARM chose to forego desktop level performance a long time ago in pursuit of mobile and embedded markets. (The original Archimedes PC was quite competitive with the high end PCs of that era.) It was an effective strategy that carried a cost as their market now overlaps with Intel's, just as Intel's choices did very well for them but limits their entry into where the growth is today.
Re: It has entertained this idea
Don't forget NT 4 and Windows 2000 for almost every CPU then vying for the workstation market. DEC Alpha, MIPS, PowerPC. And the myriad processors WinCE was offered on.
There was a strong belief that Intel was going to hit a wall and you needed to be ready to have your OS sold on whatever took over the market. Even Intel tested the waters with the 80860 long before the Itanium. IBM had OS/2 3.0 Warp for the PowerPC but killed it after they decided the desktop market wasn't going to happen for non-Apple PPC machines. Perhaps if Steve Jobs didn't kill CHRP as one of his first acts upon returning to lead the company.
In both previous architecture shifts the new CPU family was a major boost over what the previous machines had. For instance, at the time the PowerPC 601 was first shipped (although it was not intended to be a production CPU and had been built as a proof of concept) it was the fastest microprocessor BYTE magazine had ever tested. A major jump for that era.
The PowerPC models Apple used also had the advantage of a dedicated bit of transistor real estate to do some of the conversions that would otherwise have added a LOT of overhead to emulating the Moto 68K family.
When it was decided to go to Intel the problem was that Apple alone wasn't a big enough customer for IBM to commit the resources in producing competitive desktop CPUs at a pace to match Intel. The profit margins were tiny compared to IBM mainframes but the level of capital required was greater. Apple had no interest in encouraging an open market for non-Apple PowerPC desktops, so something had to change. By the time the first Intel Macs shipped the last round of CPUs IBM produced for Apple were getting a bit dated by the standards of the PC industry. Once again, the new architecture had plenty of spare horsepower to handle the emulation problem.
Unless ARM's upcoming 64-bit product line offers unprecedented level of performance gain, at least an order of magnitude over current ARM designs, there is just no way to have a third relatively easy transition. Something has to give. Either Apple dumps its professional users and allows their desktops to age into uselessness, or they once again offer some hardware means to ease the transition.
I'd expect the former to occur as those users have become a very small portion of Apple's revenue picture. The cachet of holding that market is no longer as useful as it once was now that they've become so strong in the consumer end.
Another possibility is that Apple continues on x86 but dumps Intel. AMD is currently circling the drain but has some very valuable IP. Apple could buy the lot using the change under the cushions of the couch in Tim Cook's office. One question would be whether to retain the ATI portion of the company or spin it off into an independent company again.
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- Pics Brit inventors' GRAVITY POWERED LIGHT ships out after just 1 year
- Microsoft teams up with Feds, Europol in ZeroAccess botnet zombie hunt
- Storagebod Oh no, RBS has gone titsup again... but is it JUST BAD LUCK?