AMD and ATI are officially tying the knot after living together in corporate sin for nearly three years. The company's formerly disparate processor and graphics businesses will merge into one amidst a major reorganization for AMD that will spawn four new operating units. AMD says the goal is to streamline dreaming up chip …
Hope it works!
I've always used AMD chips from way back in the K5 days for my homebuilds, and pushed AMD in the office when Opteron came out, but recently they have fallen behind Intel. I think a strong AMD will be good for both companies as it fosters competition and innovation to the benefit us of customers. Here's hoping the new structure helps them get back head-to-head with Intel.
But, ATi have really burnt their bridges as far as I am concerned. I used to use nothing but ATi graphics cards by choice, but the last two year's worth of drivers have been awful. Even our hp laptops, which mainly have ATi Mobile graphics, haven't proven immune to graphics bugs, to the point where exasperation made me tell the hp rep we only wanted future kit to have nVidia graphics. All my recent homebuilds for myself, friends and family have been with nVidia cards, and I don't see that changing.
Resistance is futile...
AMD and ATI have been assimilated by the Berg
Ahh I see
So they intend to merge two units into four. Yep, coming together by dividing themselves in half, and your annoying drop down menu will now have four categories instead of two.
Seriously though, all this sounds like is code for "we're going to fire lots of people".
I'm a fan of ATI graphics cards because I've never had much trouble with their drivers, maybe that makes me somewhat unique but I don't think so.
Thing is I'm not sure I can continue to buy video cards from what seems to be a transformation into a smaller, less financially secure Intel. Intel chipsets are okay, but they're hardly good enough to play top end games at decent resolutions with all the fancy stuff turned on.
Processor companies in general don't inspire confidence when it comes to buying a gaming video card. That's why it made more sense to keep ATI and AMD separate. The branding issue will cause them problems. I don't see many others willing to buy an AMD video card any more than they would an Intel video card. The only winner I see from this will be nVidia, which is a shame really.
does this mean we'll start seeing AMD moving away from GPUs and start working on proper General purpose Processors? DEATH TO THE GRAPHICS CARD!
ATi on board, on the mainboard
Perhaps now FPUs will get that long-awaited technological kick up the bahookie?
General processor on graphics chip?
If you have a many (16+) core graphics processor, which can be used for other processing needs, why not use one as the computers general/central processor and save on a separate chip/socket/cache/fan.
You could slow/stop the unused cores, when the temperature got too hot or battery got too low.
I expect the reason is because of the lack of a decent interpreter to get the X86 OSs working on them. See OpenCL. As I understand it, that's the problem.
IE try installing Windows on an UltraSPARC. I dare you.
If anyone would like to correct me, feel free.
Anyway, I'm hoping that ATI, er, AMD, finally pull their finger out and start putting physics processing on board. Tried the Cryostasis tech demo on my computer the other night [Q6600, 8GB RAM, XFX 4850 XXX 512Mb]..
...Four frames per second. Except in the areas where physics wasn't being used [such as when looking at walls, etc] and bang, straight up to 50fps.
What a load of toss!
Paris, as I expect a razor sharp mind like hers is very interested in the GPGPU question as well.
I'd still rather have......
An AMD Processor and an nvidia graphics card in my machine.
@General processor on graphics chip?
GPU's are too specialist, essentially just uber maths co-processes geared around GFX.
But for general tasks, even ignoring the lack of x86 commands, GPU's simply don't have the grunt or capabilities that a standard CPU has. An OS running on a GPU would run like Vista on a 200MHz 486!
Although other math intensive tasks can be done by GPU's, such as protein analysis (folding at home via CUDA on nVidia as one example), and now more recently hardware physics acceleration via PhysX on recent nVidia cards (8, 9, and 200 series cards)(I believe ATI/AMD are also working on something similar for their cards now as well).
"But for general tasks, even ignoring the lack of x86 commands, GPU's simply don't have the grunt or capabilities that a standard CPU has. An OS running on a GPU would run like Vista on a 200MHz 486!"
They made 200MHz 486 chips? Thats news to me.
/Blue man because research is a wonderful thing
- Vid Hubble 'scope scans 200,000-ton CHUNKY CRUMBLE ENIGMA
- Bugger the jetpack, where's my 21st-century Psion?
- Google offers up its own Googlers in cloud channel chumship trawl
- Interview Global Warming IS REAL, argues sceptic mathematician - it just isn't THERMAGEDDON
- Apple to grieving sons: NO, you cannot have access to your dead mum's iPad