AMD is the best
vi runs faster on AMD than Intel.
Sometimes, experts don't know what they're talking about. Just ask AMD, which over the past 12 months has had to endure second-guessing from financial analysts and industry pundits about slumping market share, a string of financial losses, and costly distribution and design gaffes. On Thursday, AMD's top brass traveled to New …
vi runs faster on AMD than Intel.
If they're complaining about experts, that's one thing, but when they start to question "conventional wisdom" - which we all know is simply an anachronistic term for the "wisdom of the crowds" - they risk calling down the wrath of Wales.
No, I don't mean the Principality, but rather the Jimmy. Having read some of the tales of how the Outlaw Jimmy Wales can come down on one, I'd certainly think at least twice before "dissing" conventional wisdom, or even besmirching the honour thereof.
they don't have to replace many dodgy Opterons...
because there *aren't* many Opterons!
AMD's just jealous because they realized that cutting corners on computer processor design is not going to win them points like it used to.
You get what you pay for...
And they divided and got the wrong answer. All the analysts want to see is $$$$ nothing else. If you can't make money, they don't want to hear about it (unless you were a dot-com business in 2000, but I digress).
They will never learn about us "west coast" types until they shed their "suits".
Especially in a home PC, and to be blunt, even in an office environment. There's few applications that can harness the power of Dual Core, let alone quad core, so what is all the fuss about really?
In gaming terms their ownership with ATI will (alright, SHOULD) help to integrate the motherboard, pcu and graphics card letting it do the work it should be doing (I've always thought of graphics cards as floating point units that were under-utilized by the programmers.
But I'm not a technical type of person, so this all may be just bum smoke. My last two cards have been from ATI on an Intel based pc, however with many games reportedly being "better" (read: More stable) on the nVidea chipset it might be time for a change.
It does sound reasonably reasonable though right?
Yes tech experts can be wrong and certainly narrow minded but for instance I know of one popular stock analyst who doesn't like AMD just because the CEO is Hector Ruiz. Wall street can be just as narrow minded as any group of "experts". All in all if they return to profitability then I think they will need to do it without Wall street's help.
Otellini weaps over the AMD thing? Ummm... Intel was a company with massive financial resources and many divisions that made money, not just processors. In fact, last I heard, during the "dark years" Intel was making money hand over fist on flash technologies. Let's not forget I/O processors and all the other yummy stuff Intel makes. The only reason the stock was impacted is because the "Experts" mentioned in this article were too narrow minded to understand that there's more to Intel than just their products which compete with AMD.
That being said, a lot of the "experts" kept telling me that Intel CPUs were a dead end and that AMD was the way to go. I kept responding... what's more likely
1) Intel is just sitting on their big round butts figuring that they'll catch up to AMD someday?
2) Intel is working on something big and instead of divulging all the technology break throughs their making so AMD can copy them before Intel comes to market, they've decided to take it in the butt for a year or two since they aren't 100% dependant on the CPU division for survival.
The fact is, Intel was never in any danger. They may have lost a small chunk of market share to AMD, but even Intel has to admit that it's a good thing, the money lost on that is made up from 5 less FTC investigations a month.
All that said... my personal opinion isn't that ATI was overpriced for $5.6 billion. If anything it came cheap. It was AMD's lack of ability to capitolize on the purchase. After all, if Intel, the marketting and business superpower can't make graphics tech buyouts work, how the heck does AMD think they could. Intel has bought at least 4 graphics chips companies I can think of and it all came to nothing in the end. If AMD was smart, they'd focus more on running ATI as one company and AMD as another.
"AMD's just jealous because they realized that cutting corners on computer processor design is not going to win them points like it used to." (AC)
Yes and no. I wouldn't call it "cutting corners", but not too far off neither. From what i hear, you could say Intel leads at physics and AMD at logic. Now that Intel has stepped off the broken Netburst architecture (picked up the to good ideas of AMD? or just went back and thought about P3 again with new wisdom about what is definitly not the way to go (P4)?), it probably has become harder for AMD to offset Intels "physical advantage" with better logic.
"In gaming terms their ownership with ATI will (alright, SHOULD) help to integrate the motherboard, pcu and graphics card letting it do the work it should be doing (I've always thought of graphics cards as floating point units that were under-utilized by the programmers." (Daniel Winstone)
Yeah AMD may want/have to get its money from their products extending over the plain CPU (bus architectures, integrated solutions).
On the second part i can't agree, graphic chips aren't good FPUs. They may be excellent stream processors, but even there they eat buckloads of power. As soon as code has a little more complex branching (such that actually needs the if, not replaceable by bitmasks) a GPU tends to shoot itself into the foot and going on a break for cure. Yes, i may have a bias against using GPUs for every task that can somehow be shoehorned into them. Maybe that will fade off over time, but probably not before they also start thinking about efficiency.
Maybe they should ask the monopoly that is Intel, to bundle their chips with their M/B's
I'd suggest that the problem with multi-core processors is in what the OS can do.
Like everything else this year, the business is tainted by Windows Vista.
I'm still using XP, and that's taking 650 megabytes of RAM just to read The Register.
It'd be nice if I could afford all that extra CPU power. I do things that could use it. But, right now, the tools to use it are still trailing the chains of Redmond, like a cheap knockoff of Marley's Ghost.
There are loads of application which can take advantage of multi-core, just not many on the desktop. All the major databases, for example, benefit from extra cores. This began in the early 90s for the likes of Sybase and Oracle (they could do it before that but there were scaling issues which were mostly overcome in the early 90s).
It's only the Windows/desktop centric world that doesn't know how to take advantage of them.
As for AMD a couple of clients insisted on using them for servers and desktops and, to my surprise, the results have been great. Overall they were cheaper than Intel and worked as well.
I agree on the quad core issue. On laptops the big performance problem is the Hard drive. On PCs and servers the issue is memory (and to a lesser degree Hard drive performance as well)
Intel et al have been pushing developers to write more threaded code, but all that developers have done is write more bloatware. Multi-core CPUs are a waste at the moment. Intel (& AMD) would be better off getting better memory technologies.
If it were me I'd probably get depressed, give up and close AMDs doors.
Although then you'd be end up paying huge amounts for processors from intel.
Ways for AMD to up their share price
1. Rename themselves as AMD 2.0
2. Stop selling Processors (that's so last millenium)
3. Start a social networking website (for anyone doesn't matter which group they target)
4. Sit back and watch the share price go through the roof.
Financial experts tend to use fixed models for making predictions. There are very few financial experts in the way there are IT experts who are expected to have diverse knowledge of computing around their core skills as well as some sort of higher education in a related subject. Financial experts come out of Uni with a high grade in a generic degree and are then trained in the processes of the company. There are many examples of financial experts fucking up when the situation they are modelling does not conform to the rules of the modelling process.
"2. Stop selling Processors (that's so last millenium)" ... or also push XXXXClueSieve Algorithms allied to Processors and Information Processing for Future IntelAIgents, Chris? ......... for that Naked Short Sell advantage.
It is QuITe telling that re Barcelona we continue to hear " all anyone wants to talk about is an erratum in Barcelona chips that AMD confirmed last week. It was discovered late in its development ..." and "And when asked what exactly had gone wrong with Barcelona, Meyer would only say that the erratum, as microprocessor bugs are generally called, manifests itself in only obscure conditions. "This is not a very good forum to conduct a design engineering post mortem," he said, declining to elaborate." and ""We blew it and we're very humbled by it and we're going to learn from it and not do that again," ....... and yet who is yet to explain what went wrong.
Some dirty little Private Secret, swept under the carpet, is always still there to be pulled out again if AMD try to lead the Markets again/with gains again. MeThinks someone thinks they have AMD over a barrel and/or AMD thinks that to be so and that suggests that it is personal and political.... but only in the heads of those who think it to be so and it plagues them and effectively would render them impotent as competition.
couldn't agree more -- or would, if cOuLd, but never can quite "say..."say, what?" goodbye to a society, needless to say.
BArcelona -- the agony and the ecstasy -- Drogba scores again, jackets for gOalposts, spirit of 66... ah. aArdvArk
You have lost money every quarter for more than a year. Your purchase of ATI is still costing you money. Your market valuation is smaller than a single quarter of revenue for Intel. You invested a sum equivalent to your current market worth in a new fab that isn't online yet. Your 65nm process isn't fully baked yet. You're having trouble getting your Phenom and Opteron products to market. Benchmarks for your Opteron products are being withdrawn because no one can verify them with real products. Intel's Core2 wiped the floor with you. Intel's Quad core chips at 45nm use a High-K material that cuts leakage allowing higher speed with lower heat thus negating any advantage you previously had in this department. Core2 let Intel catch up in pure performance terms in one leap. Penryn is here running an enhanced Core2 on 45nm and Nehalem is coming within 12 months, yet you're still working on your response to Core2 at 65nm.
OK, you're right Hector, we should ignore all of that and place our faith in your bluster.
Nah, maybe not. We have a word in the UK to handle people like this.
Yep, that's the word, and Hector, you're fitting it pretty well right now. Stop trying to ignore reality and start addressing it.
Ok, this is scary. Either amanfrommars really had something important to say or I have been reading his posts too long, but I understood that. I think I need to go lay down...
AMD makes damn good products in my opinion. They've done pretty well when you consider how small they are compared to Intel.
Intel has FAR more money they can throw at R&D and testing and so forth. You gotta admire a company like AMD really. They really are David taking on Goliath ... or chimpzilla taking on chipzilla, etc.
I hope they shock the computer world in the next year or so with some huge innovations they are working on now. Something that will make their chips dominate vs. Intel chips on benchmarks for 2-3 years.
Maybe AMD has some Einsteins in the company that can make it happen. That would be a great story in the history of business.
I'd like to see AMD have a big comeback like Apple has with their iPods and iPhones. Maybe AMD needs a charismatic leader like Steve Jobs to do it ... a football coach kind of guy who is good with the tech press and good at inspiring the employees to be their best, etc. Hopefully Dirk Meyer is going to be like that when he becomes CEO later on.
You gotta have that optimal combination of well engineered, innovative, high performing products, good marketing and saavy PR you know ... you need that "cool factor" and all that.
It is a sad story. Once Intel was the monopolist and then along came AMD. AMD went from under dog to price dog, but still is nothing more than a company. AMD has delivered processors for many years and always kept up with the pace - until now. So what is the problem? The problem is that putting 4 cores on a chip-die is a *fucking* milestone in the computer industry! The challenge is to produce something that scales over all 4 corners and sides of a die. This is something Intel has not yet achieved. Intel has merely glued together two 2-cores and calls it cowardly "the first 4-core processor", knowing exactly what the challenge here is. You need to see the geometric, two-dimensional symmetry as well understand how electrons travel through a chip. The complexity of the job is then what makes it so difficult as well as challenging to produce. You do not win this challenge by glueing two 2-cores together. The performance potential that lies in a good chip design has literally gained another dimension.
So what is the damage here? There are hardly any Barcelonas available, the chip simply does not exist yet. All that got damaged were people's expectations. Do we still make news out of this? And the few chips that got sold to run and with all four cores - and a patch.
AMD is obviously in for the challenge. Even their execs feel challenged by stupid experts and wish nothing more to correct them. To me they are the last heros in an economic world where everyone is stepping back or avoiding confrontation because there may be a rational explanation for everything - there is not.
Is anyone here waiting for Intel to step out of the background and help out with showing the world the first "real" 4-core processor? After they have given us the "first" already? I do not think so.
Biting the hand that feeds IT © 1998–2018