There was a time, not so long ago, when integrated graphics were so feeble they couldn’t pull the metaphorical skin off a rice pudding. Broadly speaking, the integrated graphics processor (IGP) was fit for little more than the two-dimensional Windows desktop, and a graphics card was necessary if you wanted to play games. intel …
It would be nice to see the article updated after re-testing with a Phenom II, which is much more similar to the Intel processor used.
You indicated that using a Athlon X2 4850e would save approximately 20 watts compared to using a Athlon 64 X2 7750 processor. The 4850e is rated at a maximum power consumption of 45 Watts compared to the 95 Watts of the 7750. This would appear to save around 40 Watts.
The effect woudl be that the Athlon X2 4850e with the ATI 3300 integrated graphics would draw around 90 Watts under load and less when idle.
You also have another error in that the specific MSI motherborad withATI 3300 integrated graphics includes 1 GB of side port graphics memory which will increase power usage.
I considered using Phenom II for this feature - I'm writing the review for El Reg right now - but shied away on the grounds of cost. The 3GHz Phenom II X4 940 costs £220 which puts it up against the 2.66GHz Core 2 Quad Q9400 and I'm not sure that either processor is necessary for a PC that has integrated graphics.
Athlon X2 4850e power
I have previously used the X2 4850e in testing for other reviews and those figures are the basis for my observation that the power saving is 20W and that the performance hit is 20 percent.
These are my own measured figures and are not manufacturer's figures.
As for SidePort I have seen some bold claims for this on-board graphics memory but have never seen any measurable effect, either in terms of performance of in terms of power draw.
That may be due at least in part to my practice of rounding power figures to the nearest 5W.
I can understand the charts!!!
Finally you took the crayons away from the monkey. More of these please, I can actually see which product is fastest.
Desktop integrated graphics shoot-out
I still hope AMD/ATI & nVidia could come up an ultra low wattage quad displays capable, dgital/analog/hdmi mixed chipset for the low cost built-in mobo so the whole world will be able to afford the multi-displays gaming luxuary at own home.
@! Desktop integrated graphics shoot-out
But of course, ATI & nVidia long for the day when they can give away the farm cheap instead of having quad display gamers buy a video card. Perish the thought of paying more for more features.
About the CPU, these boards really should've been fitted with the slowest CPUs that weren't completely castrated with less than 1MB L2 cache, otherwise it's a bit silly talking about power consumption and someone cheaping out on the video tends to do the same on the processor. Regardless, at least we had more than a small amount of assurance the processor (nevermind the memory bus??) itself wasn't a bottleneck for the video benchmarks.
Please help this idiot (me)
Um... My brain is a little slow today... which these of motherboards would be best for building a cheapish, cool and QUIET media centre PC? Or would a Playstation 3 (at 250 GBP) be better for this task?
Cometh the AMD fanboyz....
Is it me or are the only whiners here the AMD fanboyz?
I don't really care for speed, but I need to know which maximum resolution is supported? Can I, for example connect 2 30" LCDs onto an Intel board with 2 DVI connectors? Do know that I have to know which resolutions the chipset supports.
SiS and other stuff...
If you're looking for a motherboard that uses an SiS chipset, the Intel D201GLY2/2A (with embedded 1.3GHz Celeron CPU) would be one choice. I got one and was really pretty impressed with what it could do, especially considering the limited processing power it has.
Integrated graphics are OK in my book, even if they don't perform all that well. It bothers me that the graphics options from nVidia and ATI either have huge heatsinks or cook themselves into an early grave. I've never lost an integrated graphics chipset...compared to a few ATI and nVidia boards that just got too hot.
- Comment Renewable energy 'simply WON'T WORK': Top Google engineers
- Nexus 7 fandroids tell of salty taste after sucking on Google's Lollipop
- Useless 'computer engineer' Barbie FIRED in three-way fsck row
- Game Theory Dragon Age Inquisition: Our chief weapons are...
- 'How a censorious and moralistic blogger ruined my evening'