Very glad this happens : " The good news is that ASUS has replicated several of our problems and we expect a new BIOS release shortly for use in the motherboard review. "
That's what I call useful review that isn't a waste of time. Glad you have the reputation and the pull. ( one wonders what they do at Acer - I guess they wait for you guys to tell them...)
I agree with the guys who are saying they need to make discrete (not discreet, jesus) GPUs consume much less power when idling, even if that means a hybridpower style segmentation of the gpu, but it should be done all on the hardware, completely transparent to the chipset and system
Maybe hybrid sli doesnt help frame rates too much , but it can make sense when nvidia integrates the Ageia pysics in gpu, then the mgpu can take the load of physics and the dgpu the graphics in hybrid mode, i dont know if it is possible but i think it is...
" Whether or not this price tag is worth the premium over the nForce 750a SLI boards is up for debate. It's not really in our opinion as we do not believe the current AMD processor series is capable of the required computational power needed to support 3-way SLI or Quad SLI configurations. This is not a knock against NVIDIA as AMD has the same problem with Quad CrossFire; it just reflects the current state of the processor offerings from AMD."
why don't you just put 9750-9850+790fx+2-3way crossfire against q6600/q9300+x38+2-3way and compare total price/performance/power but perhaps need to add a x48 board since the lack of pci-e lanes on the x38
you could do the same with lower spec P35 but then again this board has no decent feature set against current amd chipset offerings when you talk about multi gpu setup but would still be interesting to read what happens when using CF on this board against amd770
So why isn't the 750a going to be the perfect HTPC motherboard?
Aren't the two requirements for the perfect HTPC motherboard native 1080p output via HDMI and 7.1 LPCM audio on the same HDMI connector? Also, the post processing with a phenom matches the AMD 780 chip feature set.
So, I don't understand why you would say that the AMD 780 is better for HTPC's.
Or are you guys suggesting that it's best to wait for the AMD 780 refresh that includes 7.1 LPCM because the integrated graphics perform so much better?
I personally believe the 750a would make an excellent HTPC system if you utilize a ATX case design, might plan on using it for gaming with a discreet video card, and can afford it. The GF8300 board that just arrived is a better solution at first glance (if a uATX design and not having SLI capability is important) and compares favorably to the 780G from a price to performance viewpoint, more so than the GF8200. We will have an update on it next week.
We received the 175.16 drivers right after the article went live and will have some post-processing comparisons (174.14s did not handle this right) this weekend between the two chipsets. Right now, it is a toss up in my opinion, and due to that fact, I would go NV for the multi-channel LPCM.
A couple of points here:
[QUOTE]
This is absolutely unacceptable and would prevent us from recommending the 780a as anything more than just another SLI motherboard. HybridPower is quite possibly the best feature for a high-end SLI user and if it won't work with 30" displays then its usefulness is severely degraded.
[/QUOTE]
I'll tell ya I use a 26" LCD TV on my desktop and it's big enough. I don't need 2560x1600. 1080p (1200p) is fine and matches the pixels on HDTV. Anything 1080p capable is completely reasonable. Just up the AA or AF if the FPS are too fast for ya. Just because Dell or Apple says Uber-users need a 30-inch LCD to be cool doesn't make it true. 24", 26", 27", these are great on today's desks. I really think a 30" LCD on my desk would be too big.
Gary Key, you da man, seriously, but proofread the article for typos.
Last point, and this goes for all of Anandtech's staff: Respect due, but seriously: NO dGPU. Call it A VIDEO CARD. Or -- A GRAPHICS CARD. Also, no calling a product from a CPU or GPU company a 'part' . Call it a chip -- or a CPU or a graphics chip, etc.
I'd love to see a 2560x1600 24-26" display, the more resolution the better. If that 9 megapixel LCD weren't several thousand dollars it would be sweet.
Maybe I'm the only one, but I'm so sick of every new PC component having a global warming psychotic power consumption "feature set", as if any of we end users actually give a d-a- you know what.
Heck, maybe I'm lone gunman here, but it really makes me sick, over and over again, as if I'd buy their piece of crap because they have some wattage bean counter going gaga about their lower power requirements.
Hey, here's an idea. STOP GAMING, lower yer rezz, use a tiny 13 inch monitor, and put monitor sleep mode to kick on in one minute.
Better yet, shut your system off, you're wasting the earth, and get outside for heat from the sun or put on a wool sweater, or dunk ter head in the creek if you're too warm.
Who are they fooling ? They're pushing 1,000 watt PS's, then we have to hear this beanny watt counter crud. Yeah, right after the Q6600, 4 HD's, 2 DVD's, memory card readers, dual monitor outputs, ohhh.. and make sure you got a 700 watt plus supergigajiggawatt or she might not run.....
I for one would just like to say, to noone and nobody in particular, go take a green time out.
PS- this article is no more or less green than any other, so it isn't a target. I guess it's clear this is a power surge and perhaps an overload. Well, good!
You are absolutely right, especially the application of this technology to notebooks, which is pure insanity. Why would I care if my laptop could switch from discrete to integrated GPU to save battery power and provide me another hour or so of use? I am trying to destroy the earth so I want as little battery life as possible so I can plug it in and use more resources.
As for desktops, those crazy tree-huggers want you to use less power so that your systems run more efficient and PUT OUT LESS HEAT. This would be a complete waste for those who dropped several hundred dollars for water-cooling and giant, ridiculous, circus clown heatsinks. This isn't even mentioning the enviro-psychos who like to use their computer as a floor heater in winter.
How about you take your finger out of your nose because it is clearly in too far and blocking your brain from releasing any common sense.
Why stop at that, you need the wind up power notebook, like the ones selling for the 3rd world. No plugging in and no charging any battery except by turning the crank handle.
If you're gaming on a battery, it's not just your finger up your nose, but likely your neighbors as well, to hold it up so high. Where are you that you cannot plug in ... up in that airplane ... saving all that jet fuel ?... or did you drive your Yugo to some way out park to hack, away from civilization, also an energy saver, no doubt. Have fun disposing of the polluting battery, too.
Desktops: If your system is putting out so much heat that you need to run a refrigerator to "cool just the system down", you certainly are not "saving" any power either.. DUH.
Gigantic heatsinks (and their gargantuan fans)are power-hungry users trying to crank out the last bit of mhz, often with voltage increases, huh ... DUH. Maybe the jet engine they replaced was a cheap sleeve bearing, but they didn't "save power".
Not sure exactly what the donkey you were trying to say, since you didn't make any sense, but then, that's what it's all about, huh. Preening your green self while wildly flailing about and praising the gigantic power (savings ? lol ) drain you are, anyway - while firing up the 250 watt triple 3d sli maxxed super whomper game.
I think if you had any common sense, you'd "get it".
Why on is ESA only available on the highest end model? Nvidia wants the industry to adopt and implement it into their hardware but won't even put it into their own stuff?
I don't understand why so many pages and charts are devoted to pure performance for motherboards. Unless there is physical flaw or bad drivers, performance between these motherboards are normally next to nil!
I understand stability, overclocking, and power consumption. But looking at these charts a lot of them are minuscule difference that often can be explained by settings, other components or bad drivers. I am not saying bench testing are not useful. But I don't think it is necessary to view dozens of charts with no or little difference. In fact, it would make more sense to go in to details where there is a significant difference. I think your attention to detail gets the best of you :D
Right. The benchmarks are not that interesting, and also which IGP runs which game at how many fps more or less is pretty uninteresting - as if the world had only gamers.
As much as I like the image quality provided by Nvidia products, they're still a no-go if you want open source drivers - and here is much room for improvement. I won't buy (nor sell) any of them unless they follow the good examples of Intel and ATI/AMD.
So my next mb - which will definitely have an IGP again - will be of the other mentioned makers, depending on whether I need an AMD or an Intel CPU next time.
BTW: tried movies (MythTV) together with Compiz, and that really didn't look nice, even on my 6150/430 Nvidia. Only after switching off most or all desktop effects, the picture became more stable...
I tried a Wolfdale 2,6GHz (E8200) with Intel's G35, and it's an improvement already - tho for "serious" HTPC usage, I would probably wait for the G45, which should be out this summer.
Sure, Intel chip sets are not flawless, like their drivers also. But Intel and AMD are moving into the right direction, and I wish this would be honoured more when comparison tests like the one here are performed.
The world isn't only Windows, and only gamers - wake up guys. Take the Phoronix test suite if you have to compare and show numbers. I think even this test suite is GPL'ed, so...
Anyway: the ATI/AMD 690G (RS690) will work now with 3D, using only open source drivers - and it's news like these which are really important for the rest of us - not which newest chip set has a few frames per second more or less, which is really ONLY interesting for first person shooters.
quote: HyperTransport 3.0 capability (5.2GT/s+ interface) is included and is important in getting the most out of the 780a graphics core. With a Phenom onboard, the 780a will perform post-processing on high-definition content and it makes a difference in image quality and fluidity during 1080p playback.
How important is HT3 for the IGP? Is 1080P content watchable without it?
Also, is there an equivalent to AMD's sideport memory that may show up in some 780a/8200 boards?
HT3 is most important when you watch interlaced contents (1080i) because of the extra HDHQV features require alot more bandwidth than normal 1080p. Theoratically 1080p should be watchable without HT3, but this largely depends on the K8 model you get.
I'm not sure about sideport equivalence from NVIDIA, I haven't heard anything related to it and I highly doubt they will be able to come up with one, because that requires modification of their existing blocks which they probably won't bother to spend the time on. If you really want that, just get an AMD board ;)
Well I was planning on getting a 4850e and have been recently trying to decide between the 780G and 8200. I'd like to get the best IGP performance and also have RAID5 w/out using any extra cards, but that seems impossible at this point. Maybe a manufacturer will pair up 780G with SB750 when it gets released.
If you want to max out 3D performance, HT3 is the way to go. HT1 can provide maximum 8GB of bandwidth, HT3 with 1800MHz can provide 14.4GB of bandwidth (2 channel DDR2-800 is 12.8GB). The actual improvement of this reflected in benchmarks such as 3DMarko6 is quite significant (>20%), but nonetheless it is still IGP, so whether you would like to invest more into it is totally up to you.
Fixed... Gary changed the chart sizes but didn't update the HTML (where a smaller width and height was hard-coded). Shame on him. I have had him flogged with a Cat-o-nine-SATA-cords.
I appreciate the effort by Nvidia to reduce idle power consumption, but I would much rather see a discrete GPU that doesn't draw so much power when idling in first place. ATI has been making significant strides in this department lately with PowerPlay, and EVERY motherboard/configuration benefits. Having two GPUs with redundant framebuffers is going around your elbow to get to your ******* if you ask me.
HomerDog. Not sure I entirely understand your problem with Hybrid Power. Its basically a technology that lets you shut of your discreet GPUS completely. No amount of power saving tech is going to have that measure of impact. ((Or system noise impact)).
Your right that every motherboard benefits from power saving tech on discreet GPUs. But the difference in power saving by using a feature like Hybrid power is huge compared to any idle technology existing on GPUS. Browsing from my desktop with Hybrid Power enabled and Quad SLI 9800GX2. My AVG room temp went down 4-5C after 2 hours of web activity from having hybrid power enabled. Thats significant.
Don't get me wrong, HybridPower is a cool feature that I will consider when I'm making my next motherboard/GPU purchase.
However, the fact remains that the HD3K cards have a significantly larger delta between their idle and load power consumption figures than the current crop of Nvidia cards. If ATI continues to build on this trend they may not even need a complex mGPU/dGPU hybrid solution to get idle consumption down to near IGP levels, although they're probably working on one anyway.
Now we just need Hybrid Power in laptops - where it should have been first, IMO! At the very least, HybridPower should have shipped with support for 8800GT/GTS 512 and 9600 cards rather than just 9800 GTX/GX2.
Also, my two cents on GeForce Boost: hooray for an extra 20% over 20FPS. That sounds fine, until you look at the bigger picture. A GeForce 8400 GS or 8500 GT is terribly slow relative to most discrete GPUs. Sure, they cost $40 to $70 depending on model and features. An extra 20% performance (or even 50%) would be fine. However, a $75 8600GT is already about twice as fast and a 9600GT (with rebates available for $110-$120) isn't even on the same continent.
If you have an IGP motherboard and you think it's too slow for games, I seriously doubt you're going to want to spend $50 to roughly double the performance. As any mathematician can tell you, multiplying any real number by zero is still zero. It may not be that bad, but I'd say 9600GT with Hybrid Power support is what people should shoot for. I figure that will arrive some time in the near future. Then just wait for it to show up on Intel platforms.
While I agree with you, I think this is a great idea. An onboard GPU is always going to use less power than a discrete one. The main issue I'm concerned with is, does the system get back the memory used by the onboard GPU when the discrete GPU is in use? Granted it's only going to use 64-128MB of RAM likely, maybe 256. But still, those are resources that aren't able to be used by games.
Of course it doesn't really matter for most since it only supports the 9800GTX and 9800GX2 and, in my opinion, you'd have to be stupid to go with the 9800GTX when the 8800GTS 512MB offers nearly identical performance. Heck even the 8800GT 512MB is only about 5 FPS different.
They need to offer the hybrid power support across the entire 8x00 series.
There are a lot of AMD fans. AMD still has a lot of loyal followers, maybe you forget that AMD had the speed crown for many more years than Intel. I have been an NV fan since it was STB in the early 90s, I, for one, like the fact that they are offering similar solutions, even though they lag a little.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
38 Comments
Back to Article
SiliconDoc - Monday, July 28, 2008 - link
Very glad this happens : " The good news is that ASUS has replicated several of our problems and we expect a new BIOS release shortly for use in the motherboard review. "That's what I call useful review that isn't a waste of time. Glad you have the reputation and the pull. ( one wonders what they do at Acer - I guess they wait for you guys to tell them...)
piroroadkill - Sunday, May 11, 2008 - link
I agree with the guys who are saying they need to make discrete (not discreet, jesus) GPUs consume much less power when idling, even if that means a hybridpower style segmentation of the gpu, but it should be done all on the hardware, completely transparent to the chipset and systemKGR - Saturday, May 10, 2008 - link
Maybe hybrid sli doesnt help frame rates too much , but it can make sense when nvidia integrates the Ageia pysics in gpu, then the mgpu can take the load of physics and the dgpu the graphics in hybrid mode, i dont know if it is possible but i think it is...duploxxx - Thursday, May 8, 2008 - link
Always like to read the reviews and comments from your site, but why don't you just provide proof with real data instead of a hit in the dark. You already have big parts of the data in an other review.(http://www.anandtech.com/video/showdoc.aspx?i=3232...">http://www.anandtech.com/video/showdoc.aspx?i=3232...
" Whether or not this price tag is worth the premium over the nForce 750a SLI boards is up for debate. It's not really in our opinion as we do not believe the current AMD processor series is capable of the required computational power needed to support 3-way SLI or Quad SLI configurations. This is not a knock against NVIDIA as AMD has the same problem with Quad CrossFire; it just reflects the current state of the processor offerings from AMD."
why don't you just put 9750-9850+790fx+2-3way crossfire against q6600/q9300+x38+2-3way and compare total price/performance/power but perhaps need to add a x48 board since the lack of pci-e lanes on the x38
you could do the same with lower spec P35 but then again this board has no decent feature set against current amd chipset offerings when you talk about multi gpu setup but would still be interesting to read what happens when using CF on this board against amd770
gipper - Wednesday, May 7, 2008 - link
So why isn't the 750a going to be the perfect HTPC motherboard?Aren't the two requirements for the perfect HTPC motherboard native 1080p output via HDMI and 7.1 LPCM audio on the same HDMI connector? Also, the post processing with a phenom matches the AMD 780 chip feature set.
So, I don't understand why you would say that the AMD 780 is better for HTPC's.
Or are you guys suggesting that it's best to wait for the AMD 780 refresh that includes 7.1 LPCM because the integrated graphics perform so much better?
Gary Key - Thursday, May 8, 2008 - link
I personally believe the 750a would make an excellent HTPC system if you utilize a ATX case design, might plan on using it for gaming with a discreet video card, and can afford it. The GF8300 board that just arrived is a better solution at first glance (if a uATX design and not having SLI capability is important) and compares favorably to the 780G from a price to performance viewpoint, more so than the GF8200. We will have an update on it next week.We received the 175.16 drivers right after the article went live and will have some post-processing comparisons (174.14s did not handle this right) this weekend between the two chipsets. Right now, it is a toss up in my opinion, and due to that fact, I would go NV for the multi-channel LPCM.
The Jedi - Wednesday, May 7, 2008 - link
A couple of points here:[QUOTE]
This is absolutely unacceptable and would prevent us from recommending the 780a as anything more than just another SLI motherboard. HybridPower is quite possibly the best feature for a high-end SLI user and if it won't work with 30" displays then its usefulness is severely degraded.
[/QUOTE]
I'll tell ya I use a 26" LCD TV on my desktop and it's big enough. I don't need 2560x1600. 1080p (1200p) is fine and matches the pixels on HDTV. Anything 1080p capable is completely reasonable. Just up the AA or AF if the FPS are too fast for ya. Just because Dell or Apple says Uber-users need a 30-inch LCD to be cool doesn't make it true. 24", 26", 27", these are great on today's desks. I really think a 30" LCD on my desk would be too big.
Gary Key, you da man, seriously, but proofread the article for typos.
Last point, and this goes for all of Anandtech's staff: Respect due, but seriously: NO dGPU. Call it A VIDEO CARD. Or -- A GRAPHICS CARD. Also, no calling a product from a CPU or GPU company a 'part' . Call it a chip -- or a CPU or a graphics chip, etc.
strikeback03 - Thursday, May 8, 2008 - link
I'd love to see a 2560x1600 24-26" display, the more resolution the better. If that 9 megapixel LCD weren't several thousand dollars it would be sweet.Wolfcastle - Wednesday, May 7, 2008 - link
The author should clean up the grammar a bit. Anandtech has a large audience.James5mith - Wednesday, May 7, 2008 - link
Maybe I'm just foolish here, but for the extreme overclocking crowd, I see an immediate and tangible benefit:If you happen to fry your video card while OC'ing it, you can use the onboard video as a stopgap until you get it repaired.
SiliconDoc - Wednesday, May 7, 2008 - link
Maybe I'm the only one, but I'm so sick of every new PC component having a global warming psychotic power consumption "feature set", as if any of we end users actually give a d-a- you know what.Heck, maybe I'm lone gunman here, but it really makes me sick, over and over again, as if I'd buy their piece of crap because they have some wattage bean counter going gaga about their lower power requirements.
Hey, here's an idea. STOP GAMING, lower yer rezz, use a tiny 13 inch monitor, and put monitor sleep mode to kick on in one minute.
Better yet, shut your system off, you're wasting the earth, and get outside for heat from the sun or put on a wool sweater, or dunk ter head in the creek if you're too warm.
Who are they fooling ? They're pushing 1,000 watt PS's, then we have to hear this beanny watt counter crud. Yeah, right after the Q6600, 4 HD's, 2 DVD's, memory card readers, dual monitor outputs, ohhh.. and make sure you got a 700 watt plus supergigajiggawatt or she might not run.....
I for one would just like to say, to noone and nobody in particular, go take a green time out.
PS- this article is no more or less green than any other, so it isn't a target. I guess it's clear this is a power surge and perhaps an overload. Well, good!
Donkey2008 - Wednesday, May 7, 2008 - link
You are absolutely right, especially the application of this technology to notebooks, which is pure insanity. Why would I care if my laptop could switch from discrete to integrated GPU to save battery power and provide me another hour or so of use? I am trying to destroy the earth so I want as little battery life as possible so I can plug it in and use more resources.As for desktops, those crazy tree-huggers want you to use less power so that your systems run more efficient and PUT OUT LESS HEAT. This would be a complete waste for those who dropped several hundred dollars for water-cooling and giant, ridiculous, circus clown heatsinks. This isn't even mentioning the enviro-psychos who like to use their computer as a floor heater in winter.
How about you take your finger out of your nose because it is clearly in too far and blocking your brain from releasing any common sense.
SiliconDoc - Wednesday, May 7, 2008 - link
Why stop at that, you need the wind up power notebook, like the ones selling for the 3rd world. No plugging in and no charging any battery except by turning the crank handle.If you're gaming on a battery, it's not just your finger up your nose, but likely your neighbors as well, to hold it up so high. Where are you that you cannot plug in ... up in that airplane ... saving all that jet fuel ?... or did you drive your Yugo to some way out park to hack, away from civilization, also an energy saver, no doubt. Have fun disposing of the polluting battery, too.
Desktops: If your system is putting out so much heat that you need to run a refrigerator to "cool just the system down", you certainly are not "saving" any power either.. DUH.
Gigantic heatsinks (and their gargantuan fans)are power-hungry users trying to crank out the last bit of mhz, often with voltage increases, huh ... DUH. Maybe the jet engine they replaced was a cheap sleeve bearing, but they didn't "save power".
Not sure exactly what the donkey you were trying to say, since you didn't make any sense, but then, that's what it's all about, huh. Preening your green self while wildly flailing about and praising the gigantic power (savings ? lol ) drain you are, anyway - while firing up the 250 watt triple 3d sli maxxed super whomper game.
I think if you had any common sense, you'd "get it".
The Jedi - Wednesday, May 7, 2008 - link
Jigga-WHAAAT?!zander55 - Wednesday, May 7, 2008 - link
Why on is ESA only available on the highest end model? Nvidia wants the industry to adopt and implement it into their hardware but won't even put it into their own stuff?crimsonson - Tuesday, May 6, 2008 - link
I don't understand why so many pages and charts are devoted to pure performance for motherboards. Unless there is physical flaw or bad drivers, performance between these motherboards are normally next to nil!I understand stability, overclocking, and power consumption. But looking at these charts a lot of them are minuscule difference that often can be explained by settings, other components or bad drivers. I am not saying bench testing are not useful. But I don't think it is necessary to view dozens of charts with no or little difference. In fact, it would make more sense to go in to details where there is a significant difference. I think your attention to detail gets the best of you :D
My .02
In general I do think you guys do awesome work.
wjl - Tuesday, May 6, 2008 - link
Right. The benchmarks are not that interesting, and also which IGP runs which game at how many fps more or less is pretty uninteresting - as if the world had only gamers.As much as I like the image quality provided by Nvidia products, they're still a no-go if you want open source drivers - and here is much room for improvement. I won't buy (nor sell) any of them unless they follow the good examples of Intel and ATI/AMD.
So my next mb - which will definitely have an IGP again - will be of the other mentioned makers, depending on whether I need an AMD or an Intel CPU next time.
strikeback03 - Thursday, May 8, 2008 - link
I have to use the restricted drivers on both my desktop (discrete NVIDIA) and laptop (discrete ATi) in Ubuntu.And I've never understood the point of windows that wobble.
sprockkets - Tuesday, May 6, 2008 - link
Tru, I love not having to install any drivers for compiz-fusion on my Intel G31 system. It actually runs it better than my 6150 AMD system.But, under load with movies and compiz and other stuff graphics wise running, the 6150 doesn't crap out as much.
Good chipset, waiting for Intel's version. I have been an AMD person for a long time, but, for $70 a 2ghz Pentium Allendale works great for me.
WTB a gen 2 Shuttle XPC in silver with either the G45 or Intel's. 3ghz Wolfdale will do nicely.
wjl - Wednesday, May 7, 2008 - link
BTW: tried movies (MythTV) together with Compiz, and that really didn't look nice, even on my 6150/430 Nvidia. Only after switching off most or all desktop effects, the picture became more stable...wjl - Wednesday, May 7, 2008 - link
I tried a Wolfdale 2,6GHz (E8200) with Intel's G35, and it's an improvement already - tho for "serious" HTPC usage, I would probably wait for the G45, which should be out this summer.Sure, Intel chip sets are not flawless, like their drivers also. But Intel and AMD are moving into the right direction, and I wish this would be honoured more when comparison tests like the one here are performed.
The world isn't only Windows, and only gamers - wake up guys. Take the Phoronix test suite if you have to compare and show numbers. I think even this test suite is GPL'ed, so...
Anyway: the ATI/AMD 690G (RS690) will work now with 3D, using only open source drivers - and it's news like these which are really important for the rest of us - not which newest chip set has a few frames per second more or less, which is really ONLY interesting for first person shooters.
Natfly - Tuesday, May 6, 2008 - link
How important is HT3 for the IGP? Is 1080P content watchable without it?
Also, is there an equivalent to AMD's sideport memory that may show up in some 780a/8200 boards?
derek85 - Tuesday, May 6, 2008 - link
HT3 is most important when you watch interlaced contents (1080i) because of the extra HDHQV features require alot more bandwidth than normal 1080p. Theoratically 1080p should be watchable without HT3, but this largely depends on the K8 model you get.I'm not sure about sideport equivalence from NVIDIA, I haven't heard anything related to it and I highly doubt they will be able to come up with one, because that requires modification of their existing blocks which they probably won't bother to spend the time on. If you really want that, just get an AMD board ;)
Natfly - Tuesday, May 6, 2008 - link
Well I was planning on getting a 4850e and have been recently trying to decide between the 780G and 8200. I'd like to get the best IGP performance and also have RAID5 w/out using any extra cards, but that seems impossible at this point. Maybe a manufacturer will pair up 780G with SB750 when it gets released.derek85 - Thursday, May 8, 2008 - link
If you want to max out 3D performance, HT3 is the way to go. HT1 can provide maximum 8GB of bandwidth, HT3 with 1800MHz can provide 14.4GB of bandwidth (2 channel DDR2-800 is 12.8GB). The actual improvement of this reflected in benchmarks such as 3DMarko6 is quite significant (>20%), but nonetheless it is still IGP, so whether you would like to invest more into it is totally up to you.Von Matrices - Tuesday, May 6, 2008 - link
Is my PC at fault or does anyone else notice the horrible compression of the charts on page 6?JarredWalton - Tuesday, May 6, 2008 - link
Fixed... Gary changed the chart sizes but didn't update the HTML (where a smaller width and height was hard-coded). Shame on him. I have had him flogged with a Cat-o-nine-SATA-cords.Mgz - Tuesday, May 6, 2008 - link
in page 4 you have a little typo "we can't really be sure until NVIDI confirms the details"homerdog - Tuesday, May 6, 2008 - link
I appreciate the effort by Nvidia to reduce idle power consumption, but I would much rather see a discrete GPU that doesn't draw so much power when idling in first place. ATI has been making significant strides in this department lately with PowerPlay, and EVERY motherboard/configuration benefits. Having two GPUs with redundant framebuffers is going around your elbow to get to your ******* if you ask me.ChrisRay - Tuesday, May 6, 2008 - link
HomerDog. Not sure I entirely understand your problem with Hybrid Power. Its basically a technology that lets you shut of your discreet GPUS completely. No amount of power saving tech is going to have that measure of impact. ((Or system noise impact)).Your right that every motherboard benefits from power saving tech on discreet GPUs. But the difference in power saving by using a feature like Hybrid power is huge compared to any idle technology existing on GPUS. Browsing from my desktop with Hybrid Power enabled and Quad SLI 9800GX2. My AVG room temp went down 4-5C after 2 hours of web activity from having hybrid power enabled. Thats significant.
SLIZONE Forum Admin.
Nvidia User Group
homerdog - Tuesday, May 6, 2008 - link
Don't get me wrong, HybridPower is a cool feature that I will consider when I'm making my next motherboard/GPU purchase.However, the fact remains that the HD3K cards have a significantly larger delta between their idle and load power consumption figures than the current crop of Nvidia cards. If ATI continues to build on this trend they may not even need a complex mGPU/dGPU hybrid solution to get idle consumption down to near IGP levels, although they're probably working on one anyway.
JarredWalton - Tuesday, May 6, 2008 - link
Now we just need Hybrid Power in laptops - where it should have been first, IMO! At the very least, HybridPower should have shipped with support for 8800GT/GTS 512 and 9600 cards rather than just 9800 GTX/GX2.Also, my two cents on GeForce Boost: hooray for an extra 20% over 20FPS. That sounds fine, until you look at the bigger picture. A GeForce 8400 GS or 8500 GT is terribly slow relative to most discrete GPUs. Sure, they cost $40 to $70 depending on model and features. An extra 20% performance (or even 50%) would be fine. However, a $75 8600GT is already about twice as fast and a 9600GT (with rebates available for $110-$120) isn't even on the same continent.
If you have an IGP motherboard and you think it's too slow for games, I seriously doubt you're going to want to spend $50 to roughly double the performance. As any mathematician can tell you, multiplying any real number by zero is still zero. It may not be that bad, but I'd say 9600GT with Hybrid Power support is what people should shoot for. I figure that will arrive some time in the near future. Then just wait for it to show up on Intel platforms.
FITCamaro - Tuesday, May 6, 2008 - link
While I agree with you, I think this is a great idea. An onboard GPU is always going to use less power than a discrete one. The main issue I'm concerned with is, does the system get back the memory used by the onboard GPU when the discrete GPU is in use? Granted it's only going to use 64-128MB of RAM likely, maybe 256. But still, those are resources that aren't able to be used by games.Of course it doesn't really matter for most since it only supports the 9800GTX and 9800GX2 and, in my opinion, you'd have to be stupid to go with the 9800GTX when the 8800GTS 512MB offers nearly identical performance. Heck even the 8800GT 512MB is only about 5 FPS different.
They need to offer the hybrid power support across the entire 8x00 series.
BansheeX - Tuesday, May 6, 2008 - link
Who cares about the Phenom? Where is the Intel variant, aka 730i? Another three month delay for that one? Sigh.FITCamaro - Tuesday, May 6, 2008 - link
People who want a Phenom.DigitalFreak - Wednesday, May 7, 2008 - link
Those mythical people exist?KnightProdigy - Thursday, May 8, 2008 - link
There are a lot of AMD fans. AMD still has a lot of loyal followers, maybe you forget that AMD had the speed crown for many more years than Intel. I have been an NV fan since it was STB in the early 90s, I, for one, like the fact that they are offering similar solutions, even though they lag a little.Gary Key - Tuesday, May 6, 2008 - link
We expect to see the Intel mGPU variants this summer, just in time to compete with the G45.