Comments Locked

45 Comments

Back to Article

  • mdma35 - Friday, October 9, 2009 - link

    Epic Article was pleasure to read thnx for sucj informative stuff
  • jamstan - Sunday, July 13, 2008 - link

    I just did a build with an E8500. The temp always shows 30 degrees no matter how high I overclock it or what speed I have my Vantec Tornado at. Being an overclocker it stinks that I bought a cpu with a temp sensor that doesn't work. I guess its a common problem with this cpu and I hear Intel won't RMA a cpu with a bad sensor. I'm gonna be giving them a call.
  • Johnbear007 - Saturday, March 8, 2008 - link

    I'd still like to know (other than microcenter) what retailer(S) are carrying the q6600 for "under 200$". I would much rather have a sub 200$ q6600 than a 260$ e8400 from mwave
  • MrSpadge - Thursday, March 6, 2008 - link

    I do not agree with much of mindless1's critique on page 3, but we arrive at a somewhat similar conclusion: the section " The Truth About Processor "Degradation" " is lacking. Rather than adressing my issues with mindless1's post I'll just explain my point.

    Showing the influence of temperature on reliability is nice and well, but you neglect the factor which is by far the most important: voltage. It's effect on reliability / expected lifetime / MTTF is much higher than temperature (within sane limits).

    How did you generate the curves in the first plot on that page? Is it just a guess or do you have exact data? Since you mention the 8500 specifically I can imagine that you got the data (or formula) from some insider. If so I'd be curious about how these curves look like if you apply e.g. 1.45 V. There should be a drastic reduction in lifetime.

    If you don't think voltage is that important and you have no ways to adjust the calculations, you could pm dmens here at AT. I'd say he's expert enough in this field.

    MrS
  • Toferman - Thursday, March 6, 2008 - link

    Another great article, thanks for your work on this Kris. :)
  • xkon - Thursday, March 6, 2008 - link

    where are the sub $200 q6600's? i know microcenter had some for $200, but they are no where near me. any other ones? stating it in the article like that makes me think they are available at almost any retailer for that price. maybe if it was rephrased to something like they have been known to be priced as low as $200 or something like that. then again. maybe i'm not in the know, and am just not looking hard enough.
  • TheJian - Thursday, March 6, 2008 - link

    Yet another example of lies. The cheapest Q6600 on pricewatch is $243. And that doesn't come with a 3yr warranty OR a heatsink. So really the cheapest is $253 for retail box with heatsink/fan and 3yr. That's a FAR cry from $200. Cheapest on Cnet.com is $255. Where did they search to find these magical $200 Q6600 chips? I want one. I suspect pricegrabber etc would show the same. I'm too lazy to check now...LOL
  • MaulSidious - Thursday, March 6, 2008 - link

    dunno about america but in britain you can get a q6600 anywhere for 130-150 pounds
  • Johnbear007 - Thursday, March 6, 2008 - link

    150 pounds is about 250-300$ american which is nowhere near what the articles author is claiming. One microcenter deal doesnt really constitute claiming you can bag one from retailer(S) for under 200$. Also, another poster pointed to what he called a q6700 for 80$. That is not true, it was an e6700 which is dual core not quad.
  • Karaktu - Wednesday, March 5, 2008 - link

    I would just like to point out that it has been possible to run a sub-90-watt maximum HTPC for nearly two years. In fact, I've been doing it.

    It DOES require a Core Duo or Core 2 Duo mobile chip, but MoD isn't a new concept.

    ASUS N4L-VM DH
    - Using onboard Intel graphics, Realtek SPDIF and Gigabit network
    Core Duo T2500 (2.0GHz)
    - Cooled by a Nactua NC-U6 northbridge cooler and 60mm fan set to low
    2 x 1GB DDR2 667
    Vista View D1N1-E NTSC/ATSC PCI-E tuner
    Vista View D1N1-I NTSC/ATSC PCI tuner
    - (That's two analog and two HDTV tuners)
    1TB WDC GP 5400rpm hard drive
    750GB Samsung Spinpoint F1 7200rpm hard drive
    Antec Fusion case (rev 1)
    - VFD
    - 430-watt 80 Plus power supply
    - 2 x 120mm TriCool fans set to low
    - External IR for remote and keyboard
    Running MCE 2005

    Idles at 68 watts AT THE WALL and draws a maximum of 90 watts at full load (recording 4 shows and watching a fifth show/movie).

    If I ever get around to dropping the PSU to an EA-380, I'm sure the efficiency would go up a little since I would be closer to that magic 20 - 80% range on the power supply.

    Joe
  • chizow - Wednesday, March 5, 2008 - link

    Only had an issue with this statement:

    quote:

    Intel has also worked hard to make all of this performance affordable. Many US retailers now stock the 65nm Q6600 quad-core CPU at less than $200, which places it squarely in the 45nm dual-core price range - something to think about as you make your next purchasing decision. However, if it comes down to the choice between a 65nm and 45nm CPU we would pick the latter every time - they are just that good. The only question now is exactly when Intel will decide to start shipping in volume.


    While this may be true for those building a new system around a new CPU, this might not hold true for those looking to overclock using an existing board. These 45nm CPUs with their higher stock 1333 FSB will by necessity use lower multipliers which essentially eats into potential FSB overclocks on FSB-limited chipsets. Considering many 6-series NV chipsets and boards will not support Penryn *at all* this is a very real consideration for those looking to upgrade to something faster.

    Given my experiences with NV 6-series chipsets compared to reviews, I'm not overly optimistic about Penryn results on the 7-series either. Curious as to which board you tested these 45nm with? I haven't kept up with P35/X38 capabilities but the SS you showed had you dropping the multiplier which is a feature I thought was limited to NV chipsets? I might have missed it in the article, but clarification would be appreciated.

  • TheJian - Thursday, March 6, 2008 - link

    I have a problem with this part of that statement "Intel has also worked hard to make all of this performance affordable." They forget it was AMD who forced Intel to cut margins on cpus from 62% (I think that was their high a few years back) to a meager 48% (if memory serves) and their profits to tank 60% in some quarters while driving AMD into the ground. Do they think they were doing it for our sakes? NOT. It was to kill AMD (and it worked). WE just got LUCKY.

    How much INTEL butt kissing can you do in one article? Notice that since AMD has sucked Intel is starting to make an ABOUT FACE on pricing. Server chips saw an increase of 20% about a month ago or so (it was written about everywhere). Now we see the E8400 which was $209 on newegg just a few weeks ago and IN STOCK, is said to go for $250 if you believe Anandtech. Even newegg has their future chips when they come back in stock now priced at $239. That's a $30 increase! What is that 14% or so? I missed the first E8400's and thought it would go down, instead Intel restricts volume and causes a price hike to soak us since AMD sucks now. I hope AMD puts out a decent chip shortly (phenom 3ghz would be a good start) so we can stop the price hikes.

    What's funny to me is the reviewers let Intel get away with pricing in a review that comes nowhere near what it ACTUALLY debuts for. They've done the same for Nvidia also (not just anand, but others as well). The cards always end up being $50 more than MSRP. Which screws competitors because we all wait for said cheap cpu/gpu and end up not buying a very good alternative at the time (on the basis of false pricing from Intel/Nvidia). They should just start saying in reviews, "but expect $50 more upon debut than they say because they always LIE to get you to not buy the competitors product". That would at least be more accurate. For the record I just bought an 8800GT and will buy a wolfdate in a few weeks :) My problem here is the reviewers not calling them out on this crap. Why is that?
  • mindless1 - Wednesday, March 5, 2008 - link

    Yes you are right that the higher default FSB works against them, it would be better if a Pentium or Celeron 45nm started with lower FSB so the chipsets had enough headroom for good overclock.

    NV is not the only one that can drop the multiplier, I've not tried it on X38 but have on P35 and see no reason why a motherboard manufacturer would drop such a desirable feature unless the board simply was barren of o'c features, something with OEM limited bios perhaps.
  • Psynaut - Wednesday, March 5, 2008 - link

    It took me a minute to figure out that they were talking about the chips that were released 6-8 weeks ago.
  • squito - Wednesday, March 5, 2008 - link

    Same here ... maybe they need to be reintroduced?
  • Johnbear007 - Wednesday, March 5, 2008 - link

    Where in the U.S. do you see a Q6600 for under 200$???? I still see it at 245$ at newegg and your own anandtechshopping search doesn't come anywhere near the pricepoint you mention.
  • XtAzY - Wednesday, March 5, 2008 - link

    If you take a look at Hot Deals last month, you could have gotten a q6700 for $80

    http://hardforum.com/showpost.php?p=1032017513&...">http://hardforum.com/showpost.php?p=1032017513&...

    but of course it's already over
  • smithkt - Wednesday, March 5, 2008 - link

    That was for the e6700 not the q6700
  • firewolfsm - Wednesday, March 5, 2008 - link

    With core 2s, you can always do a minor frequency overclock while actually undervolting it. With a decent cooler, it could even last longer than stock.
  • ap90033 - Wednesday, March 5, 2008 - link

    Is it me or was this a little to negative on the OC stuff? I mean really, if you have good cooling, keep the voltage 1.44 or lower I bet the CPU would last 2-3 years or more...

    It almost sounds like they are marketing for Intel, "Great Overclokcer" but wait dont OC just get the highest model or it will only last 10 minutes!
  • TheJian - Thursday, March 6, 2008 - link

    Agreed. I haven't had a cpu that hasn't been heavily overclocked since like 1992 or so. All of these chips clear back a 486 100mhz ran for others for years after I sold them. My sister is still running my Athlon 1700+ in one machine. It's on all day, she doesn't even use standby...LOL (except for the monitor). It's like 6yrs old. Probably older than that. Still running fine. I think it runs at 1400mhz if memory serves (but it's a 1700+, you know amd's number scheme). Every time I upgrade I sell a used chip at like 1/2 off to a customer/relative and they all run for years. I usually don't keep them for more than 1yr but they run well past the 3yr warranty for everyone else, and I drive them hard while I have them. I'd venture to guess that most of Intel/AMD chips would last 5+ years on avg. The processes are just that good. After a few years in a large company with 600+ pc's I realized they just don't die. We sent them to schools (4-5yr upgrade schedule) before they died or sold them to employees for $50-100 as full pc's! I think I saw 3 cpu deaths in 2.5yrs and they were dead weeks after purchase (p4's...presshots..LOL). Don't even get me started on the hard drive deaths though...ROFL. 40+/yr. Those damn Dell SFF's get hot and kill drives quick (not to mention the stinking plastic, smells burnt all day). You're lucky to get to warranty on those. I digress...
  • mindless1 - Wednesday, March 5, 2008 - link

    Yes, the author is completely wrong about overclocking. Overclocking (within sane bounds like not letting the processor get hotter than you'd want even if it weren't overclocked), INCREASES the usable lifespan, not decreases.

    The author has obviously not had much experience overclocking, for example there are still plenty of Celeron 300MHz processors that ran at 450+MHz for almost ten years then were retired due to beyond beyond their "usable" lifespan, just slow by then modern standards. Same for Coppermine P3 and Celeron, Athlon XP, take your pick there are almost never examples of a processor that fails prematurely that had ran stable for a couple years, unless it was due to some external influence like the heatsink falling off or motherboard power circuit failure.

    Overclocking really isn't a gamble - unless you don't use common sense. 2-3 years is a lifespan you'd get if you were doing something extreme, not a modest voltage increase using a heatsink that keeps it cool enough.

    I suggest the article page about "The Truth About Processor Degradation" should just be deleted, it's not just misleading but mostly incorrect. Here's the core of the problem:

    "As soon as you concede that overclocking by definition reduces the useful lifetime of any CPU, it becomes easier to justify its more extreme application."

    Absolutely backwards. Overclocking does not by definition nor by any other nonsensical standard, reduce the useful lifetime of CPUs. It increases the useful lifetime by providing more performance so that processor remains at the required performance levels (which escalate) for a longer period, then eventually is retired before failing in most cases. It is wrong to think that if an overclocked processor would last 18 years without overclocking and 12 with modest overclocking, that this suddenly means "it becomes easier to justify it's more extreme application." It means you can do something sanely and have zero problems or use random guesses and do something "extreme" and then you will find a problem. Author is completely backwards.

    "Too many people believe overclocking is "safe" as long as they don't increase their processor core voltage - not true."

    There is no evidence of this. Show us even one processor that failed from increase in clock speed within it's default voltage and within it's otherwise viable lifespan.

    "Frequency increases drive higher load temperatures, which reduces useful life. "

    Wrong. While it is true that a higher frequency will increase temps, it is not true that a higher temp (so long as it's not excessive) will cause the processor to faill within it's "useful life". On the contrary you have extended the useful life by increasing the performance. Millions upon millions of overclockers know this, a moderate overclock (or even a lot, providing the vcore isn't increased significantly) has no effect, it's always some other portion of the system that fails first from old age, generally motherboard or PSU. It might be fair to say that overclocking, through use of more current, is more likely to reduce the viable lifespan of the motherboard or PSU, or actually both long before the processor would fail.

    Intel doesn't warranty overclocking because it is definitely possible to make a mistake though ignorance or ineptitude, and because their price model is based on speed/performance. It is not based upon evidence that experienced overclockers using good judgement will end up with a processor that failed within 8 years, let alone 3!



    It also goes a long way to understanding why Intel has a strict "no overclocking" policy when it comes to retaining the product warranty. Too many people believe overclocking is "safe" as long as they don't increase their processor core voltage - not true. Frequency increases drive higher load temperatures, which reduces useful life.

  • TheJian - Thursday, March 6, 2008 - link

    AMD has recently proved this, and even Intel to some extent with P4's. AMD's recent chips have been near max, with almost no overclocking room (same for quite a few models of P4's) and they lived long lives. Proving you can run at almost max at default voltages with no worries.

    Where does the author get his data? Just as you said. Prove it. I think Intel is tire of us overclocking the crap out of their great cores. With AMD not having ANY competitor they end up with all chips being able to hit 4ghz but having to mark them at 3.16ghz. What do we do? Overclock to near max and that pisses them off. :) Make a few phone calls to some people and tell them write a "10minute's of overclocking and your cpu blows up" article or you won't get our next engineering samples to test :) Maybe I'm wrong but it's sure suspicious. Recommending anything with Intel IGP for HTPC applications is might suspicious also. Yes, I read tech-report too. Also the same is on toms hardware! Check out this FIRST SENTENCE of their 3/4/08 article on 780 chipset from AMD:
    "With today's introduction of its new 780G chipset, AMD is finally enabling users to build an HTPC or multimedia computer for HDTV, HD-DVD or Blu-ray playback that doesn't require an add-in graphics card. (AMD already included HDCP support and an HDMI interface in its predecessor chipset, the 690G.) The northbridge chip of the new 780G chipset also features an integrated Radeon HD3200 graphics unit that can decode any current high-definition video codec. As a result, CPU load is decreased to such a degree that even a humble AMD Sempron 3200+ is sufficient for HD video playback. Also, while Intel's chipsets get more power-hungry with every generation, AMD's newest design was designed with the goal of reducing power consumption."
    http://www.tomshardware.com/2008/03/04/amd_780g_ch...">http://www.tomshardware.com/2008/03/04/amd_780g_ch...

    OK, so for the first time we can build an HTPC without an add-in graphics card. Translation - IT can't be done on Intel! Ok, even a LOWLY SEMPRON 3200+ cuts it with this chipset! Translation - No need for an Intel Core2 2.66ghz-2.83ghz dual core! No need for a dual core at all. Before this chipset it took an A64 6400 DUAL CORE (on AMD's old 690G chipset and that chipset smokes Intels IGP) and still was choppy. Now they say a 1.8ghz SINGLE CORE SEMPRON only shows 63% cpu utilization WITHOUT choppy on the 780G! On top of that it will save you money while running. Even the chipset is the BEST EVER in power use. They openly tell you how BAD Intel's chipsets are at 90nm. But Anandtech wants us to buy this crap? BLU-RAY finally hit it's limit on a 1.6ghz SEMPRON at tomshardware. They hit 100% cpu in a few spots. I hope Anandtech's 780G chipset review sets this record straight. They'd better say you should AVOID INTEL like the plague or something is FISHY in HTPC/Intel/Anandtech world.

    Don't get me wrong Intel has the greatest chips now for a year, I'm personally waiting on the E8400 to hand me down my E4300 to my dad (with runs at 3.2hz with ease). But call a spade a spade man. Intel sucks in HTPC. SERIOUSLY SUCKS after early this week!
  • Quiet1 - Wednesday, March 5, 2008 - link

    Kris exposes his personal preferences when he writes... "While there is no doubt that the E8500 will excel when subjected to even the most intense processing loads, underclocked and undervolted it's hard to find a better suited Home Theater PC (HTPC) processor. For this reason alone we predict the E8200 (2.66GHz) and E8300 (2.83GHz) processors will become some of the most popular choices ever when it comes to building your next HTPC."

    But what are you going to plug that CPU in to??? An Intel motherboard with Intel integrated graphics? Look at the full picture and you'll see that if you're building an HTPC, the CPU just has to be decent enough to get the job done... the really important thing is your IG performance on your chipset.


    The Tech Report: “AMD's 780G chipset / Integrated graphics all grown up”
    http://www.techreport.com/articles.x/14261/1">http://www.techreport.com/articles.x/14261/1
    “The first thing to take away from these results is just how completely the 780G's integrated graphics core outclasses the G35 Express. Settings that deliver reasonably playable framerates on the 780G reduce the G35 to little more than an embarrassing slideshow.”

    "Between our integrated graphics platforms, the 780G exhibits much lower CPU utilization than the G35 Express. More importantly, the AMD chipset's playback was buttery smooth throughout. The same can't be said for the G35 Express, whose playback of MPEG2 and AVC movies was choppy enough to be unwatchable."


  • sprockkets - Thursday, March 6, 2008 - link

    That's the problem with Intel's platform, at least without an add in card. I thought the new nVidia chipset would change all that, then I found out they are only using a single channel of ram, how retarded is that?

    Then, for now, having the ability to run the add in card for games but then shut it down afterwards when you do not need it is sweet. I would wait though for the 8200 chipset since i know it will be easier to get working in Linux but may go still for the 780G for Windows Vista.
  • HilbertSpace - Wednesday, March 5, 2008 - link

    I read that article too, and thought the same thing.
  • Atreus21 - Wednesday, March 5, 2008 - link

    I wish to hell Intel would quit using those penises for marketing their architecture shrinks. Every time I try and read it I'm like, "Ah!"

    One would think they're trying to say something.
  • Atreus21 - Wednesday, March 5, 2008 - link

    I mean, the least they could do is not make it flesh colored.
  • frombauer - Wednesday, March 5, 2008 - link

    I'll finally upgrade my x2 3800+ (@2.5GHz) very soon. Question is, for gaming mostly, will a high clocked dual core suffice, or a lower clocked quad will be faster as games become more multi-threaded?
  • 7Enigma - Thursday, March 6, 2008 - link

    I think it really depends on how long you plan on keeping the new system. Since your current rig is a couple years old, you fall into the category of 90% of us. We don't throw out the mobo and cpu every time a new chip comes out, we wait out a couple generations and then rebuild. I'm running at home right now on an A64 3200+ (OC'd to 2.5GHz) so I don't even have dual-core right now.

    My plan is that even though the duals offer potentially better gaming performance right now (gpu obviously still being the caveat), since I only rebuild every 3-4 years I need something to be more futureproof than someone who upgrades every year. It would be great to say I'll get a fast dual-core today and next year get a quad, but 4 out of 5 times the upgrade would require a new mobo anyway so I'd rather wait another month or two, get a 45nm quad and the 9800 when it comes out.

    My biggest dissapointment with my last build was NOT jumping on the "new" slot and instead getting an AGP mobo. That is what has really hampered my gaming the last year or so. Once the main manufacturers stopped producing AGP gfx cards my upgrade path stopped cold. If I could go back to jan 05 I would have spent the extra $50-100 on a mobo supporting PCI-X, which would have allowed me to upgrade past my current 6800GT and keep on gaming. Right now I have a box of games I've never played (gifts from Christmas) because my system can't even load them.

    So in short, the duals are right NOW the better buy for gaming, but I'd hedge my bets and splurge on a 45nm quad when they come out. In all honesty unless you play RTS' heavily, or we have some crazy change of mindset by game producers (not likely) the gpu will continue to be the bottleneck at anything above 17-19" LCD resolutions. I actually just got a really nice 19" LCD this past Christmas to replace my aging 19" CRT and I did it for a very good reason. All it takes is to see a game like Crysis and realize that we may not be ready yet for the resolutions that 20/22/24 display, unless we have the cash to burn on top of the line components and upgrade at a much more frequent rate.

    2cents.
  • TheJian - Thursday, March 6, 2008 - link

    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...

    You can buy a Radeon 3850 and triple your 6800 performance (assuming it's a GT with an ultra it would be just under triple). Check tomshardware.com and compare cards. You'd probably end up closer to double performance because of a weaker cpu, but still once you saw your fps limit due to cpu you can crank the hell out of the card for better looks in the game. $225 vs probably $650-700 for a new board+cpu+memory+vidcard+probably PSU to handle it all. If you have socket 939 you can still get a dual core Opty144 for $97 on pricewatch :) Overclock the crap out of it you might hit 2.6-2.8 and its a dual core. So around $325 for a lot longer life and easy changes. It will continue to get better as dual core games become dominant. While I would always tell someone to spend the extra money on the Intel currently (jeez, the OC'ing is amazing..run at default until slow then bump it up a ghz/core, that's awesome), if you're on a budget a dual core opty and a 3850 looks pretty good at less than half the cost and both are easy to change out. Just a chip and a card. That's like a 15 minute upgrade. Just a thought, in case you didn't know they had an excellent AGP card out there for you. :)
  • mmntech - Wednesday, March 5, 2008 - link

    I'm in the same boat with the X2 3800+. Anyway, when it comes to dual vs quad, the same rules apply back when the debate was single versus dual. Very few games support quad core but a quad will be more future proof and give better multitasking. The ultimate question is how much you want to spend, how long you intend to keep the processor, and what the future road maps for games and CPU tech are within that period.

    I'm a long time AMD/nVidia man but I'm liking what Intel and ATI are putting out. I'm definitely considering these Wolfdales, especially that sub $200 one. I'm going to wait for the prices and benchmarks for the triple core Phenoms though before I begin planning an upgrade.
  • Margalus - Wednesday, March 5, 2008 - link

    the current state of affairs generally point to the higher clocked dual core. Very few games can take advantage of 4 cores, so the more speed you get the better.
  • Spacecomber - Wednesday, March 5, 2008 - link

    This has been mentioned in a couple of articles, now, that what these processors will run at with no more than 1.45v core voltage applied is what really matters for most people buying one of these 45nm chips. So, it begs the question, what are the results at this voltage?

    While the section on processor failure was somewhat interesting, I think that it should have been a separate article.
  • retrospooty - Wednesday, March 5, 2008 - link

    "these processors will run (safely) at with no more than 1.45v core voltage applied is what really matters for most people buying one of these 45nm chips. So, it begs the question, what are the results at this voltage"

    Very good point. Since these CPU's are deemed safe up to 1.45 volt, lets see how far they clock at 1.45 volts. 4.5 ghz at 1.6 volts is nice for a suicide run, but lets see it at 1.45.
  • Spoelie - Wednesday, March 5, 2008 - link

    This reads like an excerpt of a press release:

    "We could argue that when it came to winning the admiration and approval of overclockers, enthusiasts, and power users alike, no other single common product change could have garnered the same overwhelming success."

    Except that it was not. It was a knee-jerk reaction to the K8 release way back in 2003. It was too expensive to matter to anyone except for the filthy rich. The FX around that time was more successful. In recent years they just polished the concept a bit, but gaining admiration and overwhelming success because of it?? I think not. The Conroe architecture was the catalyst, not some expensive niche product.

    "Our love affair with the quad-core began not too long ago, starting with the release of Intel's QX6700 Extreme Edition processor. Ever since then Intel has been aggressive in their campaign to promote these processors to users that demand unrivaled performance and the absolute maximum amount of jaw-dropping, raw processing power possible from a single-socket desktop solution. Quickly following their 2.66GHz quad-core offering was the QX6800 processor, a revolutionary release in its own right in that it marked the first time users could purchase a processor with four cores that operated at the same frequency as the current top dual-core bin - at the time the 2.93GHz X6800."

    Speed bump revolutionary? Oh well ;)

    No beef with the rest of the article, those two paragraphs just stand out as being overly enthousiastic, more so than informative.
  • MaulSidious - Wednesday, March 5, 2008 - link

    this articles a bit late isn't it? seeing as they been out for quite a while now.
  • MrModulator - Wednesday, March 5, 2008 - link

    Well it's being updated from time to time. I think it is relevant since Cubase 4 is still the latest version used of cubase and the performance is the same today. What is important with this is that they measure up two equally clocked processors where the difference is in the number of cores. Yes, the quad is better at higher latencys but it loses the advantage at lower latencys and even gets beaten by the dual-core.
    More of a reminder of the limitations of current day quadcores in some situations. This will probably change when Nehalem is introduced with its on-die memory controller, a higher FSB and faster DD3 memory.
  • adiposity - Wednesday, March 5, 2008 - link

    Uh, what? I think he's saying these processors were on the shelves over a month ago. This article is acting like they are just about to come out!

    -Dan
  • MrModulator - Wednesday, March 5, 2008 - link

    Yeah, you talk about games and maximum cpufrequency on dual core is important, but there are other areas that are much more interesting. Performance for sequencers where you make music (in DAW-based computeres) is seldom mentioned. It is very important to be able to cram out every ounce of performance in real-time with a lot of software synthesizers and effects using a low latency setting(not memory latency but the delay from when you press a key on the synt until it is procesed in the computer and put out from the soundcard for example).
    Here's an interesting benchmark:

    http://www.adkproaudio.com/benchmarks.cfm">http://www.adkproaudio.com/benchmarks.cfm
    (Sorry, using the linking button didn't work, you have to copy the link manually)
    If you scroll down to the Cubase4/Nuendo3 Test you can compare the QX6850(4 core) with the E6850 (2 core). They both run at 3 GHz. Look at what happens when the latency is lowered. Yes the dualcore actually beats the quadcore, even though these applications use all cores available. The reason could be that all 4 cores compete for the fsb and memory access when the latency is really low. Very interesting indeed, as DAW is an area in much more for cpu than gaming...
  • MrModulator - Wednesday, March 5, 2008 - link

    Made a typo in the post. The last sentence should have said "Very interesting indeed, as DAW is an area in much more NEED for cpu-POWER than gaming.
  • Casper42 - Wednesday, March 5, 2008 - link

    Q9450 and his friends?

    For that matter, when am I going to be able to BUY an E8xxx part? Newegg has the E8400 listed but always out of stock.

    I tend to be the guy who people come to when they want to build a new rig, and right now I am telling them all to hold off and get an E8000 series or a Q9000 series CPU and a GF9000 series GPU. But right now all these parts arent REALLY available.

    So whats the REAL release date Kris?
  • Margalus - Wednesday, March 5, 2008 - link

    I bought an e8400 about 3 weeks ago from http://www.zipzoomfly.com">http://www.zipzoomfly.com

    I love the thing. with the thermalright ultra-120 extreme it runs at 25°C idle and with both cores at 100% load it hits 34°C! Room temp around 21°C.
  • MaulSidious - Wednesday, March 5, 2008 - link

    http://www.overclockers.co.uk/showproduct.php?prod...">http://www.overclockers.co.uk/showprodu...groupid=... these have been in stock and well stocked for a few weeks now.
  • Kromis - Wednesday, March 5, 2008 - link

    Most impressive

Log in

Don't have an account? Sign up now