Comments Locked

34 Comments

Back to Article

  • Sc4freak - Sunday, April 15, 2007 - link

    It's one of the few games out there that benefit greatly from multi-core. It would have been interesting to see how this new CPU benefitted one of the most CPU-bound games out there right now.
  • SilverMirage - Wednesday, April 11, 2007 - link

    AT fails to be completely honest with the situation:

    1. AT conveniently publishes this on the exact day AMD's price cuts come into effect. That's interesting. Although AT mentions this, they could have mentioned that AMD's previous prices were not able to compete.

    2. Benchmarking the 5000+ against the e6300 is inherantly biased since it will be the e6320 which is contending with the 5000+

    "5000+ will have to contend with the E6300"

    3. Now this depends a lot on the mobo, but I'd say that the conclusion from these benchmarks is that the E6320 and E6420 will be better for their price in a week or two.

    he Athlon 64 X2 6000+ is a realistic alternative to the E6600/E6400, the 5600+ competes well with the E6400/E6300 and the 5000+ can hold its own against the E6300/E4300

    (AT fails to mention again that the E6300 is an unfair comparison)
  • duploxxx - Monday, April 16, 2007 - link

    In the beginning of C2D launch we saw many reviews reducing the multiplier to see what 4mb of cache could do against 2MB of cache... it was only a few % depending on the type of apps... so stop the crap that a 6320 will outperform a 5000, same with the e6400 vs 5600. because for sure it will not!

    As for power consumption, yes a K8 consumes more power at load, It also consumes A LOT LESS in idle, how long is you're system idle a day? And buy a normal ATI chipset like the asus M2R32-MVP and the total power consumption at load will be less than the C2D system...

    any system can be oc'ed.... one bether than the other. you are talking about 5% of users maximum. the allendale tend to oc worse these days......

    nice review but start using the ATI chipsets also, they are equal performers against NVidia and cunsume a lot less and also cheap these days.
  • yyrkoon - Wednesday, April 11, 2007 - link

    It is funny, you come here saying the Anandtech crew is full of BS, yet you do not bring any proof with you, so excuse me if I call BS on you. Things do not magically work one way, instead of another, JUST BECAUSE *you* say so.
  • DeepThought86 - Tuesday, April 10, 2007 - link

    I don't understyand why only the CPU prices are considered?? Shouldn't the overall cost including a motherboard be a much more realistic measure? What about a performance/overall (CPU+MB) cost metric be very useful
  • Griswold - Tuesday, April 10, 2007 - link

    I just skimmed over the article, but where are the numbers for power usage?
  • RedWolf - Tuesday, April 10, 2007 - link

    One thing that is in AMD's favor is that Dell is now selling AMD machines. The college I work for is buying all AMD machines this year. Even the slowest C2D machines are a few hundred dollars more than AMD machines. All of our machines this year, including laptops, are Athlon X2 powered machines because the price was so attractive. I simply could not configure a C2D machine that came close. That price difference allowed us to go to 2 gb of ram and still be under C2D pricing for the same machine. Granted we aren't building enthusiast machines or buying for business but we are buying AMD and getting them at great prices.
  • dm - Monday, April 9, 2007 - link

    quote:

    The April 22nd price cuts aren't terribly aggressive, but they do restore a little balance to the equation . The 6000+ goes back to compete with the E6600 instead of the E6400, which does change things thanks to the E6600's larger L2 cache. The 5600+ now goes head to head with the E6400 instead of the E6300, and the 5000+ will have to contend with the E6300.


    It is important to note that Intel is also coming up with a better Intel® Core™ 2 Duo E6300 and Intel® Core™ 2 Duo E6400, which are Intel® Core™ 2 Duo E6320 and Intel® Core™ 2 Duo E6420. Both have full 4MB L2 cache and will be a lot better performer. I have done quite a few tests with them here (and it includes Intel® Core™ 2 Duo E4400 as well):

    http://fanboyreview.blogspot.com/2007/03/brag-fanb...">http://fanboyreview.blogspot.com/2007/0...g-fanboy...

    quote:

    The price is obviously quite steep, and those who are not opposed to overclocking would be better off buying a Q6600 and simply overclocking it to QX6800 speeds.


    You missed a wonderful processor, which is the quad core Intel® Xeon® X3210 (2.13GHz/8MB L2/1066MHz) which is an LGA775-socket compatible CPU and would appear to be binned to worked at a lower voltage. And according to guru3D (http://www.guru3d.com/newsitem.php?id=4949)">http://www.guru3d.com/newsitem.php?id=4949) the price will be hovering the $430 range. I have done some testing with this Intel® Xeon® X3210 here:

    Part I (Stock Benchmark): http://fanboyreview.blogspot.com/2007/04/article-l...">http://fanboyreview.blogspot.com/2007/0...icle-lit...
    Part II (Overclocked up to 63%): http://fanboyreview.blogspot.com/2007/04/article-l...">http://fanboyreview.blogspot.com/2007/0...e-little...

    Anyway, overall, nice article!!!
  • skrewler2 - Tuesday, April 10, 2007 - link

    Doing a google search, I see the price is around $750-800.. Too bad, you got me excited too
  • yacoub - Tuesday, April 10, 2007 - link

    DM I didn't see any head-to-head comparison of 6300 to 6320 and 6400 to 6420 in your review of them. Am I reading your graphs wrong?
  • irev210 - Monday, April 9, 2007 - link

    You should also put overclocking as a factor.

    As an enthusiast oriented site, it makes sense to compare the average overclockability of Intel core 2 duo processors against AMD.

    Most core 2 duo processors in all forms, can reach at least 3GHz or more, and of course the best can do 3.8GHz or higher.

    That is something that AMD can not do, and it should be noted.

    You should also evaluate performance per watt as well, as another important deciding factor for us.
  • rqle - Monday, April 9, 2007 - link

    That and also the fact that clock for clock, intel is extremely faster. A 2.6ghz Intel is FASTER than a 3.0ghz X2 AMD, and intel chips can OC much higher.
  • photoguy99 - Monday, April 9, 2007 - link

    As usual a good article, kudos.

    However being AT has many enthusiast followers, would readers not be interested to note that:

    1) AMD has near zero headroom for over clocking

    2) AMD's chips use more power than Intel at the same performance level. Yes it is not as important as in a data center, but can will still make a small difference in your power bill.

    So to conclude, it seems the answer is yes, AMD is competitive on the low end, but why give up these two advantages when the price is so similar?

  • kmmatney - Monday, April 9, 2007 - link

    "1) AMD has near zero headroom for over clocking"

    Seeing as how so many people are overclocking their AMD chips from 1.8 Ghz to 2.8 GHz, your comment is not true.
  • yyrkoon - Monday, April 9, 2007 - link

    To put it a little differently, with the motherboard I use, a 3600+ and perhaps a couple sticks of Corsair value ram, you can build a 'modern' system, capable of doing anything most users would need to do, for ~$500 including a small LCD monitor. Try doing that with an Intel system, that will keep up with this AMD system, you can not . ..
  • yyrkoon - Monday, April 9, 2007 - link

    Simply, because the platform for AMD can be far more inexpencive, with much better features. Platform, of course, meaning the motherboard. Take my ABIT NF-M2 nView for example, find my an intel board, with the same features, for under $150usd, that is as stable. Keep in mind, the NF-M2 nView only runs $90usd. Very stable board, and many features, including Heatpipe passivly cooled chipset(OTES), excellent overclocking features (read my post above, that will blow your 'minimal head room for overclocking' theory out of the water ;) ), 4x SATAII ports, 2x PATA ports, 8x USB ports, 1x firewire port, onboard graphics, onboard GbE, onboard sound, 1x 16x PCIe, 1x 1xPCIe 2x PCI, with 4 dimm slots capable of supporting up to 8GB of DDR2. Also keep in mind, that I have read, people have had the X2 3600+ up to 3Ghz, but of course, this is what I have read, and not personally experienced.

    Anyhow, the Intel CPUs are great, and I honestly wanted to upgrade to C2D myself, however, after reading loads of posts all over the web, and seeing all the issues with the latest C2D platform boards, and the overall cost for a motherboard that offers as many features as the board I listed above, I did not want to 'break the bank' to do so. Once OEMs come around, with a motherboard sporting a modern chipset for C2D, plenty of features, and offer SLI, or single GPU variations, Intel could THEN in my eyes, be compedetive. Not everyone needs or even wants SLI, or a lot of the other garbade OEMs are putting on their 'top of the line' motherboard, increasing the prices dramatically.
  • dm - Monday, April 9, 2007 - link

    quote:

    Simply, because the platform for AMD can be far more inexpencive, with much better features.


    Not necessarily. Due to the fact that Intel is still using discrete memory controllers, there are a lot of motherboards out there that can take advantage of this. Take a look for example, an Asus P5PE-VM (http://www.asus.com/products4.aspx?l1=3&l2=11&...">http://www.asus.com/products4.aspx?l1=3...l3=272&a... which can be a great upgrade for those who have AGP and DDR1 components but wanting to experience the power of Conroe. An Intel® Core™ 2 Duo E4300 processor is a perfect match for this. The i865 chipset is a stable and has proven itself for quite some time already, and this motherboard goes for under $60!!!

    And there's also a flavor where PCIe and AGP, DDR2 and DDR1 are acceptable, look at the ASrock 775Dual-VSTA (http://www.asrock.com/mb/overview.asp?Model=775Dua...">http://www.asrock.com/mb/overview.asp?Model=775Dua..., giving users flexibility for users to either use DDR1 with PCIe, or DDR2 with AGP.
  • yyrkoon - Tuesday, April 10, 2007 - link

    I would not put an Asrock board in your system, let alone one of mine ever again. Again, I have looked, and looked long, and hard, for a motherboard, with as many features, as the one I use currently, and there is simply *none* in the same class as this board, period. Motherboards that have the features I want, using any Intel capable chipset, and 100% rock solid stability simply do not exist, in this price range. If I had the money, I would probably end up spending $300-$400 usd for a good Intel board, and you can bet, it would not be an Asus board either. Anyhow, the current motherboard I own, only has one issue, and one that is not really that big of an issue (it will not POST, or boot Windows, while a USB HDD is powered on, and attached to one of its USB ports).

    ABIT, MSI, Gigabyte, Tyan(Foxconn seems to be ok, and Intel boards now days, seem to be hit and miss, unless you go with a server board), are pretty much the only system board manufactuers I will use in any system in the future, and this is subject to change as whomever starts making bad parts.
  • photoguy99 - Monday, April 9, 2007 - link

    Since the article included the quote below, it might be more informative to note that QuickTime can only use two threads at a time, which explains why AMD looked better in this test:

    "The Quicktime H.264 test paints a particularly good picture for AMD, with the 6000+ equalling even the Q6600."
  • yacoub - Monday, April 9, 2007 - link

    quote:

    With a substantial number of our CPU benchmarks available in 64-bit versions, using the 64-bit version of Vista wasn't a difficult choice.


    And so starting with this test all benchmark numbers from Anandtech from here on out become that much less relevant to the vast majority of your readership because they still run WinXP 32bit - and have little reason to upgrade their OS to Vista yet. :[

    Maybe in a year or two your reviews conducted on Vista 64bit will be more relevant to the majority of your readership, as by then driver support will be much improved, games will be Vista/DX10 focused, and overall performance on Vista will be that much better than WinXP and thus create an incentive for WinXP users to finally make the switch.

    Until then, thanks for the less relevant benchmark results...
  • Chadder007 - Monday, April 9, 2007 - link

    Considering that if you are getting a whole new PC, then you would most likely be getting a new OS to. To which I would go with 64bit Vista also.
  • SunAngel - Monday, April 9, 2007 - link

    Just think, when MS gets Vista fully patch and tweeked and FSB1333 or 1600 and HT2400 or 3000 for Quad and Octo-cores is mainstream, you'll be able to take a 30 min. MS-DVR recording from Media Center and reduce it in like 2 minutes. 18-30 months from now I expect Intel and AMD to performance quite some magic with their processors. Good times ahead indeed.
  • yacoub - Monday, April 9, 2007 - link

    And we all know how certain benchmarks vary WIDELY between XP and Vista - often due to drivers or programming issues - so to say "oh the numbers should be close enough between the two OSes" would be completely untrue.
  • yacoub - Monday, April 9, 2007 - link

    quote:

    While Intel will still hold control of the world's fastest desktop processor title, AMD may actually offer better value at lower price points.

    As long as you don't allow overclocking into the equation, then yes. But if you allow for overclocking, even a modestly overclocked E4300 can match or beat an E6400 and thus best the 6000+.

    We rarely hear much about how the AMD chips overclock these days... is it just due to a lack of overclock-oriented boards? Have all the board manufacturers focused on Intel because that is where all the attention is and where they hope to get the most profit for their boards?

    It would be interesting to see a good update on AMD overclocking on AM2. Do the chips even have much headroom? If so, are there overclocker boards available to OC them with? etc
  • yyrkoon - Monday, April 9, 2007 - link

    From my experiences, which by the way is not a lot, I have noticed that generally, desktop classed CPUs from AMD do not OC well. However, that being said, I have an Opteron 1210, paid $150usd for it, and have had it running 310Mhz CPU core(2790Ghz overall, stock is 1.8Ghz . . .), on an ABIT NF-M2 nView motherboard, with inexpencive Cosair XMS2 ProMos memory, using stock cooling. Granted, the system immediatly BSoD'd when trying to run SuperPI, but I have little doubt, that if I had a better cooler (my case is very compact, so it is pretty difficult to find something small, and efficient), that it would have been able to run this speed fine.

    From what I have read, the 3600+ can hit 3Ghz using water cooling, but I have no hands on experience personally. Personally, I am very happy with my Opteron, it runs every game I play just fine, the only real problem I have with my current system, is that my current video card is already showing age, and it is only 6 months old (7600GT) :/
  • JarredWalton - Monday, April 9, 2007 - link

    I think the best AMD chips will OC to around 3GHz, give or take. The problem is, an E4300 will overclock to around 3.6GHz pretty easily (get a better CPU cooler is all you need). At that point, the E4300 is so much faster than anything AMD currently has on the desktop that I think it's a bit silly to consider AMD for serious overclocking - unless you already have an AM2 board? At stock speeds, however, AMD does quite well on pricing, especially post price-cuts.
  • D4LDarksideD4L - Wednesday, March 18, 2009 - link

    No think your wrong here. The best AMD cpu doesn't max out at 3.0Ghz it clock way faster than that with very extreme cooling. Guys at AMD inc. had overclock an AMD qual core to a insanely 6.5Ghz. and its very stable. To achieve this speed they use LiquNitrogen and LiquHelium to bring down the temperature to below 200 degree F. To my understanding That the world record breaker.

    http://game.amd.com/us-en/landings/tomslanding.asp...">http://game.amd.com/us-en/landings/toms...CR=World...
  • yyrkoon - Monday, April 9, 2007 - link

    I do not dissagree with you 100% Jarred, you points are completely valid. However, the system in question would cost more than a comparrible AMD system, and in my case, I am very specific in what I want for features, and brand, so, it would cost me a lot more. Also this performance difference you speak of looks great on paper, and graph, but in realy world application, I bet, you, me, or anyone else, would be hard pressed to notice the difference. Perhaps if all you do is encode/decode sure, but general usage, and game playing, the noticable difference just would not be there.

    Now, if you have the cash, I would reccomend to anyone who plays games, to go with Intel C2D, but know that you will pay for the speed difference, and the chances are good, unless you spend a lot, you would never know the difference. As I said before, build me a C2D system for ~$500, with an LCD monitor, and then we can talk. Granted, this AMD ~$500 system, would not play FEAR, Oblivion, or any other graphically intensive game well, but my system does using an eVGA 76000GT KO (can be had easily for ~$130usd, making the over all cost higher, but still very viable, even on a budget).
  • TA152H - Monday, April 9, 2007 - link

    I agree with you, mainly. My main system, if I have one that can be called that, is a Katmai running at 600 MHz. Why? Because it's completely fanless and it uses very little power and I don't need anything better for what it does. I have a seperate development machine, of course, and other machines I use for more demanding stuff, but 90% of the time, unless I'm testing, this computer is more than adequate for what I want.

    Overclocking is important only within the tiny context of the person who is going to buy a processor to do it. Intel and AMD don't care that much about it, unless it makes their chips less reliable and then they are against it since it hurts their reputation. It's done by such a small percentage of people, it's not going to greatly impact their sales. I almost got tarred and feathered back about 10 years ago when I recommended that our company buy the Celeron 300As and overclock them to 450 MHz, rather than using Pentium IIs, which were slower and way more expensive. I never made that mistake again.

    Back in the bad old days, you actually had to have a bit of a clue to overclock; you'd have to unsolder the crystal and replace it. A lot of people did this with the original PC/AT 139, so IBM changed it so you couldn't for the 239 (by putting a timing loop in the ROM). So, I guess it was somewhat more common then. But then again, the computer user back then was much more sophisticated since they were not mainstream devices like they are now.

    AMD is in deep trouble, as evidenced by their recent announcement of extremely poor sales. I just do not understand their timing with respect to ATI, because it gave Intel a golden opportunity. Intel knows AMD is cash strapped, and they can't fight a price war. AMD doesn't know this though, and they are playing chicken driving a VW Bug against a Hummer. Sooner or later, AMD must lose, and they are idiotic for thinking Intel does not know this. ATI made it impossible for AMD to follow the course they are now, but they are. Good luck to them. Cost cutting isn't the answer, Intel will just extend their manufacturing lead.

    One thing I don't get is the announced price cuts by Intel not being called aggressive. Call me crazy, but when you chop 1/3 of the price off of already attractive products, that's very substantial.

    A lot of this points to Barcelona (what a stupid name) being really good. Intel is trying to kill AMD before it comes out, and AMD seems to believe they have to keep market share at any cost. Obviously, AMD's path is unsustainable and they would eventually go out of business on it if nothing changed. If Barcelona is really good, they could suffer a few quarters of it while waiting for Barcelona, which would presumably sell quite well if it is as good as they say. The problem is, they are losing market share even with their low prices.

    One positive about all this is how much smarter consumers have become. It used to be Intel could sell whatever they had even if it sucked. But, when Intel had a bad product most recently, they lost share. Now they have a better product, they have gained it. It hasn't always been this way.
  • yyrkoon - Monday, April 9, 2007 - link

    Well, I do not know which way it is going to go, but it is either very, very good for AMD, or very, very bad. Just for the 'monopoly' Intel would gain, in AMD going out of business, and the 'you have this product, and you have to like it' effect we would get from Intel, I think it would be very, very bad for everyone, if AMD went out of business.

    I have been computing for a long, long time, (since 82-83 ), and have been used to AMD being the underdog, so I do not really see this as the nail in their coffin just yet. Only time will tell, and AMD knows how to fight a price war, from the bottom up( or middle up, if you ever really considered Cyrix a compeditor ).
  • TA152H - Monday, April 9, 2007 - link

    I don't think you got my point. I'm not saying AMD will go out of business, just that their path would eventually lead them to it if nothing changed. For example, let's say they didn't have a new product out in a few months. Would they continue the policy they are on? I don't know the answer to that, but I think the upcoming new core has something to do with their current policy. I guess it would have to.

    I've never thought highly of Hector Ruiz, and I think even less of him now. I liked Jerry Sanders a lot, he was charismatic and visionary, and was the only one that could stand up to Intel. Many others tried, and they failed. And it was under his leadership that AMD passed Intel with the Athlon, and much of the current situation is from technology he was responsible for. He never backed down, and made excellent strategic moves like buying NexGen when the K5 ran into snags. I never worried whether AMD would survive under him. Ruiz, I just don't like him and I don't have as much faith in him. I still don't know why they are still on the K7+ core now, it's been way too long and something better should have come out years ago. They came out with a product good enough to beat the miserable P7, but they had to know Intel would come back fighting.

    But, in the worst case, and I'm not saying this will happen, AMD will be bought by someone else rather than disappear into nothingness. IBM makes the most sense, particularly since they are out of the PC business and spend a lot on semiconductor manufacturing and developing even without AMD. In fact, I am a little surprised they aren't one company already. AMD by itself is a weakling that can only grow when their competitor missteps, and when their competitor is doing well, they lose money. And IBM/AMD combined company would give Intel fits, and be their roughly their equal.

    At any rate, the industry will never allow Intel to be alone as an x86 maker. It's too lucrative a market, and Intel has produced some of the worst processors known to mankind. Uncontested, people might actually have to use them. It won't happen. Then again, we somehow let Microsoft dominate with their lousy products. So, who knows?

    With regards to AMD knowing how to fight a price war, they have no chance in this one, outside of the Barcelona. Intel can make the chips cheaper, and they are much better processors. Plus, they still have a better reputation. Intel can take market share from AMD and make them like it right now. There just isn't anything AMD can do with their lousy K7+ and inferior manufacturing, plus high debt. And that's what's happening, Intel is winning back market share and AMD is selling their rubbish for peanuts to boot. But still, we have Hector the jackass telling all the world that AMD will not yield any market share gains, and instead will get to 30% this year. He's a buffoon, he is not in control of the situation, Intel is, and he's trying to use bravado to cover up the fact Intel will take what they want, it's just a matter of how much money they are willing to give up to do it. I really don't like this man.

    Cyrix is dead now, VIA bought them and promptly killed their line off and went with the IDT Centaur line. I actually have one of their micro-ATX motherboards and processors running at 800 MHz. It's totally quiet, and it's elegant in it's own way, but for 800 MHz it's really slow. To me, it seems like it's roughly equivalent to a 500 MHz Katmai, and I'm not exaggerating. But, it uses less than 10 watts, so I guess that's to be expected.

  • yyrkoon - Tuesday, April 10, 2007 - link

    This 'war' you speak of , is actually a battle, the war goes on indefinately, but yes, I got your point.

    As for who 'rules' AMD, I could care less, as long as they stay around, giving Intel a reason to make good products, and vice versa. I care more about things when AMD makes stupid judgement calls, as in switching socket types too often, and not supporting them for very long, but they are not the only ones guilty fo this, and to be honest, I am not sure if there really is much of a choice, when technology advances as fast as it is now.

    AMD would never dissapear into nothingness, and they *could* go back to making IC's only, they probably just would not make as much money doing that alone.

    Cyrix *has* been dead, for a long, long time now, at least in the desktop arena, which, in my opinion did not happen soon enough. I remember having a Cyrix P200, and playing quake2, and getting 7 FPS, popped the CPU out, droped in a P55 233MMX Intel CPU, bumped the FSP up to 75Mhz, and watched as it got over 60 FPS with the same settings . . . (I miss the good ole Super7 days, if only because the platform did not matter, you could use any CPU in it).
  • TA152H - Tuesday, April 10, 2007 - link

    Actually, VIA bought Cyrix and actually released a few products based on their architecture, and then bought the Centaur line from IDT and stopped making the Cyrix based chips.

    Cyrix chips weren't always bad, at least on paper. I always had problems with them though, but in some ways they were way ahead of AMD. AMD was just a clone maker until they couldn't do it, whereas Cyrix made their own processors without copying Intel microcode. The chips you're talking about had miserable floating point performance, but their integer performance was excellent. They used that silly PR rating stuff where they were actually clocked lower than they were rated at, because of their superior IPC. Cyrix was also unique in the x86 world for saying that the decoupled architecture of the K5 and P6 was not the way to go, because you'd have too much trouble with OOP as you got to deeper pipelines. It would be interesting to see if they'd still be running x86 code natively today, or running some inelegant decoupled architecture like Intel and AMD are. Apparently, since AMD and Intel survived, it was the way to go.

    AMD has been making x86 processors almost as long as Intel has. IBM used them extensively in their original PCs and so did Tandy. They were a licensed second source for Intel. Intel got a little greedy with the 386 and decided they didn't want AMD making them as well, although AMD eventually just reverse engineered it and did it anyway, creating a great legal battle. AMD 386s were excellent too, they ran at faster clock speeds, and used a lot less power. Unfortunately, their 486s sucked, they were unreliable. They seemed to have a lot of cache problems, and if you turned off the cache the processor ran OK, but with it, it wouldn't work. Mainly the DX2 80/40 had this, and I have no idea why.

    You complain about the Cyrix chip, but did you ever try a K5? The floating point on that processor would make a man with a hairy back cry. That's just the way it was, Intel had ferocious floating point, and everyone else had lousy floating point. It was how they were able to compete. They didn't "waste" so many transistors on something that most people never used anyway, and that helped them a lot. Even the K6 with it's low latency, but non-pipelined floating point unit was way behind the Pentium. Only the Athlon changed that.

    I don't miss Super 7 at all. You had a bunch of substandard companies backing that standard, and none of the chipsets were particularly stable. The VIA MVP3 was probably the best, the ALI was miserably bad. The MVP3 had terrible memory performance, even with motherboard cache memory. Even though it and the 440BX were both rated at 100 MHz, the memory performance on the 440BX was roughly 50% greater. The P55C didn't use Super 7 though, it used the regular Socket 7 chipsets, like the 430TX and 430HX, both of which were excellent. I still run my print server off a 430HX based Pentium 233 MMX. It does the job fine, and if I want to play any games from the mid to late 1990s, it's ideally suited for it.
  • KhoiFather - Monday, April 9, 2007 - link

    This CPU is smoking hot!!!! I need to pick me up one of these when the price falls about 80%, yup yup!!!!

Log in

Don't have an account? Sign up now