AMD stated that it didn't want to clue Intel in on what it had up its proverbial sleeve
----------------------------------------------------------------
I really want to laugh. 64bit is really a revolution? What else AMD invented except IMC? Intel invented X86 and AMD copied it.
AMD invented the integrated memory controller? You're joking, right?
They weren't even the first x86 company to make one, it was NexGen, a company they bought when they realized they weren't competent enough to make their own x86 processors without copying Intel. It was during the K5 debacle, and Jerry Sanders was smart enough to buy them, and that is where the K6 came from.
AMD copied a lot more than x86 from Intel. I actually have one of their 1981 product sheets, they even copied Zilog stuff like the Z8001 and Z8002. They, of course, copied the 8080 and 8085 as well. So, the fellow that goes on about IBM making that relationship is dead wrong. It started a lot sooner than that, and only when Intel got greedy with the 386 was it ended with acrimony. Intel was annoyed AMD would sell everything for so cheap.
Thank god for the caveman that invented the wheel or we really be screwed.
Do you guys get hard-on's googling information about who "invented" what first just to shove it back at someone's face? Who knows, maybe the guy who invented the IMC now works for AMD. Who the hell knows what original idea or concept it came from either. Though I'd say AMD did a hell of a job making it work!
You think I used Google for this? You wouldn't understand because you're a moron, but it bothers me when people say things I know are wrong and perpetuate misinformation.
They had built computers on a chip well before the x86 world ever saw the IMC, so making weird claims about AMD and them inventing the IMC is flat wrong. They were the first to make that tradeoff, but anyone could have done it.
Making it work isn't particularly difficult, and I'm not even sure it's proven to be a great idea. You give up a lot to get it, and only when compared to the ultra-sophisticated, and horrible Pentium 4 did it shine. Against the Core 2, this great feature is part of a product that gets totally raped by Intel's solution. So, I'm not sure it was greatest thing in the world to spend the transistors on, Intel obviously found better ways. Even now, the Barcelona still won't have memory disambiguation like the Core 2, and will have only the store/load OOP ability of the P6 core. It's a step in the right direction, but I'm thinking they should have spent a few more transistors there, although I think they'd be better off taking them from x87. What is the point of that now?
AMD copied x86 at Intel's request. Intel was bidding for IBM's original PC project, IBM said 'you must have a second manufacturing source', so Intel contacted AMD, asked them to manufacture 8086 chips and supplied them with the semiconductor masks to do so. AMD cunningly made the ability to use the x86 ISA in their later chips a requirement of the contract, though it took a court case to confirm the legality of this. But no, AMD did not slavishly clone Intel's ISA, Intel specifically asked them to do so.
AMD's announcement sounds like a good idea. It will allow them to catch up to current battery life improving technologies.
My Dell 700m inpirion (which costed $800 shipped new 2 years ago) features a pentium-m dothan 1.7GHz single core. Hardly the latest technology. I've managed to get 5+ hours with wifi always on and moderate screen brightness (this screen gets too bright!) in linux.
In windows xp battery life is ~3.9hours with similar usage. In vista it dwindles even further. (note i have 2GB of ram to prevent swap usage/hard disk activity).
But in linux there are great power management capabilities and intel provides a good amount of open source drivers. And now they released this very useful application:
http://www.linuxpowertop.org">http://www.linuxpowertop.org
Already there are many new patches to even further reduce power consumption by badly coded programs. Some of these fixes will work their way into Windows (like firefox), but I wonder how carefully Microsoft evaluated their power consumption due to code?
All of these "extra lower power states" are useless if the software never allows the CPU to reach those lower power states. My own computer had several problems that forced my lowest power state to "C2". After using powertop I have managed to get my computer go into C4 power state. I'll be getting even better battery life now!
As far as AMD is concerned, I believe they have waited too long to match intels capabilities in the fastest growing mobile market. Why wait for Griffen when Santa Rosa is essentially the same? And I already know intel has great open source support (all of my cheap dell hardware works perfectly).
How is what is being done with the Barcelona and this Griffin different from what AMD did with the CXT core in the K6-2? They called it write combining back then. It seems remarkably similar to me, but I might be missing something. Can someone elaborate on the differences?
AFAICT it is the same thing except one layer down the memory hierarchy. Instead of your (L2) cache controller combining writes to your memory controller, your memory controller combines writes to the memory banks.
I thought the same thing, but wanted to make sure.
One little thing, it was the L1 cache controller, the K6-2 didn't have a L2 cache on the chip, it was on the motherboard. The K6-III, K6-2+, and K6-III+ all had the on-chip L2 cache.
OK, this line "while the K8 is arguably a better starting point for a mobile-specific architecture than the P6", is completely wrong, even though it is qualified.
Are you kidding? Do you remember the K7 when it came out and how it compared to the Pentium III? Sure it was faster, but the power use was WAY higher and in no way consistent with the marginal performance improvement. Once you improved the cache on the Pentium III with the Coppermine, and did so later with the Thunderbird, the difference in performance was greatly reduced. People were still preferring the Tualatin 1.4 to the newer stuff until the Core 2 came out, because they were so power efficient and easy to cool, and the performance was excellent, mainly because of the 512K cache. The main reason the Athlon had any advantage, which it did because of frequency headroom is the Pentium III was seriously memory bottlenecked, and that wasn't a huge architecture step forward.
More to the point, why didn't they start with the K6-III????? It was way better than the Pentium III in terms of performance per watt, although not so in floating point. But, AMD is always talking about these coprocessors, so put a K6-III that is cleaned up a little to address its deficiencies, integrate the northbridge and graphics into it (it's a tiny die, it would still be small) and put a small socket for a coprocessor if people need floating point. Why put a power hog like the K8 as the starting point? They could also make a K8 mobile part for the devil may care about power mobile group, but I think the K6 derived group would have extremely low power use, function very well, and be relatively inexpensive to make. A K6-III+ at 600 MHz, even with terrible memory bandwidth and a tiny 256K L2 cache still does fine surfing the internet and opening office apps, etc... Someone needs to go back to these shorter pipelined units for power savings, I was hoping it was AMD. Intel doesn't really have anything, the Pentium wasn't decoupled, and the Pentium Pro had the long pipeline :( .
I agree with your points here, but it might have come down to an issue of time and manpower. Intel probably has teams of engineers sitting around twiddling their thumbs just waiting for something to do while AMD probably has everyone working around the clock. It's probably more a result of resources than anything.
I think you're right, but a product like that could make a big difference too. It would be so different than anything out there, and have such advantages for the market it is designed for, I think it is something they should have looked at.
I guess what's so disappointing for me is they mentioned that they were going to try a completely new design for it, and then they just did another iteration of the K8 and took a branch from there. I don't think they can seriously differentiate themselves from the Core 2 line, and I think they have to. Intel is so much better in manufacturing, if AMD retains design parity, or something close, I don't know how they are going to be successful. I think Fusion is a good idea, but I don't think it's enough and I don't think it would be hard for Intel to duplicate, because as you say, they have the resources.
Besides, they have the K6, they'd have to increase the memory interface, improve the decoders, and tweak little things here and there, but it's a great processor. Remember how disappointed everyone was when the K7 couldn't beat it clock normalized for integer and it was beating the Katmais that were 50 MHz faster? This with the putrid VIA MVP3 chipset that had horrible memory performance. It was a really good design.
I am also wondering why they still have such strong x87 in the Athlons. Why even bother these days, particularly with the mobile part. Put in a tiny non-pipelined version for compatibility, and save the space for something more useful. x87 isn't even supported in x86-64 mode, so it's clearly a dead technology.
quote: The Centrino brand simplified notebook purchasing and quickly became a mark associated with a notebook you wanted to buy.
Maybe it did for Joe Sixpack, but it ticks me off. Centrino tells me nothing about what kind of processor is in a system, it just tells me that the system has a wireless card (of some sort, Intel branded but who-knows-which-model). Centrino could mean Pentium-M (Banias or Dothan), Celeron-M, or Core Solo. Centrino Duo at least tells me a system is dual-core, but not whether it's Core Duo or Core 2 Duo (or possibly "Pentium Dual Core", that relabeled Core Duo Intel is putting out in limited quantities). You call this simple? I sure don't.
I can't stand this marketing trend. There is very little way to "at-a--glance", know exactly what you are getting in a given laptop. It's just one more buzzword to know, when just saying the laptop has WiFi (which 95% do at this point) and an Intel xxx processor running at such-an-such a speed would be useful. And it looks like it gets review sites like Anandtech sucked into buzzword bingo in the process.
quote: Centrino could mean Pentium-M (Banias or Dothan), Celeron-M, or Core Solo. Centrino Duo at least tells me a system is dual-core, but not whether it's Core Duo or Core 2 Duo (or possibly "Pentium Dual Core", that relabeled Core Duo Intel is putting out in limited quantities). You call this simple? I sure don't.
And you can further differentiate the single core Centrinos from the dual cores. Dual core versions are Centrino Duo, and single core ones are Centrino. It looks like even the logo can be different for Core Solo compared to Pentium M.
The OP is right and you're completely wrong. I came on here to post the very same message.
The definition of x in this case is that it is a variable. If both x and a were constants....you'd just have a single number. There would be no relationship. Go back to Algebra I if you truly don't understand this.
Seeing innovations like this where they're doing more with less just really makes me happy. It's nice to see that they're addressing power concerns and working towards having a powerful computer that also can be power conservative.
While the hardware industry is getting more efficent, unfortunately the software industry is following the trend of the P4. Software is getting more and more inefficient and bloated while hardware is getting more efficient. It'd be nice if this trend would reverse and would start seeing better software written again..
I can not help but think 'WHY' a laptop *NEEDS* more than one core on the CPU. Anyone claiming that their laptop is comparrible to the performance of a good desktop, is only kidding themselves. From my point of view, a laptop is a tool, used to do whatever you can not do on the desktop, for whatever reason ( travelling, and away, from your desktop, etc ). Yes, I understand that multiple core CPU have been availble in laptops for some time now, but that does not really answer my question. The point I am getting at here, is that if the laptop CPU could be made aroudn a single core, there should be plenty of room for other enhancements, and a potential for a lower TDP.
quote: Now, instead of executing writes as soon as they show up, writes are stored in a buffer and once the buffer reaches a preset threshold the controller bursts the writes sequentially. What this avoids is the costly read/write switch penalty, helping improve bandwidth efficiency and reduce latency."
The statement above, to me resembles something along the lines of the difference between USB v2.0, 400Mbit firewire , and syncronous/asyncronous read/write capabilities, there *HAS* to be a performance hit here . . . Who knows, maybe I am wrong ?
A second core does make a very significant difference in terms of system responsiveness, even if you rarely (or never) max out both CPUs. I started using dual-CPU systems about 8 years ago and never looked back.
More and more people are using a laptop as their only PC, so that theory that laptops are just typewriters isn't true anymore. They used to be just typewriters because that was all they _could_ do.
With proper power management, a dual-core CPU can consume as little as a single-core one, for the same amount of work, so the only issue becomes the price of the chip itself. And since dual-core CPUs are so cheap, these days, there's really no reason not to have one.
quote: I can not help but think 'WHY' a laptop *NEEDS* more than one core on the CPU.
My company issued me a laptop, complete with anti-virus, software firewall, and hard disk encryption preinstalled that I can't turn off. My system runs two antivirus scans a week on the whole hard drive (at about 4 hours each). Given that an antivirus scan requires considerable hard disk access, that means that in addition to the processor load for running the virus-scanning algorithms, there's also substantial load for decryption.
Unfortunately this laptop is a single-core Pentium M - no Core 2 Duo for me. My friend who hired on a little after I did got the Core 2 Duo version. The difference is astounding. My laptop (on XP, of course) has significant usability constraints - even the UI just isn't very responsive because of all the crap running in the background.
I would *LOVE* to have an extra core around, dedicated to handling this bullcrap. It would make a tremendous difference in my ability to actually "get work done" when my system would otherwise be busy running all this junk.
I'm aware that the extra core wouldn't improve the hard disk situation - but it would DEFINITELY help a WHOLE lot.
There's more people than you'd think who have a laptop as their only computer, and I am one of them. Yeah a laptop doesn't NEED dualcore, but believe me, it does make more of a difference than you want to give it credit for. I'm in college and take notes on this thing all day, a desktop just wouldn't work for me, and I just don't have the money to have both right now.
quote: The point I am getting at here, is that if the laptop CPU could be made aroudn a single core, there should be plenty of room for other enhancements, and a potential for a lower TDP.
The new Intel Merom M0 stepping processors already supports something similar. With single threaded apps, one core goes down to C3 while the other speeds up to a turbo frequency (bin + 1) while staying below or at the specified TDP.
I agree with this as well. People really expect laptops to be something that can do everything and it really is unreasonable. That said, AMD and Intel are making their computers more powerful so that people can start using laptops exclusively as that is the market trend, to go away from desktop machines.
I'm wondering if AMD supports completely turning off a core and having it run in single core operation as I'd think running in single core mode with a slightly higher clock speed (if at all) would be more power efficient than having two slower cores running simultaneously. No need to have two cores running when you're just typing in MS word. It'd also be nice if intel and AMD would support far slower clocks speeds as IIRC the minimum clock speed increased from 600mhz on the Dothan and Banias machines to 800mhz on newer machines.
If I could get my laptop to downclock to 100mhz when I'm typing in Word and reduce the voltage considerably, I bet you'd see a much higher jump in battery life. But with vista being so bloated today, the computer would probably never downclock to a speed as low as that since XP and vista seemingly are always busy.
Anand pretty clearly stated that Griffen is not based on Barcelona but on K8.
I'm not really getting how this strategy will help AMD in the mobile market though. The current K8 core is getting pasted by C2D today clock-for-clock. And from what I just read I don't think the improvements made to create Griffen will enable to catch up to C2D IPC-wise.
Add to that the fact that by the Griffen release time Penryn will be widely available at 45nm. More power efficient than current mobile C2D offerings, faster, and with better IPC.
For the mobile space a cheap power efficient platform (not just a CPU but a NB/SB, wireless chipset, graphics, driver support, etc...) is far more important than sheer CPU performance.
Griffin sure won't beat C2D on performance IPC or clockspeed wise but it will provide them with a viable low cost set of one size fits all tech. that they can sell to the OEM's and that is where the major volume (and thus money) is.
Unleess AMD thinks they can add Barcelona to the mobile market Intel will continue to add market share in mobile front. Plus it seems like their implying ATI graphics only when talking about discrete graphics.
One question- why 6 sata ports and 14 usb ports...
quote: One question- why 6 sata ports and 14 usb ports...
Using desktop chipset becomes a problem of overkill, wasting silicon and power. Need to be slimmed down but that would still be more expensive due to smaller production.
How much power do they use when inactive ?
There are plenty of notebook users that want decent performance with LOOOONG battery life. If AMD can provide better typical battery use, they could still make some waves. At present, Turion X2 is already competitive with C2D in low power states, but at load they fall behind. A few optimizations could help them in both areas.
But that won't really help them in a years time.
Competing in 12 months with what Intel has today means than when Intel has moved on 12 months, they will likely be ahead again, so overall nothing will have changed, although the consumer will reap the benefits :)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
36 Comments
Back to Article
Roy2001 - Friday, May 18, 2007 - link
AMD stated that it didn't want to clue Intel in on what it had up its proverbial sleeve----------------------------------------------------------------
I really want to laugh. 64bit is really a revolution? What else AMD invented except IMC? Intel invented X86 and AMD copied it.
TA152H - Friday, May 18, 2007 - link
AMD invented the integrated memory controller? You're joking, right?They weren't even the first x86 company to make one, it was NexGen, a company they bought when they realized they weren't competent enough to make their own x86 processors without copying Intel. It was during the K5 debacle, and Jerry Sanders was smart enough to buy them, and that is where the K6 came from.
AMD copied a lot more than x86 from Intel. I actually have one of their 1981 product sheets, they even copied Zilog stuff like the Z8001 and Z8002. They, of course, copied the 8080 and 8085 as well. So, the fellow that goes on about IBM making that relationship is dead wrong. It started a lot sooner than that, and only when Intel got greedy with the 386 was it ended with acrimony. Intel was annoyed AMD would sell everything for so cheap.
Regs - Sunday, May 20, 2007 - link
Thank god for the caveman that invented the wheel or we really be screwed.Do you guys get hard-on's googling information about who "invented" what first just to shove it back at someone's face? Who knows, maybe the guy who invented the IMC now works for AMD. Who the hell knows what original idea or concept it came from either. Though I'd say AMD did a hell of a job making it work!
TA152H - Sunday, May 20, 2007 - link
What a dork.You think I used Google for this? You wouldn't understand because you're a moron, but it bothers me when people say things I know are wrong and perpetuate misinformation.
They had built computers on a chip well before the x86 world ever saw the IMC, so making weird claims about AMD and them inventing the IMC is flat wrong. They were the first to make that tradeoff, but anyone could have done it.
Making it work isn't particularly difficult, and I'm not even sure it's proven to be a great idea. You give up a lot to get it, and only when compared to the ultra-sophisticated, and horrible Pentium 4 did it shine. Against the Core 2, this great feature is part of a product that gets totally raped by Intel's solution. So, I'm not sure it was greatest thing in the world to spend the transistors on, Intel obviously found better ways. Even now, the Barcelona still won't have memory disambiguation like the Core 2, and will have only the store/load OOP ability of the P6 core. It's a step in the right direction, but I'm thinking they should have spent a few more transistors there, although I think they'd be better off taking them from x87. What is the point of that now?
Starglider - Friday, May 18, 2007 - link
AMD copied x86 at Intel's request. Intel was bidding for IBM's original PC project, IBM said 'you must have a second manufacturing source', so Intel contacted AMD, asked them to manufacture 8086 chips and supplied them with the semiconductor masks to do so. AMD cunningly made the ability to use the x86 ISA in their later chips a requirement of the contract, though it took a court case to confirm the legality of this. But no, AMD did not slavishly clone Intel's ISA, Intel specifically asked them to do so.hechacker1 - Friday, May 18, 2007 - link
AMD's announcement sounds like a good idea. It will allow them to catch up to current battery life improving technologies.My Dell 700m inpirion (which costed $800 shipped new 2 years ago) features a pentium-m dothan 1.7GHz single core. Hardly the latest technology. I've managed to get 5+ hours with wifi always on and moderate screen brightness (this screen gets too bright!) in linux.
In windows xp battery life is ~3.9hours with similar usage. In vista it dwindles even further. (note i have 2GB of ram to prevent swap usage/hard disk activity).
But in linux there are great power management capabilities and intel provides a good amount of open source drivers. And now they released this very useful application:
http://www.linuxpowertop.org">http://www.linuxpowertop.org
Already there are many new patches to even further reduce power consumption by badly coded programs. Some of these fixes will work their way into Windows (like firefox), but I wonder how carefully Microsoft evaluated their power consumption due to code?
All of these "extra lower power states" are useless if the software never allows the CPU to reach those lower power states. My own computer had several problems that forced my lowest power state to "C2". After using powertop I have managed to get my computer go into C4 power state. I'll be getting even better battery life now!
As far as AMD is concerned, I believe they have waited too long to match intels capabilities in the fastest growing mobile market. Why wait for Griffen when Santa Rosa is essentially the same? And I already know intel has great open source support (all of my cheap dell hardware works perfectly).
TA152H - Friday, May 18, 2007 - link
How is what is being done with the Barcelona and this Griffin different from what AMD did with the CXT core in the K6-2? They called it write combining back then. It seems remarkably similar to me, but I might be missing something. Can someone elaborate on the differences?tayhimself - Friday, May 18, 2007 - link
AFAICT it is the same thing except one layer down the memory hierarchy. Instead of your (L2) cache controller combining writes to your memory controller, your memory controller combines writes to the memory banks.TA152H - Friday, May 18, 2007 - link
I thought the same thing, but wanted to make sure.One little thing, it was the L1 cache controller, the K6-2 didn't have a L2 cache on the chip, it was on the motherboard. The K6-III, K6-2+, and K6-III+ all had the on-chip L2 cache.
TA152H - Friday, May 18, 2007 - link
OK, this line "while the K8 is arguably a better starting point for a mobile-specific architecture than the P6", is completely wrong, even though it is qualified.Are you kidding? Do you remember the K7 when it came out and how it compared to the Pentium III? Sure it was faster, but the power use was WAY higher and in no way consistent with the marginal performance improvement. Once you improved the cache on the Pentium III with the Coppermine, and did so later with the Thunderbird, the difference in performance was greatly reduced. People were still preferring the Tualatin 1.4 to the newer stuff until the Core 2 came out, because they were so power efficient and easy to cool, and the performance was excellent, mainly because of the 512K cache. The main reason the Athlon had any advantage, which it did because of frequency headroom is the Pentium III was seriously memory bottlenecked, and that wasn't a huge architecture step forward.
More to the point, why didn't they start with the K6-III????? It was way better than the Pentium III in terms of performance per watt, although not so in floating point. But, AMD is always talking about these coprocessors, so put a K6-III that is cleaned up a little to address its deficiencies, integrate the northbridge and graphics into it (it's a tiny die, it would still be small) and put a small socket for a coprocessor if people need floating point. Why put a power hog like the K8 as the starting point? They could also make a K8 mobile part for the devil may care about power mobile group, but I think the K6 derived group would have extremely low power use, function very well, and be relatively inexpensive to make. A K6-III+ at 600 MHz, even with terrible memory bandwidth and a tiny 256K L2 cache still does fine surfing the internet and opening office apps, etc... Someone needs to go back to these shorter pipelined units for power savings, I was hoping it was AMD. Intel doesn't really have anything, the Pentium wasn't decoupled, and the Pentium Pro had the long pipeline :( .
Goty - Friday, May 18, 2007 - link
I agree with your points here, but it might have come down to an issue of time and manpower. Intel probably has teams of engineers sitting around twiddling their thumbs just waiting for something to do while AMD probably has everyone working around the clock. It's probably more a result of resources than anything.TA152H - Friday, May 18, 2007 - link
Goty,I think you're right, but a product like that could make a big difference too. It would be so different than anything out there, and have such advantages for the market it is designed for, I think it is something they should have looked at.
I guess what's so disappointing for me is they mentioned that they were going to try a completely new design for it, and then they just did another iteration of the K8 and took a branch from there. I don't think they can seriously differentiate themselves from the Core 2 line, and I think they have to. Intel is so much better in manufacturing, if AMD retains design parity, or something close, I don't know how they are going to be successful. I think Fusion is a good idea, but I don't think it's enough and I don't think it would be hard for Intel to duplicate, because as you say, they have the resources.
Besides, they have the K6, they'd have to increase the memory interface, improve the decoders, and tweak little things here and there, but it's a great processor. Remember how disappointed everyone was when the K7 couldn't beat it clock normalized for integer and it was beating the Katmais that were 50 MHz faster? This with the putrid VIA MVP3 chipset that had horrible memory performance. It was a really good design.
I am also wondering why they still have such strong x87 in the Athlons. Why even bother these days, particularly with the mobile part. Put in a tiny non-pipelined version for compatibility, and save the space for something more useful. x87 isn't even supported in x86-64 mode, so it's clearly a dead technology.
LoneWolf15 - Friday, May 18, 2007 - link
Maybe it did for Joe Sixpack, but it ticks me off. Centrino tells me nothing about what kind of processor is in a system, it just tells me that the system has a wireless card (of some sort, Intel branded but who-knows-which-model). Centrino could mean Pentium-M (Banias or Dothan), Celeron-M, or Core Solo. Centrino Duo at least tells me a system is dual-core, but not whether it's Core Duo or Core 2 Duo (or possibly "Pentium Dual Core", that relabeled Core Duo Intel is putting out in limited quantities). You call this simple? I sure don't.
I can't stand this marketing trend. There is very little way to "at-a--glance", know exactly what you are getting in a given laptop. It's just one more buzzword to know, when just saying the laptop has WiFi (which 95% do at this point) and an Intel xxx processor running at such-an-such a speed would be useful. And it looks like it gets review sites like Anandtech sucked into buzzword bingo in the process.
IntelUser2000 - Friday, May 18, 2007 - link
Actually, Celeron M based laptops can't be certified as Centrino. Here is the chart from Intel: http://www.intel.com/products/centrino/compare.htm">http://www.intel.com/products/centrino/compare.htm
And you can further differentiate the single core Centrinos from the dual cores. Dual core versions are Centrino Duo, and single core ones are Centrino. It looks like even the logo can be different for Core Solo compared to Pentium M.
acejj26 - Friday, May 18, 2007 - link
power consumption is linear with respect to frequency and quadratic with respect to voltage, not exponentialelpresidente2075 - Friday, May 18, 2007 - link
You do know that they are one in the same in this instance, right?Goty - Friday, May 18, 2007 - link
x^2 is quadratic, a^x is exponential (a being some constant).Big difference.
JarredWalton - Friday, May 18, 2007 - link
a^x is indeed exponential... and yet, x could be something like... 2! :)Seer - Friday, May 18, 2007 - link
The OP is right and you're completely wrong. I came on here to post the very same message.The definition of x in this case is that it is a variable. If both x and a were constants....you'd just have a single number. There would be no relationship. Go back to Algebra I if you truly don't understand this.
goku - Friday, May 18, 2007 - link
Seeing innovations like this where they're doing more with less just really makes me happy. It's nice to see that they're addressing power concerns and working towards having a powerful computer that also can be power conservative.While the hardware industry is getting more efficent, unfortunately the software industry is following the trend of the P4. Software is getting more and more inefficient and bloated while hardware is getting more efficient. It'd be nice if this trend would reverse and would start seeing better software written again..
yyrkoon - Friday, May 18, 2007 - link
I can not help but think 'WHY' a laptop *NEEDS* more than one core on the CPU. Anyone claiming that their laptop is comparrible to the performance of a good desktop, is only kidding themselves. From my point of view, a laptop is a tool, used to do whatever you can not do on the desktop, for whatever reason ( travelling, and away, from your desktop, etc ). Yes, I understand that multiple core CPU have been availble in laptops for some time now, but that does not really answer my question. The point I am getting at here, is that if the laptop CPU could be made aroudn a single core, there should be plenty of room for other enhancements, and a potential for a lower TDP.The statement above, to me resembles something along the lines of the difference between USB v2.0, 400Mbit firewire , and syncronous/asyncronous read/write capabilities, there *HAS* to be a performance hit here . . . Who knows, maybe I am wrong ?
Justin Case - Sunday, May 20, 2007 - link
A second core does make a very significant difference in terms of system responsiveness, even if you rarely (or never) max out both CPUs. I started using dual-CPU systems about 8 years ago and never looked back.More and more people are using a laptop as their only PC, so that theory that laptops are just typewriters isn't true anymore. They used to be just typewriters because that was all they _could_ do.
With proper power management, a dual-core CPU can consume as little as a single-core one, for the same amount of work, so the only issue becomes the price of the chip itself. And since dual-core CPUs are so cheap, these days, there's really no reason not to have one.
Eris23007 - Friday, May 18, 2007 - link
My company issued me a laptop, complete with anti-virus, software firewall, and hard disk encryption preinstalled that I can't turn off. My system runs two antivirus scans a week on the whole hard drive (at about 4 hours each). Given that an antivirus scan requires considerable hard disk access, that means that in addition to the processor load for running the virus-scanning algorithms, there's also substantial load for decryption.
Unfortunately this laptop is a single-core Pentium M - no Core 2 Duo for me. My friend who hired on a little after I did got the Core 2 Duo version. The difference is astounding. My laptop (on XP, of course) has significant usability constraints - even the UI just isn't very responsive because of all the crap running in the background.
I would *LOVE* to have an extra core around, dedicated to handling this bullcrap. It would make a tremendous difference in my ability to actually "get work done" when my system would otherwise be busy running all this junk.
I'm aware that the extra core wouldn't improve the hard disk situation - but it would DEFINITELY help a WHOLE lot.
ADDAvenger - Friday, May 18, 2007 - link
There's more people than you'd think who have a laptop as their only computer, and I am one of them. Yeah a laptop doesn't NEED dualcore, but believe me, it does make more of a difference than you want to give it credit for. I'm in college and take notes on this thing all day, a desktop just wouldn't work for me, and I just don't have the money to have both right now.Wolfpup - Friday, May 18, 2007 - link
Totally agree. I can't wait for quad core laptops. And yes, I intend to use my laptop as my only computer.sprockkets - Friday, May 18, 2007 - link
because it was demonstrated that adding another core did not make any difference in battery life while increasing performance.JackPack - Friday, May 18, 2007 - link
The new Intel Merom M0 stepping processors already supports something similar. With single threaded apps, one core goes down to C3 while the other speeds up to a turbo frequency (bin + 1) while staying below or at the specified TDP.
goku - Friday, May 18, 2007 - link
I agree with this as well. People really expect laptops to be something that can do everything and it really is unreasonable. That said, AMD and Intel are making their computers more powerful so that people can start using laptops exclusively as that is the market trend, to go away from desktop machines.I'm wondering if AMD supports completely turning off a core and having it run in single core operation as I'd think running in single core mode with a slightly higher clock speed (if at all) would be more power efficient than having two slower cores running simultaneously. No need to have two cores running when you're just typing in MS word. It'd also be nice if intel and AMD would support far slower clocks speeds as IIRC the minimum clock speed increased from 600mhz on the Dothan and Banias machines to 800mhz on newer machines.
If I could get my laptop to downclock to 100mhz when I'm typing in Word and reduce the voltage considerably, I bet you'd see a much higher jump in battery life. But with vista being so bloated today, the computer would probably never downclock to a speed as low as that since XP and vista seemingly are always busy.
CrystalBay - Friday, May 18, 2007 - link
Kinda sounds like Barcelona, is not the panacea people were hoping for...Hulk - Friday, May 18, 2007 - link
Anand pretty clearly stated that Griffen is not based on Barcelona but on K8.I'm not really getting how this strategy will help AMD in the mobile market though. The current K8 core is getting pasted by C2D today clock-for-clock. And from what I just read I don't think the improvements made to create Griffen will enable to catch up to C2D IPC-wise.
Add to that the fact that by the Griffen release time Penryn will be widely available at 45nm. More power efficient than current mobile C2D offerings, faster, and with better IPC.
Like I said, I don't get it?
mesyn191 - Friday, May 18, 2007 - link
For the mobile space a cheap power efficient platform (not just a CPU but a NB/SB, wireless chipset, graphics, driver support, etc...) is far more important than sheer CPU performance.Griffin sure won't beat C2D on performance IPC or clockspeed wise but it will provide them with a viable low cost set of one size fits all tech. that they can sell to the OEM's and that is where the major volume (and thus money) is.
Haltech - Friday, May 18, 2007 - link
Unleess AMD thinks they can add Barcelona to the mobile market Intel will continue to add market share in mobile front. Plus it seems like their implying ATI graphics only when talking about discrete graphics.One question- why 6 sata ports and 14 usb ports...
tygrus - Sunday, May 20, 2007 - link
Using desktop chipset becomes a problem of overkill, wasting silicon and power. Need to be slimmed down but that would still be more expensive due to smaller production.
How much power do they use when inactive ?
strikeback03 - Tuesday, May 22, 2007 - link
Are there any desktop chipsets that support 14 USB ports? I don't remember seeing any.JarredWalton - Friday, May 18, 2007 - link
There are plenty of notebook users that want decent performance with LOOOONG battery life. If AMD can provide better typical battery use, they could still make some waves. At present, Turion X2 is already competitive with C2D in low power states, but at load they fall behind. A few optimizations could help them in both areas.Lonyo - Friday, May 18, 2007 - link
But that won't really help them in a years time.Competing in 12 months with what Intel has today means than when Intel has moved on 12 months, they will likely be ahead again, so overall nothing will have changed, although the consumer will reap the benefits :)