Comments Locked

61 Comments

Back to Article

  • sykemyke - Wednesday, August 9, 2006 - link

    Hey, why don't we just put FPGA block on the cpu?

    This way, programmer could create really new Assembly command, like 3DES or something..
  • unclebud - Monday, August 7, 2006 - link

    it was good to just read any sort of article from the site owner.
    was feeling that the reviews section had just fallen into the depths of fanboyism, so it was good just to hear somebody at least sometimes impartial THINKING out loud rather than just showing off.
    what's really interesting to me is that the whole article mimics what was written in the latest (i think) issue of cpu from selfsame author.
    good issue incidentally. will buy it from wal-mart hopefully tomorrow (they have 10% off magazines)
    cheers, and keep representing -- i still have the 440bx benchmarks/reviews filed away in a notebook
  • jp327 - Sunday, August 6, 2006 - link

    I'm not a gamer so I usually dont follow the video segment, but looking at the Torenza
    slide on page 2(this article), I can't help but see the similarity between what amd forcasts and the PS3's Cell architectuter:

    http://www.anandtech.com/showdoc.aspx?i=2379&p...">http://www.anandtech.com/showdoc.aspx?i=2379&p... cell

    http://www.anandtech.com/showdoc.aspx?i=2768&p...">http://www.anandtech.com/showdoc.aspx?i=2768&p... K8L

    Doesn't AMD have a co-op of some sort w/ IBM?
  • RSMemphis - Sunday, August 6, 2006 - link

    I thought you guys already knew this, but apparently not.
    Most likely, there will be no Fab 30, it will be re-equipped to be Fab 38, 300 mm with 65 nm features.
    Considering all the aging Fabs out there, it makes sense to have the 90 nm parts externally manufactured.
  • xsilver - Saturday, August 5, 2006 - link

    of the 5.4b of ATI's purchase price, is most of that due to intellectual property?
    i mean as you state, ATI has no fabs.


    and then regarding the future of GPU's, with CPU's now becoming more and more multithreaded, couldnt it be fathomable that some of the work be moved back to the cpu in order to fill that workload?
    unless of course gpus are also going multithreaded soon? (on die, not just SLI)
  • eugine MW - Saturday, August 5, 2006 - link

    I had to register just to say well written article. It has provided me with much more information regarding the merger than any other website.

    Greatly written.
  • MadBoris - Thursday, August 3, 2006 - link

    How is the GPU on a CPU even considered a good idea by anyone?

    GPU bandwidth + CPU Bandwith = how the hell are mobo bus's and chipset going to handle all that competing bandwidth from one socket. Either way their is crazy amount of conflicting bandwidth from one socket, I doubt it can be done without serious thrashing penalties.

    When I want to upgrade my video card, I have to buy some $800 CPU/GPU combo. :O

    Call me crazy, but that sounds like an April fools joke. But who's kidding who?

    It's doom and gloom for PC gaming, and AMD just made it worse.
  • JarredWalton - Thursday, August 3, 2006 - link

    Considering that we have the potential for dual socket motherboards with a GPU in the second socket, of a "mostly GPU CPU" in the second socket, GPU on CPU isn't terrible. Look at Montecito: 1.7 billion transistors on a CPU. A couple more process transitions and that figure will be common for the desktop CPUs.

    What do you do with another 1.4 billion transistors if you don't put it into a massive L2/L3 cache? Hmmm... A GPU with fast access to the CPU, maybe multiple FSBs so the GPU can still get lots of bandwidth, throw on a physics processor, whatever else you want....

    Short term, GPU + CPU in a package will be just a step up from current IGPs, but long term it has a lot of potential.
  • dev0lution - Thursday, August 3, 2006 - link

    1. There was no mention of the channel in this article, which is the vehicle by which most of these products make it to market. Intel and Nvidia have a leg up on any newly formed ATI/AMD entity, in that they make sure their partners make money and are doing more and more to reward them for supporting their platforms. AMD has been somewhat confused lately, trying to keep their promises to their partners while trying to meet sales goals on the other.

    2. Intel and Nvidia could ramp up their partnership a whole lot quicker than AMD/ATI can (no pesky merger and integrating cultures to worry about), so now you have Nvidia with a long term, very gradual share shift on the AMD side with a quicker ramp up on the Intel side of things to replace ATI's share. Intel and Nvidia in the short term end up doing pretty well, with plenty of time to develop next gen platforms to compete with whatever the long term AMD/ATI roadmap looks like.

    3. AMD/ATI got more publicity and PR over this whole deal than they probably could have gotten with their annual marketing budgets combined. Everyone inside and outside the tech world have been talking about this merger which isn't a bad way to get brand recognition for no additional investment.
  • s1wheel4 - Wednesday, August 2, 2006 - link

    This will be the end of AMD and ATI as we know them today....and the end of both in the high end enthusiasts market...when merged; the new company will be nothing more than a mediocre company both of which will lag behind Intel and NVIDIA in performance.



  • Zebo - Wednesday, August 2, 2006 - link

    No worries...AMD runs a tight, efficient company that is accustomed to surviving
    through very hard times. AMD survived for a long time making chips that were cheap and almost as powerful as Intel's best. If they have to fall back to that business model to survive, they will. I personally loved those days of $40-$80 chips. But that's not realistic considering where AMD has been, their name and market presence currently, products on the table.. AMD is a mainstream player now with good reputation and large OEM's building thier boxes with them. They aint going anywhere.
  • poohbear - Wednesday, August 2, 2006 - link

    so if the deal goes through, will the ATI brand name disappear? will we see AMD graphics cards instead of ATI graphics cards?
  • Sunrise089 - Wednesday, August 2, 2006 - link

    IMHO there would be no reason to abandon the second most valuable GPU name. When Ford bought Aston Marton they didn't suddenly rename the products things like Ford DB7.
  • erwos - Wednesday, August 2, 2006 - link

    Let me toss out a few random thoughts. I'm more of an economist than a businessman, but I took enough banking and finance to know enough to hurt myself.

    Almost all huge corporate mergers are not huge successes. Indeed, most of them tend to be failures unless the businesses are _very_ similar (gold mining company A buys out gold mining company B). My favorite example is Novell buying out SuSE and Ximian - everyone's doing operating systems, yet the best you can say was that it wasn't a complete failure. Certainly, the promised benefits haven't really emerged. Another good example is AOL and Time Warner.

    The bad news here is that ATI and AMD are in two different sections of the industry, and that for the proposed benefits of this merger to work, they're going to have to integrate very tightly. To make things worse, the benefits of integration aren't all that clear. GPU on a CPU? Who's been asking for that? It has certain implications for the embedded market (think Geode and system on a chip applications), but they hardly needed to buy a company the size of ATI to accomplish that particular goal. And it couldn't be to hand ATI the better fabs, either - as Anand pointed out, AMD isn't going to have any extra fab space in the medium-term outlook.

    My prediction: ATI-AMD will spend the next 9 months after the merger at _vastly_ decreased efficiency. Intel and nVidia will both be able to exploit this and take definitive leads in technology, at least for a while. In the long-term, ATI-AMD's dedication to high-end GPUs will fade, because the former-AMD executives running the company have absolutely no experience in the field. I am pessimistic, because, unfortunately, that is the historical truth.

    Personally, I think that if GPU on a CPU becomes the prevailing way to go, nVidia will just buy out VIA or Transmeta. And they'll probably have just the same problems as ATI and AMD will have, too... But there's no reason to toss those problems on yourself until you have to, and there was no really compelling reason for AMD to buy ATI at this moment in time.
  • Kim Leo - Wednesday, August 2, 2006 - link

    what are you talking about? ok its fine to comparte other situation like this, but AMD didnt buy ATI just for the "intergrated graphics in CPU" idea and even though Hector Ruiz dosn't have too much experience in this sector but ATI's CEO who will still be there does, and i don't think that AMD won't listen to what he has to say about it. I think this will be great, AMD and ATI will both benefit from this, they both get technologies that can be used in their own products
  • erwos - Wednesday, August 2, 2006 - link

    There aren't two CEOs. There's one, and his name is Hector Ruiz. At best, ATI's CEO will get pushed into director of the graphics division. More to the point, AMD's the much bigger company, and it's more likely their corporate culture is going to dominate ATI's. ATI's CEO's opinion will matter, but it's not going to sway AMD like it did/does ATI.

    If the plan isn't to integrate GPUs on CPUs, what other benefit was there to acquiring ATI? What techologies is ATI going to give AMD, and vica versa?

    -Erwos
  • Sunrise089 - Wednesday, August 2, 2006 - link

    Couldn't the desire to purchase a healthy company with a high profit margin in a fast growing industry be a benefit? I think everyone is too focused on integration in the short term. AMD had $$$, $$$ is there to spend or invest, and if the bean-counters at AMD think the ROI for buying ATI is higher than investing in a new fab or whatever than they make that decision.
  • JarredWalton - Wednesday, August 2, 2006 - link

    The major benefit seems to be AMD getting a company with a reasonable chipset business, and they can work that to create better business platforms, thus helping to penetrate the lucrative business sector. Except, penetrating the business sector is extremely difficult, especially the corporate world. "Buy Intel and Dell" is the standard decision, and even if Dell isn't picked, almost all businesses buy Intel systems. They did this all through the "NetBurst failure", so why would they change now that Intel has a good chip again (Core 2)?
  • yacoub - Wednesday, August 2, 2006 - link

    So will we see a reference cooler design on future ATI cards that is less noisy than the silly thing on the X1800/X1900 series? ;P
  • jones377 - Wednesday, August 2, 2006 - link

    In Q106 the marketshare breakdown for all x86 chipsets were as follows....

    Intel 57%
    VIA 15%
    ATI 12%
    Nvidia 9%
    SiS 6%

    http://www.xbitlabs.com/news/chipsets/display/2006...">http://www.xbitlabs.com/news/chipsets/display/2006...

    Different breakdown for Intel and AMD platforms. Basically Nvidia has almost no share in the Intel platform market while ATI sells in both.
  • jjunos - Wednesday, August 2, 2006 - link

    I believe that the high % ATI is getting here is because of a shortage of chipsets that INTEL was experiencing in the last quarter. As the article states the increase of 400% in ATI chipset marketshare.

    So as such, I wouldn't necessarily take this breakdown as permanent future marketshares.
  • JarredWalton - Wednesday, August 2, 2006 - link

    I added some clarification (the quality commentary). Basically, Intel is the king of Intel platform chipsets, and NVIDIA rules the AMD platform. ATI sells more total chipsets than NVIDIA at present, but a lot of those go into laptops, and ATI chipset performance on Intel platforms has never been stellar. Then again, VIA chipset performance on Intel platforms has never been great, and they still provide 15% of all chipsets sold.

    Of course, the budget sector ships a TON of systems, relatively speaking. That really skews the numbers. Low average profit, lower quality, lower performance, but lots of market share. That's how Intel remains the #1 GPU provider. I'd love to see numbers showing chipset sales if we remove all low-end "budget" configurations. I'm not sure if VIA or SiS would even show up on the charts if we only count systems that cost over $750.
  • jones377 - Wednesday, August 2, 2006 - link

    Before Nvidia bought Uli they had about 5% marketshare IIRC. And I bet of those current 9% Nvidia share, ULi still represents a good portion (probably on the order of 3-4%) and it's all in the low-end. For some reason, Nvidia really sucks at striking OEM deals compared to ATI which have historically been very good at it. The fact that Intel is selling motherboards with ATI chipsets is really helping ATI though.

    According to my previous link, ATI has 28% of the AMD platform market despite being a relative newcomer compared to Nvidia there. I think even without this buyout, ATI would have continued to grow this share, now this is a certain. I think Nvidia also realised this a while back because they started pushing their chipsets for the Intel platform much more.

    Still, Nvidia will have an uphill struggle in the long term. If Intel chooses a new chipset partner in the future (they might just go at it alone instead), they are just as likely to pick SiS or VIA (though I doubt they will go VIA) over Nvidia and SiS already have a massive marketshare advantage over Nvidia there (well since Nvidia has almost none everyone has). So while ATI will likely loose most/all of their Intel chipset marketshare eventually, I doubt Nvidia will gobble up all of that. They will face stiff competition from Intel, SiS and VIA in that order. The one bright spot for Nvidia is that they should continue to hold on to the profitable high-end chipset market for the AMD platform and grow it for the Intel platform. Still, overall this market is very small..

    And lets not even mention the mobile market... Intel has that one all gobbled up for itself with the Centrino brand and this AMD/ATI deal will ensure that AMD will have something simular soon. Given the high power consumption of Nvidia chipset making them already unsuited for the mobile market, even if they come out with an optimised mobile chipset, their window of opportunity is all but gone there now.
  • Sunrise089 - Wednesday, August 2, 2006 - link

    In terms of style, I don't think the initial part of the article (the part with each company's slant) was very clear. Was it pure PR speak (which is how the NVIDIA part read) or AT's targeted analysis (how the Intel part read).

    Second, I think quotes like this:
    "Having each company operate entirely independently makes no sense, since we've already discussed that it's what these two can do together that makes this acquisition so interesting."
    continue to show you guys are great tech experts, but may also suggest that you guys aren't the best business writers on the web (not saying I am either of course). A lot of companies are acquired by another company solely for the purpose of making money, not any sort of integration or creation of a competitive advantage. If a company is perceived to be undervalued, and another company feels it's currently in a good financial situation, it's a smart move to spend the $$$ on an acquisition if you feel the acquired company's long term growth may out-pace that of the currently wealthier company. Yahoo and Google do this sort of thing all the time, and big companies like Berkshire-Hathaway do it as well. Do you think Warren Buffet really wanted to invest in Gillette to allow his employees to obtain cheaper shaving products? No, he simply felt he had money available and buying a share of Gilette would net him money in the long term.

    If the ATI/AMD merger only creates the possibility of major collaboration between the two companies in the future (basically as a hedge for both companies against unexpected or uncertain changes in the marketplace) but ATI continue to turn a profit when seen as an individual corporate entity, than the acquisition was the correct thing to do so long as AMD had no better use for the $$$ it will spend on the purchase.
  • defter - Wednesday, August 2, 2006 - link

    quote:

    ATI continue to turn a profit when seen as an individual corporate entity


    If the deal goes through, ATI won't be an individual corporate entity. There will be just one company, and it will be called "AMD".

    That's why Anand's comment makes sense. It would be quite silly e.g. for marketing teams to operate independently. Imagine: first AMD's team goes to OEM to sell the CPU. Then the "ATI" team goes to the same OEM to sell GPU/chipset. Isn't it much better to combine those teams so they can offer CPU+GPU/chipset to the OEM at once?
  • jjunos - Wednesday, August 2, 2006 - link

    I can't see ATI simply throwing away their name. They've spent way too much money and time building up their brand, why throw it away now?
  • Sunrise089 - Wednesday, August 2, 2006 - link

    I'm not sold on the usefullness of combining the marketing at all. Yes, to OEMs you would obviously do it, but why on the retail channel? ATI has massive brand recognition, AMD has none in the GPU maerketplace. Even if the teams are the same people, using the ATI name would not at all be a ridiculous notion. Auto manufactures do this all the time: Ford ownes Mazda, and for economics are scale purposes builds the Escape and Tribute at the same plant. Then when selling to a rental company, they would both be sold by corporate fleet sales, but when sold to the public Ford and Mazda products are marketed completely independently of one another.

    Once again, even if both companies are owned by AMD it is not impossible to still keep the two divisions farely distinct, and that's where my "when seen as an individual corporate entity" comment came from.
  • johnsonx - Wednesday, August 2, 2006 - link

    quote:

    The 3D revolution killed off basically all giants in the graphics industry and spawned new ones, two of which we’re talking about today.


    Obviously you are referring to ATI and NVIDIA. The 3d revolution certainly did spawn NVIDIA, but my recollection says that ATI has been around far longer than that. I think I still have an ATI Mach-32 EISA-bus graphics card in a box somewhere, and that was hardly ATI's first product. ATI products even predate the 2D graphics accelerator, and even predate VGA if I recall correctly (anyone see any 9-pin monitor plugs lately?). I do suppose your statement is correct in the sense that there were far more graphics chip players in the market 'back then'; today there really are just two giants and about 3 also-rans (Matrox, SiS, XGI). ATI was certainly one of the big players 'back then'; indeed it took me (and ATI too, for that matter) quite some time to figure out that in the 3D Market that ATI was a mere also-ran themselves for awhile; the various 3D Rage chips were rather uncompetitive vs the Voodoo and TNT series of the times.

    No offense to you kids who write for AT, but I actually remember the pre-3D days. I sold and serviced computers with Hercules Monochrome graphics adapters, IBM CGA and EGA cards, etc. The advent of VGA was a *BIG DEAL*, and it took quite some time before it was at all common, as it was VERY expensive. I remember many of ATI's early VGA cards had mouse ports on them too (and shipped with a mouse), since the likely reason to even want VGA was to run Aldus Pagemaker which of course required a mouse (at least some versions of it used a run-time version of Windows 1.0... there was also a competing package that used GEM, but I digress).

    To make a long story short, in turn by making it even longer, ATI was hardly 'spawned' by the 3D revolution.


    now I'll just sit back and wait for the 'yeah, well I remember farther back than you!' flames, along with the 'shut up geezer, no one cares about ancient history!' flames.
  • Wesley Fink - Wednesday, August 2, 2006 - link

    Not everyone at AT is a kid. My 3 children are all older than our CEO, and Gary Key has been around in the notebook business since it started. If I recall I was using a CPM-based Cromemco when Bill Gates was out pushing DOS as a cheaper alternative to expensive CPM. I also had every option possible in my earlier TI99-4A expansion box. There amy even be a Sinclair in a box in the attic - next to the first Apple.

    You are correct in that ATI pre-dates 3-D and had been around eons before nVidia burst on the scene with their TNT. I'm teaching this to my grandchildren so they won't grow up assuming - like some of our readers - that anyone older than 25 is computer illiterate. All my kids are in the Computer Business and they all still call me for advice.
  • Gary Key - Thursday, August 3, 2006 - link

    I am older than dirt. I remember building and selling Heath H8 kits to pay for college expenses. The days of programming in Benton Harbor Basic and then moving up to HDOS and CP/M were exciting times, LOL. My first Computer Science course allowed me to learn the basics to program the H10 paper tape reader and punch unit and sell a number of units into the local Safeway stores with the H9 video/modem (1200 baud) kit. A year later I upgraded everyone with the H-17 drive units, dual 5.25" floppy drives ($975) that required 16k ($375) of RAM to operate (base machine had 4k of RAM).

    Anyway, NVIDIA first started with the infamous NV1 (VRAM) or STG2000 (DRAM) cards that featured 2d/3d graphics and an advanced audio (far exceeded Creative Labs offerings) engine. Of course Microsoft failed to support Quadratic Texture Maps in the first version of Direct3D that effectively killed the cards. I remember having to dispose of several thousand Diamond EDGE 3D cards at a former company. They rebounded of course with the RIVA 128 (after spending a lot of time on the ill-fated NV2 for Sega, but it paid the bills) and the rest is history.

    While ATI pre-dated most graphic manufacturers, they were still circling the drain from a consumer viewpoint and also starting to lose OEM contracts (except for limited Rage Pro sales due to multimedia performance) in 1997 when they acquired Tseng Labs. Thanks to those engineers the Rage 128 became a big OEM hit in 1998/1999 although driver performance was still terrible on the consumer 3D side even though the hardware was competitive but lead to the once again OEM hit, Radeon 64. The biggest break came in 2000 when they acquired ArtX and a couple of years later we had the R300, aka Radeon 9700 and the rest is history. If S3 had not failed so bad with driver support and buggy hardware releases in the late 1998 with the Savage 3D, ATI very well could have gone the way of Tseng, Trident, and others as S3 was taking in significant OEM revenue from the Trio and ViRGE series chipsets.

    Enough old fart history for tonight, back to work..... :)
  • johnsonx - Thursday, August 3, 2006 - link

    Yep, you two are both old. Older than me. Heath H8? I didn't think selling candy bars would pay for college. You actually had to build candy bars from a kit back then? Wow. ;)

    Mostly the 'kids' comment was directed at your esteemed CEO, and maybe Kubicki too (who I'm well aware is with Dailytech now), and was of course 99.9% joke. Anand may be young, but he's already accomplished a lot more than many of us ever will.
  • Gary Key - Thursday, August 3, 2006 - link

    where is the edit button... led to
  • PrinceGaz - Wednesday, August 2, 2006 - link

    Well according to ATI's investors relations webby and also Wikipedia, they were founded in 1985 and started by making integrated-graphics chips for the like of IBM's PCs, and by 1987 had started making discrete graphics-cards (the EGA Wonder and VGA Wonder).

    Yes, they quite obviously do predate the 3D revolution by many years. VGA graphics date from 1987 and no doubt the VGA Wonder was one of the first cards supporting it. I imagaine that EGA Wonder card they also made in 1987 would have had the 9-pin monitor connection you mention as that is the EGA standard (I've never used it but that's what the Wiki says).

    All useless information today really, but a bit history is worth knowing.
  • johnsonx - Wednesday, August 2, 2006 - link

    Yep, I stuck quite a few EGA and VGA wonder cards in 386's and 486's back then. They were great cards because they could work with any monitor. Another minor historical point: Monochrome VGA was common in those days too - better graphics ability than old Hercules Mono, but hundreds of $ less than an actual color monitor.
  • yacoub - Wednesday, August 2, 2006 - link

    Your comment should get rated up b/c you correctly state that ATI has been around for some time. Let us also not forget that NVidia bought 3dfx, 3dfx did not simply disappear. And Matrox, while mostly focused in the graphic design / CAD market with their products, has also survived their forays into the gaming market with products like the G200 and G400. Perhaps something about basing your graphics card company in Canada is the trick? :)
  • johnsonx - Wednesday, August 2, 2006 - link

    Well, 3dfx was dead. NVidia was just picking at the carcass. Matrox survives only because they make niche products for professional applications. Their 3D products (G200/G400/G450, Parhelia) were hotly anticipated at the time, but quickly fell flat (late to market, surpassed by the competition by the time they arrived, or very shortly after).
  • mattsaccount - Wednesday, August 2, 2006 - link

    >>NVIDIA also understands that dining with Intel is much like dining with the devil: the food may be great but you never know what else is cooking in the kitchen.

    The food in Intel's cafeteria is actually quite good :)
  • stevty2889 - Wednesday, August 2, 2006 - link

    Not when you work nights..it really sucks then..
  • dev0lution - Thursday, August 3, 2006 - link

    But the menu changes so often you don't get bored ;)
  • NMDante - Wednesday, August 2, 2006 - link

    Night folks get shafter with cafe times.
    That's probably why there's so many 24 hr. fast food offerings around RR site. LOL
  • HopJokey - Wednesday, August 2, 2006 - link

    quote:

    The food in Intel's cafeteria is actually quite good :)

    I beg to differ. It gets old after a while:(
  • Regs - Tuesday, August 1, 2006 - link

    The distant future looks good. Though we yet to see any more green slides about new core technologies from AMD. It almost seems AMD will be making baby-steps for the next 5 or so years to try to compete with the performance Intel is now currently offering.

    For stock holders - lets just hope AMD can pull something off to gain revenue from other markets with the help of Dell and ATi. Their growing capital and recent acquisition need some definite profits to pay it off.
  • AnandThenMan - Tuesday, August 1, 2006 - link

    I think it's fair to say the article has a very strong pro Intel and NVIDIA slant. For starters, it needs to be pointed out that ATI is actually the #2 graphic maker, not NVIDIA. Saying that NVIDIA is #1 in the desktop space is only part of the market, so why state it that way? Trying to make NVIDIA look good of course...

    And this:
    quote:

    It really wouldn't be too shocking to see the whole merger evaporate and for ATI and AMD to just continue on their present, independent paths -- certainly no more surprising than the initial announcement.

    This statement is just dumb. Unless the planet is destroyed by an asteroid, the deal is pretty much done. It is HIGHLY unlikely that the deal will not happen.
  • defter - Wednesday, August 2, 2006 - link

    The desktop market is very important market since most of the profits are made in the high-end desktop market.

    For example ATI has much bigger overall marketshare than NVidia (27.6% vs 20.3%) and has lot of presense in other markets (consumer electronics, handhelds). Still, NVidia has bigger revenue, meaning that ASP of NVidia chips is much higher.

    If you look at profits, the difference is even bigger, during the last quarter, NVidia made three times as much profit as ATI. Thus high-end desktop market is definitely very important.

    Here are some GPU market share numbers for Q2:
    http://www.xbitlabs.com/news/video/display/2006073...">http://www.xbitlabs.com/news/video/display/2006073...
  • PrinceGaz - Wednesday, August 2, 2006 - link

    quote:

    The desktop market is very important market since most of the profits are made in the high-end desktop market.


    Most of the profits are not made in the high-end desktop market, in fact the very high end probably struggles just to break even due to the relatively tiny number of units shipped compared to development costs. Most of the money in discrete graphics is actually made in the low-end discrete graphics segment, cards like the 7300 and the X1300.
  • defter - Wednesday, August 2, 2006 - link

    This is like saying: "most of the revenue is made on $100 CPUs instead of FX/Opteron parts..."

    The revenue can be higher on the low end of the market. But GPUs like 7300/X1300 are selling at $20 or less, profit margins for those can't very high. High-end chips like 7900/X1900 are selling for about $100 and the margins are much higher. (Compare the die size between 7900 and 7300, the difference isn't THAT big).
  • JarredWalton - Wednesday, August 2, 2006 - link

    Hey, I'm a skeptic and you can blame me for the comment. Still, until the deal is well and truly done we have a proposed merger. Government interference, cold feet, whatever other setback you want... these things can and do happen. Do I think the deal *won't* happen? Nope - no more than I think the deal *will* happen. If you had asked me three months ago when I first heard the rumors, I think I would have been about 90% sure it wouldn't happen, so obviously I'm less skeptical now than before.

    As for NVIDIA and Intel slant, the NVIDIA perspective is their view. That doesn't mean it's correct, any more than the ATI, AMD, or Intel perspectives. However, ATI is #2 for the same reason Intel is #1: integrated graphics, specifically on laptops, and again we're talking about the underpowered, mediocre kind that will choke on Vista's Glass GUI. Wipe out all of the low-end GPUs, and NVIDIA has a clear lead in the market. Not in performance, necessarily, but in mindset and brand recognition? Definitely. We are an enthusiast website, and so we're looking at the stuff that moves the market forward, not just what suffices to run office apps.
  • AnandThenMan - Wednesday, August 2, 2006 - link

    quote:

    Intel is #1: integrated graphics, specifically on laptops, and again we're talking about the underpowered, mediocre kind that will choke on Vista's Glass GUI. Wipe out all of the low-end GPUs, and NVIDIA has a clear lead in the market.

    Being #1 in one market is not good enough anymore. NVIDIA NEEDS to be in the integrated graphics sector, the ultra thin mobile sector, the console market, the HD devices market etc. etc. This is where ATI is much more diverse than NVIDIA.

    The article is about the implications of AMD/ATI and how it affects Intel, NVIDIA, and the whole industry. I understand what you are saying about the discreet enthusiest market, and naturally this is the most interesting and desirable segment we all like to talk about. But the merger is about much more than that. IMO, NVIDIA has to re-invent itself to be capable of taking on AMD/ATI. NVIDIA has come out and bragged about how they are not the "last man standing" but this is marketing spin at best. NVIDIA is on the record years ago as saying they want to "be where ever there is a pixel" but honestly, AMD/ATI is far better positioned to deliver this than NVIDIA IMO.
  • defter - Wednesday, August 2, 2006 - link

    quote:

    NVIDIA NEEDS to be in the integrated graphics sector, the ultra thin mobile sector


    Care to elaborate? NVidia is doing fine financially, why it NEEDS to be strongly present on those sectors?

    quote:

    the console market


    NVidia has been in the console market since 2001.
  • Calin - Wednesday, August 2, 2006 - link

    NVidia IS in the integrated graphics sector - if you are referring to the "enthusiast" integrated graphic sector
  • leexgx - Wednesday, August 2, 2006 - link

    i have only seen integrated graphics on nvidia based chip sets
    Most on board vdeio is VIA/s3 or sis integrated graphics (intel chip sets been intel video)
  • Calin - Thursday, August 3, 2006 - link

    ATI RS480, RS482 and RS485 are in this game too (chipsets with integrated video). They were plagued by southbridge problems - slow USB performance mainly, and lack of features (like SATA 2). Whether or not this was detrimental to them, I don't know.
    (you can find mainboards with ATI integrated chipsets from MSI and ECS)
  • JarredWalton - Wednesday, August 2, 2006 - link

    I agree with that. I think NVIDIA's comments are far more bravado than actual truth. However, if they can convince investors and consumers that they's "won the GPU war", it may not matter.

    My big problem with the deal: I don't know what AMD is doing spending $5.4 billon on ATI. Not that ATI is bad, but that's almost two new fabs. That's a lot of talented engineers making a lot of money for several years at least. I think ATI would be insane to not take the offer, but I feel AMD is almost equally insane to make the offer in the first place.
  • Furen - Wednesday, August 2, 2006 - link

    AMD is borrowing the money to buy a viable, self-sufficient company. Convincing banks to let you borrow money for two fabs that will help you out 3+ years down the line is not very easy, especially considering that most people seem to think that AMD's growth is slowing down. Heck, having two extra fabs in 3 years could mean that AMD will just have lots of extra capacity with no use for it. Also, $5.4B for ATI is dirt cheap. Well, maybe not dirt cheap but undervalued considering that its portfolio rivals or surpasses nVidia's in many ways.
  • AnandThenMan - Wednesday, August 2, 2006 - link

    quote:

    My big problem with the deal: I don't know what AMD is doing spending $5.4 billon on ATI. Not that ATI is bad, but that's almost two new fabs. That's a lot of talented engineers making a lot of money for several years at least. I think ATI would be insane to not take the offer, but I feel AMD is almost equally insane to make the offer in the first place.

    It is a big gamble for AMD, no doubt. A make-or-break deal in fact. But I would hope that the top brass has carefully considered the costs and the future markets/profits/advantages. With any high stakes game, the rewards are spectacular, but the cost of failing can mean you're history.

    I suppose having new fabs does you little good if you can't offer a platform to the Dell's and HP's out there. There is no doubt in my mind about one thing, AMD is aiming straight at Intel's integrated platform approach.
  • Calin - Wednesday, August 2, 2006 - link

    AMD is trying to get in the game of long-term selling. The corporate computers initiative they had some years ago (or maybe a year ago) was a first step - "freeze" a computer configuration, which you can then offer for a long time (like 3 years). If a computer breaks, move its hard drive in a new computer in the same line, and have everything working with GUARANTEED no problems.
    AMD did good in taking over the enthusiast market by storm - but this market has NO loyality whatsoever - people will upgrade everything they need and everything they don't in order to get the next big thing. Having a guaranteed revenue of mostly guaranteed value beats that (having an non-guaranteed revenue of big or small value, like it happens now).
    AMD is much more ready to go in the corporate market - selling desktop computers, not just servers as it did until now.
    Also, take into account that if AMD is behind in the "next big thing" (whatever this might be), it really does not have the money to play catchup. Intel has both the money and the market inertia to continue to be a big player when everything else is against its products. So, AMD is puting its future on a bet that the next big thing will be core-integrated graphics. If this works, they would reap huge benefits - just like they were able to with the Athlon64 on desktops/Opteron on servers (and somewhat Turion on mobiles). Before the Opteron days, AMD was largely inexistant in server space (the Athlon MP started to make a buzz, but they had little market share).
    Will the money have been better spent on two fabs? AMD and ATI are both using external partners for creating chips, and this is more expensive only in the long run. In the short run, paying more for chips beats paying 3 billions to have your fab ready in three years. I figure the use of external fabs will continue long time in the future, and just the top of the line products will be built on AMD's fabs.
  • darkdemyze - Wednesday, August 2, 2006 - link

    quote:

    Also, take into account that if AMD is behind in the "next big thing" (whatever this might be), it really does not have the money to play catchup.


    You say AMD doesn't have the money to play catch-up, and this is true that the whole deal is "a bet." But how else is AMD supposed to catch up? With C2D being released? Intel is going to have a huge impact on the performance sector by the end of the year - about the same time this merger is projected to be completed. What I mean by this is Intel is now ahead of the curve on AMD with this new architecture and according to their "new architecture every 2 years" roadmap, and Intel intends to not let the performance crown slip again as they did with Pentium4.

    So what is AMD to do to keep up? As you said, place a bet on "the next big thing" and hope for the best. I'm not discreditting AMD for K8L, or Torrenza for that matter. But I think at the very least Torrenza will be greatly effected by this endeavor. Personally I feel this is a very positive aquisition.
  • Calin - Thursday, August 3, 2006 - link

    K8L is just a few months from launch, and it might get AMD to performance parity with Intel (or exceed Core 2 Duo, or be left behind). I am hoping for a draw or a win for AMD.
    What AMD needs is a cash cow (as Athlon64 was until now). Will the ATI acquisition bring this to table? It could very well be so, and there are enough niches and market slices where this strategy is a winner.
    Unfortunately, this might (or might not) reduce the competition in high-end video cards arena...
  • Nelsieus - Wednesday, August 2, 2006 - link

    I strongly disagree with you.

    This, thus far, has been the best summarization coverage I've read on this issue.


  • PeteRoy - Tuesday, August 1, 2006 - link

    AMD did not have it's own chipset with integrated graphics, audio and lan which is why it never made it to the offices where the big money is.
  • Thatguy97 - Friday, January 29, 2016 - link

    Well that went to shit

Log in

Don't have an account? Sign up now