Comments Locked

50 Comments

Back to Article

  • drwho9437 - Saturday, August 8, 2009 - link

    I think Anand is correct that FABs are ridiculously expensive and that the volume leader has a major advantage. However, it remains to be seen if the spin off is a good move for AMD.

    The honest truth is that 15 nm is probably the end of the road for gate lengths. That is a few generations away. Lithography can improve all you want it to, but leakage and process variability become more and more important as you go down in size. At 10 nm cube of Silicon has about 100,000 atoms in it. Even at 10^20 /cm^3 doping levels you are talking about 100 atoms in the channel. That's 10% variability in doping... 32 28 22 18 end of the road my friends. Of traditional Silicon that is. There really isn't anything right now to easily replace it, its just so nice chemically...

    People will say the end has been predicted forever, and then claim I am wrong (at least that is what normally happens when I say this). I may be wrong but that isn't why. The sun comes up every day but one day it won't. One day we will hit the physical limits of Silicon and from my point of view it will be the statistical that will certainly be a problem. I think you'll see problems at around 10-15 nm.

    What is exciting it the idea that we will have to finally be more cleaver than drawing smaller lines in Si and think of something truly original to make progress...

    But given that that original idea may either take time to materialize or that it might be expensive, I think you will see reliable Si for a LONG time at some dimension. 40 or 32 nm may be with us a for decades in many types of systems... Given this, the cost of upgrading FABs may stop because there will be no more upgrades. Then you ask do you want control over them or not... I can see arguments on both sides. Certainly in the near future AMD had no choice cash flow wise.

    There are other ways to innovate other than die shrinks, and leaning on optics and chemistry. We shall see if we are cleaver enough.
  • cosminliteanu - Monday, August 3, 2009 - link

    Thank you for this article :)
  • zodiacfml - Sunday, August 2, 2009 - link

    Thanks Anand! Some good posted comments too.
  • toyotabedzrock - Saturday, August 1, 2009 - link

    $88M / 1,400 Jobs = $62,875. So a lot of the jobs must be for not highly skilled people? I consider highly skilled to be worth $80-100K.

    Doesn't the US control how much of our most advanced tech is exported to certain countries? We only just recently allowed Intel to build a 65nm fab in China.
  • drwho9437 - Saturday, August 8, 2009 - link

    Typical FAB worker would probably hold a BS in EE, Chem, Mat Sci, starts at about 42-50K when I graduated undergrad. Those 10K slots add up for the MS/PhD who run the line or whatever.
  • mesiah - Thursday, July 30, 2009 - link

    Great article! I love reading stuff like this. I am by far not an expert in the field, but it sounds like EUV could revolutionize the chip making process. Although it looks like it still has a significant number of hurdles to overcome, especially when it comes to WPH. Also an exciting prospect but many more years off is the current research with graphene. Its hard to imagine the processing power we will be able to harness in the near future. Alot of people take for granted the devices we use today. Things like iphones and netbooks were hardly dreamed of when I was a kid. Sure, we talked about seeing stuff like this 15 years ago, but I guess I never believed it would happen so fast.
  • MODEL3 - Wednesday, July 29, 2009 - link

    Great article.

    I just have some questions regarding fab companies such TSMC:

    Until 180nm, we had a very good tempo for manufacturing proccess transitions.
    Then suddenly, from 180nm to 90nm there was a very profound deceleration.
    Then from 90nm to 40nm (and for the medium-term future) we have an increase of the tempo again.

    Why they keep increasing the tempo?

    You said that the R&D cost increased dramatically and that the Fab Start-up Cost increased dramatically also. (I am not arguing, after all I don't have a technological background)

    So if for a company such TSMC, the cost has increased so much more in relation with the past then it should charge their customers per wafers (per size counted in) much more, right?

    Then Why between 90nm and 55nm:

    The die size per price range increased exponentially (I have done the calculations with Nvidia chips because the "die information" was more easily found on the net)

    This is a fact.

    Unless that your explanation is simply that Nvidia & ATI made way way more money back then, or that TSMC was robbing Nvidia/ATI back then, what is your explanation?

    I mean that, yes it is easy to figure out that Nvidia & ATI made more money back then, but the financial data does not support such a huge difference.

    My guess (out of the conspiracy book) is that already exists a valid successor of silicon based type methods and they just milking the current technology (for the next 5-10 years)

    (I am not talking about technologies to scale down lithography, like alpha EUV, I am talking about THE next step to silicon based types)


    In another subject you said:
    Note that this 28nm process is a “half-node” (between 32nm and 22nm) and where I’m expecting to see ATI (and NVIDIA) GPUs made at Globalfoundries"

    Why are you expecting 28nm to be that process?

    Do you think that after 28nm the TSMC will skip 22nm (like they did with 45nm) and focus on 20nm (so the transition will take them more time), so GF will find an opportunity to offer something competitive?
  • drwho9437 - Saturday, August 8, 2009 - link

    High K and SOI both expensive gave new breathing room. But they both are stop gap measures... Bigger problems to come. Expect 32 nm and the next node to go OK then bigger fish enter the picture.

    Also don't expect way higher frequencies. 2 GHz is pretty easy to design. I can do a PCB that is flat to 2 GHz in about an hour... Designing in the X band however is painful.

    Still all sorts of interesting things going on outside Si, organics, molecular switches, terahertz, optics on chips, the return of magnetic. They are all pain in the butt compared to better lithography though :-)
  • Zagor Tenay - Wednesday, July 29, 2009 - link

    Great article! It is so relieving to see that the whole semiconductor industry is not a captive at the hands of an ugly, greedy beast called INTEL.
  • Zingam - Wednesday, July 29, 2009 - link

    Don't forget that AMD owns the GF. If GF makes a profit. AMD gets revenues.
    I don't think that AMD does not have foundries anymore. They are just not attached to it. It isn't the same as NVIDIA - they do not possess anything like GF.
  • iwodo - Wednesday, July 29, 2009 - link

    There are loads of questions to be asked, I hope Anand could provide some insight.

    1. AMD's problem is not having enough fabs to manufacture its chip therefore to keep healthy supply. Considering AMD have had 30% market share before without much help from OEMs. They were doing very good. And discounting ATI 's chip are not even produce from GF. They dont even have enough capacity to keep up with their own products. Why are they signing up other players?

    2. The X86 issues - i believe everyone is allowed to design an X86 CPU, but to manufacture it require a license from Intel. Now GF has the license to make X86 for AMD. Doesn't that mean other players could theoretically make x86 CPU ( Nvidia ?? ) Ofcoz, getting X64 from AMD is a different story.

    3. What is the current Physical Limit we can downsize to? Since Atom are measured in size of pm ( 1nm = 1000pm ) . I know Intel has an up to 11nm roadmap. If, and if we can get to single digit nm scale then i honestly believe ARM dont have a fighting chance against Intel. Not to mention 3D and Muti Gate tech will help.

    4. Any info on how many fabs Intel, Samsung and other players owe?

    5. Are fabs for Flash a lot cheaper?

  • willstoddard - Wednesday, July 29, 2009 - link

    I think the limit is not necessarily going to be physical limit as much as a fiscal limit. The photo lithography process is the Fab limiter. It always has been. The Nikon S620 is just beginning to ship to customers (Currently Intel only). It is targeted for the 32nm node. ASML has the XT:1950Hi at the same node. Intel will push these machines beyond advertised specs to produce 22nm gate layers.

    Both Nikon and ASML have EUV (x-ray) machines in second stage beta. However there are still significant hurtles to overcome. EUV requires exposure under Cryo vacuum pressures. This is new to both companies. X-rays do not penetrate quartz or CaF2 optics, so only mirrors can be used, including the projection lens system. This causes significant amounts of flare. They will work through these and many other issues, but my crystal ball tells me the the biggest hurtle will be making the technology economically viable to IC makers. The cost of one 32nm node lithography immersion scanner is 40+ million dollars. You can negotiate a better deal if you buy more. ASML tends to be ~7 to 10 million more per machine than the comparable Nikon because they have to buy the projection lens from Carl Zeiss. Nikon makes their own. The larger fabs have as many as 40 of these machines. These machines have a throughput of 170 (300mm) wafers per hour (ASML) and 200 WPH (Nikon). The EUV machines will cost $80M+ and will have ~40WPH (300mm) throughput, limited by the illumination source and delivery system.

    When the machines cost twice as much and have 1/5th the output capacity, it is difficult to make a viable business model.

    So what will happen? I think EUV will still need to mature for another 5-7 years before it is a viable option.

    The article states that immersion technology lowers the wavelength of the light as it passes through water. In fact, it does no such thing. It simply changes the refractive index of the medium the light is travelling through from ~1.00 of air to ~1.44 for water. This changes the angle of the light leaving the projection lens and allows higher order diffected rays to reach the wafer surface which increases contrast of the image.

    I think we will see other innovations in the current lithography systems that will get us to 12nm. Researchers are looking for a replacement for DI water to further increase the index of refraction in immersion systems. Exposing with polarized light allows much greater lens NA values without a loss of contrast or depth of focus. Unfortunately 157nm as an exposure wavelength is not viable because the optics (PL and beam shaping) degrade to fast and would have to be replaced too often.

    If solutions for the EUV issues are not found within the next 5 years, we may see a very large price spike in high end ICs (both logic and memory).
  • Anand Lal Shimpi - Wednesday, July 29, 2009 - link

    1) I don't believe AMD is presently capacity constrained. Also remember that AMD is effectively only running on 1 fab right now, as Fab 1 Module 2 migrates to 28nm it shouldn't be a problem taking care of both ATI and AMD needs.

    2) I don't believe GF gets the license to make x86 CPUs, AMD retains the license and it's their belief that they can use that license to produce the CPUs at GF. So far I haven't heard any more about what AMD is trying to do as being illegal.

    3) GF is saying that down to 22nm isn't an issue and with EUV moving down to 16nm should be possible too. I've heard that moving beyond that is going to be possible as well but we're going to start seeing the use of a lot more interesting techniques to put transistors together. Multi-gate and through silicon vias are coming.

    4) Intel has a ton of fabs, but not all of them are at modern manufacturing processes. Some are still producing 130nm and 90nm I believe for chipsets. The number to pay attention to is 4 x 32nm fabs for 2010 for Intel.

    5) I don't believe the fabs themselves are much cheaper, but the chips are far simpler to produce.
  • Shilohen - Wednesday, July 29, 2009 - link

    For 3, you're not exactly correct. Atoms are often measured in Ångström which is 0.1 nm, so quite far from the 1pm scale. Therefore, I fear there isn't much room left for miniaturization. Furthermore, as you reduce the transistor size, you start getting strange quantum effects, the most problematic one for computers being the tunnel effect where the moving electron more or less decides to escape its potential well, "tunnelling" through the barrier. See http://en.wikipedia.org/wiki/Quantum_tunnelling">http://en.wikipedia.org/wiki/Quantum_tunnelling.

    Basically, I don't think Moore's law is going to hold much more longer, maybe 5 years and AFAIK, Quantum computing is incredibly far from being ready.
  • OccamsAftershave - Thursday, July 30, 2009 - link

    In a decade, when memristors, memcapacitors and meminductors begin mass production, completely revolutionizing circuit elements and computing itself, the current bumps from new lithography processes will seem like baby steps.
    Add 3D, and Moore's Law is left gasping in the dust.
  • sdsdv10 - Tuesday, August 4, 2009 - link

    And so begins Skynet...
  • nubie - Wednesday, July 29, 2009 - link

    Don't forget they will be using optical tech to transmit the data around, so you can lose all of the signal-boosting transistors that allow the data to be routed around the chip, and their supply circuits.

    Don't forget chip stacking and 3D processes, if you could get 256MB of Level 3 cache on-package ("on"-die), you would be able to keep the cores working much longer and see more efficiency from them.

    If you could design the chip with a corrugated cross-section and build onto it you would have much greater surface area, combined with the Caches on a secondary layer and optical data routing you would see a great gain.

    I don't think we are going to hit the "wall" anytime soon. I recall an article from ~2000 in PopSci that claimed 32nm was the wall, but it doesn't look like it is.

    I think that the tech will get smarter, there are always more ways to get something done, you never know x86 might obsolete and provide even more efficiency, or be replaced by on-die co-processors for all compute intensive tasks, leaving only a couple x86 cores for general purpose duty.

    Great article Anand, keep up the good work :)
  • Matt Campbell - Wednesday, July 29, 2009 - link

    Always cool to see the entry and upkeep costs of fabs. I'm glad that they're coming here (being an upstate resident), but I hope they negotiated some very substantial ongoing tax breaks - that is a major detriment of doing business in NY.
  • Kibbles - Wednesday, July 29, 2009 - link

    Did the licensing issue on x86 ever get cleared? Last I remember reading was AMD arguing that: the license was only for technical help from Intel and that they haven't needed Intel's help so they don't need the license?
    I'm curious because if that's true, then does that mean anyone, at least those that don't need help from Intel, can make a x86 processor? (nvidia?)
  • philosofool - Wednesday, July 29, 2009 - link

    Good article. It's nice to understand the business perspective on the need for a foundry partner. I'm very interested to see where this actually goes, but it seems like a wise move if the rest of the world's microprocessor making companies decide to outsource their fabrication (which only makes sense as costs go up.)
  • aguilpa1 - Wednesday, July 29, 2009 - link

    MT
  • vol7ron - Wednesday, July 29, 2009 - link

    This is good and bad.

    It's good because the costs could be driven down, but as said it seems like Intel will be on 22nm-12nm before AMD gets a deep breath of the 32nm market. Not to mention, when you branch out into a new factory and new process like that, you're likely to run into other problems.

    Time will tell. Maybe this will give AMD some time to really think about its chip architecture.
  • TA152H - Wednesday, July 29, 2009 - link

    This article is full of nonsense.

    AMD could easily own fabs without owning the PC market. IBM does. Does IBM ship as many processors as Intel? Hmmmm, doubtful.

    It's kind of funny how AMD couldn't afford fabs when their processors suck, but could when they the better processor.

    Fabs ARE expensive. So is not owning one. Buying ATI gave them plenty of uses for their fabs, except they never built ATI stuff on them. Especially with the chipsets, they could have used their older equipment to keep building them.

    Let's face it, AMD ran into trouble because they can't make a decent processor. They make a large processor that can't perform. It's the same size as the i7, but it surely can't sell like one, so they have to sell it really cheap so it makes sense to buyers.

    Real men have fabs. That's from Jerry Sanders, and he created AMD, only to have Hector Ruiz castrate it, and destroy it. They Phenom is just a bad design compared to its competition, and they have no answer for the Atom, which is gaining acceptance. That's the problem. The slow market share gains AMD was making ended when they ran into a buzzsaw called the Conroe, and so did their profits. Sure, it's the fabs. Uh huh. Funny, though, how Conroe made them a lot more expensive, isn't it?

    Selling their fabs was like cutting their arm off, and thinking they would win in a heavyweight boxing match. Sure, they got their weight down, and will have a little more stamina, but they are a one-handed fighter now. It's fine against half-rate companies like Nvidia, that also have no fabs, but against Intel, it just won't cut it.

    Inferior design and no fabs :( . Little wonder why they are losing money? The worst part is, it just looks worse for them. Intel's roadmap is full of interesting new products. AMD's? Good grief. Does Jerry Sanders have any kids???? He could pull magic out of hat. These ass-clowns running AMD are just running it into the ground. Thank goodness they have ATI though. ATI still makes great products, and was a very good move for them, despite the cost. I wish AMD would just stop trying to compete with Intel on performance - let's face it, they can't. They're not smart enough, and they don't even own fabs. If they could make a chip that's smaller, and less expensive to make, and slower, instead of one that's the same size and slower, they'd have a market. If they could make something 60% of the size, with 75% of the performance, they'd have something. Or something close. Something between the Atom and i7. But, nope, they make something 100% of the size, with 65% of the performance. Give them credit, that's hard to do. I don't know anyone else who would reward such incompetence with jobs. Something has to change there. They can't compete with Intel in performance, but they might be able to make something smaller that offers good enough performance. It's a huge market. Selling an expensive to make chip into it isn't the right approach, but it's all they have with their awful design. ATI executed perfectly with their designs, by not trying to make it huge and super-fast. AMD should borrow some folks from there.
  • brybir - Wednesday, July 29, 2009 - link

    I fail to see why this article is full of nonsense as you say.

    The point of the article is that building fabrication facilities is a very high cost capital element for a company that does not have the scale to adequately absorb the costs.

    For example, AMD had two fabs. Say they get close to maxing out those fabs producing the Athlong X2 XXXX+ stuff. Their decision is "do we invest in a new fab to the tune of $4bn dollars or not". Now, say they commit the money to build it, $4bn, and then they produce something called the "Phenom" which ends up being a gigantic turd. Suddenly they are hemoraging a lot of cash. That $4bn they started to spend on a fab is no longer needed as they lose market share. This only can happen a few times for a company the size of AMD before it is bankrupt.

    The article basically implies that the company needs to be the size of Intel, IBM, Sony etc who can infact take the risk to build a new fab. If Intel, the current majority owner of the CPU market, builds a fab and in the next generation after i7 and i5 they produce another P4 esk piece of crap, they can absorb the extra cost of the foundry they started to build. Its about size and scale. Intel also can draw its fab's out over longer lifetimes making chipsets and flash, whereas AMD does not make much of that stuff these days.

    Short story is that AMD is not in the position to build fabs and capitalize it on the way a larger company that makes a diverse set of products can.


    As to ATI...the designs are build on a process years before we see them, its not like they sit down 6 months before product launch and start designing. Its a constant cycle with the planning done for years in advance. Could AMD have produced ATI chips in its fabs? Yes, as they are going to start doing it on GF in 2010+. But, AMD is in the same position, what happens if "bulldozer" ends up being a rockstar and ATI produces a nVidia killer GPU and everything is being made on two fabs in Germany. Its possible there will be product shortages and how would that look...AMD finally makes a good competitive product and its cant make enough....certainly has never happened in the past....


    As to your last rant about price/performance....I dont even know what to say. You say things like "something between the Atom and the i7"...thats not even a statement. It means nothing. They execute designs years ahead (notice the trend here on the planning thing) and something it does not go according to plan. Remember when AMD had better CPU designs for a good while during the P4 years, when Intel made huge CPU's with extremely deep pipelines that were also as large or larger and often had to be sold at discounts. There was that bit about Intel abusing its monopoly status and such too but we can ignore that for now. The point is, things are planned and sometimes they go accordingly and sometimes they dont. Your ranting statement could change AMD to Intel and it would be true a few years ago.

    Today its AMD that is failing to execute as well as their competition. But to say that they are not as smart or that no one should give them jobs etc....give me a break. What do you want AMD to do, just stop producing their current CPU's and wait a yaer or so until their next generation is ready to go? But wait, im sure they can design one of the most complicated mircoprocessors in the world in like 6 weeks or something right?
  • jconan - Tuesday, August 4, 2009 - link

    but IBM is also a foundry like GF, though not as large as TSMC they also manufacture chips for other companies ie Nintendo, AMD, Sony, and etc...
  • TA152H - Thursday, July 30, 2009 - link

    I'll repeat my other statement. IBM is making money with their fabs, because they sell decent processors that they can get margins on. AMD isn't. This article has it all wrong. They could have fabs, if they sold processors with a decent margin. That's the root cause.

    I'm not saying AMD shouldn't second source some of their production, which you're implying. I'm simply saying they need their own fabs. There's a synergy between manufacturing and design, and now they don't have it. Intel does though.

    I'm not suggesting they build another fab, yet. I'm suggesting that in order to compete with Intel on anything near equal terms, you need your own fabs. As Winston Churchill said, wars are not won on retreats. Losing your fabs is a HUGE retreat, and now they have a big competitive disadvantage compared to Intel.

    Intel's Pentium 4 showed just how smart they were, and how stupid. It was an extremely advanced design, far more advanced than anything available at that time, and also a bad performing one. AMD is still making a K7 with some changes now, and not enough of them.

    The remarks about the Atom and i7 are insane. I'm not suggesting AMD should have known about the Atom, or they are smart enough to design something like it. Clearly they're not. But, why are they still using a huge x87? They got rid of x87 instructions in x86-64 back with the Athlon 64. Why are they still supporting 3D Now! SSE came out around 2000? The Conroe came out in 2006. It's 2009. Why are they so far from being competitive?

    They should ditch their current designers, and hire some from IBM. Not that it would be easy, but if you offered a guy third in line to be your lead designer, you'd get a lot of guys with that type of promotion. Two companies still know how to make processors, IBM and Intel. AMD can't. They need to get some designers from IBM, or another company that doesn't make failure after failure, and stop rewarding the people that are in charge. They haven't advanced the design NEARLY enough since 2000. Something has to change, and the constant delays to Bulldozer aren't the type of change that's needed. Maybe they can get some SPARC designers. I don't know how good they are, but they at least can introduce some new ideas. Besides, how can they be worse?

    I still think AMD is best served by their ATI approach. A size that makes sense, and don't try to be the fastest. That's been AMD's approach for almost forever, and it can work. From 2000-2006 they competed for absolute performance, because Intel made a pig, but that's not the norm, and AMD got sucked into it a slugfest they can't win. It's like watching an old fight with guys fighting Marciano. They think they can outpunch him, and slug for a while because they aren't entirely unsuccessful, but then get worn out while he doesn't. In the end, they end up flat on their back wondering why they tried such a strategy. It's tempting, but ultimately disastrous. AMD's flat on their back, with their feet in the air, losing massive amounts of money any quarter. Selling fabs won't change it - designing a good processor might.
  • bh192012 - Wednesday, July 29, 2009 - link

    "Short story is that AMD is not in the position to build fabs and capitalize it on the way a larger company that makes a diverse set of products can. "

    Then the other route they could have tried would be to become a larger company and make more products. Like their own GFX chips, SSD hard drives, stmicro chips etc. Bascially everything described in the article could have been done w/o breaking off the fab, except maybe hoping to make Nvidia graphics chips. The biggest difference would be the 6 billion in cash, which is probably the main real reason they did what they did?
  • zebrax2 - Thursday, July 30, 2009 - link

    If only its as easy as you say it is. Going to other markets needs money and money is what they don't have. Plus add to that the time needed for R&D of the said new products. They would have already bled to death before they were even able to release it.
  • B3an - Wednesday, July 29, 2009 - link

    I dont know why you have bothered write such a long reply to such an amazingly stupid post by a moron that obviously knows absolutely nothing.

    I particularly like the part:

    "AMD could easily own fabs without owning the PC market. IBM does. Does IBM ship as many processors as Intel? Hmmmm, doubtful."

    That made me LOL. As if AMD even have 1/4 the money IBM have to build and maintain fabs.
  • JonnyDough - Monday, August 3, 2009 - link

    Yeah, that's what I was thinking. I don't think this moron realizes who/what IBM is...they're a giant.
  • TA152H - Thursday, July 30, 2009 - link

    Obviously, you're a moron.

    Think before you post, OK?

    IBM CAN afford just about anything, but they don't afford things that lose money. Perforce, IBM's fabs make money, or they'd jettison them. How is it they make money with their fabs, and AMD can't? AMD sells a lot more processors. The answer is simple, except for you. AMD is making a crappy processor they can't sell for much money.

    I'll say it again, because you're obviously slow. IBM has fabs that make money, or they'd get out of the business. They sell less processors. Therefore, AMD having to own the x86 business is a idiotic remark. They just need processors that they can sell for a higher margin. Got it now, simpleton?
  • HVAC - Wednesday, July 29, 2009 - link

    Dear Brybir,

    Here, let me help you by writing the reply you should have written instead of the diatribe you did submit:

    "@TA152H .... MORON!"
  • brybir - Wednesday, July 29, 2009 - link

    You are correct.

    I got caught in a moment of weakness in my desire to keep the trolls away. I am also bored at work. A very bad combination.

    What I have done here represents all that is bad and wrong with the world. I will turn in my nerd card at the door and go sit in a shallow pool of cold water in a dark corner of the room until I am better.
  • DFranch - Wednesday, July 29, 2009 - link

    strikeback03: I did not realize that Malta, NY had a reputation for rain. It's not exactly Seattle.
  • strikeback03 - Wednesday, July 29, 2009 - link

    I lived in the area for 4 years, it rains enough, and snows more
  • karhill - Wednesday, July 29, 2009 - link

    Malta's annual rainfall is about 43.5 inches, compared to 37 inches for Seattle.
  • Tuor - Thursday, July 30, 2009 - link

    Heh. You shouldn't be looking at rain totals, but days per year that it's mostly cloudy/cloudy. I'm pretty sure Seattle will beat out Malta pretty easily in that regard... but maybe not.
  • JarredWalton - Thursday, July 30, 2009 - link

    http://countrystudies.us/united-states/weather/was...">Olympia is much worse than http://countrystudies.us/united-states/weather/was...">Seattle, according to that site.
  • JarredWalton - Wednesday, July 29, 2009 - link

    Ha! You leave my lovely state of WA out of this. At least here in Olympia, we get very little (if any) rain during the time of June-August. My grass is dead, and current temps are in the mid-90s (supposed to hit 101F today!), which totally sucks. Anything above 80F is too hot for me. :-(

    GIVE ME BACK MY RAIN, DAGNABBIT!

    FYI, Olympia gets more rain than Seattle: estimate is around 180 days of rain per year. LOL
  • just4U - Thursday, July 30, 2009 - link

    Yeah it's really hot here in Calgary, Canada right now. It's like 61F .. and were expecting it to get up to 82F tomorrow.. Ugh!!

    (WTH who thru that shoe at me! oO)

  • ClownPuncher - Wednesday, July 29, 2009 - link

    98 F in Redmond currently, no AC in my house...I'm glad I went with some high CFM fans in my air cooled PC!
  • BillyAZ1983 - Wednesday, July 29, 2009 - link

    Pffft, here in lovely Bullhead City is a very chilly 120F. Pretty soon I might have to go get my jacket.
  • JarredWalton - Thursday, July 30, 2009 - link

    Ah, but I'd wager you have AC. Washingtonians don't believe in such things (at least not for homes), since it "never" gets that hot here. Ugh.... My house started at 86F this morning (8AM), got to 91F by noon, and reached a high of 96F by 6PM. Currently it's back down to 92F - yes, at 10PM. The fan by the door isn't helping much, considering it's only a few degrees cooler outside.
  • strikeback03 - Wednesday, July 29, 2009 - link

    ...until I read that it was in Malta. Where there would certainly be rain if it was outdoors.
  • Sottilde - Wednesday, July 29, 2009 - link

    Hey Anand, thanks a ton for turning me on to CNSE. I was just starting my search for a graduate program. I'm determined not to be a CS code monkey the rest of my life!
  • Pirks - Thursday, July 30, 2009 - link

    I don't see how being a wafer pressing monkey is any better
  • nunocordeiro - Wednesday, August 5, 2009 - link

    Good luck fort you! CNSE does look like a good carrer investment. And don't mind Pirks. He is our own little private joke around these parts.
  • blyndy - Wednesday, July 29, 2009 - link

    The graph shows GF starting 32nm production in Q1 2010, which sounds great considering Intels 32nm product will start selling in Q4 2009.

    But it's easy to forget that it's many months between the start of volume production and the start of retail availability.

    So, realistically, how long before a 32nm AMD CPU is available in retail? I would imagine the very end of Q3 2010 at the earliest, more likely mid-Q4, with a majority 32nm lineup a year after that. So, that is still roughly a year after Intel, but given previous transition timetables, not bad at all!
  • blyndy - Wednesday, July 29, 2009 - link

    I like reading about the insides and technologies in fabs.
  • Einy0 - Wednesday, July 29, 2009 - link

    Yes, very interesting stuff... Nice to see this working out for GF / AMD... In fact to whole semi-conductor industry gets a new player with big money to build new high quality chips on cutting edge technology.

Log in

Don't have an account? Sign up now