Page 2 of 2 FirstFirst 12
Results 17 to 30 of 30

Thread: AMD and Intel do battle over TSMC capacity, says report

  1. #17
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by DanceswithUnix View Post
    As I've noted before, AMD seem to be doing much better with him gone. Might be fluke, but funny none the less.
    It wouldn't surprise me if RDNA was actually what Raja Koduri was working on - even the naming sounds like something he would come up with!

  2. #18
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by Iota View Post
    Isn't there a danger here that Intel will look closely at how TSMC is producing more useful wafers at lower process scaling and takes that knowledge back and applies it and adapts it to their own fabs and processes?
    They will be told that their card is the Three of Diamonds, not how the trick was performed


    Quote Originally Posted by CAT-THE-FIFTH View Post
    It wouldn't surprise me if RDNA was actually what Raja Koduri was working on - even the naming sounds like something he would come up with!
    IME it takes about 6 months after a key figure leaves an engineering group for it to have found its new direction. He has been gone 3 years, so approximately nothing in AMD engineering will be down to him at this point.

  3. #19
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by DanceswithUnix View Post
    They will be told that their card is the Three of Diamonds, not how the trick was performed




    IME it takes about 6 months after a key figure leaves an engineering group for it to have found its new direction. He has been gone 3 years, so approximately nothing in AMD engineering will be down to him at this point.
    I give you an example of Jim Keller,and his first stinct at AMD - he left in 1999. He originally work at DEC on the Alpha series CPUs,and AMD bought over a lot of ex-DEC engineers from him,and licensed many DEC things such as the EV6 bus. But he did a lot of important work for the K8,aka,the Athlon 64. He worked on Hypertransport and X86-64. So basically only 3 years later was much of the stuff he worked on,actually finished.

    Also look at his second stinct at AMD in 2012. He left in 2015 and Zen was released in 2017. Zen was revealed in 2015.

    RDNA1 was launched in 2019 and Navi was delayed,but references to the GPU family appeared back in 2017:
    https://hexus.net/tech/news/graphics...-linux-driver/

    Raja Koduri left at the end of 2017. So I would say its highly likely he worked on RDNA1. PC enthusiasts just hate on him,but forget that AMD was in a terrible state when he joined,and seemingly ignore his hand in GPUs such as the ATI 9700 PRO.

    Enthusiasts also ignore the important role Rory Read played in actually stopping the company from going bankrupt,and diversifying its sales base. Again we assume Lisa Su was the person who started the design of Zen,but Zen on original roadmaps was meant to be a late 2016 release,and Rory Read is the person who hired Jim Keller. So from what I gather design work for Zen started as far back as 2012. It is Rory Read who people need to also be thanking.

    We need to look at what he was involved when he joined in 2013 - GCN1.2(R9 290),probably not. But after that period AMD started cutting R and D anyway,and moved more towards their Bulldozer replacement. RTG was very poorly funded.

    GCN1.3 which was in the R9 285/R9 380 and Fiji probably but there was strong hints, the parts were made primarily for Apple,as they had enhanced OpenCL performance. Fiji was GCN1.3 and a test vehicle for HBM,and AMD had been making test boards for a few years beforehand. If you look at Fiji it is basically a doubled up Tonga using HBM. Apparently parts of the design were farmed out to an external company to save money.

    Polaris and Vega are definitely what he worked on,but AMD ended up having to use the inferior GF/Samsung 14NM due to WSA. Nvidia GPUs made on the same node were generally worse,and these were smaller than what AMD was attempting to make. Again if you look at Polaris,it was launched with a very high stepping number,indicating there were very real problems getting it to market. You saw that with the problems with PCI-E power draw. Getting back to TSMC was the best move AMD made,as GF had very little experience with GPUs.

    We saw the problems with the same node with Ryzen barely clocking past 4GHZ and being delayed nearly 6 months. But GCN1.3 and GCN1.4 were iterative refreshes done on the relative cheap. Nvidia made a far bigger change going from Kepler to Maxwell.However,Polaris appeared in the consoles refreshes,so I would argue,they were partially funded by MS and Sony,as their consoles came out the same year.

    Vega was realistically never made just for consumer workloads,but actually helped AMD gain sales in commercial markets,and the first Vega based GPUs were Radeon Instinct accelerators. They were the first AMD GPUs to actually try and get into the machine learning market. AMD because they had no GTX1080 competitor basically pushed it to the limit for desktop. Even many of the features which they had were never enabled in software,again showing how little money was appropriated to RTG. Yet if you look at commercial Vega GPUs,they were far more efficient. The Radeon 7 probably is something he was involved in too,as it further bumped up machine learning performance.

    So I would say maybe RDNA2,possibly is something he didn't work on,but then RDNA1 was the precursor to RDNA2. Then look at CDNA...its probably Vega refined even more towards commercial workloads.
    Last edited by CAT-THE-FIFTH; 27-07-2020 at 09:45 PM.

  4. #20
    Senior Member Xlucine's Avatar
    Join Date
    May 2014
    Posts
    2,160
    Thanks
    297
    Thanked
    188 times in 147 posts
    • Xlucine's system
      • Motherboard:
      • Asus TUF B450M-plus
      • CPU:
      • 3700X
      • Memory:
      • 16GB @ 3.2 Gt/s
      • Storage:
      • Crucial P5 1TB (boot), Crucial MX500 1TB, Crucial MX100 512GB
      • Graphics card(s):
      • EVGA 980ti
      • PSU:
      • Fractal Design ION+ 560P
      • Case:
      • Silverstone TJ08-E
      • Operating System:
      • W10 pro
      • Monitor(s):
      • Viewsonic vx3211-2k-mhd, Dell P2414H

    Re: AMD and Intel do battle over TSMC capacity, says report

    The next officially scheduled Intel launch will be that of Tiger Lake mobile processors on 2nd September.
    Neat, didn't know that was confirmed yet



    Quote Originally Posted by CAT-THE-FIFTH View Post
    ...Polaris and Vega are definitely what he worked on,but AMD ended up having to use the inferior GF/Samsung 14NM due to WSA. Nvidia GPUs made on the same node were generally worse,and these were smaller than what AMD was attempting to make. Again if you look at Polaris,it was launched with a very high stepping number,indicating there were very real problems getting it to market. You saw that with the problems with PCI-E power draw. Getting back to TSMC was the best move AMD made,as GF had very little experience with GPUs...
    What does the node have to do with power distribution on the 12 V side? The chip itself will only see whatever comes through the VRMs

  5. #21
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by Xlucine View Post
    What does the node have to do with power distribution on the 12 V side? The chip itself will only see whatever comes through the VRMs
    AMD hinted at R9 290 level performance for months before the RX480 launched. One of them was it would be "entry level" VR performance,and what did AMD consider entry level back then - an R9 290. That is why I said on here and OcUK forums,that the RX480 would be R9 290/390 level performance which was confirmed by leaks a few weeks before launch.

    If you look at the actual design,it was rated for 150W power draw via the power connectors,even though AMD reference PCBs can handle more than that(Nvidia is very stingy in that regard). AMD first had to fix the PCI-E power draw,and then said the 6 pin was safe to draw over 75W,which is how they fixed it. However,you didn't got designs after that which made those claims. If the PCI-E power connector needs over 75W,an 8 pin is used.

    From the very high stepping number,etc I expect by the time AMD got the chips back from mass production,they realised they had to bump up voltage,etc to just hit the clockspeeds they needed to hit to get the performance they wanted. But if you already have designed the PCBs and coolers,then that is what you have. They just fudged it all,instead of redesiging the reference PCB with an 8 pin power connector,and a better reference cooler.

    What AMD did was dodgy TBH - they honestly should have shipped the reference designs with lower voltages and clockspeeds,then there would be no problem. But they didn't and got found out.

    If you look at many reviews,the reference cards throttled very quickly,so AMD just put through any voltage which was required to hit that performance target. This is why the RX480 cooler was so crap - I think the whole shebang was meant to be under 150W,and they didn't bother to change it once they realised it was an optimistic assessment.

    If you also look at the Nvidia GPUs made on Samsung 14NM,they had around 10% worse performance/watt than the TSMC made ones,and the GF process was probably in a worse state,ie,as GF had licensed it late in the game. If you don't believe me,just look at this:
    https://tpucdn.com/review/amd-radeon..._1920_1080.png
    https://tpucdn.com/review/amd-radeon..._3840_2160.png

    That is moving from a 28NM Planar node to a 14NM Finfet node.

    Fiji was around 600M2,but Vega was around 490MM2. Both were pushed to the limit in terms of clockspeed.

    The same goes with Vega - GF had very little experience with making largish GPUs. TSMC OTH does. Also,Ryzen was meant to be a late 2016 release too going by earlier roadmaps. This is probably why Intel did the emergency edition Kaby Lake series,but it ended up preceding the expected launch date of Ryzen.
    Last edited by CAT-THE-FIFTH; 28-07-2020 at 12:25 AM.

  6. #22
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: AMD and Intel do battle over TSMC capacity, says report

    Whelp, using competitor Fabs might also be a result of Murthys sudden ejection.

    "you've screwed the pooch on this one now to make up for your failure we have to do something we've never done except for extraneous silicon"

  7. #23
    Senior Member
    Join Date
    Apr 2017
    Posts
    329
    Thanks
    0
    Thanked
    8 times in 7 posts

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by DanceswithUnix View Post
    Quote Originally Posted by QuorTek View Post
    With the Intel GFX thing... I think they found out it was a bad idea to hire the AMD guy... or I dunno...
    As I've noted before, AMD seem to be doing much better with him gone. Might be fluke, but funny none the less.
    There was a lot of information going around about 18-24 months before he left, maybe more.

    All of this information played out exactly as predicted. He would drag his feet at AMD, talking a big game, then engineer a move to Intel.

  8. #24
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by CAT-THE-FIFTH View Post
    I give you an example of Jim Keller,and his first stinct at AMD - he left in 1999. He originally work at DEC on the Alpha series CPUs,and AMD bought over a lot of ex-DEC engineers from him,and licensed many DEC things such as the EV6 bus. But he did a lot of important work for the K8,aka,the Athlon 64. He worked on Hypertransport and X86-64. So basically only 3 years later was much of the stuff he worked on,actually finished.
    Now the story I remember at the time was that on learning they had been transferred to Intel, pretty much the entire DEC CPU team resigned on the spot and wandered over to AMD to ask if there were jobs going. So really, much though I respect the guy it wasn't all about Jim there was an entire winning team culture from the amazing 21264 chip transplanted into AMD. One of the key features of the Athlon was its FPU and no-one seems to remember the name of the guy that designed it, though he has the same ex-DEC background as Jim and possibly greater impact on that design. It was a team effort, and with a good team it has momentum to keep it going if any member leaves.

    You can only plan a few months in advance, roadmaps beyond that are nebulous at worst and lacking in detailed clarity at best. It is a list of things that need to be done, because if you knew how it was going to be done down to a detailed level then it would already be done. That's how modern Agile development works. Silicon products are a little different because there is a 6 month fab+test cycle after tape out so I should revise what I said: Someone's input into a team lasts at most about 6 months after they leave, and then there will be another 6 months at least before silicon is available making it look like a year.

  9. #25
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by DanceswithUnix View Post
    Now the story I remember at the time was that on learning they had been transferred to Intel, pretty much the entire DEC CPU team resigned on the spot and wandered over to AMD to ask if there were jobs going. So really, much though I respect the guy it wasn't all about Jim there was an entire winning team culture from the amazing 21264 chip transplanted into AMD. One of the key features of the Athlon was its FPU and no-one seems to remember the name of the guy that designed it, though he has the same ex-DEC background as Jim and possibly greater impact on that design. It was a team effort, and with a good team it has momentum to keep it going if any member leaves.

    You can only plan a few months in advance, roadmaps beyond that are nebulous at worst and lacking in detailed clarity at best. It is a list of things that need to be done, because if you knew how it was going to be done down to a detailed level then it would already be done. That's how modern Agile development works. Silicon products are a little different because there is a 6 month fab+test cycle after tape out so I should revise what I said: Someone's input into a team lasts at most about 6 months after they leave, and then there will be another 6 months at least before silicon is available making it look like a year.
    Emm,where did I say one engineer was responsible for everything? It is pretty much known Jim Keller works on high level concepts,so he would have had dozens or 100s of engineers who did a lot of the ACTUAL work. But you need someone to direct the research and set a path to where the company heads to. In academia,that would be someone called the Principal Investigator(PI) and the lab goals would be planned years ahead,as projects last years,and they need to follow on from each other. You do appreciate that even a PhD project takes 3~4 years?? You make adjustments along the way,but the original plans are started years before you finish.

    The fact is that most of the work he and other DEC engineers did,took years to find itself into AMD CPUs,ie,3 years. That is not this weird view,that it takes 6 months.

    A lot of the research takes YEARs from the inception of the project goals,so if you think a design only takes six months you are mistaken. It does not even take six months in academia FFS - six months is nothing. I know plenty of people who worked on engineering projects - six months is nothing especially for something as complex as some of these chips. You also don't seem to appreciate that when someone leaves in these companies,it takes months to find a proper successor.

    These are massive corporate machines - change does not happen quickly. That means for months,the interim successor will just follow whatever roadmaps the person before made. Lisa Su took over until January 2019 as interim boss. Whenever the CEO takes over a division,it is unlikely much changes will be happening,until they find a proper replacement.

    If you even looked at many of the patents for innovations,we eventually find in Intel/AMD/Nvidia products they are published years ahead. So that means the person in charge would have been pushing those years before. You have forgotten that its not only the actual GPU design,but designing all the innovations that go into the GPU. Plus after the design is finalised,it needs to be actually made at the fab partner,and that again takes months and months.

    Also Jim from AdoredTV said,Raja Koduri's FIRST GPU,for Intel is actually hinted to be DG2. That means DG1 is actually not entirely his own design. That means its taken over two years to finally get something out which he was significantly involved in.

    Plus you are on purpose ignoring that Navi was already a thing in 2017 - so he definitely worked on it. Navi was meant to release in January 2019 according to all the rumours,but was delayed. Samples were out in 2H 2018.

    Navi was planned to be taped out by late 2017:
    https://www.overclock3d.net/news/cpu..._end_of_2017/1

    That means silicon design would have been finished by early 2018 at worst ,with limited tested production at TSMC in the following months. So if he leaves in late 2017,then he had zero to do with RDNA1? So RDNA1 was designed from scratch to a samples in less than 12 months? What??

    You are making a massive assertion that he had zero to do with RDNA,which actually cannot be backed up,especially if we look at timelines,leaks and even the statements made by AMD.

    Its hilarious enthusiasts on the internet have this issue with him,and they know better. If they knew better,then how come he was hired by Apple,AMD and Intel?? For such a crap engineer he seems to be working with some pretty huge companies.
    Last edited by CAT-THE-FIFTH; 28-07-2020 at 11:15 AM.

  10. #26
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: AMD and Intel do battle over TSMC capacity, says report


  11. #27
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by CAT-THE-FIFTH View Post
    You do understand that even a PhD project takes 3~4 years?? You make adjustments along the way,but the original plans are started years before you finish.
    I have worked on products that took hundreds of engineers years to work on. That scale has its own challenges and benefits. I'm an engineer, so as famously pointed out repeatedly on Big Bang Theory I never bothered with a PhD so whilst I am happy with large engineering efforts I don't see how basic research can map to product design (which is very much a moving target if done right). I guess that is where I am coming from, the destination changes over time so whilst you hope you never have to do something as expensive as a U-turn in design and can keep momentum from a start in the right direction; the team has to adapt, learn and re-target. In the current Scrum environment that is pretty much universally used in current engineering that re-targetting is on a 2 week sprint, so every 2 weeks you get problems thrown up that require a hand on the tiller at the architect level.

    I'm not sure why you think I'm hating on Raja. We all need to be cut some slack sometimes (the legendary Mr Keller was responsible for Bulldozer, no-one gets it spot-on every time), but there just seems a disjoint from what Raja says to what Raja does. But then I guess if he dialled back the hyperbole the world would be a less interesting place

  12. #28
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: AMD and Intel do battle over TSMC capacity, says report

    Quote Originally Posted by DanceswithUnix View Post
    I have worked on products that took hundreds of engineers years to work on. That scale has its own challenges and benefits. I'm an engineer, so as famously pointed out repeatedly on Big Bang Theory I never bothered with a PhD so whilst I am happy with large engineering efforts I don't see how basic research can map to product design (which is very much a moving target if done right). I guess that is where I am coming from, the destination changes over time so whilst you hope you never have to do something as expensive as a U-turn in design and can keep momentum from a start in the right direction; the team has to adapt, learn and re-target. In the current Scrum environment that is pretty much universally used in current engineering that re-targetting is on a 2 week sprint, so every 2 weeks you get problems thrown up that require a hand on the tiller at the architect level.

    I'm not sure why you think I'm hating on Raja. We all need to be cut some slack sometimes (the legendary Mr Keller was responsible for Bulldozer, no-one gets it spot-on every time), but there just seems a disjoint from what Raja says to what Raja does. But then I guess if he dialled back the hyperbole the world would be a less interesting place
    The problem is you are not looking at the basic research which leads to the innovations whicn you engineers can....well....engineer. It takes years for that research to be done - labs don't work in six month periods as six months is nothing. That is barely a masters level project,not even research masters level. People don't seem to appreciate that,especially when you need to go along multiple paths to get to where you want to go to. Look at so many of the basic patents,we end up find in a number of products....it takes years to get there. Somebody has to direct all that and its usually the PI. The minutiae might change as it's a process of continuous feedback,but someone has to direct the plans,and there is some end goal.

    Zen took 5 YEARS to get to its final release. Bulldozer took years of development - it was originally meant to be on 45NM. I give you an example of a GPU you have heard off...the RV770,aka,the HD4870 from 1H 2018.

    Design work started in 2005:
    https://www.anandtech.com/show/2679/3

    If you look at a lot of historical comments from ATI,AMD,Intel and Nvidia they plan what they are doing years ahead of where, You need to appreciate what AMD,Intel and Nvidia do is very complex,especially since designs maybe finished 12~18 months before we see them,but it takes that much time to see if you can actually make it and then other factors come into place . Then most forget,there is the issue of capacity. One would be the earliest time you can get enough capacity for a proper launch. Then if you have several product,what actually gets priority in your product stack,etc.Then there is something else also. Inventory - there have many indications that AMD and Nvidia have delayed products to sell off excess inventory. If not you end up with deep discounting,and a new product which has limited quantities,so sabotages your profit margins on the ones which you actually sell the most off. Then there is the software side,ie,making drivers which are useable,sending samples off to games developers,so they can patch in support,etc. Then its most likely there will people beta testing the GPUs,to try and highlight any glaring problems,which need major remedial work.

    If you look at when the RX5700XT was releases,samples were already floating around in 2H 2018. So Navi existed in silicon for 9~12 months before we got actual GPUs. Even with Polaris and Vega ,IIRC,AMD were publically showing off working samples 6~9 months before release. Vega based Radeon Instinct accellerators appeared months before the Vega based gaming GPUs. That tells me they had samples of some sort months before that.

    AMD said the design work was meant to finished by the end of 2017. So even accounting for delays,that definitely is something Raja Koduri would have been involved it. I doubt that in the space of less than a year,cash strapped AMD designed a new GPU from scratch. Even if there were changes,I suspect most of them were done to making it more manufacturable.

    Even the whole RDNA name,sounds a very Raja Koduri level kind of naming(not that I am saying he definitely did it). Come on - Radeon DNA?? Just like his "spicy" names. Its kind of cringe but catchy,which is what he did a lot!

    Edit!!

    Jim Keller rejoined AMD in 2012. AMD Bulldozer pre-dated him as it was original meant to be on 45NM,and was delayed so they could put it on 32NM.

    AMD has so little money at that point,they just continued on the trajectory which had been laid down years before:
    https://www.realworldtech.com/bulldozer/

    So by 2010 we already had a lot of information about it,which was still firmly back in the days of the K10.
    Last edited by CAT-THE-FIFTH; 28-07-2020 at 11:56 AM.

  13. #29
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: AMD and Intel do battle over TSMC capacity, says report

    Rajas largest problem is over promising and over marketing to make up for deficiencies. This was the case with both Vega and pre-Vega Fury.

    Wildly power architectures being used in the wrong ways.

  14. #30
    Registered+
    Join Date
    May 2006
    Posts
    95
    Thanks
    0
    Thanked
    1 time in 1 post

    Re: AMD and Intel do battle over TSMC capacity, says report

    Maybe Intel are going to pay TSMC for using patents and for TSMC people to help Intel to solve their FAB problems?
    Most likely hust using TSMC for non-CPU chips eg. GPU, chipset.

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •