Page 1 of 2 12 LastLast
Results 1 to 16 of 17

Thread: Tesla dumps Nvidia, will use its own self-driving chips

  1. #1
    HEXUS.admin
    Join Date
    Apr 2005
    Posts
    31,709
    Thanks
    0
    Thanked
    2,073 times in 719 posts

    Tesla dumps Nvidia, will use its own self-driving chips

    The Tesla FSD is "the best chip in the world… by a huge margin," claims Elon Musk.
    Read more.

  2. #2
    Senior Member
    Join Date
    Feb 2017
    Posts
    246
    Thanks
    3
    Thanked
    17 times in 17 posts

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    With time Musk claims start to appear less and less truthful. But I might be wrong.

  3. Received thanks from:

    Pleiades (23-04-2019)

  4. #3
    Old Geezer
    Join Date
    Jul 2016
    Location
    Under a rusty bucket
    Posts
    540
    Thanks
    53
    Thanked
    42 times in 31 posts

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by GinoLatino View Post
    With time Musk claims start to appear less and less truthful. But I might be wrong.
    He's sounding more and more like Trump.

  5. #4
    Senior Member
    Join Date
    Feb 2012
    Posts
    527
    Thanks
    2
    Thanked
    55 times in 31 posts

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Nvidia are pretty hurt by this. No wonder they want to compare their mega-bucks 500W self-driving Pegasus board with this <80W board.

    Note that 144 TOPS includes the redundancy, it's 72 TOPS. However it may be that the board can use all 144 TOPS, and drop back to 72 TOPS in the case of a single chip failure.

    And the 21x performance increase clearly targets a bottleneck in the Nvidia Drive Xavier solution that is outside of the NNP, as it's only about 3x more powerful there.

  6. Received thanks from:

    Tabbykatze (23-04-2019)

  7. #5
    Registered+
    Join Date
    Apr 2019
    Location
    Portsmouth
    Posts
    29
    Thanks
    0
    Thanked
    1 time in 1 post
    • AlvieM's system
      • Motherboard:
      • Gigabyte G.1 Sniper Z97
      • CPU:
      • i7 4790K
      • Memory:
      • 16GB HyperX Savage 2400MHz
      • Storage:
      • OCZ ARC 100 240GB + 1TB Seagate HDD
      • Graphics card(s):
      • Intel HD Graphics (I know)
      • PSU:
      • EVGA GS650
      • Case:
      • NZXT H440 RED
      • Operating System:
      • Windows 10 Pro
      • Internet:
      • Virgin Media 350

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    I'm surprised it's using ARM chips but I guess because of the reduced instructions set, it performs better? Probably wouldn't make sense to start a whole new division for just chips anyway as that would cost a lot and require a lot of expertise.

  8. #6
    Senior Member
    Join Date
    May 2014
    Posts
    2,385
    Thanks
    181
    Thanked
    304 times in 221 posts

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by sykobee View Post
    Nvidia are pretty hurt by this. No wonder they want to compare their mega-bucks 500W self-driving Pegasus board with this <80W board.

    Note that 144 TOPS includes the redundancy, it's 72 TOPS. However it may be that the board can use all 144 TOPS, and drop back to 72 TOPS in the case of a single chip failure.

    And the 21x performance increase clearly targets a bottleneck in the Nvidia Drive Xavier solution that is outside of the NNP, as it's only about 3x more powerful there.
    In journal/marketing speak, Nvidia did the equivalent of a toys out the pram toss.

  9. #7
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by AlvieM View Post
    I'm surprised it's using ARM chips but I guess because of the reduced instructions set, it performs better? Probably wouldn't make sense to start a whole new division for just chips anyway as that would cost a lot and require a lot of expertise.
    ARM is aimed at this sort of usage. Tesla can choose whatever level of cost/performance tradeoff they want and just buy a hard cell ready to place in their SoC layout. I'm not sure what else you think they could use, PowerPC is basically dead at this stage, MIPS don't have the same range of choice, RISC-V whilst one to watch would be rather new and risky when they are already going for it with the neural part.

    Intel/AMD just aren't in the running for stuff like this, they are too big and power hungry.

  10. #8
    Senior Member
    Join Date
    Mar 2012
    Posts
    262
    Thanks
    0
    Thanked
    26 times in 25 posts
    • devBunny's system
      • Motherboard:
      • Asus P9X79 Pro
      • CPU:
      • i7-3930
      • Memory:
      • 8GB Kingston HyperX
      • Storage:
      • 256GB Samsung 830
      • Graphics card(s):
      • 2 x GTX 560Ti
      • PSU:
      • OCZ ZX1000W Gold
      • Case:
      • Xigmatek Elysium
      • Operating System:
      • Win 7 and Win XP in VMs
      • Monitor(s):
      • 3 x Dell 2410M 1920x1200 IPS

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by Friesiansam View Post
    Quote Originally Posted by GinoLatino View Post
    With time Musk claims start to appear less and less truthful. But I might be wrong.
    He's sounding more and more like Trump.
    Musk is most likely a narcissist but he has the slight advantage over Trump of having a reason to think that he's superior ... Trump is a slobbering moron in comparison.

    There's also very little sense that Musk is a sociopath, which checklist Trump has plenty of ticks on.

    Musk may have a touch of manic depression. Certain, er, "less than advisable" actions and outbursts on his part suggest at least hypomania.

    All round, Musk is a good guy, which Trump is not, and, as with most divas, you have to cut him a lot of slack even if it grates to do so. :-)

  11. #9
    Senior Member
    Join Date
    Mar 2012
    Posts
    262
    Thanks
    0
    Thanked
    26 times in 25 posts
    • devBunny's system
      • Motherboard:
      • Asus P9X79 Pro
      • CPU:
      • i7-3930
      • Memory:
      • 8GB Kingston HyperX
      • Storage:
      • 256GB Samsung 830
      • Graphics card(s):
      • 2 x GTX 560Ti
      • PSU:
      • OCZ ZX1000W Gold
      • Case:
      • Xigmatek Elysium
      • Operating System:
      • Win 7 and Win XP in VMs
      • Monitor(s):
      • 3 x Dell 2410M 1920x1200 IPS

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    That said, don't ask me to pledge any support for the "Musk for President" campaign, should it ever happen. )

  12. #10
    Long member
    Join Date
    Apr 2008
    Posts
    2,427
    Thanks
    70
    Thanked
    404 times in 291 posts
    • philehidiot's system
      • Motherboard:
      • Father's bored
      • CPU:
      • Cockroach brain V0.1
      • Memory:
      • Innebriated, unwritten
      • Storage:
      • Big Yellow Self Storage
      • Graphics card(s):
      • Semi chewed Crayola Mega Pack
      • PSU:
      • 20KW single phase direct grid supply
      • Case:
      • Closed, Open, Cold
      • Operating System:
      • Cockroach
      • Monitor(s):
      • The mental health nurses
      • Internet:
      • Please.

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by DanceswithUnix View Post
    Intel/AMD just aren't in the running for stuff like this, they are too big and power hungry.
    From a position of total ignorance I was wondering - in ultrasound we've moved from dedicated hardware towards GPGPU stuff to do the same job. Requires an awful amount of oomph but the system can be upgraded far more by software updates than it could be when running on hardware beam formers, etc.

    Would this kind of approach work for self driving cars? Just have a load of raw compute power in there, probably with a PSU to match and do it all using software, allowing for OTA / easy / cheap upgrades when things inevitably go wrong down the line and recalls happen?

  13. #11
    Senior Member Xlucine's Avatar
    Join Date
    May 2014
    Posts
    2,160
    Thanks
    297
    Thanked
    188 times in 147 posts
    • Xlucine's system
      • Motherboard:
      • Asus TUF B450M-plus
      • CPU:
      • 3700X
      • Memory:
      • 16GB @ 3.2 Gt/s
      • Storage:
      • Crucial P5 1TB (boot), Crucial MX500 1TB, Crucial MX100 512GB
      • Graphics card(s):
      • EVGA 980ti
      • PSU:
      • Fractal Design ION+ 560P
      • Case:
      • Silverstone TJ08-E
      • Operating System:
      • W10 pro
      • Monitor(s):
      • Viewsonic vx3211-2k-mhd, Dell P2414H

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by philehidiot View Post
    From a position of total ignorance I was wondering - in ultrasound we've moved from dedicated hardware towards GPGPU stuff to do the same job. Requires an awful amount of oomph but the system can be upgraded far more by software updates than it could be when running on hardware beam formers, etc.

    Would this kind of approach work for self driving cars? Just have a load of raw compute power in there, probably with a PSU to match and do it all using software, allowing for OTA / easy / cheap upgrades when things inevitably go wrong down the line and recalls happen?
    AIUI that's what they're doing, just with a different flavour of silicon than GPUs. I don't believe there's much difference in the silicon between the neural networks doing DLSS in 2XXX series GPUs and the neural networks in the nvidia drive chips - the inferences are different enough from GPU work for there to be a big benefit to going custom silicon, but also general enough they can be applied wherever people want the "deep learning" buzzword (to share the R&D cost of the design of the neural network bits).

  14. #12
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by philehidiot View Post
    From a position of total ignorance I was wondering - in ultrasound we've moved from dedicated hardware towards GPGPU stuff to do the same job. Requires an awful amount of oomph but the system can be upgraded far more by software updates than it could be when running on hardware beam formers, etc.

    Would this kind of approach work for self driving cars? Just have a load of raw compute power in there, probably with a PSU to match and do it all using software, allowing for OTA / easy / cheap upgrades when things inevitably go wrong down the line and recalls happen?
    I'm interested to hear you have GPU powered ultrasound, for some years I have wondered why ultrasound scanners aren't a cheap(ish) USB peripheral knocked out by the million in China that can use a laptop graphics card given how compute. I'm sure driving an ultrasound scanner has it's complications and risks of false positives etc, it feels to me as an engineer that every GP should have one by now. Hopefully from what you are saying we are heading slowly down that path.

    Anyway, back to your question...

    Neural networks aren't really about computational power as the sums aren't that hard. The precision isn't that high, GPUs go for fp16 as the lowest they can do though fp16 was generally abandoned for desktop use in the early gpu generations for poor image quality so it has had to be re-introduced. Some systems I believe used fixed point getting down to 8 bit precision, again something that in a 64 bit world CPUs aren't usually optimised for. On top of that, the output from the multiply-accumulate maths that tallys up the neural input is then put through a non linear function like an S curve to get the final fire/not fire result. But the killer is memory access, just like in a living brain the results are not so much down to the simple processing that a neuron does but the massive connectivity of the dendrites linking to other brain cells. That is replaced with what comes down to a lot of matrix lookup.

    So really whilst you can get perfectly decent results with a GPU, such low precision non linear arithmetic with massive memory bandwidth isn't something that fits any other compute job that well. Custom hardware seems a really good idea.

    Note that this is for the mass market of deploying a trained network. The actual network training requires higher precision, a big GPU cluster with HBM is ideal for that.

  15. #13
    Registered+
    Join Date
    Apr 2019
    Location
    Portsmouth
    Posts
    29
    Thanks
    0
    Thanked
    1 time in 1 post
    • AlvieM's system
      • Motherboard:
      • Gigabyte G.1 Sniper Z97
      • CPU:
      • i7 4790K
      • Memory:
      • 16GB HyperX Savage 2400MHz
      • Storage:
      • OCZ ARC 100 240GB + 1TB Seagate HDD
      • Graphics card(s):
      • Intel HD Graphics (I know)
      • PSU:
      • EVGA GS650
      • Case:
      • NZXT H440 RED
      • Operating System:
      • Windows 10 Pro
      • Internet:
      • Virgin Media 350

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by DanceswithUnix View Post
    ARM is aimed at this sort of usage. Tesla can choose whatever level of cost/performance tradeoff they want and just buy a hard cell ready to place in their SoC layout. I'm not sure what else you think they could use, PowerPC is basically dead at this stage, MIPS don't have the same range of choice, RISC-V whilst one to watch would be rather new and risky when they are already going for it with the neural part.

    Intel/AMD just aren't in the running for stuff like this, they are too big and power hungry.
    I'm not gonna lie but I don't actually know much about chips etc. I know Intel and AMD can't perform well because they're not aimed at neural stuff as they are more general chips aimed at a wide range of applications.

    I do however think it will be interesting to see a RISC-V neural processor if it's possible but I guess it does indeed make sense for Tesla to use ARM.

  16. #14
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by AlvieM View Post
    I'm not gonna lie but I don't actually know much about chips etc. I know Intel and AMD can't perform well because they're not aimed at neural stuff as they are more general chips aimed at a wide range of applications.

    I do however think it will be interesting to see a RISC-V neural processor if it's possible but I guess it does indeed make sense for Tesla to use ARM.
    It isn't really that they are general purpose, Intel/AMD are based on the 8086 which was a rubbish processor when new. In some ways it was improved with the 386 and again with AMD64, but all that history drags a lot of transistors with it to remain compatible. At the very high end like we get in our desktop machines processors are so complicated anyway that the overhead is only something like 5% (according to Intel some years ago, so apply a big pinch of salt to that number). That is why the Atom chips were always doomed to failure in the tablet market, the likes of ARM will always be faster and lower power for a smaller cpu.

    Edit: Note the cpu isn't doing any neural processing, just shuffling data to and from the neural unit and acting on results.

  17. #15
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    What I have seen of the video so far is actually rather good, though it is 4 hours long and I did skip to the first hour to get to the silicon description. Includes a primer for how neural networks can drive a car if people are interested in that.

  18. #16
    Senior Member
    Join Date
    Mar 2005
    Posts
    4,935
    Thanks
    171
    Thanked
    384 times in 311 posts
    • badass's system
      • Motherboard:
      • ASUS P8Z77-m pro
      • CPU:
      • Core i5 3570K
      • Memory:
      • 32GB
      • Storage:
      • 1TB Samsung 850 EVO, 2TB WD Green
      • Graphics card(s):
      • Radeon RX 580
      • PSU:
      • Corsair HX520W
      • Case:
      • Silverstone SG02-F
      • Operating System:
      • Windows 10 X64
      • Monitor(s):
      • Del U2311, LG226WTQ
      • Internet:
      • 80/20 FTTC

    Re: Tesla dumps Nvidia, will use its own self-driving chips

    Quote Originally Posted by DanceswithUnix View Post
    It isn't really that they are general purpose, Intel/AMD are based on the 8086 which was a rubbish processor when new. In some ways it was improved with the 386 and again with AMD64, but all that history drags a lot of transistors with it to remain compatible. At the very high end like we get in our desktop machines processors are so complicated anyway that the overhead is only something like 5% (according to Intel some years ago, so apply a big pinch of salt to that number). That is why the Atom chips were always doomed to failure in the tablet market, the likes of ARM will always be faster and lower power for a smaller cpu.

    Edit: Note the cpu isn't doing any neural processing, just shuffling data to and from the neural unit and acting on results.
    IIRC since the days of the original Athlon for AMD and the Pentium 4 for Intel, the processors are RISC like internally any way. They effectively have a hardware emulator in the CPU to translate.
    "In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."

Page 1 of 2 12 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •