Page 2 of 2 FirstFirst 12
Results 17 to 18 of 18

Thread: Nvidia Turing GPU is expected to be unveiled next month

  1. #17
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,042
    Thanks
    3,909
    Thanked
    5,213 times in 4,005 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Nvidia Turing GPU is expected to be unveiled next month

    Quote Originally Posted by ohmaheid View Post
    If Nvidia continues to increase performance with their new chips, as they have in the last 3 or 4 years, it leaves AMD with a mountain to climb regarding the GPU market. Their efforts with the Vega architecture were less than overwhelming.
    Its not even like the jumps are as big as like 10 years ago,either so now we see the bargain AMD had to make to get Zen out.

  2. #18
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Nvidia Turing GPU is expected to be unveiled next month

    Most like nothing more than a minor coincidence - as myself and scaryjim mentioned earlier in the thread, the naming follows on from years of using the names of scientists.

    Largely due to reasons I mentioned in my last post, it's unlikely Nvidia would target a GPU architecture at mining - it's too volatile a market and the turnaround time for complex semiconductor design and manufacturing is substantial. They would have had start focussing on such a design choice years ago, too.

    While AMD and Nvidia are ideally positioned to design and produce products which are suited to mining the current memory-hard algorithms with their experience in high bandwidth memory controllers, some are planning to change to PoS and who's to say the preferred algorithms won't change again, rendering all of that R&D and manufacturing effort useless - they couldn't even sell a dedicated product to other markets. Mining-focussed cards are a different story given it's relatively straightforward to play about with firmware and strip out the display connectors, but it take a lot of commitment to produce a dedicated ASIC.

    Don't get me wrong, it would be interesting for some product or other to render consumer GPUs useless for mining again (as I understand it, something with a ton of memory bandwidth and fewer 'cores' should do well, but then you still have the memory shortage to contend with), but then there's still motivation to produce different ASIC-resistant algorithms. Whether the speculative market will tolerate the cycle of switching to new algorithms on a regular basis is another story.

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •