Page 2 of 2 FirstFirst 12
Results 17 to 18 of 18

Thread: Nvidia Turing GPU is expected to be unveiled next month

  1. #17
    Bows out! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Hopefully somewhere less backstabby
    4,456 times in 3,442 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • PSU:
      • Case:
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • Internet:

    Re: Nvidia Turing GPU is expected to be unveiled next month

    Quote Originally Posted by ohmaheid View Post
    If Nvidia continues to increase performance with their new chips, as they have in the last 3 or 4 years, it leaves AMD with a mountain to climb regarding the GPU market. Their efforts with the Vega architecture were less than overwhelming.
    Its not even like the jumps are as big as like 10 years ago,either so now we see the bargain AMD had to make to get Zen out.

    Those despicable Elk,stealing the pond weed!

  2. #18
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    914 times in 786 posts

    Re: Nvidia Turing GPU is expected to be unveiled next month

    Most like nothing more than a minor coincidence - as myself and scaryjim mentioned earlier in the thread, the naming follows on from years of using the names of scientists.

    Largely due to reasons I mentioned in my last post, it's unlikely Nvidia would target a GPU architecture at mining - it's too volatile a market and the turnaround time for complex semiconductor design and manufacturing is substantial. They would have had start focussing on such a design choice years ago, too.

    While AMD and Nvidia are ideally positioned to design and produce products which are suited to mining the current memory-hard algorithms with their experience in high bandwidth memory controllers, some are planning to change to PoS and who's to say the preferred algorithms won't change again, rendering all of that R&D and manufacturing effort useless - they couldn't even sell a dedicated product to other markets. Mining-focussed cards are a different story given it's relatively straightforward to play about with firmware and strip out the display connectors, but it take a lot of commitment to produce a dedicated ASIC.

    Don't get me wrong, it would be interesting for some product or other to render consumer GPUs useless for mining again (as I understand it, something with a ton of memory bandwidth and fewer 'cores' should do well, but then you still have the memory shortage to contend with), but then there's still motivation to produce different ASIC-resistant algorithms. Whether the speculative market will tolerate the cycle of switching to new algorithms on a regular basis is another story.

Page 2 of 2 FirstFirst 12

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts