question .. will the new piledriver be getting a better memory controller .. that is native 1600mhz or higher ??instead of an upto 1866 after thought ?
question .. will the new piledriver be getting a better memory controller .. that is native 1600mhz or higher ??instead of an upto 1866 after thought ?
What? The BD memory controller does support up to 1866 'native', by which I assume you mean without overclocking. The speed the memory runs at depends on the RAM itself, you need 1866 RAM to run at that speed, stock.
i ment this ...
http://gskill.us/forum/announcement.php?f=2
and to answer my own question ....
A Built-in Memory Controller on Piledriver chips would be directly connected to 4 DDR3 DIMMS with stock frequency support of 1866Mhz in Dual Channel Mode.
Last edited by flearider; 05-02-2012 at 06:05 PM.
watercooled (05-02-2012)
Thanks for that, I wasn't aware of it. A friend has just build a BD system and has been having occasional BSOD and while I suspected RAM settings or faulty RAM, this pretty much confirms it. However, I run 4 sticks myself on Thuban without problems.
Interesting:
http://www.xbitlabs.com/news/other/d...s_for_AMD.html
OTH,they could be reading too much into the statement as both IBM and GF are part of the consortium which develops SOI technology.
Yeah definitely interesting, GloFo have been holding AMD back with delays, yield issues and so on. As much as I want them to be a successful and competitive fab, there are at least a number of competing fabs in existence, rather than the two major competing x86 CPU compaies.
Hmm, friend is running 1333MHz RAM so possibly not...
Still trying to get him to run Memtest.
This is interesting:
http://www.extremetech.com/computing...t-overclocking
From the article:
"We will know more when the team presents its research on February 27 at the International Symposium on High Performance Computer Architecture."
Its peer-reviewed research. It is open to other researchers to investigate the claims. Intel and Nvidia do exactly the same thing - most of those research groups which claim xyz with those companies equipment in many cases have funding form them too. Intel and Nvidia also fund research too in universities and it has been the case for yonks. Even Apple does so(they have equipment grants for example).
Last edited by CAT-THE-FIFTH; 08-02-2012 at 11:24 AM.
Two edits to the article already
Hmmm, looks like interesting research. Not sure how they simulated an APU with L3 cache, as far as I know there aren't any in the immediate offing, but it's basically someone showing that software can handle what AMD want to do in hardware. It's hardly a revolution, just a new way of getting data to the GPU when it needs it. Wonder how much complexity it adds to coding / debugging and the compiler...
This is the sort of thing I've been expecting (and posting about) since the APU was announced. And it's the sort of thing AMD are ultimately hoping for - using at as an APU, not just a CPU with the GPU used for graphics. The compute-oriented GPU architectures should help a lot too.
A good article about the future direction of AMD CPUs:
http://techreport.com/articles.x/22452/1
Looks like the new CTO acknowledges they need more than 10% to 15% improvement in CPU performance each year.
Some more details about the CPUs AMD is releasing this year:
http://www.techpowerup.com/160274/AM...10-Series.html
So,the top end Trinity A10 will have an HD7660D IGP. This does indicate the IGP will be in between an HD6570 and HD6670 GPU(these are being rebranded as the HD7570 and HD7670 IIRC).
Hopefully,the A10 has an IGP which is close to an HD6570 GDDR5 or HD5670 GDDR5 in performance. The existing A8 has an IGP similar in performance to an HD5550 GDDR5.
Last edited by CAT-THE-FIFTH; 10-02-2012 at 01:57 PM.
Pretty impressive when you consider what the next Xbox is meant to be using for its GPU!
There are currently 5 users browsing this thread. (0 members and 5 guests)