Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV
MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display
HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television
i7 (Bloomfield) Overclocking Guide
Originally Posted by Spock
Odd, I wonder what is the deciding factor between people that make optimal screen size vary so much?
I went initially from 21" CRT to a 24" LCD, no adjust time needed.....but as I said, the jump to 30" took a while.
And that is for gaming. I play everything at 2560x1600.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Yes, I understand that. It's not very good having such a big screen on an FPS, that is right up until the point you try scopeless snipping. In Fallout 3 for example, using the Repeater, I can shoot twice as far as I can with the sniper rifle, because when you're "scoped in" with the sniper rifle it does that annoying shake.
Desktop (Cy): Intel Core i7 920 D0 @ 3.6GHz, Prolimatech Megahalems, Gigabyte X58-UD5, Patriot Viper DDR3 6GiB @ 1440MHz 7-7-7-20 2T, EVGA NVIDIA GTX 295 Co-Op, Asus Xonar D2X, Hauppauge WinTV Nova TD-500, 2x WD Caviar Black 1TB in RAID 0, 4x Samsung EcoDrive 1.5TB F2s in RAID 5, Corsair HX 750W PSU, Coolermaster RC-1100 Cosmos Sport (Custom), 4x Noctua P12s, 6x Noctua S12Bs, Sony Optiarc DVD+/-RW, Windows 7 Professional Edition, Dell 2408WFP, Mirai 22" HDTV
MacBook Pro (Voyager): Intel Core 2 Duo @ 2.6GHz, 4GiB DDR2 RAM, 200GB 7200RPM HDD, NVIDIA 8600GTM 512MB, SuperDrive, Mac OS X Snow Leopard, 15.4" Matte Display
HTPC (Delta-Flyer): Intel Core 2 Q8200 @ 2.33GHz, Zotec GeForce 9300-ITX, 2GiB of DDR2 Corsair XMS2 RAM, KWorld PE355-2T, Samsung EcoDrive F2 1.5TB, In-Win BP655, Noctua NF-R8, LiteOn BluRay ROM Drive, Windows 7 Home Premium, 42" Sony 1080p Television
i7 (Bloomfield) Overclocking Guide
Originally Posted by Spock
Appropriate use of power vs cost or to use the more well known phrase. More money than sense? or even a fool and his money are soon parted?
Intel's EE's are STUPIDLY expensive. So were NV's top end GFX chips. 600quid for a gfx card?
Yes AMD at the top end does lose out to Intel at present. Things will roll around as they do. Games are GPU limited, not core limited at present. The next age of computing will be the coding revolution. Better effective use of multi core setups is what is needed. HT is just "cheating" vs real cores. Will give Intel credit thou. Since they changed the bus its made alot of difference so its not starved for bandwith.Listen, I like AMD too, and have an Athlon X2 in the laptop I'm typing this on, and a Phenom II 940BE in one of my rigs. However; this is because I bought a i7 965 when they came out for my main rig and i can afford to dick around with second tier stuff to toss some support AMDs way for competitions sake. However; no one has any real reason to buy an AMD cpu these days. Besides the fact they are clock for clock slower at dual and quad core, they have no hyperthreading, and their motherboards aren't up to Intel standards. Not to mention they lock you into Crossfire for multicard and with Intel you can buy whatever is better- CF or SLi.
AMD locks you in? Sorry. NV locks you OUT. Its been proven before that SLI is disabled except on certified boards. The P45 boards could be unlocked to run SLI.
To me thats far too big. 24" would be fine for me.Except for one thing: gaming on a 30" 25X16 monitor is about 800X better than gaming on a 21" 16X10 monitor. It fills your field of vision, resolution is almost double- everything is bigger and sharper. Not to mention all the 30" panels I've heard of are very high quality displays. Why is it "daft" if you can easily afford it? I wrote the check for $1200 w/o a second thought.
Guess we'll see when Fermi eventually crawls into the light. Tegra is making waves thou. Much better use of technology.I don't think anyone needs to "save NVIDIA":
http://www.tmcnet.com/usubmit/2009/11/05/4466646.htm
NVIDIA is making millions, while AMD loses millions, and NVIDIA has no debt. (compared to AMD's $5 billion in debt)
Who's going to save AMD? The 58XX line is very good, but remember AMD is a much larger company than ATi- they can't survive on half the computer gaming market.
This sort of thing always amazes me, why do people talk about spending relatively small amounts of money as if the buyer were making a bonfire of hundred dollar bills in the backyard?
What if the buyer has double the median household income for his/her area coming into the coffers because they've worked hard? Triple? Quadruple? Quintuple? You're going to begrudge them spending $1000 on a chip or monitor they get enjoyment from in their free time because cheaper models are available?
You never want to go shopping with me, it will anger you.
Errrrr....
A lot of people are saying Fermi will launch next month, not exactly a large amount of time after HD5870.
I know those 8-10 weeks may seem like a long time to you, but in the scale of GPU development time, that's pretty much the blink of an eye.
BTW- if you're about to post "They won't have quantity!", you might remember AMD didn't either. I've been thinking about buying one of those wonder 5870s since they're pretty cheap. Every day I look on newegg, not one 5870 in stock. Go figure.
Last edited by Rollo; 06-11-2009 at 07:48 PM.
Slower but also consuming alot less energy. Also, I never realised a 5770 was software.
Joe gets what he paid for. A 4550 is alot cheaper than a 3870.Or what if Joe buys an ATi 4550 thinking its a big upgrade over his 3870 only to find it's a crippled version of it with less of everything?!
Why compare apples to oranges? It's like saying RISC sucks because CISC has more complex instructions. Or, a more modern example, Atom versus Coppermine.The 5870 is the same lousy VLIW arch ATi has been selling since the 2900, with the addition of much needed TMUs and ROPs. Why do I say "lousy VLIW arch"? Does it seem odd to anyone else it takes 1440 ATi stream processors in a 5850 to finally best the GTX285 with 240 stream processors?
Lot's of efficiency going on there, only takes 5X as many ATi SPs to offer comparable performance.
Driver trick, lol.The changes and improvements in the Fermi make the changes from 4870 to 5870 look like the difference in the 800GT and 9800GTX+ comparatively. It wasn't a huge leap from DX10.1 to DX11, and EyeFinity is just a driver trick that gives users the ability to stop worrying about things like whether they can actually see the difference in 4X and 8XAA, and gives them the ability to wonder if they can actually forget about the 2" thick opaque black bars running through their field of vision.
Last time I read about EyeFinity, the hardware actually contained the parts needed to output the visuals.
Would having SM3 make much difference? I had a x700 for two years before I encountered Bioshock which only runs with SM3. By then the card was garbage.ATi took a year and a half to bring SM3 to market, and beyond that TWIMTB is the predominant dev assist in the market today.
We can't even find a Fermi that is running games.BTW- if you're about to post "They won't have quantity!", you might remember AMD didn't either. I've been thinking about buying one of those wonder 5870s since they're pretty cheap. Every day I look on newegg, not one 5870 in stock. Go figure.
Why don't you do us a favour and remove yourself from here?
Last edited by cleaverlch; 06-11-2009 at 11:35 PM.
Maybe we dont have money to burn. Not all of us were born with silver spoons...
Wanna post a poll rollo? see if NV can hit that deadline. March would be my bet given the news i've seen.
I'm sure they will have a working card out for launchdate. But 1 card makes not a launch. Paper launchs suck. (And yes both NV and ATI have done this)
As for stock of ATI's cards... maybe its because demand is so great? Low power, high performace and available now vs "its going to be awesome please wait while we fix the horrible yeild issues we're having."
Scan get them in and they sell out in days. PreOrdered up the hilt.
/me shrugs.
The more things change, the more things stay the same. It's funny to see after 4 years Rollo is still spouting the same old FUD. The imperious "I'm better than you because I make more money" shtick is still there. Fact remains, I remain doubtful that anyone is going to stay wealthy for long if they're blowing $1000 on something that would serve them equally well if they paid 1/3rd the cost.
Has anyone seen or got the proof that AMD/ATi did or have provided the code asked for by Eidos so that AA can be enabled?, or are ATi just going to build the hack to use the Nv code into their next driver release. Something they may have done before perhaps?
You can't link to me saying I'm "better" than anyone. I can't help how it makes you feel when someone says $1000 isn't unreasonable for a piece of hobby equipment, but you shouldn't put words in others mouths.
Like it or not, the $1000 that is "a lot" to you, a "fair price" to me, is "pocket change" and "tip money" to others.
Is any of us "right", or "better"?
Not in my opinion, just a difference of opinion, and/or place in the consumer market.
Well... Time for another piece of "innovation".
Looks like NV is stomping on Lucid.
http://www.anandtech.com/video/showdoc.aspx?i=3646 - Lucid Hydra 200: Vendor Agnostic Multi-GPU, Available in 30 Days
http://en.expreview.com/2009/11/06/r...ydra-chip.html - Report: NVIDIA Plans to Block Lucid’s Hydra Chip
Looks like a case of kill crush and destroy to me. AMD has no issues with Lucid.If you’re wondering why the highly-anticipated MSI Big Bang Trinergy motherboard turned out to be using NVIDIA’s nForce 200 SLI chip, with no sign of the Lucid Hydra 200 chip as it promised, the answer is simple - NVIDIA does not like it.
Considering the product would impact NVIDIA’s profit coming from SLI fee, the green giant decides that it’s time to do something. Firstly, they will break support for Lucid’s chip at the driver part, and by unknown means force MSI to postpone their “Big Bang” motherboard.
These stories and the many others that keep popping up are killing NV's reputation.
This is another very short term view by NV.
Why should we trust you? You arent exactly making it easy for us.
Last edited by mercyground; 08-11-2009 at 03:38 PM.
Perfectionist (08-11-2009)
Perfectionist (08-11-2009)
It's not NVIDIA's fault you believe unsubstantiated rumors about them.
This kind of thing is impossible to disprove of course- how do you prove you didn't do something?
I could tell you that a manager at NVIDIA told me they've never even seen Hydra, much less tried to block it coming to market. (true)
And then you'd likely say "I choose to believe the rumor".
As Lucid is heavily funded by Intel and Intel are facing another round of lawsuits for uncompetitive behaviour it seems more likely that they would ask for the chip to be shelved as they are the ones responsible for trying to push Nvidia out of the chipset business, but blind prejudice cannot see the obvious it seems.
There are currently 1 users browsing this thread. (0 members and 1 guests)