Page 1 of 7 1234 ... LastLast
Results 1 to 16 of 108

Thread: Are we being sold the modern day version of snake-oil?

  1. #1
    No more Mr Nice Guy. Nick's Avatar
    Join Date
    Jul 2003
    Posts
    10,021
    Thanks
    11
    Thanked
    316 times in 141 posts

    Are we being sold the modern day version of snake-oil?

    I was recently very lucky to be at a pre-Bloodline party hosted by those loverly people from ATi... over a few HUGE flaggons of Bavarian beer, conversation of course turned to graphics cards and graphics in general.

    Now, it's undisputed that the main reason for these more and more powerful cards is to get the best performance possible to give a better playing experience.

    Or is it?

    There was a rather animated discussion about graphics going on which I was dragged into to comment on the graphics in Half Life 2. The argument was along the lines of 'Isn't the water rendering in HL2 fantastic?' versus 'Yeah, but why can't you produce a set a drivers unified for Windows Media Center AND gaming?... so stuff your flash water effects'.

    My part in the conversation put a whole new swing on things when I was asked what I thought... which was that even on my 9800XT, running Cat 4.1s, HL2 looks great.

    The response was that it would look even better on a more powerful card, but I have to wonder if this is really true.

    Sure, you can quote visual quality measures at me and reel off frame rate charts all day long... but those are measurements made by a computer... can YOU with YOUR naked eye actually see a difference?

    With this thought in mind, I rolled up to Bloodline the next morning and noticed that on our superb HEXUS posters, we had a shot of a nVidia SLi set-up and two screen shots taken from a couple of reviews I did (HL2 and Doom3). You couldn't see that those shots were done on a 9800XT... more inportantly you COULDN'T see that those shots WEREN'T done on a higher end card.

    Then I watched the SLi machine running through 3DMark 2005, with the little bar showing how much the cards were waiting for the CPU to send them info to process and finally those nagging thoughts crystalised into one coherent question, and beleive me, its a biggie.

    Ready? Here we go:

    In our haste to have the latest and greatest, the fastest and flashiest, have the hardware manufacturers succeeded in the biggest con job of all time by leading us into beleiveing we HAVE to have their newest most expensinve bit of kit when a) the cards speed are now throttled by even the fastest CPUs and b) our eyes couldn't tell the difference anyway?

    Yes, ladies and gentleman, I believe that we've reached the point where snake-oil is back... it's just printed on a PCB and got a fan on top....
    Quote Originally Posted by Dareos View Post
    "OH OOOOHH oOOHHHHHHHOOHHHHHHH FILL ME WITH YOUR.... eeww not the stuff from the lab"

  2. #2
    ton3s utdmleach's Avatar
    Join Date
    Nov 2004
    Location
    Bradford
    Posts
    591
    Thanks
    0
    Thanked
    0 times in 0 posts
    B*lloxs shun, you mean I just wasted 450 euros on a card that my pc can't match...o well. I see what you mean though, it's all down to ourselves falling for the hype, needing the latest and greatest. People are always wanting to overclock their cards, in many cases, they dont need to, but its the desire to have the BEST or a few more points on benchies to differencaite themselves above the rest.

  3. #3
    Going Retro!!! Ferral's Avatar
    Join Date
    Jul 2003
    Location
    North East
    Posts
    7,860
    Thanks
    561
    Thanked
    1,438 times in 876 posts
    • Ferral's system
      • Motherboard:
      • ASUS Z97-P
      • CPU:
      • Intel i7 4790K Haswell
      • Memory:
      • 12Gb Corsair XMS3 DDR3 1600 Mhz
      • Storage:
      • 120Gb Kingston SSD & 2 Tb Toshiba
      • Graphics card(s):
      • Sapphire Radeon R9 380 Nitro 4Gb
      • PSU:
      • Antec Truepower 750 Watt Modular
      • Case:
      • Fractal Design Focus G Mid Tower
      • Operating System:
      • Windows 10 64 bit
      • Monitor(s):
      • 28" iiyama Prolite 4K
      • Internet:
      • 80Mb BT Fiber
    HL2 runs lovely on my machine and I'm running an XFX FX 5700 LE 128 Mb. Everything is on high detail 1024x768, except the shadows on the water which is set to minimum. Still looks amazing though.

    However Doom 3 ran better on my Rad 9600 Pro card (Currently about to be sent in for repair under warranty at Connect3D)

    It shows that you dont really need to be spending hundreds on a new vga card when a gig of ram and a 1.8 Gig processor helps gaming more than enough

  4. #4
    Senior Member
    Join Date
    Jul 2003
    Location
    London
    Posts
    888
    Thanks
    9
    Thanked
    4 times in 4 posts
    I disagree, purely because of the slowdowns that I get on my 9800 Pro with AA and AF on to a decent level - yes, it can still look good, but on an xp2400/9800pro/nforce2 system can't make it look good at a constant fps above 40 - and that is why I'd upgrade if I had the money.

    Playable eye candy is the game - a 9800 Pro just doesn't quite cut it for me, shame I'm cashless.

  5. #5
    Sublime HEXUS.net
    Join Date
    Jul 2003
    Location
    The Void.. Floating
    Posts
    11,819
    Thanks
    213
    Thanked
    233 times in 160 posts
    • Stoo's system
      • Motherboard:
      • Mac Pro
      • CPU:
      • 2*Xeon 5450 @ 2.8GHz, 12MB Cache
      • Memory:
      • 32GB 1600MHz FBDIMM
      • Storage:
      • ~ 2.5TB + 4TB external array
      • Graphics card(s):
      • ATI Radeon HD 4870
      • Case:
      • Mac Pro
      • Operating System:
      • OS X 10.7
      • Monitor(s):
      • 24" Samsung 244T Black
      • Internet:
      • Zen Max Pro
    I don't think we really get much improved visuals over one generation, after all, most often they're just evolutions on a basic design, but they do get the ability to do things faster.

    After a few generations the increased speed allows you to play with new tricks - anti-aliasing is a prime example, 3dfx first tried it a long time ago, but it was simply too slow to use in games, 6 years on it's now usable in games.

    The same will be true of things like hdri lighting and subsurface scattering amongst others over time, that's why it's really not worth upgrading with every new generation, unless you really *must* have the latest and greatest *all* the time..

    For instance, I won't be upgrading to the x800 series at all, I'll be skipping to R520 at the least, possibly further depending on how things go
    (\__/)
    (='.'=)
    (")_(")

  6. #6
    Ah, Mrs. Peel! mike_w's Avatar
    Join Date
    Oct 2003
    Location
    Hertfordshire, England
    Posts
    3,326
    Thanks
    3
    Thanked
    9 times in 7 posts
    I really don't care much about graphics - even on low settings, modern games look great. In my opinion, gameplay is by far the most important factor - after all, we buy games to play them, not just to stare at them.
    "Well, there was your Uncle Tiberius who died wrapped in cabbage leaves but we assumed that was a freak accident."

  7. #7
    john johnnr892's Avatar
    Join Date
    Feb 2004
    Location
    Stowmarket
    Posts
    791
    Thanks
    0
    Thanked
    0 times in 0 posts
    couldn't agree more, but it is easy to get caught up in the hype.Just looking at fps charts and graphs make me want to overclock or buy the latest hardware.
    Last edited by johnnr892; 18-12-2004 at 10:54 PM. Reason: didnt type properly
    Cheiftech Matrix/xp 2600@ 2.3ghz/ Abit NF7 v2/1gb GEIL value dual channel pc3200@ 2.5-3-3-6/XFX 6600gt/80gb Western Digital boot disk/80gb maxtor for storage and games/LG cdrw/Nec 3500A

  8. #8
    Now with added sobriety Rave's Avatar
    Join Date
    Jul 2003
    Location
    SE London
    Posts
    9,948
    Thanks
    501
    Thanked
    399 times in 255 posts
    I like turning up the eye candy if I can. I'd quite like a faster card, but the truth is the only game I ever play is Tribes 2, and that plays fine at 1280x960 with everything maxed, with my 9700 core underclocked to 150MHz. A faster card would be a waste at the moment; since it seems that Tribes 2 is dying as an online game, I'm inclined to play as much of it as I can now while there are still people to play against.

    Rich :¬)

  9. #9
    Member
    Join Date
    Dec 2004
    Posts
    111
    Thanks
    0
    Thanked
    0 times in 0 posts
    im getting a 6600gt in a few months

    i have skipped the previous generation, and hope to skip the one after this

    i think its the best way to go, as someone said above
    however i am also getting the option to go SLI and socket 939 so if i do see slow downs in a year or two i can have a small upgrade, instead of another massive one

    its hard to accept that you cant be the best for more than say 2 months, but in the computer world that is jsut how it is, everyone should recognise that and wed all prolly be a little richer

    Chunky
    MY SO CALLED HALF LIFE
    Episode One
    Episode Two
    Episode Three

  10. #10
    No more Mr Nice Guy. Nick's Avatar
    Join Date
    Jul 2003
    Posts
    10,021
    Thanks
    11
    Thanked
    316 times in 141 posts
    Now, feel free to correct me if I'm wrong, but can the human eye actually detect or benefit from an increase in framerate? Surely with a game upating the screen 40 times a second the eye can't detect a difference as small as once everything fourtieth of a second... that's 0.025 of a second. If our eyes were that sensitive, or our nervous systems to be more exact, then we wouldn't need photo finishes at the racecourse, would we?

    iirc, films at the cinema run at 25 fps, I've never noticed any jumpy motion there, have you? Of course, we could go a touch higher and take TV, which is 50 fps... again, I've not noticed any flickering there either.

    That would lead me to believe that anything above 40 fps is really just wasted 'bang for you buck', as you can't actually see it, can you?

    As for all the bells and whistles, the graphical effects, yes, you might find that you have to upgrade your card to get them as they are hardware dependant, but most of the cost of new cards is from the research and production of ever faster cores. The actual bits that do the stuff like hardware bump mapping etc AREN'T dependant on core speed, its just in the chip design... sooo, if you've already got a card that's got plenty of oomph, why can't the manufacturers bring out cheaper cards based on slower, less expensive chips that still take advantage of the latest graphical niceties.

    Of course, I could be dead wrong on all this, but its a thought process I've been having for a while.

    As to AA and AF, my 9800XT handles both with ease even in IL2.... but NOT in Pacific Fighters... Now this brings in another interesting aspect...

    Remember the days of the SPectrum and the Amiga and all that. Remember how cruddy games looked when they first came out? Back then a new machine was launched maybe once every couple of years... the Amiga remained basically unchanged for AGES... but games looked better and better. How was this as the hardware hadn't changed?

    The answer was simple. Programmers got better at writing the code to either make things look nicer or to run smoother... It seems that now, rather than bother to get the code right and optimise to run as quickly as possible, developers just expect us to go and buy a new card... Doom 3 and HL2 are the latest examples... If you want 40 fps and all the tinsel, go buy a new card.

    I'm sorry, but I've GOT a DX9 card. Its a year old. How about you take some of my £35 quid or more I paid for the game and write some decent code so I can appreciate all your hard work for misting, fogging, real time water reflections and all that jazz?

    It seems that the trend is for sloppy coding to be covered up by us having to buy more and more expensive bits of kit. And the price of the new technology isn't dropping either.

    A 3Dfx card, a Voodoo 1, cost something like £90 when it was first released. Back then it was as bleeding edge as the 6800Ultra is today. Inflation hasn't been that high, so why is the card so expensive? In fact, its less of a leap forward than the Voodoo which was the first GPU, so how come it costs so much?
    Quote Originally Posted by Dareos View Post
    "OH OOOOHH oOOHHHHHHHOOHHHHHHH FILL ME WITH YOUR.... eeww not the stuff from the lab"

  11. #11
    Registered User
    Join Date
    Dec 2004
    Posts
    154
    Thanks
    1
    Thanked
    1 time in 1 post
    Quote Originally Posted by Deckard

    Yes, ladies and gentleman, I believe that we've reached the point where snake-oil is back... it's just printed on a PCB and got a fan on top....
    **** me.

    You've must have been living a stygian existence if you think Nvidia are the first or only company to market you something you don't need.

  12. #12
    Goron goron Kumagoro's Avatar
    Join Date
    Mar 2004
    Posts
    3,154
    Thanks
    38
    Thanked
    172 times in 140 posts
    I cannot remember exactly what the response time for the human eye is but 40 fps is def way above.... I think that the eye can detect black and white changes fastest, maybe

    I think the reason why people want high frame rates is because when it gets really busy it can drop off to a level which the eye can detect.
    I wonder why graphics cards always need to run as fast as they can, i guess they cant work out when it needs to.... or they cant be bothered to implement a system which does throttle it.

    When i look at screen shots which show high res 1600by1200 shots i can barely see any diff with or without anti aliasing. lower res i can.

    When i see reviews i always want to know what other components are suitable to go with that part.... It still amazes me that people will buy low latency ram when they have or are going to get an Athlon XP.
    Last edited by Kumagoro; 19-12-2004 at 11:00 AM.

  13. #13
    Senior Member
    Join Date
    Jul 2003
    Location
    ZA ✈ UK
    Posts
    622
    Thanks
    0
    Thanked
    0 times in 0 posts
    That a game looks better on more expensive cards is rubbish - you could probably run HL2 on a Riva (Provided it has enough RAM), and it'd look identical to an x800. The speed at which it generates frames is a different matter - newer cards make games more playable by not having you wait several minutes for each frame to be drawn. For example, a friend of mine played Doom 3 on a MX440, with everything set to minimum. He could play it - to an extent. The MX440 actually helped him cheat - when enemies were nearby, his frame rate would drop from sub-20fps to below 10fps. Obviously, the experience wasn't too good for him - playing a sluggish and unresponsive game makes it difficult to enjoy. But the point is that the game looked much as it would on a high-end card.

    I'd probably be able to see the difference between 40fps and 60fps. But then, that may be for the same reason I can't run at a refresh rate less than 85Hz (My eyes are rather sensitive to that kind of thing). But having higher frame rates do affect gameplay. In Quake 3, for example, some jumps can only be made when the game is running at 125fps. While undetectable, the apex of jumps at 125fps is ever so slightly higher than that at lower framerates. Until someone explained it to me, I couldn't fathom why others could jump up a certain step and I couldn't, back on my Riva TNT at 60fps.

    Comparing TV/cinema frame rates to computer frame rates is as apples are to grapes. When the original footage is filmed, fast-moving objects will blur on each frame. That blurring is what makes film and TV seem so smooth, even though they only run at 25 to 30fps. Computers, being the precise machines that they are, don't generate blurred frames. Of course, the programmers of the game could force motion-blurring, but it may wind up being jerkier at 30fps than getting 70fps with no motion-blur.

    Overall, I don't think you need the latest and greatest card to play the newest games. Sure, it's nice to run at 1600x1200 with full anti-aliasing and everything set to maximum detail. But you can still play the same game with as much enjoyment at 800x600 with the same settings on an older card, and still get above playable framerates. That better cards will make you a better gamer is rubbish - those that play games competitively turn off the eye candy anyway.

  14. #14
    No more Mr Nice Guy. Nick's Avatar
    Join Date
    Jul 2003
    Posts
    10,021
    Thanks
    11
    Thanked
    316 times in 141 posts
    Quote Originally Posted by Hobart Paving
    **** me.

    You've must have been living a stygian existence if you think Nvidia are the first or only company to market you something you don't need.
    lol, no, I haven't, but I thought this was a topic relative to computers... that said, has ANYONE actually bought one of those USB kettles?

    I think they're the funniest thing ever, the premise being that you don't need to leave your PC to make a cuppa... except that you'll have to go to the kitchen to fill the thing up with water, get a cup, a tea bag and some milk...

    I just think that the whole GFX card thing takes it to the extreme though. It's a bit like Intel's ad campaign a few years ago telling everyone that their new CPU made the internet faster... I mean, really?
    Quote Originally Posted by Dareos View Post
    "OH OOOOHH oOOHHHHHHHOOHHHHHHH FILL ME WITH YOUR.... eeww not the stuff from the lab"

  15. #15
    Theoretical Element Spud1's Avatar
    Join Date
    Jul 2003
    Location
    North West
    Posts
    7,508
    Thanks
    336
    Thanked
    320 times in 255 posts
    • Spud1's system
      • Motherboard:
      • Gigabyte Aorus Master
      • CPU:
      • 9900k
      • Memory:
      • 16GB GSkill Trident Z
      • Storage:
      • Lots.
      • Graphics card(s):
      • RTX3090
      • PSU:
      • 750w
      • Case:
      • BeQuiet Dark Base Pro rev.2
      • Operating System:
      • Windows 10
      • Monitor(s):
      • Asus PG35VQ
      • Internet:
      • 910/100mb Fibre
    Well tbh HL2 runs fine on my PC

    I play on max details @ 1600x1200, and have framerates that never drop below 50fps. AA and AF are turned off, but I can't tell any difference at that resolution.

    At the moment I am home for christmas so have to use dads little 17" monitor [wasnt space n the car for my 21" beast] so consequently i have to use a lower resolution. The difference there is huge, I can clearly see a huge difference between AA and no AA at this res.

    Anyway my point is that deck is right, my 9800pro runs sweet on all the latest games atm, no problems with it. They look and perform great, so why should i splash out £400 on a top of that range card? I spent £260 on one a year ago, and hopefully it will last me another year. Its the same story with my CPU, I've had this athlon2800+ since AMD started production and never had a problem. OK so its overclocked and will put pushed further when i cba to try, but its still working. No real reason to 'upgrade' to an amd64 chip yet..


    Of course this gfx card malarky can work the other way too. Remeber the GF4MX or the FX5200 or the radeon 9200? how many people were suckered into buying one of those dead weights believeing they were good cards, when a GF3ti was a better performer. Its a tricky market ;/

  16. #16
    No more Mr Nice Guy. Nick's Avatar
    Join Date
    Jul 2003
    Posts
    10,021
    Thanks
    11
    Thanked
    316 times in 141 posts
    Quote Originally Posted by Kumagoro
    I think the reason why people want high frame rates is because when it gets really busy it can drop off to a level which the eye can detect.
    Fair point, but Serious Sam filled the screen with loads of monsters and in an outdoor enviroment... yeah, it wasn't nearly as detailed as todays games, but the coding was so well done that even rigs that had trouble with an Excel sheet could run it at respectable framerates.

    Quote Originally Posted by Kumagoro
    I wonder why graphics cards always need to run as fast as they can, i guess they cant work out when it needs to.... or they cant be bothered to implement a system which does throttle it.
    Which brings us to the driver issue again... or poor coding from developers.... perhaps it's to make us feel that we HAVE to get something more powerful to get our games running smoothly...

    Quote Originally Posted by Kumagoro
    When i look at screen shots which show high res 1600by1200 shots i can barely see any diff with or without anti aliasing. lower res i can.
    I agree... and I'm sure that with some better coding, we could run our games at LEAST one resolution higher with no troubles at all...
    Quote Originally Posted by Dareos View Post
    "OH OOOOHH oOOHHHHHHHOOHHHHHHH FILL ME WITH YOUR.... eeww not the stuff from the lab"

Page 1 of 7 1234 ... LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •