Page 4 of 6 FirstFirst 123456 LastLast
Results 49 to 64 of 90

Thread: Normal Witcher 3 performance is possible on AMD GPUs

  1. #49
    Moosing about! CAT-THE-FIFTH's Avatar
    Join Date
    Aug 2006
    Location
    Not here
    Posts
    32,039
    Thanks
    3,910
    Thanked
    5,224 times in 4,015 posts
    • CAT-THE-FIFTH's system
      • Motherboard:
      • Less E-PEEN
      • CPU:
      • Massive E-PEEN
      • Memory:
      • RGB E-PEEN
      • Storage:
      • Not in any order
      • Graphics card(s):
      • EVEN BIGGER E-PEEN
      • PSU:
      • OVERSIZED
      • Case:
      • UNDERSIZED
      • Operating System:
      • DOS 6.22
      • Monitor(s):
      • NOT USUALLY ON....WHEN I POST
      • Internet:
      • FUNCTIONAL

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    OS X is the desktop OS,iOS isn't. You cannot truely multi-task on either iOS or Android,which sets them apart from proper desktop OSes like Windows and OS X.

  2. #50
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Sorry I'm not seeing it, if what you claim was correct why is the 750ti (Maxwell) getting lower frame rates than the 760 (Kepler), that's the reverse of what you say about Kepler doing worse than Maxwell.

    Quote Originally Posted by watercooled View Post
    Looking at it objectively, I really don't come to that conclusion? AMD have their own exclusive features, but I wouldn't choose them over Nvidia because of them; it works both ways before you accuse me of being biased.
    Sorry if you think I accused you of being biased, but it does seem your ignoring the business reasons for wanting to keep certain features exclusive.
    You say AMD have their own exclusive features, so why is it OK for AMD to have exclusive features but not Nvidia?

    Quote Originally Posted by watercooled View Post
    And that matters how exactly?
    Edit: Didn't notice when I replied, but yeah CAT is correct. I was referring to iOS, the OS which runs on the iPhone/iPad and the competitor to Android, not OSX. For some reason I read it as something along the lines of 'Windows is desktop, Android is not'.
    Apologise that was me confusing OSX and iOS.

    The mobile space is very different, yes it may have the free Android OS versus iOS, but the one thing they both have in common is that they attempt to tie people into using one system or another via there apps stores, while one OS is proprietary versus a more open source approach their both attempting the same thing, something that Microsoft is trying to replicate, their all offering what amounts to a system intended to persuade people to choose them over their competitors, at initial purchase and every other purchase after.

    For all intents and purpose Android and iOS may as well both be proprietary system, they both achieve the same result of making people more dependant on them than their competitor, that they have an invested interest.

    Quote Originally Posted by watercooled View Post
    That same proprietary system which is promising to run iOS and Android apps since there are so few on Windows?
    And that just goes to show why it's more of a benefit to the underdog when it comes to sharing features.
    If Microsoft was being altruistic why hasn't it offered the win32 binaries to iOS and Android so they can run Windows software?

  3. #51
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    I notice on the second graph on http://wccftech.com/witcher-3-initial-benchmarks/ that switching on hairworks on the 285 (where geometry/tessellation was improved) gives it not much worse impact than the GTX 960 so they start and end roughly on par with each other.

    So that says to me that GCN1.2 is OK here, just unfortunately they only have one card that uses it.

    AMD need to get the 390 out the door.

  4. #52
    Anthropomorphic Personification shaithis's Avatar
    Join Date
    Apr 2004
    Location
    The Last Aerie
    Posts
    10,857
    Thanks
    645
    Thanked
    872 times in 736 posts
    • shaithis's system
      • Motherboard:
      • Asus P8Z77 WS
      • CPU:
      • i7 3770k @ 4.5GHz
      • Memory:
      • 32GB HyperX 1866
      • Storage:
      • Lots!
      • Graphics card(s):
      • Sapphire Fury X
      • PSU:
      • Corsair HX850
      • Case:
      • Corsair 600T (White)
      • Operating System:
      • Windows 10 x64
      • Monitor(s):
      • 2 x Dell 3007
      • Internet:
      • Zen 80Mb Fibre

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by CAT-THE-FIFTH View Post
    Emm,I have not met anyone who is a gamer in real life who actually cares about PhysX or TressFX or any of these effects apart from people like us on forums.

    The biggest selling PC games,are not graphical powerhouses. They are games like LoL,DOTA2 and Minecraft. Even Blizzard games don't use any of that tech and are more CPU limited anyway.
    I think you'd be surprised how many do want those features. I know several people (including myself) who shy away from AMD cards because of franchises like Batman, Borderlands and Metro. The difference in those titles between Physx and no-Physx can be quite significant.

    Of course, a lot of people are going to put price and gameplay ahead of graphics fidelity and those of us who crave the bells and whistles can still enjoy games like SC2 and D3 (as you say, Blizzard are not the biggest users of gpus!) but when you can see the differences on offer and one of them is noticeably more enjoyable to look at, it can be hard to pick the blander one, even more so to choose a graphics card you know excludes you from even having the decision.
    Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
    HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
    HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
    Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
    NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
    Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive

  5. #53
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by Corky34 View Post
    Sorry I'm not seeing it, if what you claim was correct why is the 750ti (Maxwell) getting lower frame rates than the 760 (Kepler), that's the reverse of what you say about Kepler doing worse than Maxwell.
    750Ti is a significantly smaller GPU than the 760. You think it's right that the 960 outperforms the 780, and a 290X matches 780Ti SLI?

    Quote Originally Posted by Corky34 View Post
    Sorry if you think I accused you of being biased, but it does seem your ignoring the business reasons for wanting to keep certain features exclusive.
    You say AMD have their own exclusive features, so why is it OK for AMD to have exclusive features but not Nvidia?
    I was pre-empting it (correctly it seems). At no point have I said it's acceptable for one brand to do something, and another brand to not do it. However, AMD's 'exclusive' features tend to be open anyway, so there's nothing stopping them from running well on GPUs from other manufacturers. That option frequently doesn't exist with Nvidia.

    Quote Originally Posted by Corky34 View Post
    The mobile space is very different, yes it may have the free Android OS versus iOS, but the one thing they both have in common is that they attempt to tie people into using one system or another via there apps stores, while one OS is proprietary versus a more open source approach their both attempting the same thing, something that Microsoft is trying to replicate, their all offering what amounts to a system intended to persuade people to choose them over their competitors, at initial purchase and every other purchase after.

    For all intents and purpose Android and iOS may as well both be proprietary system, they both achieve the same result of making people more dependant on them than their competitor, that they have an invested interest.
    This is beginning to sound straw-man-ish now. AOSP is open, there's nothing stopping Android-compatible apps being run on different systems, as shown by Blackberry, FireOS and now Windows. The point being that, having a closed system=better for business is a fallacy. It may correlate in some cases, but doesn't in others.

    Quote Originally Posted by Corky34 View Post
    If Microsoft was being altruistic why hasn't it offered the win32 binaries to iOS and Android so they can run Windows software?
    It depends on the sort of apps. x86 apps cannot run on ARM, while there are already Android/iOS versions of most of the .NET apps anyway so there's little point either way. And the app compatibility on Windows isn't what I'd consider altruistic, more like an effort to remove a barrier for switching to their OS since their own app marketplace is badly behind the others.
    Last edited by watercooled; 22-05-2015 at 03:11 PM.

  6. #54
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by shaithis View Post
    I think you'd be surprised how many do want those features. I know several people (including myself) who shy away from AMD cards because of franchises like Batman, Borderlands and Metro. The difference in those titles between Physx and no-Physx can be quite significant.
    Horses for courses I guess. I had an Nvidia GPU when I played the first Batman and played it with GPU PhysX off as it wasn't worth the performance impact IMO. I also found it more gimmicky than anything - having sparks falling randomly from the sky and stacks of paper flying around, in the words of Shania Twain, didn't impress me much. The smoke was pretty cool, but also seemed a bit excessively used.

    I also seem to remember a config change for Borderlands would allow 'GPU PhysX' to run fine on a CPU without an Nvidia card installed.

  7. #55
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    750Ti is a significantly smaller GPU than the 760. You think it's right that the 960 outperforms the 780, and a 290X matches 780Ti SLI?
    Sorry you said about Kepler doing worse than Maxwell, something that doesn't appearer to be correct.
    That's unless you know of other Kepler vs Maxwell comparisons, and not Maxwell vs Maxwell, or dual GPU vs dual GPU.

    Quote Originally Posted by watercooled View Post
    I was pre-empting it (correctly it seems). At no point have I said it's acceptable for one brand to do something, and another brand to not do it. However, AMD's 'exclusive' features tend to be open anyway, so there's nothing stopping them from running well on GPUs from other manufacturers. That option frequently doesn't exist with Nvidia.
    And the only reason their open is because they have everything to gain from their tech being adopted by the market leaders, if AMD was market leader do you think they would still be so altruistic?

    Quote Originally Posted by watercooled View Post
    This is beginning to sound straw-man-ish now. AOSP is open, there's nothing stopping Android-compatible apps being run on different systems, as shown by Blackberry, FireOS and now Windows. The point being that, having a closed system=better for business is a fallacy. It may correlate in some cases, but doesn't in others.
    I didn't say it best business practice in all situations, I said and still say sharing features is only a benefit when either the market is divided equally or you have the smallest market share.
    If you have a dominant market share your basically spending time and money to give your competitors a leg up, an advantage, for free no less.

    Quote Originally Posted by watercooled View Post
    It depends on the sort of apps. x86 apps cannot run on ARM, while there are already Android/iOS versions of most of the .NET apps anyway so there's little point either way. And the app compatibility on Windows isn't what I'd consider altruistic, more like an effort to remove a barrier for switching to their OS since their own app marketplace is badly behind the others.
    So exactly as I've laid out, the underdog Microsoft (in mobile) is attempting to provide people with the same secret sauce they find on iOS and Android on their OS, it's of benefit for Microsoft to enable people to run the technology of the market leaders.

  8. #56
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by Corky34 View Post
    Sorry you said about Kepler doing worse than Maxwell, something that doesn't appearer to be correct.
    That's unless you know of other Kepler vs Maxwell comparisons, and not Maxwell vs Maxwell, or dual GPU vs dual GPU.
    Are you just being deliberately obtuse? Because Kepler's poor performance is blindingly obvious, seemingly even to the people who are happy to lay the blame solely at AMD's feet for poor performance of GW on AMD cards. And I bolded the SLI part for good reason; a SINGLE 290X matches TWO 780 in SLI. I.e. single GPU vs dual GPU. Crossfire is predictably non-functional so 295X is about the same as a 290X.

    Quote Originally Posted by Corky34 View Post
    And the only reason their open is because they have everything to gain from their tech being adopted by the market leaders, if AMD was market leader do you think they would still be so altruistic?
    I've no idea, and nor does it matter in this context. If the positions were swapped I'd be equally critical of AMD.

    Quote Originally Posted by Corky34 View Post
    I said and still say sharing features is only a benefit when either the market is divided equally or you have the smallest market share.
    If you have a dominant market share your basically spending time and money to give your competitors a leg up, an advantage, for free no less.
    Which is true sometimes, untrue others, just as I said. If PhysX were more widely supported, more games would probably use it as devs wouldn't have to worry about locking features to certain users. As is stands, it's basically only used in about one TWIMTBP game per year.

    Quote Originally Posted by Corky34 View Post
    So exactly as I've laid out, the underdog Microsoft (in mobile) is attempting to provide people with the same secret sauce they find on iOS and Android on their OS, it's of benefit for Microsoft to enable people to run the technology of the market leaders.
    You've got it back to front - MS are the ones adopting someone else's technology, not 'being open'.
    Last edited by watercooled; 22-05-2015 at 04:15 PM.

  9. #57
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by directhex View Post
    http://www.3dmark.com/compare/fs/4740806/fs/2757339

    When compared directly to each other, I guess? Note especially how much better the 960 is on the physics benches than the 780, which is relevant given this discussion is all about GameWorks
    Missed this post earlier. FYI, the physics score is more CPU based, and the system with the 960 has an 8 core CPU whereas the 780 system has a 6 core.

  10. #58
    root Member DanceswithUnix's Avatar
    Join Date
    Jan 2006
    Location
    In the middle of a core dump
    Posts
    12,986
    Thanks
    781
    Thanked
    1,588 times in 1,343 posts
    • DanceswithUnix's system
      • Motherboard:
      • Asus X470-PRO
      • CPU:
      • 5900X
      • Memory:
      • 32GB 3200MHz ECC
      • Storage:
      • 2TB Linux, 2TB Games (Win 10)
      • Graphics card(s):
      • Asus Strix RX Vega 56
      • PSU:
      • 650W Corsair TX
      • Case:
      • Antec 300
      • Operating System:
      • Fedora 39 + Win 10 Pro 64 (yuk)
      • Monitor(s):
      • Benq XL2730Z 1440p + Iiyama 27" 1440p
      • Internet:
      • Zen 900Mb/900Mb (CityFibre FttP)

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by Corky34 View Post
    And the only reason their open is because they have everything to gain from their tech being adopted by the market leaders, if AMD was market leader do you think they would still be so altruistic?
    They gave away Mantle, so yes I expect they would.

  11. #59
    Token 'murican GuidoLS's Avatar
    Join Date
    Apr 2013
    Location
    North Carolina
    Posts
    806
    Thanks
    54
    Thanked
    110 times in 78 posts
    • GuidoLS's system
      • Motherboard:
      • Asus P5Q Pro
      • CPU:
      • C2Q 9550 stock
      • Memory:
      • 8gb Corsair
      • Storage:
      • 2x1tb Hitachi 7200's, WD Velociraptor 320gb primary
      • Graphics card(s):
      • nVidia 9800GT
      • PSU:
      • Corsair 750w
      • Case:
      • Antec 900
      • Operating System:
      • Win10/Slackware Linux dual box
      • Monitor(s):
      • Viewsonic 24" 1920x1080
      • Internet:
      • AT&T U-Verse 12mb

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by DanceswithUnix View Post
    They gave away Mantle, so yes I expect they would.
    Is it irony that what's left of Mantle ended up at a consortium led by an executive from Nvidia?

    Of course, the whole Mantle thing was never about competing with Nvidia - it was about competing with Microsoft and DX - a fight they would have never won, and unlike Khronos, a fight that probably wouldn't have even managed a stalemate. There's still no guarantee that Vulkan will ever be a real competitor, unfortunately. It's great that Valve is backporting part of their library to OpenGL, and promised support in Source2, but Gabe Newell definitely marches to the beat of his own drum, and a lot of devs may not be willing to join him, and that's the key to the success of Vulkan - dev support. And API is pointless if nobody uses it.
    Esse Quam Videri
    Out on the road today I saw a Black Flag Sticker on a Cadillac...


  12. #60
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by GuidoLS View Post
    Is it irony that what's left of Mantle ended up at a consortium led by an executive from Nvidia?

    Of course, the whole Mantle thing was never about competing with Nvidia - it was about competing with Microsoft and DX - a fight they would have never won, and unlike Khronos, a fight that probably wouldn't have even managed a stalemate. There's still no guarantee that Vulkan will ever be a real competitor, unfortunately. It's great that Valve is backporting part of their library to OpenGL, and promised support in Source2, but Gabe Newell definitely marches to the beat of his own drum, and a lot of devs may not be willing to join him, and that's the key to the success of Vulkan - dev support. And API is pointless if nobody uses it.
    Considering what DX12 offers, I'd say they pretty much got what they set out to achieve whatever way you look at it.

  13. #61
    Token 'murican GuidoLS's Avatar
    Join Date
    Apr 2013
    Location
    North Carolina
    Posts
    806
    Thanks
    54
    Thanked
    110 times in 78 posts
    • GuidoLS's system
      • Motherboard:
      • Asus P5Q Pro
      • CPU:
      • C2Q 9550 stock
      • Memory:
      • 8gb Corsair
      • Storage:
      • 2x1tb Hitachi 7200's, WD Velociraptor 320gb primary
      • Graphics card(s):
      • nVidia 9800GT
      • PSU:
      • Corsair 750w
      • Case:
      • Antec 900
      • Operating System:
      • Win10/Slackware Linux dual box
      • Monitor(s):
      • Viewsonic 24" 1920x1080
      • Internet:
      • AT&T U-Verse 12mb

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    Considering what DX12 offers, I'd say they pretty much got what they set out to achieve whatever way you look at it.
    Some did, anyway. Gaben's fantasy of seeing every Microsoft exec ever with a hatchet in their forehead has yet to come to fruition, though.

    Seriously, though - it would be nice to see some simultaneous releases of AAA software on both DX and OpenGL. While not a graphics using beast, Obsidian's Pillars of Eternity proved it could be done.
    Esse Quam Videri
    Out on the road today I saw a Black Flag Sticker on a Cadillac...


  14. #62
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Maybe we'll see it more with Mantle essentially being absorbed into the OpenGL successor. The move makes a lot of sense when you think about it, rather than competing with two APIs.

  15. #63
    Senior Member
    Join Date
    Dec 2013
    Posts
    3,526
    Thanks
    504
    Thanked
    468 times in 326 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by watercooled View Post
    Are you just being deliberately obtuse? Because Kepler's poor performance is blindingly obvious, seemingly even to the people who are happy to lay the blame solely at AMD's feet for poor performance of GW on AMD cards. And I bolded the SLI part for good reason; a SINGLE 290X matches TWO 780 in SLI. I.e. single GPU vs dual GPU. Crossfire is predictably non-functional so 295X is about the same as a 290X.
    No but it would seem you're backtracking on what you've previously said, that being "And the reason for equally terrible Kepler performance?" and all the other posts complaining about the "relatively poor performance on Nvidia cards previous to Maxwell", perhaps if you move the goal posts far enough no one will notice.

    Quote Originally Posted by watercooled View Post
    I've no idea, and nor does it matter in this context. If the positions were swapped I'd be equally critical of AMD.
    Of course it matters, the only reason people are up in arms is because company A is using it's market dominance to supposedly adversely effect company B, something that company B would happily do if the roles were reversed, it's called economics, like I said businesses job is to make money.

    Quote Originally Posted by watercooled View Post
    Which is true sometimes, untrue others, just as I said. If PhysX were more widely supported, more games would probably use it as devs wouldn't have to worry about locking features to certain users. As is stands, it's basically only used in about one TWIMTBP game per year.
    And that's still one more game per year than their competitor, more so to choose a graphics card you know excludes you from even having the decision.

    Quote Originally Posted by watercooled View Post
    You've got it back to front - MS are the ones adopting someone else's technology, not 'being open'.
    That's because they have to if they want to compete with the big boys, like I said it benefits the underdog to share with the market leaders not the other way around.

  16. #64
    Senior Member watercooled's Avatar
    Join Date
    Jan 2009
    Posts
    11,478
    Thanks
    1,541
    Thanked
    1,029 times in 872 posts

    Re: Normal Witcher 3 performance is possible on AMD GPUs

    Quote Originally Posted by Corky34 View Post
    No but it would seem you're backtracking on what you've previously said, that being "And the reason for equally terrible Kepler performance?" and all the other posts complaining about the "relatively poor performance on Nvidia cards previous to Maxwell", perhaps if you move the goal posts far enough no one will notice.
    Would you care to explain where I've backtracked? Because I completely stand by what I said - performance is poor on Nvidia cards previous to Maxwell.

    You do understand that the 780 is a Kepler card, right? The only 700 series Maxwell are the 750 cards. I really don't see what's so hard to understand unless you're confusing the model numbers?

    However expecting a 750Ti to beat everything Kepler just because it's Maxwell is silly, and not what I implied at any point. Kepler performs comparatively far worse than Maxwell.

Page 4 of 6 FirstFirst 123456 LastLast

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •