Many people seem to put far too much emphasis on power consumption, assuming even a small difference in load power will add up to massive savings in electricity bills, which for domestic usage is usually a deeply flawed logic, especially given just how much extra people will spend with the aim of reducing their bills! I've worked it out more accurately in the past, but to anyone convinced it makes a significant difference, actually do the maths and work out how much you're 'saving' each year vs the purchase price of the card. I know a very rough ball-park calculation for a 24/7 load, is to take the load in Watts and that's roughly what you would pay in pounds per year. So say you game for 2 hours per day and the card is using 100W more, that would be (again, very roughly, and rounding up) an extra tenner a year. Are you really going to make back the difference in purchase price over the length of time you own the card, let alone actually save money? It's certainly not as straightforward as people assume, and the UK has fairly expensive electricity vs many other markets.
Another thing people will do is look at graphs comparing the factory-OC AMD card in question to an Nvidia reference design which tend to have substantially lower power consumption than the majority of actual retail cards.
Sure, all else being the same, lower power consumption is a bonus, but all else is not the same!