Most modern (AAA) games have some pretty amazing support for GCN h/w thanks to consoles but if you would look at engines which are properly optimized for all h/w on the market (like UE4) then you will see what I've already said above - GCN needs about twice as much energy for the same performance as current NV tech and this is solely a h/w design problem. This problem is also likely to become a catastrophe for AMD in 2018 after the launch of NV's next gen gaming architecture which will probably improve perf/watt even further if Volta is of any indication.
The thing that disappoints me most with modern AMD, is even when there are these clear cut deficiencies, it basically takes a decade for AMD to try and even address it. It just doubles and triples and quadruples down.
Welp...looks like the landscape's a changing. Intel is going directly after Nvidia. They actually have the R&D budget and experience to make competitive chips though (drivers will be a huge issue still however). This move is actually scary!
They tried going after Nvidia with Larabee too. Apparently it isn't very easy (even with all the money). Feels more Intel like getting scared about Nvidia beginning to encroach on their territory and Intel hoping they can evolve to try and stop it.