This is lengthy, but heh, I can't find a different way to go through it all:
- While waiting for Digital Foundry to come up with some reliable data on PC/Console comparison, I've myself compared side by side, frame by frame footage between a supposedly maxed quality 4k PC and PS4 Pro. I'm not seeing the difference people claim to be. The PC footage is very clearly much more defined and high resolution, but what's on screen is almost identical to the PS4 footage. The same distance ranges, the same geometry, trees on the distance, grass draw distance. The exact same distance where shadows pop up. It's close to 1:1 reproduction. The PS4 seems to have a stronger "haze", and has much less clarity because of the resolution, but it looks just the same screen rescaled. So I'm still doubting that the console version is a mix of low and medium, unless low and medium are nearly identical to the max.
This is the footage I've used:
https://www.youtube.com/watch?v=wZpgt6L89hY (PS4 Pro)
https://www.youtube.com/watch?v=hNutWJ7Xw2Q (PC)
I've compared 2:05 onward from the first video VS 1:42:30 of the second one. Going on for the next 10 minutes. Since they are scripted sequences in the open, they are quite easy to compare side by side.
- The second point is about the topic itself. The advantage of playing on a console isn't simply that the game "just works", but also that it was built FOR the hardware. On PC when you have to juggle with settings to find a decent compromise of performance and quality, you either spend three months taking screenshots, compare every setting and so on. There are always settings that tank the performance while being negligible visually, so you just don't know what's the best compromise unless you really spend hours researching, and even then it's always a rough estimation.
This on top of emergent technical issues. For example I remember the first Titanfall, the developers explained they spent a lot of time reorganizing the texture pool on PC, so that all the big, important textures that take the priority on screen still retained a very high quality, while they used lower textures for stuff that was more hidden, less noticeable. The result what that on a screenshot comparison there wasn't almost any perceivable distinction between different texture settings. Even if the texture requirements went up dramatically from one setting to the next.
Compare this I just said, to Assassin Creed Unity. If you tried to match the same texture quality of a PS4, on the PC version, you obtained a version that looked HORRENDOUS. This because the PS4 had a mix of medium and high textures, carefully handpicked, that on PC was impossible to achieve. You either selected "high", or medium would have been already lower than PS4 quality despite some trivial textures would look better. Because on PC the textures pools weren't well done and the developers only focused to make the high settings good. The result of this was that on PC you either maxed textures, or the game would look worse than a PS4. No middle ground.
Here we come to the conclusion about one aspect that I've never seen expressed: games on consoles, because of the single hardware, are VERY FINELY TUNED FOR ART DIRECTION. It means that there are devs whose whole job is match the very best performance concessions to look the best possible. It means there are professionals who spend days doing this fine tuning of details, cutting the corners that are the least important. On PC you can replicate some of this through manual settings, but it's a very long shot from tuning the code directly, and you can never match the time and care spent by devs paid for the job.
This directly leads, on PC, the impulse of pushing everything to the max. It's natural.
And now maybe a personal thing: when I look at PC footage, compared to consoles, the PC footage gives me a feeling that things aren't quite "right". But it was hard to pinpoint why. I eventually realized that its because of the animations. When moved to 60 fps footage the added smoothness has the incidental effect of making the same animations more robotic and stiff. The same happens with textures, when you increase the resolution to 4k the much higher definition simply brings out flaws that you wouldn't notice. Basically the PC, with the added clarity, enhances the problems too, because the game wasn't originally made for this definition. And ultimately it looks "off", weird.
The exact same animation that looks perfectly fine at 30 fps, when seen at 60 becomes extremely unnatural and robotic, as if sticks out the rest of the environment like a sore thumb. Same for facial expressions, or textures quality. When you see the console footage, the game looks like a marvel because it all blends together naturally. When you see the PC footage you have this surgical sight that suddenly emphasizes all the things that aren't quite right, and it all appears more glitchy and rough.