My eyes are probably just terrible, but I've been struggling to hit good framerates at 4k with games (this sucks as 4k is the entire reason I switched to PC gaming) so I just did comparisons between 1440p and 4k. Took screenshots and directly compared and... on some games I don't see any difference. Of course, if I zoom the screenshots in I can. But normal fullsize on screen, I can't. However some games there is a noticeable difference. Like I compared Tomb Raider 2013, and on Ultimate quality the 1440p and 4k look identical to me, however the 1440p runs at 60fps while 4k runs at 30fps. Yet on some other games, even with antialiasing, I notice some nasty jaggies at 1440p versus 4k. But I have also encountered games which have nasty aliasing even at 4k with the same antialiasing solution.
Is this a game engine thing?