felipepepe is right, but consider the scope of the problem.
Books, or the written word, haven't changed significantly since the invention of the printing press. With some basic education it's even possible to read a play by Shakespeare and not only know what's going on, but appreciate it deeply.
In film you have three distinct eras:
- Silent films
- "Talkies"
- Color
Very few silent films are appreciated today, they are cataloged and listed as being 'important', but go largely unseen. You won't find them on your cable station. In the case B&W films with sound, certain classics like Citizen Kane, Casablanca and Seven Samurai (timeless stories married with an accessible interface) remain appreciated, but only the very best of that era are shown on cable. The introduction of color was the final innovation that cemented what we know and accept as film, these are 95% of the films you'll see on television today. The lesson here is that as technology advances, people have less and less patience for the older interfaces (silent films or black and white with sound). The film has to be
exceptional for the viewer to forgive the aged interface.
Computer games suffer the same problem, except much more so. They are terribly complex program/interfaces which span across many dimensions of evolution, accessibility and trending such as:
- Interface (command line, 2D, introduction of the mouse, isometric, side scrollers, 1st person 3D, 3D Avatars, etc.)
- Controller (joystick, gamepad, lightgun, foot pedals, driving wheels, mouse, keyboard, etc.)
- Platform (arcade games, early consoles, pre-microsoft PCs, microsoft PCs, 2nd gen consoles, heavy GPU based games, next gen consoles)
- Type (fighting games, side scrollers, simulators, RPG, FPS, TBS, RTS, etc.)
- Setting (Fantasy, Modern, Mystery, Horror, Science Fiction, War, etc.)
Then even more complicating, there's the issue of games that were bad as a whole but introduced one great technical innovation (such as allowing the user to map their own keys). Should that bad game be remembered, or just the good feature it introduced? Aren't they two different things? Aren't there hundreds of such examples in gaming history?
Ultimately a 'good' game offers immersion, atmosphere, and compelling gameplay.
Unfortunately there is no uniform interface for games across their time of inception (as there are with books, films, etc.). People know how to read a book, they know how to watch a film, but they have to
learn how to interact with each and every new game they attempt to play. Some are very easy (pong), some are very difficult (NetHack, Dwarf Fortress, System Shock, Gothic). So difficult in fact that most modern gamers will be so frustrated by the interface that they quit after 5 minutes and turn to something else. The game may be a classic, but if the interface renders it inaccessible, it may as well not exist (for 99% of gamers).
And let's face it, to some extent we are all graphics whores. It's just a question of where you personally draw the line, and everyone's threshold is different there as well.
Certain interface decisions, platform availability, insufficient hardware or OS to play a game, and better looking iterations of an original but graphically outdated concept muddy the waters incredibly.
Thus, trying to compare 'computer games', or even just cRPGs, is comparing apples to oranges to grapefruit to pine trees. It just doesn't makes sense. Film, literature, music, visual art, sculpture and other classical art forms don't have this problem because there is a relatively universal interface of accessibility. A painting is framed and viewed, music is heard, sculpture is regarded from various angles. How are we to objectively compare computer games? We can't even agree on what an RPG is.