The importance of graphics in video games is debatable. Some may consider them not important at all, some say they are secondary and some say they are what caries video games. If the story is good and gameplay is good, games don’t need good graphics. But is that true?
First of all let’s take a quick look at some history of video games. We’ll be looking at consoles and mobile games not computer or arcade games. Those are a category all on their own because of the modular nature of computers..
The first widely spread video games for homes were pong consoles. These were systems that had video games built in (mostly). Usually it was a variation of a tennis game. Those had really rudimentary graphics – two paddles and a ball (that was actually a square).
One particular console called the Magnavox Odyssey solved the problem of graphics by providing overlays that were suppose to be put on the TV screen. The overlays had graphics printed on them.
The next big thing was the Atari 2600 in 1977. It had simple graphics but definitely a step up from the pong consoles.
In the 80s something a lot of people call the bit wars started. The main competitors were Nintendo and Sega. Now graphics were measured by bits. The more bits the console had the better the graphics. Companies heavily marketed the number of bits their consoles had and how the graphics looked.
The 32-bit era had the Amiga CD 32 but it didn’t really make an impact. Most console companies just skipped to 64-bit.
EDIT: Sorry, I goofed. The Playstation 1 was 32-bit aswell. So the 32-bit DID in fact make a huge impact. It was the birth of one of Nintendo’s greatest competitors that pushed Nintendo to make a better system (hence better graphics).
Then came along the 64-bit era with the likes of the Sega Dreamcast, Nintendo64, and Atari Jaguar. Polygon graphics were now the norm and graphics were making leaps towards “3D” games instead of side-scrollers.
This is where the bit wars stopped. The number of bits wasn’t really important anymore as long as the graphics were good. With the Playstation 2, GameCube and Xbox advanced computer graphics were possible. The textures were better, a higher polygon count was possible and the frame rate was higher.
Every subsequent generation of video games would just up the ante. But Nintendo took a different approach. We’ll get back to that.
My thoughts on game graphics in general
We see that with every subsequent generation of games the graphics got better. But if the story and gameplay are the only thing that matters why would you ever need to make consoles that produce better graphics? And is that always a good thing to do?
Games like the pong consoles and Atari2600 really demanded a lot of imagination to play. A square could become the ball and a simple line could become a paddle. In a way that has a certain charm and also forces a whole different way of thinking compared to what we are used to today.
Later on the graphics became more representative and detailed so gamers needed to use less imagination but the games were visually more pleasant to look at.
Better graphics also aid in narrative. Imagination is one thing but if a game developer wants to tell a good story it’s imperative to provide visual aids. In the days of the Atari2600 you needed to read the manual to find out more about the story. With graphics becoming better and better the narrative could be improved and the story progression was better.
Another point is that certain games just simply rely on the immersion. There is no bigger story. Good gameplay wouldn’t really matter if the immersion is not there. Graphics and sound are key factors in this point. Shooters on the Xbox and Playstation rely heavily on this fact lately. The Oculus Rift is also a good example.
Here we also come to a point I was hinting at earlier. Nintendo took a whole different turn when it comes to graphics. They started this with the GameCube but it became more apparent with the Nintendo Wii. The graphics, while good, were cartoony.
A good example of this was The Legend of Zelda: the Wind Waker. Upon release fans were disappointed by the graphics and critics didn’t think it was impressive either. Cell shaded graphics weren’t good enough because the Playstation 2 and Xbox 360 provided WAY better graphics.
Today we know that Nintendo made the right choice. Playing Wind Waker today is just as satisfying as it was in 2003. Why? Because cartoony cell shaded graphics age better. They look pleasing and charming in any decade. You don’t see the flaws of realistic graphics.
A game like Goldeneye 007 on Nintendo 64 was the height of graphics back in 1997 but looks really terrible and dated today. It doesn’t take away from the gameplay but it certainly doesn’t immerse the player as it did back in the day.
That is also the reason why mobile games and indy games are moving back to simulated 16 bit graphics and retro art styles. Older graphics simply age better because we aren’t focused on the flaws as much and the cartoony simple sprites represent just enough.
Graphics aren’t just the world we build and the characters we put in to that world. What about the user interface? If the user interface is a mess it will put players off the game because it makes it unplayable.
You can consider a start menu a UI but in game menus also fall in to this category. They need to be to the point while still visually appealing enough to make them clear and easy to navigate.
A good example of what a difference a good UI can make is The Secret of Monkey Island, that was re-released with a better UI and it was clearly easier to navigate.
But are the graphics REALLY that important?
Let’s make it clear… “Good” graphics is a subjective term. I like the visuals of the first Super Mario Bros. while some may not because Super Mario Galaxy looks better. We are not going to define what good graphics are but to see if they make a difference.
Well… This is a toughie. For games that need good graphics to immerse players definitely yes (1st person shooters or different simulators). Games that rely on character customization also need pleasing graphics because the player has a need to see their character visually as close to what they want them to be. Otherwise visual customization would be pointless.
There is no denying that a game cannot be saved based on graphics alone. Games NEED good gameplay. But for me… Yes, graphics play an important role in video games. They immerse, narrate and draw you in even before you press the first button of the controller or screen.
But what kind of graphics you decide on with a certain game depends on what your target audience is and what you want your game to play like. That is why you need a good graphics department that does their homework.
Even old Atari graphics have a place in gaming today. Retro is becoming modern again (I know that sounds redundant :P). So if you want to draw in a hipster crowd use retro graphics. If you want your game to age well 16-bit graphics or cell-shaded graphics are a good choice. Off course it also all depends on hardware limitations and personal preference.
Mobile games today rely heavily on pleasing visuals. The first things we see in the app store are screenshots of the game. Would you honestly buy a game off the app store that looks like a mess on the screenshots? I know I wouldn’t. Like it or not we are visual creatures and we place a lot of value on things that look good.
Games like Angry Birds have a huge market even outside the app itself. That really wouldn’t be possible if the characters were just simple squares instead of cute little birds and pigs.
Graphics aid the atmosphere in games. Without that we could simply move back to text based games on DOS. In my opinion no matter what anyone says, good graphics are important for all video games. What good graphics are is debatable but they are important. Games are a visual interactive medium and we need to treat it as such. I know it’s a very generic conclusion but the point isn’t the conclusion but how I got to it and that’s what this post is all about :).