Graphics and artistry have changed how we see video games
By Brittney MacDonald, Contributor
We’ve all seen them in movies: Flynn’s in Tron, Litwak’s Family Fun Center in Wreck-It Ralph; I’m of course talking about those ancient arenas that are all too scarce today—arcades! This is where video games first began. Long before Mario and Zelda, there was Gauntlet and Centipede; before he teamed up with Diddy Kong, Donkey Kong was stealing Princess Peach and throwing barrels down ramps like a jerk. It’s hard to imagine a time when you had to leave your house just to challenge your friends to a race in the newest Forza, or a time when a single video game was the size of your refrigerator. What’s also hard to imagine is a time when adults weren’t allowed to play video games at all: some video arcades had a policy of not even allowing adults on the arcade floor. Instead, they were relegated to the very simplistic console games available at the time, such as Pong and Death Race—if they could bare the shame of playing with something meant for children.
With games such as God of War, Dante’s Inferno, Dead Space, and Catherine being so clearly meant for a more mature audience than for your average 13-year-old, it makes me wonder what changed in video games over the years to make them acceptable for adults. Not only that, but I wonder what made us a marketable demographic for video games at all.
Comparatively, there are a thousand differences between the games of the past and those of today. But the most obvious and the one that trumps all others is how artistry and creativity have become so integral to our enjoyment of modern video games—to the point where most gamers aren’t even aware that they’re there; they just know how horrible it is when they’re not.
No, I’m not talking about games like Flower or Journey, where the whole point of the game is to be a peaceful, hippie love fest. Modern gamers expect their games to be works of art, from the music to the graphics; so much so that game developers are now using technology meant for movies to create a smoother, more realistic experience for their fans.
Games like Beyond: Two Souls and Call of Duty: Ghosts are employing motion capture technology originally meant to produce dangerous or impossible action sequences in film. The same technology that was used to create the Na’vi in James Cameron’s blockbuster Avatar is now being used to make sure that us gamers can’t complain about poor movement animation. The days of the wire-frame model are ending, my friends; instead, we have real actors running obstacle courses so their movements can be scanned and put into our gameplay.
Ideas like these are changing and revolutionizing the video game industry. Although kids today expect perfect graphics and immersive gameplay, it’s very hard for many of us to imagine anyone under the age of 12 appreciating the effort today’s developers are putting in to ensure their game’s a success. In other words, the force that’s driving the industry to be better isn’t the pre-teen market like in the days of old; instead, the driving force is the—once frowned upon—adult gamers.
I ask Ryan McGechaen, a game developer from Vancouver-based Relic Entertainment, how important artistry and creativity is when developing a game.
“Absolutely important; I really believe that video games are much closer to art than they were even five years ago. If you look at a lot of the indie games that have been coming out, more and more of them are focussed on artistic expression rather than making money.”
When asked how advances in graphic technology affect how Relic approaches a new project, McGechaen replied, “We can do a lot more than we have in past projects. Relic in particular has always been focussed on hi-fidelity experiences—through highly detailed models, effects, and even technology such as our terrain deformation. A lot of this technology feeds back into our game design; for example, when we decided to build Company of Heroes 2, we knew we wanted it to feature the harsh winter of the WWII Eastern Front. Thanks to advances in DirectX [a technology that powers the graphics in many games], we were able to take that a step further, giving us snow that deforms when soldiers move through it and leaving tracks, or ice that can be destroyed and re-freeze.”
Let’s get historical for a second: knowing that many games today are marketed more to adults is one thing, but what encouraged developers to target an older demographic in the first place? The Sonic the Hedgehog I played as a kid certainly wasn’t aimed to impress my mother, though I will admit she developed a pretty bad Ecco the Dolphin habit—long before the days of the Internet, there was the Sega hotline to call and get tips from, and let’s just say that our phone bill was strangely high for the couple of months it took her to finally beat that game.
As video games shifted and technology changed so that many arcade games became console games as well, with the invention of the Super Nintendo and Sega Genesis, all of a sudden adults were able to try out new games and relive their arcade days without receiving strange looks from frightened parents thinking they were a kidnapper. Plus, kids who had been raised on arcade controls needed time to adjust to a D-pad, which left an opening for adults to learn as well—all in the privacy of their own homes.
It wasn’t long before developers became aware that they were catering to an older market, and saw a chance to appeal to an even greater demographic pool than they had in the past. This influx of older gamers changed the direction some developers went in. They wanted to appeal to the more mature market on a level that separated them from the younger demographics. The subject matter of their games consequently changed, as did the difficulty levels of the controls and game maps.
Once the older market was targeted, it also brought in all the remaining market pools: teen, young adult, adult, and senior. As the interest in video games grew and they became more easily accessible via consoles, handheld devices, and the Internet, no age demographic that was left untapped. So nowadays, video games are being treated more like movies than anything else. There is a target audience, and the complexity of the graphics, story, and gameplay are all based off of what that audience is. Much like violent and/or cerebral movies like Psycho may not appeal to a child, a game like Viva Piñata won’t appeal to an adult (unless you’re my mother). Meanwhile, games like Mass Effect, Borderlands, and Assassin’s Creed are more than capable of picking up the torch for the older gamers who want a challenge, beautiful graphics, and an intriguing plot.
I suppose if I was to sum it all up, video games aren’t for kids anymore—but neither are they for adults, teens, or our grandparents, exclusively. So much goes into video games that they’re now for everyone. The creativity, artistry, and desire to be better with each and every new project have developers working their butts off to satisfy a large market of people waiting to see the latest advancements, as well as be emotionally moved by the games they choose to invest their time in. Arcades are slowly becoming a thing of the past in North America, and with them the idea that video games are only child’s play.