(Disclaimer: The following is an editorial based solely on my own experiences and feelings, and does not represent the opinions of anyone other than myself.)
Once activities are repeated enough and find their way into the routine of life, they become customs. Customs agreed upon by communities are then passed down through generations and become traditions. Traditions in turn find their way into the present day and are unquestionably grandfathered in simply because “that’s just the way it is.” Buying a freshly cut Christmas tree. The Lambeau Leap. The expression that “boys will be boys.” All of these are examples of things that need to change as society evolves. (Seriously, people, it IS excessive celebration.) Another time-honored tradition that seems well in tact is the in-game cinematic, or cutscene. Granted, it is a much more recent example than any of the previous ones, but technology evolves much faster than anything else so it stands to reason it belongs in the same category.
While cutscenes did, for a time, serve an important purpose; whether it was to further the narrative, introduce players to a new environment, or simply show off impressive CG cinematics, they have since outlived their usefulness. With the advanced AI and graphics this and future generations of consoles provide, the cutscene now feels unnecessary by literally pulling gamers out of the action. Such an abrupt detachment from our sense of immersion during gameplay disrupts the flow of action. Also, as most gamers know, most action-oriented games require a certain rhythm, a feel for the controls that one must acquire. The last thing a player wants is to be taken out of the action while they’re right in the zone, watch a cutscene featuring a powerful new foe/boss, then be thrust right back into combat having to deal with a more difficult challenge.
As previously mentioned, graphics and AI have improved greatly as video games surged through the 90’s and into today’s next-gen market. Where cutscenes were once needed to showcase amazing technological feats and animations, most games can now accomplish such things using the in-game engine. While it’s certainly not easy to build a blockbuster franchise like Uncharted or God of War, as technology improves it will become more affordable for companies to create giant set pieces that shift and evolve as stages progress. So, if a game can already handle such impressive spectacles during gameplay segments, why take the controller out of the player’s hands? Yes, these games referenced have cutscenes, but with some clever transitions combined with the epic-in-scope levels they already provide there is no longer a need for them.
Game companies are clearly struggling with this issue as well. Look at the introduction of Quick Time Events (QTEs) into many modern games. What started out as an attempt to keep the player immersed in the experience has become something of a large annoyance to many gamers. Overlooking the fact that QTEs are often frustrating and poorly implemented, it’s more imperative to focus on the fact that they make little sense from a gameplay standpoint. If you’re going to stop the action to show a cutscene, a gamer will automatically assume that means it’s time to break focus and subconsciously switch to a more passive level of concentration. Why then would anyone want to see button prompts flash across the screen, taking their eyes off of the actual action, while the character performs an amazing finisher or acrobatic stunt? If the point was to show the player something cool that they themselves can’t perform during gameplay, why direct their focus to the sides of the screen or even worse, down at their controller? It’s disruptive to the experience, especially for new players unfamiliar with their console. This design choice both undermines the entire point of a cutscene while simultaneously giving credibility to the idea that such events should take place within the flow of gameplay, leaving only a singular reason remaining for the inclusion of cutscenes: Focusing the narrative.
If a game needs a cutscene to convey story or drive the narrative in a certain direction, here’s a newsflash: They’re doing something wrong. For proof, just turn to some Game of The Year winners and all around fantastic franchises such as BioShock and The Elder Scrolls. These games have no need for cutscenes because the story is in-game, suurounding the player. It’s found in dialogue and character interactions, in books and audio tapes, as well as in interactive cinematics that leave the player in control of the character. No one would dispute that these games offer amazing, fleshed out stories and the real beauty here is that the entire thing unfolds in-game. This type of design and storytelling is also perfect because it allows the player to consume as much story and immerse themselves in as much of the experience as they choose to, but that’s a topic for another day.
It’s a safe bet to assume some will contest or outright protest the destruction of such a longstanding (in tech years) tradition, so perhaps there’s a fair compromise to be had. Instead of outright dropping the use of cutscenes entirely, they could be better utilized by being placed as bookends at the beginning and end of a game. This in effect eases the player into the game and gives him a proper setup to the story the way a developer intends and then serves to give the game a finite conclusion, again, as intended. However, everything in between should be as interactive and immersive as possible. After all, interaction is the one thing that separates this medium from all other popular forms of entertainment.
Agree? Disagree? Feel the sudden urge to punch a wall or pull out your hair? Before doing something destructive, share your comments below. The best way to make a difference is by opening up a dialogue and sharing your thoughts. Also, please note I’m simply trying to open up a discussion, and am not in any way intending to attack those who may enjoy cutscenes. I’m simply looking to hear your thoughts while simultaneously sharing a few of my own.