CultureGames

Frame rate and resolution: Does it really matter?

Article by Ryan Maskell November 23, 2014

It may not have been widely touted by large news outlets, but recently there has been widespread controversy in the gaming community. Aside from accusations of misogyny and a broken journalistic system brought about by GamerGate and Zoe Quinn, pressure has mounted on companies developing for next generation consoles – some of whom have been seen as making deliberate decisions for sub-standard resolution and degraded frame rates.

The debate has arisen mainly because of a comment made by senior producer Vincent Pontbriand at Ubisoft about Assassin’s Creed: Unity. Pontbriand confirms that it will run at a frame rate of thirty and a resolution of 900p across both next-gen platforms, to ‘avoid all the debates and stuff’. This immediately raised concerns as it implied that the game was being downgraded specifically for Xbox, which is rumoured to have less capable hardware than PS4. Ubisoft have since revoked their statement, but the question still hangs: Is 900p at thirty frames per second good enough?

15587212257_8485047351_o1It seems that many developers now think so, with Assassin’s Creed: Unity being far from the only game running at these specifications. Recent examples include Dragon Age: Inquisition, Ryse: Son of Rome, and Watch Dogs.

‘We made the right decision to focus our resources on delivering the best gameplay experience, and resolution is just one factor,’ Pontbriand says; ‘Those additional pixels could only come at a cost to the gameplay.’ The developer of Ryse has a similar perspective, stating that thirty frames per second offers a more cinematic effect, akin to a movie, and citing the mixed reactions to The Hobbit film’s 48 frames per second as evidence that higher does not always mean better.

Many gamers argue against this point on the basis that any comparison to films is invalid, simply because of the way the content is produced. When a film is recorded, natural motion blur occurs between frames creating a smooth effect between each one, which removes the jerkiness that might otherwise appear. However, motion blur such as this cannot be effectively imitated in video games. In fact, the less frames there are, the more jagged it appears to be. Games are interactive, requiring split-second decisions that can mean the difference between life and death, and create frustration between players when they can’t tell what’s going on. The fact that many games ran below 1080p on Xbox 360 and PlayStation 3 was accepted due to the fact that they were many years old. But with the release of next generation consoles, many gamers have begun to expect a higher standard.

15586960598_0ef588ee3e_oIt’s easy to understand where they are coming from. It’s argued that there is a significant difference between thirty frames per second and sixty, with studies showing that low frame rate on consoles heightens the amount of input lag from controllers. Some gamers also state that sixty frames per second means consistently smoother gameplay, resulting in a better overall experience. This was one of the main selling points developers were pushing at last year’s biggest gaming convention E3. With that on every gamer’s mind, it’s no wonder there’s bad blood between them and companies such as Ubisoft, who have been offering what gamers perceive to be an inferior product. This was the case with Watch Dogs, which was downgraded graphically across all platforms after the E3 demo, only for PC gamers to find that the features still existed and could be run easily on desktop computers with no problems.

Ultimately, there is always a choice between smooth gameplay and clearer visuals as well as other features, and it seems that developers have been consistently choosing the latter. Whether it really matters though, is down to the situation. There is no one argument that can cover every single game and platform. Individual arguments have to be made for each game, taking into account its genre and how well the limitations are combated. The more relevant question, perhaps, is why it needs to be done in the first place. With brand new consoles already struggling to cope with developer’s visions, it’s hard to imagine what things will look like a couple of years down the line. This then creates a new discussion: are the developers to blame, or is it ‘next-gen’ that is simply not good enough?

 

You may also like

LEAVE A COMMENT