April 22, 2019
Gamers have discovered a way to use machine learning to improve the graphics of older games. Called “AI upscaling,” the technique uses an algorithm to take a low-resolution image and, based on training data, generates a version with more pixels. Although upping the resolution of images is not new, machine learning has improved both the speed and the quality of the end result. On the r/GameUpscale subreddit, which is moderated by Norwegian teacher and student Daniel Trolie, users share “tips and tricks” on the practice.
The Verge reports, “browsing these forums, it’s apparent that the modding [modifying] process is … a job for skilled craftspeople, requiring patience and knowledge … a labor of love, not a quick fix.” But because the AI-enabled process is much speedier than the work of a team, “there’s been an explosion of new graphics for old games over the past six months or so” including “Doom,” “Half-Life 2” (below), “Metroid Prime 2,” “Final Fantasy VII,” “Grand Theft Auto: Vice City” and “Mass Effect 2.”
A “modder” by the alias of ‘hidfan’ revealed that “the updated ‘Doom’ visuals he made took at least 200 hours of work to tweak the algorithm’s output and clean up final images by hand.”
In other words, “just because AI is involved, doesn’t mean human labor isn’t.” One task modders must do is edit out the noise created by the AI upscaling algorithms. Hidfan reported he spends between five to 15 hours to clean up a single monster.
At Topaz Labs, a startup upscaling service, chief technology officer Albert Yang said the process begins with a GAN (generative adversarial network) algorithm that is trained with “millions” of low-res/high-res image pairs. Once trained, the algorithm tries to do its own transition from low-res to high-res, comparing its own work against training models. In the process, it creates its own rules for upscaling. Yang noted that this “one-size-fits-all approach … produces mixed results.”
The Verge also says that upscaling images is “often about salvaging memories,” but the results can be jarring. Stefan Rumen, a modder who upscaled “Final Fantasy VII,” expressed that “new display technology is as much to blame for this as outdated graphics.” “With the pixel/low polygon graphics of yesteryear, the old TV monitors helped gloss over many imperfections,” he said. “Your mind finished the job and filled in the gaps [but] modern displays show these old games in their un-filtered roughness.”
But, in another way, “these early games are also the perfect target for AI upscaling … partly because of their extensive use of pre-rendered backgrounds, which mean modders have to process fewer images.” “They’re not as low-res as pixel art, meaning there’s more information for the machine learning to do its magic, but it’s not a too high resolution that an upscale wouldn’t be needed,” he said.