The software shows just how far neural networks have come. In the past, apps like Prisma have utilized AI-powered filters to turn your photos into paintings that evoke art masters like Van Gogh or Picasso. Both Facebook and Google also brought the so-called "style transfer" feature to their respective platforms. But NVIDIA's new tool goes one further by creating lifelike pieces from the most basic of outlines (in other words, something from nothing).
There are essentially three parts to the GauGAN software: a paint bucket, pen and pencil. At the bottom of the screen are a bunch of labels like sky, water, rock, and sand. Select "water" and tap the paintbrush and the tool will turn your blue line into a cascading, photorealistic waterfall. The same goes for turning circles into clouds or lumps into rocks or a cliffside.
"It's like a coloring book picture that describes where a tree is, where the sun is, where the sky is," said Bryan Catanzaro, vice president of applied deep learning research at NVIDIA. "And then the neural network is able to fill in all of the detail and texture, and the reflections, shadows and colors, based on what it has learned about real images."
GauGAN's underlying AI is also intuitive enough to cast a reflection from a tree into the puddle of water below it. Swap a label from "grass" to "snow" and the entire image switches to a winter scene, even plucking the leaves from your tree's branches.
For now, NVIDIA is only showing off a demo that highlights the software's strengths. There's no mention of its approach to man-made objects like buildings and furniture, which as The Verge notes, is much trickier for GANs to replicate. NVIDIA hopes GauGAN will eventually make it onto its AI playground: a new website that opens up its image editing, styling, and photorealistic synthesis software demos to the masses. It envisions the tool being used by everyone from architects and urban planners to video game developers.