You’ve probably heard of an AI technique known as “style transfer” — or, if you haven’t heard of it, you’ve seen it. The process uses neural networks to apply the look and feel of one image to another, and appears in apps like Prisma and Facebook. These style transfers, however, are stylistic, not photorealistic. They look good because they look like they’ve been painted. Now a group of researchers from Cornell University and Adobe have augmented style transfer so that it can transfer the look of one photo onto another — while still looking like a photo. The results are impressive.
The researchers’ work is outlined in a paper called “Deep Photo Style Transfer.” Essentially, they’ve taken the methods of the original style transfer, and added another layer of neural networks to the process — a layer that makes sure that the details of the original image are preserved.
From left to right: the original image, the reference image, and the output.
“People are very forgiving when they see [style transfer images] in these painterly styles,” Cornell professor Kavita Bala, a co-author of the study, tells The Verge. “But with real photos there’s a stronger expectation of what we want it to look like, and that’s why it becomes an interesting challenge.”
The added neural network layer pays close attention to what Bala calls “local affine patches.” There’s no quick way to accurately translate this phrase, but it basically means the various edges within the image, whether that’s the border between a tree and a lake, or a building and the sky. While style transfer tends to play fast and loose with these edges, shifting them back and forth as it pleases, photo style transfer preserves them.