• 07-23-15

Why Google’s Deep Dream A.I. Hallucinates In Dog Faces

It turns out Google’s neural network is obsessed with canines for a reason.

By John Brownlee1 minute Read

We’ve all been getting a kick out of what artists and developers have been doing with Google’s Deep Dream, the neural net powered hallucination AI. Now you can play with it for yourself, thanks to the Dreamscope web app. Just upload an image, pick one of 16 different filters, and turn that image into a hallucinogenic nightmare of infinitely repeating dog eyes for yourself.

Which probably has you wondering: what’s up with all those dog eyes anyway? Why does every single image Deep Dream coughs up look to a lesser or greater extent like Seth Brundle from The Fly crammed his teleport pod full of canines then flipped the switch? As it turns out, there’s a pretty simple answer for this.

As you may know, Google’s Deep Dream runs off the same type of neural network that powers Google’s Photos ability to identify images by their content. Essentially, the network emulates the neurons in the human brain, with a single core of the network ‘firing’ every time it sees a part of the image it thinks it recognizes. Deep Dream’s trippy effects come from giving it an initial image, then initiating a feedback loop, so that the network begins trying to recognize what it recognizes what it recognizes. It’s the equivalent of asking Deep Dream to draw a picture of what it thinks a cloud looks like, then draw a picture of what its picture looks like, ad infinitum.

Where do the dogs come in? This Reddit thread provides some insight. A neural network’s ability to recognize what’s in an image comes from being trained on an initial data set. In Deep Dream’s case, that data set is from ImageNet, a database created by researchers at Stanford and Princeton who built a database of 14 million human-labeled images. But Google didn’t use the whole database. Instead, they used a smaller subset of the ImageNet database released in 2012 for use in a contest… a subset which contained "fine-grained classification of 120 dog sub-classes."

In other words, Google’s Deep Dream sees dog faces everywhere because it was literally trained to see dog faces everywhere. Just imagine how much the Internet would be freaking out about Deep Dream right now if it was trained on a database that included fine-grained classification of LOLCATS instead.

[via Rhizome]

Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

About the author

John Brownlee is a design writer who lives in Somerville, Massachusetts. You can email him at [email protected].



How Microsoft lost its swagger—and why it’s back

Inside YouTube’s $14 billion plan to reinvent NFL Sunday Ticket

Chrome’s sidebar apps are the best new productivity feature no one’s talking about


Meet Vivek Ramaswamy, the controversial billionaire shaking up the Republican presidential race

83% have never tried alcohol: New research shows teens’ mental health is stronger than we think

New York’s LLC transparency bill would unmask wealthy real estate owners’ identities


How to harness generative AI for designing great customer experiences

These charming tools are a radical vision for how you’ll use your computer

Juul got young people addicted to nicotine—this startup wants to help them quit

Work Life

Your hybrid workplace will fail if you neglect this essential factor

DEI work is not done. Here’s how leaders can adapt their strategies post-affirmative action

Why ‘follow your passion’ is bad career advice