Training Facial Recognition on Some New Furry Friends: Bears

Ed Miller and Mary Nguyen are Silicon Valley software developers by day, but moonlight at solving an unusually fuzzy problem.

A few years ago the pair became mesmerized, like many of us, by an Alaskan webcam broadcasting brown bears from Katmai National Park. They also happened to be seeking a project to hone their machine learning expertise.

“We thought, machine learning is really great at identifying people, what could it do for bears?” Mr. Miller said. Could artificial intelligence used for face recognition be harnessed to discern one bear face from another?

At Knight Inlet in British Columbia, Canada, Melanie Clapham was pondering the same question. Dr. Clapham, a postdoctoral researcher at the University of Victoria working with Chris Darimont of the Raincoast Conservation Foundation, was keen to explore face recognition technology as an aid to her grizzly bear studies. But her expertise was bear biology, not A.I.

Fortuitously, the four found a match on Wildlabs.net, an online broker of collaborations between technologists and conservationists. Combining their skill sets, Mr. Miller and Ms. Nguyen volunteered spare time over several years for this passion project that would eventually bear fruit, reporting the results of their experiment last week in the journal Ecology and Evolution. The project they produced, BearID, could help conservationists monitor the health of bear populations in various parts of the world, and perhaps aid work with other animals, too.

They got started by looking for other animals that had gotten the deep learning treatment.

“In typical engineering fashion, we’re always looking for a shortcut,” Mr. Miller said.

They discovered “dog hipsterizer,” a program that found the faces, eyes and noses of dogs in photos and placed rimmed glasses and mustaches on them. “That was where we started,” Ms. Nguyen said.

Although trained on dogs, dog hipsterizer worked reasonably well on the similarly shaped faces of bears, giving them a programming head start. Nevertheless, Ms. Nguyen said, the work’s initial stages were tedious. Creating a training data set for the deep learning program involved examining over 4,000 photos with bears in them and then manually highlighting each bear’s eyes, nose and ears by drawing boxes around them so the program could learn to find these features.

The system also had to overcome a challenge of brown bears’ physical appearance.

To monitor populations, “we have to be able to recognize individuals,” said Dr. Clapham. But bears don’t have any feature comparable to a fingerprint, such as a zebra’s stripes or a giraffe’s spots.

ImageThe BearID software identified bears at an accuracy rate of 84 percent.
Credit…Melanie Clapham

From 4,675 fully labeled bear faces on DSLR photographs, taken from research and bear-viewing sites at Brooks River, Ala., and Knight Inlet, they randomly split images into training and testing data sets. Once trained from 3,740 bear faces, deep learning went to work “unsupervised,” Dr. Clapham said, to see how well it could spot differences between known bears from 935 photographs.

First, the deep learning algorithm finds the bear face using distinctive landmarks like eyes, nose tip, ears and forehead top. Then the app rotates the face to extract, encode and classify facial features.

The system identified bears at an accuracy rate of 84 percent, correctly distinguishing between known bears such as Lucky, Toffee, Flora and Steve.

But how does it actually tell those bears apart? Before the era of deep learning, “we tried to imagine how humans perceive faces and how we distinguish individuals,” said Alexander Loos, a research engineer at the Fraunhofer Institute for Digital Media Technology, in Germany, who was not involved in the study but has collaborated with Dr. Clapham in the past. Programmers would manually input face descriptors into a computer.

But with deep learning, programmers input the images into a neural network that figures out how best to identify individuals. “The network itself extracts the features,” Dr. Loos said, which is a huge advantage.

He also cautioned that, “It’s basically a black box. You don’t know what it’s doing,” and that if the data set being examined is unintentionally biased, certain errors can emerge.

For instance, if some bears are photographed more often in light than in dark conditions, the lighting difference can cause misclassification of the bears. (Data bias can be a problem in human facial recognition by A.I., with misidentifications known to be more likely for people of color).

Whatever BearID is really doing, Dr. Clapham, who recognizes many Knight Inlet bears by sight, was surprised and encouraged by where the program fell short.

Video

Cinemagraph
The neural network figures out its own ways of telling these two bears apart.CreditCredit…Melanie Clapham

“The bears that I confused, the network confused as well,” she said, suggesting that the app behaves similarly to the neural network in her brain. However, this first release of BearID is just the start. She hopes the open-source application will become more accurate with more inputs, use and time.

The app is of great interest to the Knight Inlet Lodge in Glendale Cove, which has run grizzly bear tours for decades, and its current owner, the Nanwakolas member First Nations of Canada.

“Fifteen years ago when we started doing land use planning, there was just one provincial bear health expert for the whole province,” said Kikaxklalagee / Dallas Smith, the president of Nanwakolas Council and a member of the Tlowitsis Nation. That hampered the Nations’ understanding of the health of bears on their territory. He said he felt excited that this “Jason Bourne-ish” technology would allow for more informed stewardship of bears. “We’re trying to make it a sustainable, limited footprint operation.”

And BearID may not stop with North American bears, as Dr. Clapham is already in conversation with others keen to use it for species like sloth bears, sun bears and Asiatic bears, as well as wolves.

“What we’d love is that one day we have somewhere where people can upload camera trap images and the system tells you not only what species you’ve seen, but also what individual you’ve seen,” and maybe its sex and age as well, she said.

Source

Be the first to comment

Leave a Reply

Your email address will not be published.


*


twenty − nine =