Swiss National Park, Switzerland
Markus Marks, University of Zurich / ETH Zurich / Caltech
Luke Franzke, Zurich University of the Arts (ZHdK)
Laurens Bohlen, University of Zurich
Ecologists who study animal behavior often spend weeks or months watching and sorting camera trap footage. That’s why neurotechnologist Markus Marks and a research team at the Institute of Neuroinformatics (INI) at the University of Zurich and ETH Zurich have developed a machine learning application to automate this task (→Marks et al., 2022).
A LONG HISTORY
The question of whether computers can see and think like humans has long fascinated researchers. The first experiments in the field of computer vision started in the 1960s. By the 1980s, computers were able to recognize basic structures like corners and edges. In the 1990s, with the increasing affordability of digital cameras and the growing power of computers, machine vision as a research field began to flourish.
Since then, significant progress has been made in the field. Today, smartphones can recognize faces, self-driving cars can read traffic signs, and AI programs assist medical professionals in diagnosing skin cancer. Such applications are also in demand in ecology, enabling extensive and complex research projects that would be impossible without technological assistance.
For that reason, Markus Marks and his team have developed an application that can automatically identify individual animals, recognize their behavior, and sort video data accordingly. The application not only detects actions, such as whether a chimpanzee is grooming another, but also emotions and moods like fear, aggression, or joy.

A QUICK SOLUTION
Conventional applications that can recognize animal behavior focus on specific features in video frames. A common approach is pose estimation, which identifies the animal’s body posture. This requires the application to find the same points (so-called landmarks), such as eyes or paws, in each frame. While this method works well, it is labor-intensive because for each new species the application has to identify the landmarks anew.
In contrast, Markus’ application can directly work with raw video data. This has two major advantages. Firstly, the method outperforms pose estimation-based applications. Secondly, researchers do not need to train the application for each animal species. Instead, they can evaluate their data almost instantly.
In the future, this application will enable long-term and in-depth studies in the field of behavioral research. It is particularly interesting for zoos, where numerous different species coexist. There it can contribute to improving animal welfare.

AN INTERACTIVE INSTALLATION
Interaction designer Luke Franzke and biologist Laurens Bohlen, in collaboration with Markus Marks, have developed an installation that aims to intuitively demonstrate how this application works.
In Moving Encounters, visitors stand in front of a camera that initially recognizes them. On a screen, they can observe how the computer classifies them. Depending on their behavior, various animals appear in tile-shaped windows surrounding them.
For the installation, Luke and Laurens worked with camera trap data from the Swiss National Park, allowing Moving Encounters to immerse visitors in the wilderness. Markus’ application actually could be useful for research in the Swiss National Park, too – for example, to better understand the behavior of wolves.

Markus Marks completed his PHD at the Institute of Neuroinformatics (INI) at the University of Zurich and ETH in 2022. He is now working as a postdoctoral researcher at the California Institute of Technology (Caltech) in Pasadena, California. Markus is the first author of the study Deep-learning-based identification, tracking, pose estimation, and behavior classification of interacting primates and mice in complex environments, which was published in April 2022 in Nature Machine Intelligence and which forms the basis for the installation Moving Encounters.

Luke Franzke is a scientific staff member at the Zurich University of the Arts (ZHdK). In 2006, he completed his Bachelor’s degree in Multimedia at Victoria University in Melbourne and gained several years of experience in UI design and development. After obtaining his Master’s degree in Interaction Design at ZHdK, he joined the research group Enactive Environments, where he developed new digital devices made of perishable materials. Luke teaches Product Design and basic programming skills, and he explores new material technologies in Interaction Design.

Laurens Bohlen is a project manager at the Art & Science Office of the University of Zurich (UZH). He studied Biology at UZH and gravitated toward Computational Biology during his studies. He continued to study Quantitative Biology and Systems Biology at UZH for his Master’s degree where he developed a machine learning system to identify individual Alpine ibex based on their horns.