
The challenge in this approach is to resolve individuals from each other and link their locations in consecutive frames during close range interactions, when they are touching or occluding each other. The most common approach for tracking unmarked individuals is to try and follow the trajectory of an individual for the duration of the video.

At the same time, these approaches, when applied to individual tracking, are limited by a more extensive computational burden, higher error rates, and stricter requirements for image quality. The former category has the obvious advantages of reduced interference with natural behavior, unbounded number of tracked individuals, and not having the burden of tagging animals and maintaining these tags throughout the experiment. Solutions for automated video tracking of insects in social groups can be roughly divided into two categories (for reviews see Dell et al., 2014 Robie et al., 2017a): methods for tracking unmarked individuals ( Branson et al., 2009 Pérez-Escudero et al., 2014 Romero-Ferrero et al., 2019 Sridhar et al., 2019 Feldman et al., 2012 Khan et al., 2005 Fasciano et al., 2014 Fasciano et al., 2013 Bozek et al., 2020), and methods for tracking marked individuals ( Mersch et al., 2013 Robinson et al., 2012). However, although complex social behavior has been the focus of extensive research for over a century, technological advances are only beginning to enable systematic and simultaneous measurements of behavior in large groups of interacting individuals. Insects provide an attractive and powerful model for collective and social behavior, as they exhibit a wide range in social complexity, from solitary to eusocial, while allowing for controlled, high-throughput experiments in laboratory settings ( Feinerman and Korman, 2017 Lihoreau et al., 2012 Gordon, 2014 Schneider et al., 2012).

One of the exciting frontiers of the field is the study of collective behavior in group-living organisms and particularly the behavior of groups of insects. This has created a growing demand for methods to measure and quantify behavior, which has been met with a wide range of tools to measure, track, and analyze behavior across a variety of species, conditions, and spatiotemporal scales ( Anderson and Perona, 2014 Berman, 2018 Brown and de Bivort, 2018 Krakauer et al., 2017 Dell et al., 2014 Robie et al., 2017a Todd et al., 2017 Egnor and Branson, 2016). Consequently, the behavioral and neural sciences have moved to study more complex forms of behavior at ever-increasing resolution. Our understanding of behavior, together with the biological, neural, and computational principles underlying it, has advanced dramatically over recent decades. anTraX can handle large-scale experiments with minimal human involvement, allowing researchers to simultaneously monitor many social groups over long time periods.

anTraX is readily integrated into existing tools and methods for automated image analysis of behavior to further augment its output. The use of color tags, a well-established and robust method for marking individual insects in groups, relaxes requirements for image size and quality, and makes the software broadly applicable. anTraX combines neural network classification of animals with a novel approach for representing tracking data as a graph, enabling individual tracking even in cases where it is difficult to segment animals from one another, or where tags are obscured. Here, we present anTraX, an algorithm and software package for high-throughput video tracking of color-tagged insects. Nevertheless, tracking individuals in closely interacting, group-living organisms remains a challenge.

Recent years have seen a surge in methods to track and analyze animal behavior.
