OpenControl: A free opensource software for video tracking and automated control of behavioral mazes

https://doi.org/10.1016/j.jneumeth.2007.06.020Get rights and content

Abstract

Operant animal behavioral tests require the interaction of the subject with sensors and actuators distributed in the experimental environment of the arena. In order to provide user independent reliable results and versatile control of these devices it is vital to use an automated control system. Commercial systems for control of animal mazes are usually based in software implementations that restrict their application to the proprietary hardware of the vendor. In this paper we present OpenControl: an opensource Visual Basic software that permits a Windows-based computer to function as a system to run fully automated behavioral experiments. OpenControl integrates video-tracking of the animal, definition of zones from the video signal for real-time assignment of animal position in the maze, control of the maze actuators from either hardware sensors or from the online video tracking, and recording of experimental data. Bidirectional communication with the maze hardware is achieved through the parallel-port interface, without the need for expensive AD-DA cards, while video tracking is attained using an inexpensive Firewire digital camera. OpenControl Visual Basic code is structurally general and versatile allowing it to be easily modified or extended to fulfill specific experimental protocols and custom hardware configurations. The Visual Basic environment was chosen in order to allow experimenters to easily adapt the code and expand it at their own needs.

Introduction

Many behavioral experiments in neuroscience address high-level cognitive processes and are therefore extremely complex to design and analyze. Since a vast number of variables may, and often do contribute to the outcome of the experiments, it is paramount to reduce possible external factors of variability, of which human factors are particularly noteworthy. The complexity of behavioral testing is further increased by the need to precisely quantify the core variables which have been defined as representative of the animal's behavior. These two reasons, user independent control and precise quantitative measurements, motivate the use of automated control systems for behavioral mazes. Operant protocols used in cognitive neuroscience research (Buccafusco, 2001) regularly require interaction between the animal and several mechanical or electronic actuators that include levers, nose pokes, shockers, visual or audio stimulators, and food or drink dispensers. Commercial systems from top vendors already have software and hardware packages to control all the sensors and actuators of the maze, providing the possibility to define particular experimental conditions without the need for the experimenter to have computer programming skills. However, in addition to their high cost, these control systems may also present experimental limitations: they cannot be used universally to control actuators of any manufacturer and they commonly restrict their input to analogic signals (therefore, video tracking of animal position cannot be used to control actuators). Moreover, many experimental settings require the real-time integration of the behavioral data with other methods (e.g. neurophysiological recordings, drug delivery) asking for a degree of customization that commercial control systems cannot anticipate.

Several experimental protocols require well defined constrains for the actuators activity: for example in radial arm mazes it is convenient to dispense a food pellet only when the animal advances halfway through each baited arm, which is commonly accomplished using photocell sensors; figure-eight mazes for delayed alternation tasks or circular mazes for probing the activity of hippocampal place cells may also operate doors and feeders according to the location of the animal (Belhaoues et al., 2005, Dudchenko, 2004, Lee and Wilson, 2002); for example, it was recently described a rewarded learning task in which the animals were trained to use auditory cues of distance to goal in order to locate an unmarked area in an open-field (Bao et al., 2004). Video tracking of the animal location is therefore not only used for offline trajectory calculation but becomes a necessary element for real-time control of the maze hardware. Automated control of these types of mazes would be extremely easy to implement using a video camera instead of photocells for determining the animal position, and custom solutions of video-based control have been developed to satisfy particular needs (Pedigo et al., 2006). Both free and commercial solutions are already available for either video tracking of animal position (Noldus et al., 2001, Ramazani et al., 2007, Togasaki et al., 2005) or control of maze hardware (Zhang, 2006). However, no free simple solution exists that integrates animal tracking with maze hardware control. Even commercial solutions for this purpose are not common.

Given the number of hardware customization that some experimental protocols require, we have developed in our lab a versatile control system for behavioral mazes that allows the free integration of any electronic component deemed necessary and also includes real-time video tracking of the animal position. Three priorities guided the development of the software running this control system: it should be open-source, no expensive hardware components should be required and it should be very easy to adapt the software for specific experimental needs.

OpenControl was designed in a modular way so that every user may easily adapt the code to the needs of its particular situation. In this paper we describe the core of the software and electronic components that we have developed in order to interface with actuators from commercial manufacturers, and describe the application of the system in the control of a particular experimental protocol.

Section snippets

Materials and methods

The entire control system is made of five components: the control computer, the OpenControl software, the video camera, the maze hardware modules, and the custom electronics needed in order to use the parallel LPT1 port to control generic maze hardware (Fig. 1). Each one of these components will be described separately. Additional technical details on the setup and use of the system can be found on the webpage set for free download of the software: http://sourceforge.net/projects/OpenControl.

Results

The OpenControl system has been used in our laboratory in the control of several protocols, including the Rodent Gambling Task reported elsewhere (Pais-Vieira et al., 2007). It has also been used in experiments involving tethered rats with chronically implanted multielectrodes for simultaneous recording of behavioral and neural activities (Pais-Vieira et al., unpublished results). The use of optocoupling circuitry in the bidirectional connections through the parallel port prevented the

Discussion

We presented a software solution for automated control of behavioral tests. The OpenControl software, including the tracking algorithm, is freely available under the GNU general public license (http://www.gnu.org/licenses/gpl.html). We expect that our program may be valuable for laboratories wanting to have full access and control of the automation software. Despite being written under general assumptions, our program can be easily modified and expanded to face other needs. The process of

Acknowledgements

Supported by Fundaão para a Ciência e Tecnologia (FCT), Grants POCI/SAU-NEU/63034/2004, BPD/26198/2006, and Bial Foundation, Grant 84/04. We also acknowledge Miguel Pais-Vieira for user-feedback during the development of the software.

Cited by (65)

  • Wireless vigilance state monitoring

    2021, Methodological Approaches for Sleep and Vigilance Research
  • RFID-supported video tracking for automated analysis of social behaviour in groups of mice

    2019, Journal of Neuroscience Methods
    Citation Excerpt :

    For these reasons, tracking solutions for automated behavioural quantification have been developed (for a review, see Peleh et al., 2019). Most tracking systems are based on video recording and automatically deliver information about animal’s location, distance travelled and velocity (e.g., Aguiar et al., 2007; Crispim Junior et al., 2012; de Visser et al., 2006; Spink et al., 2001; Tort et al., 2006). A common problem in video tracking is the inability to maintain the identity of multiple animals.

  • Leveraging calcium imaging to illuminate circuit dysfunction in addiction

    2019, Alcohol
    Citation Excerpt :

    These beams may be detected by the imaging system resulting in artefactual signals; thus, alternative approaches may be required to record the behavioral activity of the animal. For example, electrical lickometers can be used to determine when the animal is consuming alcohol (Dole, Ho, & Gentry, 1983; Robinson & McCool, 2015; Spector, Andrews-Labenski, & Letterio, 1990), and video tracking can be used to assess general locomotion (Aguiar, Mendonca, & Galhardo, 2007; Spink, Tegelenbosch, Buma, & Noldus, 2001). Similarly, care should be taken to ensure that changes in ambient light are not detected by the imaging sensor (e.g., when the animal is crossing sides in a light-dark box, or if a house light cue is used).

View all citing articles on Scopus
View full text