

Central Nervous System Recovery
Evaluation through Computational With Analysis of Locomotion Kinematics
Abstract
Mouse models are commonly used in the study of central nervous system (CNS) diseases. Areas such as spinal cord injury (SCI) research often require quantification of mouse locomotion ability. The established standard for mouse recovery assessment after SCI, the Basso Mouse Scale (BMS), is limited by reliance on manual observation. Existing computational assessment solutions provide objective and sensitive quantification, but are difficult to reproduce or require expensive proprietary products. This article introduces an accessible and open-source solution for Central Nervous System Recovery Evaluation through Computational With Analysis of Locomotion Kinematics (CNSRECWALK). The pipeline utilizes DEEPLABCUT, an open-source markerless pose estimation model, to identify the position of body parts in videos of mice walking across a transparent channel. CNSRECWALK extracts and calculates 139 parameters categorized into speed, joint flexibility, coordination, stability, stepping, body weight support, and other metrics. These parameters can be aggregated to provide summary scores by category, or reduced through principal component analysis (PCA) to find overall trends or differences between experimental groups. This article describes the computational pipeline, and preliminary usage of it. It discusses the potential for this open-source method to enable reproducible comparison of condition severity and subtle behavioral differences in a variety of neurological and physiological studies.
Introduction
Central nervous system (CNS) diseases and injuries, such as spinal cord injury, result in varying degrees of locomotion deficits. Spinal cord injury can cause temporary or permanent changes to movement, feeling, strength, and other body functions below the injury site. According to the World Health Organization, approximately 15.4 million people are living with spinal cord injury globally as of 2021 (Spinal Cord Injury, 2024). Many ongoing efforts to develop treatments for spinal cord injury pertain to restoring body functions through research in areas such as CNS axon regeneration and remyelination (Winter et al., 2022).
In order to test hypotheses and evaluate treatments, researchers need methods to interpret and quantify their effects. Some CNS research happens in vitro, such as through cell cultures. However, due the complexity of neurobiology and the CNS, in vivo testing is often necessary early in the research process. Mouse models are commonly used for in-vivo experiments in the study of these diseases and potential treatments. A variety of data types are often collected, including microscopic imaging, electromyography recordings, and kinematic and behavioral data (Zörner et al., 2010). Data representing the ability to carry out various behaviors, such as locomotion (walking), are important as they represent the end result, showing how treatments may directly improve or interact with a human patient’s day to day life. Kinematic data can be evaluated to quantify this functional recovery (Garrick et al., 2021).
The standard for mouse recovery assessment after spinal cord injury is the Basso Mouse Scale (BMS), an integer score between 0 and 9 assigned through manual observation. To score a mouse, a
pair of experimenters observe it in an open field or track. Observers note parameters such as the extent of ankle movement and frequency/consistency of plantar stepping (a step where the paw faces the surface as it bears weight, as opposed to a step where the reverse side of the paw contacts or drags along the surface), and the degree of body trunk “stability” (Basso et al., 2006). However, BMS has several shortcomings. As it is assessed by eye, it can be subject to bias or subjectivity. Additionally, in order to be easy to track by eye, by design, some factors of the scale are not clearly defined. For example, there are multiple locomotor characteristics that define each level, such that individual parameters are not discernable in the final score. As such, there is limited room for nuanced information. As it is non-continuous and non-linear, it lacks the capacity to capture details in a sensitive manner.
Computational assessment and video analysis solve many of these problems, providing objective and sensitive quantification. Most systems so far have used a side view or bottom-up view of a mouse walking across a clear channel (Garrick et al., 2021). Some use advanced three-dimensional motion tracking using multiple cameras recording at the same time, or treadmill/robotic setups that interact mechanically with the mice (Dominici et al., 2012). These solutions can calculate many parameters describing mouse locomotion, such as temporal features, stepping trajectory, body weight support and stability, overall velocity, and coordination. However, a majority of existing computational pipelines are difficult to reproduce or are commercial products (Diogo et al., 2019). This project aims to create an accessible and open-source solution. As an open-source method, CNSRECWALK seeks to enable reproducible comparison of condition severity and subtle
behavioral differences in a variety of neurological and physiological studies.
Methods
CNSRECWALK utilizes DEEPLABCUT, an open-source markerless pose estimation model (Mathis et al., 2018), to identify the position over time of body parts in videos of mice walking across a transparent channel. The channel is built with easily accessible plastic and aluminum extrusion. The chamber used was 70 centimeters in length, 70 centimeters in height, and 5 centimeters in width. A mirror beneath the channel fixed at a 45 degree angle from the floor of the channel allows a single camera to capture both the side and bottom of the mouse at once. A light bar illuminates the channel, and the camera is placed at a distance of 24 inches from the nearest edge of the chamber. Videos are recorded at 100M, 60p, 120 frames per second, with an f-stop of 9, a shutter speed of 1/320, and an ISO of 2000 using a Sony a6400 camera. The focus of the lens is manually adjusted so that no parts of the mouse are blurred in either the top or bottom view. To prepare for DEEPLABCUT tracking, longer videos are cut into shorter clips, such that the mouse makes a complete pass from one end of the channel to the other in each clip, or in the case that a mouse is unable to make a complete pass, for a set duration of time. The clips are flipped such that the mouse is always traveling from right to left to reduce the number of unnecessary variables. (In many models, results are expected to be symmetrical between the left and right sides of the body, so this does not pose an issue. However, whether a clip is flipped or not is noted in case it is necessary to determine the true left/right sides of the body, and for cases where results are not expected to be symmetrical.)
22 points are identified and tracked using DEEPLABCUT (Figure 1). As DEEPLABCUT pose estimation is markerless, no points are required to be physically drawn or attached to the mouse (Mathis et al., 2018). In order to train the DEEPLABCUT model, individual frames were extracted from videos from multiple datasets over a variety of mouse conditions and manually labeled. After manual labeling, the model was manually validated against frames not in the training dataset. Outlier cases were identified, and those with incorrect tracking were manually labeled and added to the training dataset. The result of training with 800000 iterations was, with a p-cutoff of 0.8, an average training dataset error of 3.86 pixels and a test dataset error of 3.16 pixels.

Figure 1. The 22 points identified and tracked using DEEPLABCUT. The points represent body parts from the top and side views.
CNSRECWALK extracts and calculates 139 parameters categorized into speed, joint flexibility, coordination, stability, stepping, body weight support, and other metrics. The
DEEPLABCUT data is scaled such that all coordinates are normalized relative to the body length of the mouse. In addition, a median filter is applied to the coordinates to remove short-lived tracking errors.
After preprocessing, the first step in calculating parameters is to identify the gait pattern of the mouse, as many parameters are calculated or aggregated per step. (For example, maximum ankle height per step, or steps/second.) To do this, the velocities of the paws (tracked from the bottom mirror view) are analyzed. The locomotion cycle can be classified into two phases, swing and stance. During the swing phase, the limb is not supporting body weight, and advances forward. During the stance phase, the limb supports body weight and the paw remains stationary on the ground. To identify the time and duration of steps, CNSRECWALK identifies these swing and stance phases. First, a Savitzy-Golay filter is applied to the velocity over time to smooth out noise while preserving peaks. Additionally, unrealistically high velocity values are discarded and the distance between the paw and a reference point (such as base of the tail) is verified. Then, a velocity threshold is determined as a factor of the maximum velocity of the paw throughout the video. If the threshold is too low, such as when there is no motion at all throughout the video, it is set to a fixed value. The discarding of unrealistically high velocities prevents an impossibly high threshold from being set, which would cause real steps to be missed. Next, any periods of time for which the paw velocity is greater than the threshold is considered a swing phase, and any other period is considered a stance phase. A phase is not counted if it is active at the beginning or end of the recording, as it may be partially cut off and not include the entire stepping motion. Additional filtering is then applied to the identified phases: Phases that are shorter than a
preset duration are discarded, and phases with a peak velocity amplitude less than another threshold, also determined as a fraction of the maximum velocity over the duration of a clip, are discarded. Each swing phase followed by a stance phase is considered one full step, which is used as a unit of time for calculating other parameters.

Figure 2. Identification of swing and stance phases based on smoothed paw velocity (orange). Also shown is raw velocity (blue) and velocity threshold (blue horizontal line).
Several parameters are calculated directly from this timing information, such as swing duration, stance duration, swing/step
duration ratio, step duration, step speed, and step frequency. These values per step are averaged to produce a median, min, and max value for the entire video. In addition to simple timing parameters, this information is also used to calculate parameters describing inter-limb coordination of the gait cycle. This includes the ratio of pairs of consecutive steps which are in the correct order (a hindlimb step followed by a forelimb step, or a forelimb step followed by a hindlimb step, as opposed to a forelimb step followed by a forelimb step). The other coordination parameters are the phase between each step of the front left paw and each other paw. For example, the amount of time between the front right paw step and the front left paw step, divided by the amount of time between the front left paw step and the next front left paw step.

Figure 3. Phase_with_step_front_left parameters for an example video clip. Each line represents a step in the video clip, and the angle of the line represents the phase between that step and the previous front_left step.
Various positional parameters are determined from the coordinates extracted by DEEPLABCUT. The positional parameters include angles of various joints, heights of various body parts, and distances between body parts. These values are aggregated over time for the entire clip to determine values such as per-clip median toe height. Additionally, many measurements are first aggregated on a per-step basis, finding the minimum, maximum, or median
per step. Parameters related to the formelimb stepping are calculated over the timeframe of each forelimb step, and those related to the hindlimb stepping are calculated over the timeframe of each hindlimb step. Parameters relevant overall are calculated for both. The value of the parameter for each step is then averaged for the entire clip.
The heights of the body parts are calculated relative to a baseline height. The baseline height is determined based on the minimum height of the paw during the timeframe of calculation, to approximate the height of the surface the mouse is walking on. So, for per-clip parameter values, the baseline height is the minimum height of the paw for the entire clip, while for per-step parameters, it is the minimum height of the paw for the step. Parameters related to forelimb stepping use the front paw to calculate the baseline height, while parameters related to the hindlimb use the back paw.
After parameters are calculated, additional filtering is done to further remove erroneous data points. Pairs of body parts which are expected anatomically to be at a fixed distance are validated, and affected parameters in frames where these skeletal distances are unrealistically large or small are removed. This helps to alleviate the effects of DEEPLABCUT tracking errors and erratic values in cases where a point is occluded behind other body parts or goes out of frame. Additionally, a median filter is applied to certain valuesensitive but non-time-sensitive parameters such as paw angle.

Figure 4. Median filter applied to ankle angle parameter to remove erroneous outlier values. The single-frame spikes to large values are noise, as it is physically impossible for the joint to move a large distance in one frame and return in the next frame.
Based on potential differences in the recovery process of spinal cord injury and classified according to functionality, the parameters can roughly be divided into the categories of hindlimb flexibility, coordination, velocity, body weight support, stability, stepping, tail, and other.
Parameter
angle_crest_median
angle_hip_median
angle_knee_median
angle_ankle_median
step_back_left_angle_crest_max
step_back_left_angle_crest_min
step_back_left_angle_hip_max
step_back_left_angle_hip_min
step_back_left_angle_knee_max
step_back_left_angle_knee_min
step_back_left_angle_ankle_max
step_back_left_angle_ankle_min
Description
Hindlimb flexibility
The median angle per video clip about the crest, between the back and hip
The median angle per video clip about the hip, between the crest and knee
The median angle per video clip about the knee, between the hip and hind ankle
The median angle per video clip about the hind ankle between the hind toe and knee
The mean per video clip value of the max per backleft paw step angle about the crest, between the back and hip
The mean per video clip value of the min per back-left paw step angle about the crest, between the back and hip
The mean per video clip value of the max per backleft paw step angle about the hip, between the crest and knee
The mean per video clip value of the min per back-left paw step angle about the hip, between the crest and knee
The mean per video clip value of the max per backleft paw step angle about the knee, between the hip and hind ankle
The mean per video clip value of the min per back-left paw step angle about the knee, between the hip and hind ankle
The mean per video clip value of the max per backleft paw step angle about the hind ankle, between the hind toe and knee
The mean per video clip value of the min per back-left paw step angle about the hind ankle, between the hind toe and knee
step_back_left_angle_crest_range
step_back_left_angle_hip_range
The mean per video clip value of the difference between the min and max values per back-left paw step of the angle about the crest, between the back and hip
The mean per video clip value of the difference between the min and max values per back-left paw
step_back_left_angle_knee_range
step_back_left_angle_ankle_range
ratio_step_front_left_with_step_back
ratio_step_front_right_with_step_back
step of the angle about the hip, between the crest and knee
The mean per video clip value of the difference between the min and max values per back-left paw step of the angle about the knee, between the hip and hind ankle
The mean per video clip value of the difference between the min and max values per back-left paw step of the angle about the hind ankle, between the hind toe and knee
Coordination
The ratio of front left paw steps followed by a hindlimb step, over the number of front left paw steps in a clip
The ratio of front right paw steps followed by a hindlimb step, over the numer of front right paw steps in a clip
ratio_correct_step_order
step_back_right_phase_with_front_left
step_front_right_phase_with_front_left
step_back_left_phase_with_front_left
The ratio of steps in the correct order over the total number of steps in a clip. A step in the correct order is either a hindlimb step followed by a forelimb step, or a forelimb step followed by a hindlimb step
The mean per video clip value of the fraction of time between front left paw steps at which each back right paw step occurs.
The mean per video clip value of the fraction of time between front left paw steps at which each front right paw step occurs.
The mean per video clip value of the fraction of time between front left paw steps at which each back left paw step occurs.
Velocity
step_front_left_swing_speed
step_back_left_swing_speed
overall_speed
height_crest_median
The mean front left paw swing speed per video clip
The mean back left paw swing speed per video clip
The mean speed of the tail base per video clip
Body Weight
The median height about the surface per video clip of the iliac crest
height_hip_median
height_knee_median
height_tailBase_median
height_back_median
step_front_left_height_back_min
step_front_left_height_back_max
step_front_left_height_back_range
step_back_left_height_crest_min
step_back_left_height_crest_max
step_back_left_height_hip_min
step_back_left_height_hip_max
step_back_left_height_knee_min
step_back_left_height_knee_max
step_back_left_height_tailBase_min
step_back_left_height_tailBase_max
The median height about the surface per video clip of the hip
The median height about the surface per video clip of the knee
The median height about the surface per video clip of the tail base
The median height about the surface per video clip of the iliac crest
The mean per video clip value of the max per frontleft paw step height above the surface of the back
The mean per video clip value of the max per frontleft paw step height above the surface of the back
The mean per video clip value of the difference between the min and max values per front-left paw step of the height above the surface of the back
The mean per video clip value of the max per backleft paw step height above the surface of the crest
The mean per video clip value of the max per backleft paw step height above the surface of the crest
The mean per video clip value of the max per backleft paw step height above the surface of the hip
The mean per video clip value of the max per backleft paw step height above the surface of the hip
The mean per video clip value of the max per backleft paw step height above the surface of the knee
The mean per video clip value of the max per backleft paw step height above the surface of the knee
The mean per video clip value of the max per backleft paw step height above the surface of the tail base
The mean per video clip value of the max per backleft paw step height above the surface of the tail base
Stability
width_stance_paws_front_median
width_stance_paws_back_median
angle_paw_back_left_liftoff
The median per video clip lateral distance between the front paws
The median per video clip lateral distance between the back paws
The median per video clip value of the mean values of the back left paw angle from the nose-tail base line for the six frames surrounding the start of each back left
angle_paw_back_right_liftoff
angle_paw_back_left_landing
angle_paw_back_right_landing
step_front_left_width_paws_back_max
step_front_left_width_paws_back_min
step_front_left_width_paws_back_range
step_back_left_height_crest_range
step_back_left_height_hip_range
step_back_left_height_knee_range
step_back_left_height_tailBase_range
step_back_left_width_paws_back_max
step_back_left_width_paws_back_min
step_back_left_width_paws_back_range
paw swing phase
The median per video clip value of the mean values of the back right paw angle from the nose-tail base line for the six frames surrounding the start of each back right paw swing phase
The median per video clip value of the mean values of the back left paw angle from the nose-tail base line for the six frames surrounding the end of each back left paw swing phase
The median per video clip value of the mean values of the back left paw angle from the nose-tail base line for the six frames surrounding the end of each back right paw swing phase
The mean per video clip value of the max per frontleft paw step lateral distance between the back paws
The mean per video clip value of the min per frontleft paw step lateral distance between the back paws
The mean per video clip value of the difference between the min and max values per front-left paw step of the lateral distance between the back paws
The mean per video clip value of the difference between the min and max values per back-left paw step of the height above the surface of the iliac crest
The mean per video clip value of the difference between the min and max values per back-left paw step of the height above the surface of the hip
The mean per video clip value of the difference between the min and max values per back-left paw step of the height above the surface of the knee
The mean per video clip value of the difference between the min and max values per back-left paw step of the height above the surface of the tail base
The mean per video clip value of the max per backleft paw step height above the surface of the knee
The mean per video clip value of the max per backleft paw step height above the surface of the knee
The mean per video clip value of the max per backleft paw step height above the surface of the knee
Stepping
count_step_back_left
count_step_back_right
count_step_front_left
count_step_front_right
angle_elbow_median
height_ankle_median
height_toe_median
height_shoulder_median
height_elbow_median
height_frontPaw_median
step_front_left_angle_elbow_max
step_front_left_angle_elbow_min
step_front_left_angle_elbow_range
step_front_left_height_shoulder_min
step_front_left_height_shoulder_max
step_front_left_height_elbow_min
step_front_left_height_elbow_max
step_front_left_height_frontPaw_max
step_front_left_height_shoulder_range
The number of back left paw steps per video clip
The number of back right paw steps per video clip
The number of front left paw steps per video clip
The number of front right paw steps per video clip
The median angle per video clip about the elbow, between the front paw and shoulder
The median height per video clip above the surface of the ankle
The median height per video clip above the surface of the back toe
The median height per video clip above the surface of the shoulder
The median height per video clip above the surface of the elbow
The median height per video clip above the surface of the front paw
The mean per video clip value of the max per frontleft paw step angle about the elbow, between the front paw and shoulder
The mean per video clip value of the min per frontleft paw step angle about the elbow, between the front paw and shoulder
The mean per video clip value of the difference between the min and max values per front-left paw step of the angle about the elbow, between the front paw and shoulder
The mean per video clip value of the min per frontleft paw step height above the surface of the shoulder
The mean per video clip value of the max per frontleft paw step height above the surface of the shoulder
The mean per video clip value of the min per frontleft paw step height above the surface of the elbow
The mean per video clip value of the max per frontleft paw step height above the surface of the elbow
The mean per video clip value of the max per frontleft paw step height above the surface of the front paw
The mean per video clip value of the difference
between the min and max values per front-left paw step of the height above the surface of the shoulder
step_front_left_height_elbow_range
step_front_left_step_distance
step_front_left_protraction_distance
step_front_left_retraction_distance
step_front_left_duration_step
step_front_left_duration_swing
step_front_left_stance_duration
step_front_left_ratio_duration_swing_ste p
step_back_left_height_ankle_min
step_back_left_height_ankle_max
step_back_left_height_toe_max
step_back_left_height_ankle_range
step_back_left_step_distance
step_back_left_protraction_distance
step_back_left_retraction_distance
step_back_left_duration_step
The mean per video clip value of the difference between the min and max values per front-left paw step of the height above the surface of the elbow
The mean per video clip value of the distance between the leftmost and rightmost points per step of the front left paw
The mean per video clip value of the max per step positive distance between the shoulder and front paw
The mean per video clip value of the max per step negative distance between the shoulder and front paw
The mean per video clip of front left paw step durations (each step is one swing followed by one stance)
The mean per video clip of front left paw swing durations
The mean per video clip of front left paw stance durations
The mean per video clip value of, for each front left paw step, the swing duration divided by the step duration
The mean per video clip value of the min per back-left paw step height above the surface of the back ankle
The mean per video clip value of the max per backleft paw step height above the surface of the back ankle
The mean per video clip value of the max per backleft paw step height above the surface of the back toe
The mean per video clip value of the difference between the min and max values per back-left paw step of the height above the surface of the back ankle
The mean per video clip value of the distance between the leftmost and rightmost points per step of the back left toe
The mean per video clip value of the max per step positive distance between the iliac crest and back toe
The mean per video clip value of the max per step negative distance between the iliac crest and back toe
The mean per video clip of back left paw step
step_back_left_duration_swing
step_back_left_stance_duration
step_back_left_ratio_duration_swing_ste p
durations (each step is one swing followed by one stance)
The mean per video clip of back left paw swing durations
The mean per video clip of back left paw stance durations
The mean per video clip value of, for each back left paw step, the swing duration divided by the step duration
Other
angle_eye_median
angle_back_median
angle_curve_median
distance_nose_tail_End_median
distance_shoulder_crest_median
distance_elbow_ankle_median
step_front_left_angle_eye_max
step_front_left_angle_eye_min
step_front_left_angle_back_max
step_front_left_angle_back_min
step_front_left_angle_eye_range
step_front_left_angle_back_range
The median per video clip angle between the surface and the nose-eye vector
The median per video clip angle about the back, between the shoulder and iliac crest
The median per video clip angle about the back, between the nose and tail base
The median per video clip distance between the nose and the tail end
The median per video clip distance between the shoulder and iliac crest
The median per video clip distance between the elbow and back ankle
The mean per video clip value of the max per frontleft paw step angle between the surface and the noseeye vector
The mean per video clip value of the min per frontleft paw step angle between the surface and the noseeye vector
The mean per video clip value of the max per frontleft paw step angle about the back, between the shoulder and iliac crest
The mean per video clip value of the min per frontleft paw step angle about the back, between the shoulder and iliac crest
The mean per video clip value of the difference between the min and max values per front-left paw step of the angle between the surface and the nose-eye vector
The mean per video clip value of the difference between the min and max values per front-left paw
step_front_left_distance_nose_tail_max
step_front_left_distance_nose_tail_min
step_front_left_distance_nose_tail_End_m ax
step_front_left_distance_nose_tail_End_m in
step_front_left_distance_shoulder_crest_m ax
step_front_left_distance_shoulder_crest_m in
step_front_left_distance_elbow_ankle_ma x
step_front_left_distance_elbow_ankle_mi n
step of the angle about the back, between the shoulder and iliac crest
The mean per video clip value of the max per frontleft paw step distance between the nose and tail base
The mean per video clip value of the min per frontleft paw step distance between the nose and tail base
The mean per video clip value of the max per frontleft paw step distance between the nose and tail end
The mean per video clip value of the min per frontleft paw step distance between the nose and tail end
The mean per video clip value of the max per frontleft paw step distance between the shoulder and iliac crest
The mean per video clip value of the min per frontleft paw step distance between the shoulder and iliac crest
The mean per video clip value of the max per frontleft paw step distance between the elbow and back ankle
The mean per video clip value of the min per frontleft paw step distance between the elbow and back ankle
step_front_left_distance_nose_tail_range
step_front_left_distance_nose_tail_End_ra nge
The mean per video clip value of the difference between the min and max values per front-left paw step of the distance between the nose and tail base
The mean per video clip value of the difference between the min and max values per front-left paw step of the distance between the nose and tail end
step_front_left_distance_shoulder_crest_ra nge
step_front_left_distance_elbow_ankle_ran ge
step_back_left_distance_nose_tail_max
step_back_left_distance_nose_tail_min
step_back_left_distance_nose_tail_End_m ax
The mean per video clip value of the difference between the min and max values per front-left paw step of the distance between the shoulder and iliac crest
The mean per video clip value of the difference between the min and max values per front-left paw step of the distance between the elbow and back ankle
The mean per video clip value of the max per backleft paw step distance between the nose and tail base
The mean per video clip value of the min per back-left paw step distance between the nose and tail base
The mean per video clip value of the max per backleft paw step distance between the nose and tail end
step_back_left_distance_nose_tail_End_m in
The mean per video clip value of the min per back-left paw step distance between the nose and tail end
step_back_left_distance_shoulder_crest_m ax
step_back_left_distance_shoulder_crest_m in
step_back_left_distance_elbow_ankle_ma x
The mean per video clip value of the max per backleft paw step distance between the shoulder and iliac crest
The mean per video clip value of the min per back-left paw step distance between the shoulder and iliac crest
The mean per video clip value of the max per backleft paw step distance between the elbow and back ankle
step_back_left_distance_elbow_ankle_min
The mean per video clip value of the min per back-left paw step distance between the elbow and back ankle
step_back_left_distance_nose_tail_range
step_back_left_distance_nose_tail_End_ra nge
The mean per video clip value of the difference between the min and max values per back-left paw step of the distance between the nose and tail base
The mean per video clip value of the difference between the min and max values per back-left paw step of the distance between the nose and tail end
step_back_left_distance_shoulder_crest_ra nge
step_back_left_distance_elbow_ankle_ran ge
The mean per video clip value of the difference between the min and max values per back-left paw step of the distance between the shoulder and iliac crest
The mean per video clip value of the difference between the min and max values per back-left paw step of the distance between the elbow and back ankle
Tail
angle_tail_median
height_tail_median
step_front_left_angle_tail_max
step_front_left_angle_tail_min
step_front_left_angle_tail_range
The median angle per video clip about the tail base, between the tail end and iliac crest
The median height per video clip of the tail end above the tail base
The mean per video clip value of the max per frontleft paw step angle about the tail base, between the tail end and iliac crest
The mean per video clip value of the min per frontleft paw step angle about the tail base, between the tail end and iliac crest
The mean per video clip value of the difference between the min and max values per front-left paw step of the angle about the tail base, between the tail end and iliac crest
step_back_left_height_tail_max
step_back_left_height_tail_min
step_back_left_height_tail_range
The mean per video clip value of the max per backleft paw step height of the tail end above the tail base
The mean per video clip value of the min per back-left paw step height of the tail end above the tail base
The mean per video clip value of the difference between the min and max values per back-left paw step of the height of the tail end above the tail base
Table 1. Full list of parameters calculated
These parameters can be aggregated to provide summary scores by category, or reduced through principal component analysis (PCA) to find overall trends or differences between experimental groups. To calculate summary scores, the z-score of each parameter is first calculated for each trial. Then, the mean z-score of only trials in the baseline group is calculated for each parameter. The z-score for all trials is multiplied by -1 for parameters where the mean baseline group z-score value is less than 0, such that the z-score for baseline group is always positive. The z-score values for each parameter category are then averaged to produce one value per trial per parameter. A weighted average may also be used: To calculate a weighted average, principal component analysis is applied to the normalized parameter values of each parameter category at a time, to determine loading factors for each parameter within each parameter category. Before getting the mean of the parameters to produce one value per category, the parameters are first multiplied by weights calculated according to the loading factors (Equation 1). However, the weighted summary scores typically did not differ greatly from the unweighted version. More analysis is required in this direction to confirm if it is better to weight or not.
Equation 1. Calculation of parameter weights for weighted summary scores.

Figure 5. Example of summary scores.
The parameters extracted by CNSRECWALK can also be used to visualize the data, not just in figures, but also by overlaying information onto the videos. A common technique in analyzing gaits is putting paint on the paws of the mice and having them walk over a sheet of paper to record their stepping patterns. A similar effect can be achieved using CNSRECWALK by plotting the position of the paws during stance phases over the video, resulting in a “footprints” visual, as well as plotting swing phases with dotted lines to represent the path of travel of the paw. Angle
and height parameters can be visualized by annotating the

the ratio of plantar or dorsal (where the reverse side of the paw contacts or drags along the surface) stepping based on the bottom view were also created. This information is captured in the ankle angle and heights from the side view, but unlike the bottom view, the side view information lacks the side of the mouse facing away from the camera. A convolutional neural network was trained to classify images of paws from the bottom view into plantar or dorsal. In preliminary use, the model was able to achieve a 90% accuracy of classification. However, the added parameters did not provide much more information than the other parameters that already existed, and the convolutional neural network was still not as reliable compared to DEEPLABCUT-based approaches.
Results and Discussion
CNSRECWALK was used to evaluate experiments conducted as part of several in-progress spinal cord injury studies at the Zhigang He Laboratory, as well as a dyskinesia model through a collaboration with the Stevens Lab of Boston Children's Hospital and The Stanley Center at the Broad Institute.
For one study, mice from four different treatment groups were compared, each with recordings from 1 week post injury, 4 weeks post injury, and 8 weeks post injury. The parameter values were averaged across multiple recording clips (in which the mouse passes from one side to the other) to get one datapoint (one set of parameter values) per mouse per time point. PCA was applied to the parameters to reduce the dimensionality of the data. As expected, there was a clear separation between experimental groups in PCA space. Principal component 1 correlated with overall injury recovery. Parameters that correlated most with principal component 1 were mostly hindlimb heights, suggesting hindlimb stability, followed by coordination parameters, and paw angle parameters, suggesting control over stepping. Meanwhile principal component 2 captured a pattern which approximately formed a “V” shape with respect to PC1. The parameters most correlated to PC2 mostly represented speed, as well as forelimb height. This revealed a negative correlation between forelimb stepping frequency and overall speed.

Figure 7. PCA Analysis. PC1 correlates with recovery, while PC2 reveals a trend in stepping speed not captured by BMS.

Table 2. PCA loading factors. The correlation of each parameter to PC1 and PC2 axes.
Separations between experimental groups for individual parameters were also investigated. T-tests were performed between groups for each parameter and principal component. The t-tests also made it possible to determine which parameters were significantly different between groups. Additionally, the distance in PCA space from the baseline for each group was measured, to determine if there was a significant difference between overall recovery of the groups.

Figure 8. Height_knee_median vs Basso-mouse scale score. This is an example of one parameter that showed correlation to BMS score.
For time series data, parameters, principal component values, and distance to baseline values also changed over time within each treatment group. As expected, treatment groups had roughly equivalent parameter and principal component values immediately
after injury, but as time went on and recovery took place, the groups would separate.

Figure 9. Median angle ankle over time.
Generated parameters visualizations and videos were manually reviewed to ensure all parameters are explainable and calculated correctly. Quality control methods described earlier in this article and adjustments to parameter calculations were implemented to eliminate inaccuracies. It was qualitatively confirmed that left/right limbs have similar parameter values when behavior is expected to be symmetrical, indicating that there is no problem with the practice of flipping videos horizontally in the preprocessing stage to improve the reliability of DEEPLABCUT tracking.
One issue encountered in preliminary usage of the pipeline was that many clips, especially with more severe injury and paralysis, have no hindlimb steps at all. This posed a problem as it would generate N/A values for parameters that depended on hindlimb stepping, either directly or to use as the timeframe of aggregation. As PCA analysis cannot accommodate N/A values, these trials
would not be able to be included in the analysis. To solve this problem, a change was made to substitute the hindlimb stepping pattern for the forelimb stepping pattern in cases where the stepping pattern was only used as a timeframe over which to aggregate the parameter value. Additionally, the per-clip values, such as median angle per clip, are unaffected by this issue and may be more representative in these cases. However, preliminary usage showed the minimum-/maximum-based parameters to be susceptible to outliers and mostly redundant of the per-step parameters in cases where there were no N/A values. With the forelimb/hindlimb substitution mitigation for N/A values available, there is no compelling reason to prefer the per clip minimum and maximums. The per clip median values were retained, but the per clip minimum and maximum positional parameter values, such as minimum crest angle, were removed.
Preliminary usage in conjunction with clustering and machine learning techniques also showed potential. Trials were clustered in PCA space through K-means clustering, allowing similar trials to be identified based on the parameters. These groups were then compared to experimental condition groups or BMS value groups. Additionally, t-SNE reduction was utilized to visualize clusters in two dimensions after clustering based on many-dimensional principal components. Several methods to computationally replicate BMS values were attempted, including by following BMS criteria programmatically, and through unsupervised and supervised machine learning approaches such as linear regression and decision trees. However, replicating the BMS score appeared to generate less useful information than directly interpreting the parameters.
Conclusions and Future Work
Preliminary use in experiments suggested that CNSRECWALK is a feasible option for quantification of mouse locomotion recovery which could benefit a variety of neurological fields. It can reliably calculate many different parameters describing locomotion at different severity levels. Through PCA analysis, CNSRECWALK can replicate a linear scale representing overall recovery, similar to BMS. (Figure 8). This score can be compared with BMS values.
CNSRECWALK also successfully surfaced differences not apparent in other quantification methods like BMS. For example, this method revealed a negative correlation between forelimb stepping frequency and overall speed, suggesting that stepping becomes more deliberate with greater body weight support as recovery improves. In addition, CNSRECWALK can be used to calculate scores from parameters categorized according to biological meaning (Figure 5).
Limitations include tracking errors and tradeoffs between cost and accuracy; CNSRECWALK still needs more testing, and needs more fine-tuning for reliability and robustness. Early in the preliminary usage and testing of the pipeline, when using prerecorded videos recorded by multiple people, under different settings, and not necessarily according to the protocol described in the methods, problems with subpar video recording quality were encountered, including incorrect framing of the video causing the bottom mirror to be out of frame, poor focus causing either the top view/bottom view/both to be out of focus, motion blur due to improper lighting, camera shake, and tilted videos where the surface is not horizontally level with the frame.
Another potential limitation is that the DEEPLABCUT model trained may not work for other labs due to slight differences in the recording setup, even if they replicate it according to published instructions. In order to address this, it is necessary to continue testing and retraining the model in a variety of data collection environments. For example, one limitation of the pipeline that was encountered when collaborating with the Stevens Lab was that the model did not work at all for “Agouti” mice which had a different fur color from the mice the model had been trained with.
In the future, it is necessary to more rigorously prove that CNSRECWALK is reliable. Although CNSRECWALK was applied to studies at the Zhigang He Laboratory, the studies themselves were approaching novel questions, so they could not serve as a perfect controlled environment to test CNSRECWALK. While the results of the pipeline were compared to BMS scores as “ground truth” manual labels, the end goal of the new method is not to simply replicate BMS, so more detailed and nuanced manual or known-true labels to compare it to are needed. It is also necessary to understand exactly what variables must be controlled for the analysis to be valid, for example, the minimum number of steps needed in a clip, or whether parameters are impacted by nonlocomotion actions such as a mouse pausing to sniff around.
Based on the results of this evaluation, it may make sense to make changes to the list of parameters and how they are calculated, or incorporate additional elements of established or emerging research. For example, one type of information that is not currently captured is the consistency of the path of the limb endpoint between steps (Courtine et al., 2009).
Overall, one necessary improvement is to better package the model and genericize it so that anyone can use it, ideally without a technical coding background. This requires the reusable analysis pipeline to be better yet decoupled from the requirements of specific experiments. It should be a well-documented software library that is easy to integrate into larger data analysis workflows and adapt to different research questions, with ample practical documentation and potentially a graphical user interface. In the future, CNSRECWALK will be published and open-sourced to benefit the neurobiology research community.
Works
Cited
Aljovic, A., Zhao, S., Chahin, M., De La Rosa, C., Van Steenbergen, V., Kerschensteiner, M., & Bareyre, F. M. (2022). A deep learning-based toolbox for Automated Limb Motion Analysis (ALMA) in murine models of neurological disorders. Communications Biology, 5(1), 131. https://doi.org/10.1038/s42003-022-03077-6
Asboth, L., Friedli, L., Beauparlant, J., Martinez-Gonzalez, C., Anil, S., Rey, E., Baud, L., Pidpruzhnykova, G., Anderson, M. A., Shkorbatova, P., Batti, L., Pagès, S., Kreider, J., Schneider, B. L., Barraud, Q., & Courtine, G. (2018). Cortico–reticulo–spinal circuit reorganization enables functional recovery after severe spinal cord contusion. Nature Neuroscience, 21(4), 576–588. https://doi.org/10.1038/s41593-018-0093-5
Basso, D. M., Fisher, L. C., Anderson, A. J., Jakeman, L. B., Mctigue, D. M., & Popovich, P. G. (2006). Basso Mouse Scale for Locomotion Detects Differences in Recovery after Spinal Cord Injury in Five Common Mouse Strains. Journal of Neurotrauma, 23(5), 635–659.
https://doi.org/10.1089/neu.2006.23.635
Courtine, G., Gerasimenko, Y., Van Den Brand, R., Yew, A., Musienko, P., Zhong, H., Song, B., Ao, Y., Ichiyama, R. M., Lavrov, I., Roy, R. R., Sofroniew, M. V., & Edgerton, V. R. (2009). Transformation of nonfunctional spinal circuits into functional states after the loss of brain input. Nature Neuroscience, 12(10), 1333–1342.
https://doi.org/10.1038/nn.2401
Diogo, C. C., Da Costa, L. M., Pereira, J. E., Filipe, V., Couto, P. A., Geuna, S., Armada-da-Silva, P. A., Maurício, A. C., & Varejão, A. S. P. (2019). Kinematic and kinetic gait
analysis to evaluate functional recovery in thoracic spinal cord injured rats. Neuroscience & Biobehavioral Reviews, 98, 18–28.
https://doi.org/10.1016/j.neubiorev.2018.12.027
Dominici, N., Keller, U., Vallery, H., Friedli, L., Van Den Brand, R., Starkey, M. L., Musienko, P., Riener, R., & Courtine, G. (2012). Versatile robotic interface to evaluate, enable and train locomotion and balance after neuromotor disorders. Nature Medicine, 18(7), 1142–1147. https://doi.org/10.1038/nm.2845
Garrick, J. M., Costa, L. G., Cole, T. B., & Marsillach, J. (2021). Evaluating Gait and Locomotion in Rodents with the CatWalk. Current Protocols, 1(8), e220. https://doi.org/10.1002/cpz1.220
Mathis, A., Mamidanna, P., Cury, K. M., Abe, T., Murthy, V. N., Mathis, M. W., & Bethge, M. (2018). DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning. Nature Neuroscience, 21(9), 1281–1289. https://doi.org/10.1038/s41593-018-0209-y
Spinal cord injury. (2024, April 16). World Health Organization. https://www.who.int/news-room/factsheets/detail/spinal-cord-injury
Squair, J. W., Milano, M., De Coucy, A., Gautier, M., Skinnider, M. A., James, N. D., Cho, N., Lasne, A., Kathe, C., Hutson, T. H., Ceto, S., Baud, L., Galan, K., Aureli, V., Laskaratos, A., Barraud, Q., Deming, T. J., Kohman, R. E., Schneider, B. L., … Anderson, M. A. (2023). Recovery of walking after paralysis by regenerating characterized neurons to their natural target region. Science, 381(6664), 1338–1345. https://doi.org/10.1126/science.adi6412
Takeoka, A., Vollenweider, I., Courtine, G., & Arber, S. (2014). Muscle Spindle Feedback Directs Locomotor Recovery and Circuit Reorganization after Spinal Cord Injury. Cell, 159(7), 1626–1639.
https://doi.org/10.1016/j.cell.2014.11.019
Winter, C. C., He, Z., & Jacobi, A. (2022). Axon Regeneration: A Subcellular Extension in Multiple Dimensions. Cold Spring Harbor Perspectives in Biology, 14(3), a040923.
https://doi.org/10.1101/cshperspect.a040923
Zörner, B., Filli, L., Starkey, M. L., Gonzenbach, R., Kasper, H., Röthlisberger, M., Bolliger, M., & Schwab, M. E. (2010). Profiling locomotor recovery: Comprehensive quantification of impairments after CNS damage in rodents. Nature Methods, 7(9), 701–708.
https://doi.org/10.1038/nmeth.1484
