Understanding the neuromuscular basis for behavior requires measuring both kinematic and physiological data at high resolution in unconstrained conditions: a technically challenging goal. Here we present an integrated experimental-computational pipeline for measuring and quantifying body part kinematics and muscle activity in freely behaving Drosophila melanogaster. We first present Spotlight, a closed-loop videography system that performs real-time tracking to record untethered flies at high resolution (6 m/pixel) and high frame rate (330 Hz) while also enabling optical recordings of limb muscle activity via a fluorescent calcium reporter. To analyze these massive datasets without manual image annotation, we introduce PoseForge, a synthetic-data-driven framework that exploits morphologically accurate biomechanical simulations to generate synthetic data, and contrastive self-supervised learning to infer 3D keypoints and dense body-part segmentation from a single camera view. Using resulting 3D kinematic data, we can replay recorded behaviors in a biomechanical digital twin, NeuroMechFly, to infer forces generated and experienced by the fly’s limbs. Finally, we illustrate the capability of our system to optically record muscle activity. We show how the legs’ long-tendon muscles activate upon mechanical vibration, possibly to activate gripping and to maintain a stable posture. Taken together, this workflow enables scalable, high-resolution measurement and modeling of unconstrained, natural behavior.
Toward terminological clarity in digital biomarker research
Digital biomarker research has generated thousands of publications demonstrating associations between sensor-derived measures and clinical conditions, yet clinical adoption remains negligible. We identify a foundational




