Abstract
The capability of tracking objects in low-light environments like nighttime is crucial for numerous real-world applications. However, previous Multi-Camera Multi-Target(MCMT) tracking methods are primarily focused on tracking during daytime with favorable lighting, overlooking the challenge posed by low-light conditions. The main difficulty of tracking under low-light condition is the lack of detailed visible appearance features. To address this issue, we incorporate the infrared modality into MCMT tracking framework to provide more useful information. We constructed the first Multi-modality (RGBT) Multi-camera Multi-target tracking dataset named M3Track, which contains sequences captured in low-light environments, laying a solid foundation for all-day multi-camera tracking. Based on the proposed dataset, we propose All-Day Multi-Camera Multi-Target tracking network, termed as ADM-CMT. Specifically, we propose an All-Day Mamba Fusion(ADMF) module to fuse information from different modalities adaptively. Within ADMF, the Lighting Guidance Model(LGM) extracts lighting relevant information to guide the fusion process. Furthermore, the Nearby Target Collection(NTC) strategy is designed to enhance tracking accuracy by leveraging information derived from surrounding objects of targets. Experiments conducted on M3Track demonstrate that ADMCMT exhibits strong generalization across different lighting conditions. The code will be released at https://github.com/QTRACKY/ADMCMT.