Automated Live Streaming: Developing a Long-Range Robotic Tripod System with Machine Vision

This study introduces the development of a long-range robotic tripod system designed to automate live streaming for institutional and educational events. The system integrates a motorized carbon-fiber tripod, a high-power zoom camera, and a real-time computer vision pipeline to reduce reliance on manual camera operators. Using YOLOv7 for object detection and DeepSORT for multi-object tracking, the tripod autonomously pans and tilts to follow subjects during live coverage. Stepper motors controlled by DRV8825 drivers ensure smooth and precise camera movements. Performance evaluation demonstrated consistent tracking accuracy of 92% with low latency under 100 ms at distances up to 100 meters. User testing through NASA-TLX showed a 37% decrease in workload, while QUIS surveys reported an average usability score of 8.2/10. These results confirm that the proposed system can enhance media production efficiency by offering reliable, high-quality automated live streaming in resource-limited environments.

Leave a Reply

Your email address will not be published. Required fields are marked *