The following project description is taken from Interaction Lab website.
Infants engage in motor babbling that allows them to explore their space and learn what movements produce desired outcomes. Less motor babbling from infants can lead to developmental delays. Our goal is to develop a socially assistive, non-contact, infant-robot interaction system to provide contingent positive feedback to increase exploration and expand early movement practice.
Towards this end, we are collaborating with physical therapists to create approaches to predict the developmental status of infants using wearable sensors; running user studies that explore various robot rewards for contingent activities for the infant, as well as measuring the infant’s ability to mimic the robot; and using reinforcement learning to adjust the difficulty of the task presented by the robot to increase the infant’s engagement with the task.
Development Detail
For project development, my contributions includes:
- Detecting and Tracking two Sphero SPRK+ robots with a wall-mounted camera
- Object Detection: apply transfer learning to YOLOv3 pre-trained with MS-COCO dataset
- Visual Tracking:
- With SiamRPN: Since SiamRPN outputs tracking confidence, detection is used only when confidence is below a threshold
- With CSRT tracker: CSRT does not output tracking confidence; detection is conducted with a predefined frequency to update the tracking location