Autoslug is a Baskin-affiliated organization focused on practical applications of AI, machine learning, and computer vision. Our mission is to create hands-on opportunities for students, enabling them to apply these technologies in real-world environments and move beyond theoretical understanding to explore the intersections of humans, data, and hardware.
Our ASL project is about creating an accessible American Sign Language gesture interpreter. This past year we have developed a gesture interpreter that can recognize the static ASL alphabet. Our goal this upcoming year is to expand our capabilities to motion-gestures that take into account full-body context and facial expressions. To support this, we will use an NVIDIA Jetson, a camera, and a pair of motion-capture gloves that we construct to capture training data.
Modbot is a project focused on developing and researching with a modular robot platform, in addition to developing the skills of our members. In the past year, we have planned and developed a unique modular system that enables the ability to swap parts through various technologies—from well-established communication protocols like I2C, various industry-used microcontrollers like the RP 2040, and custom power systems. Our goals for this year are to continue developing the physical systems and establish long-term self-motivated research goals for our group in the field of autonomous driving.
The robot dog project is our collaboration with Professor Steve McGuire at HARE Labs to control a life-size robot dog. This project involves students in active research with a real-life robot. It also integrates a range of skills: sensors for perception of terrain, control theory to develop a locomotion model, simulations to verify and test our models, AI for perception and control, as well as many more potential fields. We hope to create a useful and robust locomotion model for the dog.