The Arduino UNO Q is here and it is a very exciting product that combines a Qualcomm Dragonwing QRB2210 processor and STM32U585 microcontroller, giving users the best of both worlds in a single convenient package.One of its best features is the ability of the SBC to send data seamlessly to the microcontroller for interfacing with other components.James Bruton took advantage of that to build an electric car that he can steer with his face.
Part of what makes the new UNO Q so compelling is the way it lets makers and engineers do both heavy-lifting and low-level control.In this case, that heavy lifting is facial recognition and the low-level control is sending signals to the motor drivers.The Arduino App Lab made it easy for Bruton to take advantage of an Edge Impulse AI model to perform the facial recognition, then send the relevant data (steering commands) to the STM32 that tells the motor drivers how to move the motors.
The vehicle itself is exactly the kind of thing Bruton excels at building.Its chassis is a combination of 3D-printed parts and GRP (glass-reinforced plastic) tubes, which are strong, stiff, and more affordable than carbon fiber.Powerful brushless motors spin the two rear wheels and two DC gearmotors actuate the steering rack for the front wheels.
Power comes from a hobby LiPo battery pack and there is a big e-stop switch for safety.The AI model running on the UNO Q monitors the location of Bruton’s face within the video frame, ignoring any faces that are small (and therefore far away, so it is safe to say they aren’t the driver).If Bruton leans to one side, his face moves in the frame and the car steers in that direction.
He does, however, have to operate the throttle manually as there wasn’t time before our “From Blink to Think” product launch event for Bruton to implement face-based throttle control.