UC Berkeley AI Hackathon

Smart glasses that served as an IOT-enabled travel companion & guide for the visually impaired.

When I first heard about the latest developments in LLMs and ML, I was excited about the potential for LLMs to serve as an intelligent, yet simple-to-use tool that greatly improved human - computer interactibility.


For the hackathon, my teammate and I innovated and designed a Smart Glasses concept that served as an IOT-enabled travel companion & guide for the visually impaired.

We used many low-cost sensors: a camera, mic, bone-conducting speaker, GPS module, and many more! To tie all the data together, we created a unified format and process that would be provided to the LLM. Users would be able to ask various questions into their mic, such as “Is there a water bottle in front of me” or “what street am I on?”


We also implemented a pre-trained depth perception model, as well as image processing (BEV Grid map creation) and used the A* Search algorithm to create a path-finding feature for our users.