
Assistive products for inclusive mobility
While studying optics, I often thought about a simple question: if light helps us navigate everything, then how do people who cannot see navigate? That curiosity led me to two projects for visually impaired users: Navia, a mobility support robot, and Sightify, a pair of glasses that convert images into sound. Each product is a way of testing how technology can make up for the missing “light.”
NAVIA
Mobility assistive robot for the visually impaired
At the end of 2022, I began my first personal project: Navia, a robot that supports mobility for visually impaired users. From simple experiments using Lego and sensors, I gradually understood that for any device made for vulnerable communities to work safely, every detail must be absolutely precise. That was also when I started to see more clearly how AI and robotics could become tools to improve quality of life, beginning with very small but meaningful changes.



During development, Navia was equipped with four ultrasonic sensors to continuously measure distance and navigate. A misalignment of just a few centimeters or a wrong parameter in the algorithm could make the robot respond inaccurately. There were days when I only fixed a single line of code but the entire system ran more smoothly, and there were nights when I stayed up late checking everything again and again because of one tiny error.


When the two Navia units were completed, I organized a simulation workshop for users to try them out. It was not just a product introduction event, but also a moment for me to observe how they interacted, listen to their feedback, and adjust Navia to make its behavior feel natural. Watching participants move confidently with the robot’s support made me realize that creativity only becomes meaningful when it helps someone.






Video: Testing the robot.
SIGHTIFY
Smart assistive glasses for the visually impaired
Sightify came from a very simple question: if light helps us see the world, then how can technology “translate” that world for people who cannot see? From that idea, I developed a concept for smart glasses that support visually impaired users through AI computer vision and real-time audio feedback, allowing them to “hear” the surrounding space.
I was responsible for mechanical design, integrating the image processing system, and training basic object recognition models to describe objects, distances, or directions through sound. At the same time, I built a voice-command interface so users could interact more naturally without touching the device.
Developing Sightify required technical work while also making me think more about the real needs of the visually impaired community. What matters is not only the accuracy of the model but the ability to convert visual information into audio experiences that are truly useful and easy to understand. This process of “translating light into sound” helped me better understand the role of physics, AI, and design when they all work toward the same goal: creating technology that can accompany and empower users.








Seeing with Hands,
Creating with Heart
Alongside technology, I have been involved in volunteer activities such as teaching painting to people with visual impairments. These sessions are not about recreating images, but about exploring form, texture, and emotion through touch, movement, and imagination.
Working with visually impaired learners challenged my assumptions about how art is perceived and created. I learned to communicate ideas without relying on sight, to slow down, and to listen more carefully. Each session became a shared space where creativity was guided by sensation rather than vision. This experience deepened my belief that innovation, whether artistic or technological, should begin with empathy and grow from genuine human connection.























