top of page

Assistive products for inclusive mobility

While studying optics, I often thought about a simple question: if light helps us navigate everything, then how do people who cannot see navigate? That curiosity led me to two projects for visually impaired users: Navia, a mobility support robot, and Sightify, a pair of glasses that convert images into sound. Each product is a way of testing how technology can make up for the missing “light.”

NAVIA

Mobility assistive robot for the visually impaired

At the end of 2022, I began my first personal project: Navia, a robot that supports mobility for visually impaired users. From simple experiments using Lego and sensors, I gradually understood that for any device made for vulnerable communities to work safely, every detail must be absolutely precise. That was also when I started to see more clearly how AI and robotics could become tools to improve quality of life, beginning with very small but meaningful changes.

z7090291245156_9cda6820801caee7bc057f0b696be89e.jpg
z7090291253483_9a628d64934c00d2c6b260bdef5cca5e.jpg
z7090291263693_5bc1e3d44704c1e2c69be3bdccc20175.jpg

During development, Navia was equipped with four ultrasonic sensors to continuously measure distance and navigate. A misalignment of just a few centimeters or a wrong parameter in the algorithm could make the robot respond inaccurately. There were days when I only fixed a single line of code but the entire system ran more smoothly, and there were nights when I stayed up late checking everything again and again because of one tiny error.

z7285136340114_9bcbd636d7f70c5712e25a32d4c6683e.jpg
z7285135320036_f4bdf79bdcdbcac65843ab220e9f1625.jpg

When the two Navia units were completed, I organized a simulation workshop for users to try them out. It was not just a product introduction event, but also a moment for me to observe how they interacted, listen to their feedback, and adjust Navia to make its behavior feel natural. Watching participants move confidently with the robot’s support made me realize that creativity only becomes meaningful when it helps someone.

SIGHTIFY

Smart assistive glasses for the visually impaired

Sightify came from a very simple question: if light helps us see the world, then how can technology “translate” that world for people who cannot see? From that idea, I developed a concept for smart glasses that support visually impaired users through AI computer vision and real-time audio feedback, allowing them to “hear” the surrounding space.

​

I was responsible for mechanical design, integrating the image processing system, and training basic object recognition models to describe objects, distances, or directions through sound. At the same time, I built a voice-command interface so users could interact more naturally without touching the device.

​

Developing Sightify required technical work while also making me think more about the real needs of the visually impaired community. What matters is not only the accuracy of the model but the ability to convert visual information into audio experiences that are truly useful and easy to understand. This process of “translating light into sound” helped me better understand the role of physics, AI, and design when they all work toward the same goal: creating technology that can accompany and empower users.

IMG_5362.jpeg

Engagement in
scientific research

My experiences with Navia and Sighitfy made me want to go further, not only making things work but understanding the essence of each phenomenon. That was also part of the motivation for me to join research projects with professors and senior scientists, where I had the chance to approach science in a more systematic and rigorous way.

IMG_2047.JPG
Screenshot 2025-12-15 173539.png
1. High-Pressure Melting of FCC Argon and Neon:

Under the supervision of Assoc. Prof. Nguyen Quang Hoc, I developed theoretical models for rare gas crystals (Ar, Ne) under high pressure using the Statistical Moment Method and the Work–Heat Equivalence Principle. I calculated binding energy, lattice vibrations, and melting points up to 350 GPa, then compared them with AIMD simulations and experimental data. The publication in Computational Materials Science helped me understand condensed matter at a deeper level and the precision required in theoretical modeling.

Screenshot 2025-12-15 173646.png
2. Light Focusing Through an Absorbing Interface:

​With Dr. Do Mai Trang, I developed a model to simulate light focusing through an absorbing interface based on the Debye–Török–Fedrov–Nakajima framework. I implemented the simulation using Gauss–Legendre quadrature and analyzed intensity attenuation and focal shift when the medium has weak absorption. It was the first time I clearly saw how the sophistication of optical theory can be “translated” into simulations and concrete data.

Screenshot 2025-12-15 173617.png
3. Refractive Index and Absorption of Olive Oil:

In research with Dr. Le Canh Trung, I used a Michelson interferometer to measure refractive index as a function of wavelength and analyzed absorption and transmission spectra with a NIR spectrometer. Processing the data using Beer–Lambert and Sellmeier gave me hands-on experience with precise optical experiments and opened a perspective on applying optics to the study of biological materials.

At first, the dense theoretical background was overwhelming. But reading papers, running simulations, and directly experimenting in the lab helped me see the “structure” of scientific research: theory provides the path, simulations are the trial frame, and experiments are where every detail must align to produce meaningful results. Thanks to this, I improved my reasoning skills, my precision in handling experiments, and my patience when I had to repeat a procedure many times because of a tiny error.

​

What I realized in the end is that research is not only numbers or graphs. It is the process of placing each piece, whether data, intuition, or understanding, in the right spot so that the scientific picture becomes complete. And it is that very process that nurtures my curiosity, excitement, and persistence.

The Quan Nguyen

cam.png

thequandtm2007@gmail.com
(+84) 37 367 2224

Hanoi, Vietnam

​​​

© 2026 by The Quan Nguyen.

 

14.png
bottom of page