DEVICE UX
Designing How the Device Communicates

The device has no screen, yet it constantly communicates with the driver. My role was to define how it does that. I designed the full system of lights and sounds — colour, timing, rhythm, and intensity — that translates internal states into signals drivers can understand instantly. The aim was clarity without distraction: feedback that is noticeable when needed, but otherwise stays in the background.

The system covers alerts, pairing, connection changes, power states, and user actions. Instead of treating these as separate effects, I structured them as one coherent signal language. Events were grouped by intent — brief nudge, caution, urgency, confirmation, or background status — so similar situations behave in consistent ways. Drivers recognise patterns over time, rather than needing to interpret each signal from scratch.
ALERTS, STATES & CONTINUOUS FEEDBACK
Making the Device Feel Understandable

Driving alerts required clear differentiation without adding stress. Variations in pitch, rhythm, duration, and colour distinguish camera zones, traffic, hazards, and speed events. Lower-impact situations use short, light cues; higher urgency uses deeper tones and stronger light. Recognition comes from pattern, not loudness.

Beyond alerts, the device also communicates quieter states: pairing progress, reconnection, background activity, and power transitions. Soft pulses, confirmation tones, and gradual light changes make invisible processes feel clear and controlled. Subtle “heartbeat” signals confirm the device is active without demanding attention. Together, these behaviours make the product feel reliable and predictable rather than technical or opaque.
PROTOTYPING THE BEHAVIOUR
A Web-Based Simulator to Design Before Hardware

A key part of the project was developing a way to design and test behaviour before hardware was ready. I built a web-based device simulator using Cursor-generated code, which allowed light patterns, timing sequences, and sound combinations to be explored interactively in the browser.

This removed the usual slow loop between design, engineering, and the hardware manufacturer. Instead of waiting for firmware updates or physical prototypes, behaviours could be tested, compared, and refined in real time. The simulator made it possible to evaluate the system as a whole — not just individual signals — and significantly accelerated iteration.

Building this tool was my own initiative and involved some risk, as AI-assisted coding workflows were still new at the time. It proved highly effective, becoming a practical bridge between design intent and technical implementation.
RESULT
A Consistent System of Signals

The outcome is a consistent language of light and sound that lets the device communicate clearly without a screen. Drivers learn the signals through use rather than instruction. Feedback supports awareness and safety, while remaining calm and unobtrusive.

More work

Back to Top