Visualising Voice: A UI/UX Exploration with Speek
UI/UX
BRAnd
PRODUCT
Speek is a conceptual AR app created for a university project, designed for the Apple Vision Pro. It helps Deaf users interpret vocal tone and volume through real-time, audio-reactive visuals. The project explores accessible UI design and creative experimentation using TouchDesigner.
Initial Research
I conducted initial user research by engaging with members of the Deaf community and consulting with AR designers. This approach aimed to understand the unique needs of deaf users and explore effective methods to integrate these requirements within augmented reality experiences.
Visualising Speech
To bring Speek's dynamic visuals to life, I utilised TouchDesigner for creating real-time, audio-reactive graphics. Understanding that this visual would be central to the brand and UI, I focused on creating a sleek and engaging design while ensuring it remained accessible and legible.
Iterative Approach
Designing for AR was an entirely new experience for me. Having an iterative approach for this project was crucial in creating an experience that was optimal.

Lightweight Interface
By integrating minimal bezels with transparent design components, the UI achieves an airy and unobtrusive feel. This design approach aligns with feedback from user interviews, where participants emphasised the need for an unobtrusive interface that seamlessly integrates with their environment.



Personalisation
Speek will assign each voice a unique colour combination, making it easier for users to know who's talking.
Use Cases
Speeks different modes are designed to be accessible in a wide range of scenarious, including identifying non-verbal sounds.




The Brand
Although my priorities were the UI and UX, I felt it was important to contextualise the app with a strong brand, which was inspired by the Speek animation itself.



Promotional Mockups

