top of page
MAD_home_1.jpg

M.AD

MINUSCULE ADVENTURE

Activate your senses through real-time adaptive narration + music

HCI Design | Interaction Design | Experience Design

Focus : Mental Wellbeing

Tools : IBM NodeRed (JS+Sql) | Figma | Adobe PremierePro

12 Weeks | 2020

Team Members : Jeremy Hulse | Joy Zhang | Lisa Aoyama

Role : UX Architect | UX Researcher | Industrial (Hardware) Designer

MINUSCULE ADVENTURE (M.AD) transforms an ordinary neighborhood walk into a dynamic, immersive audio adventure. Narration and music are generated in real time and adapt to your environment, creating a unique experience every day.

This project addresses the growing need for urban dwellers to engage their senses and support mental wellbeing. With increasing screen time and remote work, city residents face higher risks of digital fatigue—M.AD offers a playful, mindful way to reconnect with their surroundings.

MAD_1.jpg

The platform combines intelligent software with immersive hardware.

 

The software generates real-time, personalized narration and adaptive music using inputs like GPS, weather, time, and user preferences. The hardware uses bone-conduction audio to layer this experience onto the natural soundscape, and its sensor-equipped, connected earpieces enable a fully screen-free voice interface.

MAD_1_.jpg
User Journey Flow

CONTEXT + IMPACT

MAD_2.jpg
Screenshot 2022-09-26 at 13.33.30.png

OUR SOLUTION

INSPIRATION + IDEATION

As daily life becomes increasingly digital and virtual, reconnecting with our physical surroundings is crucial for mindfulness and mental wellbeing. Drawing on phenomenology, MINUSCULE ADVENTURES invites users to re-engage with their lifeworlds by amplifying sensory awareness—turning ordinary walks into meaningful, immersive experiences.

DESIGN HYPOTHESIS
Miniscule Adventures help us to live mindfully, improving our mental wellbeing

MAD_8.jpg

RESEARCH + DEVELOPMENT

KEY USERS
Urban Dwellers in
Sensorially Deprived
Environments

We started by interviewing experts and extreme users experiencing sensory deprivation, gaining insight into their most significant challenges and the strategies they use to adapt and cope.

MAD_10.jpg

EXPERTS CONSULTED

MAD_11.jpg

CO-DESIGN WORKSHOP

We ran a co-design workshop with 17 work-from-home users (ages 20–45) to gather real-world insights and bring potential users into the process from the start. Since the project centers on subjective experience, early involvement was key. Participants also designed their own ideal Minuscule Adventure.

MAD_12.jpg

USER TESTING + EXPERIMENTS

Our Validation Process was inspired by both Mindfulness Therapy and Psychedelic Therapy and we used both the industry standard MAAS (Mindfulness Attention Awareness Scale) and 11D-ASC (Altered State of Consciousness) surveys to conduct our user testing. This systematic surveying allowed us to gather qualitative and quantitative analysis in conjunction with our user interviews.

Screenshot 2022-09-27 at 01.49.23.png

EXPERIMENT I

FOCUS ON

- Immersive Experience

- Personalization

EXPERIMENT II

FOCUS ON

- Autonomous Experience

- User Generated Content

EXPERIMENT III

FOCUS ON

- User Customization

- User Reflection

Users showed diverse comfort levels and preferences, so insights from Experiments I and II made it clear that the experience needed to be customizable. We developed a four-tier spectrum—from Total Immersion to Total Autonomy—mixing narration and thematic prompts in different ways.

Testing Tier 3 in Experiment III led to measurable increases in Mindfulness and Psychedelic Research survey scores, confirming that the experience helped users slow down, become more present, and activate their senses.

PROTOYPING + MVP

HARDWARE

The hardware platform is developed to unplug from the screens with a self-contained microprocessor, its own internet connectivityGPS tracking and memory card. Activate the senses using the earpieces, with embedded accelerometers and microphones in both to generate hyper live inputs. The bone conduction technology allows the user to still hear all ambient sounds. The earpieces will be worn, and the case will be kept in the pocket during the adventure.

MAD_27_edited.jpg

SOFTWARE

The current MVP, built on IBM Node-RED (JS + SQL), uses geo-tagged narration and music stitched together live in real time. The long-term goal is to evolve this into a fully autogenerated system that assembles adaptive narration and music based on GPS, weather, time of day, and sensor data (accelerometers, microphones) captured through the hardware platform.

UX JOURNEY

Choose from four tiers of immersion—from a minimal themed walk to a fully narrated experience—with adaptive music added at the higher levels.

MAD_30.jpg

© 2025 by KATT

bottom of page