Alexa Siu portrait photo. A woman smiling, black hair, gray shirt, standing in front of a plain gray background.

Hello! I'm Alexa, a Research Scientist at Adobe Research.

My research interests are primarily in the fields of Human-Computer Interaction & Accessibility. I investigate how multimodal perception can improve how we understand and interact with information.

The results of my work have led to novel haptic interfaces that aim to make spatial information more accessible for people who are blind. Application areas of interest include supporting design, collaboration, information visualization and VR/AR.

Prior to Adobe, I was part of the shape lab and the HCI Group at Stanford University, advised by Prof. Sean Follmer. Previously, I completed my M.S. in Mechanical Engineering also at Stanford and my B.S. in Biomedical Engineering at Georgia Tech with a minor in Computer Science.

** If you are a PhD or undergraduate student looking for summer internship opportunities or collaborations, send me an email with your interests.

Contact:
asiu@adobe.com
Google Scholar

Latest Research

An icon representation of the audio narrative composition.  The audio data narrative shows four interleaved segments of  description followed by a segment of the line graph.

Supporting Accessible Data Visualization Through Audio Data Narratives

Alexa F Siu, Gene S-H Kim, Sile O’Modhrain, and Sean Follmer

To address the need for accessible data representations on the web that provide direct, multimodal, and up-to-date access to live data visualization, we investigate audio data narratives –which combine textual descriptions and sonification (the mapping of data to non-speech sounds). We present a dynamic programming approach to generate data narratives considering perceptual strengths and insights from an iterative co-design process. Data narratives support users in gaining significantly more insights from the data.

Four images: a) Slide-tone. Shows a motorized slider  with a mounted platform and a finger sliding on the platform.  b) Tilt-tone. Shows a motorized tilt platform and a  finger resting on the tilting platform. c) labelled visual  graph shows a line graph with three distinct peaks.  d) laser-cut cutout. Shows a physical cut-out of the  same line graph and a user's hand exploring the cutout.

Slide-Tone and Tilt-Tone: 1-DOF Haptic Techniques for Conveying Shape Characteristics of Graphs to Blind Users

Danyang Fan, Alexa F Siu, Wing-Sum Law, Raymond Zhen, Sile O’Modhrain, and Sean Follmer

To improve interactive access to data visualizations, we introduce two refreshable, 1-DOF audio-haptic interfaces based on haptic cues fundamental to object shape perception. These devices provide finger position, fingerpad contact inclination, and sonification cues. Our research offers insight into the benefits, limitations, and considerations for adopting these haptic cues into a data visualization context.

An icon of a survey and a speech bubble

COVID-19 Highlights the Issues Facing Blind and Visually Impaired People in Accessing Data on the Web

Alexa F Siu, Danyang Fan, Gene S-H Kim, Hrishikesh V Rao, Xavier Vazquez, Sile O’Modhrain, and Sean Follmer

Dissemination of data on the web has been vital in shaping the public’s response during the COVID-19 pandemic. We postulated the increased prominence of data might have exacerbated the accessibility gap for the Blind and Visually Impaired (BVI) community and exposed new inequities. Based on a survey (n=127) and contextual inquiry (n=12), we present observations that provide an understanding of the impact access or inaccess has on the BVI community and implications for improving the technologies and modalities available to disseminate data-driven information on the web.

Three images representing the CAD workflow showing  a script for rendering a 3D model of a cup, a cup rendered in a 2.5D shape display, and a cup 3D printed

shapeCAD: An Accessible 3D Modelling Workflow for the Blind and Visually-Impaired Via 2.5D Shape Displays

Alexa F. Siu, Son Kim, Joshua A. Miele, Sean Follmer

We describe our participatory design process towards designing an accessible 3D modelling workflow for the blind and visually-impaired. We discuss interactions that enable blind users to design, program, and create using a 2.5D interactive shape display.

'A user wearing the haptic guidance device, PantoGuide, and touching a tactile graphic of a bar chart that is placed on top of a touchscreen.'

PantoGuide: A Haptic and Audio Guidance System To Support Tactile Graphics Exploration

Elyse D. Z. Chase, Alexa F. Siu, Gene S-H Kim, Abena Boadi-Agyemang, Eric J. Gonzalez, and Sean Follmer

Tactile graphics interpretation is an essential part of building tactile literacy and often requires individualized in-person instruction. PantoGuide is a low-cost system that provides audio and haptic guidance cues while a user explores a tactile graphic. We envision scenarios where PantoGuide can enable students to learn remotely or review class content asynchronously.

'A sketch of a table with constructive visualization tokens. On the side there are three additional images of users constructing visualizations with the token. The user constructs a 3D bar chart and a stack of tokens. '

Constructive Visualization to Inform the Design and Exploration of Tactile Data Representations

Danyang Fan, Alexa F. Siu, Sile O'Modhrain, Sean Follmer

As data visualization has become increasingly important in our society, many challenges prevent people who are blind and visually impaired (BVI) from fully engaging with data graphics. We adpt a constructive visualization framework, using simple and versatile tokens to engage non-data experts in the construction of tactile data representations.

'An image of the wristworn Haptic Pivot device used in Virtual Reality to catch an apple falling from a tree. The device is worn by a user and renders the haptic sensation of an object falling into the users hand.'

Haptic PIVOT: On-Demand Handhelds in VR

Robert Kovacs, Eyal Ofek, Mar Gonzalez Franco, Alexa F. Siu, Sebastian Marwecki, Christian Holz, Mike Sinclair (UIST 2020)

PIVOT is a wrist-worn haptic device that renders virtual objects into the user’s hand on demand. Its simple design comprises a single actuated joint that pivots a haptic handle into and out of the user’s hand, rendering the haptic sensations of grasping, catching, or throwing an object – anywhere in space .

'Three images. The first shows a person wearing the  prototype of a white cane VR controller. The second shows  the first-person view that the person would see if they were  wearing a head-mounted display in VR. The last shows the  overhead view of the map of a virtual environment along  with the position of the user.'

Virtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds

Alexa F. Siu, Mike Sinclair, Robert Kovacs, Christian Holz, Eyal Ofek, Edward Cutrell (CHI 2020)

How might virtual reality (VR) aid a blind person in familiarization with an unexplored space? This project explores the design of an immersive VR haptic and audio experience accessible to people who are blind with the goal of facilitating orientation and mobility training.

Four photos next to each other: 1) The raised line sketch shows  the spatial layout of the different voltmeter components  including: a knob, speaker, connectors, push button,  toggle switch, and tactile gauge with servo; 2) 2D sketch  on paper identical to the raised line sketch with printed  annotations of what each component is and its function. 3) a hobbyist’s  hands are testing a switch on a voltmeter prototype  in progress with the PCB and wires coming out from the  back; 4) front view of the voltmeter prototype. Top row shows a  speaker, a toggle switch, and a 180-degree arc of tactile markings  with a servo in the middle. Bottom row shows binding posts, a selector  knob, and a push button.

Making Nonvisually: Lessons from the Field

Cynthia L Bennett, Abigale Stangl, Alexa F Siu, and Joshua A Miele (ASSETS 2019)

The Maker movement promises access to activities from crafting to digital fabrication for anyone to invent and customize technology. But people with disabilities, who could benefit from Making, still encounter significant barriers to do so. We share our personal experiences Making nonvisually and supporting its instruction through a series of workshops where we introduced Arduino to blind hobbyists.

A 2.5 shape display rendering a raised surface  with a hand touching the top.

shapeShift: A Mobile Tabletop Shape Display for Tangible and Haptic Interaction

Alexa F. Siu, Eric J. Gonzalez, Shenli Yuan, Jason B. Ginsberg, Sean Follmer (CHI 2018)

We explore interactions enabled by 2D spatial manipulation and self-actuation of a mobile tabletop shape display (e.g. manipulating spatial-aware content, an encountered-type haptic device, etc). We present the design of a novel open-source shape display platoform called shapeShift.

Selected Press

Gizmodo

October 2020 & April 2018

These Wrist-Worn Hammers Swing Into Your Hands So You Feel Virtual Objects

read more >>
TechCrunch

October 2019

This Tactile Display Lets Visually Impaired Users Feel On Screen 3D Shapes

read more >>
National Science Foundation

November 2019

4 Awesome Discoveries You Probably Didn't Hear About (Episode 87)

read more >>
Core77

December 2019

Stanford Researchers Develop Tactile Display to Make 3D Modeling More Accessible for Visually Impaired Users

read more >>
Fast Company

April 2018

A Computer Mouse for the Year 3000

read more >>
The Wall Street Journal

April 2018

Volkswagen Brings Sense of Touch to Virtual Reality

read more >>

Designed by Alexa F. Siu. Powered by
Bootstrap 4, Jekyll and Jekyll-Scholar.