Alexa Siu portrait photo. A woman smiling, black hair, gray shirt, standing in front of a plain gray background.

Hello! I'm Alexa, a PhD Student at Stanford University.

My research interests are primarily in the fields of Human-Computer Interaction and Accessibility. I am interested in the design of physical interfaces that leverage multimodal perception to make spatial information more accessible for people who are blind. Application areas of interest include supporting design, collaboration, and information visualization.

Currently, I am part of the shape lab and the Stanford HCI Group, advised by Prof. Sean Follmer. Previously, I completed my M.S. in Mechanical Engineering also at Stanford and my B.S. in Biomedical Engineering at Georgia Tech with a minor in Computer Science.

Contact:
afsiu@stanford.edu
Google Scholar

Latest Research

'An image of the wristworn Haptic Pivot device used in Virtual Reality to catch an apple falling from a tree. The device is worn by a user and renders the haptic sensation of an object falling into the users hand.'

Haptic PIVOT: On-Demand Handhelds in VR

Robert Kovacs, Eyal Ofek, Mar Gonzalez Franco, Alexa F. Siu, Sebastian Marwecki, Christian Holz, Mike Sinclair

PIVOT is a wrist-worn haptic device that renders virtual objects into the user’s hand on demand. Its simple design comprises a single actuated joint that pivots a haptic handle into and out of the user’s hand, rendering the haptic sensations of grasping, catching, or throwing an object – anywhere in space .

'Three images. The first shows a person wearing the  prototype of a white cane VR controller. The second shows  the first-person view that the person would see if they were  wearing a head-mounted display in VR. The last shows the  overhead view of the map of a virtual environment along  with the position of the user.'

Virtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds

Alexa F. Siu, Mike Sinclair, Robert Kovacs, Christian Holz, Eyal Ofek, Edward Cutrell

How might virtual reality (VR) aid a blind person in familiarization with an unexplored space? Exploring the design of immersive VR experiences accessible to blind people with the goal of facilitating orientation and mobility training.

Three images representing the CAD workflow showing  a script for rendering a 3D model of a cup, a cup rendered in a 2.5D shape display, and a cup 3D printed

shapeCAD: An Accessible 3D Modelling Workflow for the Blind and Visually-Impaired Via 2.5D Shape Displays

Alexa F. Siu, Son Kim, Joshua A. Miele, Sean Follmer

We describe our participatory design process towards designing an accessible 3D modelling workflow for the blind and visually-impaired. We discuss interactions that enable blind users to design, program, and create using a 2.5D interactive shape display.

A 2.5 shape display rendering a raised surface  with a hand touching the top.

shapeShift: A Mobile Tabletop Shape Display for Tangible and Haptic Interaction

Alexa F. Siu, Eric J. Gonzalez, Shenli Yuan, Jason B. Ginsberg, Sean Follmer

We explore interactions enabled by 2D spatial manipulation and self-actuation of a mobile tabletop shape display (e.g. manipulating spatial-aware content, an encountered-type haptic device, etc). We present the design of a novel open-source shape display platoform called shapeShift.