Skip to main content
SearchLogin or Signup

Space Compass- AstroAccess Team

Location awareness for people with Visual impairments

Published onOct 19, 2021
Space Compass- AstroAccess Team
·

Project Overview

Advancing Disability Inclusion in Space - Orientation Awareness in Zero Gravity for Vision Impaired Individuals



Background

When disabled people have equitable access to all jobs, taking on humanity’s most complex tasks, perspectives change. Access to space changes the worldview not only of aspiring explorers, but of those that employ them, and most importantly, those that look up to them.

Disabled astronauts would have inherent strengths and advantages that could enhance mission success. Due to differences in the vestibular system, some deaf individuals are immune or resistant to motion sickness. NASA has known this since the 1950s, when 11 deaf men known as the “Gallaudet 11” participated in extensive research to help shape the future of human space exploration. During these experiments, NASA proved that deaf space flight participants would be more adaptable to the foreign gravitational environments, and yet there has never been a deaf astronaut. Hearing crew members would also benefit from being fluent in sign language as this would allow for non-verbal communication in any emergency situation that results in auditory anomalies. Universal design not only facilitates inclusion, it inherently results in new system redundancies and functionalities that would improve safety measures for all crewmembers. Visit this article for more information on “The Case for Disabled Astronauts.” 

Mission: AstroAccess will allow the next generation of disabled scientists, students, athletes and artists to see that STEM work is truly possible for them. Women entering the space program sparked a steady increase of women in STEM that has continued to the present day (Source). When people with disabilities enter the space program, we expect the same to happen with disabled STEM students. As we give more disabled advocates platforms in STEM, we will see more disabled scientists. We can then expect more science consumers as more members of the public begin to see themselves reflected in scientists in the media. The inclusion of disabled personnel will necessitate changes in space habitats, equipment, policies, and procedures that will benefit everyone. Mission: AstroAccess will enable the initial stages of this research, collecting data crucial to the future of inclusive space exploration. 

For the inaugural flight in 2021, the mission will focus on basic operational tasks that will a) demonstrate the abilities of disabled crew members to work effectively in a microgravity environment and b) investigate minor changes that could be made to this environment in order to promote greater accessibility in the future.

User Profile: Blind / Low Vision

Pains:

  • Locating handles

  • Locating in vessel

  • Lack of visual orientation for navigation

  • Cant’ see where

  • No floor to stand

  • Fonts and back colors

  • Lights / visual cues

  • Seeing / accessibility for handles and switches

  • Doesn’t know where up or down or left or right is

  • Interaction with a floating object / how to grab?

  • Being aware of your surroundings is even more challenging in zero g - a 360 degree volume of space to consider

  • No visuals for zero g orientation

  • May be hard to tell how fast your moving and how you’re oriented

  • Other objects / bodies floating into your personal bubble

  • No gravity for down orientation

  • No feedback from movement

  • Holds cannot see

  • Directing movement

  • Returning to secure position / identifying position

  • Knowing where to go

  • Knowing orientation


Gains:

  • Tether

  • Use of all your arms and legs

  • Can hear where there are sounds

  • Can hear signals from flight crew

  • Potentially lower nausea due to no interference between visual cortex and inner ear

  • Increased sensitivity to the environment using music signals for guide

  • Large exposure area for touch

  • Sensory experiences

  • Vibration sets to navigate the craft

  • Touch surface pathway throughout the craft

  • Potentially reduced motion sickness susceptibility

  • Aren’t as susceptible to orientation confusion due to visual cues


Week 3:

Project Ideation

Initial ideas (from the design sprint)

  • Slate and stylus

  • Pool noodle cane

  • Sonified gyroscope

  • Tone source on wall or floor

  • Braille display

  • Air tags



Ideas to test:

  • Location awareness with beacons with haptic feedback

  • Walking cane to navigate and reorient the user


Miro brainstorming link


Physical Environment Accessibility

  • Can sound beacons be used for blind crew members to orientate themselves in 3D space?

  • How could tactile markers be used inside the plane cabin to support access for blind crew members?


Week 4:

Moodboard

Space Sunflower

Space wand

Biomimicry- Ants following a trail, sunflower following the sun, bats using sonar for location-awareness


AstroAcess Flight - Initial Prototype

On October 17th, the AstroAccess team flew from Long Beach airport. Many initial prototypes and ideas were tested.

For the blind group, there were a few tests:

Audio Localization - Chimes / Door Bells

What worked?

  • The bells were placed on the ground and could be activated by the user.

  • On the stationary test, the bells were audible

  • The options of different sounds made it easy for the blind users to choose from

What did not work?

  • During he parabolas - the bells were not audible

Haptic Prompts - Using Soundbrenner device

What worked?

  • Vibrations could be felt before entering a parabola

What didn’t work?

  • Needs to be tied tighter to the body for the user to perceive the vibrations in zero gravity


Our early prototype:

Since we only had a few weeks to prepare, an iPhone app was developed that could be paired with multiple Bluetooth beacons. The iPhone would vibrate less/more as it is near the beacons.

Results: inconclusive. Only got to test on a single parabola, and the button on the iPhone that shuts down the haptic feedback was accidentally pressed.

Design Idea 1: Wearable Haptic Device

The idea here is similar to the iPhone experiment for the flight. The user would move their pointing finger, and if they are pointing towards the beacon - it will vibrate more.

Questions:

  • Is 10 seconds enough time to move your pointing finger around to have proper orientation?

  • When floating around, is it possible to keep the finger pointing towards the floor while re-orienting the body?

The electronic scheme for such a device would be as follows:

Design Idea 2: Multiple Wearable Devices

Instead of one pointing finger - perhaps put an haptic device in multiple places on the body.

Questions:

  • How easy it is to understand relative haptics on different body parts? Can we tell if my left leg is vibrating “more” than our right arm?

Design Idea 3: Haptic Orb

This idea is inspired by this watch design for blind people:

Given the short time frame - we want to convey as much information as possible. Would it be possible to design an object that conveys directional information within a single touch? or a few touches?

The idea is now to create an orb, with six time of flight, ultrasonic sensors, one at each pole. Each will have an haptic feedback motor. When you touch the orb - you could feel where it vibrates to. If you rotate it - the vibrations will change so it always vibrates towards the floor.

Very rough sketch:


Electronics

We chose to use an HC-SR04 ultrasonic sensor for each poles. These are cheap (4$) and abundant in the lab as well as ordering. In the future, distance/angle measuring using UWB receviers/transmitters could achieve better performance (DWM1000 or DWM3000). However, these items are not easily available to purchase in a short time frame + will take more time to configure.

The transmitter sends an ultrasonic signal at 40KHz. The receivers picks up the signal and using the time it took to trvael - the distance between the transmitter and recevier can be extrapolated. To sync the receiver and transmitter - we need another wireless module. We chose to use nRF24L01 as it’s common around the lab and easy to use. It sends a signal at 2.4GHz.

Here is the first, breadboarded prototype:

Status: waiting for haptic feedback motor + driver to prototype


3D Casing of the orb


3D printed lower half


PCB for the beacon

Once the beacon design was working (using Ultrasonic transmitter and RF sync), we can make a nicer beacon device by making a PCB that fits as an Arduino shield:

After a few soldering mistakes, the beacon seems to be working fine. Here is the code:

/*
  Transmit side of one way ultrasonic ranging using HC-SR04 + nRF24L01

  HC-SR04 Ping distance sensor:
  VCC to arduino 5V
  Echo to Arduino pin A1
  Trig to Arduino pin A0

  nRF24L01 2.4GHz Tranciever
  VCC to Arduino 3.3V
  CE to Arduino pin 9
  CSN to Arduino pin 10
  MOSI to Arduino pin 11
  MISO to Arduino pin 12
  SCK to Arudino pin 13

  General technique from here: https://forum.arduino.cc/t/communication-between-2-ultrasonic-sensors/433986/21
  HC-SR04 technique from here: http://arduinobasics.blogspot.com.au/2012/11/arduinobasics-hc-sr04-ultrasonic-sensor.html
  nRF23L01 technique from here: https://create.arduino.cc/projecthub/muhammad-aqib/nrf24l01-interfacing-with-arduino-wireless-communication-0c13d4
*/

#include <SPI.h>
#include <nRF24L01.h>
#include <RF24.h>

#define echoPin A1 
#define trigPin A0 

#define SyncPeriod 1000   // period for sync in microseconds

RF24 radio(9, 10); // CE, CSN         
const byte address[6] = "00088";     //Byte of array representing the address. This is the address where we will send the data. This should be same on the receiving side.

boolean syncSeq[16] = {1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 1} ; // Bit sequence for synchronization

int maximumRange = 200; // Maximum range needed (cm)
int minimumRange = 0; // Minimum range needed
long duration, distance; // Duration used to calculate distance
long timeout = 50000;  // Ping duration timeout (uS)

void setup() {
  Serial.begin (9600);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);

  digitalWrite(trigPin, LOW) ;

  Serial.println("Transmitter init");

  radio.begin();                  //Starting the Wireless communication
  radio.openWritingPipe(address); //Setting the address where we will send the data
  radio.setPALevel(RF24_PA_MIN);  //You can set it as minimum or maximum depending on the distance between the transmitter and receiver.
  radio.stopListening();          //This sets the module as transmitter
}

void loop() {
  TxSync() ;        // Send RF synchronization sequence
  
  digitalWrite(trigPin, HIGH);      // Start ping sequence
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  //Delay 100ms before next reading.
  delay(100);
}

void TxSync()
{
  const char text[] = "START";
  radio.write(&text, sizeof(text));    
  delayMicroseconds(SyncPeriod);
}


User Experience Session with Mona (Nov 16)

Fortunately, we had the opportunity to meet with Mona, who was on the first AstroAccess ZeroG flight, to get feedback and ideas on our prototype. The prototype wasn’t ready yet, so we simulated it. We took the 3D print of half an orb as in our design, added sensors (that don’t work yet) and a single vibration motor that is controlled manually using a voltage supply.

The UX experiment went as follows - the vibration motor was at a fixed side of the half orb, unknown to Mona. By touching the half-orb she had to figure out which direction the vibration came from. The idea was to test the hypothesis of whether our hands (or specifically, blind people’s hands) can quickly recognize the vibration source on the object.

The results were very interesting. Mona could figure out the source of vibration but only after 7-10 seconds and using intricate touches - not feasible for a ZeroG scenario. We tested different vibration intensities - if the vibration is too strong, it propagates over the entire object and it’s impossible to determine. If it’s too weak, it’s also too hard to determine the source. There exists a “sweet spot” of vibration intensity. Still, the uniform material and spherical shape of the object make it hard to localize quickly.

Together, we brainstormed and devised a new design, where each sensor is extended far from the center. The motors and sensors will be at the tips of these extended poles, and our new hypothesis is that by moving one’s hand over the poles the user might be able to figure out the direction quickly.

The electronic stays the same - six ultrasonic sensors and vibration motors connected to a central microprocessor with an RF module for sync with the beacon.

Pictures of the new design options:











Comments
0
comment

No comments here