Lunar dust interference of depth-data collection
2021.11.15 Lunar Rover Measurements
From the drawing provided to the teams and by estimating the most likely position of the cameras, we found that the distance between the wheels and the camera is:
Height= 5.26 cm
Horizontal=16.7 cm
All measured from the edges of the wheels
2021.11.15 Regolith rake
In order to ensure the regolith isn’t all pushed away by the first wheel rotation leaving no dust for collection or kick-up in subsequent rotations (parabolas), we’ve designed a regolith rake which can be operated manually after each parabola. The rake (shown below) will be operated by pulling on a string at either end of the containment box through an O-ring sealed tube (the Viton O-ring will be tight to the string thus maintaining containment). After each run, the wheel will be lifted and the rake will be passed over the regolith to re-smooth the surface.
The CAD model for the rake (below) will first be 3D printed to test the setup and maneuverability of the design. The final piece will be made out of aluminum to minimize static with the regolith. The rail pieces will be filed down to be rounded so they don’t stick in the 80/20 rails when pulled.
This is a fully parameterized design so that we can easily modify the sizing, thickness, tooth size and spacing.
A possible fitting we could use would be a 1/6” tubing connector but replace the ferrule (metallic) with a Viton O-ring which will seal the string within the tube as it slides. In the image below, the 1/16” tubing represents the string. The same fitting can be used with a solid 1/16” pipe which will connect to the wheel to raise and lower it.
2021.11.09 Mechanical Wheel Considerations
The wheel motion mechanism has been decided on to be static and rotating. The most important considerations are:
Matching 6-10 cm/s rotation speed
Matching weight experienced in micro gravity by the wheel
Perserve the expected position of the camera on the rover
Allowing for the reshuffling of the sand without hurting the wheel.
This will be done by putting the motor on a slit to allow the wheel to move up and down during the different forces.Furthermore, it will allow us to “pull up” the wheel to allow for the brush to pass.
The other component to design is the brush that will move the sand around. We want it to be metal so that it does not stick to the regolith electrostaticly.
We also realised we do not have that much regolith so we have desided to make a smaller sand box, even if we are increasing the overall size of the containment box.
2021.11.02 Prototyping and designing for Lunar Outpost
We had the opportunity to meet with Lunar Outpost and requested envelopes for the payloads (camera payload location with respect to wheel and height above the lunar surface). LO will provide these for us. We were also given the speed of the wheel so we can design for a maximum speed, thus a maximum dust ejection from the wheel’s movement. This will also allow us to do preliminary tests to see if the wheel can collect dust/regolith at this speed.
We have also begun testing the camera capability for different lighting conditions.
Key here is that the outdoor capture, while it does provide lower resolution and a shallower depth of view, this was a worst-case scenario in full diffuse (light cloud cover) daylight. We will next try a night-time capture with directional spot-lighting. This will be reflective of what will be in the experiment box.
2021.10.26 PDR review
Some key developments for our PDR were the technical schematics. This included the first draft of our CAD model:
Here, the main changes that need to be made for the next iteration are:
Update the model to more accurately reflect the Lunar Outpost envelopes - these will be provided by LO in the next few weeks
Increase the size of the box to accommodate the wheel size
Increase the size of the 3D wall to take advantage of the size of the box
Complete the design of the wheel operating mechanism
Next, we presented the Subsystem schematic:
Key in this schematic is noting that the Azure camera is controlled by a laptop which is not within the experiment box. We will be designing to protect the laptop from mechanical damage (i.e. kicking) and ensuring it is ruggedly attached to the outside of the box.
Finally, we presented the electrical schematic:
This is representative of the wheel and light controls but does not include the laptop and camera controls as these will be controlled by the laptop on a separate electrical system. The main concern for the laptop system is to ensure there is enough battery life for the duration of the flight so no external power is required.
We have also thought through our Concept of Operations and this is summarized here:
Key comments to update for CDR:
Add why this is an important payload--that it's de-risking the real lunar payload (Lunar Outpost)
Stay as close as possible to real dimensions and spatial configuration for the lunar payload opportunity with Lunar Outpost
Cleary explain why wax is being melted for dust collection (will this actually be done?)
Provide more detail on how the flight experience will be integrated into the outreach component.
Provide a robust prototype
Our complete schedule can be found here: https://docs.google.com/spreadsheets/d/1MA-mLEtphqoF1RVflvRatB9zEdz2V9f_/edit?usp=sharing&ouid=100770675523606880620&rtpof=true&sd=true
2021.10.19 Outreach update
We met with the Cambridge Public School system JK-12 Science Coordinator - Janet MacNeil - who we will be coordinating with to develop the content and schedule for the outreach program. The main components of the program will be:
Develop 4 or 5 short educational videos on microgravity and parabolic flight (15-minute videos). These will be played asynchronously to Grade 7 classes across Cambridge introducing them to science in microgravity.
Have each Grade 7 class submit an experiment proposal for the parabolic flight - shoebox size and passive.
Down-select to 1 or 2 experiments that will be included on the parabolic flight. This will be done like a competition with a panel to select the final experiment.
Have all classes contribute to the final experiment design (this may just be having all students present the science that will be explored with the selected experiment).
Fly the experiment on the Zero-G flight - record the entire flight in 360 VR video including a look at the flyers, a detailed look at the student experiment in flight and a close-up of some of the other experiments on-board.
Have a ‘Flight Day’ with the Grade 7 students explaining the results of their experiment. This will partially be done in VR - we will convert the VR video for the Oculus Quest and provide classrooms with an Oculus Quest to view the video. It will also be available on YouTube.
Janet has provided us with the class content for two of the students’ units - Rollercoasters and Mysteries of the Universe - on which we will be basing our course content.
We have confirmed funding ($1500) to purchase the Oculus Quests for the students and materials for their experiment. We may use some of this funding to buy a dedicated GoPro Max (360 VR video). We would like this to be a program that can run annually if there are 1-2 students in the Zero-G class who would take on the video-recording and Flight Day. The course content can be re-used year-to-year.
2021.10.12 Towards a PDR
This week we reviewed our projects in class and got advice on how to advance to a PDR. In order to incorporate the dust collection wheel into the project design, we’d like to have the wheel be motorized and be able to roll through the lunar dust kicking up the dust into the depth-camera’s view field. Here are a few preliminary sketches of what this will look like.
We’ve included here a 3D printed backdrop with identifiable shapes that we can use to assess the cameras view-field. The experiment is being designed for lunar gravity, which may be anywhere from 2 to 6 parabolas, so we’ll need to design for efficient data collection. The parabola will look something like this:
Camera is initiated and recording is started at the beginning of the flight: data will be parsed after flight to minimize actions required during parabolas.
Wheel is turned on at pre-defined speed at start of lunar gravity
Wheel traverses lunar dust (~2 seconds) and turns off (automated sequence)
Dust is recorded via depth-camera (continuous stream)
What we expect to be able to determine from this data:
Can the wheel design collect dust?
How much dust can obstruct the depth-camera view-field while still providing acceptable data:
3D wall will have ‘objects’ of different size/scale to provide a metric for what is considered acceptable data
How scratch-resistant is the camera lens to this type of exposure?
How fast can the wheel move without causing obstructing amounts of dust
How fast does the dust settle back into an acceptable density for the camera after the wheel stops
The last two will be dependent on how many parabolas we can do. If we only have 2 parabolas, this will not give us enough to get data for multiple speeds or to have a statistically relevant data set for dust settling.
We will also be able to run this experiment in microgravity and collect data for questions 1., 2. and 3.
Still to be determined for PDR:
How to quantify dust density
Automation of the wheel movement
3D wall design
Dust containment with wheel movement.
2021.10.05 Creative Design
Cady Coleman spoke to us today about the creative component of her time on the ISS. She gave us some fantastic advice about exploring our creative sides, how music influenced her experience and how we can incorporate this into our projects.
Cady’s advice:
Individual insight can help to share your ideas
What you bring to the moment is by definition enough
Interesting to give context (not just the launch, the people watching it; not just the view from ISS, but the window you're looking through)
Making ripples requires being brave and being open (particularly with art/music)
When you are brave enough to share your art it catalyzes things
Individuality makes the team - it is the connections between these disciplines that achieves the team and achieves the solution.
Sparks imagination and humanizes humans in space
Reminds you there's magic, lets you recharge to come back to your work more completely and gives you a sense of connection with the other people doing the art with you, lets you be more present
Success of a Zero-G flight requires you to have a moment where you truly experience the flight - see how this experience can create ripples in the rest of your life
Simpler is always better - scale back so that you can have the complete experience.
The data will be good and will lead to something, your failures will even lead to something
When you're feeling overwhelmed and feel you can afford it the least, that is the best time to take a break, go speak with someone who doesn't speak that language
Next we heard from Sana Sharma who flew an experiment last year exploring not only painting in zero-g but how the experiment itself was an artistic creation. She created the experiment to reflect who she was and how she could take Earth with her into ‘space’. Here are her key pieces of advice:
How can art and artistic endeavour tie you back to home
Document everything - this is so valuable at the end to pull together an understanding of how you did what you did (film things, take photos, keep sketches)
Be present and experience the embodied sensation in the moment
You will learn things on the flight that are impossible to anticipate on the ground
She also shared some of memories of the flight itself. Here were her insights:
Surreal
Try to have fun
Discovered on the microgravity parabolas
Difficult not to push off the ground with her feet - which pushed her right to the ceiling where she stuck. Instead you can just gently float off the ground
Felt slightly ill - asked to lie down during hyper-g and don't look around. On the second half of the flight she didn't do this (sat upright in hyper-g) and felt sick
Next time - she would simplify (reduce the number of experiments) and try to increase the amount done in free-float
The final part of class was working on our mood boards.
My focus here was combining the idea of taking depth-images in a dusty or debris filled environment with collection techniques of that dust. The images portray how dusty the moon environment is, especially with the rover wheel’s movement, what this means for the camera in terms of getting covered in dust and seeing through the dust. I’ve also included our preliminary design box for the camera and an artist’s rendering of what we will do with this depth-data, putting it into a VR environment for doing real science on the lunar surface in VR.
2021.09.21 Project Ideation
We will be combining our project with another project that is looking at dust collection using a soft wax inside the wheel of a rover. We are exploring how to use this wheel to disturb the dust for the camera capture in a controlled and repeatable method. We would also need to understand how to replace the wax in the wheel without releasing the simulant (glove box?).
We broke down our project into categories of development to assess what components could be bought early and what components need further consideration for the design.
The critical components we discussed (seen above) were:
Containment box:
Lighting - black out the box and include a strip or single point light source (controllable)
Lunar dust simulant - available in one of our team member’s labs
Needs to be fully contained
Mechanism to move the dust (wheel for dust collection?)
Camera system:
Accessible from outside the box for wiring and data collection
Data collection methodology
Ability to remove camera mid-flight to swap out camera with sapphire lens
Landscape for depth-data identification
Design simple 3D object to have as identification parameter
How to connect the box to the aircraft?
Work under vacuum?
Proposing depth-data acquisition in a lunar dust environment:
The RESOURCE team in the Human System Lab in Aero/Astro is examining different types of depth-data acquisition techniques to integrate depth-mapping into a virtual reality (VR) platform for lunar rover exploration missions. Through analogue testing we have determined that LiDAR combined with RBG imagery can provide the most comprehensive mapping while minimizing bandwidth. Using off-the-shelf components, we will be able to quickly adapt the hardware for near-term lunar missions. We have done preliminary testing of data collection using the Intel RealSense L515 which uses an 860 nm Class 1 infrared laser time-of-flight (ToF combined with a 1920 x 1080-pixel RBG camera with an embedded colour image signal processor. By using a COTS part with integrated ToF and RBG imaging we reduce the processing required to align different camera view fields and positions when rendering the VR environment as well as reducing development costs.
Lunar dust is one of the primary challenges of returning to the lunar surface. It is ‘very abrasive, highly cohesive, [and impairs] optical instrumentation’ (Johansen, 2020). To prepare for flight-readiness assessments of the COTS part, in addition to vacuum, temperature and vibration testing, we need to determine the capabilities of the camera for lunar dust considerations. There are two main considerations that need to be tested, 1) the LiDAR data collection capabilities with dust interference, or veiling (Liebe et al., 2004), and 2) lens scratch resistance with dust exposure. We propose the use of a 15cm x 15 cm x 30 cm acrylic box with a mounted L515 LiDAR+RGB camera: CAD Model. We will add a layer of fine-grained lunar regolith within the box to simulate lunar dust. When in lunar gravity we will agitate the dust, simulating rover movement, and monitor the output of the L515. We would repeat this test for different levels of agitation of the dust simulating different rover speeds. This would give a good indication of the camera’s ability to function with interfering particles and would allow us to monitor how soon after rover movement we could capture depth-data (time required for the dust to settle after agitation). After the experiment we would assess the lens of the COTS part identifying if there would be a need for a sapphire lens, as is used on the Mars cameras (Edgett et al., 2009, Bell et al., 2003). Ideally, we would have two cameras, one with the COTS lens, a second with a sapphire lens, and perform the experiment on each to compare the damage caused by the lunar dust. Ideally this experiment would be performed in repeated Lunar gravity parabolas, however, the dust’s veiling effects, and scratch resistance could be tested in microgravity as well.
The ability to capture depth data on the Moon with a COTS part would make visualizing the lunar surface more accessible. Using this type of data to render VR environments would be beneficial not only to scientists exploring the lunar surface, but for outreach as well. It would give the rest of the world the opportunity to be up close and personal with our Moon.
References
· Johansen, M. R., “NASA Dust Mitigation Strategy”. The Impacts of Lunar Dust on Human Exploration, Plenary Address, February, 2020.
· Liebe, C. C., Scherr, L. M. and Wilson, R. G. “Sun-induced veiling glare in dusty camera optics”. Optical Engineering 43(2), February, 2004. https://doi.org/10.1117/1.1635835
· Bell, J. F., Squyres, S. W., Herkenhoff, K. E., Maki, J., Schwochert, M., Dingizian, A., Brown, D., Morris, R. V., Arneson, H. M., Johnson, M. J., Joseph, J., Sohl-Dickstein, J. N. “The Panoramic Camera (PanCam) Investigation on the NASA 2003 Mars Exploration Rover Mission.” Lunar and Planetary Science XXXIV, 2003 p. 1980