In-space communications have the potential to open up a plethora of mission topologies. Traditional mission designs using a single vehicle are losing favor to robust, distributed space systems that are more economical, and perhaps even more capable, thanks to recent advancements in low-power and miniaturized electronics.
Laser communications can enable more efficient and higher bandwidth communications across longer distances than conventional radio frequency (RF) systems. Beam divergence angles for lasers at optical frequencies are much narrower than the divergence angles for RF systems, which concentrates beam energy in a narrower cone angle, and effectively allows a much higher transmit gain than RF antennas.
However, by this same virtue, a laser communication system requires much more precise pointing and tracking (PAT) systems to establish and maintain links. Conventional PAT systems may use mechanical gimbals or MEMS mirrors. Mechanical gimbals are very capable, but are heavy, and are more prone to failure, especially when subjected to launch and space environments. MEMS mirrors are compact, efficient, and quick, but provide very limited actuation and beam steering range, and so may have to be used in conjunction with an initial body-pointing system.
MOSAIC aims to use liquid lens technology in order to bridge this performance gap. Liquid lenses use a special optical fluid in order to be able to change their focal length dynamically. Two current technologies are being looked at: electrowetting and pressure-based lensing. Electrowetting works on the principle of altering hydrophobicity with an applied electric field, causing liquid drops to change in curvature. Pressure-based liquid lenses work on a voice coil principle changing the pressure in an optical fluid assembly to change surface curvature.
The current design uses an array of three liquid lenses: two are placed off-axis perpendicularly to provide beam steering, and one is placed on-axis to provide focusing capability.
This technology has the potential to provide long-range, wide field-of-view communications that can serve up to an entire hemisphere, and without any additional body-pointing system, making them very favorable for small satellites from a systems-level view.
By being liquid, the lenses are also affected by gravity. The optical fluid in both lenses ‘sags’ and creates asymmetric steering. We wish to investigate the lens’ steering in microgravity, as this data would tell us the lens’ true capability in space and any required modifications to pointing and tracking algorithms.
This week I headed into lab to try and get the MOSAIC prototype working. Thankfully, the process is mostly as simple as putting the optical elements together, aligning them, and turning on the laser. I haven’t quite had the time or willpower to align the optical elements yet, but it needs doing.
The final flight model will forego the fisheye of the normal MOSAIC test setup and instead just have a camera positioned on the end. I’ve started working on getting the drivers for this camera working, as the beam will need to be focused on the image sensor for this to work optimally.
Finally, I also started work on the electrical design of the experiment. This is a great excuse to put a couple of liquid lens driver boards on a breakout as well, so that I don’t need to use the many boards the developer kit comes with. Liquid lens driver chips are a fairly niche bit of kit but I can at least find one on Digikey. The other chip, the Maxim 14574 is the first chip I can’t find anywhere except going directly to Corning, interestingly enough.
I gotta write this one, still making diagrams :)
// TODO: Diagram of setup, electrical schems from kicad