Uncrewed Systems Technology 047 l Aergility ATLIS l AI focus l Clevon 1 UGV l Geospatial insight l Intergeo 2022 report l AUSA 2022 report I Infinity fuel cell l BeeX A.IKANBILIS l Propellers focus I Phoenix Wings Orca

Read all back issues online www.ust-media.com UST 47 : DEC/JAN 2023 UK £15, USA $30, EUROPEe22 In training Focus on using deep learning networks for image analysis Cutting edge Advances in propeller designs to generate more lift for longer Take a load off How managed autorotation makes the ATLIS cargo carrier cost-effective and energy-efficient

Trusted performance Global leader in aerospace components for UAV and tactical applications. Power systems A wide range of power systems for UAVs, including alternators, starter-alternators, voltage regulators, power management units, and starters. Our products are characterized by high power density and optimized design, based on decades of engineering and manufacturing experience. Turbine propulsion Engines designed to satisfy demanding mission requirements. Continuous innovation and solid engineering make an ideal choice for the most demanding mission requirements. Best-in-class quality, mission assurance, affordability, supply chain security, compliance with DFAR flow-downs, and after-market support. Proudly made in Texas. Actuation Robust electromechanical actuators that meet a wide range of operational requirementswith digital communications and health and usage monitoring. Acutronic has leveraged its long history of systems engineering to deliver the highest quality servos in this size and product class. Proudly made in Texas. acutronic.com acut onic.com aerospace_210x297mm.indd 1 12/6/22 9:54 AM Trusted performance Global leader in aerospace components for UAV and tactical applications. Power systems A wide ra g of power systems for UAVs, including alter ator , starter-alternators, volta e regulators, powe anageme units, a d st rters. Our products are cha acterized by high power densi y and optimized sig , based on decades of gine ring and manufacturing experie . Turbine propulsion Engines designed to satisfy demanding mission requirements. Continuous innovation and solid engineering make an ideal choice for the most demanding mission requirements. Best-in-class quality, mission assurance, affordability, supply chain security, compliance with DFAR flow-downs, and after-market support. Proudly made in Texas. Actuation Robust electromechanical actuators that meet a wide range of operational requirementswith digital communications and health and usage monitoring. Acutronic has leveraged its long history of systems engineering to deliver the highest quality servos in this size and product class. Proudly made in Texas.

3 December/January 2023 | Contents Uncrewed Systems Technology | December/January 2023 04 Intro NASA’s latest mission to the Moon highlights the use of uncrewed systems to test out technologies before risking a human crew 06Platform one: Mission-critical info UAV sees through walls and exploits a wi-fi loophole, new AI system ranks alongside human car drivers, autonomous robotic dog for surveying construction sites, and much more 20 In conversation: Ben Kinnaman The CEO of Greensea Systems talks to us about the architecture of the company’s OpenSea offshore robotics platform 24Dossier: Aergility ATLIS Design a VTOL system that’s simpler and safer than a helicopter, yet more efficient than an autogyro, and this is one result 38Focus: AI Using neural networks for image processing comes down to training and inference. We explain how they are carried out 48Digest: Clevon 1 The development story behind this UGV, which has been designed and optimised for last-mile deliveries of parcels 56 Insight: Geospatial surveys New uncrewed systems are proving to be superior to crewed vehicles for geospatial work, making them the go-to choice 66Show report: Intergeo 2022 Our selection of the uncrewed technology and product highlights at this year’s geospatial survey and mapping show 76Show report: AUSA 2022 Some of the key offerings for military applications unveiled at this annual Association of the US Army exhibition 78Dossier: Infinity fuel cells The need to vent a fuel cell’s waste water to the ambient air makes the technology unsuitable for underwater and space applications, but this company has found an answer to that 90 In operation: BeeX A.IKANBILIS How this multi-thruster hovering AUV, and a technology called adaptive autonomy, is proving ideal for windfarm inspections 98Focus: Propellers Uncrewed systems builders want propellers with more thrust and longer lifespans, prompting a range of innovations by suppliers 108Digest: Phoenix Wings Orca How this cargo-carrying UAV has been designed in-house to maximise its range, number of deliveries – and profit 114PS: Seabed ‘harvesting’ robots In the search for valuable metal ores, mining companies are turning to the ocean floor. Here’s one eco-friendly approach 48 20 56 90 108

Read all back issues online www.ust-media.com UST 47 :DEC/JAN 2023 UK £15,USA$30,EUROPEe22 In training Focus on using deep learning networks for image analysis Cutting edge Advances in propeller designs to generatemore lift for longer Take a load off Howmanaged autorotationmakes theATLIS cargo carrier cost-effective and energy-efficient 4 The launch of the Artemis space programme in November 2022 is a testament to determination. It has struggled to overcome delays for more than 10 years, with hydrogen leaks repeatedly halting the countdown. Now though, the uncrewed Orion capsule, launched by the giant SLS rocket, has travelled further than any other manned spacecraft, to the dark side of the Moon and back. Photographs show unparalleled detail on the surface of our lunar satellite. But the programme, which we detailed in the Space insight in our previous issue (UST 46, October/November 2022), is also notable for using the uncrewed module to test technologies before risking a human crew. The mannequins on board are providing vital data on the performance of the module and its control systems throughout the mission. That feedback is forming the basis of a return to the Moon for humans in the next few years, and the possibility of establishing a lunar base. This role of using uncrewed systems to test out technology also extends to the depths of the ocean. For example, using machine learning to identify unexploded munitions on the seabed (see Platform One in this issue) is just one of the ways AI applications are being used to keep humans safe; we also explore using AI to monitor wind turbines in our Focus on page 38. Nick Flaherty | Technology Editor To the Moonand back Editorial Director Ian Bamsey Deputy Editor Rory Jackson Technology Editor Nick Flaherty Production Editor Guy Richards Contributor Peter Donaldson Technical Consultants Paul Weighell Ian Williams-Wynn Dr Donough Wilson Prof James Scanlan Design Andrew Metcalfe andrew@highpowermedia.com UST Ad Sales Please direct all enquiries to Freya Williams freya@ust-media.com Subscriptions Frankie Robins frankie@ust-media.com Publishing Director Simon Moss simon@ust-media.com General Manager Chris Perry Intro | December/January 2023 December/January 2023 | Uncrewed Systems Technology Volume Nine | Issue One December/January 2023 High Power Media Limited Whitfield House, Cheddar Road, Wedmore, Somerset, BS28 4EJ, England Tel: +44 (0)1934 713957 www.highpowermedia.com ISSN 2753-6513 Printed in Great Britain ©High Power Media All rights reserved. Reproduction (in whole or in part) of any article or illustration without the written permission of the publisher is strictly prohibited. While care is taken to ensure the accuracy of information herein, the publisher can accept no liability for errors or omissions. Nor can responsibility be accepted for the content of any advertisement. SUBSCRIPTIONS Subscriptions are available from High Power Media at the address above or directly from our website. Overseas copies are sent via air mail. 1 year subscription – 15% discount: UK – £75; Europe – £90 USA – £93.75; ROW – £97.50 2 year subscription – 25% discount: UK – £135; Europe – £162 USA – £168.75; ROW – £175.50 Make cheques payable to High Power Media. Visa, Mastercard, Amex and UK Maestro accepted. Quote card number and expiry date (also issue/start date for Maestro) ALSO FROM HPM THE COMMUNICATIONS HUBOF THE RACING POWERTRAINWORLD SHANETECKLENBURG: Unleashingextrahorses NOV/DEC 2022 UK £15,US/CN$25,EUROPEe22 www.highpowermedia.com RADICAL INJECTIONOPTIMISATION Appliance of science to a 401 FordV8 CLASSIC SUPERBIKE SECRETS Revitalising the Kawasaki ZXR THEHIDDEN LIFEOF ENGINES Focus on advanced simulation tools ISSUE017 | JAN/FEB2023 UK£15 USA$30 EUROPE€22 E-MOBILITY ENGINEERING THE COMMUNICATIONS HUB OF THE ELECTRIFIED POWERTRAIN Gentle probing Nottoohot, nottoocold Non-destructivewaysof testingEVbatteries Thermalmanagementsolutions forEVbatterychargingandsafety Sites of the future ECE’sworkonelectrifying Doosanexcavators The USE network Having now provided several enterprises around the world with the support and connections they need to implement efficient and sustainable technological solutions, we’re keen to continue expanding this free service. If the uncrewed vehicle and/or system you’re working on could benefit from some independent advice, from engineers specialising in the appropriate field, then please do get in touch. Email your question/challenge/dilemma/predicament to thenetwork@uncrewedsystemsengineering.comor visit www.uncrewedsystemsengineering.com and raise a case with us. All questions will be treated in the strictest confidence, and there’s no obligation whatsoever to follow any recommendations made.

The SRoC is a standardized product built for Defence robotics. It offers high communication flexibility by interfacing through our new Swappable Radio Modules (SRM) and/or the Nett Warrior connector. www.uxvtechnologies.com A breakthrough in robotic control UXV Technologies, Inc. SRM compatible | Ruggedized | 25 different inputs THE BRANDNEWSROC Contact us

6 Mission-critical info for UST professionals Platformone Researchers in Canada have developed a UAV that can use wi-fi signals to see through walls (writes Nick Flaherty). The device, nicknamed Wi-Peep, can fly near a building and identify and locate all the wi-fi-enabled devices inside it in a matter of seconds. The Wi-Peep exploits a loophole the researchers call ‘Polite wi-fi’. Even if a network is password protected, smart devices will automatically respond to contact attempts from any device within range. The time-of-flight protocol used by Wi-Peep sends several messages to a device as it flies and then measures the response time for each, enabling it to identify the device’s location to within a metre. “The Wi-Peep devices are like lights in the visible spectrum, and the walls are like glass,” said Dr Ali Abedi, a professor of computer science at the University of Waterloo in Canada. “Using similar technology, one could track the movements of security guards inside a bank by following the location of their phones or smartwatches. Likewise, a thief could identify the location and type of smart devices in a home, including security cameras, laptops and smart TVs, to find a good candidate for a break-in. In addition, the device’s operation via UAV means it can be used quickly and remotely without much chance of the user being detected,” he said. The system is built from a commercial UAV and $20 of easily purchased 2.4 GHz radio detection hardware that weighs 10 g, making it light enough to mount on a UAV. “As soon as the Polite wi-fi loophole was discovered, we realised that this kind of attack was possible,” said Dr Abedi. “On a fundamental level, we need to fix the loophole so that our devices do not respond to strangers. We hope our work will inform the design of nextgeneration protocols.” Airborne vehicles Wi-fi loophole warning December/January 2023 | Uncrewed Systems Technology The Wi-Peep UAV highlights a loophole in wi-fi-enabled devices

7 Platform one Uncrewed Systems Technology | December/January 2023 Researchers have developed an alternative positioning system using mobile phone networks and an atomic clock that is more robust and accurate than satnav in urban settings (writes Nick Flaherty). The team, at Delft University of Technology (DUT) and Vrije Universiteit Amsterdam (VUA), achieved an accuracy of 10 cm in a prototype system. Radio signals from navigation satellites such as GPS and Galileo are weak when received on Earth, making accurate positioning impossible if the radio signals are reflected or blocked by buildings. “That can make GPS unreliable in urban settings, for instance,” said Christiaan Tiberius at DUT and coordinator of the project. “That is a problem if we ever want to use automated vehicles. So far, we had no back-up system.”  The project, called SuperGPS, developed an alternative positioning system that makes use of the mobile telecoms network instead of satellites, which could be more robust and accurate than GPS. “We realised that the telecoms network could be transformed into a very accurate alternative positioning system that is independent of GPS,” said Jeroen Koelemeij of VUA. “We have developed a system that can provide connectivity like existing mobile and wi-fi networks, as well as accurate positioning and time distribution like GPS.” The key is connecting the mobile network to a very accurate atomic clock that broadcasts precisely timed messages for positioning. This is hosted by VSL, the National Metrology Institute of the Netherlands, and distributed through radio transmitters that are connected and time-synchronised at the sub-nanosecond level through a fibre optic Ethernet network. VSL has extensive experience with a protocol called White Rabbit, developed at the CERN particle accelerator lab, that allows timing signals to be carried over long distances using fibre cables. “We had already been investigating techniques to distribute the national time produced by our atomic clocks to users elsewhere through the telecoms network,” said Erik Dierikx of VSL. “With these techniques we can turn the network into a nationwide distributed atomic clock – with many new applications such as very accurate positioning through mobile networks. “With the hybrid optical-wireless system we have demonstrated, in principle anyone can have wireless access to the national time produced at VSL. Basically it forms an extremely accurate radio clock that is good to one billionth of a second.” The system uses radio signals with a bandwidth much larger than usual. “Buildings reflect radio signals, which can confuse navigation devices,” said Gerard Janssen at DUT. “The large bandwidth of our system helps to sort out these confusing signal reflections, and enables higher positioning accuracy. “At the same time, bandwidth in the radio spectrum is scarce and therefore expensive. We circumvent that by using a number of related small-bandwidth radio signals spread over a large virtual bandwidth. That means only a small fraction of the virtual bandwidth is actually used, and the signals can be very similar to those of mobile phones.” Driverless cars ‘SuperGPS’ for urban roads The system improves on satnav positioning in urban areas by exploiting mobile phone networks

8 Platform one Researchers in Germany have developed an autonomous AI system that is comparable to a human driver for detecting pedestrians and other road users (writes Nick Flaherty). The Cognitive Neuroinformatics research group at the University of Bremen has worked with automotive supplier Continental to develop the system as part of the Proreta 5 project. The work includes researchers from TU Darmstadt and the TU Lasi in Romania, and is the latest stage in a project that has been running for 20 years to develop autonomous technologies. The research vehicle was equipped by Continental with sensors and computers to test the resulting functional and verification methods for the automated driving system directly under real conditions. Methods included multi-modal prediction of dynamic behaviour of an object, specifying and testing traffic rules compliance, and logic-based testing to detect unsafe behaviour of AI modules. “The great advantage of AI is that, after a training phase, it is able to draw the right conclusions in unknown situations based on what it has learned,” said Prof Schill, head of the Cognitive Neuroinformatics working group at the University of Bremen. “One part of the project was to observe the human drivers as they reduce and evaluate the complexity of the environment themselves. The adaptive algorithms are now being trained according to similar principles.” In the project, the Cognitive Neuroinformatics group investigated AI methods for recognising objects and obstacles in the environment. An attention-driven pipeline identified relevant areas in camera images using saliency maps that show where a driver’s attention is focused first, for example on other road users or when signs appear. Then the driver’s gaze was projected into the image to expand the relevant area. This distinguished between relevant and non-relevant regions in the image that were used for building mathematically correct models to represent the position, orientation, speed or size of other road users and describe how the other vehicles are moving. The other element of the project was to implement object tracking to perceive road users in the monitoring area and estimate their condition over time using radar and Lidar data. A list of tracked objects is then sent to the prediction, planning and control software modules for further processing, and the state of each object is estimated using a probabilistic filter. Another aspect of the project developed new models to describe articulated vehicles such as buses, trams or vehicles with trailers in a mathematically correct way, and adding this to the tracking algorithms. The first Proreta project in 2002 developed emergency braking systems that are now a mainstream technology. Driverless cars AI rivals human drivers December/January 2023 | Uncrewed Systems Technology The system is now being trained according to how people evaluate a driving environment’s complexity

Fiberpro Your Navigation Partner For Additional Information, Visit www.fiberpro.com Fiber Optic Gyroscope FG 150 FOG Inertial Measurement Unit FI 200C / FI 200P (ITAR FREE) Inertial Navigation System FN 210 FIBERPRO Fiber Optic Gyroscope since 1995

10 Platform one Trimble has teamed up with Exyn Technologies to develop an autonomous robotic system that can be used for surveying complex environments where existing positioning technologies won’t work (writes Nick Flaherty). The system uses the Spot robotic dog from Boston Dynamics with the Trimble X7 3D laser scanner and the ExynPak machine learning platform. The combination allows Spot to move fully autonomously inside complex and dynamic construction sites to capture consistent and precise surveying data. “Integrating autonomous surveying technology into a construction workflow can improve operational efficiency and transparency throughout a build, while also transforming worker safety for potentially hazardous data collection,” said Aviad Almagor, vice-president of technology innovation at Trimble. A tool called ExynAI can sense and avoid obstacles, dynamically adapting to the changing environment of sites. The ExynPak uses a Velodyne Lidar sensor on a gimbal with two Chameleon 5 MP cameras from FLIR mounted on Spot for Level 4 autonomous operation without the need for satellite navigation. That avoids the need for an operator or for the robot to learn about its environment beforehand. A surveyor defines a 3D volume for a mission, and the integrated robotic solution handles the complexities of self-navigation without needing a map, GPS or wireless infrastructure. Integrating the Trimble X7 provides the 3D laser scanning to capture the state of the environment. The captured data can be uploaded to the Trimble Connect collaboration platform to be shared, which can include a comparison with Building Information Models and previous scans to monitor quality and progress. This creates a detailed map with minimal human intervention and risk.  Spot the dog’s survey skill Ground vehicles Spot can autonomously survey complex, GNSSdenied construction sites December/January 2023 | Uncrewed Systems Technology Researchers at the University of Bath in the UK have developed a machine learning (ML) technique to detect unexploded bombs at the bottom of the sea (writes Nick Flaherty). The system uses large, unlabelled survey datasets to aid automatic classification from synthetic aperture sonar (SAS) data. The researchers simulated this data to train an AI framework that would be used by vehicles such as the REMUS 620 (see page 16) to autonomously detect munitions that have been discarded. Autonomous underwater vehicles equipped with SAS can survey large areas at centimetre resolution, but that generates a lot of data that needs an automated approach to detecting and classifying unexploded munitions. The ML model encodes a representation of SAS images from which new SAS views can be generated. This requires the model to learn the physics and content of the images without the need for human labels in self-supervised learning. A more accurate graphics technology called ray tracing was used to generate realistic images, and noise was then added to match the statistics of real SAS images. These were 18,000 images of the Skagerrak UXO dumpsite taken using the HISAS 1030 sonar system. The pre-trained model can then be fine-tuned to perform classification on a small amount of labelled examples with 700 training images and 2500 test images. A 250 kg, 1.8 m-long bomb was used to demonstrate the accuracy of the system, which worked better than a traditional self-supervised approach and systems with no pre-training. AI system detects UXBs Seabed safety

T-motor THE SAFER PROPULSION SYSTEM POWER MAKES YOUR EXPLORATION www.tmotor.com Atlas has developed a software-defined radio (SDR) module for UAVs with a reliable anti-interference mode (writes Nick Flaherty). The 3W radio module supports frequency hopping as a key part of the anti-jamming algorithm in UAVs. It measures 31 x 37 x 4 mm and weighs only 7 g. “The key feature of our SDR is an antijamming mode, meaning that the radio has frequent hopping and is able to work on frequencies from 2.2 to 2.7 GHz,” said Ivan Tolchinsky, CEO of Atlas. “The radio scans frequencies, and if it detects any frequency blocking, it starts hopping between the frequencies. “The radio is available on civilian frequencies as well, and it can be used for commercial purposes. Once we have tested it on our own UAV systems, it will be available as a standalone product.” Atlas has also used the latest version of its mesh network radio, AtlasMESH, in the unit to provide longer-range connections by routing signals through other radios. Tolchinsky said the company is working on the next version of the module, which will be able to define the wave factor with a wide range of frequencies. SDR resists interference Airborne vehicles The UAV software-defined radio’s key feature is an anti-jamming mode

As autonomous vehicles become a reality, emergency vehicle detection will be critical to providing drivers with information in those emergencies 12 Microphones from Infineon Technologies are being used to detect the sirens of emergency vehicles to ensure that autonomous cars move out of their way (writes Nick Flaherty). Many countries are introducing regulations that require drivers to give way to emergency vehicles. As more autonomous cars are deployed, regulations are likely to include a provision for detecting and responding to emergency vehicles and meet safety requirements. This will require a combination of audible and visual warning signals. The system, developed by Cerence, uses MEMS microphones with its Emergency Vehicle Detection (EVD) software to actively detect approaching emergency vehicles, especially when they are not in sight. The system combines an array of the IM67D130A XENSIV MEMS microphones, strategically placed on the outside of the vehicle, with the Cerence EVD. With a total harmonic distortion of less than 0.5% at a sound pressure level of 94 dB and a high acoustic overload point of 130 dBSPL, the microphones can capture distortion-free audio signals in noisy environments. That allows signals to be reliably classified even when background noise obstructs the siren tone. Cerence EVD can be integrated into the automotive assistant or on separate microcontrollers. It can also estimate the source of the sound from sirens of police cars, ambulances and fire trucks. Once a siren is identified, the driving assistant is told to take the appropriate action, whether to pull over or continue to a safe space. “As autonomous vehicles quickly become a reality, emergency vehicle detection will be critical to providing drivers with the information they need in emergency situations,” said Christophe Couvreur, senior vice-president general manager, core products, at Cerence. “By partnering with Infineon, we are providing OEMs with an integrated hardware- and software-based emergency vehicle detection system.” A development kit helps developers to quickly evaluate the MEMS microphones and test different placement configurations on the vehicle to get the best results. Listening out for sirens Driverless cars The Cerence system enables driverless cars to give way to emergency vehicles December/January 2023 | Uncrewed Systems Technology

Platform one Researchers in Belgium have developed the first UAV navigation architecture to fuse an event-based camera and a frequency modulated continuous wave radar (writes Nick Flaherty). Each sensor is processed by a bioinspired spiking neural network (SNN) with continual spike timing-dependent plasticity (STDP) learning similar to that used in the human brain. Unlike current neural networks used for SLAM systems, this approach does not require any offline training phase. Instead the SNN continuously learns features from the input data on the fly via the STDP learning. An event-driven camera (also called a dynamic vision sensor, DVS) is a new type of imaging sensor composed of independent pixels that asynchronously emit spikes whenever there is a change in light intensity. In contrast to traditional image sensors that capture RGB light, the DVS cameras perform well in low-light conditions, producing a stream of data that contains patterns in both the spatial and spike timing dimensions. This can then be used as the input for a spiking neural network (SNNs, see AI focus, page 38). The researchers used a Davis 346 DVS camera from Inivation that weighs 100 g without the lens. It has a measurement area of 346 x 260 pixels and delivers 12 million events per second to the SNN, combined with the data from the radar sensor and an ultra-wideband sensor that provides the distance to the ground. The SNN outputs are then used for map correction. The team, at KU Leuven, conducted numerous experiments to benchmark the system against state-of-the-art AI methods, showing that it performs particularly well when there are big variations in the lighting. It was tested on three different flight sequences in a warehouse environment. Navigation by neural net Airborne vehicles The UAV’s bio-inspired spiking neural net does not require any offline training / / WWW . H A R W I N . C O M Connectors shown actual size ENABLING TECHNOLOGY EVERYWHERE Harwin’s connector products are proven to perform in extreme conditions, with shock, vibration and temperature range rigorously tested. Micro connectors start at 1.25mm pitch delivering 2A per contact, up to 8.5mm and 60A - we cover a wide range of applications for when SWaP matters most. With our quality, service, support, and highly reliable products, you can depend on Harwin. Harwin UAV Uncrewed Systems Jan 23.indd 1 05/12/2022 08:45

14 Platform one NI has created a unified test system architecture that allows engineers to move iteratively between data replay and hardware-in-the-loop (HiL) testing (writes Nick Flaherty). HiL is used extensively to validate the perception, planning and control algorithms running on ECUs in autonomous driving (AD). The NI Replay and HiL AD system can combine and inject real-world road test data or simulation scenarios to test the ECUs. By providing a unified toolchain, common hardware configurations and test automation infrastructure across the development workflow, engineers can improve test coverage and so reduce the development and test times. The Replay part of the system has to feed the HiL system with external data in the same way as it would be done with the vehicle during actual test drives. This is used to increase test coverage reliability and repeatability through direct injection techniques to insert faults, frame delays and more into the sensor bitstream. The Replay element links to the AD controller through multiple I/O and automotive bus signals, with tight control over timing and data synchronisation. This also has to be adaptable for future I/O and test requirements as systems continue to add more cameras, radar, Lidar and other sensor types. NI uses an open architecture to provide full validation test coverage for AD functions, making existing data more usable throughout the entire product lifecycle. Module interface and processing cards using the PXI standard provide hardware and software fault location with nanosecond synchronisation and timing control for a reliable execution of test cases. Modular hardware generates signals to emulate radar objects, camera interfaces, vehicle bus traffic and-general purpose I/Os to test sensor fusion on the AD controller. An open software approach enables interfaces and sourcing data from IT infrastructure and cloud service providers such as Microsoft Azure, AWS and Seagate. This creates the unified test system architecture that gives the ability to move back and forth between data replay and HiL testing with the same system and a single toolchain for data recording, HiL and test coverage. Developer ZF Mobility Solutions is using the connected workflow and HiL system for AD development. Driverless cars Iterative ECUs tester Combining real-world data replay and hardware-in-theloop testing allows ECU test times to be reduced December/January 2023 | Uncrewed Systems Technology UAV Navigation has enhanced the internal algorithms of its autopilot during the landing phase to include differential braking control of the undercarriage for fixed-wing UAVs (writes Nick Flaherty). The system ensures that the autopilot can autonomously keep the aircraft on the centreline of a runway during the braking manoeuvre. The Vector-600 autopilot is able to control an aircraft on the ground, using brakes on the undercarriage wheels in order to allow a UAV to be steered safely, even at landing speed. This is a complex operation which, if poorly executed, could lead to the brakes becoming locked up and a subsequent loss of control. The algorithm takes platform ground speed into account at each stage, applying the correct amount of braking force to ensure a safe and controlled manoeuvre. Runway landing manoeuvres for fixedwing UAVs are critical and complex, and precise control is required to enable smooth transitions between the various stages – final approach, transition to level flight and speed reduction over the runway, the flare, initial contact with the runway and finally the braking manoeuvre – to bring them to a safe, controlled stop on the runway centreline. Some flight control computer algorithms complete an autonomous operation when the UAV touches down. Differential braking however enables the flight control solution to execute a truly complete operation of an uncrewed aircraft in a safe and reliable way. Safer landings promised Airborne vehicles Differential braking allows fixed-wing UAVs to keep to a runway’s centreline during landing

CubePilot ecosystem here4 CubePilot's latest Multi Band, Multi Constellation RTK GNSS module The Here4 integrates DroneID for flight information sharing and provides additional features for those keen to experiment, like onboard PWM and a full IMU solution. IMU PROCESSOR STM32H757 PWM BAROMETER RTK CUBEID (DRONEID) COMPASS RM3100 CubePilot Copyright © 2022 CubePilot Australia. All Rights Reserved. CubePilot's latest Multi Band, Multi Constellation RTK GNSS module PWM COMPASS RM3100 STM PROCESSOR STM32H757 CUBEID (DRONEID) RTK BAROMETER IMU The Here4 integrates DroneID for flight information sharing and provides additional features for those keen to experiment, like onboard PWM and a full IMU solution.

16 December/January 2023 | Uncrewed Systems Technology A laboratory at the University of Plymouth is setting up a £1.2 million project to create an underwater data network in Smart Sound Plymouth (writes Nick Flaherty). The network, called the Smart Sound Connect Subsurface, will be used to test prototype autonomous underwater systems and services. There is already a ‘wave relay’ mesh of 4G and 5G radio that reaches 20 miles out to sea running from buoy to buoy developed with Steatite and Vodafone. The Plymouth Marine Laboratory will install an underwater acoustic comms network that works to depths of 75 m from each buoy and connects into this surface network. The lab is working with acoustic modems supplier Sonardyne on a range of projects. “The Smart Sound Connect Subsurface network will integrate into the existing advanced surface networks to deliver a fully connected environment, combining underwater, surface and aerial platforms to deliver a testbed for marine autonomy,” said Dr James Fishwick, Head of Smart Sound Plymouth at the laboratory. The network is a key element of the National Centre for Coastal Autonomy, the UK’s first autonomous fully integrated coastal observing and monitoring network, which will use the latest autonomous boats and submarines. “This will deliver a unique subsea comms network within Smart Sound Plymouth,” said Dr Alex Nimmo Smith, Associate Professor in Marine Physics at the university. Marine autonomy network Subsea systems The Mission Technologies division of HII in the US is developing a medium-class UUV with a range of 200 nautical miles and carrying a sonar payload (writes Nick Flaherty). The REMUS 620 has a battery life of up to 110 hours and a range of 275 nautical miles unladen, or 78 hours and a range of 200 nautical miles with HII’s standard synthetic aperture sonar payload. It is intended for mine countermeasures, hydrographic surveys, intelligence collection, surveillance and electronic warfare. A prototype is being built, with production planned for the end of 2023. It has a modular, open architecture design for a range of payloads and HII’s Odyssey suite of autonomy software. The software allows multiple REMUS 620s to operate collaboratively or to be deployed from submarines, small crewed or uncrewed boats, amphibious ships or even a helicopter. The craft can also be used as a platform to launch and operate other vehicles or payloads. The batteries and energy modules have a standard, open interface and so are swappable, allowing for quick turnaround and incorporation of alternative energy sources such as fuel cells as they become available. “The REMUS 620 is the first medium UUV designed to accurately deliver this range of advanced above-and-below water effects at long range,” said Duane Fotheringham, president of Mission Technologies’ Unmanned Systems business group. “It was designed from the ground up for ease of payload integration, and it can support up to three interchangeable energy sections that can be charged inside or outside the vehicle.” Payloads can be replaced or enhanced for multi-mission capabilities, including intelligence, surveillance and reconnaissance, as well as cyber and electronic warfare operations. Modular-payload UUV Underwater vehicles The network will be used to test prototype autonomous underwater systems The REMUS 620 is designed for long-endurance missions such as electronic warfare

Platform one 17 Uncrewed Systems Technology | December/January 2023 A differential oscillator is providing more accurate timing for autonomous vehicles (writes Nick Flaherty). The oscillator, developed by SiTime, uses a MEMS architecture to provide more accurate timing than existing oscillators that use quartz crystals. As automotive safety systems integrate more sensors and cameras, they are generating an explosion of data that is crucial for safe, autonomous operation. A typical vehicle generates 2 Tbytes of data each hour, according to the Automotive Edge Computing Consortium, but that is set to increase to 20 Tbytes per hour by 2025. All that sensor data must be transferred at very high speeds within the in-car network, even in the most demanding environments, to enable the central processing unit to make timely decisions for a safe, reliable journey. Timing technology is one of the weakest links in vehicle electronics. Quartz crystals provide a timing signal, but they are susceptible to vibration and shock, extreme temperatures, and exhibit performance degradation over time. The MEMS oscillator, built in silicon, uses a differential structure to cancel out internal noise and provide a typical jitter of 150 fs (femtosecond, a million billionth of a second) and ±30 ppm stability. Parts with a stability of ±25 ppm or better are also possible with the architecture, providing a signal from 1 MHz to 920 MHz. The differential output drivers include LVPECL, LVDS, HCSL and low-power HCSL. A proprietary FlexSwing output driver performs like LVPECL but provides independent control of voltage swing and DC offset. The oscillator is designed to interface with chipsets that have non-standard input voltage requirements without requiring external source-bias resistors. MEMS timer Driverless cars Dr Donough Wilson Dr Wilson is innovation lead at aviation, defence, and homeland security innovation consultants, VIVID/futureVision. His defence innovations include the cockpit vision system that protects military aircrew from asymmetric high-energy laser attack. He was first to propose the automatic tracking and satellite download of airliner black box and cockpit voice recorder data in the event of an airliner’s unplanned excursion from its assigned flight level or track. For his ‘outstanding and practical contribution to the safer operation of aircraft’ he was awarded The Sir James Martin Award 2018/19, by the Honourable Company of Air Pilots. Paul Weighell Paul has been involved with electronics, computer design and programming since 1966. He has worked in the real-time and failsafe data acquisition and automation industry using mainframes, minis, micros and cloud-based hardware on applications as diverse as defence, Siberian gas pipeline control, UK nuclear power, robotics, the Thames Barrier, Formula One and automated financial trading systems. Ian Williams-Wynn Ian has been involved with uncrewed and autonomous systems for more than 20 years. He started his career in the military, working with early prototype uncrewed systems and exploiting imagery from a range of systems from global suppliers. He has also been involved in groundbreaking research including novel power and propulsion systems, sensor technologies, communications, avionics and physical platforms. His experience covers a broad spectrum of domains from space, air, maritime and ground, and in both defence and civil applications including, more recently, connected autonomous cars. Professor James Scanlan Professor Scanlan is the director of the Strategic Research Centre in Autonomous Systems at the University of Southampton, in the UK. He also co-directs the Rolls-Royce University Technical Centre in design at Southampton. He has an interest in design research, and in particular how complex systems (especially aerospace systems) can be optimised. More recently, he established a group at Southampton that undertakes research into uncrewed aircraft systems. He produced the world’s first ‘printed aircraft’, the SULSA, which was flown by the Royal Navy in the Antarctic in 2016. He also led the team that developed the ULTRA platform, the largest UK commercial UAV, which has flown BVLOS extensively in the UK. He is a qualified full-size aircraft pilot and also has UAV flight qualifications. Uncrewed Systems Technology’s consultants The MEMS timing oscillator is more accurate and robust than those based on quartz crystals

Platform one Geo Week Monday 13 February – Wednesday 15 February Denver, CO, USA www.geo-week.com Oceanology International Americas Tuesday 14 February – Thursday 16 February San Diego, USA www.oceanologyinternationalamericas.com IDEX Monday 20 February – Friday 24 February Abu Dhabi, United Arab Emirates www.idexuae.ae Unmanned & Autonomous Systems Summit Wednesday 8 March – Thursday 9 March Maryland, USA www.unmannedsystems.dsigroup.org GEO Connect/Drones Asia Wednesday 15 March – Thursday 16 March Marine Bay Sands, Singapore www.dronesasia.com DSEI Japan Wednesday 15 March – Friday 17 March Tokyo, Japan www.dsei-japan.com Amsterdam Drone Week Tuesday 21 March – Thursday 23 March Amsterdam, The Netherlands www.amsterdamdroneweek.com Military Robotics and Autonomous Systems Conference Monday 17 April – Tuesday 18 April London, UK www.smgconferences.com/defence/uk/conference/ robotic-autonomous-systems Ocean Business Tuesday 18 April – Thursday 20 April Southampton, UK www.oceanbusiness.com Rotorcraft/Unmanned Systems Asia Wednesday 3 May – Friday 5 May Singapore www.rca-umsa.com Xponential 2023 Monday 8 May – Thursday 11 May Denver, USA www.xponential.org Undersea Defence Technology Tuesday 9 May – Thursday 11 May Rostock, Germany www.udt-global.com Uncrewed Maritime Systems Technology Wednesday 10 May – Thursday 11 May London, UK www.smgconferences.com/defence/uk/conference/ Unmanned-Maritime-Systems Paris Airshow Monday 19 June – Sunday 25 June Paris, France www.siae.fr/en Autonomous Ship Expo Conference Tuesday June 20 – Thursday June 22 Amsterdam, The Netherlands www.autonomousshipexpo.com MOVE Wednesday 21 June – Thursday 22 June London, UK www.terrapinn.com/exhibition/move Japan Drone Monday 26 June – Wednesday 28 June Chiba, Japan www.ssl.japan-drone.com Commercial UAV Expo Americas Tuesday 5 September – Thursday 7 September Las Vegas, USA www.expouav.com DSEI Tuesday 12 September – Friday 15 September London, UK www.dsei.co.uk DroneX Tuesday 26 September – Wednesday 27 September London, UK www.dronexpo.co.uk UAV Show Thursday 19 October – Friday 20 October Bordeaux, France www.uavshow.com Egypt Defence Expo Monday 4 December – Thursday 7 December New Cairo, Egypt www.egyptdefenceexpo.com Uncrewed Systems Technology diary 18 December/January 2023 | Uncrewed Systems Technology

2,163 jobs and counting… Now live The global hub for uncrewed andautonomous systems engineering vacancies

20 Ben Kinnaman, founder and CEO of Greensea Systems and originator of the Open Software and Equipment Architecture (OpenSea) robotics platform, came up through the hands-on, practical side of the marine industry. Now in his late 40s, he left school as soon as he could and went to work offshore as a salvage diver before moving into ROVs as a technician and then a supervisor, returning to education with a focus on robotics and control systems along the way. While working with ROVs, it became obvious to him that the industry had evolved in a way that made the integration of new systems and technologies slow and difficult. “Our industry came up very pragmatically. We had technicians developing and building machines to work deeper, safer and accomplish more, but in doing so we ended up in a lot of silos: everyone was doing things a bit differently. Other robotics industries grew out of academia and government-funded initiatives, and they brought standards forward early. “I saw an opportunity to create a platform that would allow our industry to advance at the rate of other industries,” he says. Breaking down the silos The purpose of the resulting open architecture platform was to break down those silos and make it easy for the industry to transition to new technologies, share ideas and to integrate new sensors, payloads and packages. These days, OpenSea is deployed on about 2500 underwater vehicles and related systems, both crewed and uncrewed. “From a technical perspective, the challenge is really around how to build a platform that is flexible and scalable, that people can use to advance robotics but that stops short of over-specifying the end result. So it is really about The head of Greensea Systems explains the technologies behind his company’s open offshore robotics platform. Peter Donaldson reports Openwaters December/January 2023 | Uncrewed Systems Technology The OpenSea robotics platform runs around 2500 UUVs and related systems, including this autonomous underwater crawler from Greensea company Bayonet Ocean Vehicles (Images courtesy of Greensea Systems)

21 architecture: how do we build a bucket of Legos without predefining what they are supposed to create?” Kinnaman founded Greensea in 2006, and recalls that building a business around the concept of an open architecture platform before the industry was willing to adopt one was difficult. “Open architecture is now quite a buzzword; everyone says it with their breakfast in the morning, but 16 years ago, nobody was talking about open architectures.” Building blocks In explaining the platform’s openness, he returns to the Lego analogy. “Lego is a little building block with a standard interface, and if you have an interface that matches that block, you can plug other blocks into it,” he says. “OpenSea is made up of hundreds of little ‘blocks’ with a public interface that people can build upon.” Greensea provides application programming interfaces (APIs) and interface control documents (ICDs) to enable customers and partners to develop software that works with OpenSea, and generally shares its knowledge of how to build robust underwater robotic systems. In the software stack on a vehicle’s main computer, OpenSea usually sits directly on top of the operating system. In most systems on which it is deployed, that OS is a Linux distribution, a choice that emerged from user preference, as many operators and manufacturers like it and its technical performance. “These days, with edge processors running OpenSea in autonomous robots or sensor sets, most of the OS backbone originates out of an Ubuntu Linux distribution. However, OpenSea is completely cross-platform. We have built packages for OpenSea that support systems from Windows to Red Hat and everything in between.” Greensea supports current and legacy systems, running a continuous process in-house in which the software engineering team is constantly porting OpenSea onto new platforms and testing it. “Every night we are testing and validating OpenSea across six or seven different platforms,” Kinnaman says. OpenSea provides common wrappers around OS processes. The wrappers serve as an abstraction layer between the OS and the particular implementation of OpenSea so that common calls and loggers can run over any platform. Pragmatism and architecture On top of that layer sits the OpenSea Library, an application suite that Kinnaman describes as a mixture of pragmatism and intentional architecture. “From a pragmatic perspective, the applications represent our 16 years of writing software for ocean robotics, and include everything from device drivers to utility functions to translation applications to the tools and processes that developers and users need.” On the architecture side, OpenSea addresses some critical technology, particularly control, navigation and autonomy, through what Kinnaman calls its big five applications. The first of these is OpenINS, an inertial navigation system engine, which processes and fuses sensor data to produce a cohesive state estimate (navigation solution) for a robot platform. The second is OpenCMD, which is a vehicle platform and control package that provides full, six-degrees-of-freedom, multi-state, open- and closed-loop control for a vehicle. Ben Kinnaman | In conversation Uncrewed Systems Technology | December/January 2023 Kinnaman with a diver propulsion device run by the OpenSea platform. Some of these devices have automated modes, enabling them to be summoned from the seabed

22 In conversation | Ben Kinnaman The third is Open Manager, known as OpenMNGR, which Kinnaman describes as an autonomy executive, an application that provides a link between autonomous decision-making software and OpenSea. OpenFLS is the fourth, serving as a sonar processing application that supports almost all the forward-looking imaging sonar systems currently on the market, he says. In addition to data processing, OpenFLS also provides technologies including feature detection, SLAM routines for stabilising a navigation system, target tracking and closed-loop control, plus closed-loop feature-relative positioning of sonar data. Finally, OpenBT is a behaviour treederived engine for autonomous systems. Interfaces galore Naturally, OpenSea is also designed to run third-party applications, and to do so it provides a wide range of interfaces for them. “We have the ‘down and in’ interfaces that OpenSea needs to sit on top of computing platforms, then we have the ‘up and out’ interfaces that other applications use to work with OpenSea,” Kinnaman explains. “With the lower-level interfaces, OpenSea supports almost every industry interface, from CAN and CAN Open to Protobus and Modbus to serial TCP/IP and more. We also have the common industry standard interfaces. Natively, OpenSea uses LCM and will soon use DDS as the inner process interfaces. “We publish all those interfaces, and provide build and software utilities to help people connect to them.” The company also supports other open architecture platforms, such as the Robot Operating System (ROS), providing ROS bridges and translation layers. “It is not uncommon at all, especially in the academic community, to have a pretty big ROS stack interfacing with an OpenSea platform, although it is less common in defence and commercial applications,” he says. Greensea also provides support for several proprietary third-party control and autonomy packages, as well as a handful of other open architecture systems. For customers and partners who want to develop their own applications to run on OpenSea, the company provides a software developer kit (SDK), which Kinnaman says contains “all the keys to the kingdom”. In the SDK are the APIs and ICDs, the latter also including all the interfaces and examples for interfacing with the OpenSea suite of applications. Greensea also makes bespoke kits for many of its robot manufacturer partners. The ‘easy’ button One option aimed at customers who want a ready-made computing environment on their vehicles into which they can integrate applications is the OpenSea Hub. This is an embedded processor with a high-density I/O module associated with it, usually implemented through an FPGA chip. Kinnaman describes the Hub as “the easy button” – a stable, proven hardware platform that comes with OpenSea installed, along with essentially everything needed to plug it in and start developing a robot. “It shortcuts the need for porting OpenSea onto a new hardware platform, or a new OS, or creating a new build environment,” he says. “It is a platform that is known and understood, and that helps manage risk and cost.” Work with partners on their projects can start at almost any stage in the process, he emphasises. “Sometimes we start very early and are literally there at the table when they start drawing the concepts on the whiteboard. At that point it is easy to provide the Hub as the ‘brain’. “Sometimes the conversation begins much later, maybe even after years of operating a mature product that they now want to advance with new technologies but whose current architecture inhibits that. In that case, we might replace the processor on the vehicle with the Hub, or port OpenSea onto their existing hardware.” Kinnaman also notes that Greensea is releasing a new product in late 2022 called OpenSea Edge. This is a vehicleagnostic autonomy system for ROVs supporting seafloor to over-the-horizon comms, video and sonar perception systems. It also supports third-party AI/ML libraries. OpenSea Edge is a strapdown solution for untethered autonomy on subsea robotics. Greensea began deploying it in several defence applications in early 2022. Phased development Greensea characterises the development it undertakes with partners as a process divided into four phases, numbered 0 to 3. Phase 0 concentrates on fundamental requirements, and involves discussions about deciding how OpenSea can be of most value. It culminates in a technical roadmap. Phase 1 is about initial development, December/January 2023 | Uncrewed Systems Technology The ability to run many and diverse vehicles, such as this hull service robot from Armach Robotics (also part of the Greensea group), comes from the platform’s open architecture and multiple interfaces

RkJQdWJsaXNoZXIy MjI2Mzk4