Read all back issues online www.ust-media.com UST 49 : APRIL/MAY 2023 UK £15, USA $30, EUROPE €22 Cheaper by the dozen The impact of image sensor advances on uncrewed system designs We needmore data Sonar technology makes fresh headway Level headed How the WAM-V’s self-adjustment system keeps it on an even keel
When Made in America is required. Leading producer of electromechanical actuators. Ruggedized Militarized Smart acutronic.com Scan to view technical data
3 April/May 2023 | Contents Uncrewed Systems Technology | April/May 2023 48 108 58 24 66 04 Intro Image sensing systems are becoming smaller and less expensive, opening up new applications for UAVs as well as UGVs 06Platformone: Mission-critical info Intel and Daedalean unveil the first certifiable avionics design using machine learning, MIT develops simple and scalable way to build AUVs, UAVs automate rice plant counting, and much more 20 In conversation: Noel Heiks The chairman of Censys Technologies explains how she sees AI playing a vital role in high-resolution inspections by UAVs 24Dossier: Marine Advanced RoboticsWAM-V This auto-levelling USV is proving to be a go-to solution for users who want to survey large areas quickly, even in rough seas. We detail its development and the surprising inspiration behind it 38Focus: Image sensors Sensor prices are falling and they’re getting smaller, setting the stage for a raft of new design options for uncrewed vehicles 48Digest: Ottonomy Ottobot How Covid-19, social distancing and labour shortages created the conditions for producing this last-mile delivery UGV 58Digest: Eurolink Systems Beluga Taking its design cue from the bulbous-headed cetacean has led to this quadrotor that’s intended for multiple applications 66 Insight: UGVs If a job’s considered too dangerous or too dirty to be carried out by people, there’s nowmore than likely to be a UGV that has been designed specifically to do it instead 76Show report: IDEX 2023 Our round-up of products showcased at the latest outing for this biennial exhibition for the armed services 82Dossier: Rotron RT600-HC We report on the development of this popular twin-rotor Wankel gasoline engine developed for uncrewed helicopters 92 In operation: Cleo Robotics Dronut X1 Details of the design choices behind this duct-shaped UAV for inspecting hazardous and GNSS-denied indoor spaces 98Focus: Sonar systems USV and UUV operators are demanding more from sonar technology, leading to its developers to continue making advances in its capabilities, details of which we explain here 108 In operation: DroneWorks solar panel inspection How this UAV removes the need to use people to inspect solar arrays, and how the data it generates can save a lot of money 114PS: The human body as a UAV platform A look at research that shows we don’t mind UAVs landing on us, but it depends on where and what we’re doing at the time
ELECTRIC,HYBRID& INTERNALCOMBUSTION forPERFORMANCE Hailthewondrouswail UnleashingaNorton rotary Extrememetallicsolutions Advancedalloys for racing V10 renaissance Inside theRodin turbo ISSUE145 FEBRUARY/MARCH2023 www.highpowermedia.com ISSUE018 | MAR/APR2023 UK£15 USA$30 EUROPE€22 THE COMMUNICATIONS HUB OF THE ELECTRIFIED POWERTRAIN Multiple choice Alldue recognition Latest trend in batterymanagement Theusesandbenefits ofbiometric technologies Fromzero to zero HowCakeaims tomake itsKalke-bike emissions-freeateverystageof itsproduction 4 April/May 2023 | Uncrewed Systems Technology Intro | April/May 2023 The battle between autonomous systems on the ground and in the air is heating up, at least for making deliveries. One of the latest developments for small delivery UGVs is detailed in our report on page 48 on the Ottonomy Ottobot, which marks a significant move for the commercial roll-out of these delivery systems. At the same time, grape growers in France are turning to ground-based autonomous spraying systems rather than using airborne vehicles, as we report on page 6. More broadly, the latest developments for UGVs are detailed on page 66. Image sensing systems are vital for these smaller systems, being made in higher volumes with lower costs, and the latest spiking AI technology for such sensors is detailed on page 6. The latest Lidar chips detailed on page 38 are also providing lower power sensing for ground as well as airborne systems. These developments enable much lower power operation, which also provides opportunities for new UAV applications such as counting rice plants (see page 6) where ground-based systems can’t operate. Nick Flaherty | Technology Editor When less ismore Readallback issuesonline www.ust-media.com UST49 :APRIL/MAY2023 UK£15,USA$30,EUROPE€22 Cheaperbythedozen The impactof imagesensoradvances onuncrewedsystemdesigns Weneedmoredata Sonar technologymakes freshheadway Level headed How theWAM-V’sself-adjustment systemkeeps itonanevenkeel Editorial Director Ian Bamsey Deputy Editor Rory Jackson Technology Editor Nick Flaherty Production Editor Guy Richards Contributor Peter Donaldson Technical Consultants Paul Weighell Ian Williams-Wynn Dr Donough Wilson Prof James Scanlan Design Andrew Metcalfe andrew@highpowermedia.com UST Ad Sales Please direct all enquiries to Freya Williams freya@ust-media.com Subscriptions Frankie Robins frankie@ust-media.com Publishing Director Simon Moss simon@ust-media.com General Manager Chris Perry The USE network Having now provided several enterprises around the world with the support and connections they need to implement efficient and sustainable technological solutions, we’re keen to continue expanding this free service. If the uncrewed vehicle and/or system you’re working on could benefit from some independent advice, from engineers specialising in the appropriate field, then please do get in touch. Email your question/challenge/dilemma/predicament to thenetwork@uncrewedsystemsengineering.comor visit www.uncrewedsystemsengineering.comand raise a case with us. All questions will be treated in the strictest confidence, and there’s no obligation whatsoever to follow any recommendations made. Volume Nine | Issue Three April/May 2023 High Power Media Limited Whitfield House, Cheddar Road, Wedmore, Somerset, BS28 4EJ, England Tel: +44 (0)1934 713957 www.highpowermedia.com ISSN 2753-6513 Printed in Great Britain ©High Power Media All rights reserved. Reproduction (in whole or in part) of any article or illustration without the written permission of the publisher is strictly prohibited. While care is taken to ensure the accuracy of information herein, the publisher can accept no liability for errors or omissions. Nor can responsibility be accepted for the content of any advertisement. SUBSCRIPTIONS Subscriptions are available from High Power Media at the address above or directly from our website. Overseas copies are sent via air mail. 1 year subscription – 15% discount: UK – £75; Europe – £90 USA – £93.75; ROW – £97.50 2 year subscription – 25% discount: UK – £135; Europe – £162 USA – £168.75; ROW – £175.50 Make cheques payable to High Power Media. Visa, Mastercard, Amex and UK Maestro accepted. Quote card number and expiry date (also issue/start date for Maestro) ALSO FROMHPM
UXV TECHNOLOGIES A breakthrough in robotic control THE BRAND NEW SRoC SRM compatible Ruggedized 25 different inputs The SRoC is a standardized product built for Defence robotics. It offers high communication flexibility by interfacing through our new Swappable Radio Modules (SRM) and/or the Nett Warrior connector. www.uxvtechnologies.com UXV Technologies, Inc. Contact us The SRoC is a standardized product built for Defence robotics. It offers high communication flexibility by interfacing through our new Swappable Radio Modules (SRM) and/or the Nett Warrior connector. www.uxvtechnologies.com A breakthrough in robotic control UXV Technologies, Inc. SRM compatible | Ruggedized | 25 different inputs THE BRANDNEWSROC Contact us
6 April/May 2023 | Uncrewed Systems Technology Mission-critical info for UST professionals Platformone Intel and Daedalean have developed the first multi-core reference design for certifiable avionics using machine learning, or ML (writes Nick Flaherty). The design provides vision-based situational awareness using neural networks with high-resolution, high-throughput camera inputs based on Intel’s 11thGen Core i7 andAgilex F-Series FPGAs. On top of the SWaP constraints, autonomous aircraft developers face two other challenging circumstances when incorporating ML and AI into avionics systems. First, ML and neural network applications have increasingly high computational requirements. Second, no ML application has yet been certified by aviation regulators. The problem is that processor manufacturers typically withhold details about how a multi-core processor’s shared cache works, such as how cache lines are flushed, despite the common understanding that cache operation has a major impact on the determinism of how applications run. Intel has already introduced the Airworthiness Evidence Package (AEP) that provides manufacturers with processor artefacts and the analysis and mitigation of non-deterministic and unintended behaviour to support DO254 certification up to design assurance level (DAL) A for aircraft. Now Daedalean has designed a system for a sensor computer using the AEP and Intel processors for use in AI/ML aviation applications and with the available documentation to support certification. This is the first real-world working example to provide guidance on how to approach these challenges in general: how to ensure an ML-based system can Airborne vehicles Avionics design usesML Daedalean’s software provides certifiable machine learning in safety-critical aerospace applications
7 Platformone Uncrewed Systems Technology | April/May 2023 meet the computational requirements, certification requirements and SWaP limitations at the same time. Without understanding the behaviour of multi-core processors, systemdevelopers can struggle to guaranteemitigation of all potential failure conditions. This is currently solved by disabling three cores in a four-core system for certification, because the interaction between the cores cannot be sufficiently managed for the certifying authorities. Daedalean’s AI-enhanced situational awareness software provides certifiable ML in safety-critical aerospace applications with a full multi-core system. Careful partitioning of the hardware allows all the software components to be tested independently without interference. While partitioning all resources is not necessary for certification up to DAL-C, Daedalean chooses to conduct partitioning for two reasons – for the potential to upgrade to DAL-B in the future and to simplify the testing process. The Daedalean system runs on Vyper, a lightweight hypervisor developed inhouse. For each software component, it determines what it is allowed to execute and when. Vyper’s minimal partitioning hypervisor runs on small code size, typically 1500 lines, making it easier to certify, and runs multiple isolated partitions with each partition statically assigned to one of four physical CPU cores. Memory management is a key issue for such systems, and a two-level paging architecture provides each partition with its own isolated address space, allowing memory buffers to be safely shared between partitions. Daedalean does not need to use an RTOS in its system, because the virtual machine extensions are designed to be lightweight and self-contained without relying on third-party code. That allows the system to operate without the need for insulation from external sources. However, developers using an RTOS or commercial hypervisor can still run Daedalean software, provided that the RTOS supports the 11th Gen Intel Core i7 processor. That allows developer applications to run on the architecture, as Daedalean reserves CPU cores for this purpose. Aviation guidance from the CAST 32A standard and the more recent AMC 20193 adopted in Europe recommends – and in some cases mandates – that all available resources are partitioned among applications to ensure predictable execution and prevent competition for access. In the case of a systemwith four cores, Vyper provides robust partitioning of CPU time, cache levels, systemmemory and device access by partitioning the cores into non-interfering time slices. It uses Intel’s Time Coordinated Computing settings in the processor BIOS to make execution more deterministic, and is coupled with VMX Virtual Machine Extensions to create virtual machines that run the partitions. Virtualization for Directed I/O isolates the PCIe devices and assigns them to partitions. Intel’s Resource Director Technology Framework, particularly Cache Allocation Technology, is used to partition the shared Level 3 cache memory. This allows each partition to have its own cache and memory, and the only primitive used is a FIFO for comms, which is set up at compile time. That allows the system to operate without interrupts. Vyper runs a scheduler on each processor to execute software partitions according to a compile time-defined schedule. To simplify certification, there is no dedicated processor for handling partition hypercalls or managing the overall system state. Instead, the schedule on each processor is responsible for all partitions pinned to that processor. Software partitions can run either application logic or driver code; this distinction is transparent to Vyper. The first implementations are supporting pilots in the cockpit with situational awareness but can scale for fully autonomous systems that are certified as safe to carry passengers. A spiking neural network built using memtransistors can help prevent collisions between autonomous vehicles at night (writes Nick Flaherty). Almost half of fatal accidents occur at night. As vehicles become autonomous, the ways of detecting and avoiding these collisions must evolve too, but current systems are often complicated, resourceintensive or work poorly in the dark. Researchers at Penn State University have therefore developed a sensor based on the neural circuitry insects use to avoid an obstacle. Instead of processing an entire image, they processed only one variable: the intensity of a car’s headlights. Without the need for an onboard camera or image sensor, the detection and processing units were combined, making the overall detector smaller and more energy-efficient. The collision detector uses eight photosensitive memtransistors constructed from a layer of molybdenum disulphide (MoS 2 ) organised as a neural network. Each memtransistor has multiple terminals and includes a memory resistor (memristor) element made from MoS 2 that stores the neural network weight and transistor technology. Multiple memristors can be embedded in a single transistor, enabling it to more accurately model a neuron with multiple synaptic connections. The sensor has an area of only 40 µm2 and uses only a few hundred picojoules of energy — tens of thousands of times less than existing systems. In real-life, night-time scenarios, the detector could sense a potential two-car accident 2-3 seconds before it happened, giving the vehicle enough time to take corrective action. Driverless cars Neural net’s night crashes assistance
8 Platformone April/May 2023 | Uncrewed Systems Technology A scalable, modular system for developing underwater autonomous robots is aiming to speed up their design process (writes Nick Flaherty). The system, developed by researchers at theMassachusetts Institute of Technology (MIT), builds deformable ‘aquabots’ using simple repeating substructures instead of unique components. The teamhas demonstrated the system in two example configurations, one like an eel and the other a wing-like hydrofoil. The principle allows virtually unlimited variations in formand scale. Existing approaches to soft robotics for marine applications are generally made on small scales, while many useful real-world applications require devices on scales of metres. Previous designs such as the RoboTuna used 3000 different parts and took about 2 years to design and build. The modular system consists of lattice-like pieces, called voxels, that are mostly hollow structures consisting of cast plastic pieces with narrow struts in complex shapes. The box-like shapes are load-bearing in one direction but soft in others, an unusual combination achieved by blending stiff and flexible components in different proportions. The soft elements allow the researchers to implement flow control to reduce drag and improve propulsive efficiency, resulting in substantial fuel savings. In one of the devices, the voxels are attached end to end in a long row to form a metre-long, eel-like structure. The body is made up of four segments, each consisting of five voxels, with an actuator in the centre that can pull a wire attached to each of the two voxels on either side, contracting them and causing the structure to bend. The whole device of 20 units is then covered with a rib-like supporting structure, and then a tight-fitting waterproof neoprene skin. The researchers deployed the structure in anMIT tow tank to show its efficiency in the water, and demonstrated that it was capable of generating enough thrust to propel itself forward using undulatingmotions. The voxel approach allows the designs to be scaled up to larger sizes without requiring the kind of retooling and redesign that would be needed to scale up current systems. “Scalability is a strong point for us,” said MIT researcher Parra Rubio. “Treating soft versus hard robotics is a false dichotomy. This is something in between, a new way to construct things.” There have beenmany eel-like robots before, for example for NASAmissions in space, but they are generally made from bespoke components as opposed to these simple and scalable building blocks. The other device they demonstrated has a wing-like shape, or hydrofoil, made up of an array of the same voxels. It can change its profile shape and so change the lift-todrag ratio and other properties of the wing. Unlike the eel design, the wing is covered in an array of scale-like overlapping tiles, which are designed to press down on each other to maintain a waterproof seal even as the wing changes its curvature. One possible application might be as an addition to a ship’s hull to reduce the formation of drag-inducing eddies and thus improve the vessel’s overall efficiency. The system can also be applied to a submersible craft, using a morphable body shape to create propulsion. Underwater vehicles Simpleway tomakeAUVs The eel-like version generated enough thrust to propel itself forward Scalability is a strong point for us. Treating soft versus hard robotics is a false dichotomy. This is something in between, a new way to construct things
Futaba Corporation of America Futaba Industrial Radio Control Coming soon “FMT-05” FOR UAV, UGV & USV ROBUST AND USABLE IN ALL WEATHER CONDITIONS FUTABA’S FIRST REMOTE CONTROLLERWITH VIDEO TRANSMISSION STABLE COMMUNICATION VIA 5.7GHz & 900MHz IP64 RATED – USABLE IN INCLEMENT WEATHER LOW LATENCY COMMUNICATION OF FULL HD IMAGES MAXIMUM COMMUNICATION DISTANCE OF 3 MILES SEE IT AT THE XPONENTIAL SHOW IN MAY!! Exhibition Information XPONENTIAL MAY 8 – 11,2023 FUTABA Booth 4650 Futaba Corporation of America INFO@FUTABAIRC.COM 5401 Trillium Boulevard, Suite A225 Hoffman Estates, IL 60192 847-884-1444
10 April/May 2023 | Uncrewed Systems Technology Researchers in China and Singapore have developed an automated method of counting rice plants using UAVs (writes Nick Flaherty). Rice is cultivated on nearly 162 million hectares of land worldwide. One of the most common ways to quantify its production is to count the number of rice plants to estimate yield, diagnose growth and assess losses in paddy fields. “The new technique uses UAVs to capture RGB images of a paddy field,” said Professor Jianguo Yao from Nanjing University of Posts and Telecommunications in China, who led the study. “The images are then processed using a deep learning network we have developed, called RiceNet, which can accurately identify the density of rice plants in a field, as well as provide higher-level semantic features such as crop location and size.” The RiceNet machine learning framework consists of one feature extractor, at the front end, that analyses the input images, and three feature decoder modules that are responsible for estimating the density of plants in a paddy field, the location of the plants and their size. The last two features are particularly important for future research into automated crop management techniques, such as spraying fertilisers. As a part of the study, the researchers deployed a camera-equipped UAV over rice fields in the Chinese city of Nanchang, capturing images measuring 5472 x 3648 pixels. They then used some of the images as a data set as a reference to train the system, and the rest as a test data set to validate the analytical findings. Out of the 355 images with 257,793 manually labelled points, 246 were randomly selected and used as training images, while the remaining 109 were used as test images. Each image contained an average of 726 rice plants. The RiceNet technique’s signalto-noise ratio enables it to efficiently distinguish rice plants from their background. The results of the study showed that RiceNet’s mean absolute error and root mean square error were 8.6 and 11.2 respectively, which is comparable to the data generated using manual methods. The research threw up some key tips for effective automation of the counting process. For instance, the team does not recommend acquiring images on rainy days. It also suggests collecting UAV-based images within 4 hours after sunrise to minimise fog time as well as the occurrence of rice leaf curls, both of which adversely affect the output quality. “In addition, we further validated the performance of our technique using two other popular crop data sets,” said Prof Yao. “The results showed that our method significantly outperforms other state-of-the-art techniques, which underscores the potential of RiceNet to replace the traditional manual method.” The RiceNet framework can also be used for other UAV- and deep learningbased crop analysis techniques, to help improve the production of food and cash crops worldwide. Airborne vehicles Rice counter crops up Tests showed that the RiceNet technique outperformed other counting methods The RiceNet system can accurately identify the density of rice plants in a paddy field, as well as provide features such as crop location and size
T-MOTOR POWER MAKES YOUR EXPLORATION www.tmotor.com Platformone Rohde & Schwarz has designed a mounting adapter for installing its UHF wireless analyser on a medium-sized UAV (writes Nick Flaherty). The R&S EVSD1000 VHF/UHF analyser is intended to work on a UAV to provide the accurate and reliable navigation required for optimising air traffic control. The EVSD1000 is a signal-level and modulation analyser that combines measurements of instrument landing systems (ILS), ground-based augmentation systems and VHF omnirange (VOR) ground stations in a single box. The mechanical and electrical design is optimised for real-time measurements of terrestrial navigation systems from a UAV with up to 100 measurement data sets per second. It provides high-precision signal analysis in the 70-410 MHz frequency range, a critical requirement for UAV- based terrestrial navigation signal measurement systems. It also includes the necessary measurement repeatability to ensure that results from UAV measurements can be compared to flight and ground inspections, in line with the standards set by the International Civil Aviation Organisation The analyser weighs 1.5 kg, and is designed to carry out measurements on ILS/VOR systems around an airport. It enables runway blocking times to be reduced compared to conventional manual testing methods that use a mast antenna while providing the necessary measurement repeatability, measurement precision and GNSS time and location stamps. Airborne vehicles Air traffic optimiser The R&S analyser is designed to be used onmediumsized UAVs to reduce runway blocking times POWER MAKES YOUR EXPLORATION www.tmotor.com
12 Platformone April/May 2023 | Uncrewed Systems Technology Researchers at the University of Illinois Urbana-Champaign have developed a technique that will help determine the lifetime of electric propulsion systems for autonomous space vehicles (writes Nick Flaherty). The team used data from low-pressure chamber experiments and large-scale computations to develop a model on a supercomputer to better understand the effects of ion erosion on carbon surfaces as the first step in predicting its failure. Electric space propulsion systems use energised atoms to generate thrust. These high-speed beams of ions bump against the graphite surfaces of the thruster, eroding them a little with each hit, and are its primary lifetimelimiting factor. When ion thrusters are tested on the ground in an enclosed chamber, the ricocheting particles of carbon from the graphite chamber walls can also be deposited back onto the thruster surfaces. This changes the thruster’s measured performance characteristics. “We need an accurate assessment of the ion erosion rate on graphite to predict thruster life, but testing facilities have reported varying sputtering [ion erosion] rates, leading to large uncertainties in predictions,” saidHuy Tran, a PhD student in theDepartment of Aerospace Engineering at UIUC, whoworked on the project. The research is part of NASA’s Joint Advanced Propulsion Institute, which includes researchers at nine universities, including UIUC. The simulations were performed using the Delta supercomputer at Illinois. A particular difficulty is replicating the environment of space in a laboratory chamber, because it is hard to build a sufficiently large chamber to avoid ionsurface interactions at the chamber walls. Although graphite is typically used for the accelerator grid and pole covers in the thruster, there isn’t agreement on which type of graphite is the most resistant to erosion. “The fundamental problemwith testing an ion thruster in a chamber is that the thruster is continuously spitting out xenon ions that also impact with the chamber walls, which are made from graphite panels, but there are no chamber walls in space,” said Tran. “When these xenon ions hit the graphite panels, they also sputter out carbon atoms that are redeposited on the accelerator grids. So instead of the grid becoming thinner and thinner because of thruster erosion, some people have seen in experiments that the grids actually get thicker with time because the carbon is coming back from the chamber walls.” The simulations resolved the limitations and uncertainties in the experimental data. “Whether it is pyrolytic graphite on the gridded ion optics, isotropic graphite on the pole covers, or poco graphite or anisotropic graphite on the chamber walls, our molecular dynamics simulations show that the sputtering rates and mechanisms are identical across all these different referenced structures,” said Huck Beng Chew, associate professor at UIUC and Tran’s supervisor. The sputtering process creates a unique carbon structure during the bombardment process. “When the ions damage the surface, they are transformed into an amorphous-like structure regardless of the initial carbon structure,” said Prof Chew. “You end up with a sputtered surface with the same unique structural characteristics. That is one of the main findings we have observed from our simulations. “The model we developed bridges the molecular dynamics simulation results and the experimental data. The next thing we want to look at is the evolving surface morphology over time as you put more and more xenon ions into the system. This is relevant to ion thrusters for deep space exploration.” Space vehicles Thruster lifespan tester High-speed ion erosion of a thruster’s graphite surfaces are its primary life-limiting factor
ABDynamics (ABD) has developed a pedestrianmodelling systemto improve the effectiveness of sensors in autonomous vehicles (writes Nick Flaherty). ABD worked with Dynamic Research on the ‘Soft Pedestrian 360’ system to support tests for improved sensor perception and categorisation. The system is a real-worldmodel with articulation of the knees, hips, shoulders and neck tomore accurately reflect a pedestrian in a sensor output. Actively articulated knees enable the hips and knees tomove independently of each other so that the gait of themodel can be controlled in different wayswith amore varied range ofmovement. This is critical for vehicle sensor systems to ensure the correct categorisation of amoving pedestrian. The gait of the model is automatically synchronised with the position, speed and acceleration of the target relative to a starting point using the platform’s IMU. This prevents a phenomenon the developers call ‘Flintstoning’, where the foot in the stance portion of the gait is not stationary relative to the ground, and results in better characterisation. The limbs, head and mounting pole are attached to the torso via foam blocks that engage corresponding sockets in the torso, such that when these components separate on impact with a test vehicle there are no exposed hard points. The servos operating the limbs and head of the pedestrian are also completely encased in foam and sit within each component. They use a newly designed slipper clutch to prevent them from being back-driven during a collision. All these factors mean the vehicle under test is protected when the limbs are disconnected upon impact. Driverless cars Real-world testmodel The pedestrian model has articulated joints for controlling the model’s gait CONNECT TECHNOLOGY WITH CONFIDENCE / / WWW . H A R W I N . C O M Harwin UAV Uncrewed Systems March 23.indd 1 22/03/2023 13:39
14 Platformone April/May 2023 | Uncrewed Systems Technology Researchers in Japan have developed a new type of laser diode that can boost the performance of solid-state Lidar sensors (writes Nick Flaherty). Photonic-Crystal Surface-Emitting Lasers (PCSELs) are built from two different types of materials with a large refractive index contrast, such as air and semiconductors. The PCSELs are different from 2D distributed feedback (DFB) lasers. DFB lasers have periodic structures with a smaller refractive index, and so can only couple the primary, or fundamental, lasing mode to produce the laser light. This same issue has an impact on the vertical-cavity surface-emitting laser diodes commonly used for Lidar systems nowadays. That means the brightness cannot be increased even by increasing the size of the device, as multilateralmode oscillations occur. In PCSELs, however, the multilateralmode oscillations can be kept very small, even though the size increases. That allows the brightness to be increased to up to 10 GW/cm2/sr (Watts per square centimetre steradian), which is comparable to high-performance fibre lasers. PCSELs also have a very narrow divergence and symmetric beam with a very narrow lasing spectra in a single mode, and their temperature dependence is much smaller than that of conventional broad area Fabry-Perot (FP) semiconductor lasers. Together, these features allow for lensfree and adjustment-free operation in Lidar sensors with a higher signal-to-noise ratio thanks to the higher brightness. This can be used to provide longer range andmore accuracy, or smaller sensors. The researchers have developed a double-lattice structure where the light waves diffracted by individual lattices have an optical path difference of a halfwavelength. That creates destructive interference at the edges of the lattice, which gives a higher beam quality and higher brightness. When a PCSEL is mounted upside down on a package, the output beam is emitted from the substrate side, with 10 W of output power and a very narrow beam divergence, of 0.1o. The PCSEL can also operate reliably across a wide temperature range, of -40 to 100 oC, and the temperature dependence of output power at a fixed current injection is -0.36%/oC on average, which is better than an FP laser. The temperature dependency of lasing wavelength is as small as 0.08 nm/oC, which is superior to FP lasers. By further expanding the concept of the double-lattice photonic crystal, the researchers have created a PCSEL with a large lasing diameter, of up to 10 mm, with a power output of 100 W to 1 kW. They have also developed a beamscanning approach for compact solidstate Lidars. This ‘dual modulated’ PCSEL simultaneously modulates the lattice point sizes and positions to direct a beam. They have built a chip for electrical 2D beam scanning based on dual modulated photonic crystals by integrating 10 x 10 different dual-modulated PCSELs in a 2D matrix, where individual PCSELs can be driven independently. The researchers have now produced a solid-state Lidar using the PCSEL devices, it uses a dually modulated photonic crystal laser (DM-PCSEL) as its light source with electronically controlled beam scanning and flash illumination used in flash Lidar to acquire a full 3D image with a single flash of light using a time-offlight (ToF) sensor. This provides both a flash source that can illuminate a 30 x 30o field of view and a beam-scanning source that provides spot illumination with 100 narrow laser beams. “Our DM-PCSEL system lets us range highly reflective and poorly reflective objects simultaneously, which is difficult for other flash Lidar system designs,” said researcher Susumu Noda, from Kyoto University. “The lasers, ToF camera and all the associated components required to operate the systemwere assembled in a compact manner, resulting in a total system footprint that is smaller than a business card.” This allows the designers to achieve both flash and scanning illumination without any moving parts or bulky external optical elements such as lenses and diffractive optical elements. Sensors Solid-state Lidars boost The PCSELs allow for lens-free and adjustment-free operation in Lidar sensors All the components have less footprint than a business card
CubePilot ecosystem CubePilot is proud to offer a reliable solution for FAA remote ID compliance Herelink Here4 Cube ID CAN Cube ID Serial CubePilot DroneShare Copyright © 2023 CubePilot Australia. All Rights Reserved C o p y r i g h t © 2 0 2 3 C u b e P i l o t A u s t r a l i a . A l l R i g h t s R e s e r v e d . CubePilot is proud to offer a reliable solution for FAA remote ID compliance
16 April/May 2023 | Uncrewed Systems Technology Researchers in the US are proposing adding a fourth light to traffic lights for driverless cars (writes Nick Flaherty). The idea, from transportation engineers at North Carolina State University, is that a ‘white light’ would enable autonomous vehicles to help control traffic flow, and let human drivers know what’s going on. In computational simulations, the new approach significantly improves travel times through intersections and reduces fuel consumption. Red lights will still mean stop, green will still mean go, yellow will warn that the light is about to turn red, while white lights will simply tell human drivers to follow the car in front of them. An autonomous system for spraying grapevines is rolling out in the champagne and wine regions of France (writes Nick Flaherty). The YV01 smart vineyard robot was developed by Yanmar with wine makers and the CIVC, the organisation that manages the production, distribution and promotion of champagne. The system is powered by a Honda IGX800 air-cooled, two-cylinder, fourstroke 25.3 bhp petrol engine. It can navigate vineyard slopes with an incline of up to 45% and through the narrow vine alleys, even in muddy and wet conditions, at 4 kph. It weighs 1 ton, around a third of tractor-based spraying systems, and is designed to be easily transported on a small truck or trailer. Its tank holds up to 200 litres of spray. The robot uses electrostatic spraying The idea uses the fact that driverless cars communicate with each other and the computer controlling the traffic signal. When enough of them are approaching the intersection, it would activate the white light to coordinate them through the intersection. When too many vehicles approaching the intersection are being controlled by drivers, the traffic lights would revert to the conventional green-yellow-red signal pattern. The systemcould be used first with autonomous trucks. They have higher rates of autonomous vehicle adoption, so there could be an opportunity to implement a pilot project that could benefit port traffic, say, and commercial transportation. technology to ensure that the aerosol droplets are accurately applied to vines, both showing and hidden, minimising the environmental impact of the spraying. Navigation is via RTK positioning with GNSS satellite signals, which allows the spray nozzles to maintain a consistent distance from the target vines. An operator can monitor the YV01 via a simple remote control, safely out of range of the spray and with no risk to the operator if it overturns. The robot was developed with an eye to the changing vineyard business landscape, and will address current and future labour shortages in Europe while helping to manage compliance with everstricter environmental requirements. “The YV01 will ease workloads, reduce costs and increase productivity as well as workplace safety,” said Jean-Benoit Bourlon of Yanmar Vineyard Solutions, a Yanmar Group company in France dedicated to the sector. “The YV01 is also much quieter than traditional spraying machines – something that is appreciated by the operators as well as their neighbours.” Driverless cars Ground vehicles White light for traffic? Through the grapevine Researchers say adding a white option to traffic lights would improve travel time and cut fuel consumption Yanmar’s YV01 can navigate vineyard slopes of up to 45%
Platformone An optical coating that combines anti-fogging and anti-reflective properties to boost the performance of Lidar systems and image sensors has been unveiled (writes Nick Flaherty). Researchers in Germany, at the Fraunhofer Institute for Applied Optics and Precision Engineering, and the Friedrich Schiller University Jena, say the coating prevents fogging by using porous silicon dioxide nanostructures that reduce reflections. It was designed for Lidar systems but can be tailored for other optical systems. The coating was developed in response to a need identified by Leica Geosystems, which develops airborne Lidar measurement systems for terrain and city mapping. When there are extreme temperature differences between the environment and the measuring system, fogging sometimes occurs on the optical surfaces. The polymer layer prevents fogging on an optical surface by acting as a water reservoir. However, differences in the refractive indices of the polymer material and the surrounding air leads to unwanted reflections and ghost light. To prevent the reflections, the researchers used very small structures, up to 320 nm high, to create an anti-reflective effect together with water permeability. To make the coating, the researchers used an AR-plas2 plasma-ion-assisted coating machine developed at the Fraunhofer Institute for Applied Optics and Precision Engineering. It allows several nanostructures to be created on top of each other, etching a nanostructure into the anti-fog coating and then fabricating a second nanostructure on top. That allows the refractive indices of the nanostructures to be adjusted to achieve very low reflections over the wide spectral range required by a Lidar sensor. Samples manufactured with this new coating technology have already been used successfully for a year in several airborne Lidar prototypes operating in various climatic conditions around the world. Because the structures are generated in a standard plasma system, the new approach can be easily incorporated into commercial manufacturing processes. Sensors Fog-proof coating Dr DonoughWilson Dr Wilson is innovation lead at aviation, defence, and homeland security innovation consultants, VIVID/futureVision. His defence innovations include the cockpit vision system that protects military aircrew from asymmetric high-energy laser attack. He was first to propose the automatic tracking and satellite download of airliner black box and cockpit voice recorder data in the event of an airliner’s unplanned excursion from its assigned flight level or track. For his ‘outstanding and practical contribution to the safer operation of aircraft’ he was awarded The Sir James Martin Award 2018/19, by the Honourable Company of Air Pilots. Paul Weighell Paul has been involved with electronics, computer design and programming since 1966. He has worked in the real-time and failsafe data acquisition and automation industry using mainframes, minis, micros and cloudbased hardware on applications as diverse as defence, Siberian gas pipeline control, UK nuclear power, robotics, the Thames Barrier, Formula One and automated financial trading systems. IanWilliams-Wynn Ian has been involved with uncrewed and autonomous systems for more than 20 years. He started his career in the military, working with early prototype uncrewed systems and exploiting imagery from a range of systems from global suppliers. He has also been involved in ground-breaking research including novel power and propulsion systems, sensor technologies, communications, avionics and physical platforms. His experience covers a broad spectrum of domains from space, air, maritime and ground, and in both defence and civil applications including, more recently, connected autonomous cars. Professor James Scanlan Professor Scanlan is the director of the Strategic Research Centre in Autonomous Systems at the University of Southampton, in the UK. He also co-directs the Rolls-Royce University Technical Centre in design at Southampton. He has an interest in design research, and in particular how complex systems (especially aerospace systems) can be optimised. More recently, he established a group at Southampton that undertakes research into uncrewed aircraft systems. He produced the world’s first ‘printed aircraft’, the SULSA, which was flown by the Royal Navy in the Antarctic in 2016. He also led the team that developed the ULTRA platform, the largest UK commercial UAV, which has flown BVLOS extensively in the UK. He is a qualified full-size aircraft pilot and also has UAV flight qualifications. Uncrewed Systems Technology’s consultants 17 Uncrewed Systems Technology | April/May 2023 The coating was designed for Lidar sensors but is suitable for a variety of optical systems
18 Platformone Uncrewed Systems Technology diary April/May 2023 | Uncrewed Systems Technology Military Robotics & Autonomous Systems Monday 17 April – Tuesday 18 April London, UK www.smgconferences.com/defence/uk/ conference/robotic-autonomous-systems Ocean Business Tuesday 18 April – Thursday 20 April Southampton, UK www.oceanbusiness.com C2ISR Global Tuesday 20 April – Thursday 22 April London, UK www.defenceiq.com/events-c2isrweek Rotorcraft/Unmanned Systems Asia Wednesday 3 May – Friday 5 May Singapore www.rca-umsa.com XPONENTIAL Monday 8 May – Thursday 11 May Denver, USA www.xponential.org UDT Tuesday 9 May – Thursday 11 May Rostock, Germany www.udt-global.com UncrewedMaritime Systems Technology Wednesday 10 May – Thursday 11 May London, UK www.smgconferences.com/defence/uk/ conference/Unmanned-Maritime-Systems Mobility LiveMiddle East Monday 15 May – Tuesday 16 May Abu Dhabi, UAE www.terrapinn.com/exhibition/mobility-live-me FutureMobility Asia Wednesday 17 May – Friday 19 May Bangkok, Thailand www.future-mobility.asia National Congress on Counter UAS Technology Tuesday 23 May – Wednesday 24 May Washington, DC www.americanconference.com/counter-uas-technology Critical CommunicationsWorld Tuesday 23 May – Wednesday 24 May Helsinki, Finland www.expobeds.com/event/critical-communications-world Energy Drone & Robotics Summit Monday 12 June – Wednesday 14 June Texas, USA www.edrcoalition.com Paris Airshow Monday 19 June – Sunday 25 June Paris, France www.siae.fr/en MOVE Wednesday 21 June – Thursday 22 June London, UK www.terrapinn.com/exhibition/move Japan Drone Monday June 26 – Wednesday June 28 Chiba, Japan www.ssl.japan-drone.com/en_la Military Robotics and Autonomous Systems Monday 10 July – Tuesday 11 July Arlington, USA www.smgconferences.com/defence/northamerica/ conference/robotics-usa Drone International Expo Wednesday 26 July – Thursday 27 July New Delhi, India www.droneinternationalexpo.com Commercial UAV Expo Americas Tuesday 5 September – Thursday 7 September Las Vegas, USA www.expouav.com DSEI Tuesday 12 September – Friday 15 September London, UK www.dsei.co.uk UAV Technology Monday 25 September – Tuesday 26 September London, UK www.smgconferences.com/defence/uk/ conference/UAV-Technology
Searching for the perfect engineering job in the uncrewed industry just got a whole lot easier The global hub for uncrewed and autonomous systems engineering vacancies www.uncrewedengineeringjobs.com Searching for the perfect engineering job in the uncrewed industry just got a whole lot easier The global hub for uncrewed and autonomous systems engineering vacancies Uncrewedengineeringjobs.com Searching for th perfect engineering job in the uncrewed industry just got a whole lot easier DroneX Tuesday 26 September – Wednesday 27 September London, UK www.dronexpo.co.uk UnmannedMaritime Systems Technology USA Wednesday 27 September – Thursday 28 September Arlington, USA www.smgconferences.com/defence/ northamerica/conference/umst-usa Unmanned SystemsWest Wednesday 27 September – Thursday 28 September San Diego, USA www.americanconference.com/unmanned-systems-west/ UAV Show Tuesday 10 October – Thursday 12 October Bordeaux, France www.uavshow.com Egypt Defence Expo Monday 4 December – Thursday 7 December New Cairo, Egypt www.egyptdefenceexpo.com UMEX Monday 22 January – Thursday 25 January 2024 Abu Dhabi, UAE www.umexabudhabi.ae GEOWEEK Sunday 11 February – Tuesday 13 February 2024 Colorado, Denver www.geo-week.com Oceanology International Tuesday 12 March – Thursday 14 March 2024 London, UK www.oceanologyinternational.com/london ILA Berlin Wednesday 5 June – Sunday 9 June 2024 Berlin, Germany www.ila-berlin.de Eurosatory Monday 17 June – Friday 21 June 2024 Paris, France www.eurosatory.com
20 While she wasn’t born among robots and high technology, Noel Heiks was introduced to themat an early age. That set her on a career path as an engineer, serial entrepreneur and investor that has led to positions on the boards of companies including long-range BVLOS UAV systems developer Censys Technologies, andmade her an evangelist for the integration of UAVs with advanced sensors andmachine learning (ML). “My dad is the culprit”, she says. “He ran a business in Greenville, South Carolina, called Advanced Automation,” she says. “I started working with him when I was about 14, first as a janitor, then a drill press operator, then a blueprinter, and when I was 18 he said, ‘OK, now you can program the robots.’” Advanced Automation’s main customer was Bosch, for whom it made fuel injectors for cars, using robots served by long lines of conveyor belts to assemble them. Programming the robots involved writing code to send data over a local area network and put it into a Lotus 123 spreadsheet. “I had to figure out how to write the code, so my dad sort of threw a C++ book at me and said ‘Learn this!’” Growing up in her father’s business also made starting and running technology companies seem the natural thing to do. To prepare for that, however, she needed to supplement her early handson experience with further education in science and technology, which she pursued through a bachelor’s degree in physics and then a master’s in electrical engineering, both from Virginia Tech. After earning her bachelor’s, she received a practical introduction to sensor technologies during 3 years as an employee of a company that built automated processing systems for manufacturing and testing photomultiplier tubes. A return to Virginia Tech followed, The head of Censys Technologies talks toPeter Donaldsonabout how she sees UAVs and AI being deployed for fast, high-resolution inspections Computer visionary The Sentaero can carry a range of sensors including methanedetecting laser spectroscopy systems (Images courtesy of Censys) April/May 2023 | Uncrewed Systems Technology
21 where she worked as a research associate building data acquisition systems for fibre optic sensors. Then came her first start-up, Haleos, which sold micro-fabrication and optoelectronics products. As vicepresident, Heiks raised capital and grew the business for 6 years before selling it. During that time she was also studying for her master’s. Next came two more start-ups including Nuvotronics, which built phased-array radars and millimetre-wave antenna systems, usually for aerospace applications. Growing Nuvotronics, she raised capital and built alliances with large commercial and defence companies before selling it to Cubic Corporation. Machine vision A stint as president and chief operating officer at Duos Technologies brought experience of very high-speed, highresolution imaging combined with AI/ ML for automated safety inspection of freight railcars. Here, Heiks’ master’s in computer vision came in handy. “They build portals throughwhich a train 2miles long passes through at 70mph, taking visible and infrared images as it is speeding by. A fewminutes later you can tell whether a nut is out of place or if awheel bearing is too hot,” she says. “We could detect all kinds ofmechanical defects.” Turning this concept around by putting the sensors or cameras on moving vehicles creates opportunities for UAVs in high-speed, high-resolution inspection services aided by computer vision and AI/ML technologies. This is where Censys Technologies comes in, with its longrange Sentaero UAVs. Invited by Censys to meet an electric utility company for which Censys was demonstrating one of its UAVs, Heiks was impressed by the system in which truck/ trailer-based mobile command centres support multiple UAVs. Some missions might cover hundreds or thousands of miles using multiple trailers and fleets of the aircraft. Adapting AI “I watched this in action and I thought about the implications for both government and commercial use,” she says. “I thought these guys were going to go far, but at the time they did not have the AI/ML capability to build on their data sets. “So I imagined that very soon Censys would be adapting its payloads and taking on all kinds of data – spectrographic, hyperspectral, thermal and so on – and discerning from that certain patterns such as, ‘That is too hot, that plant is a different species, that is a car, that guy is carrying a gun and he shouldn’t be.’ These are the sorts of things you can do when you have smarts on your aircraft.” These capabilities are now integrated into Censys’ CensWise AI computer vision system. Heiks then hired technologists from Duos, including AI expert David Ponevac as CTO, where they set up platforms such as RailSens software. This takes data from UAV flights over miles of railway looking for vegetation encroachment, anomalies in the rails themselves, anything that shouldn’t be there. She expects this kind of service with UAVs to replace manned aircraft and more subjective monitoring processes. “The issue is that you get invasive Noel Heiks | In conversation Uncrewed Systems Technology | April/May 2023 The Sentourion mobile command centre manages Sentaero operations, and applies AI and machine learning techniques to image processing
www.highpowermedia.comRkJQdWJsaXNoZXIy MjI2Mzk4