Read all back issues online www.ust-media.com UST 48 : FEB/MAR 2023 UK £15, USA $30, EUROPE €22 Network news How 5G is being deployed for uncrewed systems Recharge account The latest advances in solar power technology In for the long haul Kodiak Robotics’ autonomous solution for the US logistics industry
Electric power for UAVs More power. More products. Acutronic designs, builds and delivers a full range of UAV power systems. • Alternators • Starter-alternators • Voltage regulators • Starters The power of experience. Withdecades of engineering and manufacturing experience, Acutronic builds power systems trusted by customers globally for their high power density and efficient design. Proudly made in the U.S.A. We solve your power systems integration challenges. acutronic.com Scan to view technical data
3 February/March 2023 | Contents Uncrewed Systems Technology | February/March 2023 24 04 Intro Asleep at the wheel? In California, passengers in driverless taxis are nodding off in the back, leading to a growth in 911 call-outs 06Platformone: Mission-critical info ZF unveils an SAE Level 4 shuttle for mixed urban traffic, Flowcopter develops a UAV that can lift 150 kg, rFpro uses ray tracing to test vehicle sensors, and much more 20 In conversation: Jarno Puff LikeAbird’s CTO gives his view of the issues surrounding the design and delivery of UAV cellular comms and solutions 24Dossier: Kodiak Driver The US logistics industry is facing a dire shortage of truck drivers, which this autonomous system is designed to address 38Focus: 5G networks Uncrewed systems of all kinds are now adopting 5G technology thanks to the emergence of network-specific standards 48Digest: SWL Robotics Tiburon Everglades-style airboat propulsion enables this USV to carry out surveys where other marine craft simply cannot go 58 Insight: UUVs A station-keeping capability is becoming the new trend for UUVs, and that is driving a special range of innovations 68 In operation: Skypersonic Skycopter NASA is using this UAV, and the company’s Skyrover UGV, to train astronauts in uncrewed vehicle control on Mars 76Show report: CES 2023 Innovations in uncrewed perception, navigation and transport were among the highlights at this year’s show 82Dossier: Limbach 2400 DX and 550 EFG UAVs are becoming bigger, heavier and more power-hungry, so they need propulsion that offers higher torque, a need this company is meeting with these latest engines 94 In operation: NXInnovation 100 Enviro This autonomous electric USV is already proven in cleaning up litter and spills in urban waters; it’s portable and low-cost too 100Focus: Solar power PV technology is now accessible to smaller companies, and it can enhance the performance of uncrewed systems as well 108 In operation: Protegimus Protection We look at how this company uses UAVs to maintain the security and safety of large construction, logistics and other sites 114PS: Internet of quantumUAVs? The science of quantum computing in UAVs is moving from science fiction to science fact thanks to some recent research 38 94 68 108
THE COMMUNICATIONS HUBOF THE RACING POWERTRAINWORLD MATTBIENEMAN: BigBlockbetterment JANUARY 2023 UK £15,US/CN$25,EUROPEe22 www.highpowermedia.com NITROMETHANETWINNING Sungurtekin’s LSR Triumphs RACEENGINE LIFELINE? Focus on sustainable fuels PIPO INTO THECLOUDS The challenge of hillclimbing ISSUE017 | JAN/FEB2023 UK£15 USA$30 EUROPE€22 E-MOBILITY ENGINEERING THE COMMUNICATIONS HUB OF THE ELECTRIFIED POWERTRAIN Gentle probing Nottoohot, nottoocold Non-destructivewaysof testingEVbatteries Thermalmanagementsolutions forEVbatterychargingandsafety Sites of the future ECE’sworkonelectrifying Doosanexcavators 4 February/March 2023 | Uncrewed Systems Technology Intro | February/March 2023 One consequence of themove to driverless cars has been somewhat unexpected. The growing use of driverless cars in San Francisco for commercial taxi services has led to an increase in call-outs for the emergency services. That should come as a surprise, as autonomous systems are developed to be safer for passengers as well as pedestrians. Ironically, the growth in call-outs to the fire and ambulance services is a direct result of passengers feeling safe and secure – so much so that they have fallen asleep and can’t be roused. The vehicles then notify their remote operators that their passengers are flat out in the back, as they would on a bus. Again, this an indicator of the safety of the system, but the remote operators cannot wake them over the remote link, and have had to call an ambulance and fire engine instead. With multiple technology announcements at the recent Consumer Electronics Show on driverless technologies (see page 76), our focus on 5G connectivity (page 38) and our interview with Jarno Puff of LikeAbird on page 20, perhaps the time has come to pay more attention to the technology inside the vehicle to wake up customers who have become too comfortable. Nick Flaherty | Technology Editor Wake-upcalls Readallback issuesonline www.ust-media.com UST48 :FEB/MAR2023 UK£15,USA$30,EUROPE€22 Networknews How5G isbeingdeployed foruncrewedsystems Rechargeaccount The latestadvances in solarpower technology In for the long haul KodiakRobotics’autonomous solution for theUS logistics industry Editorial Director Ian Bamsey Deputy Editor Rory Jackson Technology Editor Nick Flaherty Production Editor Guy Richards Contributor Peter Donaldson Technical Consultants Paul Weighell Ian Williams-Wynn Dr Donough Wilson Prof James Scanlan Design Andrew Metcalfe andrew@highpowermedia.com UST Ad Sales Please direct all enquiries to Freya Williams freya@ust-media.com Subscriptions Frankie Robins frankie@ust-media.com Publishing Director Simon Moss simon@ust-media.com General Manager Chris Perry The USE network Having now provided several enterprises around the world with the support and connections they need to implement efficient and sustainable technological solutions, we’re keen to continue expanding this free service. If the uncrewed vehicle and/or system you’re working on could benefit from some independent advice, from engineers specialising in the appropriate field, then please do get in touch. Email your question/challenge/dilemma/predicament to thenetwork@uncrewedsystemsengineering.comor visit www.uncrewedsystemsengineering.comand raise a case with us. All questions will be treated in the strictest confidence, and there’s no obligation whatsoever to follow any recommendations made. Volume Nine | Issue Two February/March 2023 High Power Media Limited Whitfield House, Cheddar Road, Wedmore, Somerset, BS28 4EJ, England Tel: +44 (0)1934 713957 www.highpowermedia.com ISSN 2753-6513 Printed in Great Britain ©High Power Media All rights reserved. Reproduction (in whole or in part) of any article or illustration without the written permission of the publisher is strictly prohibited. While care is taken to ensure the accuracy of information herein, the publisher can accept no liability for errors or omissions. Nor can responsibility be accepted for the content of any advertisement. SUBSCRIPTIONS Subscriptions are available from High Power Media at the address above or directly from our website. Overseas copies are sent via air mail. 1 year subscription – 15% discount: UK – £75; Europe – £90 USA – £93.75; ROW – £97.50 2 year subscription – 25% discount: UK – £135; Europe – £162 USA – £168.75; ROW – £175.50 Make cheques payable to High Power Media. Visa, Mastercard, Amex and UK Maestro accepted. Quote card number and expiry date (also issue/start date for Maestro) ALSO FROMHPM
UXV TECHNOLOGIES A breakthrough in robotic control THE BRAND NEW SRoC SRM compatible Ruggedized 25 different inputs THE SRoC is a standardized product built for Defence robotics. It offers high communication flexibility by interfacing through our new Swappable Radio Modules (SRM) and / or the Nett Warrior connector. www.uxvtechnologies.com UXV Technologies, Inc Contact us The SRoC is a standardized product built for Defence robotics. It offers high communication flexibility by interfacing through our new Swappable Radio Modules (SRM) and/or the Nett Warrior connector. www.uxvtechnologies.com A breakthrough in robotic control UXV Technologies, Inc. SRM compatible | Ruggedized | 25 different inputs THE BRANDNEWSROC Contact us
6 February/March 2023 | Uncrewed Systems Technology Mission-critical info for uncrewed sytems professionals Platformone Vehicle systems supplier ZF has developed an autonomous shuttle for SAE Level 4 operation (writes Nick Flaherty). The shuttle was launched at the Consumer Electronics Show in the US in January, and is designed for autonomous driving in urban environments and mixed traffic, rather than segregated lanes. ZF plans to supply several thousand of the vehicles to provide an autonomous transport system that includes fleet management, maintenance, repair and training. The shuttles will be run by an operator in Florida called Beep. The 22-passenger vehicle uses Lidar, radar, camera and audio systems to provide precise environmental detection. It also uses the ZF ProConnect wireless system to connect to the cloud, working with the ZF ProAI supercomputer. It is a modular design that can be equipped with chips from different suppliers depending on the planned application and the computing power required. The connectors for the ProAI are compatible with all common plugs on the market and there are three cooling options, depending on the performance level – passive cooling, air cooling and liquid cooling – all of which fit in the same 240 x 138 x 49 mm housing. That is much smaller than its previous models and therefore gives more freedomwith the installation options in the vehicle. The ProAI runs ZF’s autonomous driving software, with the output going to the onboard actuators. The shuttle will have selectable battery capacities of between 50 and 100 kWh and a range of 80 miles at a speed of 25 mph, with further development to 50 mph as the accuracy and precision of the driving software improves. The shuttle also includes an automatic ramp and wheelchair restraints for disabled travellers, and a ‘kneeling’ function to provide level access. That enables the shuttle to dock precisely, using front and rear-wheel steering, which is included in the control system. Driverless vehicles Level 4 shuttle unveiled ZF’s autonomous shuttle is designed to travel in mixed traffic in urban areas
7 Platformone Uncrewed Systems Technology | February/March 2023 Flowcopter, in Edinburgh, Scotland, is developing a UAV that can carry up to 160 kg (writes Nick Flaherty). The 500 kgUAV uses Digital Displacement transmission technology that ismuch lighter, more robust and less costly than an equivalent electric transmission. TheUAV uses a type-certified engine froma light aircraft with a hydraulic transmission that allows the cylinders to be optimised in real time using fast digital control to drive rotors for vertical lift. The hydraulic system has four separate subsystems, all of which share a common low-pressure connection to a return line that is pre-charged by a pneumatic bladder accumulator. Each rotor subsystem has an independent fluid supply from the pump, which supplies fluid to a fixed-displacement bent-axis hydraulic motor, which also acts as the rotor hub. Each rotor subsystem has a pressure sensor and motor speed sensor. A systemcontroller sends pressure demands for each rotor circuit to the pump controller, which is a specialised electronic controller consisting of amicrocontroller, FPGA and power electronics. The digital displacement pump consists of 12 cylinders arranged in three banks of four along a common crankshaft supported by roller bearings. Each piston bears on the crankshaft by hydrostatic pad and seals by a spherical metal piston ring. Each cylinder has two commutation valves, which are both built into a single valve capsule. That drives the bent-axis hydraulic motors to drive the propellers for the lift. However, this has required a new design of propeller, as the motors operate at lower speeds but require more power. MagCAD, in Germany, has designed and produced the 2.2 m-diameter props for a demonstrator with a speed of 1800 rpm for hovering and up to 2100 rpm for horizontal flight. A hinge helps with the torque and for transport. “We have specific needs for the UAV on the torque-to-power ratios where we matched the hydraulic lift motor to the propeller using the MagCAD software, and that gives us more lift for a given power than from an off-the-shelf propeller,” said Uwe Stein, technical director of Flowcopter. “We are now looking to go bigger, for crop spraying and logistics, and we have a project with the Warsaw Aviation Institute where we are looking at our technology for a transitional UAV,” he said. “The transmission can syphon off power for four bent-axis lift motors on the wings, so we can get 40 kWout of the 5.5 kg motor. They have an incredible power-toweight ratio and robustness,” he added. “Hydraulics has always been seen as a robust and low-cost technology but also as inefficient and not very controllable. By enabling and disabling cylinders we reach 95% efficiency, putting us on par with electric transmissions but cheaper and lighter.” The UAV is currently being tested. Airborne vehicles UAV is a heavy lifter A hydraulic transmission and a new design of prop contribute to the UAV’s carrying capacity
8 Platformone February/March 2023 | Uncrewed Systems Technology An electric driverless bus is set to start operating on the city streets of Dunfermline, Scotland (writes Nick Flaherty). The CAVForth 2 project will use the CAVstar Automated Driving System, which combines sensor fusion for cameras, Lidar and radar with a machine learning system. It was developed by Fusion Processing in Bristol, England. The aim of the project is to show how autonomous buses can improve journey times and quality of service for customers while also reducing energy consumption and emissions. Fusion Processing’s latest version of CAVStar will be fitted to a fully electric Enviro100AEV bus from consortium partner Alexander Dennis. The system is capable of operating at SAE Level 4 and features redundancy on all safety-critical systems, together with additional redundancy built into the steering and braking systems. CAVStar receives information directly from traffic light systems to enable the bus to plan its speed to run smoothly from one green light to the next. That can help reduce energy consumption by up to 20% by reducing unnecessary braking and accelerating. It also means less wear on brakes and tyres, further reducing operating costs. That in turn leads to a reduction in brake and tyre particulates. These will be included in EU emissions regulations for the first time as part of the Proposals for Euro VII, which the project sees as a further benefit of using autonomous vehicles in public transport networks to help improve air quality, especially in cities. “CAVForth 2 builds on our experience of developing a fleet of five full-size SAE Level 4 autonomous buses,” said Jim Hutchinson, CEO and co-founder of Fusion Processing. As with the first CAVForth project, the CAVForth 2 bus will have a specially trained safety driver on board to monitor the vehicles’ autonomous systems, and a bus ‘Captain’ who will walk along the passenger deck, assisting customers and answering any questions they may have. A trial service in Oxford, England, is also planned using the CAVstar technology. The service, called the MiLink project, will feature a 16-seat, singledecker electric minibus, and will take a circular route around the Milton Science Park. A journey planning app, developed by Zipabout, has been customised for Milton Park to provide real-time updates on the service’s operating times between 7am and 6.30pmMonday to Saturday, running at up to every 15 minutes. Driverless vehicles Green light for bus trial The bus, although autonomous, will have a safety driver and a ‘Captain’ to answer passengers’ questions
Tekever WE DELIVER YOUR PROMISE TEKEVER offers a surveillance-as-a-service solution, delivering actionable real-time intelligence to make oceans safer and save more lives. We are the proven global leader in real-time maritime perception TEKEVER www.tekever.com
10 February/March 2023 | Uncrewed Systems Technology Simulation firm rFpro has used ray tracing to develop software that can accurately reproduce environments to test sensors in autonomous vehicles (writes Nick Flaherty). The company develops high-fidelity software for driver-in-the-loop (DIL) simulators, and 6 years ago started extending its technology to testing driverless cars. It has written a simulation engine from the ground up using ray tracing that traces all the beams that fall on a sensor. This can be visible light for a camera, infrared for a Lidar or RF for a radar sensor. Using ray tracing allows artefacts such as motion blurring to be accurately tested in a virtual environment. The simulation engine builds up environments, for example an underground parking garage or an urban tunnel at night. All the beams, or rays, in the environment are tracked, including those fromexternal lights and from the vehicle, to recreate what is received by the sensor. “We have spent 16 years creating immersive real-time high-bandwidth, low-latency simulation technology for human vision, which is what DIL simulators are all about,” said Matt Daley, operations director at rFpro. “Until now, the fidelity of simulation in the most challenging lighting situations hasn’t been high enough to replace realworld data. Our ray-tracing technology is a physically modelled simulation solution that has been developed specifically for sensor systems to accurately replicate the way they see the world.” “It needs to be engineering-accurate. You have to do things as physically accurately as possible. As soon as you move away from a perfectly lit daytime scene with lots of other light sources, and other vehicles, you have to be able to calculate how the light bounces around the environment. That is why ray tracing is needed for high-fidelity sensor simulation. “Ray tracing is established in the graphics industry but it has been focused on making things look good to human eyes. We believe this is the first engine written from the ground up for sensors in autonomous systems. “It’s all about the physics of electromagnetic waves from a source reflecting offmaterials and arriving at a sensor. It’s about how you trace the path.” The model of the sensor is a key element in the simulation. For example, a camera sensor with a rolling shutter can use three capture periods, for example at 2, 5 and 10 ms, then process that data to give an HDR image. These timings can also change from frame to frame as the sensor adapts to the different light levels while the vehicles move around. That needs to be included in the model to achieve accurate motion blur in the simulated sensor. “With rolling shutter sensors, every single line of the chip is being sampled at a slightly different time, so we don’t get straight edges,” said Daley. “That is fundamentally built into the way the sensor models are coupled with the ray tracing. What we have done is develop the ray tracer alongside the sensor APIs that allow the models to be integrated.” Simulators Ray-tracing sensor tests Adapting to varying light levels allows motion blur to be tested virtually
T-MOTOR THE SAFER PROPULSION SYSTEM POWER MAKES YOUR EXPLORATION www.tmotor.com Platformone This is tested in the lab using physical cameras that measure light intensity levels and colour reproduction against the simulation, with results within a single-digit percentage. All the calculations are handled as 32-bit floating point data to represent the strength of the light beams. However, this is not a real-time engine, as the optimisations used for high-speed rendering in the DIL systems won’t work for the highest fidelity sensor simulation. “We still need to be highly efficient with this sensor simulation though, so wemake sure every ray we fire is used,” said Daley. The ray tracing incorporates every element in a simulated scene, which has been physically modelled to include accurate material properties to create the highest fidelity sensor data. The rate of frame rendering is adjusted to suit the level of detail required. That enables high-fidelity rendering to be carried out overnight and then played back in subsequent real-time runs if needed. The simulations can all be rendered on a commercial GPU board in a PC, taking seconds per frame. The sensor models run on another GPU card, coordinated by the central processor in the PC. This can be extended to cloud computing systems with arrays of CPUs and GPUs to build systems with multiple sensors, as a driverless vehicle could have 40 or more sensors operating simultaneously. The simulation is managed by a series of APIs. A vehicle API is used for the vehicle being simulated, with a traffic API for additional vehicles, pedestrians, bicycles and so on. Then there is a sensor API to link to the sensor model. Themulti-threaded rFpro simulation is also a synchronous systemand waits for everything to finish beforemoving to the next step, which again prioritises accuracy and precision over real-time operation. A simulation thread controls all the objects and where they are, then there is an independent rendering thread to produce the ray data for the sensor models. “We started this development at the time ray-tracing cores appeared on graphics cards, so the ray tracer has been designed for those cores,” Daley said. “We havemore than 200 digital models, ranging froma 30 km section of a complex highway in central Tokyo to the controlled setting of Millbrook proving ground in England that shows the entire location.” The simulation environment makes it extremely flexible to create thousands of tests, particularly the edge and corner cases that are difficult to reproduce in the real world, such as testing the flicker mitigation in the sensor. “This represents a move away from proof-of-concept demonstrators for training neural networks using synthetic data to continuous development and testing for an inherently safe process,” Daley said. POWER MAKES YOUR EXPLORATION www.tmotor.com
12 February/March 2023 | Uncrewed Systems Technology Platformone Researchers at the EPFL in Switzerland have developed algorithms to allow people to take over control of driverless cars more easily (writes Nick Flaherty). The team has used haptic shared control on the steering wheel to create a different way for the vehicle to interact with the driver. Working with steering systems supplier JTEKT, in Japan, it has road-tested the system, which integrates different modes of humanrobot interaction. One issue that has arisen with driverless cars is the handover from autonomous operation to driver control. Research has shown that placing too much control of a vehicle in the hands of automation can do more harm than good, as disengagement by human drivers can increase the risk of accidents. “Current vehicles on the market are either manual or automatics, and there is no clear way of making their control a truly shared experience,” said Jurg Schiffmann, head of the EPFL’s Laboratory for Applied Mechanical Design in the School of Engineering. “That is dangerous, because it tends to lead to driver over-reliance on automation. “This research was based on the idea that automation systems should adapt to human drivers, not vice versa,” said EPFL PhD student and JTEKT researcher Tomohiro Nakade, who is also the lead author on a recent paper published in theCommunications Engineering journal describing the system. “In automation in general, when humans are just monitoring a system but not actively involved, they lose the ability to react,” said Robert Fuchs, a former EPFL PhD student who is now an r&d general manager at JTEKT. “That’s why we wanted to actively improve driver engagement through automation.” Rather than using a camera, the haptic approach uses data from the sensors in the car’s steering column. It also encourages continuous engagement between the driver and automation system, as opposed to current automated systems, which are typically switched either on or off. That means the software-based system can be integrated into standard mass-produced cars without any special equipment. As the driver operates the vehicle, the systemmoves between four different interaction modes depending on the evolving situation on the road. For example, the car might switch from collaboration to competition mode to take over control to avoid a sudden threat of a collision. Still within the same control framework, the system integrates an ‘inclusion’ function. This re-computes the vehicle’s trajectory whenever the driver intervenes, for example by turning the steering wheel, rather than perceiving it as an override and switching off. To test the system, the researchers developed experiments involving a simulated virtual driver and a human driver using a detached power steering system, a full driving simulator, and field tests with a modified test vehicle. The field tests were carried out with the participation of five drivers on a JTEKT test course in Japan, by connecting the researchers’ system to a standard car via an external controller. Driverless vehicles Easier driver takeover The EPFL’s haptic approach uses data from the sensors in the car’s steering column
The hub of the GCS is the GHU-100 UAV Navigation has launched a nextgeneration ground station that is designed to handle multiple types of UAVs, including those for maritime missions (writes Nick Flaherty). The GCS is based around the GHU100, a new ground control hub that helps platformmanufacturers connect multiple ground devices such as the PC, data links and joystick controllers, and forms a single network segment. Having all the elements in one segment helps minimise any network latency, which is particularly relevant for maritime missions that need to combine NMEA (National Marine Electronics Association) maritime data inputs and RTK corrections. The GHU-100 hub supports ground stations that manage target, fixedwing, rotary wing and VTOLs, and includes an internal GNSS for autonomous GCS geo-localisation. The GNSS receiver has been upgraded from the previous GCS03 to improve accuracy and to be more robust against jamming or spoofing. UAV Navigation developed its own real-time operating system for the GHU100 to provide safe and reliable operation if any of the attached PCs crash. It still has 10 I/O ports that handle multiple Ethernet payload connections such as redundant radios and the Visionair software running on the PC. The hub supports bidirectional comms between Visionair on the ground and the onboard autopilot in the air. It also supports routing messages for air-to-air, ground-to-air and air-to-ground comms. That is a basic requirement for advanced missions using systems such as multiple UAVs and GCSs. The hub also has an integrated NMEA input as a reference source. NMEA 0183 is a proprietary protocol issued by the NMEA for use with UAVs at sea, and is commonly used by autopilots for GNSS data. The hub also supports efficient dispatch of RTK correctionmessages from the base directly to the UAV to providemore accurate positioning information. Airborne vehicles GCS is UAV-agnostic CONNECT TECHNOLOGY WITH CONFIDENCE Harwin’s connector products are proven to perform in extreme conditions, with shock, vibration and temperature range rigorously tested. Micro connectors start at 1.25mm pitch delivering 2A per contact, up to 8.5mm and 60A - we cover a wide range of applications for when SWaP matters most. With our quality, service, support, and highly reliable products, you can depend on Harwin. / / WWW . H A R W I N . C O M Harwin UAV Uncrewed Systems Feb 23.indd 1 16/01/2023 17:08
14 Platformone February/March 2023 | Uncrewed Systems Technology A project called Memtonomy at Fraunhofer IESE, Germany, is aiming to tackle the memory storage challenge in driverless vehicles (writes Nick Flaherty). The constantly growing use of sensors and AI components in driverless cars is creating the need for large amounts of data to be recorded, merged and analysed in real time. For cost and energy-efficiency reasons, the control devices used for this increasingly need components originally developed for the consumer market. In particular, that includes memory devices such as DRAM and flash. However, DRAM and flash memory pose major challenges in terms of performance, energy efficiency and functional safety, as currently they are mostly not qualified for safety-critical applications. That was demonstrated last year when a failure in a flash memory module was the cause of a major recall of Tesla vehicles. TheMemtonomy project aims to increase the bandwidth tomemorywhile reducing the latency and power consumption. At the same time, it has to increase device reliability, because the storage systems have tomeet the ISO26262 safety standard for applications in vehicles. However, the researchers say it is not yet clear how to use thesememory modules and how they will perform in an automotive context with respect to bandwidth, latency, power, temperature, reliability, safety and security. “To the best of our knowledge, there are no investigations or publications that optimise DRAM memory with respect to future automotive applications,” they said. They are using an open source simulation tool called DRAMsys developed by theMicroelectronic Systems Design Research Group of the Technical University of Kaiserslautern and Fraunhofer IESE to analyse the performance of automotive memory modules for bandwidth, latency and power. The latest version of the tool, DRAMSys4.0, uses virtual models that reflect the DRAM functions, power consumption and temperature. These models enable system designers to analyse the limiting factors in a design and identify issues with the current DRAM standards that apply for autonomous vehicles. The tool provides a user-friendly trace analyser. The analysis helps researchers optimise the DRAMsubsystemwith respect to the controller architecture, power and thermal management, as well as device selection and channel configuration for driverless car applications. Driverless vehicles Datamemory boost bid A trace analyser helps researchers optimise the DRAM subsystem
CubePilot Ecosystem Here4 RTK GNSS Integrates CubeID for DroneID complianceand provides additional features for those keen to experiment, like onboard PWM, Barometer, Space hardened Mag, and a full IMU solution. Here4 redefines what to expect from a gps. Cube ID_CAN Remote identification With DroneID becoming the law in many countries worldwide, Cubepilot has introduced the new CubeID modules. hereFlow Optical flowmodule A small and compact module, integrated with indoor TDF range finder, optical flow sensor, and a 6DOF IMU. Hitec Servo Hardened Steel Gears, CAN Protocol, Up to 32V, IP67, Feedback. HerePro Multi-band RTK GNSS Triple band, multi Constellation RTK / PPK GPS. With CANFD, Built in H757 processor, and eMMC, the HerePro is a futureproof solution. RFDESIGN Telemetry High performance telemetry. Specifically designed for applications requiring long range reliable wireless communication. lightware Lidar LW20/C is suitable for small UAS where size, weight and power matters. Can be used in altitude hold and object avoidance. Step down module (5.3V) Step down power module. For items that require regulation at point of load, like the RFD900 and Lidar. CAN Bus Splitter USB Antenna Buzzer uAvionix ADS-B Stepdown module (5.3V) IR-LOCK IR-LOCK sensor Provides precision landing and target recognition features, by coupling itself to a beacon located at the landing position. Step down module (5.3V) TAttu Battery Higher discharge rate and battery capacity, advanced intelligent battery management, and CAN interface, software upgrade and log recording meet various application scenarios. T-MOTOR US Motors Suitable for different kinds of industrial UAV’s. MAUCH Power module Hall current sensor with precision current and voltage sensing. Supports high voltage and high current sensing. Cube Open source autopilot Equipped with a high performance H7 processor, triple redundant IMU, thermal controlled and vibration isolated IMU, switchable 3.3/5V PWM output, and standard ADS-B carrier board, used to identify nearby aircraft. GREMSY Gimbal 4K resolution camera support 20 times optical zoomwith an additional 12 times digitalzoom. 1080p 30fps and 60fps HDMI output. Provide USB 2.0, UART, Ethernet and Micro HDMI connection. CW3 CCW1 CCW2 CW4 APD ESC APD F series ESC are equipped with 32-bit F3 processor. Features include PWM-frequency dithering, and active phase-current demagnetisation. HEXSOON PDB-40AW/BEC 5V & 12 V Support up to 65 or 26 V DC, 6x40 A peak current and 6x25A continuous current, 5V and 12V BEC provide support for FC, Video Transmitter, Servo, Receiver, LED Etc. ARDUPILOT Px4 autopilot Herelink Controller Unit Copyright © 2023 CubePilot Australia. All Rights Reserved. ADS-B RTK GNSS Remote identification RC, telemetry, and video transmission all-in-one Gimbal U8 Motors Battery ESC IR-LOCK sensor Lidar Integrates CubeID for DroneID compliance and provides additional f e a t u r e s f o r t h o s e k e e n t o experiment,like onboard PWM, Barometer, Space hardened Mag, and a full IMU solution. Here4 redefines what to expect from a gps. With DroneID becoming the law in many countries worldwide, Cubepilot has introduced the new CubeID modules. S u i t a b l e f o r d i ff e r e n t kinds of industrial UAVs. 4K resolution camera support 20 times optical zoom with an additional 12 times digitalzoom.1080p 30fps and 60fps HDMI output.Provide USB 2.0, UART, Ethernet and Micro HDMI connection. Step down power module. For items that require regulation at point of load, like the RFD900 and Lidar. Equipped with a high performance H7 processor, triple redundant IMU, thermal controlled and vibration isolated IMU, switchable 3.3/5V PWM output, and standard ADS-B carrier board, used to identify nearby aircraft. Hall current sensor w i t h p r e c i s i o n current and voltage sensing. Supports high voltage and high current sensing. Support up to 6S or 26 V DC , 6x40 A peak current and 6x25A continuous current. 5V and 12V BEC provide support for FC,Video Transmitter,Servo, Receiver ,LED,Etc. Higher discharge rate and battery capacity, advanced intelligent battery management, and CAN interface, software upgrade and log recording meet various application scenarios. Provides precision landing and target recognition features, by coupling itself to a beacon located at the landing position. Optical flowmodule A small and compact module, integrated with indoor TOF range finder, optical flow sensor, and a 6DOF IMU. Servo Hardened Steel Gears, CAN Protocol, Up to 32V, IP67, Feedback. CAN Bus Splitter Buzzer Triple band, multi Constellation RTK / PPK GPS. With CANFD, Built in H757 processor, and eMMC, the HerePro is a futureproof solution. Multi-band RTK GNSS Open source autopilot Power module Step down module(5.3v) LW20/C is suitable for small UAS where size, weight, and power matters. Can be used in altitude hold and object avoidance. Step down module(5.3V) Step down module(5.3V) Controller Unit Telemetry High performance telemetry. Specifically designed for applications requiring long range reliable wireless communication. PDB-40A W/BEC 5V & 12 V USB Antenna HereLink is a compact i n t e g r a t e d s y s t e m . Video resolution up to 1080p@60fps. Operating distance of up to 20 km. Comes with a built-in Solex and QGC, and is highly customisable for OEM use. Ethernet ports added in new Airunit, support various external devices. 1000nit sunlight readable. APD F series ESC are equipped with 32-bit F3 processor. Features include PWM-frequency dithering, and active phase-current demagnetisation. C o p y r i g h t © 2 0 2 3 C u b e P i l o t A u s t r a l i a . A l l R i g h t s R e s e r v e d .
Rather than using general-purpose high-performance vision processors for autonomous vehicles, Recogni and Renesas have developed an embedded platform, which can deliver up to 2000 TOPS (writes Nick Flaherty). The Phoenix ECU system combines Recogni’s Scorpio AI inference processor and Renesas Electronics’ R-Car V4H ADAS/AD SoC. Both chips are built on a 7 nm CMOS process technology. The Scorpio chip’s architecture provides a latency of less than 10 ms from the last pixel out to perception results. That provides ample reaction time for the car to navigate safely, along with an object detection accuracy range of up to 300 m in real time under various road and environmental conditions. “Vision is fundamental to accurate perception processing, and essential to autonomous driving platforms,” said RK Anand, founder and chief product officer at Recogni. “From the beginning, we took an approach of processing high-resolution images at the edge to achieve nearperfect object detection and classification, and enable autonomous driving stacks to make for better driving decisions. “The Scorpio can process multiple 8 MP streams at 30 frames per second in less than 10 ms using only 25 W. That’s a performance order of magnitude greater than anything else on the market and will, we believe, help to accelerate autonomous driving become a reality.” The chip can process 1000 TOPS in the 25Wpower envelope, and two chips can be used in the platformwith a V4Hcontroller for L3 and L4 autonomous applications. The V4H is also used for machine learning inference with dedicated deep learning and computer vision processing blocks with an overall performance of 34 TOPS. The blocks work with four ARM Cortex-A76 general-purpose processor cores running at 1.8 GHz. Three ARM Cortex-R52 cores run in lockstep at 1.4 GHz for a total of 9 kDMIPS to support ASIL D real-time operation and eliminate the need for external microcontrollers. The lockstep allows the processing functions to be checked by each processor. Renesas also provides a dedicated power solution for V4H based around the RAA271041 pre-regulator and the RAA271005 PMIC. This enables a highly reliable power supply for the R-Car V4H and peripheral memories from the vehicle battery’s 12 V supply. These features enable low-power operation while targeting ASIL D compliance for systematic and random hardware faults. Recogni has an open software platform for the Scorpio with an L2+ capable AI vision perception stack and a variety of object classification and detection features. This includes vehicles, traffic signs and traffic light detection and lane detection, but developers can run their own perception and driving functions software on the chip. Driverless vehicles Fast, low-power vision 16 February/March 2023 | Uncrewed Systems Technology The Phoenix system can process multiple 8 MP streams at 30 fps in less than 10 ms We process images at the edge to achieve near- perfect object detection and classification, to make for better driving decisions
Platformone A group of researchers has developed neural network algorithms that can work on nano-UAVs weighing as little as 10 g (writes Nick Flaherty). The nano-UAVs are increasingly useful for monitoring applications, but struggle with complex autonomous operations with multiple objectives. Researchers from the University of Bologna, Italy, KU Leuven in Belgium, the Dalle Molle Institute for Artificial Intelligence, USISUPSI, in Switzerland and the Integrated Systems Laboratory at ETH Zurich, Switzerland, successfully used time-of-flight distance sensor data with a vision-based convolutional neural network (CNN) for object detection in tests of the autonomous systemwith multiple objectives. The test UAV is a Bitcraze Crazyflie 2.1 quadrotor, a COTS UAV that weighs 27 g and has a diameter of 10 cm. It is equipped with two microcontroller units (MCUs), a single-core ARM Cortex-M4 (STM32) for safe navigation and exploration policies, and a parallel ultra-low power eight-core GAP8 that uses the RISC-V instruction set. The GAP8 has been optimised for CNN inference, with a power envelope of only 134 mW, including the image sensors and external memories. The challenge of providing autonomous operation for the UAV is addressed by mapping multiple tasks on the two MCUs. The STM32F405 MCU, with a peak performance of 100 million MAC operations per second, is used for the lightweight workloads such as sensor interfaces. The GAP8 MCU is used for the machine learning and image detection algorithms. TheUAV has three additional PCBs: the Flowdeck, theMultiranger deck and theAI deck. The Flowdeck provides optical flow and heightmeasurements to increase state estimation reliability. The Multi-ranger deck features five single-beam VL53L1x timeof-flight distance sensors mounted on the UAV’s top and sides, providing line-of-sight distance measurements within 40 cm. The AI deck is a visual engine running on the GAP8 chip at 175 MHz, with a low-power QVGA-resolution grayscale camera and additional off-chip memories, including 8 Mbytes of HyperRAM and a 64 Mbytes of HyperFlash memory. The UAVwas testedwith various algorithms, and flew around a roomwithmultiple objectives. The best results achieved a final detection rate of 90%, exploiting a pseudo-randompolicy for explorationwith the largest object detectionmodel and amean flight speed of 0.5m/s. The higher detection rate can be reached by trading off the CNN’s detection capabilities with the flight speed. Airborne vehicles Neural net nano-UAVs Dr DonoughWilson Dr Wilson is innovation lead at aviation, defence, and homeland security innovation consultants, VIVID/futureVision. His defence innovations include the cockpit vision system that protects military aircrew from asymmetric high-energy laser attack. He was first to propose the automatic tracking and satellite download of airliner black box and cockpit voice recorder data in the event of an airliner’s unplanned excursion from its assigned flight level or track. For his ‘outstanding and practical contribution to the safer operation of aircraft’ he was awarded The Sir James Martin Award 2018/19, by the Honourable Company of Air Pilots. Paul Weighell Paul has been involved with electronics, computer design and programming since 1966. He has worked in the real-time and failsafe data acquisition and automation industry using mainframes, minis, micros and cloudbased hardware on applications as diverse as defence, Siberian gas pipeline control, UK nuclear power, robotics, the Thames Barrier, Formula One and automated financial trading systems. IanWilliams-Wynn Ian has been involved with uncrewed and autonomous systems for more than 20 years. He started his career in the military, working with early prototype uncrewed systems and exploiting imagery from a range of systems from global suppliers. He has also been involved in ground-breaking research including novel power and propulsion systems, sensor technologies, communications, avionics and physical platforms. His experience covers a broad spectrum of domains from space, air, maritime and ground, and in both defence and civil applications including, more recently, connected autonomous cars. Professor James Scanlan Professor Scanlan is the director of the Strategic Research Centre in Autonomous Systems at the University of Southampton, in the UK. He also co-directs the Rolls-Royce University Technical Centre in design at Southampton. He has an interest in design research, and in particular how complex systems (especially aerospace systems) can be optimised. More recently, he established a group at Southampton that undertakes research into uncrewed aircraft systems. He produced the world’s first ‘printed aircraft’, the SULSA, which was flown by the Royal Navy in the Antarctic in 2016. He also led the team that developed the ULTRA platform, the largest UK commercial UAV, which has flown BVLOS extensively in the UK. He is a qualified full-size aircraft pilot and also has UAV flight qualifications. Uncrewed Systems Technology’s consultants 17 Uncrewed Systems Technology | February/March 2023
18 February/March 2023 | Uncrewed Systems Technology Platformone AdditiveManufacturing for Aerospace, Defence and Space Conference Wednesday 22 February – Thursday 23 February London, UK www.defenceiq.com/events-additivemanufacturing Unmanned & Autonomous Systems Summit Wednesday 8 March – Thursday 9 March Maryland, USA www.unmannedsystems.dsigroup.org Paris SpaceWeek Thursday 9 March – Friday 10 March Paris, France www.paris-space-week.com GEOConnect/Drones Asia Wednesday 15 March – Thursday 16 March Marine Bay Sands, Singapore www.dronesasia.com DSEI Japan Wednesday 15 March – Friday 17 March Tokyo, Japan www.dsei-japan.com AmsterdamDroneWeek Tuesday 21 March – Thursday 23 March Amsterdam, The Netherlands www.amsterdamdroneweek.com Sea-Air-Space Monday April 3 – Wednesday 5 April Maryland, USA www.seaairspace.org Military Robotics and Autonomous Systems Conference Monday 17 April – Tuesday 18 April London, UK www.smgconferences.com/defence/uk/conference/ robotic-autonomous-systems Ocean Business Tuesday 18 April – Thursday 20 April Southampton, UK www.oceanbusiness.com Rotorcraft/Unmanned Systems Asia Wednesday 3 May – Friday 5 May Singapore www.rca-umsa.com XPONENTIAL 2023 Monday 8 May – Thursday 11 May Denver, USA www.xponential.org Undersea Defence Technology Tuesday 9 May – Thursday 11 May Rostock, Germany www.udt-global.com UncrewedMaritime Systems Technology Wednesday 10 May – Thursday 11 May London, UK www.smgconferences.com/defence/uk/conference/ Unmanned-Maritime-Systems Mobility LiveMiddle East Monday 15 May – Tuesday 16 May Abu Dhabi, UAE www.terrapinn.com/exhibition/mobility-live-me FutureMobility Asia Wednesday 17 May – Friday 19 May Bangkok, Thailand www.future-mobility.asia Critical CommunicationsWorld Tuesday 23 May – Thursday 25 May Helsinki, Finland www.critical-communications-world.com Energy Drone & Robotics Summit Monday 12 June – Wednesday 14 June Texas, USA www.edrcoalition.com Paris Airshow Monday 19 June – Sunday 25 June Paris, France www.siae.fr/en Autonomous Ship Expo Conference Tuesday June 20 – Thursday June 22 Amsterdam, The Netherlands www.autonomousshipexpo.com MOVE Wednesday 21 June – Thursday 22 June London, UK www.terrapinn.com/exhibition/move Uncrewed Systems Technology diary
Come join the autonomous revolution The global hub for uncrewed and autonomous systems engineering vacancies Uncrewedengineeringjobs.com Japan Drone Monday 26 June – Wednesday 28 June Chiba, Japan www.ssl.japan-drone.com Drone International Expo Wednesday 26 July – Thursday 27 July New Delhi, India www.droneinternationalexpo.com Commercial UAV Expo Americas Tuesday 5 September – Thursday 7 September Las Vegas, USA www.expouav.com DSEI Tuesday 12 September – Friday 15 September London, UK www.dsei.co.uk ADAS & Autonomous Vehicle Technology Expo Wednesday 20 September – Thursday 21 September Santa Clara, USA www.autonomousvehicletechnologyexpo.com DroneX Tuesday 26 September – Wednesday 27 September London, UK www.dronexpo.co.uk Intergeo Tuesday 10 October – Thursday 12 October Berlin, Germany www.intergeo.de UAV Show Thursday 19 October – Friday 20 October Bordeaux, France www.uavshow.com Dubai Airshow Monday 13 November – Friday 17 November Dubai, UAE www.dubaiairshow.aero Egypt Defence Expo Monday 4 December – Thursday 7 December New Cairo, Egypt www.egyptdefenceexpo.com
20 An early enthusiasm for computers and aviation, combined with talents that leaned more towards the practical than the academic, led Jarno Puff, now CTO of uncrewed subsystems developer LikeAbird, into an eclectic career that has encompassed industrial robotics, computing, UAV design and HMI development, emergency aviation and cloud computing. These coalesced into the development of comms for professional-grade UAV systems, including those used in BVLOS missions managed using the cloud. He first encountered remotely controlled aircraft as a hobby in the early 2000s through a friend who worked in amodel aircraft shop. He recalls having difficulty with the left-right control reversal that takes place when the aircraft comes back towards the operator, and it occurred to him that something like an aircraft cockpit and a set of virtual reality goggles would make themmuch easier to fly. The CTO of OEM comms and other systems provider LikeAbird gives Peter Donaldson his take on how to meet customers’ requirements All the rightsignals Disliking the left-right control reversal in conventional RC, Jarno Puff co-developed the RealityVision FPV UAV cockpit with flight test assistance from an aerobatic pilot (Courtesy of Jarno Puff) February/March 2023 | Uncrewed Systems Technology
21 Virtual cockpit experiment The chance to develop such a system came in 2003 through his involvement with Italian volunteer civil defence emergency aviation. UAVs were still a hobby at that point, as by then he had built a career in industrial robotics, automation and computer networking. Invited to give a talk about UAVs at the ITT Galileo Ferraris college in Verona, there he met Prof Athos Arzenton, who was involved in Verona’s voluntary civil defence association, which he invited Puff to join. On the first civil defence exercise he attended he met fellow volunteer Davide Burei, a light aeroplane pilot and instructor. Together with another friend they founded the voluntary association’s Emergency Flight Department (EFD) to experiment with the use of microlights and UAVs. “There were no flight controllers available like there are today, so manually controlling a drone was a cost-effective way to get situational awareness from the air. You just needed a licenced ‘real aircraft’ pilot to fly the drone,” he says. “We demonstrated this in 2006 using a test pilot with no RC model experience. He carried out aerobatic manoeuvres we couldn’t do as experienced RC pilots, which was impressive and confirmed our supposition.” The system that enabled this was developed in collaboration with a group of German companies, one of which was involved in developing the simulator for the Eurofighter Typhoon. “We used the original Eurofighter simulator throttle, stick and pedals with a CANaerospaceto-PPM converter and a Futaba RC handset,” he says. “The FPV display was based on Olympus video goggles and a 2.4 GHz analogue link with six patch antennas and a diversity receiver. “A Sony Video Walkman GVD1000E was used to record live video on a MiniDV cassette. An analogue video overlay generator was used to overlay basic telemetry data on the live video.” During this period, he recalls, UAVs were established only in themilitary, with no rules for civil use. “Froma technical point of view, themain challenge was in developing the video and C2 links using legal frequencies and power levels,” he says. Puff spent about 10 years working with the EFD while running his first startup, a storage area networking (SAN) consultancy as his main business. SAN is a key technology that, for example, enables multiple servers in multiple data centres to share storage facilities, adding redundancy and reliability. The UAV HMI work led him to found Advanced Aviation Technology (A2Tech) in 2005 to develop the systems into industrial-grade equipment. In this time frame, however, costeffective autopilots, miniature stabilised camera gimbals and low-cost, userfriendly multi-copter UAVs became available. These developments reduced the pilot workload involved in UAV operations, including real-time FPV piloting. Jarno Puff | In conversation Uncrewed Systems Technology | February/March 2023 Even when UAVs are operated via the internet primarily though 4G/5G, satcom is an essential back-up. This is LikeAbird’s beySAT-NB narrowband 2.4 kbit/s link (Courtesy of LikeAbird) The backpack GCS for local UAV control is a conceptual descendent of the RealityVision system, and can be used with a VR headset or conventional screen (Courtesy of LikeAbird)
www.highpowermedia.comRkJQdWJsaXNoZXIy MjI2Mzk4