UGVs

Rory Jackson reports on some stand-out applications of intelligent vehicles, from farming to transport
Since this publication started printing, the world of uncrewed ground vehicles (UGVs) has changed from a few SMEs and Google to hundreds of successful OEMs and integrators. Advancements in technology and the availability of high-end electronics for intelligent navigation without GNSS have made uncrewed robotic systems a valuable addition to many industries.
However, intelligent vehicles are nothing without the intelligent application and deployment of such technology. Fortunately, many UGV manufacturers are not just developing technology for technology’s sake, but for some carefully identified niche applications, with specific tailoring and tuning of vehicles for the nuances of each use-case, often thanks to close collaborations or discussions with customers. A handful of examples follow.

Accessibility
During the COVID-19 pandemic, healthcare workers struggled to keep patients alive and comfortable, while also suffering the worst and most frequent exposure to the coronavirus and its long-term effects. Maintaining distance between nurses and patients is especially difficult when a patient must be transported by wheelchair, and even wearing personal protective equipment (PPE) may not guarantee full protection.
Meanwhile, projections indicate that travellers with reduced mobility will double in number by 2040, and the pandemic has contributed to a shortage of operators willing to assist with wheelchair pushing, exacerbating the strain on many organisations’ resources; airports, in particular.
Seeing this problem, Turin-based Alba Robot has developed SEDIA (SEat Designed for Intelligent Autonomy), arguably an autonomous wheelchair for gently and safely transporting patients through complex indoor areas without someone needing to push it. It was first conceived in 2016 by Alba’s CEO, Andrea Bertaia, whose grandmother had just been forced to start using a wheelchair.
“I was then in a company making autonomous technology for cars, and although we started r&d into how an autonomous wheelchair could work, we gradually realised that it was too complex an application to carry out in our spare time, so I quit to incorporate Alba Robot in 2019,” Bertaia recounts.
“Since then, we’ve been worked to develop everything – software, hardware and the full vehicle – because such a project as this requires we understand every piece of the product; needing people to push reduced-mobility persons is not especially effective or safe in 2024, so we want SEDIA to function well, whether in hospitals or other locations with wheelchairs, like airports or even amusement parks.”

Alba refers to its solutions as personal mobility vehicles (PMVs), engineered for secure self-driving micro-mobility from parking facilities and through adjacent facilities, such as airports, hospitals, shopping malls, museums and pedestrian areas. That includes autonomy for dynamic route-planning and obstacle avoidance, powered by sensor fusion, and embedded facility maps for indoor localisation and GNSS for outdoor localisation.
Each PMV integrates an array of sensors, including 3D cameras for optical flow measurement, localisation assistance and collision avoidance, as well as ultrasound for detecting glass walls, which cameras can miss.
As well as developing its sensor architecture and fusion to ensure maximum safety for patients (being a very different application to autonomous logistics, and much newer and thus harder to find inspiration or guidance for), Bertaia notes: “We must have changed motors and power drives three times. Despite all our motors being very good, there was always something missing, like odometry feedback or other data until we settled on our final models. We also changed battery several times, we changed wheels; there’s not a single component that we didn’t swap out for something over time.”
The vehicles are monitored and fleet-managed through an application interface, developed for use by the hospital, airport or other end-user, who can view each SEDIA unit’s status, send remote commands or make adjustments to the embedded maps, such as geofencing areas undergoing cleaning or renovation.
“We’re looking into new technologies, like 5G, as well as more reliable GSMs [global systems for mobile communications], like 4G, and more conventional wi-fi systems to ensure persistent and secure connectivity between fleet managers and our vehicles,” Bertaia says.
“We’ve also developed tracking devices for such users to track the locations of their existing, manual wheelchairs, as many will want to take the opportunity provided by our fleet management interface to keep track of all their other assisted mobility inventory.”

Haulage sense
In our 48th issue (February/March 2023), we dove into Kodiak Robotics’ fourth-generation self-driving truck and the suite of technologies empowering it towards long-distance logistics. Key to these were its SensorPods, wing-mirror structures containing some of the sensors for real-time localisation, guidance and collision avoidance, and a bar-shaped CentrePod housing the rest of the sensors above the windshield.
Today, the company has completed its sixth-generation truck, which has removed the CentrePod on the roof; all sensors are now in the SensorPods for a more easily maintained layout.
“As ever, the SensorPods can be easily replaced by a technician in 10 minutes, and concentrating all the sensors in the wing-mirror SensorPods means there is now nothing on the roof to maintain, so maintenance doesn’t require a ladder, gantry or a crane. Roof-mounted subsystems are hard to maintain,” says Kodiak Robotics’ CTO, Andreas Wendel.
“These trucks need to be maintained in many places along their routes, and if they’re not driving, they’re not making money, so getting them out of the workshop and back on the road is really important to our customers, which now include IKEA and Werner Enterprises.”
Currently, each SensorPod has three radars, six cameras and two Lidars. They have been right-sized from 50 lb (22.68 kg) to 35 lb (15.87 kg), primarily thanks to using newer, smaller Lidars, such as Luminar’s Iris. Kodiak Robotics now views the SensorPod design as feature-complete, and it has also added a microphone for audio-detection of emergency vehicle sirens and other road-critical sounds.

Most of the SensorPod improvements were completed for the fifth-generation Kodiak truck; the key focus on the sixth-generation truck was its compute and actuation systems, particularly with respect to redundancy.
“Whenever you steer, you age your steering components and you risk something like a steering motor on your steering column breaking. It’s the same for your power systems, your braking systems and so on,” Wendel comments.
“We don’t want any single points of failure, and that’s where redundancy comes in. We’re comfortable saying our sixth-generation truck is the first fully redundant autonomous truck in the industry. In general, drive-by-wire is not yet standard in trucking, but absolutely no-one has dual redundancy of every driving component in a driverless-ready system. So, everything from computers to control surfaces in braking, steering and power throughput is at least doubled up to get those systems to ISO 26262 standards of integrity.”
Across the truck, one finds dual-redundant steering motors, triple-redundant brakes, dual-redundant power connections and dual-redundant computers. Wendel notes that the sixth-generation truck can now react to a fault by not merely pulling over for a safe abort, but switching to a backup system to keep running.
Lastly, Kodiak has added emergency flashing lights around the truck for select instances. In the event of an accident, regulations ordinarily require that a truck driver place down a warning triangle, but as autonomous vehicles are increasingly designed to operate without an onboard driver, some key members of the industry have jointly submitted an application to the US Department of Transportation for an exemption process, by which a warning flasher similar to those used by crewed tow trucks could be used instead.

(Image courtesy of Stratom)
Movements in mining
Over the last 20 years, research and investment into automation in the mining industry have predominantly been focused on haulage, drilling and fleet management technologies. With mounting interest in concepts for fully autonomous mines, there is a growing consensus that mining’s ancillary tasks should become autonomous next.
“That includes movement of extracted materials, and critical activities like maintenance and repair. Technicians need to drive out to faulty equipment, diagnose it, and go back to locate the appropriate tool or part if it isn’t on the maintenance truck,” says Zach Savit, senior manager for business development at Stratom, and the company’s lead expert on mining.
“There’s a lot of moving pieces that can benefit from autonomous systems, like Stratom’s autonomous pallet loader (APL). We’re especially getting asked about our ability to deliver things in underground, GNSS-denied environments and how we’d integrate the APL with other systems.”
The APL is a 4,535 kg, rugged, autonomous forklift-type vehicle, which runs on four tracks and has tracked conveyors or ‘rollers’ on its lifter, enabling it to pull objects on or push them off through stiction, while also being able to lift pallets and objects equal to its own body weight.
It is powered by a 2.8 litre Cummins Defense diesel engine, with a 20-gallon tank, enabling up to eight hours of operation, 50 km of distance per refuelling and a top speed of 8 mph (12.8 kph), although the tank can be swapped for a larger or smaller one as needed.
Underground navigation is aided by the availability of 3D scans and maps of mining environments, although the APL is equipped with Lidars and cameras for localisation and guidance, based on real-time perception of its surroundings.
Mark Gordon, president and CEO of Stratom, adds: “To really tackle subterranean and above-ground, GNSS-denied environments that you can frequently get in mines, we will be utilising SLAM [simultaneous localisation and mapping] and the same Lidars, cameras and IMU that the APL already has, but actually mapping out the environment in real time, so we can navigate through it, even if there have been changes since the last mapping was done.”
Stratom has developed an autonomous refuelling system, consisting of a robotic arm that can be guided by camera, Lidar or other sensors, depending on the environment, and an off-the-shelf fuel nozzle mounted into the end of the arm.
“A vehicle pulls up into a defined ‘box’ area, and through a scanning process the system intelligently locates the vehicle’s fuel-tank port, adjusts the arm to engage with that port and starts fuelling. It is fully autonomous, not just an automated, factory-type process where you’re going from point A to point B,” Gordon notes.
Savit adds: “The Cadia mine in Australia, for example, has Sandvik loaders, which work autonomously in a predefined ‘autonomous zone’, but for refuelling, someone has to go get fuel and fill them up, so there’s a big gap in the efficiency of that autonomous system, and our autonomous refuelling system would reduce the need for personnel to enter the autonomous zone or for vehicles to leave it.”
Refuelling is one of the more hazardous tasks in mines. Some trucks are 10-12 m tall, and over the past five years, the US alone has seen 50 injuries relating to getting up and down from trucks, and 280 other non-fatal accidents relating to other aspects of refuelling (according to the US Mine Safety and Health Administration’s data-retrieval system), so considerable safety gains may be made through making this autonomous.

Last-mile transport
We last featured Estonia-based AuVe Tech for its Iseauto autonomous shuttle in our 43rd issue (April/May 2022). It has since developed its next-generation shuttle, MiCA (an amalgam of ‘minu’, meaning ‘my’ in Estonian, and ‘car’).
Almost the entire vehicle has been designed from a blank sheet, with every controller, sensor, drivetrain system and structural material chosen from scratch to achieve an architecture that the company views as optimal for last-mile passenger transport.
“Last-mile means operating safely on roads that aren’t very wide and therefore don’t give much room to manoeuvre, but we also wanted to make a vehicle that was a little bit bigger than the Iseauto, mainly to give passengers more comfort in the amount of seating available, as well as greater visibility through more window area around the cabin,” says Johannes Mossov, a member of the management board of AuVe Tech.

“We’ve designed MiCa structurally such that it will be easier to produce in higher volumes, with fast assembly made possible through its parts. Accessibility of the different compartments and components for maintenance and repairs has also been made easier for maximising uptime.”
In addition to improving the experience of passengers and technicians, AuVe Tech aims to take a further step towards SAE Level 4 autonomy and safety with MiCa.
To that end, all onboard electronics, as well as the steering and braking systems, are doubled for safety. Additionally, sensors around the vehicle not only give 360o of perception, but their angles and field of views have changed to bring minimum visibility much closer to the vehicle than in the Iseauto.
The various sensors are mounted on a dedicated rail on top of MiCA, enabling them to point down to view the immediate space around the vehicle, and to be swapped in and out smoothly. The sensors include eight external cameras (plus two interior ones), seven Lidars and a Smartmicro automotive radar. Also installed onboard is an Xsens GNSS with dual-input antennas and an IMU.
Power for these (and for MiCa’s 25 kph top speed) comes from a 17.6 kW/h battery, integrated with an AC/DC that enables 22 kW fast-charging from industrial AC sockets. If constantly driving, MiCa can operate for seven hours straight, but the peaks and troughs of transportation demand each day are such that the vehicle can serve for up to 20 hours between charges (with a total recharging time of three hours and 55 minutes on a 22 kW charge).
The majority of energy consumption still comes from keeping the cabin climate comfortable for passengers in extremely cold or hot climates (which can shrink its maximum continuous operating time from seven to three hours). Future versions of MiCa may, like the Iseauto, incorporate a hydrogen fuel-cell range extender for extra energy.
“We have also continued optimising our autonomous driving software for manoeuvring around obstacles on the road, or for making smart decisions at difficult spots, like crossings, but it bears noting that MiCa’s enhanced sensor visibility around itself, compared with the Iseauto’s, including detecting farther ahead and at the back, makes a huge difference for our sensor coverage and safety integrity,” Mossov says.
“That, plus the doubling up of subsystems, has enabled us to now do regular, weekly vehicle tests without a safety operator onboard in closed-off portions of public streets, which, from a safety perspective, is needed to demonstrate that if one subsystem goes out, the vehicle can autonomously make a safe stop and alert the fleet-management system with relevant information.”

Crop cultivation
Arable farming tasks are difficult to automate, as crops can distribute, develop and even move in unpredictable ways, based on weather, wildlife, pests and other factors. Weeding is especially challenging due to the density and variety of forms in which weeds grow, as well as their proximity to valuable crops (and the closer weeds grow to crops, the more nutrients they steal).
As growers find it increasingly difficult to find and train new seasonal personnel for such tasks, Czechia-based Ullmanna is developing technologies by which weeding, and other complex field operations, could be handled by autonomous tractors and other farming UGVs in the near future.
“We have two core technologies. One is Newman, our robotic system with electromechanically actuated implements for running behind a tractor, and the other is AROW, essentially a camera box, which is installed in Newman and is actually the essential part of the machine for intelligent, autonomous control,” says Martin Ullmann, CEO and co-founder of Ullmanna.
“AROW recognises the crops and weeds, and sends control signals to Newman, which has the cultivator units – knives, basically – to prune the weeds. AROW is also offered to other manufacturers for smart vision tasks in agriculture. As farming tasks are very complex and meticulous, you generally don’t find complete agriculture UGVs offered as solutions. They’re coming, definitely, but you need to start with an autonomous implement for going behind a tractor, crewed or uncrewed.”

Newman and AROW can both weed multiple rows of a field at once, with the typical configuration using one 3D stereo camera, one RGB camera and one pair of implements per row. The standard version features six pairs of implements, spaced equally along a horizontal length.
As the platform pulls Newman through a field, AROW’s cameras recognise the crops, having been trained through image-based artificial neural networks, and they send commands to Newman’s implements to prune any plants that are not known.
Ullmanna also performs its own image gathering for AI training, taking around 1000 photos per hour for a new crop species (although the company continually takes more images to improve its crop-recognition models). It manually annotates them to say which plants are that crop and which are not, before feeding them into training software for crop classification.
“Once AROW sees a cluster of plants, based on the movement speed of the vehicle and the locations of the non-crops, it sends commands to the arms mounting the knives and then to the solenoids, which control their opening and closing motions for cutting the weeds,” Ullmann says.
“It is very complex: the knives have to cut as close as possible to the crop centres without damaging them to be able to remove every last weed. We’ve had to integrate a lot of premium-price electronics to execute the cultivation, but one six-row Newman machine can provide the same cultivation throughput as 60 manual workers.
“That’s not just weeding, but actual cultivation. AROW commands the knives to loosen the soil around the crops and that stops further weeds germinating around them.”
He adds that one potential end-user is interested in using Ullmanna’s technology for detecting and recognising broccoli for harvesting. Each individual broccoli potentially grows at a different size and elevation, and tractors’ cutting implements must adjust for them, but AI training could achieve this.
“With an AI-powered camera, like in our system, we can specify where an implement needs to aim and cut a broccoli based on its size and height. It’s just a matter of using 1000 images with different shapes and sizes of broccoli to get started,” Ullmann adds.

Special qualities
Across these various UGVs and their specialist uses, one sees some common themes behind how they are being designed to suit industry’s demands. Naturally, substituting humans in dull, dirty and dangerous work is common across many uncrewed systems applications, but in addition to these factors, some others stand out.
One is ease of maintenance. Fast vanishing are the days when a team of 10 engineers in-the-loop and 12 specialist mechanics kept a self-driving truck or bus going. Longer endurance, more uptime and less cumbersome tasks are prized targets in today’s UGV developments.
Another is comfort for humans, both inside and outside the autonomous vehicle. If UGVs are to become part of daily life, they must cause so little disruption and discomfort (whether stepping in and out of a robo-taxi or being wheeled around by a self-driving micro-mobility platform) that people barely notice the autonomous aspect of the system.
Will other uncrewed vehicle types be able to match UGVs for these qualities? Making delivery multirotors quieter, or helicopters or boats less maintenance-intensive are challenging targets to hit, albeit worthwhile ones for aerial or aquatic mobility.
But, if it pays to specialise and convenience is king, one can expect UGVs to continue being innovated for highly specialised and valuable new applications in the years ahead.
UPCOMING EVENTS




