Ottonomy Ottobot

The Ottobot is an all-electric, 4WD delivery robot developed for outdoor as well as indoor autonomous navigation (Images courtesy of Ottonomy)
The Ottobot is an all-electric, 4WD delivery robot developed for outdoor as well as indoor autonomous navigation (Images courtesy of Ottonomy)

Product of the pandemic

Covid-19 lockdowns and the social distancing they brought were the inspiration for developing this delivery UGV. By Rory Jackson

Many organisations will remember the difficulties the Covid-19 pandemic caused as they were trying to keep their businesses going amid the repeated lockdowns and social distancing advice. Uncrewed systems thus became vital to companies to stay connected with their customers, through minimum-exposure deliveries of food and so on, to B2B logistics and more.

The fear of another pandemic in the future, combined with growing shortages of labour, means the demand for intelligent, safe last-mile delivery systems is higher than ever. And while B2C UAVs still have some kinks to iron out (such as workable package drop-off mechanisms), the case for UGVs in home delivery work is increasingly well-established by small robots from companies such as Starship Technologies (UST 37, April/May 2021) and larger ones such as the Ottobot from Ottonomy.

Ottonomy was founded in late 2020 by CEO Ritukar Vijay and CTO Pradyot Korupolu. Its headquarters are in New York, it has a subsidiary in Uttar Pradesh, India, and it employs more than 40 people.

Ottonomy has formed partnerships with several customers for autonomous delivery networks, the most recent being Goggo in Spain
Ottonomy has formed partnerships with several customers for autonomous delivery networks, the most recent being Goggo in Spain

“I’ve been involved in autonomous driving and robotics for around 15 years now,” Vijay says. “I started with UGVs for defence applications, before going into autonomous warehouse robots and then worked on ADAS and autonomy for automotive vehicles.”

That track record, combined with the automotive industry’s focus shifting largely from autonomy to electrification between 2016 and 2020, motivated him and some colleagues to look further afield for use cases where autonomous mobility could solve tangible problems.

“Rather than trying to push autonomy to solve road congestion and accidents – which it might still do in the future – we saw that autonomous last-mile deliveries could immediately help retailers, restaurants and others in the pandemic and amid increasing labour shortages, as UGVs could enable a limited staff to do more while using less energy,” he says.

“So we started creating our first MVP [minimum viable product] during the pandemic, aimed at both indoor and outdoor autonomy. We already knew how to do the latter safely using 3D Lidars and cameras, and although many other developers such as Starship were dependent on GNSS and camera-based teleoperation, we wanted full autonomy from day one – the Waymo way, rather than the Tesla way.”

Indoor autonomy soon became something of a prime focus for Vijay and his team, partly because of early interest from airports including Cincinnati/Northern Kentucky International Airport (CVG) in autonomous robots that could navigate crowded terminals to deliver food and shopping to passengers. A pilot project followed in December 2020, which in addition to achieving successful autonomous navigation and deliveries indoors, provided critical lessons on the UGV’s useability and aesthetics.

“We took those lessons to take it from an MVP to a commercially deployable product, and then launched the Ottobot 1.0 in 2021 at CVG,” Vijay says.

In 2022, Ottonomy unveiled the Ottobot 2.0, which improves upon and replaces the 1.0 version as the company’s flagship product. It is a four-wheeled, 4WD system, fully electrically powered and weighs around 200 lb (90.7 kg). It features as standard two sliding metal doors within a mostly metallic hull, as well as a large infotainment screen on the front face and a smaller customer interface screen above, for scanning QR codes to confirm the identity of a recipient at hand-off.

The Ottobot 1.0 featured upwards-opening hatch doors and fixed wheels, rather than sliding doors and 4WD
The Ottobot 1.0 featured upwards-opening hatch doors and fixed wheels, rather than sliding doors and 4WD

From 1.0 to 2.0

Although the Ottobot 1.0 was optimised over the MVP design for easier use and an improved look (having the outward appearance of a robotic shopping trolley), the various trials it underwent in public spaces provided further crucial data for Ottonomy that informed how it should develop the 2.0 version.

“We’re all engineers at Ottonomy, and as other tech firms will have experienced, that sometimes influences you to make a product for engineers, instead of for restaurants, retailers and airports,” Vijay says. “So on top of making it easier to use, we also undertook an increasing number of trials in kerbside deliveries of food, groceries and consumer goods. That was a quite different use case to airports, so it demanded a different configuration.”

This spurred a redesign of the Ottobot as a modular platform that could be customised to move, navigate and react as appropriate, whether in retail work, airport terminals or other environments.

The design of version 2.0 would also aid in scalability. The company had previously planned two versions of the Ottobot, one for indoor work and the other for outdoors, but it soon realised that could mean needing two production lines and supply chains. A modular approach however would give it the flexibility to produce a wider range of designs using a single manufacturing line and supply base.

“Another issue was manoeuvrability,” Vijay says. “Whether it’s working in a warehouse, along a sidewalk or in an airport terminal, people around the UGV will move differently, and there are so many edge cases where the Ottobot will be restricted in its movement for its safety and that of pedestrians that we soon realised that the best solution was if it could move sideways, which started an internal discussion about the kind of drivetrain we’d need to achieve that.”

The team settled on a 4WD approach with differential steering (that is, azimuthing wheels), and although that meant packing a more expensive array of actuators and motor controllers inside the robot, the team agreed that it was worth it, as the Ottobot 2.0 gained the ability to navigate tight spaces on the edges of sidewalks.

The final point of focus throughout the trials was how best to improve the experience of the end-user – that is, the recipient of goods bought from the retailer or restaurant – particularly in making the hand-offs intuitive and contactless for users as far as possible.

To improve tactile access in these respects, particularly for wheelchair-bound and otherwise differently abled users, the Ottobot’s shape was changed from a trolley form to its current, flatter shape. It also has a mechanically actuated sliding door (visually similar to a bread bin) that makes packages accessible from the sides, rather than just from a top hatch, as in version 1.0.

Ottonomy worked mainly with two companies to solve these issues. One was Dassault, which selected Ottonomy for its Global Entrepreneur Program, and in doing so granted the team some free one-year licences for the full version of SolidWorks, which became a vital development tool for the Ottobot.

The other was Nvidia. Ottonomy took part in its Inception Program for Startups, which gave it access to considerable support not only in securing supply chains for Nvidia computer systems during the semiconductor shortages, but in the development of its AI technologies and autonomy algorithms.

“Most recently we’ve signed a long-term contract with Ouster for the supply of their Lidar units, to ensure we get them when we need them,” Vijay adds.

“We produced an initial batch of Ottobot 1.0 units, as we knew it would be subject to change based on user feedback. But for the Ottobot 2.0, we already have orders for a first batch of at least 100 robots, and scale production this year for a much larger number.

“Our manufacturing facility is in India but our immediate market is the US. We were originally based in Santa Monica, but we wanted to test the Ottobots in a wide range of weather conditions, which we couldn’t get in California, so we moved to New York and started our first snow tests in January 2022.”

He adds that the Newlab incubator in New York’s Brooklyn Navy Yard served as an invaluable supporter of Ottonomy’s development, and that its location was useful for testing the Ottobots in different scenarios, including varying traffic conditions, types of sidewalk and indoor versus outdoor spaces.

Newlab also enabled Ottonomy to connect with industrial partners such as Verizon, through which Ottonomy completed a 5G study in late 2022 into the feasibility of cloud-hosting key Ottobot software modules (rather than installing them on the UGV).

“Thanks to Verizon’s very low latency 5G connection, we successfully ran the Ottobots autonomously while some of their key AI computing units for localisation and object classification ran on the cloud in real time,” Vijay says.

“5G-powered cloud computing will therefore be rolled out as a strategic update as the infrastructure is made ready. It’s because of that kind of support that we have Verizon and others as key customers now.”

The Ottobot 2.0’s 4WD enables it to rotate on the spot and sidestep as needed in order to navigate through crowds without bumping into people
The Ottobot 2.0’s 4WD enables it to rotate on the spot and sidestep as needed in order to navigate through crowds without bumping into people

Sensor layout

Although much of Ottonomy’s past expertise in autonomous cars has been used in producing its Ottobot, the UGV has a highly contextual form of navigation rather than focusing on GNSS localisation first and immediate object detection second, as is more typical of the kinds of robotaxis Vijay and Korupolu used to work on.

“To understand the context of where they are and the densities of different kinds of objects – including people – they might encounter, our UGVs need enough sensors for a realistic and detailed representation of their environment,” Vijay says. “So at the top and front of them we have a 360o Ouster 3D Lidar to generate a point cloud around it.”

While that maximises real-time point cloud coverage through the use of a single Lidar, it had the potential to create shadow areas unless multiple Lidars were deployed, which would drive up the cost of each unit considerably. Numerous small ultrasonic sensors are therefore installed around the body, to cover these shadow areas and improve detection of glass surfaces, as Lidars as well as cameras can become ‘confused’ by them, particularly glass doors, owing to their refraction and transparency.

“We also have seven or eight cameras covering the UGV’s entire FoV. The actual number depends on how we alter our modular design for different uses and cost cases, and we’ll have both regular HD cameras and depth cameras,” Vijay explains.

A long-term contract has been signed with Ouster for the supply of the Ottobot’s 3D Lidar units
A long-term contract has been signed with Ouster for the supply of the Ottobot’s 3D Lidar units

“The 3D Lidar primarily gives us geometric information of objects and obstacles, to a better resolution and detail than any other sensor type, while the cameras identify semantic information that is key to contextual navigation that helps with specific identifications and classifications. The ultrasonic sensors are essentially there to detect things the Lidars and cameras can miss.

“Those aren’t the only failsafe sensors on board: we also have ‘cliff detection’ sensors, which detect not just cliffs [despite the terminology], but also small objects nearby, stairs and other edges or precipices in front of the robot. That provides another layer of safety, because autonomous perception systems can still incorrectly report that it’s safe to move forwards, just because there don’t happen to be any obstacles detected ahead.”

In addition to the sensors fitted around the body, some indicator lights and the four electrically driven wheels and a battery pack sit below. The storage containers can be selected modularly for different requirements.

An LFP battery pack sits at the bottom, a cell chemistry that meets Ottonomy’s requirements for safety and customisation over NMC
An LFP battery pack sits at the bottom, a cell chemistry that meets Ottonomy’s requirements for safety and customisation over NMC

Powertrain and control

Korupolu explains here that because of the weight of the robot and the cargo payloads to be carried, Ottonomy needed electric motors with a high load capacity, as well as highly precise initial triggering for accurate odometry.

That influenced the company to design the Ottobot with a 4WD system. Each wheel is driven by its own dedicated BLDC hub motor, and rotary electromechanical actuators for individual steering to provide nimble swerving, sidestepping and zero-radius turns through crowds of people.

“Every motor uses a closed-loop control topology, so we get all the feedback from the motors,” he adds. “Performance is more accurate that way, in that the UGV’s control inputs consistently result in the desired output. On the rare occasion that it doesn’t, it is rapidly corrected.”

“So in addition to the in-wheel motors and steering actuators, our in-house synchroniser software on the main computer ensures execution of all the correct commands – particularly electric motor positions and speeds – and we use field-oriented control to ensure smooth, jerk-free motion of the robot.”

Although Ottonomy was initially tempted to use similar electric motors and controllers to those in e-scooters – as they are also designed for closed-loop operation – their loops were nonetheless not robust enough for the Ottobots, so the company customised its motors and ESCs in-house.

“Key to the motor designs was including encoders for feedback, as well as optimising the top speed for fine steering and acceleration or deceleration control,” Korupolu says. “We use 350 W motors on each wheel, and have a 6 kph autonomous speed cap, although the robot is mechanically capable of a lot more.

“The motor controllers [from Roboteq] are rugged and reliable, and they’re programmed such that even in emergency stops, the torque output is smooth enough to prevent excessively harsh braking. We don’t risk hurting anyone, nor do we hurt the drivetrain.”

Energy is stored in a pack built around lithium iron phosphate (LFP) cells. Nickel manganese cobalt cells were used in early Ottobots and tests, but the company found itself questioning their charging times and safety. By contrast, the greater customisability of LFP enabled the team to shape and ruggedise the pack to its desired safety standards, and Vijay reports no difficulty in sourcing safety-certified LFP cells from vendors worldwide.

“We also use the battery suppliers’ BMS and charging software,” Korupolu adds. “At the moment, charging is manual. A technician either plugs in a connector to the receptacle at the rear, or swaps the entire pack, and charging takes around 75-90 minutes using DC fast charging. We’re also working on integrating wireless charging so the robot can dock itself and leave autonomously to go to a job when recharged.”

After navigating to a customer, the displays on the Ottobot’s front help with identifying and confirming an order
After navigating to a customer, the displays on the Ottobot’s front help with identifying and confirming an order

High Information Maps

As might be expected, some mapping information for localisation is key to the Ottobot’s contextual navigation. Ottonomy provides that using its in-house developed High Information Maps (HIMs), which it intends as an improvement on the HD Maps system used by many autonomous cars.

“A lot of the time the Ottobots will use geometric and semantic data inputs to stamp objects like doors, street lamps, pillars and so on with numbers, to keep the density of data easy to handle for the edge hardware,” Vijay says. “They will also record some information on what’s happening around those objects.”

Korupolu adds that despite this keeping the maps lightweight, they retain enough information for the UGVs to localise without needing GNSS. In a new operating environment, a first version of the HIMs is typically created by Ottonomy’s engineers driving the robots around the serviceable areas and pathways of potential delivery routes and customers.

“As they do so, the Ottobot builds its own maps with the fused point cloud and vision data, and these are uploaded to the cloud for the entire fleet to use,” Korupolu says. “With our real-time AI, we detect but also remove objects dynamically, as contextually we can understand that things like people, bikes, vehicles, chairs or tables go missing in streets or building interiors, and the UGV doesn’t get confused by that.”

Vijay adds, “GNSS is used mainly for georeferencing of the customer’s location, but we don’t use it very deeply in our navigation. Nonetheless, we have tested and used various companies’ GNSS to identify which companies could give us a consistent and reliable performance as well as supply.”

Path planning and dynamic avoidance

Localisation within the HIMs is essentially what enables the Ottobots to operate indoors as well as outdoors without relying on GNSS, with the camera and Lidar data and semantics being key to the UGVs understanding where they are without GNSS information playing a fundamental role in the process.

That hugely diminishes the likelihood of errors occurring due to multi-pathing or outages of GNSS signals that could occur when the Ottobots are navigating between tall buildings or through crowds of people.

As Korupolu explains, “At any given time, the Ottobot is calculating seven to 10 possible trajectories, and selects what it determines as the most cost-effective route. With the swerve capability from our 4WD, we can achieve precise, smart and safe turning radii, as well as sidestepping where necessary; the path-planning algorithm takes that into account.”

As the Ottobot executes its preferred routes, fused information from the cameras and Lidars as well as the ultrasonic sensors and cliff detectors helps inform the main computer how to dynamically recalculate routes and actuator commands to ensure it avoids objects, people and precipices during movement.

Depending on the modular configuration chosen, 12-16 ultrasonic sensors could be installed about the Ottobot, as each one has a much narrower FoV than the cameras and is more prone to performance losses owing to noise from other near-band frequencies, particularly other ultrasonic sensors, such as those from parked cars.

“The probability of interference was actually quite high, so more than 4 months of r&d went into testing and selecting the most robust ultrasonic sensors we could find,” Vijay says. “They’re not COTS or standardised components.”

The cliff detection sensor meanwhile is an IR-filtered camera at the robot’s front that points downwards, at an angle that prevents it being affected by ambient light. It has a very short range, as its role is to scan the floor for anything that might indicate a precipice.

“Among our AI algorithms, we also have a safety algorithm that automatically overrides the robot’s movements and forces a stop if anything comes too close,” Korupolu notes. “It is also what limits the speed to 6 kph, and it checks and validates every command the main autonomy software puts out.”

The ‘bread bin’ type sliding door makes package hand-off easier for differently abled customers
The ‘bread bin’ type sliding door makes package hand-off easier for differently abled customers

Computing systems

The bulk of the system’s autonomy software is installed and run on an Nvidia Jetson AGX Xavier SoC, although Ottonomy has also used the Xavier NX and Jetson Nano for this function, with some lower level ESCs and ECUs for decentralised control of subsystems.

“Our key requirements for the main computer were a deep integration of the GPU and CPU, and Nvidia is one of the best companies at that,” Korupolu says. “The GPU runs a lot of our AI, while a lot of parallel processing of all kinds of sensor data is going on too, meaning we also needed a CPU with a lot of TFLOPS and onboard memory.”

Comms buses running to and from the Jetson are selected according to the type and load of data. The Lidars communicate over high-speed Ethernet, the cameras use MIPIs, and the motor controllers and some other parts use serial interfaces, as appropriate for their high frequencies but low packet sizes.

Regarding the algorithms themselves, Korupolu says, “Our navigation and path-planning teams used STAGE and Gazebo as the simulation and testing tools, while the perception team used CARLA quite heavily. Most of our code is written in C++, with some in Python.”

Most of the subsystems have their own embedded computers serving as low-level ECUs and watchdogs, and while the high-level AGX Xavier computes software commands, a smaller companion CPU sits between these levels. That runs a vehicle interface function that converts the Xavier’s commands into signals the ECUs can understand.

As of January  this year, Ottonomy and Goggo had begun last-mile deliveries in the Spanish cities of Alcobendas and Zaragoza
As of January this year, Ottonomy and Goggo had begun last-mile deliveries in the Spanish cities of Alcobendas and Zaragoza

The Ottonomy ecosystem

Ottonomy has a suite of software applications for overseeing each Ottobot, which it refers to as its ‘ecosystem’.

For fleet managers, the Ottonomy Delivery Engine consists of a cloud and network operations interface for overseeing the performance and progress of the Ottobots. The cloud incorporates a set of APIs for plug-and-play integration with different ordering apps and other point-of-sale systems.

“If a customer doesn’t have a specific app, we have our own end-to-end application suite for ordering, which we can customise and co-brand with each fleet manager as needed,” Vijay says. “That also accounts for how their customers order, and how they track the progress and location of orders.”

For the engineers and technicians who are more directly responsible for the well-being of each UGV, a remote monitoring and teleoperation function accessible in the Ottonomy Delivery Engine dashboard enables a real-time and detailed overview of deliveries and performance. A diagnostics system in the robot detects signs of faults, and alerts its remote operator if help is needed, typically in the form of a controls override and teleoperation to the nearest workshop.

“Teleoperation is a fallback system, not a primary means of using the Ottobot, but it’s still very important to prevent situations like a UGV stopping in the middle of a road,” Vijay says. “That’s why we ensure our data link systems are optimised and use the latest secure technologies where possible.

“There’s also automation in how orders are assigned to robots, and it integrates some smart features like tracking battery SoCs so that long-distance orders aren’t given to Ottobots that don’t have the energy to complete the order and still make it to a charging location afterwards.”

Infotainment display

The 17 in (diagonal) colour-display infotainment screen on the UGV’s front provides a number of functions. These include interactive roles such as displaying messages to encourage people to give way in crowded environments, or showing a limited set of identifiable information so that delivery recipients can recognise when an Ottobot has arrived for them specifically, for example by their name or a two-factor authentication code.

“Even when Uber was just starting, there were edge cases where it was important to know your taxi’s number plate because two Ubers would show up in the same spot; the screen gives us options for resolving that,” Vijay says. “Depending on where the UGVs are, it also means we can display targeted adverts for nearby shops, restaurants or services that local pedestrians might be interested in.

“There are contextual opportunities beyond that for how the screen can be useful. For instance, in airports, we can display alerts about gates opening, final calls or cancelled flights, and other environments like business or university campuses will have messages their administrators will want disseminated. If another pandemic should occur, it could show reminders for social distancing and hygiene, or information on nearby testing centres.”

Messages can be quickly updated ad hoc via 4G and 5G data links, while the screen itself changes tone and brightness according to the time of day, remaining battery and whether it is indoors or outdoors.

Body

The UGV is predominantly metal, not expressly for collision safety or load bearing but more because Ottonomy is aware that some people might try to steal or break open an Ottobot to seize the goods inside.

“We haven’t deliberately made the UGV overly heavy, but by not focusing too much on light weight, we have something weighing around 200 lb – if someone can hoist that up and fast-carry it home, well at that point frankly they deserve it,” Vijay muses. “But that’s difficult, so our customers can trust the security of their delivery a bit more than with many other B2C delivery UGVs that tend to be far lighter and smaller.”

Manufacturing is outsourced to a distributed chain of micro-manufacturing partners specialising in components such as sheet metal and plastic moulding, to enable Ottonomy to focus on the autonomy and scale up for different batch requirements. The body’s design meanwhile is produced, assembled and tested in-house.

“The robot platform is metal, accounting for around 70% of the UGV, with the other 30% being moulded plastic, such as sensor mounting points and most edges or other potential points of physical contact with pedestrians,” Vijay says. “Therefore, if by chance there is a hard impact with a person, the plastic breaks or deforms to reduce the force.”

Future plans

Ottonomy is continuing its r&d to refine the Ottobot’s design and useability. In addition to providing modular versions of the 2.0, it unveiled in early January 2023 the Ottobot Yeti, which includes a rearwards-opening door and electromechanical rollers in its cargo bay to enable autonomous drop-off of packages for fully contactless deliveries.

The company is also keeping a close eye on evolving regulatory frameworks. Around 20 US states currently allow operations of delivery robots such as the Ottobot, although only two would allow the use of a road-capable version, which Ottonomy hopes to develop once legislation allows. The easing of regulations through proving the safety of systems such as the Ottobots will gradually enable the team to deliver its UGVs to more cities and markets.

Specifications

Ottobot 2.0

4WD

  • All-electric
  • Dimensions: 1090 x 750 x 1320 mm
  • Empty weight: 200 lb (90.7 kg)
  • Maximum operating speed: 6-8 kph (software-limited)
  • Maximum cargo capacity: 120 kg
  • Operating time between charges: 5 hours
  • Charging time: 75-90 minutes (on fast DC charge)

Some key suppliers

  • CAD and related consultation: Dassault SolidWorks
  • GNSS: LORD
  • GPUs and related consultation: Nvidia
  • Lidar: Ouster
  • Motor controllers: Roboteq
UPCOMING EVENTS