Autonomous technology is changing how materials move through factories, warehouses, and even hospitals. If you teach robotics, mechatronics, or industrial technology, you have likely watched this shift with equal parts excitement and curiosity. Students will soon enter workplaces where self-driving machines share the aisle with forklifts and people. Understanding the artificial intelligence (AI) that guides those machines is quickly becoming a core learning objective.
Autonomous Mobile Robots (AMRs) sit at the center of that trend. These robots travel freely, make their own navigation decisions, and take on the repetitive, sometimes risky task of moving parts, pallets, and packages. OTTO Motors (a Rockwell Automation company) is one of the leaders in this space, and its AMRs provide an excellent example of how AI turns a platform on wheels into a smart coworker.
Not only are AMRs a great (and necessary) addition to technical programs, they’re a great vehicle for teaching applied artificial intelligence.
AI at the Core of AMR Navigation
Every OTTO robot integrates a suite of sensors—LiDAR for distance mapping, 3D vision cameras for real-time vision, and inertial measurement units (IMUs) for orientation and movement. These sensors stream data into the AMR’s onboard processor, where artificial intelligence algorithms analyze environmental input in real time.

One foundational technique is simultaneous localization and mapping (SLAM). The robot generates a constantly-updating map of its surroundings while determining its precise location within that space. SLAM allows the AMR to navigate without predefined paths, reacting seamlessly to changes on the floor—whether that’s a forklift moving through an intersection or a new pallet blocking its original route.
Path planning algorithms help calculate optimal routes based on live facility conditions. AI also supports task management and prioritization, enabling the robot to make decisions when demand exceeds capacity, or when workflow interruptions require rerouting.
This AI-enabled autonomy is what sets AMRs apart from automated guided vehicles (AGVs) – and highlights the key difference between automation and autonomy. Matt Rendall, CEO of Clearpath Robotics & OTTO Motors, described the difference this way:
Think of any industrial facility or warehouse as a busy city. To quickly transport people around the city, you can use a subway system or taxis. A subway system is like an AGV. It’s cost-effective and automated, but it needs infrastructure to tell it where to go (a track), and it can’t stray from that predefined path. An AMR is more like a taxi. It can navigate a complex road system, redirect when there’s traffic, accidents or pedestrians, and change course on the fly. Thanks to the human driver, the taxi is autonomous, and with AI, so is the AMR.
AMRs and the Edge-to-Cloud Continuum
To better understand how artificial intelligence works in the AMR, it’s best to view it through the edge-to-cloud continuum. The AMR’s autonomy begins at the edge (the device itself). Thanks to the plethora of smart sensors and cameras, decisions about movement, collision avoidance, and local planning happens within the robot, without relying on external processing.

But the true intelligence of the system scales through the integration of cloud-based orchestration.
OTTO Fleet Manager is the central brain for the entire AMR fleet. Users can leverage the software to create jobs and workflows, analyze the efficiency of current routes and troubleshoot. But there are layers of complex AI taking place below the surface that can augment what these users are doing.
Fleet Manager collects and analyzes data from every active robot in a facility. This includes delivery timing, route congestion, idle periods, and task efficiency. Using these insights, Fleet Manager dynamically coordinates the fleet—assigning jobs, managing traffic, and optimizing charging schedules. Updates and new rules are sent to each AMR, transforming individual robot autonomy into a coordinated, fleet-wide intelligence.
This closed feedback loop means every AMR benefits from the collective experience of the fleet. If one robot encounters a blocked route or a more efficient path, that information is shared so the whole system adapts. The result is continuous, real-time optimization—not just reacting to current conditions, but steadily improving overall performance.
Real-World Applications
AMRs are already widely deployed in advanced manufacturing environments, distribution centers, and logistics hubs. They transport components between production lines, handle pallet movement in warehouses, and assist in order picking and fulfillment. Hospitals use them for non-critical deliveries like linens and medication. Restaurants and hotels have implemented AMRs to transport meals and supplies, and cleanroom environments in semiconductor manufacturing rely on them for sterile material handling.
These are not theoretical use cases—they represent real adoption across sectors, and they illustrate why students need a practical understanding of how autonomous systems operate.
What Students Should Learn About AI in AMRs
Educators don’t need to teach students to code an entire robot OS, but they should understand key AI concepts that power AMRs:
- Sensor Fusion: How data from multiple sensors (e.g., LiDAR, cameras, IMUs) is combined to create a unified, accurate model of the environment.
- SLAM: How the robot builds a map of its surroundings while tracking its own position within that map.
- Path Planning and Obstacle Avoidance: How AI algorithms calculate efficient, safe routes and dynamically avoid obstacles in real time.
- Edge Processing: The importance of onboard computation for real-time decision-making and autonomy, reducing reliance on external networks.
- Cloud Optimization and Fleet Management: How centralized platforms (like OTTO Fleet Manager) coordinate multiple robots, optimize workflows, and enable data-driven improvements across the fleet.
- Human-Robot Interaction and Safety: How AI enables safe operation around people, including the use of safety-rated sensors and compliance with industrial safety standards.
- Machine Learning and Adaptation: How robots can learn from data—improving navigation, obstacle detection, and efficiency over time.
These topics align directly with areas in robotics, data science, automation engineering, and intelligent systems.
Teaching AMRs + AI Across Different Education Levels
Autonomous mobile robots are already integrated into the operations of today’s most advanced facilities. Their effectiveness depends on intelligent software, real-time responsiveness, and seamless integration with broader automation systems. That makes AI-driven technologies an essential part of modern technical education.
For high schools, Discover AI provides an entry point into this world. Its accessible, hands-on learning modules help students explore foundational AI concepts like machine vision, autonomous navigation, and decision modeling. It’s built specifically to bring artificial intelligence into the classroom in a way that’s engaging and approachable for secondary learners.
At the postsecondary level, technical colleges and universities have the opportunity to go much deeper with the OTTO 100. Programs can incorporate full systems integration—connecting AMRs with collaborative robots, conveyors, smart sensors, and industrial software platforms. Students can study not only how AMRs work, but how they operate within a larger intelligent ecosystem.
From introductory exploration to advanced implementation, educators at every level can equip students with the knowledge and skills to thrive in AI-enabled environments.
Bring the intelligence behind AMRs into your classroom. Learn how Discover AI can support your high school program—or connect with our team to explore integration strategies for advanced training at the college and university level.