Research Project

Campus Autonomous Robot Tours (CART)

Abstract 

To explore remote environments, teleoperated rovers can be deployed to survey terrain in hard-to-access locations. This project investigates the challenges of developing an unmanned ground vehicle system for autonomy-assisted teleoperation. This proposed project is to develop the Campus Autonomous Robot Tours (CART) rover system. CART is motivated by JPL Team CoSTAR and the NeBula autonomy solution deployed on robot teams in the DARPA Subterranean Challenge. CART will be developed to use autonomy software for teleoperation in urban environments from a base station interface. The CART system will be developed to operate with tour guide personnel to provide remote interactive tours of the university campus. The rover will be designed with perception, autonomy, wireless communication, teleoperation, and safety systems for operation around pedestrians. CART will use lidar sensors to collect data for obstacle avoidance autonomy while mapping the environment. To enhance the interactive experience, the rover will use cameras to provide visual information on points of interest surrounding the rover during teleoperation. The rover will employ an inertial navigation system for waypoint navigation between points of interest. Operating CART in an urban environment will be used to address the challenges of teleoperation, inertial navigation, and wireless communication in densely structured terrain. Tour participant interaction with the rover perception and navigation systems through the base station interface will be used to gain insight into trust in autonomy. CART will function as a primary rover asset with teleoperation for remote data collection and communication during field operations.

Motivation/Research Problem

Autonomous unmanned ground vehicles (UGVs) are used in missions ranging from industrial inspection to terrestrial and space exploration. A UGV can be deployed as an independent unit to perform a mission or operate alongside other UGVs as part of a robot team to benefit from multiple robots working together to complete a cooperative task. A UGV capable of carrying a payload containing a multitude of sensors, data processing, and communications equipment can serve as a primary rover hub for long-range exploratory missions within a robot or human-robot team. A UGV equipped with teleoperation, autonomous obstacle avoidance and satellite-aided navigation can operate in dynamic environments while providing perception sensor data to a base station for remote operation with human oversight. One example of a UGV system is the Campus Autonomous Robot Tours (CART) project, which is focused on developing a system to deliver remote virtual campus tours to visitors through the eyes of a robotic rover.

Research Team
Lead Researchers:

  • Amiel Hartman, Mechanical Engineering
  • Nhut Ho, Mechanical Engineering
  • Ashley Geng, Electrical and Computer Engineering
  • Li Liu, Computer Science
  • Joe Bautista, Art + Design

Collaborator:

  • Ali-akbar Agha-mohammadi, JPL collaborator

Student Team

  • M. Fadhil Ginting
  • Coulson Aquirre
  • Kyle Strickland
  • CART student team, from Systems Engineering Research Laboratory (SERL) senior design projects and ME 486 senior design course

Funding

  • Funding Organization: NASA
  • Funding Program: MIRO
Alignment, Engagement and Contributions to the priorities of NASA’s Mission Directorates

JPL team CoSTAR uses the NeBula autonomy solution on wheeled and legged UGVs to participate in the DARPA subterrain (SubT) challenge for research tasks aligned with robot teaming and exploration on NASA space missions. In collaboration with JPL, the CART project aims to develop a UGV system that can showcase robotics technology to the university community by providing interactive tours of the campus and serving as a research platform for autonomy-assisted teleoperation. Dense urban environments are challenging scenarios for wireless communication in the absence of a network infrastructure and can suffer from latency or bandwidth limitations. However, live data streaming to a ground control station for display and teleoperation is desirable for an interactive tour guide system. UGV missions in dynamic environments with pedestrians require autonomous obstacle avoidance and fault-tolerant safety systems to avoid undesirable collisions.

Research Questions and Research Objectives

How do we use a robot platform to create a live, interactive, virtual tour that is engaging for visitors to the university and ARCS? How do we integrate the robot tour guide system into the existing infrastructure for tours at the university? How do we integrate the interactive tour display into the ARCS gallery space to showcase robotics technology? How do we safely implement teleoperation with autonomy in an urban environment?

The goal of the CART project research is to develop a primary rover asset that will be used to deliver remote interactive tours of the university campus to visitors from a gallery space. The rover and tour guide system will be designed with the following capabilities in mind:

  • Present participants with an interactive remote tour of the university from a gallery space where they are provided with data from the UGV sensors and information about the university as the UGV travels to different locations on campus.
  • Automate UGV mission tasks to reduce control interface complexity and operator load so the system can be controlled by a trained operator using simplified navigation commands.
  • Monitor communications for teleoperation and recover from intermittent control commands or inertial navigation data.
  • Notify operator of system fault and execute emergency stop on operator command.
Research Methods

Collaboration with JPL team CoSTAR will help to define use case scenarios, sensor payloads, and autonomy operation requirements for UGV development in addition to the tour guide robot concept. The dense urban environment of the university campus provides a field of operation for mission testing and deployment. Robot operating system (ROS) will be used to develop UGV software and autonomy. An electronics payload for perception, navigation, and communications will be developed by integrating commercially available sensors into a design tailored for the tour guide mission requirements. Commercially available UGV chassis base platforms will be explored for testing software and autonomy to accelerate development time to rover deployment. A custom-designed UGV chassis could be developed in the future to accommodate the specific environment and operating requirements of the mission.

To further accelerate the development process, simulation software will be used to test the autonomy performance of the robot model in a virtual university campus environment. The simulation could be integrated into the visitor display to enhance the interactive campus tour. Simplified waypoint navigation will be integrated into the ground control system to minimize operator load for UGV control, thereby allowing visitors or university tour guide ambassadors to have partial control of the system during operation, separate from an engineer/operator. Emergency stop capabilities will be integrated into the UGV and ground control safety systems for pedestrian and environment collision avoidance.

Research Deliverables and Products

The research efforts are currently directed towards integrating software and payload electronics into a Husky A200 UGV. A Linux and ROS software architecture is being used to integrate the GNSS-aided inertial navigation system using an AHRS/IMU for robot localization and waypoint navigation. Stereo depth and 3D lidar sensors are integrated using ROS software for obstacle avoidance autonomy. Preliminary teleoperation is achieved using ROS communication between the UGV and a base station computer, with the integration of waypoint navigation and calibrated obstacle avoidance still in development. A constructed representative model of the university campus is being integrated into the ROS Gazebo/RVIZ simulation that is currently being used for testing waypoint navigation and obstacle avoidance autonomy software. The campus model will be enhanced with lidar mapping data. A combination of WiFi infrastructure and radio communications is being tested for a more robust wireless communication system that can handle teleoperation in addition to high-bandwidth video data streaming for the tour participant interface. Expanded electronics for power distribution for additional perceptions, sensors, and wireless safety stops are in development. Lessons learned from collaborative research with the senior design course project in developing a mobile ground control station will be used to integrate a base station control interface into the ARCS gallery space. The Campus Autonomous Robot Tours abstract has been accepted for the ASCEND 2021 conference. The conference paper for publication is in process to discuss the unique challenges of system design for autonomous outdoor tour guide robot development and operation.

Research Timeline

Start Date: June 2020
End Date: December 2022

Research Team
Lead Researchers:

  • Amiel Hartman, Mechanical Engineering
  • Nhut Ho, Mechanical Engineering
  • Ashley Geng, Electrical and Computer Engineering
  • Li Liu, Computer Science
  • Joe Bautista, Art + Design

Collaborator:

  • Ali-akbar Agha-mohammadi, JPL collaborator

Student Team

  • Coulson Aquirre
  • M. Fadhil Ginting
  • Kyle Strickland
  • CART student team, from Systems Engineering Research Laboratory (SERL) senior design projects and ME 486 senior design course

Funding

  • Funding Organization: NASA
  • Funding Program: MIRO