Columbia Robotics Lab
Mobile Robot Control Interface

Ethan Gold
etgold@cs.columbia.edu

The mobile robot we employ is equipped with several types of sensors and a velocity-oriented drive system. Sonars, CCD cameras, GPS, and odometry sensors, as well as four degrees of actuator freedom between the pan-tilt camera mount and the robot's drive system present a complex set of data systems to be assimilated and extended by both software and wetware.

Our user-level mobile robot control system are tools primarily built on top of the low-level sensor and actuator layers provided by RWI in the form of CORBA interfaces as well as a small number of independent utilities. Back-end servers and front-end User Interfaces communicate with each other by passing variable length integer and floating point vector fields back and forth via CORBA, facilitated by a Naming Service running on the robot's embedded computer. We currently use a simple gnuplot 2-dimensional polygon data format for our input maps. The maps are used as the obstacle dataset for the pathplanner module and as the basis for the map image in the main interface. A simple interface has been created to aid in the editing and simplification of the map data files for the pathplanner. UI components present the underlying data in a variety of forms for a given purpose. These may include raw numbers, plotted range views, or integrated batch management systems.

The navigation (NavServer) component of our control system communicates with the low-level sensor/actuator control classes provided by the RWI architecture and in turn exports a higher-level comprehensive API for controlling more abstract robot motion commands such as "execute the following path" or "tell me where you are and what you are doing." Much of the API is implicit in the data commands rather than as code-readable methods and interfaces due to the nature of the data types provided. This layered, multi-threaded, networked approach allows a user (or users) to interact with the robot at several levels of detail simultaneously. A sonar can be viewed as raw data, a plotted 2-dimensional range view, or on the integrated map canvas along with the visualization of the robot's current position and projected path. The human operator can view the world as seen by the robot and from a detached aerial view at the same time. An additional feature of the consistent RWI APIs as well as our own navigation software is that it may be used on multiple robot platforms with the same UI elements without modification.

At its simplest, the GUI provides a means of editing these robot control systems without resorting to typing floating-point NavServer instructions. At the other extreme, the GUI is a rich environment for realtime two-way communication between the remote unit and the user behind a desk. Sensors may be queried, data streams paused, commands edited and issued, all from a single consistent user interface. The primary interaction interface to the robots navigational control system is the NavServer's batch command execution CORBA interface. Batches of robot motion control commands are passed back and forth as an floating point vector field in which fixed-length groups of contiguous elements comprise individual NavServer instructions. Critical navigation meta commands such as "STOP", "RESET" and "CLEAR" are made available as an omnipresent toolbar.

The goal of our user interface is to provide a comprehensive realtime view of the robot's location and activities within it's environment without cluttering the user's workspace with cumbersome low-level details unless requested. The architecture should provide for access to the robot's sensor and control systems as the various levels of integration upon request.

The UI provides both an uncluttered list view of both the current and past batch of commands and an integrated map view displaying navigation "targets" as geographical locations and paths. Historical odometric and GPS sensor feedback as well as the current robot location are overlayed on the map view in realtime. Direct sensor feedback visualization for individual sensors is accomplished with the provided MOM sensor views in separate windows on the main workspace.

Images from the robot's primary camera can also be displayed (though wireless bandwidth is limited) in a window on the robot's UI workspace and the camera's PTU controlled either through the integrated UI or via external controls and as a webcam. (How often do you see a webcam mounted on a wireless mobile robot?) Realtime image data is useful as for remote manual navigation as well as for autonomous visual calibration.

The user edits the batch commands via the click-and-drag map and the comprehensive listbox. Easily configurable maps and zooming capabilities enable extremely precise manual path generation with the mouse via point-and-click. Targets can be adjusted by dragging them around the map canvas as well as precisely by entering exact coordinates. Additionally, an external point-to-point path planning software module using a polygon-based external map representation can automatically derive a valid collision-free (for static objects!) path between two arbitrary accessible points in our two-dimensional campus model. These path and target results are integrated back into the map UI along with the manually created targets.

In addition to the main control GUI we use a number of other tools to help manage our data and systems. Direct-control interfaces such as the PTU UI shown at left are currently separate from the MOM-based integrated UI and Mobility(tm) control layer and communicate with the hardware directly. These independent UI control prototypes are being merged with the integrated UI as necessary. The CORBA robotic control system services which communicate with the hardware and each other are managed via a multi-server status GUI which presents the status of the configured set of servers to the user and provides access to their debugging output streams. While this isn't part of the robot control system per se, it is an important interface to our embedded computer system which prevents conflicts between multiple developers working on the same multi-user embedded robot computer. Future UI development may include a separation of the high-level control system into an independent CORBA-based UI in order to gain independence from the moving target of the Mobility codebase. Both an independent Navigational UI and the MOM system will coexist peacefully, providing complementary user services.

MOM-based robot interface (click to enlarge)

The integrated Navigation GUI embedded in the Mobility Object Manager (MOM). The List and Map views are visible along with the meta-command toolbar and the status panel. The dialog is an expanded view of a single target. The NavServer and Mobility interface hierarchy is at left. Our campus origin is marked with a colored splotch.

Independant PTU GUI (click to enlarge)

The stand-alone PTU control interface. The UI provides indicators for the current PTU pan and tilt angles, relative and absolute positioning controls for the mouse, and precise text-based attitude controls

Mapviewer Tool (Click to enlarge)

The mapviewer facilitates polygon mapfile editing for pathplanner input and map simplification

Server Control Tool (Click to enlarge)

This services control GUI tracks the currently running hardware-to-CORBA processes which provide the foundation of the robot control system