Menu Guidelines Links News Download Wish List Educational Robots Tomy Toy Bots Other Toy Robots My Robots
Forgotton Robots Orc Ready Beat Quasar Klatu Newt Robot Tinker Robot Beetle Robots Robot Factory Chogokin Lightan OSU ASV Walker
Coke Robots Fetal Giant Robot GreatRobot Popcorn Robot Spot Robot Alpha Dog Class Robot Novag 2Robot Restaurant Robot
Electro Robots Eric RUR Robots Freddies Ford Garco Robots Gygan Robots RG-5 Robots Cybernetic-Dog Zbop Robot Nutro Robot Roll oh Robot
Boxes Controller Manuals Posters Repairs Video Help - Help         
  Robots in the Classroom, Research and Space - by Pat Stakem | Previous | 1 | 2 | 3 | 4 | 5 | 6 | 7 | Next |  

Project LARGE: [L]IDAR-[A]ssisted [R]obotic [G]roup [E]xploration. A small fleet of robots — a mothership and some workerbots — used 3D LIDAR data to explore novel areas. The software team developed object recognition, mapping, path planning, and other software autonomously control the workerbots between infrequent contacts with human monitors, and wrote control progams using ROS.

The LARGE Team ; Mentors - "NASA Mike" - Michael Comberiate, Jaime Cervantes, Cornelia Fermuller, Marco Figueiredo, Pat Stakem ;
Software - Felipe Farias, Bruno Fernades, Thomaz Gaio, Jacqueline Kory, Christopher Lin, Austin Myers, Richard Pang, Robert Taylor, Gabriel Trisca ;
Hardware - Andrew Gravunder, David Rochell, Gustavo Salazar, Matias Soto, Gabriel Sffair ; Others - Mike Huang, William Martin, Randy Westlund


Penguin Robot
Mothership Robot "Nanook"
At NASA Goddard Space Flight Center, one
of three student research tracked platforms,
Click to Enlarge
Penguin Robot
Mothership Robot "Nanook"
The unit was heavy, and was moved around on
the cart. Development station in background.
Click to Enlarge
Penguin Robot
Mothership Robot "Nanook" and
Penguin #1 and Penguin #2 Workerbots
Click to Enlarge

Penguin Robot
Mothership Robot "Nanook"
Front view. The lidar scanned the room and
made a map. A lot of data was involved.
Click to Enlarge
Penguin Robot
Mothership Robot "Nanook"
Called Penguins because they had been
to Antarctica.. This one has lidar.
Click to Enlarge
Penguin Robot
Mothership Robot "Nanook"
Called Penguins because they had been
to Antarctica.. This one has lidar.
Click to Enlarge
Penguin Robot
Mothership Robot "Nanook"
At NASA Goddard Space Flight Center, one
of three student research tracked platforms,
Click to Enlarge

LARGE : [L]IDAR - [A]ssisted [R]obotic [G]roup [E]xploration. A small fleet of robots — a mothership and some workerbots — used 3D LIDAR data to explore novel areas. The software team developed object recognition, mapping, path planning, and other software autonomously control the workerbots between infrequent contacts with human monitors, and wrote control progams using ROS.

The LARGE Team :

Mentors : "NASA Mike", Jaime Cervantes, Cornelia Fermuller, Marco Figueiredo, Pat Stakem

Software : Felipe Farias, Bruno Fernades, Thomaz Gaio, Jacqueline Kory, Christopher Lin, Austin Myers, Richard Pang, Robert Taylor, Gabriel Trisca

Hardware : Andrew Gravunder, David Rochell, Gustavo Salazar, Matias Soto, Gabriel Sffair

Others Involved : Mike Huang, William Martin, Randy Westlund


Penguin Robot
Mothership Robot "Nanook"
At NASA Goddard Space Flight Center, one
of three student research tracked platforms,
Click to Enlarge
Penguin Robot
Mothership Robot "Nanook"
Called Penguins because they had been
to Antarctica.. This one has lidar.
Click to Enlarge
Penguin Robot
Mothership Robot "Nanook"
The unit was heavy, and was moved around on
the cart. Development station in background.
Click to Enlarge

Penguin Robot
Penguin #1 and Penguin #2 Workerbots
Lidar is in the front, and a laptop is
to the left, for collecting the data.
Click to Enlarge
Penguin Robot
Penguin #1 and Penguin #2 Workerbots
The power electronics and battery were
low in the chassis.
Click to Enlarge
Penguin Robot
Penguin #1 and Penguin #2 Workerbots
Couple of big students could lift the unit up
to a convenient height to work on it.
Click to Enlarge
Penguin Robot
Penguin #1 and Penguin #2 Workerbots
Aha! I see you. You're the Mothership Robot.
Click to Enlarge

Penguin Robot
Penguin #1 and Penguin #2 Workerbots
Each unit had two spheres, at different
heights. These were Styrofoam balls.
From the lidar (or, later, camera data)
each unit could be identified.
Click to Enlarge
Penguin Robot
Penguin #1 and Penguin #2 Workerbots
Another view of the id spheres.
Click to Enlarge
Penguin Robot
Penguin #1 and Penguin #2 Workerbots
Side view. These units had a car
battery for propulsion power.
Click to Enlarge
Penguin Robot
Penguin #1 and Penguin #2 Workerbots
to work with Mothership Robot.
Click to Enlarge

LARGE : [L]IDAR - [A]ssisted [R]obotic [G]roup [E]xploration. A small fleet of robots — a mothership and some workerbots — used 3D LIDAR data to explore novel areas. The software team developed object recognition, mapping, path planning, and other software autonomously control the workerbots between infrequent contacts with human monitors, and wrote control progams using ROS.

The LARGE Team :
Mentors : "NASA Mike", Jaime Cervantes, Cornelia Fermuller, Marco Figueiredo, Pat Stakem
Software : Felipe Farias, Bruno Fernades, Thomaz Gaio, Jacqueline Kory, Christopher Lin, Austin Myers, Richard Pang, Robert Taylor, Gabriel Trisca
Hardware : Andrew Gravunder, David Rochell, Gustavo Salazar, Matias Soto, Gabriel Sffair
Others Involved : Mike Huang, William Martin, Randy Westlund

The goal of the LARGE project is to assemble a networked team of autonomous robots to be used for three-dimensional terrain mapping, high-resolution imaging, and sample collection in unexplored territories. The software we develop in this proof-of-concept project will be transportable from our test vehicles to actual flight vehicles, which could be sent anywhere from toxic waste dumps or disaster zones on Earth to asteroids, moons, and planetary surfaces beyond.

The robot fleet consists of a single motherbot and a set of workerbots. The motherbot is capable of recognizing the location and orientation of each workerbot, allowing her to designate target destinations for any worker and track their progress. Presently, localization and recognition is performed via the detection of spheres mounted in a unique configuration atop each robot. Each worker can independently plot a safe path through the terrain to the goal assigned by the motherbot. Communication between robots is interdependent and redundant, with messages sent over a local network. If communication between workers and the motherbot is lost, the workers will be able to establish a new motherbot and continue the mission. The failure of any single robot or device will not prevent the mission from being completed. artificial color 3D point cloud image.

The robots use LIDAR sensors to take images of the terrain, stitching successive images together to create global maps. These maps can then be used for navigation. Eventually, several of the workers will carry other imaging sensors, such as cameras for stereo vision or a Microsoft Kinect, to complement the LIDAR and enable the corroboration of data across sensory modalities.


NASA Goddard Space Flight Center through "NASA Mike's" - Michael Comberiate Robot Boot Camp and Randy Westland with Mothership Robot "Nanook". The robot surveyed the harsh terrain and weather of the Antarctic. The team outfitted an MMP-40 with loads of sensors and communications gear including a LADAR unit, Satellite phone comm. system, onboard computers, battery heaters and much more. The robot has the ability to be controlled from anywhere in the world via the internet.

The robotic mothership undergoes testing in Anartica and Alaska while being operational from Maryland. The current test system runs around $30,000 while the final rover that will be sent to Mars can exceed $100 million.

The Internet, for example, is not suited to a transmission delay of more than 3 seconds. When sending data from say Mars, line of sight transmission can still experience delays of five to ten minutes. Without line of sight, the delay is even longer, often hours.

When a communication protocol experiences a transmission delay, the usual procedure is to try to send the transmission again, from the beginning. This process is not suitable for planetary exploration, thus, the need for a new communication protocol that can handle long delays.

In their research, Comberiate and his team developed a robot that is being tested at the arctic and that could wind up in the Mars Rover mission. Communicating between their offices in Maryland and the robot at the South Pole is similar to communicating to a roving robot on Mars. The engineers experience satellite synchronization issues with volumes of data as the robot takes digital dot-matrix pictures of objects it finds, similar to what they will experience when transmitting with equipment on another planet. The images are sent to the engineers, who then decide what objects require a closer look. Dot matrix is used because it will transmit faster than a digital camera image.

The robot uses laser-based guidance known as LADAR (Laser Detection and Ranging) to find and take images of objects. It is semi-autonomous and has 3D scanning capability with image stitching.

The laser has a spinning mirror inside that sweeps the beam from left to right, measuring the time it takes to return a pulsed beam of infrared light from an object. The mirror spins four times on each horizontal line and then a step motor raises it up ¼ of a degree in the vertical axis. The laser spins again along the horizontal line, building the image one line at a time. “We scan with a ¼ degree of accuracy left and right, and ¼ degree of accuracy up and down,” noted Comberiate, “which gives us a 3D image. The colors show a low resolution of the distance to every point in the scan, but the computer onboard has about 1000 times more data than shown in the images. These images convey the critical information to the operators on Earth, but take 1000 times less time to send than a typical photograph.”

3D scanning provided by the mothership brings back images that show depth of field plus azimuth plus elevation. The different colors shown in the images depict varying distances from the mothership.

Previous imaging systems could not deliver the needed resolution and the pictures displayed considerable distortion. “We chose the Lin Engineering step motor because it could handle the arctic conditions of -40 below 0 and still deliver smooth motion and hold position,” said Comberiate. “It gives us excellent remote control over the size of each step.”

In the rugged environments, the robot must operate off batteries. “We direct the heat from the electronics to where it is needed throughout the robot and to the batteries to keep them warm. Any motor we choose must be able to handle such environmental conditions.”


Courtesy of Mentor Patrick H. Stakem a Senior Systems Engineer supporting NASA's Goddard Space Flight Center.


All pictures, information and notes are not those of NASA or Goddard Space Flight Center. No affiliation or endorsement has been made or taken and Infringement is not intended. Contents will be removed if in violation. Please contact me / us if you want to add information (names, pictures etc.), find any errors, or wish information or pictures removed. We strive for accuracy and will make any changes that is necessary.


Source: Patrick H. Stakem - Updated 01-14-2015