TBM1 - Getting to know my home

Introduction
Guide a robot around the home so it can learn about the locations of objects.

Point of Contact
Joe Daly (jd12587@bristol.ac.uk)

Phase 1:
The robot waits for a instruction to move to a location (e.g. "Go to the kitchen", "Turn right"). When at the target location the robot is given an instruction in the form "See the [object] on the [location] in the [room]". The robot takes a picture of the object and filters the instruction to keep the key words that are recorded in a file, semantic_map.txt, for marking.

Phase 2:
TBD

Installation Instructions

 * hearts_tts:
 * to install sound_play:
 * to install gstreamer:
 * sudo apt-get install python-gst0.10
 * sudo apt-get install python-gst0.10

Run Instructions

 * 1) create a map:
 * 2) roslaunch hearts_navigation hearts_navigation_map_begin.launch
 * 3) edit map_name parameter in hearts_navigation_map_end.launch (optional)
 * 4) roslaunch hearts_navigation hearts_navigation_map_end.launch
 * 5) edit map_name parameter in hearts_navigation_navigate.launch to match hearts_navigation_map_end.launch (TODO: remove parameter from know_my_home.launch)
 * 6) input coordinates of points of interest to locations.json:
 * 7) rosrun hearts_navigation qeconverter.py
 * 8) roslaunch hearts_navigation hearts_navigation_navigate.launch
 * 9) set 2D pose estimate
 * 10) ensure joystick has control of TIAGo, then set 2D nav goal at point of interest
 * 11) copy coordinates into locations.json
 * 12) repeat steps 4 and 5 for all points of interest
 * 13) set benchmarking box parameters:
 * 14) edit rsbb_key and rsbb_host parameters of roah_rsbb_comm node in know_my_home.launch
 * 15) run launch file:
 * 16) roslaunch tbm1_getting_to_know_my_home know_my_home.launch

Dependencies

 * hearts_stt: speech-to-text
 * hearts_tts: text-to-speech
 * hearts_navigation: navigation
 * hearts_camera_saver: save image frames from TIAGo's camera on request

Provided Topics & Services
None