TBM2 - Welcoming Visitors

Introduction
Recognise visitors at the door and take appropriate action depending on who they are.

Point of Contact
Zeke Steer (mail@zekesteer.me.uk)

Approach
The robot waits for the doorbell to be rung, then navigates to the front door where it instructs the visitor to stand in front of the camera. The robot detects faces in the video stream, which it matches against training images of Dr. Kimble using a similarity threshold. If Dr. Kimble is not recognised then the robot attempts to recognise the postman or deliman using statistical properties (e.g. median red, green and blue) of the chest region (e.g. beneath the detected face).

If the robot recognises Dr. Kimble, it leads him to the bedroom, then waits for him to leave. The robot detects that Dr. Kimble has left the bedroom by looking for change in the laser scan data. The robot then navigates back to the front door and wishes Dr. Kimble goodbye.

If the robot recognises the postman, it instructs him to open the door and leave the parcel on the hallway floor. It then wishes him goodbye.

If the robot recognises the deliman, it leads him to the kitchen then instructs him to leave the breakfast box on the table. The robot then navigates back to the front door and wishes the deliman goodbye.

Otherwise the robot indicates to the visitor that it doesn't recognise them and cannot open the door.

The robot ends the interaction by returning to its home position.

Installation Instructions

 * axis_camera:
 * git clone https://github.com/ros-drivers/axis_camera.git
 * recompile
 * source ~/tb_ws/devel/setup.bash
 * opencv (???)

Run Instructions

 * 1) create a map:
 * 2) roslaunch hearts_navigation hearts_navigation_map_begin.launch
 * 3) edit map_name parameter in hearts_navigation_map_end.launch (optional)
 * 4) roslaunch hearts_navigation hearts_navigation_map_end.launch
 * 5) edit map_name parameter in hearts_navigation_navigate.launch
 * 6) to match hearts_navigation_map_end.launch (TODO: remove parameter from tbm2_welcoming_visitors.launch)
 * 7) input coordinates of points of interest to locations.json:
 * 8) rosrun hearts_navigation qeconverter.py
 * 9) roslaunch hearts_navigation hearts_navigation_navigate.launch
 * 10) set 2D pose estimate
 * 11) ensure joystick has control of TIAGo, then set 2D nav goal at point of interest
 * 12) copy coordinates into locations.json
 * 13) repeat steps 4 and 5 for all points of interest (home, hallway, entrance, outside bedroom)
 * 14) set benchmarking box parameters:
 * 15) edit rsbb_key and rsbb_host parameters of roah_rsbb_comm node in tbm2_welcoming_visitors.launch
 * 16) configure Axis camera:
 * 17) edit hostname, username and password parameters of axis_driver nodes in hearts_uniform.launch and hearts_face_uniform_reg.launch
 * 18) train face recogniser:
 * 19) roslaunch hearts_face_uniform_reg hearts_face_uniform_reg.launch
 * 20) have Dr. Kimble stand in front of the camera - making small circular head movements - until the training data have been captured and the launch file has terminated
 * 21) optionally change the sensitivity threshold in macros.hpp and recompile
 * 22) train uniform recogniser:
 * 23) roslaunch hearts_face_uniform hearts_face_uniform.launch
 * 24) have the deliman and postman stand in front of the camera, then get the min/max ranges of the statistical parameters over several image frames
 * 25) edit the macros.hpp file with the min/max ranges and recompile
 * 26) run launch file:
 * 27) roslaunch tbm2_welcoming_visitors tbm2_welcoming_visitors.launch

Dependencies

 * hearts_face_uniform: face and uniform recognition
 * hearts_face_uniform_reg: face registration (training)
 * hearts_tts: text-to-speech
 * hearts_navigation: navigation

Provided Topics & Services
None