top of page

Approach 

 

Person wearing the Oculus Rift + LEAP controls robot’s head and the depth data from PR2 gets rendered on the Oculus display.

 

Platforms used 

 

  • PR2 - ROS - Linux

  • Oculus - Windows

  • Remote communication through sockets

 

The figure below shows an overview of our approach. Our subsystems run on two separate computers that are connected over the network. The PR2 robot is connected to a machine that runs Linux and the Robot Operating System (ROS). The Oculus rift and LEAP are connected to another machine that runs Windows. Both these machines are connected via sockets over the CSE network. We exchange point clouds, head and hand positions over the network. We use the Google Protocol buffers for serializing and deserializing these messages to make sure that they are OS independent.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

All our communications are via primitive sockets. In total, there are two bi-directional sockets. These sockets are set to non-blocking I/O i.e. read and write operations block on these sockets. In order to handle this, we use two dedicated threads on each machine to handle the sockets.

 

The linux machine runs three threads using the pthreads API. The main thread initializes ROS, subscribes to the registered point clouds from the robot, initiates a client for sending head motion commands to the robot, creates the sockets and spawns two threads to handle the socket read and write. Whenever the main thread receives a new point cloud from the robot, it updates a global structure with the new data. One of the spawned threads checks for any updates to the global structure and on such an update, it serializes the point cloud and sends it over the network to the windows m/c. The other thread waits to receive head and hand positions from the windows m/c. On receiving such a message, it sends a command to the robot to move the head to the desired target direction.

 

Similar to the linux machine, the windows machine also runs three threads. The main thread initializes the oculus and LEAP, creates the sockets, spawns two threads to handle the sockets and starts rendering the latest point cloud from the global structure to the oculus. Additionally in the render function, it gets the latest head and hand positions from the Oculus and LEAP, and updates the global structure based on these new values. A separate thread constantly checks for any new head and hand commands and if so, it serializes these commands and sends them over the network to the linux m/c. Finally, the last of the spawned threads gets point clouds from the socket and updates the global structure for the next rendering. Currently, the registered point clouds are rendered directly as colored points (there is no mesh).

 

bottom of page