top of page
  • Currently the PR2 simluator is given the "look at" point in the world coordinate system corresponding to the rotation angle (yaw/pitch) from the Oculus HMD. The joint limits are not handles right now, which results in non - robust head tracking + rendering loop 

 

  • Slow data rates makes for poor rendering

    • Capable of receiving depth data at ~15Hz subject to network bandwidth

    • Gazebo Simulator currently works at 1Hz which limits the rendering frame rate

 

  • We spent too much time spent on robust communication between systems. 

 

    Future Work

 

  • Robust realization of head movements by handling joint angles

  • We are extracting the hand position and direction from the Leap mounted Oculus HMD but due to time constraint, we could not close the hand loop on the Linux end

  • It will be useful to investigate different rendering schemes like converting points into a mesh and comparing it with the existing point cloud rendering.

bottom of page