Mujoco VR and roadmap

Discussion in 'Feature Requests' started by ViktorM, Nov 17, 2017.

  1. Hi Emo,

    I have a feature (documentation) request and a question:

    1) It would be very useful to have some documentation for Mujoco VR as well as more default examples shipped with it, even very simple, like grasp one object with jaco arm and put it on another. I've installed and tried it a few days ago - it quickly worked out of the box a looks really nice. But I have a very little understanding what can be achieved with it, how I can control fingers of the default jaco arm and grasp something, how I can import another robot with more simple gripper, say Saywer or Fetch and and add control of it's grippers to some HTC Vive controller's keys.

    2) Can you share a roadmap for the next release? When approximately it is planned and what are a major new features?


    Thanks,
    Victor
     
  2. Emo Todorov

    Emo Todorov Administrator Staff Member

    The public version of the VR project is hopelessly behind. There is a new version that works with MuJoCo 1.50 and the Luke Hand model from DEKA, but since we are not yet allowed to distribute that model to the general public, the VR release was not updated. The plan is to update the VR project in the upcoming MuJoCo 1.60 release which should be out in a couple of months. At that point it would make sense to write documentation and provide more examples re VR.

    A major new feature in 1.60 which is already implemented is a native UI rendered in OpenGL, which allows you to add custom controls (buttons, sliders etc) to a MuJoCo simulation with surprisingly little programming. Rendering improvements are also planned, in particular allowing explicit texture coordinates to be loaded from file (instead of always generating them automatically).

    It is difficult to commit to a concrete timeline, because the main focus right now is development of optimization tools built into MuJoCo, and this is rather open-ended (but going well). The idea is to have MuJoCo design controllers for you automatically, given cost functions that specify the high-level goals of the movement. This has been the focus of my research over the past 20 years, and my research group has produced many examples of doing it successfully, but always as a research project that requires extensive manual labor by experts. I am trying to develop a product that does it automatically and much faster.