touch sensor for contact between arbitrary geoms

Discussion in 'Simulation' started by Ethan Brooks, Nov 26, 2017.

  1. For the longest time, I could not figure out why my touch sensors were not registering contacts in the sensordata array until I read this in the documentation:

    So presumably if a contact point does not involve a geom attached to the same body as the site, it will not register. My sensors need to register contact with geoms attached to the worldbody, but their sites are attached to robot fingertips. What is the best way to get contacts to register in sensordata?
  2. Emo Todorov

    Emo Todorov Administrator Staff Member

    Not sure what you are trying to do. I am guessing you want to detect contacts between robot fingertips and everything else -- in which case you should indeed attach the sensor sites to the fingertips. Or, are you trying to detect contacts between a worldbody geom and everything else? In that case, the sensor site should be attached to the worldbody geom.

    Keep in mind that contacts involve a pair of geoms, but the touch sensor is only associated with one geom, and does not care what the other geom is that forms the contact.
  3. Ok. I have a minimal example that reproduces the issue:

    #include "lib.h"
    #ifdef MJ_EGL
    #include "renderEgl.h"
    #include "renderGlfw.h"
    #include "mujoco.h"
    #include "stdio.h"
    #include "stdlib.h"
    #include "string.h"
    int initMujoco(const char *filepath, State * state)
        char error[1000] = "Could not load xml model";
        state->m = mj_loadXML(filepath, 0, error, 1000);
        if (!state->m)
            mju_error_s("Load model error: %s", error);
        state->d = mj_makeData(state->m);
        mj_forward(state->m, state->d);
        mjv_makeScene(&state->scn, 1000);
        mjr_makeContext(state->m, &state->con, 200);
        return 0;
    int setCamera(int camid, State * state)
        mjvScene *scn = &(state->scn);
        mjvCamera *cam = &(state->cam);
        mjvOption *opt = &(state->opt);
        cam->fixedcamid = camid;
        if (camid == -1) {
            cam->type = mjCAMERA_FREE;
        } else {
            cam->type = mjCAMERA_FIXED;
        mjv_updateScene(state->m, state->d, opt, NULL, cam, mjCAT_ALL, scn);
        return 0;
    int closeMujoco(State * state)
        mjvScene scn = state->scn;
        mjrContext con = state->con;
        return 0;
    //-------------------------------- main function ----------------------------------------
    int main(int argc, const char **argv)
        int H = 800;
        int W = 800;
      char const *filepath = "../zero_shot/environment/models/pick-and-place/world.xml";
        /*char const *filepath = "xml/humanoid.xml";*/
        char const *keypath = "../.mujoco/mjkey.txt";
        State state;
    #ifdef MJ_EGL
        GraphicsState graphicsState;
        initOpenGL(&graphicsState, &state);
        // install GLFW mouse and keyboard callbacks
        initMujoco(filepath, &state);
        mj_resetDataKeyframe(state.m, state.d, 0);
      float action0 = 0;
        // main loop
        for (int i = 0; i < 10000; i++) {
        for (int j = 0; j < state.m->nsensordata; j++) {
          printf("%f ", state.d->sensordata[j]);
            renderOnscreen(-1, &graphicsState);
        action0 -= .1;
            state.d->ctrl[0] = action0;
            mj_step(state.m, state.d);
        return 0;
    In this video of the output, the upper half of the screen displays a terminal which prints out the contents of sensordata (as per the code). As this video demonstrates, the touch sensors are not activating when they come in contact with the blocks or the table.


    Attached Files:

    Last edited: Nov 29, 2017
  4. I just returned to this thread and realized how completely unreproducible these code samples are, so apologies to Emo and the Mujoco community!
    I wish I could provide a tidy model bundled into one or two files, but unfortunately, as you can see from the video, my robot uses quite a few STL meshes and it would be cumbersome to upload them all (or for you to download them).

    Instead, here is a link to a small github repository that contains the code and has a broken branch with the issue and a master branch with the fix:

    Basically I moved the touchsensor up one level in the hierarchy and suddenly it became responsive to the blocks. I'm not really sure I understand why this works, so I hope Emo can weigh in and give a more thorough explanation.