Announcement

Collapse
No announcement yet.

Perception Neuron and HTC Vive, need for some clarifications

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Perception Neuron and HTC Vive, need for some clarifications

    Hello,
    here is Marco. I am currently working on my MEng thesis about "Virtual Reality for Sports Training".

    We are trying to create a VR basketball situation in which the player has to shoot a free throw wearing a HTC Vive and a Perception Neuron tracking suit.
    What we want to do is to track the movements of the player (for the ball we only have a Kinect and computer vision - OpenCV - algorithms but we accept advices) and reproduce them in the VR space of the HTC and your last post on the blog seems to give us the solution.

    I tried the scene RUISViveNeuron and it's a great start point. But I'm writing here because I have some doubts about it and maybe I can find here the solution.
    I wear the suit, start Axis Neuron, start Unity3D and the scene but I see that the orientation of the head and the body seems to be not aligned (like I see forward with hands in front of me but they seem to be moved left or right). I also tried to press spacebar as it is said in the instruction readme file but nothing seems to work pretty well.

    I also tried to configure again the VIVE space from the SteamVR app and something seems to be better (before I had the body 90 degrees misaligned, now it's better but there's some degrees of difference).

    Does someone know if there's something that I'm missing to do?

    Thanks so much,
    Marco

  • #2
    Hi Marco,

    Pressing space bar worked when I tested the scene, and Vive and Perception Neuron functioned together perfectly fine for me.

    I'm not sure what you mean with "I see forward with hands in front of me but they seem to be moved left or right". If the skeleton is constantly rotating, then the script is trying to slowly correct the yaw difference between the Vive headset and the Neuron headband. Pressing space bar will instantaneously correct the yaw difference, and you usually need to do that every time you run your application, unless you want to wait for the yaw difference to be slowly corrected by itself.

    In Unity you can select the NeuronRobot_SingleMesh gameObject, and check from the "RUIS Yaw Drift Corrector" component that the "Reset Correction Button" is set to Space. The same component also has the "Drift Correction Velocity" parameter, but I don't recommend changing it from the default value of 2 (degrees per second).

    Have you modified the scene in some way? You can download the package again, and try with unmodified RUISViveNeuron scene, if that works properly.

    Other things that you can check:
    * Make sure that Unity's Game View is "active" (click it with mouse), so that pressing Space button is registered by the software.
    * Is the skeleton animated correctly in the AXIS Neuron software? If not, there might be something wrong with the calibration of Perception Neuron, or you might be wearing left hand trackers on right hand etc.
    * Are you wearing both the Vive headset and the Perception Neuron at the same time? You should.
    * Are you wearing the headband tracker in a correct way, as intended by Perception Neuron manufacturer?

    Using Kinect to track a ball can be challenging due to input lag, and because in Kinect 2 the color camera frame rate switches automatically between 15 and 30 FPS depending on the lighting condition, whereas the depth camera has a constant frame rate of 30. But if you get your Kinect ball tracker to work, then could use RUIS to calibrate Vive and Kinect coordinate systems and make the two devices work together.

    Comment

    Working...
    X