Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Whether the program is common to HTC Vive devices #98

Open
SSSSamZhu opened this issue May 20, 2022 · 32 comments
Open

Whether the program is common to HTC Vive devices #98

SSSSamZhu opened this issue May 20, 2022 · 32 comments

Comments

@SSSSamZhu
Copy link

SSSSamZhu commented May 20, 2022

Hi,
For the need of robot display and interaction, we hope to repeat the work on VR head-mounted display (through Vive remote operation of the upper body cloud movement of the robot).But our VR headset doesn't seem to be the Oculur you're using.I have noted your relevant discussion. Does this mean I can reproduce your results on Vive at present?
#60
#71
from iCub Shenzhen01
Thanks!

@S-Dafarra
Copy link
Collaborator

S-Dafarra commented May 20, 2022

Hi @SSSSamZhu, thanks for opening the issue.

The pipeline we used in https://www.youtube.com/watch?v=r6bFwUPStOA is still in progress, and it is not thoroughly documented yet. In that case, we are using https://github.com/robotology/human-dynamics-estimation/tree/master to estimate the operator's body posture. Then, we use the Xsens_module of this repo to send references to the walking-controller (see https://github.com/robotology/walking-controllers).

For what concerns the use of the HTC Vive, we exploit this device: https://github.com/ami-iit/yarp-device-openxrheadset. Also in this case, its use is not very documented yet (but I am working on a README https://github.com/ami-iit/yarp-device-openxrheadset/tree/readme).

#71 was a first rough attempt to use the VIVE joysticks to command the robot arms. On the other hand, we are heading in a different direction, and that PR probably will not be merged.

I think that the easiest thing you can try would be to control just the robot neck with the HTC VIVE. The steps more or less would be:

Again, apologies for missing some good documentation about the whole pipeline, but it is an active research areas at the moment and many things are changing.

Let me know if you need additional details.

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Jun 7, 2022

Hi @S-Dafarra ,
Thanks for your help.I'm sorry thatI need to reopen this issue until 18 days later. I have recently been learning to use icub's libraries, and this is my first experience with VR.I hope you won't mind my trivial but useful question.

  1. Do yarp-device-openxrheadset need to be installed on the Windows and used together with Oculus_module?Or it can connect to a Vive VR device directly on superbuild of Linux?I noticed that openXR-related plug-ins have been added in the latest release of 2022.05. https://github.com/robotology/robotology-superbuild/releases
  2. Does that mean I need to fully configure the environment here https://github.com/robotology/walking-teleoperation before adding the module yarp-device-openxrheadset you mentioned?
  3. As for the whole system installation, should I install yarp on superbuild and use Windows system?
  4. Can the robot perform upper body motion without the walk-Controllers module,that captures the end effectors of the hands and head of the human operator and commands the respective movement?If you want to use the walk-Controllers module, should yarp and icub-main be installed on your Linux system as well?

Thank you for your answer and I'm sorry for taking up your time.

@SSSSamZhu
Copy link
Author

Hi @S-Dafarra ,
I'm surprised you guys made such a great robot https://www.youtube.com/watch?v=r6bFwUPStOA .We plan to use iCub more for hospitality and presentations .Therefore, we hope to reproduce your project on iCubShenzhen01 and get more detailed guidance from you. The only devices we have right now are the iCub V2.0 robot itself, and some VR devices like the VIve HTC I mentioned earlier .If necessary, we can purchase the same equipment as in your experiment for reproduction.

Thank you so much for helping us.

@S-Dafarra
Copy link
Collaborator

Apologies for the late answer, it has been a busy period. Let me try to go through your questions.

Do yarp-device-openxrheadset need to be installed on the Windows and used together with Oculus_module?Or it can connect to a Vive VR device directly on superbuild of Linux?I noticed that openXR-related plug-ins have been added in the latest release of 2022.05. https://github.com/robotology/robotology-superbuild/releases

Yes, now the yarp-device-openxrheadset is installed by the superbuild when ENABLE_TELEOPERATION is set to ON. This should make the installation much easier. On the other hand, you still need to run yarp-device-openxrheadset on Windows since the use of SteamVR and the VIVE headset on Linux is not very stable. We did try once, but we had some issues with the visualization inside the headset. For for starting I would suggest running `yarp-device-openxrheadseton Windows. TheOculusRetargetingModule`` can run on Linux instead.

  • Does that mean I need to fully configure the environment here https://github.com/robotology/walking-teleoperation before adding the module yarp-device-openxrheadset you mentioned?
  • As for the whole system installation, should I install yarp on superbuild and use Windows system?

Yes, you would need to setup the Windows system using the superbuild. I would suggest following these instructions: https://github.com/robotology/robotology-superbuild/blob/8e2869cef7ff91a3bbf1fa2151a2e79ceb1deb69/doc/conda-forge.md#source-installation

4. Can the robot perform upper body motion without the walk-Controllers module,that captures the end effectors of the hands and head of the human operator and commands the respective movement?If you want to use the walk-Controllers module, should yarp and icub-main be installed on your Linux system as well?

Yes, to use walking-controllers you would also need yarp and icub-main installed. Once more though, the use of the VIVE controllers to control the robot hands is not fully supported by us yet. It is currently a work in progress.

@SSSSamZhu
Copy link
Author

Hi @S-Dafarra ,
Thanks for your nice reply.

In that case, we are using https://github.com/robotology/human-dynamics-estimation/tree/master to estimate the operator's body posture. Then, we use the of this repo to send references to the (see https://github.com/robotology/walking-controllers).Xsens_modulewalking-controller

Based on what you did before, can I understand that I can use Vive HTC to reproduce some other functions of your project? Because I saw the vive HTC used in the video .Maybe this part of the function is to send the iCub robot's eyes back to the Vive HTC? This part of the functionality is also expected !

  1. Could you please give more details about using yarp-device-openxrheadset yarp superbuild on Windows10. For example, what command do I need to enter on the terminal?
  2. In Linux system, I have read the Readme file. The first step is to import the XML file https://github.com/robotology/walking-teleoperation/blob/master/app/scripts/Xprize-VisualRetargeting.xml you mentioned into yarpmanager and modify the configuration file ini at the same time. But I didn't find these files in superbuild . Does that mean I need to clone walking-teleoperation to the superbuild file directory?
    Screenshot from 2022-06-30 19-18-32
    Also, does walk-teleoperation not need to be compiled? Because I saw you write in the Readme
git clone https://github.com/robotology/walking-teleoperation.git
cd walking-controllers
mkdir build && cd build

Finally, thank you for your patience.

@S-Dafarra
Copy link
Collaborator

Based on what you did before, can I understand that I can use Vive HTC to reproduce some other functions of your project? Because I saw the vive HTC used in the video .Maybe this part of the function is to send the iCub robot's eyes back to the Vive HTC? This part of the functionality is also expected !

For the control of the gaze, you can check the SRanipalModule: https://github.com/robotology/walking-teleoperation/tree/42004109a4068f584fdaae348f448b8812efecdf/modules/SRanipal_module

Could you please give more details about using yarp-device-openxrheadset yarp superbuild on Windows10. For example, what command do I need to enter on the terminal?

I would suggest referring to https://www.yarp.it/latest/ and https://github.com/robotology/robotology-superbuild

In Linux system, I have read the Readme file. The first step is to import the XML file https://github.com/robotology/walking-teleoperation/blob/master/app/scripts/Xprize-VisualRetargeting.xml you mentioned into yarpmanager and modify the configuration file ini at the same time. But I didn't find these files in superbuild . Does that mean I need to clone walking-teleoperation to the superbuild file directory?

This probably means that the superbuild did not compile correctly.

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Jul 12, 2022

Hi @S-Dafarra ,
I am so sorry that I have agreed to this merger request due to my misoperation. I hope it will not cause too much negative impact on your work. If you need me to delete or withdraw the operation, please tell me the detailed steps. I am not familiar with this process. Please forgive me for my rudeness.
#71

As I mentioned earlier, we want to add solid presentation and interaction capabilities to icub that we already have, since it may need more interactive entertainment in the future. I noticed in your video, published in 2019, https://www.youtube.com/watch?v=yELyMYkCyNE there are used to Ocular fragments inside. I think this will be a stable technical solution and this is the effect of using Joypad + Oculus. Could you tell us the matching version and model you are using?
17cbb06d333a26833408e4c31c62ec9

In order to make this project looks more clean and efficient communication, maybe we can discuss our subsequent transfer to https://github.com/robotology/community/discussions.

Finally, thank you very much for your patience.

@S-Dafarra
Copy link
Collaborator

Hi @S-Dafarra , I am so sorry that I have agreed to this merger request due to my misoperation. I hope it will not cause too much negative impact on your work. If you need me to delete or withdraw the operation, please tell me the detailed steps. I am not familiar with this process. Please forgive me for my rudeness. #71

No problem at all, do not worry 😉

As I mentioned earlier, we want to add solid presentation and interaction capabilities to icub that we already have, since it may need more interactive entertainment in the future. I noticed in your video, published in 2019, https://www.youtube.com/watch?v=yELyMYkCyNE there are used to Ocular fragments inside. I think this will be a stable technical solution and this is the effect of using Joypad + Oculus. Could you tell us the matching version and model you are using?

We were using an old Oculus Rift (one of the first). It is very old, and it cannot be bought anymore. In that case, we were using this other device, but since we moved to the use of the VIVE we are not using it anymore (hence it is not maintained).

Notice that also in this way we needed walking-controllers to work on the robot.

In order to avoid using walking-controllers, we can resort to the following guide: https://github.com/robotology/human-dynamics-estimation/blob/f55e70d7adebc66552f4359f62b90fbed8b8d7ae/doc/how-to-run-whole-body-retargeting.md
On the other hand, this solution would require the use of the XSens suit.

As mentioned above, we are still working on a simpler retargeting application using just the controllers, but this is still a work in progress.

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Jul 12, 2022

We were using an old Oculus Rift (one of the first). It is very old, and it cannot be bought anymore.

Do you mean Oculus Rift CV1? There seems to be some stock in the Chinese market. It would be great to have your exact brand.

In that case, we were using this other device, but since we moved to the use of the VIVE we are not using it anymore (hence it is not maintained).

In the superbuild introduction, I saw the oculus module's function description.
https://github.com/robotology/robotology-superbuild/blob/master/doc/cmake-options.md#oculus
Is this corresponding to the function mentioned above? I'm not sure what the specific function of yarp-device-ovrHeadset is.

Notice that also in this way we needed walking-controllers to work on the robot.

In order to avoid using walking-controllers, we can resort to the following guide: https://github.com/robotology/human-
dynamics-estimation/blob/f55e70d7adebc66552f4359f62b90fbed8b8d7ae/doc/how-to-run-whole-body-retargeting.md On the other hand, this solution would require the use of the XSens suit.

All in all, does this mean that using Joypad + Oculus alone is not feasible? https://github.com/robotology/walking-teleoperation/wiki I saw it on a wiki, and it seems a little old.

but since we moved to the use of the VIVE we are not using it anymore (hence it is not maintained).

However, I remember you said that the use of the VIVE controllers to control the robot hands is not fully supported yet. You mean you use VIVE as an expression recognition module? Use XSens for full-body repositioning, not VIVE or Oculus.

@S-Dafarra
Copy link
Collaborator

but since we moved to the use of the VIVE we are not using it anymore (hence it is not maintained).

However, I remember you said that the use of the VIVE controllers to control the robot hands is not fully supported yet. You mean you use VIVE as an expression recognition module? Use XSens for full-body repositioning, not VIVE or Oculus.

Back in time, we did use the Oculus joysticks to control the robot (though walking-controllers), but we do not use that headset anymore, nor we support anymore the code that was working back in time.

Anyhow, I guess we are getting a bit off-topic.

If your aim is to use a VIVE headset to control the robot, right now the easiest that you can try is to work with the head and gaze retargeting only, as mentioned in #98 (comment). This requires you to install the robotology-superbuild as mentioned above. Once we are there, we can discuss how to proceed 😉

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Jul 12, 2022

If your aim is to use a VIVE headset to control the robot, right now the easiest that you can try is to work with the head and gaze retargeting only, as mentioned in #98 (comment).

So sorry, maybe my first question was misleading. What we want to do is reproduce an interactive technology solution on ICub, like you did before, using Joypad + Oculus to control the movement of the upper body of the robot. The first solution I came up with was to use VIVE, because we happen to have a VIVE in our lab.

Thanks.

@S-Dafarra
Copy link
Collaborator

If your aim is to use a VIVE headset to control the robot, right now the easiest that you can try is to work with the head and gaze retargeting only, as mentioned in #98 (comment).

So sorry, maybe my first question was misleading. What we want to do is reproduce an interactive technology solution on ICub, like you did before, using Joypad + Oculus to control the movement of the upper body of the robot. The first solution I came up with was to use VIVE, because we happen to have a VIVE in our lab.

Thanks.

That's clear.

The systems you have seen in the videos have been very complex to set up and require a lot of configuration and testing. In all those cases, we were controlling all the robot joints through the walking-controller.

Right now I don't have any how-to guide, nor any steps to follow. We do not have anything ready at the moment to control just the upper body using the VIVE or any other joystick, but it should be possible. In any case, this will require running a series of different modules and many things can go wrong in the middle.

Hence, as a first step, I was suggesting to start controlling only the neck, so that at least we can start testing the system.

I understand this is not exactly what you want, but I don't know how to help you otherwise.

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Jul 14, 2022

Moring, @S-Dafarra

We do not have anything ready at the moment to control just the upper body using the VIVE or any other joystick, but it should be possible. In any case, this will require running a series of different modules and many things can go wrong in the middle.

I also noticed that you used tracker for relocation, is this scheme still supported and maintained? It seems like a great idea.
https://github.com/ami-iit/yarp-openvr-trackers

@S-Dafarra
Copy link
Collaborator

Moring, @S-Dafarra

We do not have anything ready at the moment to control just the upper body using the VIVE or any other joystick, but it should be possible. In any case, this will require running a series of different modules and many things can go wrong in the middle.

I also noticed that you used tracker for relocation, is this scheme still supported and maintained? It seems like a great idea. https://github.com/ami-iit/yarp-openvr-trackers

Yes, we do use trackers. They are also supported in https://github.com/ami-iit/yarp-device-openxrheadset. With the same pipeline is possible to get the position and orientation of joysticks. Controlling the robot accordingly is another story

@SSSSamZhu
Copy link
Author

Hi, @S-Dafarra
I have learned how to connect YARP nodes to enable Windows hosts to communicate with Linux. But I still have some problems, and I hope to get your advice.

  1. I have created a new profile with the robot name in the corresponding folder, but I don't think my actions have worked. I hope you can get more detailed suggestions.

image
log in Windows.

(robsub) C:\Users\samgmzhu>yarprun --server /icub-virtualizer
[INFO] |yarp.os.Port|/icub-virtualizer| Port /icub-virtualizer active at tcp://192.168.1.102:10021/
[INFO] Yarprun successfully started on port:  /icub-virtualizer
[INFO] |yarp.os.impl.PortCoreInputUnit|/icub-virtualizer| Receiving input from /tmp/port/2 to /icub-virtualizer using tcp
STARTED: server=/icub-virtualizer alias=:icub-virtualizeryarpdev--from:openXRHeadsetParameters.ini0 cmd=yarpdev --from openXRHeadsetParameters.ini  pid=6992
[INFO] |yarp.os.impl.PortCoreInputUnit|/icub-virtualizer| Removing input from /tmp/port/2 to /icub-virtualizer
[ERROR] |yarp.os.Property| cannot read from openXRHeadsetParameters.ini
[INFO] |yarp.dev.Drivers| Welcome to yarpdev, a program to create YARP devices
[INFO] |yarp.dev.Drivers| To see the devices available, try:
[INFO] |yarp.dev.Drivers|    yarpdev --list
[INFO] |yarp.dev.Drivers| To create a device whose name you know, call yarpdev like this:
[INFO] |yarp.dev.Drivers|    yarpdev --device DEVICENAME --OPTION VALUE ...
[INFO] |yarp.dev.Drivers|    For example:
[INFO] |yarp.dev.Drivers|    yarpdev --device fakeFrameGrabber --width 32 --height 16 --name /grabber
[INFO] |yarp.dev.Drivers| You can always move options to a configuration file:
[INFO] |yarp.dev.Drivers|    yarpdev [--device DEVICENAME] --from CONFIG_FILENAME
[ERROR] |yarp.dev.Drivers| Unable to find --device option in file openXRHeadsetParameters.ini. Closing.
CLEANUP :icub-virtualizeryarpdev--from:openXRHeadsetParameters.ini0 (6992)
[INFO] |yarp.os.impl.PortCoreInputUnit|/icub-virtualizer| Receiving input from /tmp/port/2 to /icub-virtualizer using tcp
[INFO] |yarp.os.impl.PortCoreInputUnit|/icub-virtualizer| Removing input from /tmp/port/2 to /icub-virtualizer
[INFO] |yarp.os.impl.PortCoreInputUnit|/icub-virtualizer| Receiving input from /tmp/port/2 to /icub-virtualizer using tcp
[INFO] |yarp.os.impl.PortCoreInputUnit|/icub-virtualizer| Removing input from /tmp/port/2 to /icub-virtualizer
[INFO] |yarp.os.impl.PortCoreInputUnit|/icub-virtualizer| Receiving input from /tmp/port/2 to /icub-virtualizer using tcp
[INFO] |yarp.os.impl.PortCoreInputUnit|/icub-virtualizer| Removing input from /tmp/port/2 to /icub-virtualizer
  1. I was wondering how do I change the host here? Now I have changed the original icub-console to my host name issacliang-LC3. May I ask how to modify icub-console-gui? He couldn't start for that reason.
    Screenshot from 2022-07-26 20-16-05
    Looking forward to your reply.

@S-Dafarra
Copy link
Collaborator

Hi @SSSSamZhu, I am afraid this would be much harder than expected. I have been discussing this with some colleagues in the lab and we will try to release an application like those in https://robot-bazaar.iit.it/applications to perform upper-body retargeting.

Nonetheless, this might take several weeks. Hope it is not a problem for you.

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Jul 27, 2022

Hi, @S-Dafarra, Thanks for your quick reply.
I seem to have heard you mention this application before. I think he'll be this one which sounds like it has everything we need. In order to cooperate with your upcoming application release, could you please inform us of the hardware equipment we need to prepare in advance? Just VIVE controllers?

As mentioned above, we are still working on a simpler retargeting application using just the controllers, but this is still a work in progress.

@S-Dafarra
Copy link
Collaborator

Hi, @S-Dafarra, Thanks for your quick reply. I seem to have heard you mention this application before. I think he'll be this one which sounds like it has everything we need. In order to cooperate with your upcoming application release, could you please inform us of the hardware equipment we need to prepare in advance? Just VIVE controllers?

As mentioned above, we are still working on a simpler retargeting application using just the controllers, but this is still a work in progress.

In principle, it should be possible to use every headset compatible with OpenXR. We will probably test the VIVE, but probably also an Oculus connected via the Oculus Link should work.

@SSSSamZhu
Copy link
Author

Hi, @S-Dafarra,

In principle, it should be possible to use every headset compatible with OpenXR. We will probably test the VIVE, but probably also an Oculus connected via the Oculus Link should work.

If you don't mind, I look forward to participating in your VIVE testing when the prototype application is completed. Because, you know, this issue is taking a little too long. I can't wait to see icub in action. Thanks,

@S-Dafarra
Copy link
Collaborator

S-Dafarra commented Jul 28, 2022

Hi @SSSSamZhu I have recently put together the following file to run neck and eyes retargeting. Please use the following walking-teleoperation branch: https://github.com/robotology/walking-teleoperation/tree/usability_improvements
Here the steps to follow:

  1. run yarpserver
  2. run yarprun --server /icub-head --log on icub-head
  3. on Windows, download teleoperationFIles.zip and unpack them in a folder of your choice.
  4. On windows, using a terminal properly configured to run yarp applications, go to the folder where you unpacked the files and run yarpmanager --apppath ./
  5. You should be able to see two applications
    image
  6. Run all the modules on the iCub application and connect all the ports
  7. On the teleoperation application

The app to run upper body retargeting will take some more time to be developed.

@SSSSamZhu
Copy link
Author

Hi, @S-Dafarra,

I have recently put together the following file to run neck and eyes retargeting. Please use the following walking-teleoperation branch: https://github.com/robotology/walking-teleoperation/tree/usability_improvements

(robsub) C:\Users\samgmzhu\robotology-superbuild\src\walking-teleoperation>git checkout usability_improvements
Previous HEAD position was 28c685a Merge pull request #99 from robotology/devel
Switched to a new branch 'usability_improvements'
branch 'usability_improvements' set up to track 'origin/usability_improvements'.

I think it will work.

  • run yarpserver
  • run yarprun --server /icub-head --log on icub-head
  • on Windows, download teleoperationFIles.zip and unpack them in a folder of your choice.

There are two types of files in the package, one is.xml, which I put in the application folder of Superbuild ~/robotology-superbuild\build\install\share\yarp\applications; But I don't know how to place the INI configuration file, can you describe it in more detail?

  • On windows, using a terminal properly configured to run yarp applications, go to the folder where you unpacked the files and run yarpmanager --apppath ./

I've been running yarpmanager on my Windows host without getting a response. The result is the same this time. It seems that the result is not normal. Could you provide me with a solution?

(robsub) C:\Users\samgmzhu>yarpmanager

(robsub) C:\Users\samgmzhu>yarpmanager --apppath ./

(robsub) C:\Users\samgmzhu>yarpmanager --apppath ./

(robsub) C:\Users\samgmzhu>

Thanks.

@S-Dafarra
Copy link
Collaborator

S-Dafarra commented Jul 29, 2022

Hi @SSSSamZhu, that seems to be an issue with your yarp installation and I would suggest to open an issue in the relevant repository.

Otherwise I would suggest you to wait until we release the app I was mentioning above.

@SSSSamZhu
Copy link
Author

Hi, @S-Dafarra,
Ok, I have created the issue. and how about this question about oculusConfigNeckOnly.ini ?

There are two types of files in the package, one is.xml, which I put in the application folder of Superbuild ~/robotology-superbuild\build\install\share\yarp\applications; But I don't know how to place the .ini configuration file, can you describe it in more detail?

@S-Dafarra
Copy link
Collaborator

You did not need to copy those files anywhere, it is sufficient to launch yarpmanager as mentioned (with --apppath specified), or to launch the relevant modules from the folder where you have those files.

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Aug 2, 2022

Hi, @S-Dafarra,
Thanks to your help, I was able to move the neck of the robot successfully. But for some reason, the headset doesn't show the robot's view. Can you tell me the debugging scheme? This is my Windows screenshot.

e0757f8b9ba9a3e9a76eae49c2fd2dc
1659428669503
But I was able to use the camera directly from the Linux console.
Screenshot from 2022-08-02 17-00-06

@S-Dafarra
Copy link
Collaborator

Hi @SSSSamZhu make sure to connect also the ports in the Teleoperation application. If that does not work, make sure that you are using the external graphic card. You can check https://www.dell.com/support/kbdoc/it-it/000190229/how-to-set-nvidia-video-as-the-default-with-computers-that-have-integrated-and-discrete-video-cards?lang=en on how to do that.

@SSSSamZhu
Copy link
Author

Hi @S-Dafarra ,

make sure to connect also the ports in the Teleoperation application. If that does not work, make sure that you are using the external graphic card.

I tried to connect the Ports in the Teleoperation application again, but failed. So far the problem seems to be in the connection port. Does this have anything to do with my configuration on Linux? Because I notice that the names on both ends of "from" and "to" have turned green.

f1549c4d3c8d2084b5116ba1c313b91

Moreover, when I wanted to choose the independent graphics card as the main device, I found that my computer only read the independent graphics card, but did not detect the integrated graphics card on the computer motherboard. Maybe he used a separate graphics card in the first place. Sorry for the inconvenience caused by the Chinese system.
image
image

Thanks.

@SSSSamZhu
Copy link
Author

Hi @S-Dafarra ,

make sure to connect also the ports in the Teleoperation application. If that does not work, make sure that you are using the external graphic card.

I tried to connect the Ports in the Teleoperation application again, but failed. So far the problem seems to be in the connection port. Does this have anything to do with my configuration on Linux? Because I notice that the names on both ends of "from" and "to" have turned green.

Finally, I changed the carrier value from 'mjepg' to 'tcp'. The problem was solved. Now I can see the iCub view of the world in my head display. Thanks!

@S-Dafarra
Copy link
Collaborator

Great! You can check if the mjpeg carrier is correctly installed on both the Windows machine and on the iCub head: https://www.yarp.it/git-master/group__carrier__config.html#carrier_config_mjpeg

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Aug 11, 2022

Hi, @S-Dafarra ,
The problems I had:
Today I encountered this problem while trying to open the application: Unable to find "MJPEG".
e909070a1015b9767e502aeb02948de
73a2ca2a31fd3e45284752a9e185309

The solution I tried:

Great! You can check if the mjpeg carrier is correctly installed on both the Windows machine and on the iCub head: https://www.yarp.it/git-master/group__carrier__config.html#carrier_config_mjpeg

Then I think I need to turn on YARP_COMPILE_CARRIER_PLUGINS and then ENABLE_yarpcar_mjpeg_carrier in CMake, as mentioned in the link. Awkwardly, because I use SuperBuild, I can't find the YARP folder and open it separately. Please tell me what to do next.
715f2aedd403916d6260e74bab056ae
There is no Build folder in YARP. I am also aware of this tutorial, but I may need more detailed guidance. https://github.com/robotology/robotology-superbuild#faqs
Thanks.

@SSSSamZhu
Copy link
Author

SSSSamZhu commented Aug 12, 2022

Fill in some details about the machine's complaints.
I got this error when trying to open the first yarpdev of Teleoperation. At this point, the problem does not appear to be caused by 'MJEPG'. Looking forward to your reply. Thank you.

53cb9e2ff86ae29e8486681f2427154
17f8a115a85ef24855a530e23262372

I have been discussing this with some colleagues in the lab and we will try to release an application like those in https://robot-bazaar.iit.it/applications to perform upper-body retargeting.

Nonetheless, this might take several weeks. Hope it is not a problem for you.

Finally, I would like to know when the application will be available. I hope this question doesn't come across too abruptly.
Thanks again.

@S-Dafarra
Copy link
Collaborator

Hi @SSSSamZhu, sorry but the problems you are listing above are not specific to this repo. I would like to avoid cluttering this issue too much. I would suggest you open specific issues in the relevant repos.

The idea behind the application was specifically meant to avoid these kinds of configuration issues. On the other hand, its development might take some time. I will keep you posted in case there is any update.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants