Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stutter during head movement in Oculus Quest and Quest 2 #101

Open
tfurlong opened this issue Dec 10, 2020 · 20 comments
Open

Stutter during head movement in Oculus Quest and Quest 2 #101

tfurlong opened this issue Dec 10, 2020 · 20 comments
Assignees

Comments

@tfurlong
Copy link

I think this issue may exist in the Rift as well, but it is much less noticeable. When using the Oculus Quest or Quest 2 and an Oculus Link cable, the image jitters when there is head motion. It can be seen in the mirror view as well as black lines that appear on the sides of the frame. I think the issue may have something to do with TimeWarp because more prediction is required using the Quest/Link configuration.

@tfurlong
Copy link
Author

This line in OculusDevice::beginFrame looks incorrect, but removing it does not improve the problem:
m_sensorSampleTime = ovr_GetPredictedDisplayTime(m_session, frameIndex);

@tfurlong
Copy link
Author

The problem does not occur if samples=0 in viewerexample.cpp (default is 4). That is a suitable workaround for me.

@bjornblissing
Copy link
Owner

First of all, great that this library seems to work on the Quest/Link setup as well.

I have some follow up questions. Are you seeing this issues even with simple models like the cow.osg?
What configuration are you running on. I have seen some really strange performance problems with computers connected to monitors via DisplayPort->HMDI converters.

Another thing you can try is to change the threading model. It is currently set to SingleThreaded:

viewer.setThreadingModel(osgViewer::Viewer::SingleThreaded);

But other users have reported that other threading modes can work as well. See #100

@bjornblissing
Copy link
Owner

This line in OculusDevice::beginFrame looks incorrect, but removing it does not improve the problem:
m_sensorSampleTime = ovr_GetPredictedDisplayTime(m_session, frameIndex);

Sorry, I fail to see what is wrong with this line?

@tfurlong
Copy link
Author

Thanks for the quick reply! I've personally tried 2 different PCs with Quest 1 & 2 and an official and an unofficial Link cable. I also had my colleagues try it out on their Quest & Rift configurations. It seems to be the most pronounced on the Quest 2, which is why I was originally thinking it was due to the motion prediction code. It is pretty obvious with just the cow.osgt model, no need to try anything larger. I can try a different threading model, but I agree with the statement in #100 that SingleThreaded will have the lowest app-to-draw latency.

As to the call to ovr_GetPredictedDisplayTime, it is harmless given the call order of beginFrame and updatePose, but it temporarily changes the meaning of m_sensorSampleTime to be the predicted display time instead of the time stamp of the sampled pose.

@bjornblissing
Copy link
Owner

Well, what I actually meant that I have seen issues when the cable connecting the computer and standard monitor will affect the frame rate inside the HMD. In my case, it was related to connecting a HDMI monitor via a DisplayPort output via a passive converter. This locked the frame rate inside the HMD to the same frame rate as in the monitor, and the result was severe stuttering due to missed frames in the compositor.

I have started to look at the latest example code for OpenGL, and it seems like there is change in how time-warp is handled. This might be the reason for you issues. I will see if I have time to update my code in a similar fashion.

@bjornblissing
Copy link
Owner

I have just pushed a update which contain support for ovrLayerType_EyeFovDepth. This update should handle the new timewarp features better. Hopefully this will solve your issues.

@tfurlong
Copy link
Author

On the cable connection question, I understand now. One PC has a direct DisplayPort to a monitor, and another has a direct HDMI. Both see the same issue. I just tried modifying NVIDIA vsync settings on a per-application basis, and I think that the behavior changed whether vsync was enabled or not, but neither setting eliminated the stuttering.

I tried the latest code, and unfortunately it is worse for my configuration. The stuttering is still present, and it no longer goes away when samples=0. I rolled back to the November commit and verified that the best configuration is with that code base and samples=0.

@bjornblissing
Copy link
Owner

bjornblissing commented Dec 11, 2020

Ok. So moving to the depth aware timewarp was not the solution to your issues. I guess I will need to provide support for both the traditional warp and the depth aware version.

Question: Have you tried to enable any of the Oculus debug screens to try to find any clues to your performance issues?

@bjornblissing
Copy link
Owner

bjornblissing commented Dec 11, 2020

Suggestion: One thing you can try is to move the ovrserver_x64.exe to high priority using the Windows task manager. That seems to help some users who experience stutters and low frame rates on the Quest using the Link cable.

@bjornblissing
Copy link
Owner

Another question: Do you see the same type of issues in the official samples from Oculus, i.e. OculusRoomTiny_GL?

@tfurlong
Copy link
Author

No I don't see these issues in any other app.

@bjornblissing
Copy link
Owner

No I don't see these issues in any other app.

Strange! The OculusRoomTiny_GL should be using pretty much the same exact OpenGL calls as this library, so I would expect them to perform similarly... But apparently not...

@tfurlong
Copy link
Author

I noticed that too, especially with your recent updates to use the depth texture. Is it something on the OSG side? Something that adds additional or inconsistent delay that the Oculus library doesn't see?

@bjornblissing
Copy link
Owner

And the problems are only present (or at least detectable) on the Quest/Link combination.
I cannot see any issues when running with the Oculus Rift S.

@tfurlong
Copy link
Author

Yes, only for Quest/Link. I've tried on a Rift CV1, and a team member has tried on a Rift S. That's the reason I was thinking it was related to the time warp functionality. The Quest/Link combo has to encode video frames and push them out over USB, which means it relies more heavily on prediction than a Rift would.

@bjornblissing
Copy link
Owner

Did you try the suggestion of raising the process priority on the ovrserver_x64.exe?

@tfurlong
Copy link
Author

Just did, did not seem to have an effect.

@tfurlong
Copy link
Author

Is it possible this issue comes from which version of OSG I am using? We are currently at version 3.4.1.

@bjornblissing
Copy link
Owner

Just did, did not seem to have an effect.

Ok.. So then we can rule out the ovrserver_x64.exe process.

Is it possible this issue comes from which version of OSG I am using? We are currently at version 3.4.1.

It might be the reason, but I don't really see why it should be an issue. However, I am running 3.6.4 in my development environment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants