Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reed Solomon FEC seems rather in-effective when using x26X compression #361

Open
alatteri opened this issue Nov 17, 2023 · 14 comments
Open
Assignees

Comments

@alatteri
Copy link

alatteri commented Nov 17, 2023

Hello,

In my quest to lower latency, I am testing various built in FEC functions in UG. Currently I am wrapping the output of UG with srt-live-transmit, but inherently adds 3x (or more) RTT to the glass-to-glass latency. In my testing Reed Solomon seem to provide basically zero resilience from even less than 1% packet loss, but LDGM was high resilient even at 10% loss.

Tested both x264 and x265. x265 being much worse than x264, basically un-usable.

Encoder - Ubuntu Server 23.10
Input is HD SDI via BMD uv -t decklink:codec=R12L -c libavcodec:encoder=libx26X:crf=22
Client - Ubuntu Desktop 23.10 uv -d vulkan_sdl2
I am testing on a local LAN with an avg ping of about 0.8ms, and no packet loss.
I am simulating packet loss using sudo iptables -A INPUT -i enp86s0 -m statistic --mode random --probability N -j DROP. (must do sudo ufw disable for this to work)

I've tried several permutations of RS FEC and all of them seem to provide no resiliency with even a 1% (N=0.01) loss in packets.

-f V:rs:175:250
-f V:rs:200:240
-f V:rs:200:250

encoder log:

[DeckLink capture] 121 frames in 5.04129 seconds = 24.0018 FPS
Receiver of 0xaf938d54 reports RTT=30410 usec, loss 0.78% (out of 2262 packets)
[DeckLink capture] 121 frames in 5.04164 seconds = 24.0001 FPS
Receiver of 0xaf938d54 reports RTT=34591 usec, loss 1.17% (out of 3164 packets)
[DeckLink capture] 120 frames in 5.0004 seconds = 23.9981 FPS
Receiver of 0xaf938d54 reports RTT=4592 usec, loss 0.78% (out of 2118 packets)
[DeckLink capture] 121 frames in 5.04142 seconds = 24.0012 FPS
Receiver of 0xaf938d54 reports RTT=9445 usec, loss 0.78% (out of 2679 packets)
[DeckLink capture] 121 frames in 5.04333 seconds = 23.9921 FPS
Receiver of 0xaf938d54 reports RTT=49423 usec, loss 0.78% (out of 3427 packets)
[DeckLink capture] 121 frames in 5.04035 seconds = 24.0063 FPS

receiver log:

[lavc h264 @ 0x7f5724005ac0] corrupted macroblock 6 40 (total_coeff=-1)
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 6 40
[lavc h264 @ 0x7f5724005ac0] mb_type -1094995534 in P slice too large at 107 29
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 107 29
Video dec stats (cumulative): 1489 total / 1435 disp / 54 drop / 304 corr / 0 miss FEC noerr/OK/NOK: 1144/41/304
[lavc h264 @ 0x7f5724005ac0] cbp too large (3199971767) at 71 34
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 71 34
[lavc h264 @ 0x7f5724005ac0] negative number of zero coeffs at 119 35
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 119 35
[VULKAN_SDL2] 121 frames in 5.04106 seconds = 24.0029 FPS
SSRC 0xaf938d54: 3044/3072 packets received (99.0885%), 28 lost, max loss 1
[lavc h264 @ 0x7f5724005ac0] cbp too large (3199971767) at 88 24
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 88 24
[lavc h264 @ 0x7f5724005ac0] corrupted macroblock 86 50 (total_coeff=-1)
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 49 40
[lavc h264 @ 0x7f5724005ac0] corrupted macroblock 86 40 (total_coeff=-1)
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 86 40
[VULKAN_SDL2] 119 frames in 5.03892 seconds = 23.6162 FPS
[lavc h264 @ 0x7f5724005ac0] corrupted macroblock 88 28 (total_coeff=-1)
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 88 28
SSRC 0xaf938d54: 1900/1920 packets received (98.9583%), 20 lost, max loss 1
[lavc h264 @ 0x7f5724005ac0] negative number of zero coeffs at 98 29
[lavc h264 @ 0x7f5724005ac0] error while decoding MB 98 29
[lavc h264 @ 0x7f5724005ac0] cbp too large (3199971767) at 98 40

But if I use LDGM I can sustain 10% (N=0.1) loss reliably, with just a few hits every now and then. Of course 10% loss is an exaggerated scenario.
-f V:LDGM:200:250:5
encoder log:

[DeckLink capture] 120 frames in 5.00002 seconds = 23.9999 FPS
Receiver of 0x2a438faf reports RTT=35049 usec, loss 10.16% (out of 3189 packets)
Receiver of 0x2a438faf reports RTT=2853 usec, loss 9.77% (out of 2563 packets)
[DeckLink capture] 121 frames in 5.04167 seconds = 24 FPS
Receiver of 0x2a438faf reports RTT=19256 usec, loss 9.38% (out of 2611 packets)
[DeckLink capture] 120 frames in 5.00002 seconds = 23.9999 FPS
Receiver of 0x2a438faf reports RTT=2441 usec, loss 10.16% (out of 6802 packets)
[DeckLink capture] 120 frames in 5.00007 seconds = 23.9997 FPS

receiver log:

Video dec stats (cumulative): 744 total / 694 disp / 50 drop / 3 corr / 0 miss FEC noerr/OK/NOK: 37/704/3
[VULKAN_SDL2] 121 frames in 5.04012 seconds = 24.0074 FPS
SSRC 0x2a438faf: 2791/3072 packets received (90.8529%), 281 lost, max loss 3
[VULKAN_SDL2] 120 frames in 5.0024 seconds = 23.9885 FPS
SSRC 0x2a438faf: 4637/5120 packets received (90.5664%), 483 lost, max loss 3
[VULKAN_SDL2] 118 frames in 5.0015 seconds = 23.5929 FPS
SSRC 0x2a438faf: 2508/2816 packets received (89.0625%), 308 lost, max loss 3
[VULKAN_SDL2] 120 frames in 5.04039 seconds = 23.8077 FPS
SSRC 0x2a438faf: 3210/3584 packets received (89.5647%), 374 lost, max loss 4
[VULKAN_SDL2] 121 frames in 5.03794 seconds = 24.0178 FPS
SSRC 0x2a438faf: 3353/3712 packets received (90.3287%), 359 lost, max loss 3
[VULKAN_SDL2] 118 frames in 5.04208 seconds = 23.4031 FPS
SSRC 0x2a438faf: 927/1024 packets received (90.5273%), 97 lost, max loss 3
Video dec stats (cumulative): 1489 total / 1431 disp / 58 drop / 11 corr / 0 miss FEC noerr/OK/NOK: 165/1313/11

I have not test this with audio. I have not analyzed how the additional amount of FEC data adds to the total data stream.
I have also not tried to optimize LDGM settings, so they could be wildly in-efficient.

Using percentage based coverage with LDGM seems to NOT work very well either. Smaller percentage amounts have same errors.
-f V:LDGM:10%

LDGM: Choosing maximal loss 10.00 percent.
LDGM: Chosen LDGM setting for frame size that is 258.834% higher than your frame size.
You may wish to set the parameters manually.

[control] Unable to initalize FEC!
@sogorman
Copy link

@alatteri Thanks for all the leg work on this. I am interested and vested in this as well as with our application FEC is a much better solution than say SRT as we are trying to keep glass to glass latency to a minimum. We use h.265 with RS and have had mixed results as your logs show.

I am going to spend some time this week with TC and simulate jitter and random loss and see how LDGM preforms.

@MartinPulec
Copy link
Collaborator

First remark is that <m> for LDGM is not the same as <n> for Reed-Solomon (zfec). Actually rather, n = m + k. So for R-S, '200:240' gives you 20% redundancy, '200:240:5' for LDGM will gives you 120%! (I've fixed the ratio in wiki – redundancy was given as k/m but should be m/k)

For further evaluation, I'd suggest fixing the input format to testcard:s=128x96 (which gives approx. the same bitrate as your use-case (deduced from your data) and MTU=1500.

Commands used were then (both using 20% redundancy):

uv -t testcard:size=128x128 -d dummy -m 1500 -f rs:200:240
uv -t testcard:size=128x96 -d dummy -m 1500 -f ldgm:256:64:5

For 10% loss both were unable to reconstruct all, but R-S reconstructed successfully 686/774 frames (+ other 51 incomplete frames) but LDGM only 477/774 (taking first Video dec stats report).

With 5% loss it was 759/774 for R-S and 722/744 for LDGM,

@MartinPulec
Copy link
Collaborator

Using percentage based coverage with LDGM seems to NOT work very well either. Smaller percentage amounts have same errors.
-f V:LDGM:10%

This is correct, actually setting correct values for LDGM is a bit tricky; because of that the percent value was defined to simplify the setting to users by selecting a eligible "preset" for given frame size.

But, as far as I know, LDGM works good only if there are a big number of packets per frame, effectively working for uncompressed and JPEG but not being eligible for H.264/HEVC, therefore there are no presets.

Although the values for LDGM may look similar as for R-S (except the semantic difference), it is actually not so. For R-S, the actual values do not matter much, just their ratio (redundancy). And it also works very good because if there is 25% redundancy per frame, it really manages to repair the packet if received 80% of packets or more (per frame basis, the iptatable or netem 10% "dropper" will certainly drop 20% for some frames). For LDGM, both are not true - the actual numbers matter, eg. 1024:256 can give you very different results than 256:64 (that is just why percents were given). And second, it doesn't give you a guarantee that 25% redundancy catches 20% loss (it works statistically for high numbers/packets but not for low-bitrate stream)

TL;DR: Unless you use high-bitrate streams which is R-S slow for, use R-S. In terms of correction strength, I believe that LDGM is at most as good as R-S since R-S should be optimal erasure code. And for lower-bit rate streams this is very optimistic assumption; it is also more elaborate to work with.

I'll try to improve the wiki/help according this pieces of information.

@alatteri
Copy link
Author

alatteri commented Dec 6, 2023

Hello Martin.

I've tried 3 things.
uv -t testcard:size=128x128 -m 1316 -f V:rs:200:240 10.55.118.41
Is stable even thru up to 10% loss sudo iptables -A INPUT -i enp86s0 -m statistic --mode random --probability .1 -j DROP

[2023-12-06 13:03:19] Video dec stats (cumulative): 3107 total / 2809 disp / 281 drop / 17 corr / 17 miss FEC noerr/OK/NOK: 3071/2/17
[2023-12-06 13:03:19] Pbuf: total 5073/5120 packets received (99.08203%).
[2023-12-06 13:03:20] SSRC 0x222c0ffa: 3899/4352 packets received (89.5910%), 453 lost, max loss 3
[2023-12-06 13:03:22] [VULKAN_SDL2] 119 frames in 5.00003 seconds = 23.7998 FPS
[2023-12-06 13:03:25] SSRC 0x222c0ffa: 3997/4480 packets received (89.2188%), 483 lost, max loss 3
[2023-12-06 13:03:27] [VULKAN_SDL2] 121 frames in 5.04154 seconds = 24.0006 FPS
[2023-12-06 13:03:30] SSRC 0x222c0ffa: 3910/4352 packets received (89.8438%), 442 lost, max loss 4
[2023-12-06 13:03:31] Video dec stats (cumulative): 775 total / 744 disp / 31 drop / 67 corr / 0 miss FEC noerr/OK/NOK: 273/435/67
[2023-12-06 13:03:32] [VULKAN_SDL2] 121 frames in 5.0416 seconds = 24.0003 FPS
[2023-12-06 13:03:35] SSRC 0x222c0ffa: 4040/4480 packets received (90.1786%), 440 lost, max loss 4

uv -t testcard:size=128x128 -c libavcodec:encoder=libx265:crf=22 -m 1316 -f V:rs:200:240 10.55.118.41
is unstable even with only 1% loss sudo iptables -A INPUT -i enp86s0 -m statistic --mode random --probability .01 -j DROP

[2023-12-08 11:17:18] Video dec stats (cumulative): 775 total / 573 disp / 200 drop / 0 corr / 2 miss FEC noerr/OK/NOK: 773/0/0
[2023-12-08 11:17:19] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 548
[2023-12-08 11:17:19] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 548
[2023-12-08 11:17:19] [VULKAN_SDL2] 119 frames in 5.00005 seconds = 23.7997 FPS
[2023-12-08 11:17:19] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 555
[2023-12-08 11:17:19] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 555
[2023-12-08 11:17:20] SSRC 0x514c65f2: 128/128 packets received (100.0000%), 0 lost, max loss 0
[2023-12-08 11:17:24] [VULKAN_SDL2] 121 frames in 5.04179 seconds = 23.9994 FPS
[2023-12-08 11:17:25] SSRC 0x514c65f2: 252/256 packets received (98.4375%), 4 lost, max loss 1
[2023-12-08 11:17:28] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 779
[2023-12-08 11:17:28] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 779
[2023-12-08 11:17:29] [VULKAN_SDL2] 121 frames in 5.04125 seconds = 24.002 FPS
[2023-12-08 11:17:30] SSRC 0x514c65f2: 127/128 packets received (99.2188%), 1 lost, max loss 1
[2023-12-08 11:17:34] [VULKAN_SDL2] 120 frames in 5.0003 seconds = 23.9986 FPS
[2023-12-08 11:17:35] SSRC 0x514c65f2: 126/128 packets received (98.4375%), 2 lost, max loss 1
[2023-12-08 11:17:36] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 970
[2023-12-08 11:17:36] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 970
[2023-12-08 11:17:39] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1043
[2023-12-08 11:17:39] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1043
[2023-12-08 11:17:39] [VULKAN_SDL2] 120 frames in 5.00002 seconds = 23.9999 FPS
[2023-12-08 11:17:40] SSRC 0x514c65f2: 128/128 packets received (100.0000%), 0 lost, max loss 0
[2023-12-08 11:17:41] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1093
[2023-12-08 11:17:41] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1093
[2023-12-08 11:17:44] [VULKAN_SDL2] 120 frames in 5.00012 seconds = 23.9994 FPS
[2023-12-08 11:17:45] SSRC 0x514c65f2: 127/128 packets received (99.2188%), 1 lost, max loss 1
[2023-12-08 11:17:46] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1224
[2023-12-08 11:17:46] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1224
[2023-12-08 11:17:49] Video dec stats (cumulative): 1551 total / 1317 disp / 226 drop / 1 corr / 8 miss FEC noerr/OK/NOK: 1540/2/1
[2023-12-08 11:17:49] [VULKAN_SDL2] 120 frames in 5.04162 seconds = 23.8019 FPS
[2023-12-08 11:17:50] SSRC 0x514c65f2: 254/256 packets received (99.2188%), 2 lost, max loss 1
[2023-12-08 11:17:50] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1330
[2023-12-08 11:17:50] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1330
[2023-12-08 11:17:54] [VULKAN_SDL2] 121 frames in 5.04164 seconds = 24.0001 FPS
[2023-12-08 11:17:55] SSRC 0x514c65f2: 127/128 packets received (99.2188%), 1 lost, max loss 1
[2023-12-08 11:17:55] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1457
[2023-12-08 11:17:55] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1457
[2023-12-08 11:17:59] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1539
[2023-12-08 11:17:59] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1539
[2023-12-08 11:17:59] [VULKAN_SDL2] 120 frames in 5.00015 seconds = 23.9993 FPS
[2023-12-08 11:18:00] SSRC 0x514c65f2: 127/128 packets received (99.2188%), 1 lost, max loss 1
[2023-12-08 11:18:04] [VULKAN_SDL2] 121 frames in 5.04164 seconds = 24.0001 FPS
[2023-12-08 11:18:05] SSRC 0x514c65f2: 127/128 packets received (99.2188%), 1 lost, max loss 1
[2023-12-08 11:18:06] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1709
[2023-12-08 11:18:06] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1709

uv -t testcard:size=1920x1080 -c libavcodec:encoder=libx265:crf=22 -m 1316 -f V:rs:200:240 10.55.118.41
is unstable even with only 1% loss sudo iptables -A INPUT -i enp86s0 -m statistic --mode random --probability .01 -j DROP

[2023-12-08 11:20:09] Video dec stats (cumulative): 981 total / 770 disp / 203 drop / 6 corr / 8 miss FEC noerr/OK/NOK: 967/0/6
[2023-12-08 11:20:09] Pbuf: total 1517/1536 packets received (98.76302%).
[2023-12-08 11:20:10] [VULKAN_SDL2] 120 frames in 5.00003 seconds = 23.9999 FPS
[2023-12-08 11:20:11] SSRC 0x4681aaf9: 128/128 packets received (100.0000%), 0 lost, max loss 0
[2023-12-08 11:20:12] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 440
[2023-12-08 11:20:12] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 440
[2023-12-08 11:20:14] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 510
[2023-12-08 11:20:15] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 510
[2023-12-08 11:20:15] Video dec stats (cumulative): 776 total / 573 disp / 197 drop / 5 corr / 6 miss FEC noerr/OK/NOK: 765/0/5
[2023-12-08 11:20:15] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 519
[2023-12-08 11:20:15] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 519
[2023-12-08 11:20:15] [VULKAN_SDL2] 120 frames in 5.00167 seconds = 23.992 FPS
[2023-12-08 11:20:16] SSRC 0x4681aaf9: 126/128 packets received (98.4375%), 2 lost, max loss 1
[2023-12-08 11:20:20] [VULKAN_SDL2] 120 frames in 5.04018 seconds = 23.8087 FPS
[2023-12-08 11:20:21] SSRC 0x4681aaf9: 253/256 packets received (98.8281%), 3 lost, max loss 1
[2023-12-08 11:20:21] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 671
[2023-12-08 11:20:21] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 671
[2023-12-08 11:20:22] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 688
[2023-12-08 11:20:22] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 688
[2023-12-08 11:20:25] [VULKAN_SDL2] 121 frames in 5.04155 seconds = 24.0006 FPS
[2023-12-08 11:20:26] SSRC 0x4681aaf9: 128/128 packets received (100.0000%), 0 lost, max loss 0
[2023-12-08 11:20:26] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 805
[2023-12-08 11:20:26] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 805
[2023-12-08 11:20:30] [VULKAN_SDL2] 120 frames in 5.00019 seconds = 23.9991 FPS
[2023-12-08 11:20:31] SSRC 0x4681aaf9: 126/128 packets received (98.4375%), 2 lost, max loss 1
[2023-12-08 11:20:35] [VULKAN_SDL2] 121 frames in 5.04161 seconds = 24.0003 FPS
[2023-12-08 11:20:36] SSRC 0x4681aaf9: 255/256 packets received (99.6094%), 1 lost, max loss 1
[2023-12-08 11:20:36] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1058
[2023-12-08 11:20:36] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1058
[2023-12-08 11:20:38] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1110
[2023-12-08 11:20:39] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1110
[2023-12-08 11:20:39] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1114
[2023-12-08 11:20:39] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1114
[2023-12-08 11:20:39] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1121
[2023-12-08 11:20:39] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1121
[2023-12-08 11:20:40] [VULKAN_SDL2] 121 frames in 5.04147 seconds = 24.0009 FPS
[2023-12-08 11:20:41] SSRC 0x4681aaf9: 128/128 packets received (100.0000%), 0 lost, max loss 0
[2023-12-08 11:20:43] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1217
[2023-12-08 11:20:43] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1217
[2023-12-08 11:20:45] [VULKAN_SDL2] 120 frames in 5.00018 seconds = 23.9991 FPS
[2023-12-08 11:20:46] SSRC 0x4681aaf9: 124/128 packets received (96.8750%), 4 lost, max loss 1
[2023-12-08 11:20:46] Video dec stats (cumulative): 1552 total / 1319 disp / 222 drop / 8 corr / 11 miss FEC noerr/OK/NOK: 1532/1/8
[2023-12-08 11:20:50] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1396
[2023-12-08 11:20:50] [lavc hevc @ 0x7fdc38018440] Could not find ref with POC 1396
[2023-12-08 11:20:50] [VULKAN_SDL2] 119 frames in 5.0002 seconds = 23.7991 FPS
[2023-12-08 11:20:51] SSRC 0x4681aaf9: 254/256 packets received (99.2188%), 2 lost, max loss 1


@alatteri alatteri changed the title Reed Solomon FEC seems rather in-effective Reed Solomon FEC seems rather in-effective when using x26X compression Dec 7, 2023
@MartinPulec
Copy link
Collaborator

Hi Alan, wishing all the best to the new year.

I was able to reproduce your described behavior but let me explain, how FEC in UG works. To get the numbers below, run with -VV a you'll get frame sizes:

  1. exactly 32768 B
  2. 25-3000 B, but most of the times 25B
    sizes B: 4753 23 21 22 201 353 394 433 176 62 24 64 25 29 25 25 25 25 49 32 3188 922 1261 853 33 25 25 25 33 25 25 25 25 25 25 30 25 25 25 25 3202 865 1017 833 25 1361 203 41 68 25 658 76 125 25 25 25 25 68 25 40 3063 1244 1299 928 61 62 28 1370 276 79 25 67 72 46 25 25 25 28 45 25 3429 895 1177 769 25 25 25 25 25 36 25 25 50 25 25 25 25 47 36 25
  3. 158-2800 B, usually 500-600B
    sizes B: 3171 166 944 169 158 158 158 158 158 158 158 158 158 158 158 158 158 158 158 158 2797 515 504 1770 472 549 1923 558 493 521 515 521 345 1327 523 535 1961 569 568 568 2765 517 499 1812 525 602 1999 559 373 570 534 553 519 1380 582 569 1934 595 589 594 2829 537 498 1769 361 569 1950 536 523 541 547 786 475 1344 564 566 1999 556 566 545 2681 524 487 1735 526 557 1943 849 476 548 552 555 503 1312 535 536 1659 581 569 559

Currently, more of the generated FEC symbols are assembled to single packet if it fits. So it basically means that losing one packet in the case 2 and 3 means losing entire frame!

It is worth noting, that testcard content is artificial and the default pattern can be compressed quite easily. You'd get entirely different results with real picture or eg. testcard:pattern=noise.

@MartinPulec MartinPulec self-assigned this Jan 4, 2024
@alatteri
Copy link
Author

alatteri commented Jan 4, 2024

Hi Martin and team,

Happy New Year as well. Hope you got a relaxing break and nice time off with your families.

I don't totally understand the above response, but that is just my lack of technical knowledge. I have tested with real world footage and find it still quite fragile. Especially when image fade to black during a transition or anything with a lot of flat colors such as title cards and graphics. My assumption is that the content becomes highly compressible which leads to small "frame sizes" and then for some reason the FEC doesn't work well. If I recall, in my tests, uncompressed footage was highly resilient with FEC.

Is there anything that can be done to improve this situation? Currently I wrap the output of UG into SRT, and the minimum multiple is 3 with up to 4 recommended. When going distance, accounting for the RTT multiplier, the whole process can add several additional frames worth of latency. Even here locally, with RTT of around 15ms, it adds an additional frame of latency @24fps.

Thank you again for everything you do for the community. This project is such a great resource.

Alan

MartinPulec added a commit that referenced this issue Jan 5, 2024
The symbol size is printed only once (or more precisely few times,
because it is guarded by a thread-local variable and the sending may
pick a different runner). This, however, doesn't give representative
numbers when frame sizes differ (== compressed) because then may also FEC
symbol sizes so print it unconditionally at least with debug2 log level.

refers to GH-361
@MartinPulec
Copy link
Collaborator

Hi Alan,

I don't totally understand the above response

It is not your fault because the thing is that the behavior is not so straightforward and I am perhaps not good enough to describe.

the content becomes highly compressible which leads to small "frame sizes" and then for some reason the FEC doesn't work well

exactly

Is there anything that can be done to improve this situation?

From the UG perspective it depends. I've created a commit b257d13 (build including it can be taken here; hope you'll be permitted to download) which duplicates frame first packet. It is just an ad hoc improvement but I believe that it could improve the situation when there is low number of packets per video frame.

I don't currently want it put to it to main repository because we will soon make a new release so I don't want it to be included yet. Anyways, I'd be glad for eventual feedback.

Anyways, the situation for FEC is quite different in these scenarios: 1) low packet count per frame; 2) tens of packets; 3) hundred of packets. So it will be also useful for us to know the exact scenario.

@alatteri
Copy link
Author

alatteri commented Jan 19, 2024

Hi Martin,

Thank you. I will not be able to test this for at least 10 days, maybe a bit more. I will definitely give you feedback when I can.

Thanks,
ALan

@alatteri
Copy link
Author

Hi Martin,

I have not yet had an opportunity to test your test version, but I'm just thinking, would an easy solution be to use small values for -m or -l so that there are more small packets to allow an FEC recovery, instead of less big packets, where even a single loss isn't recoverable? Seems like an in-efficient way to transmit data, but maybe an easy way?

@armelvil
Copy link

I just tried forcing the mtu to 512! and it does seem to make a difference (also re; the discussion I just opened about controlling how drops are rendered)

@alatteri
Copy link
Author

alatteri commented Feb 5, 2024

Hi Martin,

I got a chance to test rev bcccf5c built Jan 19 2024 10:50:34

It is definitely much better. One thing that I found interesting, and it is counter to my above post, using a large MTU helped stability a great deal. Even though my network is not configured for Jumbo Packets, and all hosts are using standard ethernet MTU. Both tests were run with sudo iptables -A INPUT -i enp86s0 -m statistic --mode random --probability .02 -j DROP

See below.

./UltraGrid-continuous-x86_64.AppImage -t testcard:size=1920x1080 -c libavcodec:encoder=libx265:crf=22 -m 8900 -f V:rs:200:240 10.55.118.40


SSRC 0x8866371f: 254/256 packets received (99.2188%), 2 lost, max loss 1
[GL] 126 frames in 5.04021 seconds = 24.9989 FPS
Video dec stats (cumulative): 2325 total / 2304 disp / 20 drop / 0 corr / 1 miss FEC noerr/OK/NOK: 2324/0/0
SSRC 0x8866371f: 252/256 packets received (98.4375%), 4 lost, max loss 1
[GL] 126 frames in 5.03641 seconds = 25.0178 FPS
SSRC 0x8866371f: 255/256 packets received (99.6094%), 1 lost, max loss 1
[GL] 125 frames in 5.00258 seconds = 24.9871 FPS
SSRC 0x8866371f: 253/256 packets received (98.8281%), 3 lost, max loss 1
[GL] 126 frames in 5.04071 seconds = 24.9965 FPS
SSRC 0x8866371f: 254/256 packets received (99.2188%), 2 lost, max loss 1
[GL] 126 frames in 5.04109 seconds = 24.9946 FPS
SSRC 0x8866371f: 249/256 packets received (97.2656%), 7 lost, max loss 1
[GL] 126 frames in 5.03761 seconds = 25.0118 FPS
SSRC 0x8866371f: 254/256 packets received (99.2188%), 2 lost, max loss 1
[GL] 125 frames in 5.00158 seconds = 24.9921 FPS
Video dec stats (cumulative): 3100 total / 3079 disp / 20 drop / 0 corr / 1 miss FEC noerr/OK/NOK: 3099/0/0
SSRC 0x8866371f: 252/256 packets received (98.4375%), 4 lost, max loss 1
[GL] 126 frames in 5.03736 seconds = 25.0131 FPS
SSRC 0x8866371f: 253/256 packets received (98.8281%), 3 lost, max loss 1
[GL] 125 frames in 5.00139 seconds = 24.9931 FPS
SSRC 0x8866371f: 123/128 packets received (96.0938%), 5 lost, max loss 1
[GL] 126 frames in 5.04147 seconds = 24.9927 FPS
SSRC 0x8866371f: 249/256 packets received (97.2656%), 7 lost, max loss 1
[GL] 126 frames in 5.0385 seconds = 25.0075 FPS
SSRC 0x8866371f: 250/256 packets received (97.6562%), 6 lost, max loss 1
[GL] 125 frames in 5.00122 seconds = 24.9939 FPS
SSRC 0x8866371f: 248/256 packets received (96.8750%), 8 lost, max loss 1
[GL] 126 frames in 5.03988 seconds = 25.0006 FPS
SSRC 0x8866371f: 249/256 packets received (97.2656%), 7 lost, max loss 1
Video dec stats (cumulative): 3876 total / 3855 disp / 20 drop / 0 corr / 1 miss FEC noerr/OK/NOK: 3875/0/0
[GL] 126 frames in 5.03953 seconds = 25.0023 FPS

./UltraGrid-continuous-x86_64.AppImage -t testcard:size=1920x1080 -c libavcodec:encoder=libx265:crf=22 -f V:rs:200:240 10.55.118.40

SSRC 0xb6f265e6: 252/256 packets received (98.4375%), 4 lost, max loss 1
[GL] 126 frames in 5.03744 seconds = 25.0127 FPS
SSRC 0xb6f265e6: 252/256 packets received (98.4375%), 4 lost, max loss 1
[GL] 125 frames in 5.00064 seconds = 24.9968 FPS
[lavc hevc @ 0x7f96940b5940] Could not find ref with POC 446
    Last message repeated 1 times
SSRC 0xb6f265e6: 255/256 packets received (99.6094%), 1 lost, max loss 1
[lavc hevc @ 0x7f96940b5940] Could not find ref with POC 472
    Last message repeated 1 times
[GL] 123 frames in 5.00089 seconds = 24.5956 FPS
SSRC 0xb6f265e6: 380/384 packets received (98.9583%), 4 lost, max loss 2
[GL] 126 frames in 5.03908 seconds = 25.0046 FPS
SSRC 0xb6f265e6: 251/256 packets received (98.0469%), 5 lost, max loss 1
[lavc hevc @ 0x7f96940b5940] Could not find ref with POC 726
    Last message repeated 1 times
[GL] 124 frames in 5.00039 seconds = 24.7981 FPS
Video dec stats (cumulative): 775 total / 750 disp / 24 drop / 4 corr / 1 miss FEC noerr/OK/NOK: 770/0/4
SSRC 0xb6f265e6: 248/256 packets received (96.8750%), 8 lost, max loss 1
[GL] 126 frames in 5.04183 seconds = 24.9909 FPS
SSRC 0xb6f265e6: 380/384 packets received (98.9583%), 4 lost, max loss 1
[GL] 126 frames in 5.03674 seconds = 25.0162 FPS
SSRC 0xb6f265e6: 247/256 packets received (96.4844%), 9 lost, max loss 1
[lavc hevc @ 0x7f96940b5940] Could not find ref with POC 1136
    Last message repeated 1 times
[GL] 124 frames in 5.00151 seconds = 24.7925 FPS
SSRC 0xb6f265e6: 252/256 packets received (98.4375%), 4 lost, max loss 1
[GL] 126 frames in 5.03953 seconds = 25.0024 FPS
[lavc hevc @ 0x7f96940b5940] Could not find ref with POC 1303
    Last message repeated 1 times
[lavc hevc @ 0x7f96940b5940] Could not find ref with POC 1340
    Last message repeated 1 times
SSRC 0xb6f265e6: 254/256 packets received (99.2188%), 2 lost, max loss 1
[GL] 124 frames in 5.03879 seconds = 24.6091 FPS
SSRC 0xb6f265e6: 375/384 packets received (97.6562%), 9 lost, max loss 1
[GL] 125 frames in 5.0007 seconds = 24.9965 FPS
Video dec stats (cumulative): 1551 total / 1523 disp / 27 drop / 7 corr / 1 miss FEC noerr/OK/NOK: 1541/2/7
SSRC 0xb6f265e6: 253/256 packets received (98.8281%), 3 lost, max loss 1
[GL] 126 frames in 5.04069 seconds = 24.9966 FPS
[lavc hevc @ 0x7f96940b5940] Could not find ref with POC 1706
    Last message repeated 1 times
Video dec stats (cumulative): 1721 total / 1692 disp / 28 drop / 8 corr / 1 miss FEC noerr/OK/NOK: 1710/2/8
Pbuf: total 6171/6272 packets received (98.38967%).

@alatteri
Copy link
Author

alatteri commented Apr 8, 2024

Hi Martin... any thoughts on the above results from the FEC testing version you gave me?

@armelvil
Copy link

I, too, am still very interested about any improvements in this area. Right now my wifi-based test rigs still lose maybe one GOP every two minutes. It's not fatal, as long as a putative audience is tolerant.

MartinPulec added a commit that referenced this issue Jul 16, 2024
Duplicate first packet to increase resilliancy in cases when the traffic
is low, usually a single packet of some inter-frame compression like
H.264/HEVC. But it will similarly do the job when more packets per frame
are used.

First packet is duplicated instead of the last one because the last packet
can have less symbols than the first if there is more than 1
packet, eg. `DDDD|DF` (D - primary data; F - FEC, | - packet bounadry).

refers to GH-361
@MartinPulec
Copy link
Collaborator

Hi @alatteri, @armelvil

Hi Martin... any thoughts on the above results from the FEC testing version you gave me?

sorry for the delay, returning back to it now. Well, I've added the proposed change it to current code tree. Unfortunately, it is a bit hack but I don't have anything better now. The duplication is used whenever LDGM or Reed-Solomon is used as FEC. This can be disabled by :nodup option to fec parameter ("-f").

As @armelvil noted, reducing packet sizes may increase the resiliency (the recovery capability) for lower bitrate traffic, indeed.

MartinPulec added a commit to MartinPulec/UltraGrid that referenced this issue Jul 17, 2024
Duplicate first packet to increase resilliancy in cases when the traffic
is low, usually a single packet of some inter-frame compression like
H.264/HEVC. But it will similarly do the job when more packets per frame
are used.

First packet is duplicated instead of the last one because the last packet
can have less symbols than the first if there is more than 1
packet, eg. `DDDD|DF` (D - primary data; F - FEC, | - packet bounadry).

refers to CESNETGH-361
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants