-
Notifications
You must be signed in to change notification settings - Fork 860
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
srt-live-transmit bitrate issue #933
Comments
Hi @ltrayanov This does not look as an expected behavior. We will have a look. |
We already did. Unfortunately same result. |
Note that using for listening the port number above 32768 is extremely risky, especially if the UDP ports have default |
I just performed tests with ports 5555, 4444, 3333 and 2222. |
@ltrayanov Could you also check v1.3.4 to understand if it is something newly introduced? |
Sorry for my late response. We did try the old versions and we can report same results. |
@ltrayanov Thanks for the update. One more kind request. We can then take a look on the situation. |
Hello @ltrayanov I'll let you know, if I can reproduce it here. |
I am sorry, We were using the live-transmit app on both sides with the following command line: When we use the same udp source with: We are in process of building an app for our encoder and I will update with the results. |
We also decided to test srt-live-transmit ver 1.4.1 |
We will have a look and get to you as soon as possible. |
No drops, no retransmissions. @ltrayanov Please recheck with different port numbers like 4200. |
Ahoi @ltrayanov
receiver:
Can you please post the exact commands including all parameters, that you are using?
the receiver should be changed to following according to example above:
Maybe it was just a typo when writing this, but please double check, that you did not try to receive another SRT stream by mixing up port numbers on the receiver side. Is there a chance, you can create a packet capure of the stream on the receiver side? (Start capture, then start srt-live-transmit for maybe 20 sec, stop srt-live-transmit and stop capture)
Don't hesitate to get back to me, in case you have any questions. |
@maxsharabayko what can explain the 10M lower bit rate? Please note that the sender was getting the same input ts with both tests. We are testing with the srt-live-transmit app. I am uploading two more tests with different commits, same input stream was used for all of them. One of the tests was done with srt 1.4.0 (oct 4th) commit ef8ba13 and has similar results as our previous tests. We find only the snd-caller to rcv-listener combination to work (with the exception of high cpu load) From srt 1.4.0 (oct 29) commit 0f8e93e which fixes the high cpu usage. From that commit till now (srt 1.4.1) we see truncated sending rates with any input ts higher than 20M |
@ltrayanov Let me sum up the results so far.
Changing ports to 4400, etc, does not help. Which OS do you have? I see two directions to proceed with this. @ltrayanov could you please:
|
From the srt 1.4.0_0f8e93.zip srt 1.4.0 oct 29 commit 0f8e93e with fixes for epoll |
@maxsharabayko Our test machines are Ubuntu 18.04 LTS and we are using srt-live-transmit app Commands used for caller(sender) to listener(receiver) are:
Commands used for listener(sender) to caller(receiver) are: Captures and csv files for tests with SRT 1.3.4, 1.4.0, 1.4.1 can be downloaded from: I also included a capture from udp://@225.168.111.11:4300 (test_ts.ts) for your reference. |
In the data for "SRT tag 1.4.1" I see good transmission results for both cases. |
@J-Rogmann TS used in the test was out of a broadcast encoder with multicast output. The encoder used for that capture was Radiant Communications VL4522, we had the same results with Harmonic Electra 8100, ts muxes from Moto Sem v8 and Arris CAP1000. All streams are ATSC CBR MPEG TS and when analyzed they don't show any errors or abnormalities. |
@maxsharabayko the transmission results are good, but our issue is that input stream has 59.9M mux rate, and the sender rate is ~48M |
Probably I am confused already, in that case sorry. All the dumps show 50 Mbps, and they don't include UDP. You say that the input was 60 Mbps.
Could you collect the network capture for that UDP streaming as well then? |
pcap of the source ts was uploaded to the same location, file name is mcast source.zip |
Hello @ltrayanov We have also seen some implementations of piping NDI streams through SRT, which run at 150 Mbps on the input side without any issue. We will update the srt-live-transmit documentation and point out this limitation more clearly. This sample application is meant for a quick testing and we did not consider such high bitrates. If there is time, we will have a look at this again and improve the UDP input for srt-live-transmit. For now, please don't hesitate to continue on your SRT implementation and be ensured, SRT can handle >62 Mbps streams. best regards, |
@J-Rogmann , @maxsharabayko , please take a look at #762 , might be related to the current issue. |
Hello @ltrayanov best regards, |
Just mind that |
Thank you! We will test perform couple of tests. |
(Some development notes for future reference) After some optimization during experiemnts the reading loop of srt-live-transmit has the following logic (In pseudocode):
The app waits for the first time there is something to read from source (UDP). Then the reading subloop ( Even this optimized version still has severe packet losses. The only way to make it work with UDP input was to remove the epoll_wait completely:
This proved to work. However, it is hard to imagine an easy way to fix this behavior of srt-live-transmit. Reading from UDP socket:
|
A workaround for the existing The UDP streaming you have (@ltrayanov) has an average of 60 Mbps, with peaks up to ~110 Mbps. In that case the epoll turned out to be not fast enough, and due to small receiver buffer of the UDP socket, packets were accumulating in the buffer. During these peaks, there was not enough space in the buffer to store the packets, and they eventually were dropped on the UDP input. Increasing the UDP buffer helped. First, the system maximum buffer size should be increased the following way:
Then the desired buffer size (in bytes) can be specified on the UDP socket (after PR #1152 is merged).
|
The remaining issue is the reported reduced performance of the receiver in listener mode, while the same receiving in caller mode works well. The current guess is that the listener socket that remains in the listening state after the connection is established (accepted) may impact the performance of the listener-receiver. |
srt-live-transmit closes the listening socket (apps/transmitmedia.cpp:247), so the guess above is disproved. |
When testing release 1.4 with srt-live-transmit application we found a strange behavior between caller(sender) to listener(receiver) and listener(sender) to caller(receiver).
In both cases our input was 54M mpeg ts multicast stream in the example below 225.168.111.105:51011. Sender had ip address of 192.168.111.14 and receiver was at 192.168.111.15
Everything worked as expected with the config below, sending and receiving rates were around 57M, output multicast stream was fine:
sender
udp://@225,168.111.105:51011 srt://192.168.111.15:55555
receiver
srt://:55555 udp://225.168.111.15:51021
When we switched the modes to:
sender
udp://@225,168.111.105:51011 srt://:55555
receiver
srt://192.168.111.14:55555 udp://225.168.111.15:51021
The send and receive rates were down to 40M and the output multicast stream had numerous continuity count and pcr errors.
The problem stopped when the input mpeg ts stream rate was reduced 20m or below.
Is that a expected behavior with the srt with listener->caller?
Are there any socket settings that can improve that mode?
Thanks!
The text was updated successfully, but these errors were encountered: