Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the capability for video class to handle a bulk endpoint in the streaming interface. #1985

Merged
merged 6 commits into from
Jun 15, 2023

Conversation

kkitayam
Copy link
Collaborator

Describe the PR
Add handling bulk endpoints in streaming interface. This feature is only compatible with streaming, and the still image capture function is not yet implemented.

Using bulk endpoints may allow for more bandwidth utilization than isochronous endpoints. It may also enable higher image sizes and frame rates.

Additional context
I have tested this PR on Raspberry Pi Pico and Windows 11.
I have confirmed that the video_capture can display color bars up to 128x96 60fps.

@kkitayam
Copy link
Collaborator Author

@Staacks

If you have time, could you try this PR for the your project.
Using bulk endpoints may allow achieving 160x144@30fps.

@Staacks
Copy link
Contributor

Staacks commented Mar 28, 2023

Wow, thanks, that would be fantastic. I will try it soon! I am currently a bit caught up in another project and my day job, so I cannot promise that I get to it before Easter, but my fingers are definitely itching to test it.

@hathach
Copy link
Owner

hathach commented Mar 28, 2023

look great, though I am on the run for other things, will review and test this as soon as I could

@Staacks
Copy link
Contributor

Staacks commented May 29, 2023

Short sign of live: Other project finally finished and I will try it this week. Sorry for the long dealy.

@Staacks
Copy link
Contributor

Staacks commented Jun 1, 2023

I think this is one of those good new / bad news situations...

Good news:
I got it working and of course in the end adapting my code was easy (after spending a few hours searching for the little difference in my descriptors that prevented it from working immediately). So, I think the pull request is good.

Bad news:
I did not manage to achieve a significantly higher frame rate. In contrast to the isochronous mode the UVC driver accepts a 30fps stream in bulk transfer mode, but when I record with ffmpeg it shows that it expects 30fps, but needs to duplicate a few frames as it does not receive them fast enough. If I just call tud_video_n_frame_xfer() and tud_task() in a loop without anything else in there, it needs to duplicate 61 out of 1000 frames, which amounts to only a bit over 28fps. With my normal GB Interceptor code where I wait for the Game Boy's hblank before calling tud_video_n_frame_xfer() it's even a bit worse.

I first suspected that I miss a few chances to transfer data while tud_video_n_frame_xfer() is being called and modified the code such that the tud_video_frame_xfer_complete_cb can return a pointer to the next buffer, so we could continue to send without any gaps, but somehow that made it even worse. At this point I am out of ideas on how to improve performance to get the last few frames through that USB bus.

Anyhow, that's only a problem for my project. Generally speaking your code seems to work properly, so thanks!

Copy link
Owner

@hathach hathach left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Superb! Thank you @kkitayam very much again for this excellent works. Thank @Staacks for testing it out. Although it doesn't got a faster rate but it looks great. Maybe we are missing something, but this is more than enough for the merge. Thanks again and sorry for late response so far.

@hathach hathach merged commit 433ffe2 into hathach:master Jun 15, 2023
@Staacks
Copy link
Contributor

Staacks commented Jun 16, 2023

About that rate: I did not test with a minimal project, but I suspect that this is an issue particular to my project as the rp2040 has quite a lot to do in that case and probably misses a few chances to transmit data when tud_task is not called in time. I am currently working on a method to achieve 60fps via MJPEG encoding (thanks to a very clever idea someone suggested to me) and I am having the exact same problem: I managed to encode 160x144 JPEGs at 60fps, but I have not yet succeeded to split the encoding process such that tud_task can be called often enough, so the data rate is currently slower than in the old implementation (although the FPS is already higher).

Point is: I would not rule out that the bulk implementation here could yield higher rates.

@hathach
Copy link
Owner

hathach commented Jun 21, 2023

Great, thank you for your update and explanation.

@Staacks
Copy link
Contributor

Staacks commented Jul 2, 2023

Sorry, just another addendum (doesn't change anything, but might prevent someone coming from Google drawing the wrong conclusions):
I might have mixed up bulk and isochronous in my testing. While preparing the 60fps MJPEG version for a beta release I stumbled upon a mistake in my defines, which mixed up both modes except for setting a smaller endpoint buffer size for the isochronous mode. I am not 100% sure, but I suspect that this mistake was already there when I did the bulk test before implementing MJPEG. So, when I had the impression that bulk resulted in a lower frame rate, I was probably testing isochronous with a too small buffer and the bulk mode was performing similar to the original proper isochronous mode (which also makes more sense to me in how I struggled with tud_task).

Anyhow, with even fewer resources available while encoding MJPEG I found that bulk works better than isochronous, so @kkitayam's bulk implementation is part of why I can soon publish a 60fps beta version :) Thanks a lot!

@rrrrrrobot
Copy link

I tried to use bulk transport, but I couldn't get the video stream on Linux, and the picture could be produced normally in Windows

@Staacks
Copy link
Contributor

Staacks commented Jul 7, 2023

I tried to use bulk transport, but I couldn't get the video stream on Linux, and the picture could be produced normally in Windows

You will have to give more details like dmesg after connecting, software used to access the stream, the log of this software and ideally some code of yours. I have developed my project using TinyUSB's UVC primarily under Linux and had no problems with bulk transfer.

@hathach
Copy link
Owner

hathach commented Jul 7, 2023

thank you @Staacks for more update, glad you get it all working. GB Inteceptor is a great project.

@rrrrrrobot Let keep the problem within your own issue, since you already opened one.

@rrrrrrobot
Copy link

@Staacks Thank you very much,issue at #2139

@kkitayam kkitayam deleted the uvc_bulk branch July 11, 2023 13:16
@Staacks Staacks mentioned this pull request Jul 11, 2023
1 task
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

4 participants