-
-
Notifications
You must be signed in to change notification settings - Fork 11k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[scrcpy-server] In server project, how to get screen bitmap in very shot time? #1951
Comments
Instead, in your case, you could screenshot to the codec input surface:
Surface surface = codec.createInputSurface();
SurfaceControl.screenshot(display, surface); That way, the video stream should only contain the frames you screenshot-ed. (Not tested) |
Thank you very much, I know how to take screenshots into Surface now. But I also want to know, how do I get the pixel bytes data from the Surface? I only know that SurfaceView can get Bitmap and copy pixels to bytes. |
https://developer.android.com/reference/android/view/PixelCopy (since API 24) But from what you described in the initial post, you probably don't want to do that: just send the video stream produced by |
Ok, thank you, I will try PixelCopy API later, and after that, I will write the test result into this issues. Please don't close it. |
Hmm, one good news and one bad news on here. The good news is that Then, the bad news is that the callback function ( The reason why I can ensure Therefore, it will cause I can't get the screenshot in a very short time if the callback function of So, I will show you my different test code that I try it at many times. The first times, I try the following code: PixelCopy.request(screenSurface, bitmap, new PixelCopy.OnPixelCopyFinishedListener() {
@Override
public void onPixelCopyFinished(int copyResult){
if (PixelCopy.SUCCESS == copyResult) {
// There are many callback process codes on here, but there are no sentences run.
} else {
// On here code, there is no anyone are worked on here.
}
}
}, new Handler(Looper.getMainLooper())); Okay, this way fault, let me change the other one, I saw another way from google that is to use Lambda function: PixelCopy.request(screenSurface, bitmap, copyResult -> {
if (PixelCopy.SUCCESS == copyResult) {
// There are many callback process codes on here, but there are no sentences run.
} else {
// On here code, there is no anyone are worked on here.
}
}, new Handler(Looper.getMainLooper())); Alright, unsurprisingly, it successfully failed. In these two ways, I never detected any phenomenon that can prove those codes located in the callback function have run. I don't know what question with which place. So, I think about maybe I just have wrong with |
There is no running event loop on the main thread in scrcpy ( |
Ummm, so how should I do to get the Handler object? I just went to Google and searched some articles about I think if I solve the |
Even if you retrieve a You want to receive the full pictures on the client side (not only on the device), don't you? Is there a problem with this approach -> #1951 (comment)? |
Hmmm, I think I just need to get the pixel bytes array, because my client can use it, so I think I don't need any compress. I just did the test, and the following under code just needs 11-12ms for working. int bytes = bitmap.getByteCount();
buffer = ByteBuffer.allocate(bytes);
bitmap.copyPixelsToBuffer(buffer); And then, I just need to send it on another TCP socket. |
If you don't compress, it will take time to send the picture. For example, a single picture in 1920×1080 will take 1920×1080×4 in RGBA (8.3Mb). |
Yes, you are right, I do the network sending test if, by the gRPC, it will increase about 200ms delay, and if run on TCP socket, it also needs to take 100ms delay. So, I think your way is the best solution for my project: It will decrease my python PyAV CPU decoding usage, it will be great.
But, I will send the code that is a way how to use codec surface take a screenshot in 60ms(to pixels bytes) at soon because I think the other people may need it. |
Hello. |
I tried to integrate scrcpy-server into my project because python consumes a lot of CPU in the process of decoding h264 (PyAv does not support FFmpeg hardware acceleration), and I actually only need to get real-time android screenshots infrequently. But for the high latency requirements, I need to get the latest screen image as much as possible, but this is not frequent. It may be requested about 2 times in an intensive period of 1 second, and may only be requested once in 5 seconds at other times. So I tried to modify the code of scrcpy, through another TCP monitoring (the network part has no problem), get the pixel bytes of the image in real-time, but when I get the image, I found the problem:
The first thing I tried was to use reflection: The method in "android.view.SurfaceControl": "screenshot", very good, this function returns Bitmap with very low latency, but I found a problem when I tried to send the Bitmap. I use Bitmap.compress(), but I found that no matter how to adjust the encoding (JPG|JPEG|PNG), it will cause a delay of nearly 300-800ms.
Then when I tried to use "Bitmap.copyPixelsToBuffer()", I found that the Bitmap returned by "android.view.SurfaceControl.screenshot()" was a Bitmap configured using "Bitmap.Config.HARDWARE", so copyPixelsToBuffer() is the disabled method. Then, when I switched to the Btimap.copy, it still caused a delay of 100ms+.
Later, when I saw that the scrcpy-server's screen recorder is the way to bind Display to the Surface returned by MediaCodec, I tried to create a SurfaceView using the method: "SurfaceView.getHolder().getSurface()" and got and I tried to copy the Context from a line of code in the "com.genymobile.scrcpy.Workarounds.fillAppInfo()" method, but the result seems to be null.
So, I was wondering if there is any good advice on how I can get the latest screen pixel bytes in a shorter period of time?
The text was updated successfully, but these errors were encountered: