Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[scrcpy-server] In server project, how to get screen bitmap in very shot time? #1951

Open
JiXiaoYao opened this issue Dec 4, 2020 · 12 comments

Comments

@JiXiaoYao
Copy link

JiXiaoYao commented Dec 4, 2020

I tried to integrate scrcpy-server into my project because python consumes a lot of CPU in the process of decoding h264 (PyAv does not support FFmpeg hardware acceleration), and I actually only need to get real-time android screenshots infrequently. But for the high latency requirements, I need to get the latest screen image as much as possible, but this is not frequent. It may be requested about 2 times in an intensive period of 1 second, and may only be requested once in 5 seconds at other times. So I tried to modify the code of scrcpy, through another TCP monitoring (the network part has no problem), get the pixel bytes of the image in real-time, but when I get the image, I found the problem:

The first thing I tried was to use reflection: The method in "android.view.SurfaceControl": "screenshot", very good, this function returns Bitmap with very low latency, but I found a problem when I tried to send the Bitmap. I use Bitmap.compress(), but I found that no matter how to adjust the encoding (JPG|JPEG|PNG), it will cause a delay of nearly 300-800ms.

Then when I tried to use "Bitmap.copyPixelsToBuffer()", I found that the Bitmap returned by "android.view.SurfaceControl.screenshot()" was a Bitmap configured using "Bitmap.Config.HARDWARE", so copyPixelsToBuffer() is the disabled method. Then, when I switched to the Btimap.copy, it still caused a delay of 100ms+.

Later, when I saw that the scrcpy-server's screen recorder is the way to bind Display to the Surface returned by MediaCodec, I tried to create a SurfaceView using the method: "SurfaceView.getHolder().getSurface()" and got and I tried to copy the Context from a line of code in the "com.genymobile.scrcpy.Workarounds.fillAppInfo()" method, but the result seems to be null.

So, I was wondering if there is any good advice on how I can get the latest screen pixel bytes in a shorter period of time?

@rom1v
Copy link
Collaborator

rom1v commented Dec 4, 2020

MediaCodec produces a new frame every time its input surface content changes. In scrcpy, the surface is directly drawn from the display, so every change produces a frame.

setDisplaySurface(display, surface, videoRotation, contentRect, unlockedVideoRect, layerStack);

Instead, in your case, you could screenshot to the codec input surface:

     * CAVEAT: Versions of screenshot that return a {@link Bitmap} can be extremely slow; avoid use
     * unless absolutely necessary; prefer the versions that use a {@link Surface} such as
     * {@link SurfaceControl#screenshot(IBinder, Surface)} or {@link GraphicBuffer} such as
     * {@link SurfaceControl#screenshotToBuffer(IBinder, Rect, int, int, boolean, int)}.

https://github.com/aosp-mirror/platform_frameworks_base/blob/e59313abccefba25bef005d89882d9528dd48765/core/java/android/view/SurfaceControl.java#L2007-L2010

Surface surface = codec.createInputSurface();
SurfaceControl.screenshot(display, surface);

That way, the video stream should only contain the frames you screenshot-ed.

(Not tested)

@JiXiaoYao
Copy link
Author

JiXiaoYao commented Dec 4, 2020

MediaCodec produces a new frame every time its input surface content changes. In scrcpy, the surface is directly drawn from the display, so every change produces a frame.

setDisplaySurface(display, surface, videoRotation, contentRect, unlockedVideoRect, layerStack);

Instead, in your case, you could screenshot to the codec input surface:

     * CAVEAT: Versions of screenshot that return a {@link Bitmap} can be extremely slow; avoid use
     * unless absolutely necessary; prefer the versions that use a {@link Surface} such as
     * {@link SurfaceControl#screenshot(IBinder, Surface)} or {@link GraphicBuffer} such as
     * {@link SurfaceControl#screenshotToBuffer(IBinder, Rect, int, int, boolean, int)}.

https://github.com/aosp-mirror/platform_frameworks_base/blob/e59313abccefba25bef005d89882d9528dd48765/core/java/android/view/SurfaceControl.java#L2007-L2010

Surface surface = codec.createInputSurface();
SurfaceControl.screenshot(display, surface);

That way, the video stream should only contain the frames you screenshot-ed.

(Not tested)

Thank you very much, I know how to take screenshots into Surface now.

But I also want to know, how do I get the pixel bytes data from the Surface? I only know that SurfaceView can get Bitmap and copy pixels to bytes.

@rom1v
Copy link
Collaborator

rom1v commented Dec 4, 2020

But I also want to know, how do I get the pixel bytes data from the Surface?

https://developer.android.com/reference/android/view/PixelCopy (since API 24)

But from what you described in the initial post, you probably don't want to do that: just send the video stream produced by MediaCodec to the client, which decodes the stream containing only your screenshots.

@JiXiaoYao
Copy link
Author

https://developer.android.com/reference/android/view/PixelCopy (since API 24)

But from what you described in the initial post, you probably don't want to do that: just send the video stream produced by MediaCodec to the client, which decodes the stream containing only your screenshots.

Ok, thank you, I will try PixelCopy API later, and after that, I will write the test result into this issues. Please don't close it.

@JiXiaoYao
Copy link
Author

But I also want to know, how do I get the pixel bytes data from the Surface?

https://developer.android.com/reference/android/view/PixelCopy (since API 24)

But from what you described in the initial post, you probably don't want to do that: just send the video stream produced by MediaCodec to the client, which decodes the stream containing only your screenshots.

Hmm, one good news and one bad news on here.

The good news is that PixelCopy.request is working, and I successfully get the image data from scrcpy-server.

Then, the bad news is that the callback function (onPixelCopyFinished) of PixelCopy.request is not working.

The reason why I can ensure PixelCopy.request is working is I put Thread.sleep(1000) after PixelCopy.request and then I return the Bitmap. So, the data of Bitmap is the right screenshot of the real-time screen.

Therefore, it will cause I can't get the screenshot in a very short time if the callback function of PixelCopy.request because I don't know when the graph data have been copied from Surface into the Bitmap.

So, I will show you my different test code that I try it at many times.

The first times, I try the following code:

        PixelCopy.request(screenSurface, bitmap, new PixelCopy.OnPixelCopyFinishedListener() {
            @Override
            public void onPixelCopyFinished(int copyResult){
                if (PixelCopy.SUCCESS == copyResult) {
                    // There are many callback process codes on here, but there are no sentences run.
                } else {
                    // On here code, there is no anyone are worked on here.
                }
            }
        }, new Handler(Looper.getMainLooper()));

Okay, this way fault, let me change the other one, I saw another way from google that is to use Lambda function:

        PixelCopy.request(screenSurface, bitmap, copyResult -> {
            if (PixelCopy.SUCCESS == copyResult) {
                // There are many callback process codes on here, but there are no sentences run.
            } else {
                // On here code, there is no anyone are worked on here.
            }
        }, new Handler(Looper.getMainLooper()));

Alright, unsurprisingly, it successfully failed.

In these two ways, I never detected any phenomenon that can prove those codes located in the callback function have run.

I don't know what question with which place.

So, I think about maybe I just have wrong with Handler listenerThread, if I have wrong with that, how should I do? if not that reason, what about the true reason?

@rom1v
Copy link
Collaborator

rom1v commented Dec 5, 2020

In these two ways, I never detected any phenomenon that can prove those codes located in the callback function have run.

There is no running event loop on the main thread in scrcpy (Looper.loop() is never executed), so the callbacks will never be called.

@JiXiaoYao
Copy link
Author

There is no running event loop on the main thread in scrcpy (Looper.loop() is never executed), so the callbacks will never be called.

Ummm, so how should I do to get the Handler object?

I just went to Google and searched some articles about Looper, but I still don’t know how to get the correct Looper for PixelCopy

I think if I solve the Handler problem, the all code will be working.

@rom1v
Copy link
Collaborator

rom1v commented Dec 5, 2020

Even if you retrieve a Bitmap, you'll have to compress it and it will take time.

You want to receive the full pictures on the client side (not only on the device), don't you? Is there a problem with this approach -> #1951 (comment)?

@JiXiaoYao
Copy link
Author

JiXiaoYao commented Dec 5, 2020

You want to receive the full pictures on the client side (not only on the device), don't you? Is there a problem with this approach -> #1951 (comment)?

Hmmm, I think I just need to get the pixel bytes array, because my client can use it, so I think I don't need any compress.

I just did the test, and the following under code just needs 11-12ms for working.

int bytes = bitmap.getByteCount();
buffer = ByteBuffer.allocate(bytes);
bitmap.copyPixelsToBuffer(buffer); 

And then, I just need to send it on another TCP socket.

@rom1v
Copy link
Collaborator

rom1v commented Dec 5, 2020

If you don't compress, it will take time to send the picture.

For example, a single picture in 1920×1080 will take 1920×1080×4 in RGBA (8.3Mb).
For 3120×1440, it's 18Mb.

@JiXiaoYao
Copy link
Author

If you don't compress, it will take time to send the picture.

For example, a single picture in 1920×1080 will take 1920×1080×4 in RGBA (8.3Mb).
For 3120×1440, it's 18Mb.

Yes, you are right, I do the network sending test if, by the gRPC, it will increase about 200ms delay, and if run on TCP socket, it also needs to take 100ms delay.

So, I think your way is the best solution for my project:

It will decrease my python PyAV CPU decoding usage, it will be great.

Instead, in your case, you could screenshot to the codec input surface:

     * CAVEAT: Versions of screenshot that return a {@link Bitmap} can be extremely slow; avoid use
     * unless absolutely necessary; prefer the versions that use a {@link Surface} such as
     * {@link SurfaceControl#screenshot(IBinder, Surface)} or {@link GraphicBuffer} such as
     * {@link SurfaceControl#screenshotToBuffer(IBinder, Rect, int, int, boolean, int)}.

https://github.com/aosp-mirror/platform_frameworks_base/blob/e59313abccefba25bef005d89882d9528dd48765/core/java/android/view/SurfaceControl.java#L2007-L2010

Surface surface = codec.createInputSurface();
SurfaceControl.screenshot(display, surface);

That way, the video stream should only contain the frames you screenshot-ed.

(Not tested)

But, I will send the code that is a way how to use codec surface take a screenshot in 60ms(to pixels bytes) at soon because I think the other people may need it.

@AVTurovskiy
Copy link

But, I will send the code that is a way how to use codec surface take a screenshot in 60ms(to pixels bytes) at soon because I think the other people may need it.

Hello.
Did anything work?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants