Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Android] VideoTexture stop to work after some time with low-performance devices #2934

Open
itlancer opened this issue Nov 21, 2023 · 0 comments
Labels

Comments

@itlancer
Copy link

Problem Description

VideoTexture stop to work after some time with low-performance Android devices. After 5-13 minutes video renders stops (one frame displayed) or just display black/white or even green color instead of VideoTexture.

In this issue video just playing in a loop and "end" defines by onPlayStatus, not NetStream.Play.Stop. If you will use NetStream.Play.Failed - there will be similar issue with similar error logs but with another actual result.
May be onPlayStatus should be deprecated and removed at all. Discussed here: #1943 (comment)

Tested with multiple AIR 50.2.x versions, even with latest AIR 50.2.3.8 with different AIR applications, different low-performance devices and videos.
Tested with armv7 architecture cause "problem" devices firmwares dosn't support armv8.
Tested with multiple Android TV boxes (x96mini and similar) with Amlogic CPUs. Also tested with Android price checkers with Rockchip RK3229. All devices tested with different firmwares.
All "problem" devices has Android 7.1.2.
Tested with pure Stage3D and with Starling.
Same issue in all cases.
There is no such issue using Video.
Also there is no such issues with high-performance devices.
Also there is no such issue with other, non-AIR video players.

Related issues:
#2910
#2268
#1939
#1174
#1159
#587
#93
#92
#82
#16

Steps to Reproduce

Launch application with code below with any low-performance Andoird devices and wait up to 30 minutes. It just play video using VideoTexture in a loop.
Note: disable screen guard to avoid display going to sleep.

Application example with sources and video attached. Also Scout and LogCat logs from multiple devices attached.
android_videotexture_onplaystatus_render_stop_bug.zip

package {
	import flash.display.Sprite;
	import flash.display.Stage3D;
	import flash.events.Event;
	import flash.events.NetStatusEvent;
	import flash.net.NetStream;
	import flash.display3D.Context3D;
	import flash.display3D.IndexBuffer3D;
	import flash.geom.Matrix3D;
	import flash.net.NetConnection;
	import flash.display3D.textures.VideoTexture;
	import com.adobe.utils.AGALMiniAssembler;
	import flash.display3D.Context3DProgramType;
	import flash.display3D.Context3DVertexBufferFormat;
	import flash.display3D.VertexBuffer3D;
	import flash.display3D.Program3D;
	import flash.display3D.Context3DProfile;
	import flash.desktop.NativeApplication;
	import flash.desktop.SystemIdleMode;
	
	public class AndroidVideoTextureOnPlayStatusRenderStopBug extends Sprite {
		private var stage3D:Stage3D;
		private var context3D:Context3D;
		private var indexbuffer:IndexBuffer3D;
		private var matrix:Matrix3D;
		private var netConnection:NetConnection;
		private var netStream:NetStream;
		private var videoTexture:VideoTexture;
		
		public function AndroidVideoTextureOnPlayStatusRenderStopBug() {
			NativeApplication.nativeApplication.systemIdleMode = SystemIdleMode.KEEP_AWAKE;
			
			stage3D = stage.stage3Ds[0];
			stage3D.addEventListener(Event.CONTEXT3D_CREATE, contextCreated);
			stage3D.requestContext3DMatchingProfiles(Vector.<String>([Context3DProfile.BASELINE, Context3DProfile.BASELINE_EXTENDED, Context3DProfile.STANDARD, Context3DProfile.ENHANCED, Context3DProfile.STANDARD_CONSTRAINED, Context3DProfile.STANDARD_EXTENDED]));
		}
		
		private function contextCreated(event:Event):void {
			context3D = stage.stage3Ds[0].context3D;
			context3D.enableErrorChecking = true;
			context3D.configureBackBuffer(640, 480, 4, true, true, true);
			trace(context3D.driverInfo, context3D.profile);
			
			playVideo();
		}
		
		private function playVideo():void {
			removeEventListener(Event.ENTER_FRAME, enterFrame);
			
			if (videoTexture != null){
				videoTexture.removeEventListener(Event.TEXTURE_READY, renderState);
				videoTexture.attachNetStream(null);
				videoTexture.dispose();
				videoTexture = null;
			}
			if (netStream != null){
				netStream.removeEventListener(NetStatusEvent.NET_STATUS, netStream_netStatus);
				netStream.close();
				netStream = null;
			}
			if (netConnection != null){
				netConnection.removeEventListener(NetStatusEvent.NET_STATUS, netConnection_netStatus);
				netConnection.close();
				netConnection = null;
			}
			
			
			var vertices:Vector.<Number> = Vector.<Number>([
				1, -1, 0, 1, 0,
				1, 1, 0, 1, 1,
				-1, 1, 0, 0, 1,
				-1,-1, 0, 0, 0
			]);
			
			var vertexbuffer:VertexBuffer3D = context3D.createVertexBuffer(4, 5);
			vertexbuffer.uploadFromVector(vertices, 0, 4);
			
			indexbuffer = context3D.createIndexBuffer(6);
			indexbuffer.uploadFromVector(Vector.<uint>([0, 1, 2, 2, 3, 0]), 0, 6);
			
			var vertexShaderAssembler:AGALMiniAssembler = new AGALMiniAssembler();
			vertexShaderAssembler.assemble(Context3DProgramType.VERTEX, "m44 op, va0, vc0\n" + "mov v0, va1");
			
			var fragmentShaderAssembler:AGALMiniAssembler = new AGALMiniAssembler();
			fragmentShaderAssembler.assemble( Context3DProgramType.FRAGMENT, "tex ft1, v0, fs0 <2d,linear, nomip>\n" + "mov oc, ft1");
			
			var program:Program3D = context3D.createProgram();
			program.upload( vertexShaderAssembler.agalcode, fragmentShaderAssembler.agalcode);
			
			context3D.setVertexBufferAt(0, vertexbuffer, 0, Context3DVertexBufferFormat.FLOAT_3);
			context3D.setVertexBufferAt(1, vertexbuffer, 3, Context3DVertexBufferFormat.FLOAT_2);
			
			videoTexture = context3D.createVideoTexture();
			context3D.setTextureAt(0, videoTexture);
			
			context3D.setProgram(program);
			matrix = new Matrix3D();
			matrix.appendScale(1, -1, 1);
			context3D.setProgramConstantsFromMatrix(Context3DProgramType.VERTEX, 0, matrix, true);
			
			
			netConnection = new NetConnection();
			netConnection.addEventListener(NetStatusEvent.NET_STATUS, netConnection_netStatus);
			netConnection.connect(null);
		}
		
		private function netConnection_netStatus(e:NetStatusEvent):void {
			if (e.info.code == "NetConnection.Connect.Success"){
				netStream = new NetStream(netConnection);
				netStream.addEventListener(NetStatusEvent.NET_STATUS, netStream_netStatus);
				netStream.client = {onMetaData:getMeta, onPlayStatus:onPlayStatus};
				netStream.play("neon.mp4");
				videoTexture.attachNetStream(netStream);
				
				videoTexture.addEventListener(Event.TEXTURE_READY, renderState);
			}
		}
		
		private function netStream_netStatus(event:NetStatusEvent):void {
			trace(event.info.code);
			switch (event.info.code){
				case "NetStream.Play.StreamNotFound":
					trace("Stream not found");
					break;
				default:
					break;
			}
		}

		private function getMeta(mdata:Object):void {
			trace("metadata");
		}

		private function onPlayStatus(infoObject:Object):void {
			trace("onPlayStatus", infoObject.code);
			playVideo();
		}

		private function renderState(e:Event):void {
			videoTexture.removeEventListener(Event.TEXTURE_READY, renderState);
			trace("renderState");
			render();
			addEventListener(Event.ENTER_FRAME, enterFrame);
		}
	
		private function enterFrame(event:Event):void {
			render();
		}

		private function render():void {
			context3D.clear(1, 0, 0, 1);
			context3D.drawTriangles(indexbuffer);
			context3D.present();
		}

	}
}

Actual Result:
After 5-13 minutes video renders stops. One frame will be displayed or just display black/white or even green color instead of VideoTexture.
Usually no errors at all in Scout logs. In Scout all looks like video played without any issues. Only once we have got NetStream.Play.Failed. Logs attached above.

When visually video playback stops somehow in LogCat you will see continuous errors:

E vpu     : VPUClientWaitResult ioctl VPU_IOC_GET_REG failed ret -1 errno 110 Connection timed out
E H264_DEBUG: VPUClientWaitResult ret -1
E H264_DEBUG: FATAL ERROR: -257
I ROCKCHIP_VIDEO_DEC: Rkvpu_SendInputData(426): pkt.size:23868, pkt.dts:800000,pkt.pts:800000,pkt.nFlags:0
E vpu     : VPUClientSendReg ioctl VPU_IOC_SET_REG failed ret -1 errno 14 Bad address
I ROCKCHIP_VIDEO_DEC: Rkvpu_SendInputData(426): pkt.size:26070, pkt.dts:840000,pkt.pts:840000,pkt.nFlags:0

or

I CreateVideoDecoder: using MediaCodec Decoder
W System.err: java.lang.NoSuchMethodError: no non-static method "Lcom/adobe/flashruntime/air/VideoTextureSurface;.getHeight()I"
W System.err: 	at com.adobe.air.customHandler.callTimeoutFunction(Native Method)
W System.err: 	at com.adobe.air.Entrypoints$1.handleMessage(Entrypoints.java:318)
W System.err: 	at android.os.Handler.dispatchMessage(Handler.java:102)
W System.err: 	at android.os.Looper.loop(Looper.java:154)
W System.err: 	at com.adobe.air.Entrypoints.run(Entrypoints.java:339)
System.err: 	at java.lang.Thread.run(Thread.java:761)

#2910
or

OMXNodeInstance: [bf0115:rk._decoder.avc] component does not support metadata mode; using fallback
ACodec  : [OMX.rk.video_decoder.avc] storeMetaDataInBuffers failed w/ err -1010
OMXNodeInstance: getConfig(bf0115:rk._decoder.avc, ConfigCommonOutputCrop(0x700000f)) ERROR: UnsupportedIndex(0x8000101a)
runtime : /4246: AndroidMediaCodec:: configure returned successfully 
ROCKCHIP_VIDEO_DECCONTROL: Rkvpu_OMX_AllocateBuffer(265): Rkvpu_OMX_AllocateBuffer in
ROCKCHIP_VIDEO_DECCONTROL: Rkvpu_OMX_AllocateBuffer(366): Rkvpu_OMX_AllocateBuffer in ret = 0x0
ROCKCHIP_VIDEO_DECCONTROL: Rkvpu_OMX_AllocateBuffer(265): Rkvpu_OMX_AllocateBuffer in
ROCKCHIP_VIDEO_DECCONTROL: Rkvpu_OMX_AllocateBuffer(366): Rkvpu_OMX_AllocateBuffer in ret = 0x0
ROCKCHIP_VIDEO_DECCONTROL: Rkvpu_OMX_AllocateBuffer(265): Rkvpu_OMX_AllocateBuffer in
ROCKCHIP_VIDEO_DECCONTROL: Rkvpu_OMX_AllocateBuffer(366): Rkvpu_OMX_AllocateBuffer in ret = 0x0
ROCKCHIP_VIDEO_DECCONTROL: Rkvpu_OMX_AllocateBuffer(265): Rkvpu_OMX_AllocateBuffer in
ROCKCHIP_VIDEO_DECCONTROL: Rkvpu_OMX_AllocateBuffer(366): Rkvpu_OMX_AllocateBuffer in ret = 0x0
ACodec  : setupNativeWindowSizeFormatAndUsage 991 colorSpace = 0,eDyncRange = 0
SurfaceUtils: set up nativeWindow 0xafcc8808 for 1920x1088, color 0x15, rotation 0, usage 0x2900
ROCKCHIP_VIDEO_DEC: Rkvpu_Dec_ComponentInit(1048): omx decoder info : author:  toor
ROCKCHIP_VIDEO_DEC:  time: Thu, 02 Sep 2021 22:45:44 +0800 git commit aa96f5c75c8838bd5f3e0a8f5543ec8bb0ea3026 
vpu_api : vpu_open_context in
runtime : /4246:  env->GetObjectArrayElement(array, i); is null
runtime : /4246:  env->GetObjectArrayElement(array, i); is null
runtime : /4246:  env->GetObjectArrayElement(array, i); is null

or

runtime : /6237: AndroidMediaCodec::configure() this = 0xd0cfe030
runtime : /6237: AndroidMediaCodec:: about to call configure
runtime : /6237: AndroidMediaCodec:: configure returned successfully 
runtime : /6237: Exception cause - java.lang.IllegalStateException

or

ion     : ioctl c0204900 failed with code -1: Out of memory
[Gralloc-ERROR]: int alloc_backend_alloc(alloc_device_t *, size_t, int, buffer_handle_t *):140 Failed to ion_alloc from ion_client:20
Gralloc1On0Adapter: gralloc0 allocation failed: -1 (Operation not permitted)
GraphicBufferAllocator: Failed to allocate (1920 x 1080) format 17 usage 33564928: 5
GraphicBufferAllocator: Allocated buffers:
GraphicBufferAllocator: 0x7576013820:  280.00 KiB | 1280 (1280) x   56 |        1 | 0x00000f02 | NavigationBar
GraphicBufferAllocator: 0x75760138c0:  280.00 KiB | 1280 (1280) x   56 |        1 | 0x00000f02 | NavigationBar
GraphicBufferAllocator: 0x7576013960:  280.00 KiB | 1280 (1280) x   56 |        1 | 0x00000f02 | NavigationBar
GraphicBufferAllocator: 0x7576013aa0:  140.00 KiB | 1280 (1280) x   28 |        1 | 0x00000f02 | StatusBar
GraphicBufferAllocator: 0x7576013b40:  140.00 KiB | 1280 (1280) x   28 |        1 | 0x00000f02 | StatusBar
GraphicBufferAllocator: 0x7576013be0:  140.00 KiB | 1280 (1280) x   28 |        1 | 0x00000f02 | StatusBar
GraphicBufferAllocator: 0x7576013d20: 3600.00 KiB | 1280 (1280) x  720 |        1 | 0x00000f02 | SurfaceView - android.videotexture.onplaystatus.render.stop.bug/android.videotexture.onplaystatus.render.stop.bug.AIRAppEntry
GraphicBufferAllocator: 0x7576013dc0: 3600.00 KiB | 1280 (1280) x  720 |        1 | 0x00000f02 | SurfaceView - android.videotexture.onplaystatus.render.stop.bug/android.videotexture.onplaystatus.render.stop.bug.AIRAppEntry
GraphicBufferAllocator: 0x7576442780: 3600.00 KiB | 1280 (1280) x  720 |        1 | 0x00001e
        : GraphicBufferAlloc::createGraphicBuffer(w=1920, h=1080) failed (Out of memory), handle=0x0
BufferQueueProducer: [SurfaceTexture-3-5687-26] dequeueBuffer: createGraphicBuffer failed
ACodec  : dequeueBuffer failed: Out of memory (12)
BufferQueueCore: [SurfaceTexture-3-5687-26] Slot 5 is in mFreeSlots and in mActiveBuffers
BufferQueueCore: [SurfaceTexture-3-5687-26] Slot 5 is in mFreeSlots and in mActiveBuffers
BufferQueueCore: [SurfaceTexture-3-5687-26] Slot 5 is in mFreeSlots and in mActiveBuffers
BufferQueueCore: [SurfaceTexture-3-5687-26] Slot 5 is in mFreeSlots and in mActiveBuffers
BufferQueueCore: [SurfaceTexture-3-5687-26] Slot 5 is in mFreeSlots and in mActiveBuffers
ACodec  : Failed to allocate buffers after transitioning to IDLE state (error 0xfffffff4)
ACodec  : signalError(omxError 0x80001001, internalError -12)

or

[EGL-ERROR]: struct mali_image *_egl_android_map_native_buffer_yuv(android_native_buffer_t *):125: Unable to allocate memory for YUV image (1920 x 1080 12536)
GLConsumer: error creating EGLImage: 0x3003
GLConsumer: Failed to create image. size=1920x1080 st=1920 usage=0x2130 fmt=842094169
GLConsumer: [SurfaceTexture-3-13978-282] updateAndRelease: unable to createImage on display=0x1 slot=1
System.err: java.lang.RuntimeException: Error during updateTexImage (see logcat for details)
System.err: 	at android.graphics.SurfaceTexture.nativeUpdateTexImage(Native Method)
System.err: 	at android.graphics.SurfaceTexture.updateTexImage(SurfaceTexture.java:244)
System.err: 	at com.adobe.flashruntime.air.VideoTextureSurface.updateSurfaceTextureTexImage(VideoTextureSurface.java:49)
System.err: 	at com.adobe.air.customHandler.callTimeoutFunction(Native Method)
System.err: 	at com.adobe.air.Entrypoints$1.handleMessage(Entrypoints.java:318)
System.err: 	at android.os.Handler.dispatchMessage(Handler.java:102)
System.err: 	at android.os.Looper.loop(Looper.java:154)
System.err: 	at com.adobe.air.Entrypoints.run(Entrypoints.java:339)
System.err: 	at java.lang.Thread.run(Thread.java:761)

or

runtime : /13998: AndroidMediaCodec::Initialized
runtime : /13998: ndroidMediaFormat::InitClass initialized
MediaPlayerService: MediaPlayerService::getOMX
OMXClient: MuxOMX ctor
OMXMaster: makeComponentInstance(OMX.amlogic.avc.decoder.awesome) in mediacodec process
OmxComponentManagerImpl: support frame mode
OmxComponentManagerImpl: format support multi-instance
OmxComponentManagerImpl: getEntryByName_2_num=9, componentName:OMX.amlogic.avc.decoder.awesome
OMX     : FAILED to allocate omx component 'OMX.amlogic.avc.decoder.awesome' err=InsufficientResources(0x80001000)
ACodec  : Allocating component 'OMX.amlogic.avc.decoder.awesome' failed, try next one.

Full LogCat logs attached above.
*After long time period 1-2+ hours sample could cause NetStream.Play.Failed error or application crash (may be memory leak?). May be it not related to this one.

Expected Result:
Application playback video in a loop without errors/stops.

Known Workarounds

none

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant