Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Text overlay on top of Texture view and render to save it along with video. Camera2 or X api #571

Open
hiteshtechshslok opened this issue Feb 7, 2024 · 11 comments

Comments

@hiteshtechshslok
Copy link

Hello,

I am trying to add a text overlay using camera2 API over a texture view, I have been searching for the last 1 week so far nothing found.

Ku96V
Even something like this works.

Please let me know if it is possible. and any how-to's will be helpful.

Regards
Hitesh

@xizhang
Copy link

xizhang commented Feb 13, 2024

If you are using CameraX, you can check out the OverlayEffect API.

You want to create a OverlayEffect with queue depth of 0, and targeting both Preview and VideoCapture. Then, set a listener via OverlayEffect#setOnDrawListener. For every new frame that is about to be drawn, you will get a callback in the listener. Then you can use the Frame#getOverlayCanvas to get the Canvas for drawing the text.

Please let us know if you run into any issues.

@hiteshtechshslok
Copy link
Author

Hello @xizhang

Any idea on how to do this with Camera2 Surface Texture

Start preview works as it is on Texture View

private void startPreview() {
        SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture();
        surfaceTexture.setDefaultBufferSize(1920, 1080); // Set the desired size

        Surface surface = new Surface(surfaceTexture);

        try {
            final CaptureRequest.Builder captureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            captureRequestBuilder.addTarget(surface);

            // Apply black-and-white effect
            captureRequestBuilder.set(CaptureRequest.CONTROL_EFFECT_MODE, CaptureRequest.CONTROL_EFFECT_MODE_MONO);




            mCameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(@NonNull CameraCaptureSession session) {
                    try {
                        session.setRepeatingRequest(captureRequestBuilder.build(), new CameraCaptureSession.CaptureCallback() {
                            @Override
                            public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                                super.onCaptureCompleted(session, request, result);
                                Log.d(TAG, "Capturing");
                                drawTimestampOverlay();

                            }
                        }, null);
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }
                }

                @Override
                public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                    Log.e(TAG, "Failed to configure camera preview");
                }
            }, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

For surface texture any idea

private void startRecording() {
        if (mCameraDevice == null || mTextureView.getSurfaceTexture() == null) {

            return;
        }

        mMediaRecorder = new MediaRecorder();
//        mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
        if (((CheckBox) findViewById(R.id.audioCheckbox)).isChecked()) {
            Log.e(TAG, "Check box was set");
            mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
//            mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
        }
        mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
        mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);

        String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault()).format(new Date());
        File videoFile = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM), "VIDEO_" + timeStamp + ".mp4");
        mMediaRecorder.setOutputFile(videoFile.getAbsolutePath());

        mMediaRecorder.setVideoEncodingBitRate(10000000);
        mMediaRecorder.setVideoFrameRate(30);
        mMediaRecorder.setVideoSize(1280, 720);
        mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
        if (((CheckBox) findViewById(R.id.audioCheckbox)).isChecked()) {
            Log.e(TAG, "Check box was set");
//            mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
            mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
        }


//        mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);

        try {
            mMediaRecorder.prepare();
        } catch (IOException e) {
            e.printStackTrace();
            return;
        }

        Log.d(TAG, "Recording");
        mLastFrameTime=System.nanoTime();
        drawTimestampOverlay();
        SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture();
        surfaceTexture.setDefaultBufferSize(1920, 1080); // Set the desired size

        Surface surface = new Surface(surfaceTexture);

        List<Surface> surfaces = new ArrayList<>();
        surfaces.add(surface);
        surfaces.add(mMediaRecorder.getSurface());

        try {
            final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
            captureBuilder.addTarget(surface);
            captureBuilder.addTarget(mMediaRecorder.getSurface());

            mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
                @Override
                public void onConfigured(@NonNull CameraCaptureSession session) {
                    try {
                        mCaptureSession = session;
                        mCaptureSession.setRepeatingRequest(captureBuilder.build(), new CameraCaptureSession.CaptureCallback() {
                            @Override
                            public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
                                super.onCaptureCompleted(session, request, result);
                                Log.d(TAG, "Recording2");
//                                mFrameCount++;

//                                drawTimestampOverlay2();
                            }
                        }, null);
                        mMediaRecorder.start();
                        mIsRecording = true;
                        mRecordButton.setText("Stop");
//                        drawTimestampOverlay();
                    } catch (CameraAccessException e) {
                        e.printStackTrace();
                    }
                }

                @Override
                public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                    Log.e(TAG, "Capture session configuration failed");
                }
            }, null);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

Tried using Canvas but since the surface is already connected to the camera, this does not get locked.
Any idea on how to use GL on that?

@xizhang
Copy link

xizhang commented Feb 20, 2024

The camera2 API is only recommended for lower level controls. If you wish to go this route, you can create an OpenGL renderer between the camera and the TextureView/MediaCodec. Then in the shaders you can draw the timestamp on top of the input. Example given by Gemini: https://g.co/gemini/share/b1c06fd6a9e8

However I should point out that this is basically what CameraX does. It's not an easy path to do it yourself and make it work on different devices. For overlaying a simple text, the existing CameraX API should just work. Please let us know if the existing API does not meet your need.

@Raulsc9
Copy link

Raulsc9 commented Apr 24, 2024

If you are using CameraX, you can check out the OverlayEffect API.

You want to create a OverlayEffect with queue depth of 0, and targeting both Preview and VideoCapture. Then, set a listener via OverlayEffect#setOnDrawListener. For every new frame that is about to be drawn, you will get a callback in the listener. Then you can use the Frame#getOverlayCanvas to get the Canvas for drawing the text.

Please let us know if you run into any issues.

Good morning!

Could you give an example of how to use OverlayEffect to apply a watermark to a video shot with CameraX.

As much as I read the OverlayEffect API documentation, I can't figure out how to use it.

Best regards

@xizhang
Copy link

xizhang commented Apr 24, 2024

You can take a look at this WIP change for code samples: https://android-review.git.corp.google.com/c/platform/frameworks/support/+/2797834/9/camera/camera-effects/src/main/java/androidx/camera/effects/BitmapOverlayEffect.java

Otherwise you can post your detailed question on [email protected] and our engineers will be able to help you.

@Raulsc9
Copy link

Raulsc9 commented Apr 25, 2024

You can take a look at this WIP change for code samples: https://android-review.git.corp.google.com/c/platform/frameworks/support/+/2797834/9/camera/camera-effects/src/main/java/androidx/camera/effects/BitmapOverlayEffect.java

Otherwise you can post your detailed question on [email protected] and our engineers will be able to help you.

The link you passed me requires a "@google.com" user and the group you suggest I write to does not allow me to post questions.

With what little I understood from the documentation and taking into account your first answer, I created an object of type OverlayEffect in this way;

val handler = Handler(Looper.getMainLooper())
        val errorListener = Consumer<Throwable> { error -> println("Error: ${error.message}") }
        val overlayEffect = OverlayEffect(
            CameraEffect.VIDEO_CAPTURE,
            0,
            handler,
            errorListener
        )

        overlayEffect.setOnDrawListener(object : ViewTreeObserver.OnDrawListener,
            Function<Frame, Boolean> {
            override fun onDraw() {

            }

            override fun apply(input: Frame?): Boolean {
                val canvas = input?.overlayCanvas

                val textPaint = Paint().apply {
                    color = Color.RED
                    textSize = 50f
                    isFakeBoldText = true
                    textAlign = Paint.Align.CENTER
                }

                canvas?.drawText("WATERMARK", 200f, 200f + textPaint.textSize, textPaint)

                return true
            }
        })

But I can't figure out, where I should pass the overlayeffect, so that the watermark is applied to the video.

Thank you for your response.

@xizhang
Copy link

xizhang commented Apr 29, 2024

Sorry about the wrong link. I was having trouble with my work laptop for the past a few days. This is the right one: https://android-review.googlesource.com/c/platform/frameworks/support/+/2797834

You set the effect using the UseCaseGroup#setEffects API, or the CameraController#setEffects API if you are using CameraController.

@minsu-kim-skyautonet
Copy link

minsu-kim-skyautonet commented Apr 30, 2024

Sorry about the wrong link. I was having trouble with my work laptop for the past a few days. This is the right one: https://android-review.googlesource.com/c/platform/frameworks/support/+/2797834

You set the effect using the UseCaseGroup#setEffects API, or the CameraController#setEffects API if you are using CameraController.

I applied the overlay effect based on several comments, but when I targeted both the preview and the video capture, the overlay effect was only applied to the preview. When I set only one target, the overlay effect was applied to that target without issue.
How do I apply the overlay effect to both the preview and the video capture?

Overlay effect generation code

        val handler = Handler(Looper.getMainLooper())
        val overlayEffect = OverlayEffect(
            CameraEffect.VIDEO_CAPTURE or CameraEffect.PREVIEW,
            0,
            handler
        ) {
            Logger.e(it, "overlayEffect error")
        }

        val textPaint = Paint().apply {
            color = Color.RED
            textSize = 50f
        }
        overlayEffect.clearOnDrawListener()
        overlayEffect.setOnDrawListener {


            it.overlayCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR);
            it.overlayCanvas.drawText(
                getTimeText(),
                30f,
                30f + textPaint.textSize,
                textPaint,
            )

            true
        }

Code to apply the overlay effect

           val useCaseGroupBuilder = UseCaseGroup.Builder()
               .addUseCase(videoCapture)
               .addUseCase(preview)
               .addEffect(overlayEffect)


           val camera = cameraProvider.bindToLifecycle(
               lifecycleOwner,
               CameraSelector.DEFAULT_BACK_CAMERA,
               useCaseGroupBuilder.build()
           )

@xizhang
Copy link

xizhang commented Apr 30, 2024 via email

@minsu-kim-skyautonet
Copy link

minsu-kim-skyautonet commented May 2, 2024

Your configuration looks good. It should apply to both preview and video capture. Things to try: 1. Upgrade CameraX to the latest version. 2. set a viewport https://developer.android.com/reference/androidx/camera/core/UseCaseGroup.Builder#setViewPort(androidx.camera.core.ViewPort). I wonder if the overlay in video capture was cropped out due to transformation issues. 3. Use the CameraController API. CameraController is a high level API that takes care of the configuration such as the viewport which makes it less error prone. Otherwise, if you can upload a minimal reproducible code sample to GitHub, I am happy to take a look at it.

Thank you for your response. I tried moving the text output coordinates to about the center of the screen and it looks fine. You were right about the problem being caused by the cropping.

@undextrois
Copy link

were you able to implement it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants