Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Low Bitrate and framerate issue #523

Open
DannyFilo opened this issue Oct 16, 2023 · 4 comments
Open

Low Bitrate and framerate issue #523

DannyFilo opened this issue Oct 16, 2023 · 4 comments

Comments

@DannyFilo
Copy link

Hello everyone. Hope you're having a nice day.

I see some issues already were reported in the past but closed really with no solution.
Default bitrate and framerate of 1920x1080 video is unfortunately terrible ... have artifacts like from 240p videos and max 20-24 fps on Galaxy S21.

The issue seems to be that default bitrate is hardcoded to 1650000 in CGEFrameRecorder.java

    public boolean startRecording(int fps, String filename) {
        return startRecording(fps, 1650000, filename);

Additionally seems the method public boolean startRecording(int fps, int bitRate, String filename) from CGEFrameRecorder.java is not matching startRecording method from CameraRecordGLSurfaceView and I mostly use CameraRecordGLSurfaceView in my project.

Could anyone help me to add proper startRecording method with fps,bitrate to CameraRecordGLSurfaceView to be able to use my own bitrate and fps ?

100 thanks is advance,

@wysaid
Copy link
Owner

wysaid commented Oct 17, 2023

Sure, change the code, call the function:
https://github.com/wysaid/android-gpuimage-plus/blob/master/library/src/main/java/org/wysaid/nativePort/CGEFrameRecorder.java#L27

Or, you can extend the CameraRecordGLSurfaceView, and call your own record function, and pass the right bitrate.

@DannyFilo
Copy link
Author

DannyFilo commented Oct 17, 2023

Hi many thanks for your suggestion,

I actually implemented ExtendedCameraRecordGLSurfaceView. Code is compilable etc, but have just 1 issue with initialisation of private CGEFrameRecorder mFrameRecorder = new CGEFrameRecorder(); Basically startRecording(fps, bitRate, recordFilename) returns always false and I got Log.i(LOG_TAG, "start recording failed!") generated in below short code, so camera is not recording anything.

Appreciate any suggestion on how to correctly initialise mFrameRecorder as you're the master of this package :)

Full code of ExtendedCameraRecordGLSurfaceView.


import android.Manifest;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.content.Context;
import android.content.ContextWrapper;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.hardware.Camera;
import android.net.Uri;
import android.os.Build;
import android.util.AttributeSet;
import android.util.Log;
import android.view.MotionEvent;
import android.view.View;
import android.view.ViewGroup;
import android.widget.FrameLayout;

import androidx.annotation.NonNull;

import androidx.annotation.Nullable;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import androidx.core.view.ViewCompat;

import org.wysaid.common.Common;
import org.wysaid.myUtils.FileUtil;
import org.wysaid.myUtils.ImageUtil;
import org.wysaid.nativePort.CGEFrameRecorder;
import org.wysaid.nativePort.CGENativeLibrary;
import org.wysaid.view.CameraRecordGLSurfaceView;

import io.flutter.plugin.common.BinaryMessenger;
import io.flutter.plugin.common.MethodCall;
import io.flutter.plugin.common.MethodChannel;
import io.flutter.plugin.common.PluginRegistry;
import io.flutter.plugin.platform.PlatformView;

import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.Objects;


import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.ShortBuffer;
import org.wysaid.nativePort.CGEFrameRecorder;

public class ExtendedCameraRecordGLSurfaceView extends CameraRecordGLSurfaceView {

   private AudioRecordRunnable mAudioRecordRunnable;
   private Thread mAudioThread;
   private final Object mRecordStateLock = new Object();
   private boolean mShouldRecord = false;

   private CGEFrameRecorder mFrameRecorder = new CGEFrameRecorder();

   public ExtendedCameraRecordGLSurfaceView(Context context, AttributeSet attrs) {
       super(context, attrs);
   }

   public void startRecording(int fps, int bitRate, String recordFilename, MethodChannel.Result result) {
       
       Log.i(LOG_TAG, "I'm in startRecording" + recordFilename);
       new Runnable() {
           @Override
           public void run() {
               Log.i(LOG_TAG, "I'm in run");
               if (mFrameRecorder == null) {
                   Log.i(LOG_TAG, "Error: startRecording after release!!");
                   result.success(false);
                   return;
               }

               if (!mFrameRecorder.startRecording(fps, bitRate, recordFilename)) {
                   Log.i(LOG_TAG, "start recording failed!");
                   result.success(false);
                   return;
               }
               Log.i(LOG_TAG, "glSurfaceView recording, file: " + recordFilename);
               synchronized (mRecordStateLock) {
                   mShouldRecord = true;
                   mAudioRecordRunnable = new AudioRecordRunnable(new StartRecordingCallback() {
                       @Override
                       public void startRecordingOver(boolean success) {
                           result.success(success);
                       }
                   });
                   if (mAudioRecordRunnable.audioRecord != null) {
                       mAudioThread = new Thread(mAudioRecordRunnable);
                       mAudioThread.start();
                   }
               }
           }
       }.run();
   }

   private interface StartRecordingCallback {
       void startRecordingOver(boolean success);
   }

   private class AudioRecordRunnable implements Runnable {
       int bufferSize;
       int bufferReadResult;
       public AudioRecord audioRecord;
       public volatile boolean isInitialized;
       private static final int sampleRate = 44100;
       ByteBuffer audioBufferRef;
       ShortBuffer audioBuffer;
       StartRecordingCallback recordingCallback;

       private AudioRecordRunnable(StartRecordingCallback callback) {
           recordingCallback = callback;
           try {
               bufferSize = AudioRecord.getMinBufferSize(sampleRate,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               Log.i(LOG_TAG, "audio min buffer size: " + bufferSize);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleRate,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
               audioBufferRef = ByteBuffer.allocateDirect(bufferSize * 2).order(ByteOrder.nativeOrder());
               audioBuffer = audioBufferRef.asShortBuffer();
           } catch (Exception e) {
               if (audioRecord != null) {
                   audioRecord.release();
                   audioRecord = null;
               }
           }

           if (audioRecord == null && recordingCallback != null) {
               recordingCallback.startRecordingOver(false);
               recordingCallback = null;
           }
       }

       public void run() {
           android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
           this.isInitialized = false;

           if (this.audioRecord == null) {
               recordingCallback.startRecordingOver(false);
               recordingCallback = null;
               return;
           }

           while (this.audioRecord.getState() == 0) {
               try {
                   Thread.sleep(100L);
               } catch (InterruptedException localInterruptedException) {
                   localInterruptedException.printStackTrace();
               }
           }
           this.isInitialized = true;

           try {
               this.audioRecord.startRecording();
           } catch (Exception e) {
               if (recordingCallback != null) {
                   recordingCallback.startRecordingOver(false);
                   recordingCallback = null;
               }
               return;
           }

           if (this.audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
               if (recordingCallback != null) {
                   recordingCallback.startRecordingOver(false);
                   recordingCallback = null;
               }
               return;
           }

           if (recordingCallback != null) {
               recordingCallback.startRecordingOver(true);
               recordingCallback = null;
           }

           while (true) {
               synchronized (mRecordStateLock) {
                   if (!mShouldRecord)
                       break;
               }

               audioBufferRef.position(0);
               bufferReadResult = this.audioRecord.read(audioBufferRef, bufferSize * 2);
               if (mShouldRecord && bufferReadResult > 0 && mFrameRecorder != null &&
                       mFrameRecorder.getTimestamp() > mFrameRecorder.getAudioStreamtime()) {
                   audioBuffer.position(0);
                   mFrameRecorder.recordAudioFrame(audioBuffer, bufferReadResult / 2);
               }
           }
           this.audioRecord.stop();
           this.audioRecord.release();
           Log.i(LOG_TAG, "Audio thread end!");
       }
   }
}


I call it in a following way from my project (mCameraView is object of ExtendedCameraRecordGLSurfaceView )

    mCameraView.startRecording(30, 1650000, recordFilename, new MethodChannel.Result() {
        @Override
        public void success(Object result) {
            Log.i(LOG_TAG, "Start recording OK");
        }

        @Override
        public void error(String errorCode, String errorMessage, Object errorDetails) {
            Log.i(LOG_TAG, "Start recording failed");
        }

        @Override
        public void notImplemented() {
            // todo
        }
    });
   

@wysaid
Copy link
Owner

wysaid commented Oct 18, 2023

Give a fork of this repo, and produce the problem please.

@DannyFilo
Copy link
Author

DannyFilo commented Oct 19, 2023

Hi, I mean have described the problem above.

As per your advise I extended CameraRecordGLSurfaceView to ExtendedCameraRecordGLSurfaceView and I'm trying to force running startRecording(int fps, int bitRate, String recordFilename, MethodChannel.Result result) method to be able to increase bitrate and fps in camera recordings on the new private CGEFrameRecorder mFrameRecorder = new CGEFrameRecorder(); object instance.

The only issue I have is that mFrameRecorder.startRecording(int fps, int bitRate, String recordFilename, MethodChannel.Result result) always returns false probably because of incorrect initialsation of CGEFrameRecorder.

I guess initialising CGEFrameRecorder mFrameRecorder = new CGEFrameRecorder(); instance in that way is not enough to have a correct mFrameRecorder object.

So appreciate if you can give a hint how to get correct non-returning-false CGEFrameRecorder object.

Many many thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants