当前位置: 首页 > news >正文

Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频写入H264 SEI数据

记录一下学习过程,得到一个需求是基于Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频。

需求:

  1. 在每一帧视频数据中,写入SEI额外数据,方便后期解码时获得每一帧中的自定义数据。
  2. 点击录制功能后,录制的是前N秒至后N秒这段时间的音视频,保存的文件都按照60s进行保存。

写在前面,整个学习过程涉及到以下内容,可以快速检索是否有想要的内容

  • MediaCodec的使用,采用的是createInputSurface()创建一个surface,通过EGL接受camera2传过来的画面。
  • AudioRecord的使用
  • Camera2的使用
  • OpenGL的简单使用
  • H264 SEI的写入简单例子

整体思路设计比较简单,打开相机,创建OpenGL相关环境,然后创建video线程录制video相关数据,创建audio线程录制audio相关数据,video和audio数据都存在自定义的List中作为缓存,最后使用一个编码线程,将video List和audio List中的数据编码到MP4中即可。用的安卓sdk 28,因为29以上保存比较麻烦。整个工程暂时没上传,有需要私。
将以上功能都模块化,分别写到不同的类中。先介绍一些独立的模块。

UI布局

ui很简单,一个GLSurfaceView,两个button控件。

在这里插入图片描述

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"xmlns:app="http://schemas.android.com/apk/res-auto"xmlns:tools="http://schemas.android.com/tools"android:layout_width="match_parent"android:layout_height="match_parent"tools:context=".MainActivity"><android.opengl.GLSurfaceViewandroid:id="@+id/glView"android:layout_width="match_parent"android:layout_height="match_parent"app:layout_constraintBottom_toBottomOf="parent"app:layout_constraintEnd_toEndOf="parent"app:layout_constraintStart_toStartOf="parent"app:layout_constraintTop_toTopOf="parent" /><Buttonandroid:id="@+id/recordBtn"android:layout_width="wrap_content"android:layout_height="wrap_content"android:layout_marginBottom="80dp"android:text="Record"app:layout_constraintBottom_toBottomOf="parent"app:layout_constraintLeft_toLeftOf="parent"app:layout_constraintRight_toRightOf="parent" /><Buttonandroid:id="@+id/exit"android:layout_width="wrap_content"android:layout_height="wrap_content"android:layout_marginTop="20dp"android:layout_marginRight="20dp"android:text="Eixt"app:layout_constraintTop_toTopOf="parent"app:layout_constraintRight_toRightOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>

Camera2

camera2框架的使用,比较简单,需要注意的一点是, startPreview函数中传入的surface用于后续mCaptureRequestBuilder.addTarget(surface)的参数传入。surface的产生由以下基本几步完成。现在简单提一下,下面会贴代码。
1.这个surface 就是通过openGL 生成的纹理, GLES30.glGenTextures(1, mTexture, 0);
2.纹理生成SurfaceTexture, mSurfaceTexture = new SurfaceTexture(mTexture[0]);
3.mSurfaceTexture生成一个surface, mSurface = new Surface(mSurfaceTexture);
4.mCamera.startPreview(mSurface);

public class Camera2 {private final String TAG = "Abbott Camera2";private Context mContext;private CameraManager mCameraManager;private CameraDevice mCameraDevice;private String[] mCamList;private String mCameraId;private Size mPreviewSize;private HandlerThread mBackgroundThread;private Handler mBackgroundHandler;private CaptureRequest.Builder mCaptureRequestBuilder;private CaptureRequest mCaptureRequest;private CameraCaptureSession mCameraCaptureSession;public Camera2(Context Context) {mContext = Context;mCameraManager = (CameraManager) mContext.getSystemService(android.content.Context.CAMERA_SERVICE);try {mCamList = mCameraManager.getCameraIdList();} catch (CameraAccessException e) {e.printStackTrace();}mBackgroundThread = new HandlerThread("CameraThread");mBackgroundThread.start();mBackgroundHandler = new Handler(mBackgroundThread.getLooper());}public void openCamera(int width, int height, String id) {try {Log.d(TAG, "openCamera: id:" + id);CameraCharacteristics characteristics = mCameraManager.getCameraCharacteristics(id);if (characteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT) {}StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);mPreviewSize = getOptimalSize(map.getOutputSizes(SurfaceTexture.class), width, height);mCameraId = id;} catch (CameraAccessException e) {e.printStackTrace();}try {if (ActivityCompat.checkSelfPermission(mContext, android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {return;}Log.d(TAG, "mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);: " + mCameraId);mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}private Size getOptimalSize(Size[] sizeMap, int width, int height) {List<Size> sizeList = new ArrayList<>();for (Size option : sizeMap) {if (width > height) {if (option.getWidth() > width && option.getHeight() > height) {sizeList.add(option);}} else {if (option.getWidth() > height && option.getHeight() > width) {sizeList.add(option);}}}if (sizeList.size() > 0) {return Collections.min(sizeList, new Comparator<Size>() {@Overridepublic int compare(Size lhs, Size rhs) {return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());}});}return sizeMap[0];}private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {@Overridepublic void onOpened(@NonNull CameraDevice camera) {mCameraDevice = camera;}@Overridepublic void onDisconnected(@NonNull CameraDevice camera) {camera.close();mCameraDevice = null;}@Overridepublic void onError(@NonNull CameraDevice camera, int error) {camera.close();mCameraDevice = null;}};public void startPreview(Surface surface) {try {mCaptureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);mCaptureRequestBuilder.addTarget(surface);mCameraDevice.createCaptureSession(Collections.singletonList(surface), new CameraCaptureSession.StateCallback() {@Overridepublic void onConfigured(@NonNull CameraCaptureSession session) {try {mCaptureRequest = mCaptureRequestBuilder.build();mCameraCaptureSession = session;mCameraCaptureSession.setRepeatingRequest(mCaptureRequest, null, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}@Overridepublic void onConfigureFailed(@NonNull CameraCaptureSession session) {}}, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}
}

ImageList

这个类就是用于video 和audio缓存类,没有什么可以介绍的,直接用就好了。

public class ImageList {private static final String TAG = "Abbott ImageList";private Object mImageListLock = new Object();int kCapacity;private List<ImageItem> mImageList = new CopyOnWriteArrayList<>();public ImageList(int capacity) {kCapacity = capacity;}public synchronized void addItem(long Timestamp, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {synchronized (mImageListLock) {ImageItem item = new ImageItem(Timestamp, byteBuffer, bufferInfo);mImageList.add(item);if (mImageList.size() > kCapacity) {int excessItems = mImageList.size() - kCapacity;mImageList.subList(0, excessItems).clear();}}}public synchronized List<ImageItem> getItemsInTimeRange(long startTimestamp, long endTimestamp) {List<ImageItem> itemsInTimeRange = new ArrayList<>();synchronized (mImageListLock) {for (ImageItem item : mImageList) {long itemTimestamp = item.getTimestamp();// 判断时间戳是否在指定范围内if (itemTimestamp >= startTimestamp && itemTimestamp <= endTimestamp) {itemsInTimeRange.add(item);}}}return itemsInTimeRange;}public synchronized ImageItem getItem() {return mImageList.get(0);}public synchronized void removeItem() {mImageList.remove(0);}public synchronized int getSize() {return mImageList.size();}public static class ImageItem {private long mTimestamp;private ByteBuffer mVideoBuffer;private MediaCodec.BufferInfo mVideoBufferInfo;public ImageItem(long first, ByteBuffer second, MediaCodec.BufferInfo bufferInfo) {this.mTimestamp = first;this.mVideoBuffer = second;this.mVideoBufferInfo = bufferInfo;}public synchronized long getTimestamp() {return mTimestamp;}public synchronized ByteBuffer getVideoByteBuffer() {return mVideoBuffer;}public synchronized MediaCodec.BufferInfo getVideoBufferInfo() {return mVideoBufferInfo;}}
}

GlProgram

用于创建OpenGL的程序的类。目前使用的是OpenGL3.0 版本

public class GlProgram {public static final String mVertexShader ="#version 300 es \n" +"in vec4 vPosition;" +"in vec2 vCoordinate;" +"out vec2 vTextureCoordinate;" +"void main() {" +"   gl_Position = vPosition;" +"   vTextureCoordinate = vCoordinate;" +"}";public static final String mFragmentShader ="#version 300 es \n" +"#extension GL_OES_EGL_image_external : require \n" +"#extension GL_OES_EGL_image_external_essl3 : require \n" +"precision mediump float;" +"in vec2 vTextureCoordinate;" +"uniform samplerExternalOES oesTextureSampler;" +"out vec4 gl_FragColor;" +"void main() {" +"    gl_FragColor = texture(oesTextureSampler, vTextureCoordinate);" +"}";public static int createProgram(String vertexShaderSource, String fragShaderSource) {int program = GLES30.glCreateProgram();if (0 == program) {Log.e("Arc_ShaderManager", "create program error ,error=" + GLES30.glGetError());return 0;}int vertexShader = loadShader(GLES30.GL_VERTEX_SHADER, vertexShaderSource);if (0 == vertexShader) {return 0;}int fragShader = loadShader(GLES30.GL_FRAGMENT_SHADER, fragShaderSource);if (0 == fragShader) {return 0;}GLES30.glAttachShader(program, vertexShader);GLES30.glAttachShader(program, fragShader);GLES30.glLinkProgram(program);int[] status = new int[1];GLES30.glGetProgramiv(program, GLES30.GL_LINK_STATUS, status, 0);if (GLES30.GL_FALSE == status[0]) {String errorMsg = GLES30.glGetProgramInfoLog(program);Log.e("Arc_ShaderManager", "createProgram error : " + errorMsg);GLES30.glDeleteShader(vertexShader);GLES30.glDeleteShader(fragShader);GLES30.glDeleteProgram(program);return 0;}GLES30.glDetachShader(program, vertexShader);GLES30.glDetachShader(program, fragShader);GLES30.glDeleteShader(vertexShader);GLES30.glDeleteShader(fragShader);return program;}private static int loadShader(int type, String shaderSource) {int shader = GLES30.glCreateShader(type);if (0 == shader) {Log.e("Arc_ShaderManager", "create shader error, shader type=" + type + " , error=" + GLES30.glGetError());return 0;}GLES30.glShaderSource(shader, shaderSource);GLES30.glCompileShader(shader);int[] status = new int[1];GLES30.glGetShaderiv(shader, GLES30.GL_COMPILE_STATUS, status, 0);if (0 == status[0]) {String errorMsg = GLES30.glGetShaderInfoLog(shader);Log.e("Arc_ShaderManager", "createShader shader = " + type + "  error: " + errorMsg);GLES30.glDeleteShader(shader);return 0;}return shader;}
}

OesTexture

连接上面介绍的OpenGL程序,通过顶点着色器和片元着色器的坐标生成纹理

public class OesTexture {private static final String TAG = "Abbott OesTexture";private int mProgram;private final FloatBuffer mCordsBuffer;private final FloatBuffer mPositionBuffer;private int mPositionHandle;private int mCordsHandle;private int mOESTextureHandle;public OesTexture() {float[] positions = {-1.0f, 1.0f,-1.0f, -1.0f,1.0f, 1.0f,1.0f, -1.0f};float[] texCords = {0.0f, 0.0f,0.0f, 1.0f,1.0f, 0.0f,1.0f, 1.0f,};mPositionBuffer = ByteBuffer.allocateDirect(positions.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();mPositionBuffer.put(positions).position(0);mCordsBuffer = ByteBuffer.allocateDirect(texCords.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();mCordsBuffer.put(texCords).position(0);}public void init() {this.mProgram = GlProgram.createProgram(GlProgram.mVertexShader, GlProgram.mFragmentShader);if (0 == this.mProgram) {Log.e(TAG, "createProgram failed");}mPositionHandle = GLES30.glGetAttribLocation(mProgram, "vPosition");mCordsHandle = GLES30.glGetAttribLocation(mProgram, "vCoordinate");mOESTextureHandle = GLES30.glGetUniformLocation(mProgram, "oesTextureSampler");GLES30.glDisable(GLES30.GL_DEPTH_TEST);}public void PrepareTexture(int OESTextureId) {GLES30.glUseProgram(this.mProgram);GLES30.glEnableVertexAttribArray(mPositionHandle);GLES30.glVertexAttribPointer(mPositionHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mPositionBuffer);GLES30.glEnableVertexAttribArray(mCordsHandle);GLES30.glVertexAttribPointer(mCordsHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mCordsBuffer);GLES30.glActiveTexture(GLES30.GL_TEXTURE0);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, OESTextureId);GLES30.glUniform1i(mOESTextureHandle, 0);GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);GLES30.glDisableVertexAttribArray(mPositionHandle);GLES30.glDisableVertexAttribArray(mCordsHandle);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);}
}

接下来介绍的VideoRecorder,AudioEncoder,EncodingRunnable三个类需要互相搭配使用

public class AudioEncoder extends Thread {private static final String TAG = "Abbott AudioEncoder";private static final int SAVEMP4_INTERNAL = Param.recordInternal * 1000 * 1000;private static final int SAMPLE_RATE = 44100;private static final int CHANNEL_COUNT = 1;private static final int BIT_RATE = 96000;private EncodingRunnable mEncodingRunnable;private MediaCodec mMediaCodec;private AudioRecord mAudioRecord;private MediaFormat mFormat;private MediaFormat mOutputFormat;private long nanoTime;int mBufferSizeInBytes = 0;boolean mExitThread = true;private ImageList mAudioList;private MediaCodec.BufferInfo mAudioBufferInfo;private boolean mAlarm = false;private long mAlarmTime;private long mAlarmStartTime;private long mAlarmEndTime;private List<ImageList.ImageItem> mMuxerImageItem;private Object mLock = new Object();private MediaCodec.BufferInfo mAlarmBufferInfo;public AudioEncoder( EncodingRunnable encodingRunnable) throws IOException {mEncodingRunnable = encodingRunnable;nanoTime = System.nanoTime();createAudio();createMediaCodec();int kCapacity = 1000 / 20 * Param.recordInternal;mAudioList = new ImageList(kCapacity);}public void createAudio() {mBufferSizeInBytes = AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, mBufferSizeInBytes);}public void createMediaCodec() throws IOException {mFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, SAMPLE_RATE, CHANNEL_COUNT);mFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);mFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);mFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 8192);mMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);mMediaCodec.configure(mFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);}public synchronized void setAlarm() {synchronized (mLock) {Log.d(TAG, "setAudio Alarm enter");mEncodingRunnable.setAudioFormat(mOutputFormat);mEncodingRunnable.setAudioAlarmTrue();mAlarmTime = mAlarmBufferInfo.presentationTimeUs;mAlarmEndTime = mAlarmTime + SAVEMP4_INTERNAL;if (!mAlarm) {mAlarmStartTime = mAlarmTime - SAVEMP4_INTERNAL;}mAlarm = true;Log.d(TAG, "setAudio Alarm exit");}}@Overridepublic void run() {super.run();mMediaCodec.start();mAudioRecord.startRecording();while (mExitThread) {synchronized (mLock) {byte[] inputAudioData = new byte[mBufferSizeInBytes];int res = mAudioRecord.read(inputAudioData, 0, inputAudioData.length);if (res > 0) {if (mAudioRecord != null) {enCodeAudio(inputAudioData);}}}}Log.d(TAG, "AudioRecord run: exit");}private void enCodeAudio(byte[] inputAudioData) {mAudioBufferInfo = new MediaCodec.BufferInfo();int index = mMediaCodec.dequeueInputBuffer(-1);if (index < 0) {return;}ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();ByteBuffer audioInputBuffer = inputBuffers[index];audioInputBuffer.clear();audioInputBuffer.put(inputAudioData);audioInputBuffer.limit(inputAudioData.length);mMediaCodec.queueInputBuffer(index, 0, inputAudioData.length, (System.nanoTime() - nanoTime) / 1000, 0);int status = mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);ByteBuffer outputBuffer;if (status == MediaCodec.INFO_TRY_AGAIN_LATER) {} else if (status == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mOutputFormat = mMediaCodec.getOutputFormat();} else {while (status >= 0) {MediaCodec.BufferInfo tmpaudioBufferInfo = new MediaCodec.BufferInfo();tmpaudioBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);mAlarmBufferInfo = new MediaCodec.BufferInfo();mAlarmBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);outputBuffer = mMediaCodec.getOutputBuffer(status);ByteBuffer buffer = ByteBuffer.allocate(tmpaudioBufferInfo.size);buffer.limit(tmpaudioBufferInfo.size);buffer.put(outputBuffer);buffer.flip();if (tmpaudioBufferInfo.size > 0) {if (mAlarm) {mMuxerImageItem = mAudioList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);for (ImageList.ImageItem item : mMuxerImageItem) {mEncodingRunnable.pushAudio(item);}mAlarmStartTime = tmpaudioBufferInfo.presentationTimeUs;mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);if (tmpaudioBufferInfo.presentationTimeUs - mAlarmTime > SAVEMP4_INTERNAL) {mAlarm = false;mEncodingRunnable.setAudioAlarmFalse();Log.d(TAG, "mEncodingRunnable.setAudio itemAlarmFalse();");}} else {mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);}}mMediaCodec.releaseOutputBuffer(status, false);status = mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);}}}public synchronized void stopAudioRecord() throws IllegalStateException {synchronized (mLock) {mExitThread = false;}try {join();} catch (InterruptedException e) {e.printStackTrace();}mMediaCodec.stop();mMediaCodec.release();mMediaCodec = null;}
}
public class VideoRecorder extends Thread {private static final String TAG = "Abbott VideoRecorder";private static final int SAVE_MP4_Internal = 1000 * 1000 * Param.recordInternal;// EGLprivate static final int EGL_RECORDABLE_ANDROID = 0x3142;private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;private EGLContext mSharedContext = EGL14.EGL_NO_CONTEXT;private Surface mSurface;private int mOESTextureId;private OesTexture mOesTexture;private ImageList mImageList;private List<ImageList.ImageItem> muxerImageItem;// Threadprivate boolean mExitThread;private Object mLock = new Object();private Object object = new Object();private MediaCodec mMediaCodec;private MediaFormat mOutputFormat;private boolean mAlarm = false;private long mAlarmTime;private long mAlarmStartTime;private long mAlarmEndTime;private MediaCodec.BufferInfo mBufferInfo;private EncodingRunnable mEncodingRunnable;private String mSeiMessage;public VideoRecorder(EGLContext eglContext, EncodingRunnable encodingRunnable) {mSharedContext = eglContext;mEncodingRunnable = encodingRunnable;int kCapacity = 1000 / 40 * Param.recordInternal;mImageList = new ImageList(kCapacity);try {MediaFormat mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1920, 1080);mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1920 * 1080 * 25 / 5);mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);mMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);mSurface = mMediaCodec.createInputSurface();} catch (IOException e) {e.printStackTrace();}}@Overridepublic void run() {super.run();try {initEgl();mOesTexture = new OesTexture();mOesTexture.init();synchronized (mLock) {mLock.wait(33);}guardedRun();} catch (Exception e) {e.printStackTrace();}}private void guardedRun() throws InterruptedException, RuntimeException {mExitThread = false;while (true) {synchronized (mLock) {if (mExitThread) {break;}mLock.wait(33);}mOesTexture.PrepareTexture(mOESTextureId);swapBuffers();enCodeVideo();}Log.d(TAG, "guardedRun: exit");unInitEgl();}private void enCodeVideo() {mBufferInfo = new MediaCodec.BufferInfo();int status = mMediaCodec.dequeueOutputBuffer(mBufferInfo, 0);ByteBuffer outputBuffer = null;if (status == MediaCodec.INFO_TRY_AGAIN_LATER) {} else if (status == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mOutputFormat = mMediaCodec.getOutputFormat();} else if (status == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {} else {outputBuffer = mMediaCodec.getOutputBuffer(status);if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {mBufferInfo.size = 0;}if (mBufferInfo.size > 0) {outputBuffer.position(mBufferInfo.offset);outputBuffer.limit(mBufferInfo.size - mBufferInfo.offset);mSeiMessage = "avcIndex" + String.format("%05d", 0);}mMediaCodec.releaseOutputBuffer(status, false);}if (mBufferInfo.size > 0) {mEncodingRunnable.setTimeUs(mBufferInfo.presentationTimeUs);ByteBuffer seiData = buildSEIData(mSeiMessage);ByteBuffer frameWithSEI = ByteBuffer.allocate(outputBuffer.remaining() + seiData.remaining());frameWithSEI.put(seiData);frameWithSEI.put(outputBuffer);frameWithSEI.flip();mBufferInfo.size = frameWithSEI.remaining();MediaCodec.BufferInfo tmpAudioBufferInfo = new MediaCodec.BufferInfo();tmpAudioBufferInfo.set(mBufferInfo.offset, mBufferInfo.size, mBufferInfo.presentationTimeUs, mBufferInfo.flags);if (mAlarm) {muxerImageItem = mImageList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);mAlarmStartTime = tmpAudioBufferInfo.presentationTimeUs;for (ImageList.ImageItem item : muxerImageItem) {mEncodingRunnable.push(item);}mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);if (mBufferInfo.presentationTimeUs - mAlarmTime > SAVE_MP4_Internal) {Log.d(TAG, "mEncodingRunnable.set itemAlarmFalse()");Log.d(TAG, tmpAudioBufferInfo.presentationTimeUs + " " + mAlarmTime);mAlarm = false;mEncodingRunnable.setVideoAlarmFalse();}} else {mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);}}}public synchronized void setAlarm() {synchronized (mLock) {Log.d(TAG, "setAlarm enter");mEncodingRunnable.setMediaFormat(mOutputFormat);mEncodingRunnable.setVideoAlarmTrue();if (mBufferInfo.presentationTimeUs != 0) {mAlarmTime = mBufferInfo.presentationTimeUs;}mAlarmEndTime = mAlarmTime + SAVE_MP4_Internal;if (!mAlarm) {mAlarmStartTime = mAlarmTime - SAVE_MP4_Internal;}mAlarm = true;Log.d(TAG, "setAlarm exit");}}public synchronized void startRecord() throws IllegalStateException {super.start();mMediaCodec.start();}public synchronized void stopVideoRecord() throws IllegalStateException {synchronized (mLock) {mExitThread = true;mLock.notify();}try {join();} catch (InterruptedException e) {e.printStackTrace();}mMediaCodec.signalEndOfInputStream();mMediaCodec.stop();mMediaCodec.release();mMediaCodec = null;}public void requestRender(int i) {synchronized (object) {mOESTextureId = i;}}private void initEgl() {this.mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);if (this.mEGLDisplay == EGL14.EGL_NO_DISPLAY) {throw new RuntimeException("EGL14.eglGetDisplay fail...");}int[] major_version = new int[2];boolean eglInited = EGL14.eglInitialize(this.mEGLDisplay, major_version, 0, major_version, 1);if (!eglInited) {this.mEGLDisplay = null;throw new RuntimeException("EGL14.eglInitialize fail...");}//4. 设置显示设备的属性int[] attrib_list = new int[]{EGL14.EGL_SURFACE_TYPE, EGL14.EGL_WINDOW_BIT,EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,EGL14.EGL_RED_SIZE, 8,EGL14.EGL_GREEN_SIZE, 8,EGL14.EGL_BLUE_SIZE, 8,EGL14.EGL_ALPHA_SIZE, 8,EGL14.EGL_DEPTH_SIZE, 16,EGL_RECORDABLE_ANDROID, 1,EGL14.EGL_NONE};EGLConfig[] configs = new EGLConfig[1];int[] numConfigs = new int[1];boolean eglChose = EGL14.eglChooseConfig(this.mEGLDisplay, attrib_list, 0, configs, 0, configs.length, numConfigs, 0);if (!eglChose) {throw new RuntimeException("eglChooseConfig [RGBA888 + recordable] ES2 EGL_config_fail...");}int[] attr_list = {EGL14.EGL_CONTEXT_CLIENT_VERSION, 2, EGL14.EGL_NONE};this.mEGLContext = EGL14.eglCreateContext(this.mEGLDisplay, configs[0], this.mSharedContext, attr_list, 0);checkEglError("eglCreateContext");if (this.mEGLContext == EGL14.EGL_NO_CONTEXT) {throw new RuntimeException("eglCreateContext == EGL_NO_CONTEXT");}int[] surface_attr = {EGL14.EGL_NONE};this.mEGLSurface = EGL14.eglCreateWindowSurface(this.mEGLDisplay, configs[0], this.mSurface, surface_attr, 0);if (this.mEGLSurface == EGL14.EGL_NO_SURFACE) {throw new RuntimeException("eglCreateWindowSurface == EGL_NO_SURFACE");}Log.d(TAG, "initEgl , display=" + this.mEGLDisplay + " ,context=" + this.mEGLContext + " ,sharedContext= " +this.mSharedContext + ", surface=" + this.mEGLSurface);boolean success = EGL14.eglMakeCurrent(this.mEGLDisplay, this.mEGLSurface, this.mEGLSurface, this.mEGLContext);if (!success) {checkEglError("makeCurrent");throw new RuntimeException("eglMakeCurrent failed");}}private void unInitEgl() {boolean success = EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_CONTEXT);if (!success) {checkEglError("makeCurrent");throw new RuntimeException("eglMakeCurrent failed");}if (this.mEGLDisplay != EGL14.EGL_NO_DISPLAY) {EGL14.eglDestroySurface(this.mEGLDisplay, this.mEGLSurface);EGL14.eglDestroyContext(this.mEGLDisplay, this.mEGLContext);EGL14.eglTerminate(this.mEGLDisplay);}this.mEGLDisplay = EGL14.EGL_NO_DISPLAY;this.mEGLContext = EGL14.EGL_NO_CONTEXT;this.mEGLSurface = EGL14.EGL_NO_SURFACE;this.mSharedContext = EGL14.EGL_NO_CONTEXT;this.mSurface = null;}private boolean swapBuffers() {if ((null == this.mEGLDisplay) || (null == this.mEGLSurface)) {return false;}boolean success = EGL14.eglSwapBuffers(this.mEGLDisplay, this.mEGLSurface);if (!success) {checkEglError("eglSwapBuffers");}return success;}private void checkEglError(String msg) {int error = EGL14.eglGetError();if (error != EGL14.EGL_SUCCESS) {throw new RuntimeException(msg + ": EGL_ERROR_CODE: 0x" + Integer.toHexString(error));}}private ByteBuffer buildSEIData(String message) {// 构建 SEI 数据int seiSize = 128;ByteBuffer seiBuffer = ByteBuffer.allocate(seiSize);seiBuffer.put(new byte[]{0, 0, 0, 1, 6, 5});// 设置 SEI messageString seiMessage = "h264testdata" + message;seiBuffer.put((byte) seiMessage.length());// 设置 SEI user dataseiBuffer.put(seiMessage.getBytes());seiBuffer.flip();return seiBuffer;}}
public class EncodingRunnable extends Thread {private static final String TAG = "Abbott EncodingRunnable";private Object mRecordLock = new Object();private boolean mExitThread = false;private MediaMuxer mMediaMuxer;private int avcIndex;private int mAudioIndex;private MediaFormat mOutputFormat;private MediaFormat mAudioOutputFormat;private ImageList mImageList;private ImageList mAudioImageList;private boolean itemAlarm;private long mAudioImageListTimeUs = -1;private boolean mAudioAlarm;private int mVideoCapcity = 1000 / 40 * Param.recordInternal;private int mAudioCapcity = 1000 / 20 * Param.recordInternal;private int recordSecond = 1000 * 1000 * 60;long Video60sStart = -1;public EncodingRunnable() {mImageList = new ImageList(mVideoCapcity);mAudioImageList = new ImageList(mAudioCapcity);}private boolean mIsRecoding = false;public void setMediaFormat(MediaFormat OutputFormat) {if (mOutputFormat == null) {mOutputFormat = OutputFormat;}}public void setAudioFormat(MediaFormat OutputFormat) {if (mAudioOutputFormat == null) {mAudioOutputFormat = OutputFormat;}}public void setMediaMuxerConfig() {long currentTimeMillis = System.currentTimeMillis();Date currentDate = new Date(currentTimeMillis);SimpleDateFormat dateFormat = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault());String fileName = dateFormat.format(currentDate);File mFile = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),fileName + ".MP4");Log.d(TAG, "setMediaMuxerSavaPath: new MediaMuxer  " + mFile.getPath());try {mMediaMuxer = new MediaMuxer(mFile.getPath(), MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);} catch (IOException e) {e.printStackTrace();}avcIndex = mMediaMuxer.addTrack(mOutputFormat);mAudioIndex = mMediaMuxer.addTrack(mAudioOutputFormat);mMediaMuxer.start();}public void setMediaMuxerSavaPath() {if (!mIsRecoding) {mExitThread = false;setMediaMuxerConfig();setRecording();notifyStartRecord();}}@Overridepublic void run() {super.run();while (true) {synchronized (mRecordLock) {try {mRecordLock.wait();} catch (InterruptedException e) {e.printStackTrace();}}MediaCodec.BufferInfo tmpAudioBufferInfo = new MediaCodec.BufferInfo();while (mIsRecoding) {if (mAudioImageList.getSize() > 0) {ImageList.ImageItem audioItem = mAudioImageList.getItem();tmpAudioBufferInfo.set(audioItem.getVideoBufferInfo().offset,audioItem.getVideoBufferInfo().size,audioItem.getVideoBufferInfo().presentationTimeUs + mAudioImageListTimeUs,audioItem.getVideoBufferInfo().flags);mMediaMuxer.writeSampleData(mAudioIndex, audioItem.getVideoByteBuffer(), tmpAudioBufferInfo);mAudioImageList.removeItem();}if (mImageList.getSize() > 0) {ImageList.ImageItem item = mImageList.getItem();if (Video60sStart < 0) {Video60sStart = item.getVideoBufferInfo().presentationTimeUs;}mMediaMuxer.writeSampleData(avcIndex, item.getVideoByteBuffer(), item.getVideoBufferInfo());if (item.getVideoBufferInfo().presentationTimeUs - Video60sStart > recordSecond) {Log.d(TAG, "System.currentTimeMillis() - Video60sStart :" + (item.getVideoBufferInfo().presentationTimeUs - Video60sStart));mMediaMuxer.stop();mMediaMuxer.release();mMediaMuxer = null;setMediaMuxerConfig();Video60sStart = -1;}mImageList.removeItem();}if (itemAlarm == false && mAudioAlarm == false) {mIsRecoding = false;Log.d(TAG, "mediaMuxer.stop()");mMediaMuxer.stop();mMediaMuxer.release();mMediaMuxer = null;break;}}if (mExitThread) {break;}}}public synchronized void setRecording() throws IllegalStateException {synchronized (mRecordLock) {mIsRecoding = true;}}public synchronized void setAudioAlarmTrue() throws IllegalStateException {synchronized (mRecordLock) {mAudioAlarm = true;}}public synchronized void setVideoAlarmTrue() throws IllegalStateException {synchronized (mRecordLock) {itemAlarm = true;}}public synchronized void setAudioAlarmFalse() throws IllegalStateException {synchronized (mRecordLock) {mAudioAlarm = false;}}public synchronized void setVideoAlarmFalse() throws IllegalStateException {synchronized (mRecordLock) {itemAlarm = false;}}public synchronized void notifyStartRecord() throws IllegalStateException {synchronized (mRecordLock) {mRecordLock.notify();}}public synchronized void push(ImageList.ImageItem item) {mImageList.addItem(item.getTimestamp(),item.getVideoByteBuffer(),item.getVideoBufferInfo());}public synchronized void pushAudio(ImageList.ImageItem item) {synchronized (mRecordLock) {mAudioImageList.addItem(item.getTimestamp(),item.getVideoByteBuffer(),item.getVideoBufferInfo());}}public synchronized void setTimeUs(long l) {if (mAudioImageListTimeUs != -1) {return;}mAudioImageListTimeUs = l;Log.d(TAG, "setTimeUs: " + l);}public synchronized void setExitThread() {mExitThread = true;mIsRecoding = false;notifyStartRecord();try {join();} catch (InterruptedException e) {e.printStackTrace();}}}

最后介绍一下Camera2Renderer和MainActivity

Camera2Renderer

Camera2Renderer继承GLSurfaceView.Renderer,通过这个类来调动所有的代码。

public class Camera2Renderer implements GLSurfaceView.Renderer {private static final String TAG = "Abbott Camera2Renderer";final private Context mContext;final private GLSurfaceView mGlSurfaceView;private Camera2 mCamera;private int[] mTexture = new int[1];private SurfaceTexture mSurfaceTexture;private Surface mSurface;private OesTexture mOesTexture;private EGLContext mEglContext = null;private VideoRecorder mVideoRecorder;private EncodingRunnable mEncodingRunnable;private AudioEncoder mAudioEncoder;public Camera2Renderer(Context context, GLSurfaceView glSurfaceView, EncodingRunnable encodingRunnable) {mContext = context;mGlSurfaceView = glSurfaceView;mEncodingRunnable = encodingRunnable;}@Overridepublic void onSurfaceCreated(GL10 gl, EGLConfig config) {mCamera = new Camera2(mContext);mCamera.openCamera(1920, 1080, "0");mOesTexture = new OesTexture();mOesTexture.init();mEglContext = EGL14.eglGetCurrentContext();mVideoRecorder = new VideoRecorder(mEglContext, mEncodingRunnable);mVideoRecorder.startRecord();try {mAudioEncoder = new AudioEncoder(mEncodingRunnable);mAudioEncoder.start();} catch (IOException e) {e.printStackTrace();}}@Overridepublic void onSurfaceChanged(GL10 gl, int width, int height) {GLES30.glGenTextures(1, mTexture, 0);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTexture[0]);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);mSurfaceTexture = new SurfaceTexture(mTexture[0]);mSurfaceTexture.setDefaultBufferSize(1920, 1080);mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {@Overridepublic void onFrameAvailable(SurfaceTexture surfaceTexture) {mGlSurfaceView.requestRender();}});mSurface = new Surface(mSurfaceTexture);mCamera.startPreview(mSurface);}@Overridepublic void onDrawFrame(GL10 gl) {mSurfaceTexture.updateTexImage();mOesTexture.PrepareTexture(mTexture[0]);mVideoRecorder.requestRender(mTexture[0]);}public VideoRecorder getVideoRecorder() {return mVideoRecorder;}public AudioEncoder getAudioEncoder() {return mAudioEncoder;}
}

主函数比较简单,就是申请权限而已。

public class MainActivity extends AppCompatActivity {private static final String TAG = "Abbott MainActivity";private static final String FRAGMENT_DIALOG = "dialog";private final Object mLock = new Object();private GLSurfaceView mGlSurfaceView;private Button mRecordButton;private Button mExitButton;private Camera2Renderer mCamera2Renderer;private VideoRecorder mVideoRecorder;private EncodingRunnable mEncodingRunnable;private AudioEncoder mAudioEncoder;private static final int REQUEST_CAMERA_PERMISSION = 1;@Overrideprotected void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {requestCameraPermission();return;}setContentView(R.layout.activity_main);mGlSurfaceView = findViewById(R.id.glView);mRecordButton = findViewById(R.id.recordBtn);mExitButton = findViewById(R.id.exit);mGlSurfaceView.setEGLContextClientVersion(3);mEncodingRunnable = new EncodingRunnable();mEncodingRunnable.start();mCamera2Renderer = new Camera2Renderer(this, mGlSurfaceView, mEncodingRunnable);mGlSurfaceView.setRenderer(mCamera2Renderer);mGlSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);}@Overrideprotected void onResume() {super.onResume();mRecordButton.setOnClickListener(new View.OnClickListener() {@Overridepublic void onClick(View view) {synchronized (MainActivity.this) {startRecord();}}});mExitButton.setOnClickListener(new View.OnClickListener() {@Overridepublic void onClick(View view) {stopRecord();Log.d(TAG, "onClick: exit program");finish();}});}private void requestCameraPermission() {if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) ||shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||shouldShowRequestPermissionRationale(Manifest.permission.RECORD_AUDIO)) {new ConfirmationDialog().show(getSupportFragmentManager(), FRAGMENT_DIALOG);} else {requestPermissions(new String[]{Manifest.permission.CAMERA,Manifest.permission.WRITE_EXTERNAL_STORAGE,Manifest.permission.RECORD_AUDIO}, REQUEST_CAMERA_PERMISSION);}}public static class ConfirmationDialog extends DialogFragment {@NonNull@Overridepublic Dialog onCreateDialog(Bundle savedInstanceState) {final Fragment parent = getParentFragment();return new AlertDialog.Builder(getActivity()).setMessage(R.string.request_permission).setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {@Overridepublic void onClick(DialogInterface dialog, int which) {}}).setNegativeButton(android.R.string.cancel,new DialogInterface.OnClickListener() {@Overridepublic void onClick(DialogInterface dialog, int which) {Activity activity = parent.getActivity();if (activity != null) {activity.finish();}}}).create();}}private void startRecord() {synchronized (mLock) {try {if (mVideoRecorder == null) {mVideoRecorder = mCamera2Renderer.getVideoRecorder();}if (mAudioEncoder == null) {mAudioEncoder = mCamera2Renderer.getAudioEncoder();}mVideoRecorder.setAlarm();mAudioEncoder.setAlarm();mEncodingRunnable.setMediaMuxerSavaPath();Log.d(TAG, "Start Record ");} catch (Exception e) {e.printStackTrace();}}}private void stopRecord() {if (mVideoRecorder == null) {mVideoRecorder = mCamera2Renderer.getVideoRecorder();}if (mAudioEncoder == null) {mAudioEncoder = mCamera2Renderer.getAudioEncoder();}mEncodingRunnable.setExitThread();mVideoRecorder.stopVideoRecord();mAudioEncoder.stopAudioRecord();}}

相关文章:

Camera2+OpenGL ES+MediaCodec+AudioRecord实现录制音视频写入H264 SEI数据

记录一下学习过程&#xff0c;得到一个需求是基于Camera2OpenGL ESMediaCodecAudioRecord实现录制音视频。 需求&#xff1a; 在每一帧视频数据中&#xff0c;写入SEI额外数据&#xff0c;方便后期解码时获得每一帧中的自定义数据。点击录制功能后&#xff0c;录制的是前N秒至…...

C语言笔试题之反转链表(头插法)

实例要求&#xff1a; 1、给定单链表的头节点 head &#xff1b;2、请反转链表&#xff1b;3、最后返回反转后的链表&#xff1b; 案例展示&#xff1a; 实例分析&#xff1a; 1、入参合理性检查&#xff0c;即head ! NULL || head->next ! NULL&#xff1b;2、while循环…...

WEB3:互联网发展的新时代

随着科技的飞速发展&#xff0c;互联网已从最初的信息交流平台发展为涵盖了工作、生活、娱乐、教育等众多领域的复杂系统。我们将其称之为“WEB3”&#xff0c;这个名称是对互联网新时代的高度概括&#xff0c;标志着我们已经迈入了WEB3时代。 在WEB3时代&#xff0c;互联网将…...

c语言:贪吃蛇的实现

目录 贪吃蛇实现的技术前提&#xff1a; Win32 API介绍 控制台程序&#xff08;console&#xff09; 控制台屏幕上的坐标 GetStdHandle GetConsoleCursorInfo CONSOLE_CURSOR_INFO SetConsoleCursorInfo SetConsoleCursorPosition GetAsyncKeyState 宽字符的打印 …...

第5课 使用FFmpeg将rtmp流再转推到rtmp服务器

本课对应源文件下载链接&#xff1a; https://download.csdn.net/download/XiBuQiuChong/88801992 通过前面的学习&#xff0c;我们已经可以正常播放网络rtmp流及本地mp4文件。这节课&#xff0c;我们将在前面的基础上实现一个常用的转推功能&#xff1a;读取rtmp流或mp4文件并…...

Vue中v-if和v-show区别

Vue中v-if和v-show是两个常用的指令&#xff0c;用于控制元素的显示和隐藏。虽然它们都能达到相同的效果&#xff0c;但在实现机制和使用场景上有一些区别。本文将详细介绍v-if和v-show的区别&#xff0c;并且通过示例代码来演示它们的使用。 首先&#xff0c;让我们来看一下v…...

LabVIEW与EtherCAT实现风洞安全联锁及状态监测

LabVIEW与EtherCAT实现风洞安全联锁及状态监测 在现代风洞试验中&#xff0c;安全联锁与状态监测系统发挥着至关重要的作用&#xff0c;确保了试验过程的安全性与高效性。介绍了一套基于EtherCAT总线技术和LabVIEW软件开发的风洞安全联锁及状态监测系统。该系统通过实时、可靠…...

PostgreSQL的wal文件回收问题

引子 将PostgreSQL的GUC参数wal_recycle设置为on&#xff0c;然后对数据库执行一定业务量的操作&#xff0c;会发现在pg_wal目录下&#xff0c;有很多未来使用的wal文件&#xff0c;且创建时间比现在正在使用的wal文件更早&#xff0c;下文将描述和分析这种情况。 问题描述 …...

java-JUC并发编程学习笔记05(尚硅谷)

我们写一段测试代码: 会出现线程不安全的问题。 使用Vector解决线程不安全问题&#xff1a; 但是这个类几乎不会被使用了&#xff0c;因为效率太低。 方法二&#xff1a;通过Collections解决&#xff1a; 但是这种方案实际中也不太会使用。 我们还有第三种方法使用CopyOnWri…...

vulhub中spring的CVE-2022-22947漏洞复现

Spring Cloud Gateway是Spring中的一个API网关。其3.1.0及3.0.6版本&#xff08;包含&#xff09;以前存在一处SpEL表达式注入漏洞&#xff0c;当攻击者可以访问Actuator API的情况下&#xff0c;将可以利用该漏洞执行任意命令。 参考链接&#xff1a; https://tanzu.vmware.c…...

网络原理TCP/IP(1)

文章目录 端口号UDP协议 在网络通信中&#xff0c;协议非常重要 协议进行了分层 应用层就是对应着应用程序&#xff0c;是程序员打交道最多的这一层&#xff0c;调用系统提供的网络api写出来的代码都是属于应用层的 应用层有很多现成的协议&#xff0c;但是更多的还是程序员需要…...

EasyExcel使用,实体导入导出

简介 Java解析、生成Excel比较有名的框架有Apache poi、jxl。但他们都存在一个严重的问题就是非常的耗内存&#xff0c;poi有一套SAX模式的API可以一定程度的解决一些内存溢出的问题&#xff0c;但POI还是有一些缺陷&#xff0c;比如07版Excel解压缩以及解压后存储都是在内存中…...

让IIS支持SSE (Server Sent Events)

本文只探讨IISPython网站的情况&#xff0c;对于asp.net也应该不用这么麻烦。 先上结论&#xff1a;用反向代理&#xff1a; IIS URL Rewrite waitress Waitress是一个纯python编写独立的WSGI服务器&#xff0c;功能比Gunicorn弱一些&#xff0c;但可以运行在windows平台上&…...

新手从零开始学习数学建模论文写作(美赛论文临时抱佛脚篇)

本文记录于数学建模老哥视频的学习过程中。b站视频&#xff1a;http://【【零基础教程】老哥&#xff1a;数学建模算法、编程、写作和获奖指南全流程培训&#xff01;】https://www.bilibili.com/video/BV1kC4y1a7Ee?p50&vd_sourceff53a726c62f94eda5f615bd4a62c458 目录…...

k8s存储之PV、PVC

在k8s集群中&#xff0c;资源存储会散落到各个工作节点上&#xff0c;这样对用资源调用很不方便&#xff0c;那么k8s是如何实现存储资源共享的呢&#xff0c;本文浅尝辄止的探讨一下&#xff0c;k8s是通过pv、pvc实现的。 一、PV、PVC的概念 1、持久卷(PV&#xff09; pv是Pe…...

go-基于逃逸分析来提升性能程序

go-基于逃逸分析来提升性能程序 为什么要学习逃逸分析&#xff1a; 为了提高程序的性能&#xff0c;通过逃逸分析我们能知道指标是分配到堆上还是栈上&#xff0c;如何是 分配到栈上&#xff0c;内存的分配和释放都是由编译器进行管理的&#xff0c;分配和释放的速度都非常的…...

[软件工具]文档页数统计工具软件pdf统计页数word统计页数ppt统计页数图文打印店快速报价工具

文档页数统计工具软件——打印方面好帮手 在信息化时代&#xff0c;文档已成为我们工作、学习、生活中不可或缺的一部分。无论是学术论文、商业报告&#xff0c;还是个人日记&#xff0c;都需要我们对其进行有效的管理。而在这个过程中&#xff0c;文档页数统计工具软件就显得…...

linux编译ffmpeg动态库

1&#xff0c;获取源码&#xff1a;git clone https://git.ffmpeg.org/ffmpeg.git 2&#xff0c;创建编译目录&#xff0c;并编译、安装&#xff0c; cd ffmpeg mkdir build cd build ../configure --prefix~/ffmpeg --enable-shared --enable-debug1 //configure后需要使…...

Unity3d Cinemachine篇(完)— TargetGroup

文章目录 前言使用TargetGroup追随多个模型1. 创建二个游戏物体2. 创建TargetGroup相机3. 设置相机4. 完成 前言 上一期我们简单的使用了ClearShot相机&#xff0c;这次我们来使用一下TargetGroup 使用TargetGroup追随多个模型 1. 创建二个游戏物体 2. 创建TargetGroup相机 3…...

事件驱动架构:使用Flask实现MinIO事件通知Webhooks

MinIO的事件通知可能一开始看起来并不激动人心&#xff0c;但一旦掌握了它们的力量&#xff0c;它们就能照亮您存储桶内的动态。事件通知是一个全面、高效的对象存储系统中的关键组件。Webhooks是我个人最喜欢的工具&#xff0c;用于与MinIO集成。它们在事件的世界中就像一把瑞…...

XCTF-web-easyupload

试了试php&#xff0c;php7&#xff0c;pht&#xff0c;phtml等&#xff0c;都没有用 尝试.user.ini 抓包修改将.user.ini修改为jpg图片 在上传一个123.jpg 用蚁剑连接&#xff0c;得到flag...

脑机新手指南(八):OpenBCI_GUI:从环境搭建到数据可视化(下)

一、数据处理与分析实战 &#xff08;一&#xff09;实时滤波与参数调整 基础滤波操作 60Hz 工频滤波&#xff1a;勾选界面右侧 “60Hz” 复选框&#xff0c;可有效抑制电网干扰&#xff08;适用于北美地区&#xff0c;欧洲用户可调整为 50Hz&#xff09;。 平滑处理&…...

VB.net复制Ntag213卡写入UID

本示例使用的发卡器&#xff1a;https://item.taobao.com/item.htm?ftt&id615391857885 一、读取旧Ntag卡的UID和数据 Private Sub Button15_Click(sender As Object, e As EventArgs) Handles Button15.Click轻松读卡技术支持:网站:Dim i, j As IntegerDim cardidhex, …...

黑马Mybatis

Mybatis 表现层&#xff1a;页面展示 业务层&#xff1a;逻辑处理 持久层&#xff1a;持久数据化保存 在这里插入图片描述 Mybatis快速入门 ![在这里插入图片描述](https://i-blog.csdnimg.cn/direct/6501c2109c4442118ceb6014725e48e4.png //logback.xml <?xml ver…...

Java 8 Stream API 入门到实践详解

一、告别 for 循环&#xff01; 传统痛点&#xff1a; Java 8 之前&#xff0c;集合操作离不开冗长的 for 循环和匿名类。例如&#xff0c;过滤列表中的偶数&#xff1a; List<Integer> list Arrays.asList(1, 2, 3, 4, 5); List<Integer> evens new ArrayList…...

Linux相关概念和易错知识点(42)(TCP的连接管理、可靠性、面临复杂网络的处理)

目录 1.TCP的连接管理机制&#xff08;1&#xff09;三次握手①握手过程②对握手过程的理解 &#xff08;2&#xff09;四次挥手&#xff08;3&#xff09;握手和挥手的触发&#xff08;4&#xff09;状态切换①挥手过程中状态的切换②握手过程中状态的切换 2.TCP的可靠性&…...

聊聊 Pulsar:Producer 源码解析

一、前言 Apache Pulsar 是一个企业级的开源分布式消息传递平台&#xff0c;以其高性能、可扩展性和存储计算分离架构在消息队列和流处理领域独树一帜。在 Pulsar 的核心架构中&#xff0c;Producer&#xff08;生产者&#xff09; 是连接客户端应用与消息队列的第一步。生产者…...

抖音增长新引擎:品融电商,一站式全案代运营领跑者

抖音增长新引擎&#xff1a;品融电商&#xff0c;一站式全案代运营领跑者 在抖音这个日活超7亿的流量汪洋中&#xff0c;品牌如何破浪前行&#xff1f;自建团队成本高、效果难控&#xff1b;碎片化运营又难成合力——这正是许多企业面临的增长困局。品融电商以「抖音全案代运营…...

令牌桶 滑动窗口->限流 分布式信号量->限并发的原理 lua脚本分析介绍

文章目录 前言限流限制并发的实际理解限流令牌桶代码实现结果分析令牌桶lua的模拟实现原理总结&#xff1a; 滑动窗口代码实现结果分析lua脚本原理解析 限并发分布式信号量代码实现结果分析lua脚本实现原理 双注解去实现限流 并发结果分析&#xff1a; 实际业务去理解体会统一注…...

Xen Server服务器释放磁盘空间

disk.sh #!/bin/bashcd /run/sr-mount/e54f0646-ae11-0457-b64f-eba4673b824c # 全部虚拟机物理磁盘文件存储 a$(ls -l | awk {print $NF} | cut -d. -f1) # 使用中的虚拟机物理磁盘文件 b$(xe vm-disk-list --multiple | grep uuid | awk {print $NF})printf "%s\n"…...