作者:经典调剂行570 | 来源:互联网 | 2023-09-16 00:54
支撑在线直播网站源码 发展的关键就是直播技术的实现,在整个流媒体传输中视频的采集时开始,一般利用到的是设备端的摄像头,我们在实现视频采集时,不仅需要获取摄像头的调用权限,还要获取到视频录制的数据。
我们先来了解一下在在线直播网站源码开发中需要掌握的音视频基础知识点:
ffmpeg 强大的音视频处理库,(cpu软编等) mediaCodec 安卓sdk自带的编解码器,(硬编) opengles 使用gpu进行图像处理 h264,h265 图像编码压缩算法 yuv420p ,nv21 ,yuv_420_888,I420 需要了解的视频编码格式 yuv和rgb的相互转化 pcm,acc 需要了解的音频编码格式 camerax,mediaRecorder,audioRecorder 采集相关的api 音视频在安卓中一般就有2种表现形式,一种是播放在线或者本地视频(录制和播放),另一种就是在线直播网站源码中的直播(推流和拉流),下面我们首先从视频的采集来对在线直播网站源码做一下详细的探索。
视频的采集
camerax是jetpack中新加的一个相机库,设计的非常好,不用像之前camera1和camera2使用那么的繁琐,并且是和生命周期绑定,方便开发者管理生命周期。
1.添加依赖
implementation "androidx.camera:camera-camera2:$camerax_version" implementation "androidx.camera:camera-lifecycle:$camerax_version" implementation "androidx.camera:camera-view:1.0.0-alpha10"
2.创建预览布局
< androidx. camera. view. PreviewViewandroid: id= "@+id/viewFinder" app: layout_constraintTop_toTopOf= "parent" app: layout_constraintBottom_toTopOf= "@+id/camera_capture_button" android: layout_width= "match_parent" android: layout_height= "0dp" / >
3.获取相机权限(略) 4.打开相机,获取预览
@SuppressLint ( "RestrictedApi" ) private fun startCamera ( ) { initMediaCodec ( 480 , 640 , 20 ) val cameraProviderFuture = ProcessCameraProvider. getInstance ( this) cameraProviderFuture. addListener ( Runnable { bindImage ( cameraProviderFuture) } , ContextCompat. getMainExecutor ( this) ) } private fun bindImage ( cameraProviderFuture: ListenableFuture< ProcessCameraProvider> ) { val cameraProvider: ProcessCameraProvider = cameraProviderFuture. get ( ) preview = Preview. Builder ( ) . build ( ) val cameraSelector = CameraSelector. Builder ( ) . requireLensFacing ( CameraSelector. LENS_FACING_BACK) . build ( ) val imageAnalysis = ImageAnalysis. Builder ( ) . setTargetResolution ( Size ( 1280 , 720 ) ) . setBackpressureStrategy ( ImageAnalysis. STRATEGY_KEEP_ONLY_LATEST) . build ( ) imageAnalysis. setAnalyzer ( cameraExecutor, ImageAnalysis. Analyzer { image -> Thread { val data = ImageUtils. getDataFromImage2 ( image, ImageUtils. COLOR_FormatI420) val out = FileOutputStream ( File ( Environment. getExternalStorageDirectory ( ) , "hhh.yuv" ) ) out. write ( data) image. close ( ) ; } . start ( ) } ) try { cameraProvider. unbindAll ( ) camera = cameraProvider. bindToLifecycle ( this, cameraSelector, imageAnalysis, preview) preview? . setSurfaceProvider ( viewFinder. createSurfaceProvider ( camera? . cameraInfo) ) } catch ( exc: Exception) { Log. e ( TAG, "Use case binding failed" , exc) } }
5.获取视频录制数据 使用ImageAnalysis的setAnalyzer方法可以获取到在线直播网站源码录制的视频原始数据,默认的视频数据格式是 yuv_420_888,这种数据格式是不能直接使用ffmepeg或者mediacodec编码的,需要转成i420格式,
yuv_420_888是YCbCr的泛化格式,能够表示任何4:2:0的平面和半平面格式,每个分量用8 bits 表示。带有这种格式的图像使用3个独立的Buffer表示,每一个Buffer表示一个颜色平面(Plane),除了Buffer外,它还提供rowStride、pixelStride来描述对应的Plane。
按照官方的说法,第一个平面全是y数据,第2个平面包含所有的u数据,第3个平面包含所有的v数据,我们只需要解析出原始数据中的yuv分量在按照i420的排列和比例就可以得到我们最终可以使用的数据了。
但是实际解析过程中我发现了在线直播网站源码原始数据中有比较多的填充数据,按照网上的说法是用作对齐处理。
public class ImageUtils { private static final boolean VERBOSE = true; private static final String TAG = "ImageUtils" ; public static final int COLOR_FormatI420 = 1 ; public static final int COLOR_FormatNV21 = 2 ; public static boolean isImageFormatSupported ( ImageProxy image) { int format = image. getFormat ( ) ; switch ( format) { case ImageFormat. YUV_420_888: case ImageFormat. NV21: case ImageFormat. YV12: return true; } return false; } public static byte[ ] getDataFromImage2 ( ImageProxy image, int colorFormat) { if ( colorFormat != COLOR_FormatI420 && colorFormat != COLOR_FormatNV21) { throw new IllegalArgumentException ( "only support COLOR_FormatI420 " + "and COLOR_FormatNV21" ) ; } if ( ! isImageFormatSupported ( image) ) { throw new RuntimeException ( "can&#39;t convert Image to byte array, format " + image. getFormat ( ) ) ; } Rect crop = image. getCropRect ( ) ; int format = image. getFormat ( ) ; int width = crop. width ( ) ; int height = crop. height ( ) ; ImageProxy. PlaneProxy[ ] planes = image. getPlanes ( ) ; int yRe = planes[ 0 ] . getBuffer ( ) . remaining ( ) ; int uvRe = planes[ 1 ] . getBuffer ( ) . remaining ( ) ; int vuRe = planes[ 2 ] . getBuffer ( ) . remaining ( ) ; int yStride = planes[ 0 ] . getRowStride ( ) ; int uvStride = planes[ 1 ] . getRowStride ( ) ; int vuStride = planes[ 2 ] . getRowStride ( ) ; int yLength = yRe - ( yStride- width) * ( height- 1 ) ; int uvLength = uvRe - ( uvStride- width) * ( height/ 2 - 1 ) ; int vuLength = vuRe - ( vuStride- width) * ( height/ 2 - 1 ) ; byte[ ] data = new byte[ width * height * 12 / 8 ] ; byte[ ] yData = new byte[ yLength] ; byte[ ] uData = new byte[ uvLength/ 2 + 1 ] ; byte[ ] vData = new byte[ vuLength/ 2 + 1 ] ; byte[ ] uvData = new byte[ uvLength] ; byte[ ] vuData = new byte[ vuLength] ; byte[ ] rowData = new byte[ planes[ 0 ] . getRowStride ( ) ] ; for ( int i = 0 ; i < planes. length; i++ ) { ImageProxy. PlaneProxy plane = planes[ i] ; ByteBuffer buffer = plane. getBuffer ( ) ; int offset = 0 ; Log. e ( "getDataFromImage" , plane. getPixelStride ( ) + "" ) ; buffer. position ( 0 ) ; int col = height / plane. getPixelStride ( ) ; if ( plane. getPixelStride ( ) == 1 ) { for ( int j = 0 ; j < col; j++ ) { if ( i == 0 ) { if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. get ( yData, offset, width) ; } else { buffer. get ( yData, offset, yRe - ( plane. getRowStride ( ) * ( height- 1 ) ) ) ; } offset += width; if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. position ( buffer. position ( ) + plane. getRowStride ( ) - width) ; } } else if ( i == 1 ) { buffer. get ( uData, offset, width) ; offset += width; if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. position ( buffer. position ( ) + plane. getRowStride ( ) - width) ; } } else { buffer. get ( vData, offset, width) ; offset += width; if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. position ( buffer. position ( ) + plane. getRowStride ( ) - width) ; } } } } else { for ( int j = 0 ; j < col; j++ ) { if ( i == 0 ) { if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. get ( yData, offset, width) ; } else { buffer. get ( yData, offset, yRe - ( plane. getRowStride ( ) * ( height/ plane. getPixelStride ( ) - 1 ) ) ) ; } offset += width; if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. position ( buffer. position ( ) + plane. getRowStride ( ) - width) ; } } else if ( i == 1 ) { if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. get ( uvData, offset, width) ; } else { buffer. get ( uvData, offset, uvRe - ( plane. getRowStride ( ) * ( height/ plane. getPixelStride ( ) - 1 ) ) ) ; } offset += width; if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. position ( buffer. position ( ) + plane. getRowStride ( ) - width) ; } } else { if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. get ( vuData, offset, width) ; } else { buffer. get ( vuData, offset, vuRe - ( plane. getRowStride ( ) * ( height/ plane. getPixelStride ( ) - 1 ) ) ) ; } offset += width; if ( j < height/ plane. getPixelStride ( ) - 1 ) { buffer. position ( buffer. position ( ) + plane. getRowStride ( ) - width) ; } } } } } boolean isI420 = image. getPlanes ( ) [ 1 ] . getPixelStride ( ) == 1 ; if ( ! isI420) { for ( int i = 0 ; i < uvData. length; i+= 2 ) { uData[ i/ 2 ] = uvData[ i] ; } for ( int i = 0 ; i < vuData. length; i+= 2 ) { vData[ i/ 2 ] = uvData[ i] ; } Log. e ( "getDataFromImage" , uData. length + "" ) ; Log. e ( "getDataFromImage" , vData. length + "" ) ; } System. arraycopy ( yData, 0 , data, 0 , yData. length) ; System. arraycopy ( uData, 0 , data, yData. length, uData. length) ; System. arraycopy ( vData, 0 , data, yData. length + uData. length , vData. length) ; return data; } }
以上就是“在线直播网站源码开发,视频的采集如何实现?”的全部内容,希望对大家有帮助,在线直播网站源码在开发过程中需要注意每一个细节,否则可能会导致差之毫米失之千里。