热门标签 | HotTags
当前位置:  开发笔记 > 编程语言 > 正文

iOS11Objective-C-使用AVAssetWriterInputPixelBufferAdaptor从ReplayKit处理图像缓冲区

如何解决《iOS11Objective-C-使用AVAssetWriterInputPixelBufferAdaptor从ReplayKit处理图像缓冲区》经验,如何解决这个问题?

我正在尝试使用ReplayKit录制我的应用程序屏幕,在录制视频时裁剪掉它的某些部分.不太顺利.

ReplayKit将捕获整个屏幕,因此我决定从ReplayKit接收每个帧(作为CMSampleBuffervia startCaptureWithHandler),在那里裁剪并通过AVAssetWriterInputPixelBufferAdaptor将其提供给视频编写器.但是我在修剪图像缓冲区之前遇到了麻烦.

这是我记录整个屏幕的工作代码:

// Starts recording with a completion/error handler
-(void)startRecordingWithHandler: (RPHandler)handler
{
    // Sets up AVAssetWriter that will generate a video file from the recording.
    self.writer = [AVAssetWriter assetWriterWithURL:self.outputFileURL
                                           fileType:AVFileTypeQuickTimeMovie
                                              error:nil];

    NSDictionary* outputSettings =
    @{
      AVVideoWidthKey  : @(screen.size.width),   // The whole width of the entire screen.
      AVVideoHeightKey : @(screen.size.height),  // The whole height of the entire screen.
      AVVideoCodecKey  : AVVideoCodecTypeH264,
      };

    // Sets up AVAssetWriterInput that will feed ReplayKit's frame buffers to the writer.
    self.videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                         outputSettings:outputSettings];

    // Lets it know that the input will be realtime using ReplayKit.
    [self.videoInput setExpectsMediaDataInRealTime:YES];

    NSDictionary* sourcePixelBufferAttributes =
    @{
      (NSString*) kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA),
      (NSString*) kCVPixelBufferWidthKey          : @(screen.size.width),
      (NSString*) kCVPixelBufferHeightKey         : @(screen.size.height),
      };

    // Adds the video input to the writer.
    [self.writer addInput:self.videoInput];

    // Sets up ReplayKit itself.
    self.recorder = [RPScreenRecorder sharedRecorder];

    // Arranges the pipleline from ReplayKit to the input.
    RPBufferHandler bufferHandler = ^(CMSampleBufferRef sampleBuffer, RPSampleBufferType bufferType, NSError* error) {
        [self captureSampleBuffer:sampleBuffer withBufferType:bufferType];
    };

    RPHandler errorHandler = ^(NSError* error) {
        if (error) handler(error);
    };

    // Starts ReplayKit's recording session. 
    // Sample buffers will be sent to `captureSampleBuffer` method.
    [self.recorder startCaptureWithHandler:bufferHandler completionHandler:errorHandler];
}

// Receives a sample buffer from ReplayKit every frame.
-(void)captureSampleBuffer:(CMSampleBufferRef)sampleBuffer withBufferType:(RPSampleBufferType)bufferType
{
    // Uses a queue in sync so that the writer-starting logic won't be invoked twice.
    dispatch_sync(dispatch_get_main_queue(), ^{
        // Starts the writer if not started yet. We do this here in order to get the proper source time later.
        if (self.writer.status == AVAssetWriterStatusUnknown) {
            [self.writer startWriting];
            return;
        }

        // Receives a sample buffer from ReplayKit.
        switch (bufferType) {
            case RPSampleBufferTypeVideo:{
                // Initializes the source time when a video frame buffer is received the first time.
                // This prevents the output video from starting with blank frames.
                if (!self.startedWriting) {
                    NSLog(@"self.writer startSessionAtSourceTime");
                    [self.writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                    self.startedWriting = YES;
                }

                // Appends a received video frame buffer to the writer.
                [self.input append:sampleBuffer];
                break;
            }
        }
    });
}

// Stops the current recording session, and saves the output file to the user photo album.
-(void)stopRecordingWithHandler:(RPHandler)handler
{
    // Closes the input.
    [self.videoInput markAsFinished];

    // Finishes up the writer. 
    [self.writer finishWritingWithCompletionHandler:^{
        handler(self.writer.error);

        // Saves the output video to the user photo album.
        [[PHPhotoLibrary sharedPhotoLibrary] performChanges: ^{ [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL: self.outputFileURL]; }
                                          completionHandler: ^(BOOL s, NSError* e) { }];
    }];

    // Stops ReplayKit's recording.
    [self.recorder stopCaptureWithHandler:nil];
}

其中来自ReplayKit的每个样本缓冲区将直接送到编写器(在captureSampleBuffer方法中),因此记录整个屏幕.

然后,我使用AVAssetWriterInputPixelBufferAdaptor用相同的逻辑替换了该部件,该工作正常:

...
case RPSampleBufferTypeVideo:{
    ... // Initializes source time.

    // Gets the timestamp of the sample buffer.
    CMTime time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

    // Extracts the pixel image buffer from the sample buffer.
    CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    // Appends a received sample buffer as an image buffer to the writer via the adaptor.
    [self.videoAdaptor appendPixelBuffer:imageBuffer withPresentationTime:time];
    break;
}
...

适配器设置为:

NSDictionary* sourcePixelBufferAttributes =
@{
  (NSString*) kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA),
  (NSString*) kCVPixelBufferWidthKey          : @(screen.size.width),
  (NSString*) kCVPixelBufferHeightKey         : @(screen.size.height),
  };

self.videoAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:self.videoInput
                                                                                     sourcePixelBufferAttributes:sourcePixelBufferAttributes];

所以管道工作正常.

然后,我在主内存中创建了一个图像缓冲区的硬拷贝并将其提供给适配器:

...
case RPSampleBufferTypeVideo:{
    ... // Initializes source time.

    // Gets the timestamp of the sample buffer.
    CMTime time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

    // Extracts the pixel image buffer from the sample buffer.
    CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    // Hard-copies the image buffer.
    CVPixelBufferRef copiedImageBuffer = [self copy:imageBuffer];

    // Appends a received video frame buffer to the writer via the adaptor.
    [self.adaptor appendPixelBuffer:copiedImageBuffer withPresentationTime:time];
    break;
}
...

// Hard-copies the pixel buffer.
-(CVPixelBufferRef)copy:(CVPixelBufferRef)inputBuffer
{
    // Locks the base address of the buffer
    // so that GPU won't change the data until unlocked later.
    CVPixelBufferLockBaseAddress(inputBuffer, 0); //-------------------------------

    char* baseAddress = (char*)CVPixelBufferGetBaseAddress(inputBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(inputBuffer);
    size_t width = CVPixelBufferGetWidth(inputBuffer);
    size_t height = CVPixelBufferGetHeight(inputBuffer);
    size_t length = bytesPerRow * height;

    // Mallocs the same length as the input buffer for copying.
    char* outputAddress = (char*)malloc(length);

    // Copies the input buffer's data to the malloced space.
    for (int i = 0; i 

这不起作用 - 保存的视频看起来像右侧的屏幕截图:

破碎

似乎每行的字节数和颜色格式都是错误的.我已经研究并试验了以下内容,但无济于事:

4 * width每行字节的硬编码- >"不良访问".

使用intdouble不是char- >一些奇怪的调试器终止异常.

使用其他图像格式 - >"不支持"或访问错误.

此外,releaseCallback从未调用过 - ram将在10秒内完成录制.

从这个输出看起来有什么潜在的原因?


推荐阅读
  • 在ROS系统中,参数读写一般通过xml或者yaml格式的文件,其中yaml用得比较多。这是一种可读性高,轻量级的标记语言,简单好用。对于yaml文件,ros中用的较早版本的yaml- ... [详细]
  • c语言自定义BOOL函数C语言没有BOOL类型变量boolean类型是C++所独有的由于使用BOOL类型可以使代码更具有可读性,很多编程者都在C中自己定义了类似的应用,一般方法有两 ... [详细]
  • socket8 [命名管道]
    ::命名管道不但能实现同一台机器上两个进程通信,还能在网络中不同机器上的两个进程之间的通信机制。与邮槽不同,命名管道是采用基于连接并且可靠的传输方式,所以命名管道传输数据只能一对一 ... [详细]
  • Givens1,s2,s3,findwhethers3isformedbytheinterleavingofs1ands2.Forexample,Given:s1aabcc ... [详细]
  • 【go密码学】对称加密算法
    对称加密对称加密算法是相对于非对称加密算法而言,两者的区别在于,对称加密和加密和解密时使用相同的秘钥,而非对称加密在加密和解密时使用不同的秘钥(公钥和私钥)。常见的对称加密算法:D ... [详细]
  • String字符串java.lang;基本标识Java字符串的一个重要特点就是字符串不可变。finalclassString没有子类字符串字面量也是一个String类的实例存储在字 ... [详细]
  • QT串口通信文章目录QT串口通信前言一、Pycharm代码二、STM32代码前言前几天学了QT,只设计界面并没有用处,于是我便学习了QT的串口通信。Q ... [详细]
  • 开发笔记:携程2019校招编程题
    本文由编程笔记#小编为大家整理,主要介绍了携程2019校招编程题相关的知识,希望对你有一定的参考价值。携程今年的机试题为20道选择+3编程由于今天最后提交时第三题 ... [详细]
  • Proof (of knowledge) of exponentiation
    1.ProofofexponentiationProofofexponentiation是基于adaptiverootassumption(充分必要条件࿰ ... [详细]
  • 编译lib手动编译cmake编译gtest测试程序断言和caseFixture使用gmock编译gmock测试程序参考GtestGithub使用gtest(gmock)方便我们编写 ... [详细]
  • Unity3D平台宏定义之美
    Unity包含一个“平台相关的编译”功能。这包括一些预处理指令,让你分割你的脚本编译和专为支持的平台之一执行代码段。您可以Unity编辑器中运行代码,这 ... [详细]
  • Android JNI学习之Concepts
    2019独角兽企业重金招聘Python工程师标准ConceptsBeforeBeginningThisguideassumesthatyouare:Alreadyfamili ... [详细]
  • 1011-MarriageCeremoniesPDF(English)StatisticsForumTimeLimit:2second(s)MemoryLimit:32MBYouw ... [详细]
  • 每次用到v-charts我都一阵头疼,因为明明是相同的功能,但是我好像每次用到的解决方法都不一样??每次都是在api中各种查,各种尝试…直到做了个各种数据图形的需求,决定还是好好整 ... [详细]
  • Fixes#3560Itriedtodowhatproposedintheissue(inthisbranchhttps://gith ... [详细]
author-avatar
葉_沛峰
这个家伙很懒,什么也没留下!
PHP1.CN | 中国最专业的PHP中文社区 | DevBox开发工具箱 | json解析格式化 |PHP资讯 | PHP教程 | 数据库技术 | 服务器技术 | 前端开发技术 | PHP框架 | 开发工具 | 在线工具
Copyright © 1998 - 2020 PHP1.CN. All Rights Reserved | 京公网安备 11010802041100号 | 京ICP备19059560号-4 | PHP1.CN 第一PHP社区 版权所有