热门标签 | HotTags
当前位置:  开发笔记 > 编程语言 > 正文

利用GLSL在iOS上实现YV12到RGB的转换,并附带展示结果图像

本文介绍了如何在iOS平台上使用GLSL着色器将YV12格式的视频帧数据转换为RGB格式,并展示了转换后的图像效果。通过详细的技术实现步骤和代码示例,读者可以轻松掌握这一过程,适用于需要进行视频处理的应用开发。

referred to this question

提到这个问题

i convert the yv12 frame data to rgb data using glsl shader,the raw image below: enter image description here

我使用glsl着色器将yv12帧数据转换为rgb数据,原始图像如下:

but the result image is not same with the former,attached below: enter image description here

但结果图像与前者不相同,如下图所示:

following is my code for uploading the three planar data to textures:

以下是我上传三个平面数据到纹理的代码:

- (GLuint) textureY: (Byte*)imageData        
      widthType: (int) width       
     heightType: (int) height       
{          
    GLuint texName;    
    glGenTextures( 1, &texName );     
    glBindTexture(GL_TEXTURE_2D, texName);

    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    

    glTexImage2D( GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, imageData );  
    //free(imageData);

    return texName;    
}    

- (GLuint) textureU: (Byte*)imageData        
          widthType: (int) width       
         heightType: (int) height       
{          
    GLuint texName;    

    glGenTextures( 1, &texName );     
    glBindTexture(GL_TEXTURE_2D, texName);


    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    

    glTexImage2D( GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, imageData );    

    //free(imageData);
    return texName;    
}    

- (GLuint) textureV: (Byte*)imageData        
          widthType: (int) width       
         heightType: (int) height       
{          
    GLuint texName;    
    glGenTextures( 1, &texName );     
    glBindTexture(GL_TEXTURE_2D, texName);


    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    

    glTexImage2D( GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, imageData );    

    //free(imageData);
    return texName;    
}    


- (void) readYUVFile     
{    
    NSString *file = [[NSBundle mainBundle] pathForResource:@"video" ofType:@"yv12"];
    NSLog(@"%@",file);
    NSData* fileData = [NSData dataWithContentsOfFile:file]; 
    //NSLog(@"%@",[fileData description]);
    NSInteger width  = 352;    
    NSInteger height = 288;
    NSInteger uv_width  = width  / 2;    
    NSInteger uv_height = height / 2;
    NSInteger dataSize = [fileData length];
    NSLog(@"%i\n",dataSize);

    GLint nYsize  = width * height;     
    GLint nUVsize = uv_width * uv_height;      
    GLint nCbOffSet = nYsize;    
    GLint nCrOffSet = nCbOffSet + nUVsize;    

    Byte *spriteData = (Byte *)malloc(dataSize);
    [fileData getBytes:spriteData length:dataSize];


    Byte* uData = spriteData + nCbOffSet;
    //NSLog(@"%@\n",[[NSData dataWithBytes:uData length:nUVsize] description]);
    Byte* vData = spriteData + nCrOffSet;  
    //NSLog(@"%@\n",[[NSData dataWithBytes:vData length:nUVsize] description]);
    /**
    Byte *YPlanarData = (Byte *)malloc(nYsize);
    for (int i=0; i

and my fragment shaders code:

我的片段着色器代码:

   precision highp float;
uniform sampler2D SamplerY;
uniform sampler2D SamplerU;
uniform sampler2D SamplerV;

varying highp vec2 coordinate;

void main()
{
    highp vec3 yuv,yuv1;
    highp vec3 rgb;

    yuv.x = texture2D(SamplerY, coordinate).r;

    yuv.y = texture2D(SamplerU, coordinate).r-0.5;

    yuv.z = texture2D(SamplerV, coordinate).r-0.5 ;

   rgb = mat3(      1,       1,      1,
                     0, -.34414, 1.772,
               1.402, -.71414,      0) * yuv;

    gl_FragColor = vec4(rgb, 1);
}

my confusion is the conversion formula while i using this formula directly converting the yv12 data to rgb24,and draw a image with the

我的混淆是转换公式,我使用这个公式直接将yv12数据转换为rgb24,并绘制一个图像。

CGImageCreate(iwidth, 
                                       iheight, 
                                       8, 
                                       24, 
                                       iwidth*3, 
                                       colorSpace, 
                                       bitmapInfo, 
                                       provider, 
                                       NULL, 
                                       NO, 
                                       kCGRenderingIntentDefault);

the result image is correct. but using the shader (for the direct transform approach running on iOS device is dump) turns to this problem ,i've tried some tricks(expand the UV planers to (2*uv_width)*2(uv_height) rectangle and then upload the texture),but failed in the same more red image.

结果图像是正确的。但是使用shader(在iOS设备上运行的直接转换方法是dump)转向这个问题,我尝试了一些技巧(将UV planers扩展到(2*uv_width)*2(uv_height)矩形,然后上传纹理),但是在相同的红色图像中失败了。

How to resolve this issue?

如何解决这个问题?

attached with my whole glView.m code:

连同我的整个glView。m代码:

#import "OpenGLView.h"

typedef struct {
    float Position[3];
    float TexCoord[2];
} Vertex;

const Vertex Vertices[] = {
    {{1, -1, 0},{1,1}},
    {{1, 1, 0},{1,0}},
    {{-1, 1, 0},{0,0}},
    {{-1, -1, 0},{0,1}}
};

const GLubyte Indices[] = {
    0, 1, 2,
    2, 3, 0
};



@interface OpenGLView ()
- (void)setupLayer;
- (void)setupContext;
- (void)setupRenderBuffer;
- (void)setupFrameBuffer;
- (void)render;

- (GLuint)compileShader:(NSString*)shaderName withType:(GLenum)shaderType;
- (void)setupVBOs;
- (void)compileShaders;

- (void) readYUVFile;
@end

@implementation OpenGLView

- (void)setupVBOs {

    GLuint vertexBuffer;
    glGenBuffers(1, &vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW);

    GLuint indexBuffer;
    glGenBuffers(1, &indexBuffer);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(Indices), Indices, GL_STATIC_DRAW);

}



- (id)initWithFrame:(CGRect)frame
{
    self = [super initWithFrame:frame];
    if (self) {
        // Initialization code[]
        self.backgroundColor = [UIColor redColor];
        [self setupLayer];
        [self setupContext];
        [self setupRenderBuffer];
        [self setupFrameBuffer];

        [self setupVBOs];
        [self compileShaders];
        [self readYUVFile];

        [self render];

    }
    return self;
}

+ (Class)layerClass{
    return [CAEAGLLayer class];
}

-(void)setupLayer{
    _eaglLayer = (CAEAGLLayer *)self.layer;
    _eaglLayer.opaque = YES;
}

- (void)setupContext{
    EAGLRenderingAPI api = kEAGLRenderingAPIOpenGLES2;
    _cOntext= [[[EAGLContext alloc] initWithAPI:api] autorelease];

    if (!_context) {
        NSLog(@"Failed to initialize OpenGLES 2.0 context");
        exit(1);
    }

    if (![EAGLContext setCurrentContext:_context]) {
        NSLog(@"Failed to set current OpenGL context");
        exit(1);
    }
}

- (void)setupRenderBuffer {
    glGenRenderbuffers(1, &_colorRenderBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderBuffer);        
    [_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:_eaglLayer];    
}

- (void)setupFrameBuffer {    
    GLuint framebuffer;
    glGenFramebuffers(1, &framebuffer);
    glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, 
                              GL_RENDERBUFFER, _colorRenderBuffer);
}

- (GLuint) textureY: (Byte*)imageData        
          widthType: (int) width       
         heightType: (int) height       
{          
    GLuint texName;    
    glGenTextures( 1, &texName );     
    glBindTexture(GL_TEXTURE_2D, texName);

    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    

    glTexImage2D( GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, imageData );  
    //free(imageData);

    return texName;    
}    

- (GLuint) textureU: (Byte*)imageData        
          widthType: (int) width       
         heightType: (int) height       
{          
    GLuint texName;    

    glGenTextures( 1, &texName );     
    glBindTexture(GL_TEXTURE_2D, texName);


    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    

    glTexImage2D( GL_TEXTURE_2D, 0, GL_RED_EXT, width, height, 0, GL_RED_EXT, GL_UNSIGNED_BYTE, imageData );    

    //free(imageData);
    return texName;    
}    

- (GLuint) textureV: (Byte*)imageData        
          widthType: (int) width       
         heightType: (int) height       
{          
    GLuint texName;    
    glGenTextures( 1, &texName );     
    glBindTexture(GL_TEXTURE_2D, texName);


    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);    
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    

    glTexImage2D( GL_TEXTURE_2D, 0, GL_RED_EXT, width, height, 0, GL_RED_EXT, GL_UNSIGNED_BYTE, imageData );    

    //free(imageData);
    return texName;    
}    


- (void) readYUVFile     
{    
    NSString *file = [[NSBundle mainBundle] pathForResource:@"video" ofType:@"yv12"];
    NSLog(@"%@",file);
    NSData* fileData = [NSData dataWithContentsOfFile:file]; 
    //NSLog(@"%@",[fileData description]);
    NSInteger width  = 352;    
    NSInteger height = 288;
    NSInteger uv_width  = width  / 2;    
    NSInteger uv_height = height / 2;
    NSInteger dataSize = [fileData length];
    NSLog(@"%i\n",dataSize);

    GLint nYsize  = width * height;     
    GLint nUVsize = uv_width * uv_height;      
    GLint nCbOffSet = nYsize;    
    GLint nCrOffSet = nCbOffSet + nUVsize;    

    Byte *spriteData = (Byte *)malloc(dataSize);
    [fileData getBytes:spriteData length:dataSize];


    Byte* uData = spriteData + nCbOffSet;
    //NSLog(@"%@\n",[[NSData dataWithBytes:uData length:nUVsize] description]);
    Byte* vData = spriteData + nCrOffSet;  
    //NSLog(@"%@\n",[[NSData dataWithBytes:vData length:nUVsize] description]);

    Byte *YPlanarData = (Byte *)malloc(nYsize);
    for (int i=0; i

1 个解决方案

#1


2  

my foolish mistake,the three uploading-texture process and fragment shader are correct ,but the following code are not compatible:

我的愚蠢错误,三个上传纹理的过程和片段着色器是正确的,但是下面的代码是不兼容的:

glActiveTexture(GL_TEXTURE0); 
glBindTexture(GL_TEXTURE_2D, _YPlanarTexture);
glUniform1i(_textureUniformY, 0);

glActiveTexture(GL_TEXTURE1); 
glBindTexture(GL_TEXTURE_2D, _UPlanarTexture);
glUniform1i(_textureUniformU, 1);

glActiveTexture(GL_TEXTURE2); 
glBindTexture(GL_TEXTURE_2D, _VPlanarTexture);
glUniform1i(_textureUniformV, 2);

and below:

和下面的:

_YPlanarTexture = glGetUniformLocation(programHandle, "SamplerY");
_UPlanarTexture = glGetUniformLocation(programHandle, "SamplerU");
_VPlanarTexture = glGetUniformLocation(programHandle, "SamplerV");

so replace with this:

所以替换为:

_textureUniformY = glGetUniformLocation(programHandle, "SamplerY");
_textureUniformU = glGetUniformLocation(programHandle, "SamplerU");
_textureUniformV = glGetUniformLocation(programHandle, "SamplerV");

then it will do right thing .

这样做是对的。


推荐阅读
  • 题目描述:给定n个半开区间[a, b),要求使用两个互不重叠的记录器,求最多可以记录多少个区间。解决方案采用贪心算法,通过排序和遍历实现最优解。 ... [详细]
  • golang常用库:配置文件解析库/管理工具viper使用
    golang常用库:配置文件解析库管理工具-viper使用-一、viper简介viper配置管理解析库,是由大神SteveFrancia开发,他在google领导着golang的 ... [详细]
  • 本文介绍如何使用 Python 将一个字符串按照指定的行和元素分隔符进行两次拆分,最终将字符串转换为矩阵形式。通过两种不同的方法实现这一功能:一种是使用循环与 split() 方法,另一种是利用列表推导式。 ... [详细]
  • 本文详细探讨了KMP算法中next数组的构建及其应用,重点分析了未改良和改良后的next数组在字符串匹配中的作用。通过具体实例和代码实现,帮助读者更好地理解KMP算法的核心原理。 ... [详细]
  • Explore how Matterverse is redefining the metaverse experience, creating immersive and meaningful virtual environments that foster genuine connections and economic opportunities. ... [详细]
  • 本文介绍如何使用Objective-C结合dispatch库进行并发编程,以提高素数计数任务的效率。通过对比纯C代码与引入并发机制后的代码,展示dispatch库的强大功能。 ... [详细]
  • Java 中 Writer flush()方法,示例 ... [详细]
  • 1:有如下一段程序:packagea.b.c;publicclassTest{privatestaticinti0;publicintgetNext(){return ... [详细]
  • 主要用了2个类来实现的,话不多说,直接看运行结果,然后在奉上源代码1.Index.javaimportjava.awt.Color;im ... [详细]
  • UNP 第9章:主机名与地址转换
    本章探讨了用于在主机名和数值地址之间进行转换的函数,如gethostbyname和gethostbyaddr。此外,还介绍了getservbyname和getservbyport函数,用于在服务器名和端口号之间进行转换。 ... [详细]
  • 本文介绍了如何在C#中启动一个应用程序,并通过枚举窗口来获取其主窗口句柄。当使用Process类启动程序时,我们通常只能获得进程的句柄,而主窗口句柄可能为0。因此,我们需要使用API函数和回调机制来准确获取主窗口句柄。 ... [详细]
  • 扫描线三巨头 hdu1928hdu 1255  hdu 1542 [POJ 1151]
    学习链接:http:blog.csdn.netlwt36articledetails48908031学习扫描线主要学习的是一种扫描的思想,后期可以求解很 ... [详细]
  • 本文介绍如何使用 NSTimer 实现倒计时功能,详细讲解了初始化方法、参数配置以及具体实现步骤。通过示例代码展示如何创建和管理定时器,确保在指定时间间隔内执行特定任务。 ... [详细]
  • 本文介绍了如何通过 Maven 依赖引入 SQLiteJDBC 和 HikariCP 包,从而在 Java 应用中高效地连接和操作 SQLite 数据库。文章提供了详细的代码示例,并解释了每个步骤的实现细节。 ... [详细]
  • Java 中的 BigDecimal pow()方法,示例 ... [详细]
author-avatar
仰望天空说再见
这个家伙很懒,什么也没留下!
PHP1.CN | 中国最专业的PHP中文社区 | DevBox开发工具箱 | json解析格式化 |PHP资讯 | PHP教程 | 数据库技术 | 服务器技术 | 前端开发技术 | PHP框架 | 开发工具 | 在线工具
Copyright © 1998 - 2020 PHP1.CN. All Rights Reserved | 京公网安备 11010802041100号 | 京ICP备19059560号-4 | PHP1.CN 第一PHP社区 版权所有