OpenGL ES入門(mén)11-相機(jī)視頻渲染

前言

本文是關(guān)于OpenGL ES的系統(tǒng)性學(xué)習(xí)過(guò)程,記錄了自己在學(xué)習(xí)OpenGL ES時(shí)的收獲。
這篇文章的作用是利用學(xué)習(xí)的OpenGL ES知識(shí)去渲染從iOS相機(jī)中獲取的視頻數(shù)據(jù)。
環(huán)境是Xcode8.1+OpenGL ES 2.0
目前代碼已經(jīng)放到github上面,OpenGL ES入門(mén)11-相機(jī)視頻渲染

歡迎關(guān)注我的 OpenGL ES入門(mén)專題

實(shí)現(xiàn)效果

效果圖

知識(shí)點(diǎn)

  • yuv格式是一種圖片儲(chǔ)存格式,跟RGB格式類似。yuv中,y表示亮度,單獨(dú)只有y數(shù)據(jù)就可以形成一張圖片,只不過(guò)這張圖片是灰色的。u和v表示色差(u和v也被稱為:Cb-藍(lán)色差,Cr-紅色差)。最早的電視信號(hào),為了兼容黑白電視,采用的就是yuv格式。一張yuv的圖像,去掉uv,只保留y,這張圖片就是黑白的。yuv可以通過(guò)拋棄色差來(lái)進(jìn)行帶寬優(yōu)化。比如yuv420格式圖像相比RGB來(lái)說(shuō),要節(jié)省一半的字節(jié)大小,拋棄相鄰的色差對(duì)于人眼來(lái)說(shuō),差別不大。
  • yuv圖像占用字節(jié)數(shù)為 :
size = width * height + (width * height) / 4 + (width * height) / 4
  • RGB格式的圖像占用字節(jié)數(shù)為:
size = width * height * 3
  • RGBA格式的圖像占用字節(jié)數(shù)為:
size = width * height * 4
  • yuv420也包含不同的數(shù)據(jù)排列格式:I420,NV12,NV21.
    I420格式:y,u,v 3個(gè)部分分別存儲(chǔ):Y0,Y1…Yn,U0,U1…Un/2,V0,V1…Vn/2
    NV12格式:y和uv 2個(gè)部分分別存儲(chǔ):Y0,Y1…Yn,U0,V0,U1,V1…Un/2,Vn/2
    NV21格式:同NV12,只是U和V的順序相反。
  • iOS相機(jī)輸出圖片格式,下圖為設(shè)備支持的格式:
    設(shè)備支持格式

    kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v',表示輸出的視頻格式為NV12;范圍: (luma=[16,235] chroma=[16,240])
    kCVPixelFormatType_420YpCbCr8BiPlanarFullRange = '420f',表示輸出的視頻格式為NV12;范圍: (luma=[0,255] chroma=[1,255])
    kCVPixelFormatType_32BGRA = 'BGRA', 輸出的是BGRA的格式

實(shí)現(xiàn)過(guò)程

1、 捕獲視頻數(shù)據(jù),設(shè)置輸出格式。

- (void)setupSession
{
    _captureSession = [[AVCaptureSession alloc] init];
    [_captureSession beginConfiguration];
    
    // 設(shè)置換面尺寸
    [_captureSession setSessionPreset:AVCaptureSessionPreset640x480];
    
    // 設(shè)置輸入設(shè)備
    AVCaptureDevice *inputCamera = nil;
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices)
    {
        if ([device position] == AVCaptureDevicePositionBack)
        {
            inputCamera = device;
        }
    }
    
    if (!inputCamera) {
        return;
    }
    
    NSError *error = nil;
    _videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:inputCamera error:&error];
    if ([_captureSession canAddInput:_videoInput])
    {
        [_captureSession addInput:_videoInput];
    }
    
    // 設(shè)置輸出數(shù)據(jù)
    _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    [_videoOutput setAlwaysDiscardsLateVideoFrames:NO];
    
    //設(shè)置輸出格式
    [_videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
    [_videoOutput setSampleBufferDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
    
    
    if ([_captureSession canAddOutput:_videoOutput]) {
        [_captureSession addOutput:_videoOutput];
    }
    [_captureSession commitConfiguration];
}

2、實(shí)現(xiàn)代理方法,獲取視頻輸出數(shù)據(jù)。

#pragma mark - <AVCaptureVideoDataOutputSampleBufferDelegate>
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    if (!self.captureSession.isRunning) {
        return;
    }else if (captureOutput == _videoOutput) {
        OpenGLESView *glView = (OpenGLESView *)self.view;
        // 根據(jù)設(shè)置的紋理格式進(jìn)行處理
        if ([glView.render isMemberOfClass:[GLRenderRGB class]]) {
             [self processVideoSampleBufferToRGB1:sampleBuffer];
        }else {
            [self processVideoSampleBufferToYUV:sampleBuffer];
        }
    }
}

3、設(shè)置OpenGLESView,初始化的時(shí)候我們需要傳入一個(gè)渲染器GLRender,它的作用是負(fù)責(zé)處理不同類型數(shù)據(jù)(RGBA以及YUV)的渲染。

//
//  OpenGLESView.m
//  OpenGLES01-環(huán)境搭建
//
//  Created by qinmin on 2017/2/9.
//  Copyright ? 2017年 qinmin. All rights reserved.
//

#import "OpenGLESView.h"
#import <OpenGLES/ES2/gl.h>
#import "GLUtil.h"

@interface OpenGLESView : UIView
@property (nonatomic, strong) GLRender *render;
- (void)setTexture:(GLTexture *)texture;
- (void)setNeedDraw;
@end

@interface OpenGLESView ()
{
    CAEAGLLayer     *_eaglLayer;
    EAGLContext     *_context;
    GLuint          _colorRenderBuffer;
    GLuint          _frameBuffer;

    GLRender        *_render;
}
@end

@implementation OpenGLESView

+ (Class)layerClass
{
    // 只有 [CAEAGLLayer class] 類型的 layer 才支持在其上描繪 OpenGL 內(nèi)容。
    return [CAEAGLLayer class];
}

- (void)dealloc
{
    
}

- (instancetype)initWithFrame:(CGRect)frame
{
    if (self = [super initWithFrame:frame]) {
        [self setupLayer];
        [self setupContext];
    }
    return self;
}

- (void)layoutSubviews
{
    [EAGLContext setCurrentContext:_context];
    
    [self destoryRenderAndFrameBuffer];
    
    [self setupFrameAndRenderBuffer];
}


#pragma mark - Setup
- (void)setupLayer
{
    _eaglLayer = (CAEAGLLayer*) self.layer;
    
    // CALayer 默認(rèn)是透明的,必須將它設(shè)為不透明才能讓其可見(jiàn)
    _eaglLayer.opaque = YES;
    
    // 設(shè)置描繪屬性,在這里設(shè)置不維持渲染內(nèi)容以及顏色格式為 RGBA8
    _eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
                                     [NSNumber numberWithBool:NO], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];
}

- (void)setupContext
{
    // 設(shè)置OpenGLES的版本為2.0 當(dāng)然還可以選擇1.0和最新的3.0的版本,以后我們會(huì)講到2.0與3.0的差異,目前為了兼容性選擇2.0的版本
    _context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    if (!_context) {
        NSLog(@"Failed to initialize OpenGLES 2.0 context");
        exit(1);
    }
    
    // 將當(dāng)前上下文設(shè)置為我們創(chuàng)建的上下文
    if (![EAGLContext setCurrentContext:_context]) {
        NSLog(@"Failed to set current OpenGL context");
        exit(1);
    }
}

- (void)setupFrameAndRenderBuffer
{
    glGenRenderbuffers(1, &_colorRenderBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderBuffer);
    // 為 color renderbuffer 分配存儲(chǔ)空間
    [_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:_eaglLayer];
    
    glGenFramebuffers(1, &_frameBuffer);
    // 設(shè)置為當(dāng)前 framebuffer
    glBindFramebuffer(GL_FRAMEBUFFER, _frameBuffer);
    // 將 _colorRenderBuffer 裝配到 GL_COLOR_ATTACHMENT0 這個(gè)裝配點(diǎn)上
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
                              GL_RENDERBUFFER, _colorRenderBuffer);
}

#pragma mark - Clean
- (void)destoryRenderAndFrameBuffer
{
    glDeleteFramebuffers(1, &_frameBuffer);
    _frameBuffer = 0;
    glDeleteRenderbuffers(1, &_colorRenderBuffer);
    _colorRenderBuffer = 0;
}

#pragma mark - Render
- (void)draw
{
    glClearColor(1.0, 1.0, 1.0, 1.0);
    glClear(GL_COLOR_BUFFER_BIT);
    
    glViewport(0, 0, self.frame.size.width, self.frame.size.height);
    
    // 繪制
    [_render prepareRender];
    
    //將指定 renderbuffer 呈現(xiàn)在屏幕上,在這里我們指定的是前面已經(jīng)綁定為當(dāng)前 renderbuffer 的那個(gè),在 renderbuffer 可以被呈現(xiàn)之前,必須調(diào)用renderbufferStorage:fromDrawable: 為之分配存儲(chǔ)空間。
    [_context presentRenderbuffer:GL_RENDERBUFFER];
}

#pragma mark - PublicMethod
- (void)setRender:(GLRender *)render
{
    _render = render;
}

- (void)setTexture:(GLTexture *)texture
{
    [_render setTexture:texture];
}

- (void)setNeedDraw
{
    [self draw];
}

@end

4、創(chuàng)建渲染器。渲染器負(fù)責(zé)創(chuàng)建GL程序,負(fù)責(zé)創(chuàng)建定點(diǎn)緩存數(shù)據(jù),以及負(fù)責(zé)生成紋理緩存對(duì)象。

//
//  GLRGBRender.h
//  OpenGLES11-相機(jī)視頻渲染
//
//  Created by mac on 17/3/24.
//  Copyright ? 2017年 Qinmin. All rights reserved.
//

#import <UIKit/UIKit.h>
#import "GLTexture.h"
#import "GLUtil.h"

@interface GLRender : NSObject
@property (nonatomic, assign) GLuint program;
@property (nonatomic, assign) GLuint vertexVBO;
@property (nonatomic, assign) int vertCount;
- (void)setTexture:(GLTexture *)texture;
- (void)prepareRender;
@end

@interface GLRenderRGB : GLRender
@property(nonatomic, assign, readonly) GLuint rgb;
@end

@interface GLRenderYUV : GLRender
@property(nonatomic, assign, readonly) GLuint y;
@property(nonatomic, assign, readonly) GLuint u;
@property(nonatomic, assign, readonly) GLuint v;
@end

////////////////GLRender//////////////////////////
@implementation GLRender
- (void)setupGLProgram
{
}

- (void)setTexture:(GLTexture *)texture
{
}

- (void)prepareRender
{
}
@end

////////////////GLRenderRGB//////////////////////////
@implementation GLRenderRGB

- (instancetype)init
{
    if (self = [super init]) {
        [self setupGLProgram];
        [self setupVBO];
        
        _rgb = createTexture2D(GL_RGBA, 640, 480, NULL);
    }
    return self;
}

- (void)setupGLProgram
{
    NSString *vertFile = [[NSBundle mainBundle] pathForResource:@"vert.glsl" ofType:nil];
    NSString *fragFile = [[NSBundle mainBundle] pathForResource:@"frag_rgb.glsl" ofType:nil];
    
    self.program = createGLProgramFromFile(vertFile.UTF8String, fragFile.UTF8String);
    glUseProgram(self.program);
}

- (void)setupVBO
{
    self.vertCount = 6;
    
    GLfloat vertices[] = {
        0.8f,  0.6f, 0.0f, 1.0f, 0.0f,   // 右上
        0.8f, -0.6f, 0.0f, 1.0f, 1.0f,   // 右下
        -0.8f, -0.6f, 0.0f, 0.0f, 1.0f,  // 左下
        -0.8f, -0.6f, 0.0f, 0.0f, 1.0f,  // 左下
        -0.8f,  0.6f, 0.0f, 0.0f, 0.0f,  // 左上
        0.8f,  0.6f, 0.0f, 1.0f, 0.0f,   // 右上
    };
    
    // 創(chuàng)建VBO
    self.vertexVBO = createVBO(GL_ARRAY_BUFFER, GL_STATIC_DRAW, sizeof(vertices), vertices);
}

- (void)setTexture:(GLTexture *)texture
{
    if ([texture isMemberOfClass:[GLTextureRGB class]]) {
        
        glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
        
        GLTextureRGB *rgbTexture = (GLTextureRGB *)texture;
        glBindTexture(GL_TEXTURE_2D, _rgb);
        glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture.width, texture.height, GL_RGBA, GL_UNSIGNED_BYTE, rgbTexture.RGBA);
    }
}

- (void)prepareRender
{
    glBindBuffer(GL_ARRAY_BUFFER, self.vertexVBO);
    glEnableVertexAttribArray(glGetAttribLocation(self.program, "position"));
    glVertexAttribPointer(glGetAttribLocation(self.program, "position"), 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL);
    
    glEnableVertexAttribArray(glGetAttribLocation(self.program, "texcoord"));
    glVertexAttribPointer(glGetAttribLocation(self.program, "texcoord"), 2, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL+sizeof(GL_FLOAT)*3);
    
    glActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, _rgb);
    glUniform1i(glGetUniformLocation(self.program, "image0"), 0);
    
    glDrawArrays(GL_TRIANGLES, 0, self.vertCount);
}

@end

////////////////GLRenderYUV//////////////////////////
@implementation GLRenderYUV
- (instancetype)init
{
    if (self = [super init]) {
        [self setupGLProgram];
        [self setupVBO];
        
        _y = createTexture2D(GL_LUMINANCE, 640, 480, NULL);
        _u = createTexture2D(GL_LUMINANCE, 640/2, 480/2, NULL);
        _v = createTexture2D(GL_LUMINANCE, 640/2, 480/2, NULL);
    }
    return self;
}

- (void)setupGLProgram
{
    NSString *vertFile = [[NSBundle mainBundle] pathForResource:@"vert.glsl" ofType:nil];
    NSString *fragFile = [[NSBundle mainBundle] pathForResource:@"frag.glsl" ofType:nil];
    
    self.program = createGLProgramFromFile(vertFile.UTF8String, fragFile.UTF8String);
    glUseProgram(self.program);
}

- (void)setupVBO
{
    self.vertCount = 6;
    
    GLfloat vertices[] = {
        0.8f,  0.6f, 0.0f, 1.0f, 0.0f,   // 右上
        0.8f, -0.6f, 0.0f, 1.0f, 1.0f,   // 右下
        -0.8f, -0.6f, 0.0f, 0.0f, 1.0f,  // 左下
        -0.8f, -0.6f, 0.0f, 0.0f, 1.0f,  // 左下
        -0.8f,  0.6f, 0.0f, 0.0f, 0.0f,  // 左上
        0.8f,  0.6f, 0.0f, 1.0f, 0.0f,   // 右上
    };
    
    // 創(chuàng)建VBO
    self.vertexVBO = createVBO(GL_ARRAY_BUFFER, GL_STATIC_DRAW, sizeof(vertices), vertices);
}

- (void)setTexture:(GLTexture *)texture
{
    if ([texture isMemberOfClass:[GLTextureYUV class]]) {
        GLTextureYUV *rgbTexture = (GLTextureYUV *)texture;
        
        glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
        
        glBindTexture(GL_TEXTURE_2D, _y);
        glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture.width, texture.height, GL_LUMINANCE, GL_UNSIGNED_BYTE, rgbTexture.Y);
        glBindTexture(GL_TEXTURE_2D, 0);
        
        glBindTexture(GL_TEXTURE_2D, _u);
        glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture.width/2, texture.height/2, GL_LUMINANCE, GL_UNSIGNED_BYTE, rgbTexture.U);
        glBindTexture(GL_TEXTURE_2D, 0);
        
        glBindTexture(GL_TEXTURE_2D, _v);
        glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture.width/2, texture.height/2, GL_LUMINANCE, GL_UNSIGNED_BYTE, rgbTexture.V);
        glBindTexture(GL_TEXTURE_2D, 0);
    }
}

- (void)prepareRender
{
    glBindBuffer(GL_ARRAY_BUFFER, self.vertexVBO);
    glEnableVertexAttribArray(glGetAttribLocation(self.program, "position"));
    glVertexAttribPointer(glGetAttribLocation(self.program, "position"), 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL);
    
    glEnableVertexAttribArray(glGetAttribLocation(self.program, "texcoord"));
    glVertexAttribPointer(glGetAttribLocation(self.program, "texcoord"), 2, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL+sizeof(GL_FLOAT)*3);
    
    glActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, _y);
    glUniform1i(glGetUniformLocation(self.program, "image0"), 0);
    
    glActiveTexture(GL_TEXTURE1);
    glBindTexture(GL_TEXTURE_2D, _u);
    glUniform1i(glGetUniformLocation(self.program, "image1"), 1);
    
    glActiveTexture(GL_TEXTURE2);
    glBindTexture(GL_TEXTURE_2D, _v);
    glUniform1i(glGetUniformLocation(self.program, "image2"), 2);
    
    glDrawArrays(GL_TRIANGLES, 0, self.vertCount);
}

5、創(chuàng)建紋理描述信息。在這里,將RGBA數(shù)據(jù)打包存儲(chǔ),將YUV數(shù)據(jù)分開(kāi)存儲(chǔ)。

//
//  GLTexture.h
//  OpenGLES11-相機(jī)視頻渲染
//
//  Created by mac on 17/3/24.
//  Copyright ? 2017年 Qinmin. All rights reserved.
//

#import <Foundation/Foundation.h>

@interface GLTexture : NSObject
@property (assign, nonatomic) int width;
@property (assign, nonatomic) int height;
@end

@interface GLTextureRGB : GLTexture
@property (nonatomic, assign) uint8_t *RGBA;
@end

@interface GLTextureYUV : GLTexture
@property (nonatomic, assign) uint8_t *Y;
@property (nonatomic, assign) uint8_t *U;
@property (nonatomic, assign) uint8_t *V;
@end

@implementation GLTexture
@end

@implementation GLTextureRGB
- (void)dealloc
{
    if (_RGBA) {
        free(_RGBA);
        _RGBA = NULL;
    }
}
@end

@implementation GLTextureYUV
- (void)dealloc
{
    if (_Y) {
        free(_Y);
        _Y = NULL;
    }
    
    if (_U) {
        free(_U);
        _U = NULL;
    }
    
    if (_V) {
        free(_V);
        _V = NULL;
    }
}
@end

6、初始化OpenGLESView。并傳入渲染方式。

- (void)viewDidLoad {
    [super viewDidLoad];
    
    OpenGLESView *glView = [[OpenGLESView alloc] initWithFrame:self.view.frame];
    // RGB方式渲染
    GLRender *render = [[GLRenderRGB alloc] init];
    [glView setRender:render];
    self.view = glView;
    
    UIButton *btn = [[UIButton alloc] initWithFrame:CGRectMake(0, 30, 120, 30)];
    [btn addTarget:self action:@selector(startBtnClick:) forControlEvents:UIControlEventTouchUpInside];
    [btn setTitle:@"開(kāi)始" forState:UIControlStateNormal];
    [btn setBackgroundColor:[UIColor greenColor]];
    [self.view addSubview:btn];
    
    [self setupSession];
}

7、按照不同的方式處理視頻數(shù)據(jù),并傳遞給OpenGLES渲染。

// 視頻格式為:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange或kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
- (void)processVideoSampleBufferToYUV:(CMSampleBufferRef)sampleBuffer
{
    //CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    //表示開(kāi)始操作數(shù)據(jù)
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    
    int pixelWidth = (int) CVPixelBufferGetWidth(pixelBuffer);
    int pixelHeight = (int) CVPixelBufferGetHeight(pixelBuffer);
    
    GLTextureYUV *yuv = [[GLTextureYUV alloc] init];
    yuv.width = pixelWidth;
    yuv.height = pixelHeight;
    
    //size_t count = CVPixelBufferGetPlaneCount(pixelBuffer);
    //獲取CVImageBufferRef中的y數(shù)據(jù)
    size_t y_size = pixelWidth * pixelHeight;
    uint8_t *yuv_frame = malloc(y_size);
    uint8_t *y_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
    memcpy(yuv_frame, y_frame, y_size);
    yuv.Y = yuv_frame;
    
    // UV數(shù)據(jù)
    uint8_t *uv_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
    size_t uv_size = y_size/2;
    
    //獲取CMVImageBufferRef中的u數(shù)據(jù)
    size_t u_size = y_size/4;
    uint8_t *u_frame = malloc(u_size);
    for (int i = 0, j = 0; i < uv_size; i += 2, j++) {
        u_frame[j] = uv_frame[i];
    }
    yuv.U = u_frame;
    
    //獲取CMVImageBufferRef中的v數(shù)據(jù)
    size_t v_size = y_size/4;
    uint8_t *v_frame = malloc(v_size);
    for (int i = 1, j = 0; i < uv_size; i += 2, j++) {
        v_frame[j] = uv_frame[i];
    }
    yuv.V = v_frame;
    
    // Unlock
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    
    dispatch_async(dispatch_get_main_queue(), ^{
        OpenGLESView *glView = (OpenGLESView *)self.view;
        [glView setTexture:yuv];
        [glView setNeedDraw];
    });
}

// 視頻格式為:kCVPixelFormatType_32BGRA
- (void)processVideoSampleBufferToRGB:(CMSampleBufferRef)sampleBuffer
{
    //CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    //size_t count = CVPixelBufferGetPlaneCount(pixelBuffer);
    //printf("%zud\n", count);
    
    //表示開(kāi)始操作數(shù)據(jù)
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    
    int pixelWidth = (int) CVPixelBufferGetWidth(pixelBuffer);
    int pixelHeight = (int) CVPixelBufferGetHeight(pixelBuffer);
    
    GLTextureRGB *rgb = [[GLTextureRGB alloc] init];
    rgb.width = pixelWidth;
    rgb.height = pixelHeight;
    
    // BGRA數(shù)據(jù)
    //size_t y_size = pixelWidth * pixelHeight;
    uint8_t *frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
    
    uint8_t *bgra = malloc(pixelHeight * pixelWidth * 4);
    memcpy(bgra, frame, pixelHeight * pixelWidth * 4);
    
    rgb.RGBA = bgra;
    
    // Unlock
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    
    dispatch_async(dispatch_get_main_queue(), ^{
        OpenGLESView *glView = (OpenGLESView *)self.view;
        [glView setTexture:rgb];
        [glView setNeedDraw];
    });
}

// 視頻格式為:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange或kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
- (void)processVideoSampleBufferToRGB1:(CMSampleBufferRef)sampleBuffer
{
    //CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    //size_t count = CVPixelBufferGetPlaneCount(pixelBuffer);
    //printf("%zud\n", count);
    
    //表示開(kāi)始操作數(shù)據(jù)
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    
    int pixelWidth = (int) CVPixelBufferGetWidth(pixelBuffer);
    int pixelHeight = (int) CVPixelBufferGetHeight(pixelBuffer);
    
    GLTextureRGB *rgb = [[GLTextureRGB alloc] init];
    rgb.width = pixelWidth;
    rgb.height = pixelHeight;
    
    // Y數(shù)據(jù)
    //size_t y_size = pixelWidth * pixelHeight;
    uint8_t *y_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
    
    // UV數(shù)據(jù)
    uint8_t *uv_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
    //size_t uv_size = y_size/2;
    
    // ARGB = BGRA 大小端問(wèn)題 轉(zhuǎn)換出來(lái)的數(shù)據(jù)是BGRA
    uint8_t *bgra = malloc(pixelHeight * pixelWidth * 4);
    NV12ToARGB(y_frame, pixelWidth, uv_frame, pixelWidth, bgra, pixelWidth * 4, pixelWidth, pixelHeight);
    
    rgb.RGBA = bgra;
    
    // Unlock
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    
    dispatch_async(dispatch_get_main_queue(), ^{
        OpenGLESView *glView = (OpenGLESView *)self.view;
        [glView setTexture:rgb];
        [glView setNeedDraw];
    });
}

8、旋轉(zhuǎn)圖像與像素轉(zhuǎn)換。由于相機(jī)拍攝出的圖像是經(jīng)過(guò)旋轉(zhuǎn)了的圖像,因此顯示的時(shí)候需要對(duì)圖像進(jìn)行一個(gè)繞z軸的旋轉(zhuǎn),這里我們將著色器頂點(diǎn)旋轉(zhuǎn)-90度。OpenGLES是按照RGBA方式處理像素,因此我們還需要把ARGB以及YUV轉(zhuǎn)換為RGBA格式。

繞 Z 軸旋轉(zhuǎn)的旋轉(zhuǎn)矩陣可表示為:

Z軸旋轉(zhuǎn)矩陣
  • 頂點(diǎn)著色器,對(duì)position進(jìn)行-90度旋轉(zhuǎn)。
attribute vec3 position;
attribute vec3 color;
attribute vec2 texcoord;

varying vec2 v_texcoord;

void main()
{
    const float degree = radians(-90.0);
   
    //構(gòu)建旋轉(zhuǎn)矩陣
    const mat3 rotate = mat3(
        cos(degree), sin(degree), 0.0,
        -sin(degree), cos(degree), 0.0,
        0.0, 0.0, 1.0
    );
    
    gl_Position = vec4(rotate*position, 1.0);
    v_texcoord = texcoord;
}
  • RGBA渲染著色器,在這里由于數(shù)據(jù)是BGRA格式的,因此進(jìn)行了BGRA到RGBA的轉(zhuǎn)換。
precision mediump float;

varying vec2 v_texcoord;

uniform sampler2D image0;

void main()
{
    // bgra
    // rgba
    vec4 color = texture2D(image0, v_texcoord);

    // 像素轉(zhuǎn)換
    gl_FragColor = vec4(color.bgr, 1.0);
}
  • YUV渲染著色器,在這里由于數(shù)據(jù)是YUV格式的,因此進(jìn)行了YUV到RGB的轉(zhuǎn)換。
precision mediump float;

varying vec2 v_texcoord;

uniform sampler2D image0;
uniform sampler2D image1;
uniform sampler2D image2;

void main()
{
    highp float y = texture2D(image0, v_texcoord).r;
    highp float u = texture2D(image1, v_texcoord).r - 0.5;
    highp float v = texture2D(image2, v_texcoord).r - 0.5;
    
    // 像素轉(zhuǎn)換
    highp float r = y + 0.000     + 1.402 * v;
    highp float g = y - 0.344 * u - 0.714 * v;
    highp float b = y + 1.772 * u;
    
    gl_FragColor = vec4(r, g, b, 1.0)
}

注意

  • 在iOS中使用OpenGL進(jìn)行繪圖的時(shí)候,我們需要在主線程中調(diào)用GL命令,在其它線程中使用并不會(huì)有效果。由于相機(jī)代理輸出的視頻數(shù)據(jù)不在主線程中,因此我們需要手動(dòng)轉(zhuǎn)到主線程中調(diào)用GL相關(guān)的命令。

  • 在這里使用了libyuv的NV12ToARGB函數(shù),該函數(shù)轉(zhuǎn)換的實(shí)際結(jié)果并不是ARGB格式,而是BGRA格式,應(yīng)該是和大小端有關(guān)。因此,在著色器中只需要手動(dòng)將BGRA轉(zhuǎn)為RGBA就行了。

參考資料

http://www.cnblogs.com/kesalin/archive/2012/12/06/3D_math.html

https://github.com/BradLarson/GPUImage

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡(jiǎn)書(shū)系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

推薦閱讀更多精彩內(nèi)容