GPUImage源碼閱讀(九)

概述

GPUImage是一個(gè)著名的圖像處理開源庫,它讓你能夠在圖片、視頻、相機(jī)上使用GPU加速的濾鏡和其它特效。與CoreImage框架相比,可以根據(jù)GPUImage提供的接口,使用自定義的濾鏡。項(xiàng)目地址:https://github.com/BradLarson/GPUImage
這篇文章主要是閱讀GPUImage框架中的 GPUImageFilter 類的源碼。GPUImageFilter 是GPUImage中很重要、很基礎(chǔ)的類,它可以處理幀緩存對象的輸入輸出,但是對紋理并不添加任何特效,也就是說只是簡單的讓紋理通過。它更多的是作為其它濾鏡的基類,一些具體的濾鏡由它的子類去完成。同時(shí)它也只能處理單個(gè)幀緩存對象的輸入,處理多個(gè)幀緩存對象的輸入也是由它的子類去完成。以下是源碼內(nèi)容:
GPUImageFilter

實(shí)現(xiàn)效果

  • 通過繼承GPUImageFilter,實(shí)現(xiàn)自定義濾鏡特效。
自定義濾鏡.png
  • 實(shí)現(xiàn)簡單的光照效果。
簡單光照效果.png

GPUImageFilter

GPUImageFilter 本身并不實(shí)現(xiàn)相關(guān)的濾鏡特效,只是簡單的輸出輸入的紋理樣式。GPUImageFilter 更多的是作為其它濾鏡的基類,它提供了許多最基礎(chǔ)的接口,以及控制了整個(gè)濾鏡鏈的基本流程。GPUImageFilter 繼承自 GPUImageOutput 實(shí)現(xiàn)了 GPUImageInput 協(xié)議,可以將輸入的紋理經(jīng)過相關(guān)處理后輸出,從而對紋理應(yīng)用相關(guān)特效。在一個(gè)響應(yīng)鏈中可以有多個(gè) GPUImageFilter,從而實(shí)現(xiàn)了疊加濾鏡的效果。

  • 矩陣

在 GPUImage 中主要用到了3維向量、4維向量、4x4矩陣、3x3矩陣,對應(yīng)OpenGL中的vec3、vec4、mat4、mat3。之所以使用這些向量、矩陣,是為了方便向著色器傳值。在 GPUImageFilter 中定義了一組傳值的接口,在需要向著色器傳值的時(shí)候很方便。具體向量定義如下:

struct GPUVector4 {
    GLfloat one;
    GLfloat two;
    GLfloat three;
    GLfloat four;
};
typedef struct GPUVector4 GPUVector4;

struct GPUVector3 {
    GLfloat one;
    GLfloat two;
    GLfloat three;
};
typedef struct GPUVector3 GPUVector3;

struct GPUMatrix4x4 {
    GPUVector4 one;
    GPUVector4 two;
    GPUVector4 three;
    GPUVector4 four;
};
typedef struct GPUMatrix4x4 GPUMatrix4x4;

struct GPUMatrix3x3 {
    GPUVector3 one;
    GPUVector3 two;
    GPUVector3 three;
};
typedef struct GPUMatrix3x3 GPUMatrix3x3;
  • 著色器

在濾鏡中著色器程序是很重要的,它決定了濾鏡的表現(xiàn)效果。在 GPUImageFilter 中的著色器程序比較簡單,只是簡單的進(jìn)行紋理采樣,并沒有對像素?cái)?shù)據(jù)進(jìn)行相關(guān)操作。在自定義相關(guān)濾鏡的時(shí)候,我們通常改變片段著色器就行了,如果涉及多個(gè)紋理輸入,可以使用之前介紹的多重輸入濾鏡(也是GPUImageFilter的子類,但擴(kuò)展了幀緩存的輸入)。以下是 GPUImageFilter 的相關(guān)著色器。

NSString *const kGPUImageVertexShaderString = SHADER_STRING
(
 attribute vec4 position;
 attribute vec4 inputTextureCoordinate;
 
 varying vec2 textureCoordinate;
 
 void main()
 {
     gl_Position = position;
     textureCoordinate = inputTextureCoordinate.xy;
 }
 );

#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE

NSString *const kGPUImagePassthroughFragmentShaderString = SHADER_STRING
(
 varying highp vec2 textureCoordinate;
 
 uniform sampler2D inputImageTexture;
 
 void main()
 {
     gl_FragColor = texture2D(inputImageTexture, textureCoordinate);
 }
);

#else

NSString *const kGPUImagePassthroughFragmentShaderString = SHADER_STRING
(
 varying vec2 textureCoordinate;
 
 uniform sampler2D inputImageTexture;
 
 void main()
 {
     gl_FragColor = texture2D(inputImageTexture, textureCoordinate);
 }
);
#endif
  • 實(shí)例變量

GPUImageFilter 中有兩個(gè)比較重要的實(shí)例變量 firstInputFramebuffer、filterProgram。firstInputFramebuffer 表示輸入幀緩存對象,filterProgram 表示GL程序。

@interface GPUImageFilter : GPUImageOutput <GPUImageInput>
{
    // 輸入幀緩存對象
    GPUImageFramebuffer *firstInputFramebuffer;
    // GL程序
    GLProgram *filterProgram;
    // 屬性變量
    GLint filterPositionAttribute, filterTextureCoordinateAttribute;
    // 紋理統(tǒng)一變量
    GLint filterInputTextureUniform;
    // GL清屏顏色
    GLfloat backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha;
    // 結(jié)束處理操作
    BOOL isEndProcessing;
   
    CGSize currentFilterSize;
    // 屏幕旋轉(zhuǎn)方向
    GPUImageRotationMode inputRotation;
    
    BOOL currentlyReceivingMonochromeInput;
    
    // 保存RestorationBlocks的字典
    NSMutableDictionary *uniformStateRestorationBlocks;
    // 信號(hào)量
    dispatch_semaphore_t imageCaptureSemaphore;
}
  • 構(gòu)造方法

GPUImageFilter 構(gòu)造方法需要我們傳入頂點(diǎn)著色器和片段著色器就,當(dāng)然我們一般只需要傳入片段著色器即可。初始化的過程可以概括為這幾個(gè)步驟:1、初始化相關(guān)實(shí)例變量;2、初始化GL上下文對象;3、初始化GL程序;4、創(chuàng)建GL程序;5、獲取GL相關(guān)變量。

- (id)initWithVertexShaderFromString:(NSString *)vertexShaderString fragmentShaderFromString:(NSString *)fragmentShaderString;
- (id)initWithFragmentShaderFromString:(NSString *)fragmentShaderString;
- (id)initWithFragmentShaderFromFile:(NSString *)fragmentShaderFilename;

/******************* 方法實(shí)現(xiàn) ************************************/
- (id)initWithVertexShaderFromString:(NSString *)vertexShaderString fragmentShaderFromString:(NSString *)fragmentShaderString;
{
    if (!(self = [super init]))
    {
        return nil;
    }
    
    // 初始化相關(guān)實(shí)例變量
    uniformStateRestorationBlocks = [NSMutableDictionary dictionaryWithCapacity:10];
    _preventRendering = NO;
    currentlyReceivingMonochromeInput = NO;
    inputRotation = kGPUImageNoRotation;
    backgroundColorRed = 0.0;
    backgroundColorGreen = 0.0;
    backgroundColorBlue = 0.0;
    backgroundColorAlpha = 0.0;
    imageCaptureSemaphore = dispatch_semaphore_create(0);
    dispatch_semaphore_signal(imageCaptureSemaphore);

    runSynchronouslyOnVideoProcessingQueue(^{
        // 初始化GL上下文對象
        [GPUImageContext useImageProcessingContext];
        // 創(chuàng)建GL程序
        filterProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:vertexShaderString fragmentShaderString:fragmentShaderString];
        
        if (!filterProgram.initialized)
        {
            // 初始化屬性變量
            [self initializeAttributes];
            
            // 鏈接著色器程序
            if (![filterProgram link])
            {
                // 輸出錯(cuò)誤日志
                NSString *progLog = [filterProgram programLog];
                NSLog(@"Program link log: %@", progLog);
                NSString *fragLog = [filterProgram fragmentShaderLog];
                NSLog(@"Fragment shader compile log: %@", fragLog);
                NSString *vertLog = [filterProgram vertexShaderLog];
                NSLog(@"Vertex shader compile log: %@", vertLog);
                filterProgram = nil;
                NSAssert(NO, @"Filter shader link failed");
            }
        }
        // 獲取頂點(diǎn)屬性變量
        filterPositionAttribute = [filterProgram attributeIndex:@"position"];
        // 獲取紋理坐標(biāo)屬性變量
        filterTextureCoordinateAttribute = [filterProgram attributeIndex:@"inputTextureCoordinate"];
        // 獲取紋理統(tǒng)一變量
        filterInputTextureUniform = [filterProgram uniformIndex:@"inputImageTexture"]; // This does assume a name of "inputImageTexture" for the fragment shader
         // 使用當(dāng)前GL程序
        [GPUImageContext setActiveShaderProgram:filterProgram];
        // 啟用頂點(diǎn)屬性數(shù)組
        glEnableVertexAttribArray(filterPositionAttribute);
        glEnableVertexAttribArray(filterTextureCoordinateAttribute);    
    });
    
    return self;
}

- (id)initWithFragmentShaderFromString:(NSString *)fragmentShaderString;
{
    if (!(self = [self initWithVertexShaderFromString:kGPUImageVertexShaderString fragmentShaderFromString:fragmentShaderString]))
    {
        return nil;
    }
    
    return self;
}

- (id)initWithFragmentShaderFromFile:(NSString *)fragmentShaderFilename;
{
    NSString *fragmentShaderPathname = [[NSBundle mainBundle] pathForResource:fragmentShaderFilename ofType:@"fsh"];
    NSString *fragmentShaderString = [NSString stringWithContentsOfFile:fragmentShaderPathname encoding:NSUTF8StringEncoding error:nil];

    if (!(self = [self initWithFragmentShaderFromString:fragmentShaderString]))
    {
        return nil;
    }
    
    return self;
}
  • 其它方法

GPUImageFilter 的方法中,為著色器傳值的方法比較多,這是因?yàn)橹髂芙邮懿煌愋偷闹?,如:GLint、GLfloat、vec2、vec3、mat3 等。在這些方法中有三個(gè)比較重要的方法 - (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;- (void)informTargetsAboutNewFrameAtTime:(CMTime)frameTime; 這三個(gè)方法和響應(yīng)鏈密切相關(guān)。GPUImageFilter 會(huì)將接收到的幀緩存對象經(jīng)過特定的片段著色器繪制到即將輸出的幀緩存對象中,然后將自己輸出的幀緩存對象傳給所有Targets并通知它們進(jìn)行處理。方法被調(diào)用的順序:

1、生成新的幀緩存對象
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;;
2、進(jìn)行GL繪制
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
3、繪制完成通知所有的target處理
- (void)informTargetsAboutNewFrameAtTime:(CMTime)frameTime;

接下來主要看這幾個(gè)方法。

// 變換方法
- (void)setupFilterForSize:(CGSize)filterFrameSize;
- (CGSize)rotatedSize:(CGSize)sizeToRotate forIndex:(NSInteger)textureIndex;
- (CGPoint)rotatedPoint:(CGPoint)pointToRotate forRotation:(GPUImageRotationMode)rotation;

// 查詢方法
- (CGSize)sizeOfFBO;
+ (const GLfloat *)textureCoordinatesForRotation:(GPUImageRotationMode)rotationMode;
- (CGSize)outputFrameSize;

// 渲染方法
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
- (void)informTargetsAboutNewFrameAtTime:(CMTime)frameTime;

// 設(shè)置清屏顏色
- (void)setBackgroundColorRed:(GLfloat)redComponent green:(GLfloat)greenComponent blue:(GLfloat)blueComponent alpha:(GLfloat)alphaComponent;

// 傳值方法
- (void)setInteger:(GLint)newInteger forUniformName:(NSString *)uniformName;
- (void)setFloat:(GLfloat)newFloat forUniformName:(NSString *)uniformName;
- (void)setSize:(CGSize)newSize forUniformName:(NSString *)uniformName;
- (void)setPoint:(CGPoint)newPoint forUniformName:(NSString *)uniformName;
- (void)setFloatVec3:(GPUVector3)newVec3 forUniformName:(NSString *)uniformName;
- (void)setFloatVec4:(GPUVector4)newVec4 forUniform:(NSString *)uniformName;
- (void)setFloatArray:(GLfloat *)array length:(GLsizei)count forUniform:(NSString*)uniformName;
- (void)setMatrix3f:(GPUMatrix3x3)matrix forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setMatrix4f:(GPUMatrix4x4)matrix forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setFloat:(GLfloat)floatValue forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setPoint:(CGPoint)pointValue forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setSize:(CGSize)sizeValue forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setVec3:(GPUVector3)vectorValue forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setVec4:(GPUVector4)vectorValue forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setFloatArray:(GLfloat *)arrayValue length:(GLsizei)arrayLength forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setInteger:(GLint)intValue forUniform:(GLint)uniform program:(GLProgram *)shaderProgram;
- (void)setAndExecuteUniformStateCallbackAtIndex:(GLint)uniform forProgram:(GLProgram *)shaderProgram toBlock:(dispatch_block_t)uniformStateBlock;
- (void)setUniformsForProgramAtIndex:(NSUInteger)programIndex;


/******************* 方法實(shí)現(xiàn) ************************************/
// 根據(jù)旋轉(zhuǎn)方向獲取紋理坐標(biāo)
+ (const GLfloat *)textureCoordinatesForRotation:(GPUImageRotationMode)rotationMode;
{
    static const GLfloat noRotationTextureCoordinates[] = {
        0.0f, 0.0f,
        1.0f, 0.0f,
        0.0f, 1.0f,
        1.0f, 1.0f,
    };
    
    static const GLfloat rotateLeftTextureCoordinates[] = {
        1.0f, 0.0f,
        1.0f, 1.0f,
        0.0f, 0.0f,
        0.0f, 1.0f,
    };
    
    static const GLfloat rotateRightTextureCoordinates[] = {
        0.0f, 1.0f,
        0.0f, 0.0f,
        1.0f, 1.0f,
        1.0f, 0.0f,
    };
    
    static const GLfloat verticalFlipTextureCoordinates[] = {
        0.0f, 1.0f,
        1.0f, 1.0f,
        0.0f,  0.0f,
        1.0f,  0.0f,
    };
    
    static const GLfloat horizontalFlipTextureCoordinates[] = {
        1.0f, 0.0f,
        0.0f, 0.0f,
        1.0f,  1.0f,
        0.0f,  1.0f,
    };
    
    static const GLfloat rotateRightVerticalFlipTextureCoordinates[] = {
        0.0f, 0.0f,
        0.0f, 1.0f,
        1.0f, 0.0f,
        1.0f, 1.0f,
    };

    static const GLfloat rotateRightHorizontalFlipTextureCoordinates[] = {
        1.0f, 1.0f,
        1.0f, 0.0f,
        0.0f, 1.0f,
        0.0f, 0.0f,
    };

    static const GLfloat rotate180TextureCoordinates[] = {
        1.0f, 1.0f,
        0.0f, 1.0f,
        1.0f, 0.0f,
        0.0f, 0.0f,
    };

    switch(rotationMode)
    {
        case kGPUImageNoRotation: return noRotationTextureCoordinates;
        case kGPUImageRotateLeft: return rotateLeftTextureCoordinates;
        case kGPUImageRotateRight: return rotateRightTextureCoordinates;
        case kGPUImageFlipVertical: return verticalFlipTextureCoordinates;
        case kGPUImageFlipHorizonal: return horizontalFlipTextureCoordinates;
        case kGPUImageRotateRightFlipVertical: return rotateRightVerticalFlipTextureCoordinates;
        case kGPUImageRotateRightFlipHorizontal: return rotateRightHorizontalFlipTextureCoordinates;
        case kGPUImageRotate180: return rotate180TextureCoordinates;
    }
}

// 產(chǎn)生新的幀緩存
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
    static const GLfloat imageVertices[] = {
        -1.0f, -1.0f,
        1.0f, -1.0f,
        -1.0f,  1.0f,
        1.0f,  1.0f,
    };
    
    // 先渲染到幀緩存
    [self renderToTextureWithVertices:imageVertices textureCoordinates:[[self class] textureCoordinatesForRotation:inputRotation]];
  
    // 通知所有的Targets
    [self informTargetsAboutNewFrameAtTime:frameTime];
}

// 渲染到幀緩存
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
{
    if (self.preventRendering)
    {
        [firstInputFramebuffer unlock];
        return;
    }
    
    [GPUImageContext setActiveShaderProgram:filterProgram];

    outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:[self sizeOfFBO] textureOptions:self.outputTextureOptions onlyTexture:NO];
    [outputFramebuffer activateFramebuffer];
    if (usingNextFrameForImageCapture)
    {
        [outputFramebuffer lock];
    }

    [self setUniformsForProgramAtIndex:0];
    
    // GL繪制
    glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
    glClear(GL_COLOR_BUFFER_BIT);

    glActiveTexture(GL_TEXTURE2);
    glBindTexture(GL_TEXTURE_2D, [firstInputFramebuffer texture]);
    
    glUniform1i(filterInputTextureUniform, 2);  

    glVertexAttribPointer(filterPositionAttribute, 2, GL_FLOAT, 0, 0, vertices);
    glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, textureCoordinates);
    
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
    
    // 解鎖輸入幀緩存對象
    [firstInputFramebuffer unlock];
    
    // 需要等待繪制完成才去生成圖像
    if (usingNextFrameForImageCapture)
    {
        // 發(fā)送渲染完成信號(hào)
        dispatch_semaphore_signal(imageCaptureSemaphore);
    }
}

// 通知所有的Targets
- (void)informTargetsAboutNewFrameAtTime:(CMTime)frameTime;
{
    if (self.frameProcessingCompletionBlock != NULL)
    {
        self.frameProcessingCompletionBlock(self, frameTime);
    }
    
    // 傳遞幀緩存給所有target
    for (id<GPUImageInput> currentTarget in targets)
    {
        if (currentTarget != self.targetToIgnoreForUpdates)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndex = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];

            [self setInputFramebufferForTarget:currentTarget atIndex:textureIndex];
            [currentTarget setInputSize:[self outputFrameSize] atIndex:textureIndex];
        }
    }
    
    // Release our hold so it can return to the cache immediately upon processing
    [[self framebufferForOutput] unlock];
    
    if (usingNextFrameForImageCapture)
    {
//        usingNextFrameForImageCapture = NO;
    }
    else
    {
        [self removeOutputFramebuffer];
    }    
    
    // 通知所有targets產(chǎn)生新的幀緩存
    for (id<GPUImageInput> currentTarget in targets)
    {
        if (currentTarget != self.targetToIgnoreForUpdates)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndex = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];
            // 讓所有target生成新的幀緩存
            [currentTarget newFrameReadyAtTime:frameTime atIndex:textureIndex];
        }
    }
}

// 需要生成圖片則先消耗信號(hào)量,確保生成圖片的時(shí)候GL繪制已經(jīng)完成
- (void)useNextFrameForImageCapture;
{
    usingNextFrameForImageCapture = YES;

    // 消耗信號(hào)量
    if (dispatch_semaphore_wait(imageCaptureSemaphore, DISPATCH_TIME_NOW) != 0)
    {
        return;
    }
}

// 等待渲染完成信號(hào),如果接收到完成信號(hào)則生成圖片
- (CGImageRef)newCGImageFromCurrentlyProcessedOutput
{
    // Give it three seconds to process, then abort if they forgot to set up the image capture properly
    double timeoutForImageCapture = 3.0;
    dispatch_time_t convertedTimeout = dispatch_time(DISPATCH_TIME_NOW, timeoutForImageCapture * NSEC_PER_SEC);

    // 等待GL繪制完成,直到超時(shí)
    if (dispatch_semaphore_wait(imageCaptureSemaphore, convertedTimeout) != 0)
    {
        return NULL;
    }
  
    // GL渲染完成且等待未超時(shí)則生成CGImage
    GPUImageFramebuffer* framebuffer = [self framebufferForOutput];
    
    usingNextFrameForImageCapture = NO;
    dispatch_semaphore_signal(imageCaptureSemaphore);
    
    CGImageRef image = [framebuffer newCGImageFromFramebufferContents];
    return image;
}

實(shí)現(xiàn)過程

實(shí)現(xiàn)自定義濾鏡特效。

1、新建QMFishEyeFilter,并繼承自 GPUImageFilter。

//
//  QMRotationFilter.h
//  GPUImageFilter
//
//  Created by qinmin on 2017/6/8.
//  Copyright ? 2017年 Qinmin. All rights reserved.
//

#import <GPUImage.h>

@interface QMFishEyeFilter : GPUImageFilter

@property (nonatomic, assign) GLfloat radius;

- (instancetype)init;

@end

2、重寫 - (instancetype)init; 方法。

- (instancetype)init
{
    if (self = [super initWithFragmentShaderFromString:kQMFishEyeFilterFragmentShaderString]) {
        
        radiusUniform = [filterProgram uniformIndex:@"radius"];
        self.radius = 0.5;
        
        [self setBackgroundColorRed:0.0 green:1.0 blue:0.0 alpha:1.0];
    }
    return self;
}

3、 編寫自定義的著色器代碼。

NSString *const kQMFishEyeFilterFragmentShaderString = SHADER_STRING
(
 precision highp float;
 
 varying vec2 textureCoordinate;
 uniform sampler2D inputImageTexture;
 
 uniform float radius;
 
 const float PI = 3.1415926535;
 
 void main()
 {
     float aperture = 175.0;
     float apertureHalf = radius * aperture * (PI / 180.0);
     float maxFactor = sin(apertureHalf);
     
     vec2 uv;
     vec2 xy = 2.0 * textureCoordinate - 1.0;
     float d = length(xy);
     if (d < (2.0 - maxFactor)) {
         d = length(xy * maxFactor);
         float z = sqrt(1.0 - d * d);
         float r = atan(d, z) / PI;
         float phi = atan(xy.y, xy.x);
         
         uv.x = r * cos(phi) + radius;
         uv.y = r * sin(phi) + radius;
         
     }else {
         uv = textureCoordinate;
     }
     
     vec4 color = texture2D(inputImageTexture, uv);
     gl_FragColor = color;
 }
 );

4、 使用自定義濾鏡特效。

#pragma mark - Events
- (IBAction)startButtonTapped:(UIButton *)sender
{
    // 加載圖片
    GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"3.jpg"]];
    
    QMFishEyeFilter *filter = [[QMFishEyeFilter alloc] init];
    
    [picture addTarget:filter];
    [filter addTarget:_imageView];

    [picture processImage];
}

簡單光照效果。

1、新建QM3DLightFilter,并繼承自 GPUImageFilter。

//
//  QMRotationFilter.h
//  GPUImageFilter
//
//  Created by qinmin on 2017/6/8.
//  Copyright ? 2017年 Qinmin. All rights reserved.
//

#import <GPUImage.h>

@interface QM3DLightFilter : GPUImageFilter

- (instancetype)init;

@end

2、重寫 - (instancetype)init; 方法。

- (instancetype)init
{
    if (self = [super initWithVertexShaderFromString:kQM3DLightFilterVertexShaderString fragmentShaderFromString:kQM3DLightFilterFragmentShaderString]) {
        
        [filterProgram addAttribute:@"normal"];
        filterNormalAttribute = [filterProgram attributeIndex:@"normal"];
        glEnableVertexAttribArray(filterNormalAttribute);
        
        pUniform = [filterProgram uniformIndex:@"P"];
        mvUniform = [filterProgram uniformIndex:@"MV"];
        normalMatUniform = [filterProgram uniformIndex:@"normalMat"];
        
        [self setMVPMatrix];
        
        [self setupSurface];
        
        [self setBackgroundColorRed:1.0 green:1.0 blue:1.0 alpha:1.0];
    }
    return self;
}

3、初始化相關(guān)矩陣和模型。模型加載使用了開源的tiny_obj_loader。

- (void)setMVPMatrix
{
    mat4_t P = mat4_perspective(M_PI/3, 1.0, 1.0, 10.0);
    [self setMatrix4f:*((GPUMatrix4x4 *)&P) forUniform:pUniform program:filterProgram];
    
    mat4_t MV = mat4_create_translation(0, 0, -2.2);
    [self setMatrix4f:*((GPUMatrix4x4 *)&MV) forUniform:mvUniform program:filterProgram];
    
    mat4_t normalMat = mat4_transpose(mat4_inverse(MV, NULL));
    [self setMatrix4f:*((GPUMatrix4x4 *)&normalMat) forUniform:normalMatUniform program:filterProgram];
}

- (void)setupSurface
{
    NSString *path = [[NSBundle mainBundle] pathForResource:@"Sphere" ofType:@"obj"];
    _tinyOBJModel = std::make_shared<TinyOBJModel>();
    _tinyOBJModel->LoadObj(path.UTF8String);
}

4、編寫著色器程序。

NSString *const kQM3DLightFilterVertexShaderString = SHADER_STRING
(
 attribute vec4 position;
 attribute vec2 inputTextureCoordinate;
 attribute vec3 normal;
 
 uniform mat4 MV;
 uniform mat4 P;
 uniform mat4 normalMat;
 
 varying vec2 textureCoordinate;
 varying vec3 vNormal;
 varying vec3 vPosition;
 
 void main()
 {
     gl_Position = P * MV * position;
     
     textureCoordinate = inputTextureCoordinate;
     vPosition = mat3(MV) * vec3(position);
     vNormal = mat3(normalMat) * normal;
 }
 );

NSString *const kQM3DLightFilterFragmentShaderString = SHADER_STRING
(
 precision highp float;
 
 varying vec2 textureCoordinate;
 varying vec3 vNormal;
 varying vec3 vPosition;
 
 uniform sampler2D inputImageTexture;
 
 void main()
 {
     vec3 lightPos = vec3(5.0, -5.0, 0.0);
     vec3 L = normalize(lightPos);
     vec3 N = normalize(vNormal);
     
     //ambient
     vec4 AmbientLightColor = vec4(0.5, 0.5, 0.5, 1.0);
     vec4 AmbientMaterial = vec4(0.2, 0.2, 0.2, 1.0);
     vec4 ambientColor = AmbientLightColor * AmbientMaterial;
     
     //diffuse
     vec4 DiffuseLightColor = vec4(1.0, 1.0, 1.0, 1.0);
     vec4 DiffuseMaterial = vec4(0.8, 0.8, 0.8, 1.0);
     vec4 diffuseColor = DiffuseLightColor * DiffuseMaterial * max(0.0, dot(N, L));
     
     // Specular
     vec4 SpecularLightColor = vec4(1.0, 1.0, 0.0, 1.0);
     vec4 SpecularMaterial = vec4(0.7, 0.7, 0.7, 1.0);
     vec3 eye = vec3(1.0, -2.0, 5.0) - vPosition;
     vec3 H = normalize(eye + L);
     vec4 specularColor = SpecularLightColor * SpecularMaterial * pow(max(0.0, dot(N, H)), 2.5);
     
     // All light
     vec4 light = ambientColor + diffuseColor + specularColor;
     
     vec4 color = texture2D(inputImageTexture, textureCoordinate);
     gl_FragColor = color * light;
 }
 );

5、重寫渲染方法 - (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;,在渲染的時(shí)候,開啟了深度測試,頂點(diǎn)數(shù)據(jù)使用VBO存儲(chǔ)。

#pragma mark - Overwrite
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
{
    if (self.preventRendering)
    {
        [firstInputFramebuffer unlock];
        return;
    }
    
    [GPUImageContext setActiveShaderProgram:filterProgram];
    
    outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:[self sizeOfFBO] textureOptions:self.outputTextureOptions onlyTexture:NO];
    [outputFramebuffer activateFramebuffer];
    if (usingNextFrameForImageCapture)
    {
        [outputFramebuffer lock];
    }
    
    [self setUniformsForProgramAtIndex:0];

    // Setup depth render buffer
    int width, height;
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width);
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &height);
    
    // Create a depth buffer that has the same size as the color buffer.
    GLuint depthRenderBuffer;
    glGenRenderbuffers(1, &depthRenderBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, depthRenderBuffer);
    glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, width, height);
    
    // Attach color render buffer and depth render buffer to frameBuffer
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT,
                              GL_RENDERBUFFER, depthRenderBuffer);
    
    glEnable(GL_DEPTH_TEST);
    glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
    glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
    
    glActiveTexture(GL_TEXTURE2);
    glBindTexture(GL_TEXTURE_2D, [firstInputFramebuffer texture]);
    glUniform1i(filterInputTextureUniform, 2);
    
    int stride = 8 * sizeof(GLfloat);
    const GLvoid* normalOffset = (const GLvoid*)(5 * sizeof(GLfloat));
    const GLvoid* texCoordOffset = (const GLvoid*)(3 * sizeof(GLfloat));
    
    glBindBuffer(GL_ARRAY_BUFFER, _tinyOBJModel->getVertexBuffer());
    glVertexAttribPointer(filterPositionAttribute, 3, GL_FLOAT, GL_FALSE, stride, 0);
    glVertexAttribPointer(filterNormalAttribute, 3, GL_FLOAT, GL_FALSE, stride, normalOffset);
    glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, GL_FALSE, stride, texCoordOffset);
    
    // Draw the triangles.
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _tinyOBJModel->getIndexBuffer());
    glDrawElements(GL_TRIANGLES, _tinyOBJModel->getIndexCount(), GL_UNSIGNED_INT, 0);
    
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
    glDisable(GL_DEPTH_TEST);
    
    [firstInputFramebuffer unlock];
    
    if (usingNextFrameForImageCapture)
    {
        dispatch_semaphore_signal(imageCaptureSemaphore);
    }
}

6、使用濾鏡。

- (IBAction)filterButtonTapped:(UIButton *)sender
{
    // 加載圖片
    GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"3.jpg"]];
    
    QM3DLightFilter *filter = [[QM3DLightFilter alloc] init];
    
    [picture addTarget:filter];
    [filter addTarget:_imageView];
    
    [picture processImage];
}

總結(jié)

GPUImageFilter 是所有濾鏡的基礎(chǔ),其它濾鏡大多直接或間接繼承它,這些濾鏡可以混合起來構(gòu)建復(fù)雜的多重濾鏡。因此,梳理它的渲染樹對理解整個(gè) GPUImage 框架有很大的幫助。

源碼地址:GPUImage源碼閱讀系列 https://github.com/QinminiOS/GPUImage
系列文章地址:GPUImage源碼閱讀 http://www.lxweimin.com/nb/11749791

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。
  • 序言:七十年代末,一起剝皮案震驚了整個(gè)濱河市,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖,帶你破解...
    沈念sama閱讀 230,362評論 6 544
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 99,577評論 3 429
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人,你說我怎么就攤上這事。” “怎么了?”我有些...
    開封第一講書人閱讀 178,486評論 0 383
  • 文/不壞的土叔 我叫張陵,是天一觀的道長。 經(jīng)常有香客問我,道長,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 63,852評論 1 317
  • 正文 為了忘掉前任,我火速辦了婚禮,結(jié)果婚禮上,老公的妹妹穿的比我還像新娘。我一直安慰自己,他們只是感情好,可當(dāng)我...
    茶點(diǎn)故事閱讀 72,600評論 6 412
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著,像睡著了一般。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 55,944評論 1 328
  • 那天,我揣著相機(jī)與錄音,去河邊找鬼。 笑死,一個(gè)胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播,決...
    沈念sama閱讀 43,944評論 3 447
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 43,108評論 0 290
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎,沒想到半個(gè)月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 49,652評論 1 336
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 41,385評論 3 358
  • 正文 我和宋清朗相戀三年,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點(diǎn)故事閱讀 43,616評論 1 374
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情,我是刑警寧澤,帶...
    沈念sama閱讀 39,111評論 5 364
  • 正文 年R本政府宣布,位于F島的核電站,受9級(jí)特大地震影響,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 44,798評論 3 350
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧,春花似錦、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 35,205評論 0 28
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 36,537評論 1 295
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留,地道東北人。 一個(gè)月前我還...
    沈念sama閱讀 52,334評論 3 400
  • 正文 我出身青樓,卻偏偏與公主長得像,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個(gè)殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 48,570評論 2 379

推薦閱讀更多精彩內(nèi)容