概述
GPUImage是一個(gè)著名的圖像處理開源庫(kù),它讓你能夠在圖片、視頻、相機(jī)上使用GPU加速的濾鏡和其它特效。與CoreImage框架相比,可以根據(jù)GPUImage提供的接口,使用自定義的濾鏡。項(xiàng)目地址:https://github.com/BradLarson/GPUImage
這篇文章主要是閱讀GPUImage框架中的 GPUImageTwoInputFilter、GPUImageThreeInputFilter、GPUImageFourInputFilter 這幾個(gè)類的源碼。這幾個(gè)類與多個(gè)紋理輸入有關(guān),都間接或直接繼承自GPUImageFilter,在需要處理多個(gè)紋理輸入的時(shí)候可以使用它們。以下是源碼內(nèi)容:
GPUImageTwoInputFilter
GPUImageThreeInputFilter
GPUImageFourInputFilter
實(shí)現(xiàn)效果
GPUImageTwoInputFilter
GPUImageTwoInputFilter 可以接收兩個(gè)幀緩存對(duì)象的輸入。它的作用可以將兩個(gè)幀緩存對(duì)象的輸入合并成一個(gè)幀緩存對(duì)象的輸出。它繼承自GPUImageFilter,因此,可以方便在濾鏡鏈中使用。
- 實(shí)例變量。GPUImageTwoInputFilter最主要的特點(diǎn)就是增加了secondInputFramebuffer這個(gè)接收第二個(gè)幀緩存對(duì)象的實(shí)例變量,同時(shí),也增加了關(guān)于第二個(gè)幀緩存對(duì)象的其它相關(guān)參數(shù)。
@interface GPUImageTwoInputFilter : GPUImageFilter
{
// 與第二個(gè)幀緩存對(duì)象相關(guān)的參數(shù)
GPUImageFramebuffer *secondInputFramebuffer;
GLint filterSecondTextureCoordinateAttribute;
GLint filterInputTextureUniform2;
GPUImageRotationMode inputRotation2;
CMTime firstFrameTime, secondFrameTime;
// 控制兩個(gè)幀緩存對(duì)象渲染的相關(guān)參數(shù)
BOOL hasSetFirstTexture, hasReceivedFirstFrame, hasReceivedSecondFrame, firstFrameWasVideo, secondFrameWasVideo;
BOOL firstFrameCheckDisabled, secondFrameCheckDisabled;
}
- 初始化方法。初始化的時(shí)候可以不用指定頂點(diǎn)著色器,但是需要指定片段著色器。
// 初始化方法
- (id)initWithFragmentShaderFromString:(NSString *)fragmentShaderString;
- (id)initWithVertexShaderFromString:(NSString *)vertexShaderString fragmentShaderFromString:(NSString *)fragmentShaderString;
// 指定fragmentShaderString來初始化
- (id)initWithFragmentShaderFromString:(NSString *)fragmentShaderString;
{
if (!(self = [self initWithVertexShaderFromString:kGPUImageTwoInputTextureVertexShaderString fragmentShaderFromString:fragmentShaderString]))
{
return nil;
}
return self;
}
// 指定vertexShaderString和fragmentShaderString來初始化
- (id)initWithVertexShaderFromString:(NSString *)vertexShaderString fragmentShaderFromString:(NSString *)fragmentShaderString;
{
if (!(self = [super initWithVertexShaderFromString:vertexShaderString fragmentShaderFromString:fragmentShaderString]))
{
return nil;
}
// 相關(guān)變量初始化
inputRotation2 = kGPUImageNoRotation;
hasSetFirstTexture = NO;
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
firstFrameWasVideo = NO;
secondFrameWasVideo = NO;
firstFrameCheckDisabled = NO;
secondFrameCheckDisabled = NO;
firstFrameTime = kCMTimeInvalid;
secondFrameTime = kCMTimeInvalid;
runSynchronouslyOnVideoProcessingQueue(^{
[GPUImageContext useImageProcessingContext];
// 獲取第二個(gè)紋理坐標(biāo)對(duì)象
filterSecondTextureCoordinateAttribute = [filterProgram attributeIndex:@"inputTextureCoordinate2"];
// 獲取第二個(gè)紋理對(duì)象
filterInputTextureUniform2 = [filterProgram uniformIndex:@"inputImageTexture2"]; // This does assume a name of "inputImageTexture2" for second input texture in the fragment shader
glEnableVertexAttribArray(filterSecondTextureCoordinateAttribute);
});
return self;
}
- 其它方法。GPUImageTwoInputFilter給出的方法比較少,但是這里有一些重寫父類的方法是處理兩個(gè)幀緩存對(duì)象輸入的關(guān)鍵,比如:
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
,- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
。
// 增加方法
- (void)disableFirstFrameCheck;
- (void)disableSecondFrameCheck;
// 重寫方法
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
// 需不需要檢查第一個(gè)紋理輸入
- (void)disableFirstFrameCheck;
{
firstFrameCheckDisabled = YES;
}
// 需不需要檢查第二個(gè)紋理輸入
- (void)disableSecondFrameCheck;
{
secondFrameCheckDisabled = YES;
}
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
// 已經(jīng)接收了兩個(gè)紋理對(duì)象的輸入,則返回
if (hasReceivedFirstFrame && hasReceivedSecondFrame)
{
return;
}
BOOL updatedMovieFrameOppositeStillImage = NO;
// 處理第一個(gè)紋理對(duì)象
if (textureIndex == 0)
{
hasReceivedFirstFrame = YES;
firstFrameTime = frameTime;
// 如果不檢查第二個(gè)紋理輸入,則直接默認(rèn)已經(jīng)接收了第二個(gè)紋理
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(secondFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else
{
hasReceivedSecondFrame = YES;
secondFrameTime = frameTime;
// 如果不檢查第一個(gè)紋理輸入,則直接默認(rèn)已經(jīng)接收了第一個(gè)紋理
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
// || (hasReceivedFirstFrame && secondFrameCheckDisabled) || (hasReceivedSecondFrame && firstFrameCheckDisabled)
// 如果接收了兩個(gè)紋理輸入或者是有效幀,則渲染
if ((hasReceivedFirstFrame && hasReceivedSecondFrame) || updatedMovieFrameOppositeStillImage)
{
CMTime passOnFrameTime = (!CMTIME_IS_INDEFINITE(firstFrameTime)) ? firstFrameTime : secondFrameTime;
[super newFrameReadyAtTime:passOnFrameTime atIndex:0]; // Bugfix when trying to record: always use time from first input (unless indefinite, in which case use the second input)
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
}
}
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
{
if (self.preventRendering)
{
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
return;
}
[GPUImageContext setActiveShaderProgram:filterProgram];
outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:[self sizeOfFBO] textureOptions:self.outputTextureOptions onlyTexture:NO];
[outputFramebuffer activateFramebuffer];
if (usingNextFrameForImageCapture)
{
[outputFramebuffer lock];
}
[self setUniformsForProgramAtIndex:0];
glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
glClear(GL_COLOR_BUFFER_BIT);
// 激活第一個(gè)紋理
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, [firstInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform, 2);
// 激活第二個(gè)紋理
glActiveTexture(GL_TEXTURE3);
glBindTexture(GL_TEXTURE_2D, [secondInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform2, 3);
glVertexAttribPointer(filterPositionAttribute, 2, GL_FLOAT, 0, 0, vertices);
glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, textureCoordinates);
glVertexAttribPointer(filterSecondTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation2]);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
if (usingNextFrameForImageCapture)
{
dispatch_semaphore_signal(imageCaptureSemaphore);
}
}
GPUImageThreeInputFilter
GPUImageThreeInputFilter 可以接收三個(gè)幀緩存對(duì)象的輸入。它的作用可以將三個(gè)幀緩存對(duì)象的輸入合并成一個(gè)幀緩存對(duì)象的輸出。它繼承自GPUImageTwoInputFilter,與GPUImageTwoInputFilter類似,主要增加了第三個(gè)幀緩存對(duì)象處理的相關(guān)操作。
- 實(shí)例變量。主要增加了與第三個(gè)幀緩存對(duì)象相關(guān)的實(shí)例變量。
@interface GPUImageThreeInputFilter : GPUImageTwoInputFilter
{
GPUImageFramebuffer *thirdInputFramebuffer;
GLint filterThirdTextureCoordinateAttribute;
GLint filterInputTextureUniform3;
GPUImageRotationMode inputRotation3;
GLuint filterSourceTexture3;
CMTime thirdFrameTime;
BOOL hasSetSecondTexture, hasReceivedThirdFrame, thirdFrameWasVideo;
BOOL thirdFrameCheckDisabled;
}
- 方法列表。
// 增加方法
- (void)disableThirdFrameCheck;
// 重寫方法
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
- (void)disableThirdFrameCheck;
{
thirdFrameCheckDisabled = YES;
}
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
{
if (self.preventRendering)
{
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
[thirdInputFramebuffer unlock];
return;
}
[GPUImageContext setActiveShaderProgram:filterProgram];
outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:[self sizeOfFBO] textureOptions:self.outputTextureOptions onlyTexture:NO];
[outputFramebuffer activateFramebuffer];
if (usingNextFrameForImageCapture)
{
[outputFramebuffer lock];
}
[self setUniformsForProgramAtIndex:0];
glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
glClear(GL_COLOR_BUFFER_BIT);
// 激活第一個(gè)紋理
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, [firstInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform, 2);
// 激活第二個(gè)紋理
glActiveTexture(GL_TEXTURE3);
glBindTexture(GL_TEXTURE_2D, [secondInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform2, 3);
// 激活第三個(gè)紋理
glActiveTexture(GL_TEXTURE4);
glBindTexture(GL_TEXTURE_2D, [thirdInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform3, 4);
glVertexAttribPointer(filterPositionAttribute, 2, GL_FLOAT, 0, 0, vertices);
glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, textureCoordinates);
glVertexAttribPointer(filterSecondTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation2]);
glVertexAttribPointer(filterThirdTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation3]);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
[thirdInputFramebuffer unlock];
if (usingNextFrameForImageCapture)
{
dispatch_semaphore_signal(imageCaptureSemaphore);
}
}
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
// You can set up infinite update loops, so this helps to short circuit them
if (hasReceivedFirstFrame && hasReceivedSecondFrame && hasReceivedThirdFrame)
{
return;
}
BOOL updatedMovieFrameOppositeStillImage = NO;
if (textureIndex == 0)
{
hasReceivedFirstFrame = YES;
firstFrameTime = frameTime;
// 如果不檢查第二個(gè)紋理輸入,則直接默認(rèn)已經(jīng)接收了第二個(gè)紋理
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
// 如果不檢查第三個(gè)紋理輸入,則直接默認(rèn)已經(jīng)接收了第三個(gè)紋理
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(secondFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else if (textureIndex == 1)
{
hasReceivedSecondFrame = YES;
secondFrameTime = frameTime;
// 如果不檢查第一個(gè)紋理輸入,則直接默認(rèn)已經(jīng)接收了第一個(gè)紋理
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
// 如果不檢查第三個(gè)紋理輸入,則直接默認(rèn)已經(jīng)接收了第三個(gè)紋理
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else
{
hasReceivedThirdFrame = YES;
thirdFrameTime = frameTime;
// 如果不檢查第一個(gè)紋理輸入,則直接默認(rèn)已經(jīng)接收了第一個(gè)紋理
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
// 如果不檢查第二個(gè)紋理輸入,則直接默認(rèn)已經(jīng)接收了第二個(gè)紋理
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
// || (hasReceivedFirstFrame && secondFrameCheckDisabled) || (hasReceivedSecondFrame && firstFrameCheckDisabled)
// 如果已經(jīng)接收了三個(gè)紋理輸入或者是有效幀,則渲染
if ((hasReceivedFirstFrame && hasReceivedSecondFrame && hasReceivedThirdFrame) || updatedMovieFrameOppositeStillImage)
{
static const GLfloat imageVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
[self renderToTextureWithVertices:imageVertices textureCoordinates:[[self class] textureCoordinatesForRotation:inputRotation]];
[self informTargetsAboutNewFrameAtTime:frameTime];
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
hasReceivedThirdFrame = NO;
}
}
GPUImageFourInputFilter
GPUImageFourInputFilter 可以接收四個(gè)幀緩存對(duì)象的輸入。它的作用可以將四個(gè)幀緩存對(duì)象的輸入合并成一個(gè)幀緩存對(duì)象的輸出。它繼承自GPUImageThreeInputFilter,與GPUImageTwoInputFilter、GPUImageThreeInputFilter類似,主要增加了第四個(gè)幀緩存對(duì)象處理的相關(guān)操作。
- 實(shí)例變量。主要增加了與第四個(gè)幀緩存對(duì)象相關(guān)的實(shí)例變量。
@interface GPUImageFourInputFilter : GPUImageThreeInputFilter
{
GPUImageFramebuffer *fourthInputFramebuffer;
GLint filterFourthTextureCoordinateAttribute;
GLint filterInputTextureUniform4;
GPUImageRotationMode inputRotation4;
GLuint filterSourceTexture4;
CMTime fourthFrameTime;
BOOL hasSetThirdTexture, hasReceivedFourthFrame, fourthFrameWasVideo;
BOOL fourthFrameCheckDisabled;
}
- 方法列表。 GPUImageFourInputFilter與GPUImageTwoInputFilter、GPUImageThreeInputFilter相比,它們的一些關(guān)鍵方法的邏輯都比較類似。
// 增加方法
- (void)disableFourthFrameCheck;
// 重寫方法
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
- (void)disableFourthFrameCheck;
{
fourthFrameCheckDisabled = YES;
}
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
{
if (self.preventRendering)
{
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
[thirdInputFramebuffer unlock];
[fourthInputFramebuffer unlock];
return;
}
[GPUImageContext setActiveShaderProgram:filterProgram];
outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:[self sizeOfFBO] textureOptions:self.outputTextureOptions onlyTexture:NO];
[outputFramebuffer activateFramebuffer];
if (usingNextFrameForImageCapture)
{
[outputFramebuffer lock];
}
[self setUniformsForProgramAtIndex:0];
glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
glClear(GL_COLOR_BUFFER_BIT);
// 激活紋理對(duì)象
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, [firstInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform, 2);
// 激活紋理對(duì)象
glActiveTexture(GL_TEXTURE3);
glBindTexture(GL_TEXTURE_2D, [secondInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform2, 3);
// 激活紋理對(duì)象
glActiveTexture(GL_TEXTURE4);
glBindTexture(GL_TEXTURE_2D, [thirdInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform3, 4);
// 激活紋理對(duì)象
glActiveTexture(GL_TEXTURE5);
glBindTexture(GL_TEXTURE_2D, [fourthInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform4, 5);
glVertexAttribPointer(filterPositionAttribute, 2, GL_FLOAT, 0, 0, vertices);
glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, textureCoordinates);
glVertexAttribPointer(filterSecondTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation2]);
glVertexAttribPointer(filterThirdTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation3]);
glVertexAttribPointer(filterFourthTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation4]);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
[thirdInputFramebuffer unlock];
[fourthInputFramebuffer unlock];
if (usingNextFrameForImageCapture)
{
dispatch_semaphore_signal(imageCaptureSemaphore);
}
}
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
// You can set up infinite update loops, so this helps to short circuit them
if (hasReceivedFirstFrame && hasReceivedSecondFrame && hasReceivedThirdFrame)
{
return;
}
BOOL updatedMovieFrameOppositeStillImage = NO;
if (textureIndex == 0)
{
hasReceivedFirstFrame = YES;
firstFrameTime = frameTime;
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (fourthFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(secondFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else if (textureIndex == 1)
{
hasReceivedSecondFrame = YES;
secondFrameTime = frameTime;
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (fourthFrameCheckDisabled)
{
hasReceivedFourthFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else if (textureIndex == 2)
{
hasReceivedThirdFrame = YES;
thirdFrameTime = frameTime;
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (fourthFrameCheckDisabled)
{
hasReceivedFourthFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else
{
hasReceivedFourthFrame = YES;
fourthFrameTime = frameTime;
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
// || (hasReceivedFirstFrame && secondFrameCheckDisabled) || (hasReceivedSecondFrame && firstFrameCheckDisabled)
// 如果已經(jīng)接收四個(gè)紋理輸入或者是有效幀,則渲染
if ((hasReceivedFirstFrame && hasReceivedSecondFrame && hasReceivedThirdFrame && hasReceivedFourthFrame) || updatedMovieFrameOppositeStillImage)
{
static const GLfloat imageVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
[self renderToTextureWithVertices:imageVertices textureCoordinates:[[self class] textureCoordinatesForRotation:inputRotation]];
[self informTargetsAboutNewFrameAtTime:frameTime];
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
hasReceivedThirdFrame = NO;
hasReceivedFourthFrame = NO;
}
}
實(shí)現(xiàn)過程
- 先加載圖片,然后分別經(jīng)過兩個(gè)濾鏡處理,再將處理后的結(jié)果輸出到GPUImageTwoInputFilter中處理,最后用GPUImageView顯示處理后的結(jié)果。
// 圖片
GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"1.jpg"]];
// 濾鏡
GPUImageCannyEdgeDetectionFilter *cannyFilter = [[GPUImageCannyEdgeDetectionFilter alloc] init];
GPUImageGammaFilter *gammaFilter = [[GPUImageGammaFilter alloc] init];
[picture addTarget:cannyFilter];
[picture addTarget:gammaFilter];
// GPUImageTwoInputFilter
GPUImageTwoInputFilter *twoFilter = [[GPUImageTwoInputFilter alloc] initWithFragmentShaderFromString:kTwoInputFragmentShaderString];
[cannyFilter addTarget:twoFilter];
[gammaFilter addTarget:twoFilter];
[twoFilter addTarget:_imageView];
[picture processImage];
片段著色器程序
NSString *const kTwoInputFragmentShaderString = SHADER_STRING
(
varying highp vec2 textureCoordinate;
varying highp vec2 textureCoordinate2;
uniform sampler2D inputImageTexture;
uniform sampler2D inputImageTexture2;
void main()
{
highp vec4 oneInputColor = texture2D(inputImageTexture, textureCoordinate);
highp vec4 twoInputColor = texture2D(inputImageTexture2, textureCoordinate2);
highp float range = distance(textureCoordinate, vec2(0.5, 0.5));
highp vec4 dstClor = oneInputColor;
if (range < 0.25) {
dstClor = twoInputColor;
}else {
//dstClor = vec4(vec3(1.0 - oneInputColor), 1.0);
if (oneInputColor.r < 0.001 && oneInputColor.g < 0.001 && oneInputColor.b < 0.001) {
dstClor = vec4(1.0);
}else {
dstClor = vec4(1.0, 0.0, 0.0, 1.0);
}
}
gl_FragColor = dstClor;
}
);
總結(jié)
GPUImageTwoInputFilter、GPUImageThreeInputFilter、GPUImageFourInputFilter 這幾個(gè)類在我們需要合并多個(gè)濾鏡處理結(jié)果的時(shí)候會(huì)很有用,在使用的時(shí)候也比較簡(jiǎn)單。當(dāng)然在GPUImage框架中已有很多濾鏡繼承自這幾個(gè)類,可以處理多個(gè)紋理的輸入。
源碼地址:GPUImage源碼閱讀系列 https://github.com/QinminiOS/GPUImage
系列文章地址:GPUImage源碼閱讀 http://www.lxweimin.com/nb/11749791