GPUImage解析(五) —— 框架中的幾個基類

版本記錄

版本號 時間
V1.0 2017.09.02

前言

GPUImage是直接利用顯卡實現視頻或者圖像處理的技術。感興趣可以看上面幾篇文章。
1. GPUImage解析(一) —— 基本概覽(一)
2. GPUImage解析(二) —— 基本概覽(二)
3. GPUImage解析(三) —— 基本概覽(三)
4. GPUImage解析(四) —— 安裝方法及框架介紹

框架中的幾個基類

該框架其實可以分為兩個部分,一部分就是基類,另外一部分就是濾鏡,這一篇我們就說一下這個框架的基類部分。

// Base classes

#import "GPUImageContext.h"
#import "GPUImageOutput.h"
#import "GPUImageView.h"
#import "GPUImageVideoCamera.h"
#import "GPUImageStillCamera.h"
#import "GPUImageMovie.h"
#import "GPUImagePicture.h"
#import "GPUImageRawDataInput.h"
#import "GPUImageRawDataOutput.h"
#import "GPUImageMovieWriter.h"
#import "GPUImageFilterPipeline.h"
#import "GPUImageTextureOutput.h"
#import "GPUImageFilterGroup.h"
#import "GPUImageTextureInput.h"
#import "GPUImageUIElement.h"
#import "GPUImageBuffer.h"
#import "GPUImageFramebuffer.h"
#import "GPUImageFramebufferCache.h"

基類詳細分析

下面我們就詳細的分析下這幾個基類。

1. GPUImageContext

  • 繼承與屬性
@interface GPUImageContext : NSObject

@property(readonly, nonatomic) dispatch_queue_t contextQueue;
@property(readwrite, retain, nonatomic) GLProgram *currentShaderProgram;
@property(readonly, retain, nonatomic) EAGLContext *context;
@property(readonly) CVOpenGLESTextureCacheRef coreVideoTextureCache;
@property(readonly) GPUImageFramebufferCache *framebufferCache;
  • 作用:GPUImageContext是GPUImage對OpenGL ES上下文的封裝,添加了GPUImage相關的上下文,比如說Program的使用緩存,處理隊列,CV紋理緩存等。

幾個屬性

  • contextQueue 統一處理隊列
  • currentShaderProgram 正在使用的program
  • context OpenGL ES的上下文
  • coreVideoTextureCache CV紋理緩存
  • framebufferCache GPUImageBuffer緩存
  • shaderProgramCache Program的緩存
  • shaderProgramUsageHistory Program的使用歷史

幾個方法

  • useAsCurrentContext()在useAsCurrentContext設置當前上下文的時候,會先判斷上下文是否是當前context,不是再設置(為了避免上下文切換的性能消耗,即使設置的上下文是同一個上下文也會消耗性能)

  • sizeThatFitsWithinATextureForSize()會調整紋理大小,如果超過最大的紋理,會調整為不超過最大的紋理寬高。

  • (GLProgram*)programForVertexShaderString:fragmentShaderString:
    shaderProgramCache 是program的緩存,由頂點shader和片元shader字符串拼接起來做key。

  • (void)useSharegroup:(EAGLSharegroup *)sharegroup;
    EAGLSharegroup類管理一個或者多個EAGLContext的OpenGLES資源;這個是一個封閉的類,沒有開發者API。負責管理紋理緩存、頂點緩存、幀緩存、顏色緩存。(textures, buffers, framebuffers, and render buffers)

  • (EAGLContext *)context;返回OpenGL ES2.0的上下文,同時設置glDisable(GL_DEPTH_TEST);,圖像處理管道默認不允許使用深度緩存。


2. GPUImageOutput

  • 繼承與屬性
/** GPUImage's base source object
 
 Images or frames of video are uploaded from source objects, which are subclasses of GPUImageOutput. These include:
 
 - GPUImageVideoCamera (for live video from an iOS camera) 
 - GPUImageStillCamera (for taking photos with the camera)
 - GPUImagePicture (for still images)
 - GPUImageMovie (for movies)
 
 Source objects upload still image frames to OpenGL ES as textures, then hand those textures off to the next objects in the processing chain.
 */
@interface GPUImageOutput : NSObject
{
    GPUImageFramebuffer *outputFramebuffer;
    
    NSMutableArray *targets, *targetTextureIndices;
    
    CGSize inputTextureSize, cachedMaximumOutputSize, forcedMaximumSize;
    
    BOOL overrideInputSize;
    
    BOOL allTargetsWantMonochromeData;
    BOOL usingNextFrameForImageCapture;
}
  • 作用:GPUImageOutput類將靜態圖像紋理上傳到OpenGL ES中,然后使用這些紋理去處理進程鏈中的下一個對象。它的子類可以獲得濾鏡處理后的圖片功能。

3. GPUImageView

  • 繼承與屬性
/**
 UIView subclass to use as an endpoint for displaying GPUImage outputs
 */
@interface GPUImageView : UIView <GPUImageInput>
{
    GPUImageRotationMode inputRotation;
}

/** The fill mode dictates how images are fit in the view, with the default being kGPUImageFillModePreserveAspectRatio
 */
@property(readwrite, nonatomic) GPUImageFillModeType fillMode;

/** This calculates the current display size, in pixels, taking into account Retina scaling factors
 */
@property(readonly, nonatomic) CGSize sizeInPixels;

@property(nonatomic) BOOL enabled;

/** Handling fill mode
 
 @param redComponent Red component for background color
 @param greenComponent Green component for background color
 @param blueComponent Blue component for background color
 @param alphaComponent Alpha component for background color
 */
- (void)setBackgroundColorRed:(GLfloat)redComponent green:(GLfloat)greenComponent blue:(GLfloat)blueComponent alpha:(GLfloat)alphaComponent;

- (void)setCurrentlyReceivingMonochromeInput:(BOOL)newValue;

@end
  • 圖像視圖

4. GPUImageVideoCamera

  • 繼承與屬性
/**
 A GPUImageOutput that provides frames from either camera
*/
@interface GPUImageVideoCamera : GPUImageOutput <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate>
{
    NSUInteger numberOfFramesCaptured;
    CGFloat totalFrameTimeDuringCapture;
    
    AVCaptureSession *_captureSession;
    AVCaptureDevice *_inputCamera;
    AVCaptureDevice *_microphone;
    AVCaptureDeviceInput *videoInput;
    AVCaptureVideoDataOutput *videoOutput;

    BOOL capturePaused;
    GPUImageRotationMode outputRotation, internalRotation;
    dispatch_semaphore_t frameRenderingSemaphore;
        
    BOOL captureAsYUV;
    GLuint luminanceTexture, chrominanceTexture;

    __unsafe_unretained id<GPUImageVideoCameraDelegate> _delegate;
}

/// The AVCaptureSession used to capture from the camera
@property(readonly, retain, nonatomic) AVCaptureSession *captureSession;

/// This enables the capture session preset to be changed on the fly
@property (readwrite, nonatomic, copy) NSString *captureSessionPreset;

/// This sets the frame rate of the camera (iOS 5 and above only)
/**
 Setting this to 0 or below will set the frame rate back to the default setting for a particular preset.
 */
@property (readwrite) int32_t frameRate;

/// Easy way to tell which cameras are present on device
@property (readonly, getter = isFrontFacingCameraPresent) BOOL frontFacingCameraPresent;
@property (readonly, getter = isBackFacingCameraPresent) BOOL backFacingCameraPresent;

/// This enables the benchmarking mode, which logs out instantaneous and average frame times to the console
@property(readwrite, nonatomic) BOOL runBenchmark;

/// Use this property to manage camera settings. Focus point, exposure point, etc.
@property(readonly) AVCaptureDevice *inputCamera;

/// This determines the rotation applied to the output image, based on the source material
@property(readwrite, nonatomic) UIInterfaceOrientation outputImageOrientation;

/// These properties determine whether or not the two camera orientations should be mirrored. By default, both are NO.
@property(readwrite, nonatomic) BOOL horizontallyMirrorFrontFacingCamera, horizontallyMirrorRearFacingCamera;

@property(nonatomic, assign) id<GPUImageVideoCameraDelegate> delegate;
  • 作用:GPUImageVideoCameraGPUImageOutput的子類,提供來自攝像頭的圖像數據作為源數據,一般是響應鏈的源頭。GPUImage使用AVFoundation框架來獲取視頻。
    AVCaptureSession類從AV輸入設備的采集數據到制定的輸出。為了實現實時的圖像捕獲,要實現AVCaptureSession類,添加合適的輸入(AVCaptureDeviceInput)和輸出(比如 AVCaptureMovieFileOutput)調用startRunning開始輸入到輸出的數據流,調用stopRunning停止數據流。需要注意的是startRunning函數會花費一定的時間,所以不能在主線程(UI線程)調用,防止卡頓。

5. GPUImageStillCamera

  • 繼承與屬性
@interface GPUImageStillCamera : GPUImageVideoCamera

/** The JPEG compression quality to use when capturing a photo as a JPEG.
 */
@property CGFloat jpegCompressionQuality;

// Only reliably set inside the context of the completion handler of one of the capture methods
@property (readonly) NSDictionary *currentCaptureMetadata;
  • 作用:GPUImage中GPUImageStillCamera可以調用系統相機,并實現實時濾鏡,GPUImageStillCamera繼承自GPUImageVideoCamera類,添加了捕獲照片的功能。

使用步驟

  • 創建預覽View 即必須的GPUImageView
  • 創建濾鏡
  • 創建Camera 即我們要用到的GPUImageStillCamera
  • addTarget 并開始處理startCameraCapture
  • 回調數據、寫入相冊

6. GPUImageMovie

  • 繼承與屬性
/** Source object for filtering movies
 */
@interface GPUImageMovie : GPUImageOutput

@property (readwrite, retain) AVAsset *asset;
@property (readwrite, retain) AVPlayerItem *playerItem;
@property(readwrite, retain) NSURL *url;

/** This enables the benchmarking mode, which logs out instantaneous and average frame times to the console
 */
@property(readwrite, nonatomic) BOOL runBenchmark;

/** This determines whether to play back a movie as fast as the frames can be processed, or if the original speed of the movie should be respected. Defaults to NO.
 */
@property(readwrite, nonatomic) BOOL playAtActualSpeed;

/** This determines whether the video should repeat (loop) at the end and restart from the beginning. Defaults to NO.
 */
@property(readwrite, nonatomic) BOOL shouldRepeat;

/** This specifies the progress of the process on a scale from 0 to 1.0. A value of 0 means the process has not yet begun, A value of 1.0 means the conversaion is complete.
    This property is not key-value observable.
 */
@property(readonly, nonatomic) float progress;

/** This is used to send the delete Movie did complete playing alert
 */
@property (readwrite, nonatomic, assign) id <GPUImageMovieDelegate>delegate;

@property (readonly, nonatomic) AVAssetReader *assetReader;
@property (readonly, nonatomic) BOOL audioEncodingIsFinished;
@property (readonly, nonatomic) BOOL videoEncodingIsFinished;
  • 作用:GPUImageMovie類繼承了GPUImageOutput類,一般作為響應鏈的源頭,可以通過url、playerItem、asset初始化。

7. GPUImagePicture

  • 繼承與屬性
@interface GPUImagePicture : GPUImageOutput
{
    CGSize pixelSizeOfImage;
    BOOL hasProcessedImage;
    
    dispatch_semaphore_t imageUpdateSemaphore;
}
  • 作用:GPUImagePicture是PGUImage的圖像處理類,繼承GPUImageOutput,一般作為響應鏈的源頭。

幾個屬性

  • pixelSizeOfImage 圖像的像素大小。
  • hasProcessedImage 圖像是否已處理。
  • imageUpdateSemaphore圖像處理的GCD信號量。

幾個方法

  • - (id)initWithCGImage:smoothlyScaleOutput:用源圖像newImageSource和是否采用mipmaps來初始化GPUImagePicture。
    如果圖像大小超過OpenGL ES最大紋理寬高,或者使用mipmaps,或者圖像數據是浮點型、顏色空間不對等都會采用CoreGraphics重新繪制圖像。
    然后通過glTexImage2D把圖像數據發送給GPU,最后釋放掉CPU的圖像數據。
  • - (BOOL)processImageWithCompletionHandler:;通知targets處理圖像,并在完成后調用complete代碼塊。在處理開始時,會標記hasProcessedImage為YES,并調用dispatch_semaphore_wait(),確定上次處理已經完成,否則取消這次處理。
  • - (void)addTarget: atTextureLocation:;添加target到響應鏈。如果hasProcessedImage為YES,表示圖像已經處理完畢,直接設置targets的InputSize,并調用newFrameReadyAtTime()通知target。

8. GPUImageRawDataInput

  • 繼承與屬性
@interface GPUImageRawDataInput : GPUImageOutput
{
    CGSize uploadedImageSize;
    
    dispatch_semaphore_t dataUpdateSemaphore;
}
  • 作用:GPUImageRawDataInput類繼承GPUImageOutput類,可以接受二進制數據,按照特定的顏色格式,把數據轉成圖像并傳入響應鏈;GPUImageRawDataInput不會對傳入的數據copied或者retained,但你不需要在使用完之后去釋放;二進制數據發送到GPU的紋理單元,默認的顏色格式是BGRA和整型數據。

    • 上傳圖片的邏輯:先申請outputframebuffer ,然后綁定紋理,最后用glTexImage2D 上傳圖像數據到GPU。
    • processData方法:處理圖片;如果上一次操作還未完成,則直接返回。

9. GPUImageRawDataOutput

  • 繼承與屬性
#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE
@interface GPUImageRawDataOutput : NSObject <GPUImageInput> {
    CGSize imageSize;
    GPUImageRotationMode inputRotation;
    BOOL outputBGRA;
}
#else
@interface GPUImageRawDataOutput : NSObject <GPUImageInput> {
    CGSize imageSize;
    GPUImageRotationMode inputRotation;
    BOOL outputBGRA;
}
#endif

@property(readonly) GLubyte *rawBytesForImage;
@property(nonatomic, copy) void(^newFrameAvailableBlock)(void);
@property(nonatomic) BOOL enabled;
  • 作用:GPUImageRawDataOutput類實現協議GPUImageInput,可以接受響應鏈的圖像信息,并且以二進制的格式返回數據;

    • rawBytesForImage屬性: 二進制數據的指針;
    • GPUByteColorVector結構體:RGBA顏色空間結構體,便于讀取二進制數據;
    • supportsFastTextureUpload用的BGRA的顏色格式;
      如果需要輸出RGBA,則可以對BGRA格式再做一次RGBA->BRGA的顏色轉換;RGBA -> BGRA 的操作如下:
      texture2D(inputImageTexture, textureCoordinate).bgra;
    • lockNextFramebuffer屬性:標志是否要讀取圖像信息如果為YES,會調用CVPixelBufferLockBaseAddress鎖住對應的CVPixelBufferRef;

10. GPUImageMovieWriter

  • 繼承與屬性
@interface GPUImageMovieWriter : NSObject <GPUImageInput>
{
    BOOL alreadyFinishedRecording;
    
    NSURL *movieURL;
    NSString *fileType;
    AVAssetWriter *assetWriter;
    AVAssetWriterInput *assetWriterAudioInput;
    AVAssetWriterInput *assetWriterVideoInput;
    AVAssetWriterInputPixelBufferAdaptor *assetWriterPixelBufferInput;
    
    GPUImageContext *_movieWriterContext;
    CVPixelBufferRef renderTarget;
    CVOpenGLESTextureRef renderTexture;

    CGSize videoSize;
    GPUImageRotationMode inputRotation;
}

@property(readwrite, nonatomic) BOOL hasAudioTrack;
@property(readwrite, nonatomic) BOOL shouldPassthroughAudio;
@property(readwrite, nonatomic) BOOL shouldInvalidateAudioSampleWhenDone;
@property(nonatomic, copy) void(^completionBlock)(void);
@property(nonatomic, copy) void(^failureBlock)(NSError*);
@property(nonatomic, assign) id<GPUImageMovieWriterDelegate> delegate;
@property(readwrite, nonatomic) BOOL encodingLiveVideo;
@property(nonatomic, copy) BOOL(^videoInputReadyCallback)(void);
@property(nonatomic, copy) BOOL(^audioInputReadyCallback)(void);
@property(nonatomic, copy) void(^audioProcessingCallback)(SInt16 **samplesRef, CMItemCount numSamplesInBuffer);
@property(nonatomic) BOOL enabled;
@property(nonatomic, readonly) AVAssetWriter *assetWriter;
@property(nonatomic, readonly) CMTime duration;
@property(nonatomic, assign) CGAffineTransform transform;
@property(nonatomic, copy) NSArray *metaData;
@property(nonatomic, assign, getter = isPaused) BOOL paused;
@property(nonatomic, retain) GPUImageContext *movieWriterContext;
  • 作用:GPUImageMovieWriter類實現GPUImageInput協議,一般作為響應鏈的終點。shouldPassthroughAudio表示是否使用源音源。
    movieFile.audioEncodingTarget = movieWriter;表示音頻來源是文件

11. GPUImageFilterPipeline

  • 繼承與屬性
@interface GPUImageFilterPipeline : NSObject
{
    NSString *stringValue;
}

@property (strong) NSMutableArray *filters;

@property (strong) GPUImageOutput *input;
@property (strong) id <GPUImageInput> output;
  • 作用:GPUImageFilterPipeline類是濾鏡通道,把inputs的濾鏡組合起來,然后添加output為最后的輸出目標。

    • filters為輸入的濾鏡,output為輸出目標;
    • 把filters的濾鏡按照鏈表的形式串聯起來。

12. GPUImageTextureOutput

  • 繼承與屬性
@interface GPUImageTextureOutput : NSObject <GPUImageInput>
{
    GPUImageFramebuffer *firstInputFramebuffer;
}

@property(readwrite, unsafe_unretained, nonatomic) id<GPUImageTextureOutputDelegate> delegate;
@property(readonly) GLuint texture;
@property(nonatomic) BOOL enabled;
  • 作用:GPUImageTextureOutput類實現GPUImageInput協議,可以接受響應鏈的圖像,并返回對應的OpenGL ES紋理。

  • delegate屬性:實現了GPUImageTextureOutputDelegate協議的回調對象;

  • texture屬性:OpenGL ES的紋理,只讀;

  • enabled屬性:是否有效,默認為有效;

  • doneWithTexture方法:結束處理紋理圖像,解鎖firstInputFramebuffer。


13. GPUImageFilterGroup

  • 繼承與屬性
@interface GPUImageFilterGroup : GPUImageOutput <GPUImageInput>
{
    NSMutableArray *filters;
    BOOL isEndProcessing;
}

@property(readwrite, nonatomic, strong) GPUImageOutput<GPUImageInput> *terminalFilter;
@property(readwrite, nonatomic, strong) NSArray *initialFilters;
@property(readwrite, nonatomic, strong) GPUImageOutput<GPUImageInput> *inputFilterToIgnoreForUpdates; 
  • 作用:組合濾鏡,添加濾鏡的順序不同,效果也不同。

14. GPUImageTextureInput

  • 繼承與屬性
@interface GPUImageTextureInput : GPUImageOutput
{
    CGSize textureSize;
}

// Initialization and teardown
- (id)initWithTexture:(GLuint)newInputTexture size:(CGSize)newTextureSize;

// Image rendering
- (void)processTextureWithFrameTime:(CMTime)frameTime;

@end
  • 作用:GPUImageTextureInput類繼承GPUImageOutput類,可以作為響應鏈的起點,把OpenGL ES紋理對應的紋理信息導入響應鏈處理。textureSize屬性為紋理尺寸;
    初始化的時候,分配一個GPUImageFramebuffer,緩存紋理單元的信息;process的時候直接調用targets對應的就緒方法,因為圖像信息就在OpenGL ES控制內存中。

GPUImageTextureOutputGPUImageTextureInput用于 向OpenGL ES 輸入或者輸出紋理,把GPUImage的輸出作為OpenGL ES的紋理或者把OpenGL ES的輸出作為GPUImage的紋理輸入。


15. GPUImageUIElement

  • 繼承與屬性
@interface GPUImageUIElement : GPUImageOutput

// Initialization and teardown
- (id)initWithView:(UIView *)inputView;
- (id)initWithLayer:(CALayer *)inputLayer;

// Layer management
- (CGSize)layerSizeInPixels;
- (void)update;
- (void)updateUsingCurrentTime;
- (void)updateWithTimestamp:(CMTime)frameTime;

@end
  • 作用:GPUImageUIElement繼承GPUImageOutput類,作為響應鏈的源頭。通過CoreGraphics把UIView渲染到圖像,并通過glTexImage2D綁定到outputFramebuffer指定的紋理,最后通知targets紋理就緒。

16. GPUImageBuffer

  • 繼承與屬性
@interface GPUImageBuffer : GPUImageFilter
{
    NSMutableArray *bufferedFramebuffers;
}

@property(readwrite, nonatomic) NSUInteger bufferSize;

@end

17. GPUImageFramebuffer

  • 繼承與屬性
@interface GPUImageFramebuffer : NSObject

@property(readonly) CGSize size;
@property(readonly) GPUTextureOptions textureOptions;
@property(readonly) GLuint texture;
@property(readonly) BOOL missingFramebuffer;
  • 作用:假設我們自定義一個OpenGL ES程序來處理圖片,那么會有以下幾個步驟:
    • 初始化OpenGL ES環境,編譯、鏈接頂點著色器和片元著色器;
    • 緩存頂點、紋理坐標數據,傳送圖像數據到GPU;
    • 繪制圖元到特定的幀緩存;
    • 在幀緩存取出繪制的圖像。
      GPUImageFilter負責的是第一、二、三步。
      GPUImageFramebuffer負責是第四步。

18. GPUImageFramebufferCache

  • 繼承與屬性
@interface GPUImageFramebufferCache : NSObject

// Framebuffer management
- (GPUImageFramebuffer *)fetchFramebufferForSize:(CGSize)framebufferSize textureOptions:(GPUTextureOptions)textureOptions onlyTexture:(BOOL)onlyTexture;
- (GPUImageFramebuffer *)fetchFramebufferForSize:(CGSize)framebufferSize onlyTexture:(BOOL)onlyTexture;
- (void)returnFramebufferToCache:(GPUImageFramebuffer *)framebuffer;
- (void)purgeAllUnassignedFramebuffers;
- (void)addFramebufferToActiveImageCaptureList:(GPUImageFramebuffer *)framebuffer;
- (void)removeFramebufferFromActiveImageCaptureList:(GPUImageFramebuffer *)framebuffer;

@end
  • 作用:GPUImageFramebufferCacheGPUImageFrameBuffer的管理類。

幾個屬性

  • CacheframebufferCache緩存字典
  • framebufferTypeCounts 緩存數量字典
  • activeImageCaptureList 正在讀取Image數據的
  • GPUImageFrameBuffer列表
  • framebufferCacheQueue 緩存隊列

幾個方法

  • - (NSString *)hashForSize: textureOptions:onlyTexture:;
    根據size、textureOptions和onlyTexture,創建緩存字符串。
    緩存字符串+當前緩存數量形成framebufferCache緩存的key。
    如果找不到framebufferCache對應的數量,會創建新的緩存。

  • - (void)returnFramebufferToCache:;回收緩存。根據size、textureOptions和onlyTexture,創建緩存字符串,緩存字符串+當前緩存數量形成framebufferCache緩存的key。(之所以會加上數量,是因為緩存字符串不唯一)

  • - (void)addFramebufferToActiveImageCaptureList:;
    -(void)removeFramebufferFromActiveImageCaptureList:
    這兩個方法主要用于,當newCGImageFromFramebufferContents()讀取幀緩存圖像數據時,保持GPUImageFramebuffer的引用。并且讀取完數據后,在dataProviderUnlockCallback()方法釋放。

參考文章

1. GPUImage詳細解析(九)圖像的輸入輸出和濾鏡通道
2. GPUImageStillCamera 攝像頭-照相
3. iOS GPUImage研究總結

后記

未完,待續~~~

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。

推薦閱讀更多精彩內容