說到濾鏡問題,市面上所有美顏類的相機都存在各式各樣的濾鏡。那么我們怎么實現濾鏡呢?我們首先想到,是否有相關開源項目可以參考的。iOS 下有比較著名的GPUImage是用來做濾鏡渲染的,Android下面也有類似的項目。其中,美顏類開源相機比較出名的是程序員杠把子(CSDN博客:http://my.csdn.net/oShunz)的MagicCamera(github地址:https://github.com/wuhaoyu1990/MagicCamera)了。但是我發現MagicCamera的幀率實在是太低了,尤其是在使用了濾鏡之后,在紅米Note2上,發現跑起來的幀率只有13幀,這還是只做了磨皮和濾鏡的情況下,如下圖的log所示:
相比市面上的美顏類相機來說,這樣的幀率太低了。因為對商業相機來說,不僅僅需要做磨皮,還要做人臉關鍵點檢測,檢測完成后需要做瘦臉大眼等比較耗時的渲染處理,這么低的幀率肯定是不能滿足需要的。為此,本人開發了自己的相機:CainCamera(github地址:CainCamera)。在開發CainCamera相機過程中,在寫濾鏡的過程中,很大程度上參考了MagicCamera,對此,表示非常感謝程序員杠把子所做的努力和提供了這么棒的開源相機和濾鏡。
廢話不多說,下面開始進入正題。本人開發的相機幀率究竟如何呢?同樣是做了磨皮和濾鏡渲染,CainCamera在紅米Note2上做和MagicCamera同樣的磨皮和濾鏡時,實時預覽的幀率穩定在19幀以上,如下圖所示:
關于市面上主流的美顏類相機的幀率表現,可以參考本人的文章:Android預覽實時渲染的幀率優化相關,里面有跟市面上主流美顏類相機做過比較
首先是所有濾鏡的基類構建問題。基類寫得好,不僅可以節省具體濾鏡的代碼量,甚至能夠提高渲染效率。下面來看看本人寫的濾鏡基類是如何構成的:
CainCamera的濾鏡基類由BaseImageFilter 和 BaseImageFilterGroup構成的。BaseImageFilter是所有濾鏡和濾鏡組的基類,BaseImageFilterGroup則是所有濾鏡組的基類,繼承于BaseImageFilter。濾鏡組是用于管理渲染多個濾鏡的渲染。下面我們來看看兩個基類的寫法:
首先是BaseImageFilter, BaseImageFilter的核心方法如下:
/**
* 繪制Frame
* @param textureId
*/
public boolean drawFrame(int textureId) {
return drawFrame(textureId, mVertexArray, mTexCoordArray);
}
/**
* 繪制Frame
* @param textureId
* @param vertexBuffer
* @param textureBuffer
*/
public boolean drawFrame(int textureId, FloatBuffer vertexBuffer,
FloatBuffer textureBuffer) {
if (textureId == GlUtil.GL_NOT_INIT) {
return false;
}
GLES30.glUseProgram(mProgramHandle);
runPendingOnDrawTasks();
vertexBuffer.position(0);
GLES30.glVertexAttribPointer(maPositionLoc, mCoordsPerVertex,
GLES30.GL_FLOAT, false, 0, vertexBuffer);
GLES30.glEnableVertexAttribArray(maPositionLoc);
textureBuffer.position(0);
GLES30.glVertexAttribPointer(maTextureCoordLoc, 2,
GLES30.GL_FLOAT, false, 0, textureBuffer);
GLES30.glEnableVertexAttribArray(maTextureCoordLoc);
GLES30.glUniformMatrix4fv(muMVPMatrixLoc, 1, false, mMVPMatrix, 0);
GLES30.glUniformMatrix4fv(mTexMatrixLoc, 1, false, mTexMatrix, 0);
GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
GLES30.glBindTexture(getTextureType(), textureId);
GLES30.glUniform1i(mInputTextureLoc, 0);
onDrawArraysBegin();
GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, mVertexCount);
onDrawArraysAfter();
GLES30.glDisableVertexAttribArray(maPositionLoc);
GLES30.glDisableVertexAttribArray(maTextureCoordLoc);
GLES30.glBindTexture(getTextureType(), 0);
GLES30.glUseProgram(0);
return true;
}
/**
* 繪制到FBO
* @param textureId
* @return FBO綁定的Texture
*/
public int drawFrameBuffer(int textureId) {
return drawFrameBuffer(textureId, mVertexArray, mTexCoordArray);
}
/**
* 繪制到FBO
* @param textureId
* @param vertexBuffer
* @param textureBuffer
* @return FBO綁定的Texture
*/
public int drawFrameBuffer(int textureId, FloatBuffer vertexBuffer, FloatBuffer textureBuffer) {
if (mFramebuffers == null) {
return GlUtil.GL_NOT_INIT;
}
runPendingOnDrawTasks();
GLES30.glViewport(0, 0, mFrameWidth, mFrameHeight);
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, mFramebuffers[0]);
GLES30.glUseProgram(mProgramHandle);
vertexBuffer.position(0);
GLES30.glVertexAttribPointer(maPositionLoc, mCoordsPerVertex,
GLES30.GL_FLOAT, false, 0, vertexBuffer);
GLES30.glEnableVertexAttribArray(maPositionLoc);
textureBuffer.position(0);
GLES30.glVertexAttribPointer(maTextureCoordLoc, 2,
GLES30.GL_FLOAT, false, 0, textureBuffer);
GLES30.glEnableVertexAttribArray(maTextureCoordLoc);
GLES30.glUniformMatrix4fv(muMVPMatrixLoc, 1, false, mMVPMatrix, 0);
GLES30.glActiveTexture(GLES30.GL_TEXTURE0);
GLES30.glBindTexture(getTextureType(), textureId);
GLES30.glUniform1i(mInputTextureLoc, 0);
onDrawArraysBegin();
GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, mVertexCount);
onDrawArraysAfter();
GLES30.glDisableVertexAttribArray(maPositionLoc);
GLES30.glDisableVertexAttribArray(maTextureCoordLoc);
GLES30.glBindTexture(getTextureType(), 0);
GLES30.glUseProgram(0);
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
GLES30.glViewport(0, 0, mDisplayWidth, mDisplayHeight);
return mFramebufferTextures[0];
}
基類的核心方法就兩個:drawFrame 和 drawFrameBuffer。drawFrame方法直接繪制渲染輸入。drawFrameBuffer方法,則將texture渲染到FBO中,該方法用于多重濾鏡的渲染上。2017年12月08日更新: 這里的drawFrame和drawFrameBuffer方法都做了更新,因為本人實現了多段視頻錄制的功能之后,發現這里存在一個小Bug,就是處于錄制視頻狀態下,切換濾鏡會產生一幀的黑屏現象,由于本人使用OpenGLES 的 multi thread multi rendering context做錄制操作,所以在這里存在一個線程同步的問題,多線程環境下切換濾鏡,需要考慮FBO是否正確綁定texture了。不過這里也很好改動,如果沒有綁定,則返回一個標志,讓FBO綁定跳過就好,因為此時并沒有當前濾鏡層的渲染操作,否則會將FBO綁定到一個空的Texture 中,導致錄制時切換濾鏡瞬間存在黑屏的現象,該問題已修復。
接下來我們來看看BaseImageFilterGroup的核心:
@Override
public void onInputSizeChanged(int width, int height) {
super.onInputSizeChanged(width, height);
if (mFilters.size() <= 0) {
return;
}
int size = mFilters.size();
for (int i = 0; i < size; i++) {
mFilters.get(i).onInputSizeChanged(width, height);
}
// 先銷毀原來的Framebuffers
if(mFramebuffers != null && (mImageWidth != width
|| mImageHeight != height || mFramebuffers.length != size-1)) {
destroyFramebuffer();
mImageWidth = width;
mImageWidth = height;
}
initFramebuffer(width, height);
}
@Override
public void onDisplayChanged(int width, int height) {
super.onDisplayChanged(width, height);
// 更新顯示的的視圖大小
if (mFilters.size() <= 0) {
return;
}
int size = mFilters.size();
for (int i = 0; i < size; i++) {
mFilters.get(i).onDisplayChanged(width, height);
}
}
@Override
public boolean drawFrame(int textureId) {
if (mFramebuffers == null || mFrameBufferTextures == null || mFilters.size() <= 0) {
return false;
}
int size = mFilters.size();
mCurrentTextureId = textureId;
for (int i = 0; i < size; i++) {
BaseImageFilter filter = mFilters.get(i);
if (i < size - 1) {
GLES30.glViewport(0, 0, mImageWidth, mImageHeight);
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, mFramebuffers[i]);
GLES30.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
if (filter.drawFrame(mCurrentTextureId)) {
mCurrentTextureId = mFrameBufferTextures[i];
}
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
} else {
GLES30.glViewport(0, 0, mDisplayWidth, mDisplayHeight);
filter.drawFrame(mCurrentTextureId);
}
}
return true;
}
@Override
public boolean drawFrame(int textureId, FloatBuffer vertexBuffer, FloatBuffer textureBuffer) {
if (mFramebuffers == null || mFrameBufferTextures == null || mFilters.size() <= 0) {
return false;
}
int size = mFilters.size();
mCurrentTextureId = textureId;
for (int i = 0; i < size; i++) {
BaseImageFilter filter = mFilters.get(i);
if (i < size - 1) {
GLES30.glViewport(0, 0, mImageWidth, mImageHeight);
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, mFramebuffers[i]);
GLES30.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
filter.drawFrame(mCurrentTextureId, vertexBuffer, textureBuffer);
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
mCurrentTextureId = mFrameBufferTextures[i];
} else {
GLES30.glViewport(0, 0, mDisplayWidth, mDisplayHeight);
filter.drawFrame(mCurrentTextureId, vertexBuffer, textureBuffer);
}
}
return true;
}
@Override
public int drawFrameBuffer(int textureId) {
if (mFramebuffers == null || mFrameBufferTextures == null || mFilters.size() <= 0) {
return textureId;
}
int size = mFilters.size();
mCurrentTextureId = textureId;
GLES30.glViewport(0, 0, mImageWidth, mImageHeight);
for (int i = 0; i < size; i++) {
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, mFramebuffers[i]);
GLES30.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
if (mFilters.get(i).drawFrame(mCurrentTextureId)) {
mCurrentTextureId = mFrameBufferTextures[i];
}
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
}
return mCurrentTextureId;
}
@Override
public int drawFrameBuffer(int textureId, FloatBuffer vertexBuffer, FloatBuffer textureBuffer) {
if (mFramebuffers == null || mFrameBufferTextures == null || mFilters.size() <= 0) {
return textureId;
}
int size = mFilters.size();
mCurrentTextureId = textureId;
GLES30.glViewport(0, 0, mImageWidth, mImageHeight);
for (int i = 0; i < size; i++) {
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, mFramebuffers[i]);
GLES30.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
if (mFilters.get(i).drawFrame(mCurrentTextureId, vertexBuffer, textureBuffer)) {
mCurrentTextureId = mFrameBufferTextures[i];
}
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
}
return mCurrentTextureId;
}
@Override
public void release() {
if (mFilters != null) {
for (BaseImageFilter filter : mFilters) {
filter.release();
}
mFilters.clear();
}
destroyFramebuffer();
}
/**
* 初始化framebuffer,這里在調用drawFrame時,會多一個FBO,這里為了方便后面錄制視頻縮放處理
*/
public void initFramebuffer(int width, int height) {
int size = mFilters.size();
// 創建Framebuffers 和 Textures
if (mFramebuffers == null) {
mFramebuffers = new int[size];
mFrameBufferTextures = new int[size];
createFramebuffer(0, size);
}
}
/**
* 創建Framebuffer
* @param start
* @param size
*/
private void createFramebuffer(int start, int size) {
for (int i = start; i < size; i++) {
GLES30.glGenFramebuffers(1, mFramebuffers, i);
GLES30.glGenTextures(1, mFrameBufferTextures, i);
GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, mFrameBufferTextures[i]);
GLES30.glTexImage2D(GLES30.GL_TEXTURE_2D, 0, GLES30.GL_RGBA,
mImageWidth, mImageHeight, 0, GLES30.GL_RGBA, GLES30.GL_UNSIGNED_BYTE, null);
GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D,
GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR);
GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D,
GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_LINEAR);
GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D,
GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE);
GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D,
GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE);
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, mFramebuffers[i]);
GLES30.glFramebufferTexture2D(GLES30.GL_FRAMEBUFFER, GLES30.GL_COLOR_ATTACHMENT0,
GLES30.GL_TEXTURE_2D, mFrameBufferTextures[i], 0);
GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, 0);
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
}
}
/**
* 銷毀Framebuffers
*/
public void destroyFramebuffer() {
if (mFrameBufferTextures != null) {
GLES30.glDeleteTextures(mFrameBufferTextures.length, mFrameBufferTextures, 0);
mFrameBufferTextures = null;
}
if (mFramebuffers != null) {
GLES30.glDeleteFramebuffers(mFramebuffers.length, mFramebuffers, 0);
mFramebuffers = null;
}
}
BaseImageFilterGroup 主要用于管理FBO的創建和銷毀。其中,onInputSizeChanged 方法主要用于相機預覽大小(PreviewSize)改變時調用,onDisplayChanged 方法則用于SurfaceView大小發生改變時調用,initFramebuffer 方法用于為每個濾鏡創建一個FBO,destroyFramebuffer 方法則用于銷毀FBO。drawFrame 方法繼承于基類BaseImageFilter,需要重寫該方法,逐個濾鏡綁定FBO并繪制。
這樣,所有濾鏡和濾鏡組寫起來就方便很多了。我們來看下其中一個濾鏡和濾鏡組的寫法,
比如相機輸入流繪制可以這么寫:
public class CameraFilter extends BaseImageFilter {
private static final String VERTEX_SHADER =
"uniform mat4 uMVPMatrix; \n" +
"uniform mat4 uTexMatrix; \n" +
"attribute vec4 aPosition; \n" +
"attribute vec4 aTextureCoord; \n" +
"varying vec2 textureCoordinate; \n" +
"void main() { \n" +
" gl_Position = uMVPMatrix * aPosition; \n" +
" textureCoordinate = (uTexMatrix * aTextureCoord).xy; \n" +
"} \n";
private static final String FRAGMENT_SHADER_OES =
"#extension GL_OES_EGL_image_external : require \n" +
"precision mediump float; \n" +
"varying vec2 textureCoordinate; \n" +
"uniform samplerExternalOES inputTexture; \n" +
"void main() { \n" +
" gl_FragColor = texture2D(inputTexture, textureCoordinate); \n" +
"} \n";
private int muTexMatrixLoc;
private float[] mTextureMatrix;
public CameraFilter() {
this(VERTEX_SHADER, FRAGMENT_SHADER_OES);
}
public CameraFilter(String vertexShader, String fragmentShader) {
super(vertexShader, fragmentShader);
muTexMatrixLoc = GLES30.glGetUniformLocation(mProgramHandle, "uTexMatrix");
// 視圖矩陣
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -1, 0f, 0f, 0f, 0f, 1f, 0f);
}
@Override
public void onInputSizeChanged(int width, int height) {
super.onInputSizeChanged(width, height);
float aspect = (float) width / height; // 計算寬高比
Matrix.perspectiveM(mProjectionMatrix, 0, 60, aspect, 2, 10);
}
@Override
public int getTextureType() {
return GLES11Ext.GL_TEXTURE_EXTERNAL_OES;
}
@Override
public void onDrawArraysBegin() {
GLES30.glUniformMatrix4fv(muTexMatrixLoc, 1, false, mTextureMatrix, 0);
}
public void updateTextureBuffer() {
mTexCoordArray = TextureRotationUtils.getTextureBuffer();
}
/**
* 設置SurfaceTexture的變換矩陣
* @param texMatrix
*/
public void setTextureTransformMatirx(float[] texMatrix) {
mTextureMatrix = texMatrix;
}
/**
* 鏡像翻轉
* @param coords
* @param matrix
* @return
*/
private float[] transformTextureCoordinates(float[] coords, float[] matrix) {
float[] result = new float[coords.length];
float[] vt = new float[4];
for (int i = 0; i < coords.length; i += 2) {
float[] v = { coords[i], coords[i + 1], 0, 1 };
Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);
result[i] = vt[0];// x軸鏡像
// result[i + 1] = vt[1];y軸鏡像
result[i + 1] = coords[i + 1];
}
return result;
}
}
而濾鏡組則更加簡單:
public class DefaultFilterGroup extends BaseImageFilterGroup {
// 實時美顏層
private static final int BeautyfyIndex = 0;
// 顏色層
private static final int ColorIndex = 1;
// 瘦臉大眼層
private static final int FaceStretchIndex = 2;
// 貼紙
private static final int StickersIndex = 3;
public DefaultFilterGroup() {
this(initFilters());
}
private DefaultFilterGroup(List<BaseImageFilter> filters) {
mFilters = filters;
}
private static List<BaseImageFilter> initFilters() {
List<BaseImageFilter> filters = new ArrayList<BaseImageFilter>();
filters.add(BeautyfyIndex, FilterManager.getFilter(FilterType.REALTIMEBEAUTY));
filters.add(ColorIndex, FilterManager.getFilter(FilterType.SOURCE));
filters.add(FaceStretchIndex, FilterManager.getFilter(FilterType.FACESTRETCH));
filters.add(StickersIndex, FilterManager.getFilter(FilterType.STICKER));
return filters;
}
@Override
public void changeFilter(FilterType type) {
FilterIndex index = FilterManager.getIndex(type);
if (index == FilterIndex.BeautyIndex) {
changeBeautyFilter(type);
} else if (index == FilterIndex.ColorIndex) {
changeColorFilter(type);
} else if (index == FilterIndex.FaceStretchIndex) {
changeFaceStretchFilter(type);
} else if (index == FilterIndex.MakeUpIndex) {
changeMakeupFilter(type);
} else if (index == FilterIndex.StickerIndex) {
changeStickerFilter(type);
}
}
/**
* 切換美顏濾鏡
* @param type
*/
private void changeBeautyFilter(FilterType type) {
if (mFilters != null) {
mFilters.get(BeautyfyIndex).release();
mFilters.set(BeautyfyIndex, FilterManager.getFilter(type));
// 設置寬高
mFilters.get(BeautyfyIndex).onInputSizeChanged(mImageWidth, mImageHeight);
mFilters.get(BeautyfyIndex).onDisplayChanged(mDisplayWidth, mDisplayHeight);
}
}
/**
* 切換顏色濾鏡
* @param type
*/
private void changeColorFilter(FilterType type) {
if (mFilters != null) {
mFilters.get(ColorIndex).release();
mFilters.set(ColorIndex, FilterManager.getFilter(type));
// 設置寬高
mFilters.get(ColorIndex).onInputSizeChanged(mImageWidth, mImageHeight);
mFilters.get(ColorIndex).onDisplayChanged(mDisplayWidth, mDisplayHeight);
}
}
/**
* 切換瘦臉大眼濾鏡
* @param type
*/
private void changeFaceStretchFilter(FilterType type) {
if (mFilters != null) {
mFilters.get(FaceStretchIndex).release();
mFilters.set(FaceStretchIndex, FilterManager.getFilter(type));
// 設置寬高
mFilters.get(FaceStretchIndex).onInputSizeChanged(mImageWidth, mImageHeight);
mFilters.get(FaceStretchIndex).onDisplayChanged(mDisplayWidth, mDisplayHeight);
}
}
/**
* 切換貼紙濾鏡
* @param type
*/
private void changeStickerFilter(FilterType type) {
if (mFilters != null) {
mFilters.get(StickersIndex).release();
mFilters.set(StickersIndex, FilterManager.getFilter(type));
// 設置寬高
mFilters.get(StickersIndex).onInputSizeChanged(mImageWidth, mImageHeight);
mFilters.get(StickersIndex).onDisplayChanged(mDisplayWidth, mDisplayHeight);
}
}
/**
* 切換彩妝濾鏡
* @param type
*/
private void changeMakeupFilter(FilterType type) {
// Do nothing, 彩妝濾鏡放在彩妝濾鏡組里面
}
}
上面是默認的實時渲染濾鏡組,可以看到,本人在處理多個濾鏡的時候,使用了分層的形式。這么寫的原因很簡單,為了節省渲染前的計算時間。一般實時性不夠強的情況下,通常都是在調用某個方法的時候再逐個計算有多少個濾鏡,然后在繪制的時候再動態綁定FBO。但這么做對實時性要求非常高的情況下并不是非常好的方式。因為每次調用drawFrame/drawFrameBuffer繪制的時候都需要綁定和解綁FBO,效率肯定會有所影響。但實際上,我們可以在繪制之前就已經知道需要多少個濾鏡,需要多少個FBO了,繪制方法因為調用的次數非常非常多,所以任何耗時操作如非必須,都不能放在繪制方法里面,新建對象就更加不建議了,不僅新建對象比較耗時,而且可能會產生內存抖動問題。
現在我們得到了濾鏡和濾鏡組的基類,也得到了默認實時渲染的濾鏡組,接下來就是實現具體濾鏡了。具體的濾鏡實現就是glsl的事情了。如果濾鏡有多個Texture如何處理呢?沒關系,我們在BaseImageFilter的drawFrame 方法中加入了一個空方法onDrawArraysBegin(),當濾鏡需要綁定除了inputTexture外的其他Texture,則可以在這里進行綁定,比如像LOMO濾鏡一樣,綁定需要混合的Texture:
public class LomoFilter extends BaseImageFilter {
private static final String FRAGMENT_SHADER =
"precision mediump float;\n" +
" \n" +
" varying mediump vec2 textureCoordinate;\n" +
" \n" +
" uniform sampler2D inputTexture;\n" +
" uniform sampler2D mapTexture;\n" +
" uniform sampler2D vignetteTexture;\n" +
" \n" +
" uniform float strength;\n" +
"\n" +
" void main()\n" +
" {\n" +
" vec4 originColor = texture2D(inputTexture, textureCoordinate);\n" +
" vec3 texel = texture2D(inputTexture, textureCoordinate).rgb;\n" +
"\n" +
" vec2 red = vec2(texel.r, 0.16666);\n" +
" vec2 green = vec2(texel.g, 0.5);\n" +
" vec2 blue = vec2(texel.b, 0.83333);\n" +
"\n" +
" texel.rgb = vec3(\n" +
" texture2D(mapTexture, red).r,\n" +
" texture2D(mapTexture, green).g,\n" +
" texture2D(mapTexture, blue).b);\n" +
"\n" +
" vec2 tc = (2.0 * textureCoordinate) - 1.0;\n" +
" float d = dot(tc, tc);\n" +
" vec2 lookup = vec2(d, texel.r);\n" +
" texel.r = texture2D(vignetteTexture, lookup).r;\n" +
" lookup.y = texel.g;\n" +
" texel.g = texture2D(vignetteTexture, lookup).g;\n" +
" lookup.y = texel.b;\n" +
" texel.b\t= texture2D(vignetteTexture, lookup).b;\n" +
"\n" +
" texel.rgb = mix(originColor.rgb, texel.rgb, strength);\n" +
"\n" +
" gl_FragColor = vec4(texel,1.0);\n" +
" }";
private int mMapTexture;
private int mMapTextureLoc;
private int mVignetteTexture;
private int mVignetteTextureLoc;
private int mStrengthLoc;
public LomoFilter() {
this(VERTEX_SHADER, FRAGMENT_SHADER);
}
public LomoFilter(String vertexShader, String fragmentShader) {
super(vertexShader, fragmentShader);
mMapTextureLoc = GLES30.glGetUniformLocation(mProgramHandle, "mapTexture");
mVignetteTextureLoc = GLES30.glGetUniformLocation(mProgramHandle, "vignetteTexture");
mStrengthLoc = GLES30.glGetUniformLocation(mProgramHandle, "strength");
createTexture();
setFloat(mStrengthLoc, 1.0f);
}
private void createTexture() {
mMapTexture = GlUtil.createTextureFromAssets(ParamsManager.context,
"filters/lomo_map.png");
mVignetteTexture = GlUtil.createTextureFromAssets(ParamsManager.context,
"filters/lomo_vignette.png");
}
@Override
public void onDrawArraysBegin() {
super.onDrawArraysBegin();
GLES30.glActiveTexture(GLES30.GL_TEXTURE1);
GLES30.glBindTexture(getTextureType(), mMapTexture);
GLES30.glUniform1i(mMapTextureLoc, 1);
GLES30.glActiveTexture(GLES30.GL_TEXTURE2);
GLES30.glBindTexture(getTextureType(), mVignetteTexture);
GLES30.glUniform1i(mVignetteTextureLoc, 2);
}
@Override
public void release() {
super.release();
GLES30.glDeleteTextures(2, new int[]{mMapTexture, mVignetteTexture}, 0);
}
}
濾鏡之所以沒有持有Context,這樣做可以讓RenderThread也不需要持有相應的Context,方便移植到不同的系統中,如果有需要,也可以在基類添加Context上下文。這就看個人喜好了。
好了,濾鏡的實現我們也知道怎么做了。那么,像我這樣將濾鏡組分層又怎么做濾鏡切換呢?為了方便,我寫了幾個管理類,用于管理濾鏡和不同濾鏡組的切換,其中FilterIndex用于指定濾鏡層,FilterType用于指定具體的濾鏡,FilterManager用于管理切換濾鏡以及切換濾鏡組,ColorFilterManager則用于管理顏色濾鏡的切換,方便Activity調用。具體實現如下:
FilterIndex:
public enum FilterIndex {
// 無
NoneIndex,
// 美顏
BeautyIndex,
// 顏色
ColorIndex,
// 瘦臉大眼
FaceStretchIndex,
// 貼紙
StickerIndex,
// 彩妝
MakeUpIndex,
// 水印
WaterMaskIndex,
// 圖片編輯
ImageEditIndex
}
``
FilterType:
public enum FilterType {
NONE, // 沒有濾鏡
// 圖片編輯濾鏡
BRIGHTNESS, // 亮度
CONTRAST, // 對比度
EXPOSURE, // 曝光
GUASS, // 高斯模糊
HUE, // 色調
MIRROR, // 鏡像
SATURATION, // 飽和度
SHARPNESS, // 銳度
WATERMASK, // 水印
// 人臉美顏美妝貼紙
REALTIMEBEAUTY, // 實時美顏
FACESTRETCH, // 人臉變形(瘦臉大眼等)
STICKER, // 貼紙
MAKEUP, // 彩妝
// 顏色濾鏡
SOURCE, // 原圖
AMARO, // 阿馬羅
ANTIQUE, // 古董
BLACKCAT, // 黑貓
BLACKWHITE, // 黑白
BROOKLYN, // 布魯克林
CALM, // 冷靜
COOL, // 冷色調
EARLYBIRD, // 晨鳥
EMERALD, // 翡翠
EVERGREEN, // 常綠
FAIRYTALE, // 童話
FREUD, // 佛洛伊特
HEALTHY, // 健康
HEFE, // 酵母
HUDSON, // 哈德森
KEVIN, // 凱文
LATTE, // 拿鐵
LOMO, // LOMO
NOSTALGIA, // 懷舊之情
ROMANCE, // 浪漫
SAKURA, // 櫻花
SKETCH, // 素描
SUNSET, // 日落
WHITECAT, // 白貓
WHITENORREDDEN, // 白皙還是紅潤
}
FilterManager:
public final class FilterManager {
private static HashMap<FilterType, FilterIndex> mIndexMap = new HashMap<FilterType, FilterIndex>();
static {
mIndexMap.put(FilterType.NONE, FilterIndex.NoneIndex);
// 圖片編輯
mIndexMap.put(FilterType.BRIGHTNESS, FilterIndex.ImageEditIndex);
mIndexMap.put(FilterType.CONTRAST, FilterIndex.ImageEditIndex);
mIndexMap.put(FilterType.EXPOSURE, FilterIndex.ImageEditIndex);
mIndexMap.put(FilterType.GUASS, FilterIndex.ImageEditIndex);
mIndexMap.put(FilterType.HUE, FilterIndex.ImageEditIndex);
mIndexMap.put(FilterType.MIRROR, FilterIndex.ImageEditIndex);
mIndexMap.put(FilterType.SATURATION, FilterIndex.ImageEditIndex);
mIndexMap.put(FilterType.SHARPNESS, FilterIndex.ImageEditIndex);
// 水印
mIndexMap.put(FilterType.WATERMASK, FilterIndex.WaterMaskIndex);
// 美顏
mIndexMap.put(FilterType.REALTIMEBEAUTY, FilterIndex.BeautyIndex);
// 瘦臉大眼
mIndexMap.put(FilterType.FACESTRETCH, FilterIndex.FaceStretchIndex);
// 貼紙
mIndexMap.put(FilterType.STICKER, FilterIndex.StickerIndex);
// 彩妝
mIndexMap.put(FilterType.MAKEUP, FilterIndex.MakeUpIndex);
// 顏色濾鏡
mIndexMap.put(FilterType.AMARO, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.ANTIQUE, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.BLACKCAT, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.BLACKWHITE, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.BROOKLYN, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.CALM, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.COOL, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.EARLYBIRD, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.EMERALD, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.EVERGREEN, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.FAIRYTALE, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.FREUD, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.HEALTHY, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.HEFE, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.HUDSON, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.KEVIN, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.LATTE, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.LOMO, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.NOSTALGIA, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.ROMANCE, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.SAKURA, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.SKETCH, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.SOURCE, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.SUNSET, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.WHITECAT, FilterIndex.ColorIndex);
mIndexMap.put(FilterType.WHITENORREDDEN, FilterIndex.ColorIndex);
}
private FilterManager() {}
public static BaseImageFilter getFilter(FilterType type) {
switch (type) {
// 圖片基本屬性編輯濾鏡
// 飽和度
case SATURATION:
return new SaturationFilter();
// 鏡像翻轉
case MIRROR:
return new MirrorFilter();
// 高斯模糊
case GUASS:
return new GuassFilter();
// 亮度
case BRIGHTNESS:
return new BrightnessFilter();
// 對比度
case CONTRAST:
return new ContrastFilter();
// 曝光
case EXPOSURE:
return new ExposureFilter();
// 色調
case HUE:
return new HueFilter();
// 銳度
case SHARPNESS:
return new SharpnessFilter();
// TODO 貼紙濾鏡需要人臉關鍵點計算得到
case STICKER:
return new DisplayFilter();
// return new StickerFilter();
// 白皙還是紅潤
case WHITENORREDDEN:
return new WhitenOrReddenFilter();
// 實時磨皮
case REALTIMEBEAUTY:
return new RealtimeBeautify();
// AMARO
case AMARO:
return new AmaroFilter();
// 古董
case ANTIQUE:
return new AnitqueFilter();
// 黑貓
case BLACKCAT:
return new BlackCatFilter();
// 黑白
case BLACKWHITE:
return new BlackWhiteFilter();
// 布魯克林
case BROOKLYN:
return new BrooklynFilter();
// 冷靜
case CALM:
return new CalmFilter();
// 冷色調
case COOL:
return new CoolFilter();
// 晨鳥
case EARLYBIRD:
return new EarlyBirdFilter();
// 翡翠
case EMERALD:
return new EmeraldFilter();
// 常綠
case EVERGREEN:
return new EvergreenFilter();
// 童話
case FAIRYTALE:
return new FairyTaleFilter();
// 佛洛伊特
case FREUD:
return new FreudFilter();
// 健康
case HEALTHY:
return new HealthyFilter();
// 酵母
case HEFE:
return new HefeFilter();
// 哈德森
case HUDSON:
return new HudsonFilter();
// 凱文
case KEVIN:
return new KevinFilter();
// 拿鐵
case LATTE:
return new LatteFilter();
// LOMO
case LOMO:
return new LomoFilter();
// 懷舊之情
case NOSTALGIA:
return new NostalgiaFilter();
// 浪漫
case ROMANCE:
return new RomanceFilter();
// 櫻花
case SAKURA:
return new SakuraFilter();
// 素描
case SKETCH:
return new SketchFilter();
// 日落
case SUNSET:
return new SunsetFilter();
// 白貓
case WHITECAT:
return new WhiteCatFilter();
case NONE: // 沒有濾鏡
case SOURCE: // 原圖
default:
return new DisplayFilter();
}
}
/**
* 獲取濾鏡組
* @return
*/
public static BaseImageFilterGroup getFilterGroup() {
return new DefaultFilterGroup();
}
public static BaseImageFilterGroup getFilterGroup(FilterGroupType type) {
switch (type) {
// 彩妝濾鏡組
case MAKEUP:
return new MakeUpFilterGroup();
// 默認濾鏡組
case DEFAULT:
default:
return new DefaultFilterGroup();
}
}
/**
* 獲取層級
* @param Type
* @return
*/
public static FilterIndex getIndex(FilterType Type) {
FilterIndex index = mIndexMap.get(Type);
if (index != null) {
return index;
}
return FilterIndex.NoneIndex;
}
}
ColorFilterManager:
public final class ColorFilterManager {
private static ColorFilterManager mInstance;
private ArrayList<FilterType> mFilterType;
private ArrayList<String> mFilterName;
public static ColorFilterManager getInstance() {
if (mInstance == null) {
mInstance = new ColorFilterManager();
}
return mInstance;
}
private ColorFilterManager() {
initColorFilters();
}
/**
* 初始化顏色濾鏡
*/
public void initColorFilters() {
mFilterType = new ArrayList<FilterType>();
mFilterType.add(FilterType.SOURCE); // 原圖
mFilterType.add(FilterType.AMARO);
mFilterType.add(FilterType.ANTIQUE);
mFilterType.add(FilterType.BLACKCAT);
mFilterType.add(FilterType.BLACKWHITE);
mFilterType.add(FilterType.BROOKLYN);
mFilterType.add(FilterType.CALM);
mFilterType.add(FilterType.COOL);
mFilterType.add(FilterType.EARLYBIRD);
mFilterType.add(FilterType.EMERALD);
mFilterType.add(FilterType.EVERGREEN);
mFilterType.add(FilterType.FAIRYTALE);
mFilterType.add(FilterType.FREUD);
mFilterType.add(FilterType.HEALTHY);
mFilterType.add(FilterType.HEFE);
mFilterType.add(FilterType.HUDSON);
mFilterType.add(FilterType.KEVIN);
mFilterType.add(FilterType.LATTE);
mFilterType.add(FilterType.LOMO);
mFilterType.add(FilterType.NOSTALGIA);
mFilterType.add(FilterType.ROMANCE);
mFilterType.add(FilterType.SAKURA);
mFilterType.add(FilterType.SKETCH);
mFilterType.add(FilterType.SUNSET);
mFilterType.add(FilterType.WHITECAT);
mFilterName = new ArrayList<String>();
mFilterName.add("原圖");
mFilterName.add("阿馬羅");
mFilterName.add("古董");
mFilterName.add("黑貓");
mFilterName.add("黑白");
mFilterName.add("布魯克林");
mFilterName.add("冷靜");
mFilterName.add("冷色調");
mFilterName.add("晨鳥");
mFilterName.add("翡翠");
mFilterName.add("常綠");
mFilterName.add("童話");
mFilterName.add("佛洛伊特");
mFilterName.add("健康");
mFilterName.add("酵母");
mFilterName.add("哈德森");
mFilterName.add("凱文");
mFilterName.add("拿鐵");
mFilterName.add("LOMO");
mFilterName.add("懷舊之情");
mFilterName.add("浪漫");
mFilterName.add("櫻花");
mFilterName.add("素描");
mFilterName.add("日落");
mFilterName.add("白貓");
}
/**
* 獲取顏色濾鏡類型
* @param index
* @return
*/
public FilterType getColorFilterType(int index) {
if (mFilterType == null || mFilterType.isEmpty()) {
return FilterType.SOURCE;
}
int i = index % mFilterType.size();
return mFilterType.get(i);
}
/**
* 獲取顏色濾鏡的名稱
* @param index
* @return
*/
public String getColorFilterName(int index) {
if (mFilterName == null || mFilterName.isEmpty()) {
return "原圖";
}
int i = index % mFilterName.size();
return mFilterName.get(i);
}
/**
* 獲取顏色濾鏡數目
* @return
*/
public int getColorFilterCount() {
return mFilterType == null ? 0 : mFilterType.size();
}
}
這樣,在外層的Activity中,切換濾鏡只需要這么使用:
@Override
public void swipeBack() {
mColorIndex++;
if (mColorIndex >= ColorFilterManager.getInstance().getColorFilterCount()) {
mColorIndex = 0;
}
DrawerManager.getInstance()
.changeFilterType(ColorFilterManager.getInstance().getColorFilterType(mColorIndex));
if (isDebug) {
Log.d("changeFilter", "index = " + mColorIndex + ", filter name = "
+ ColorFilterManager.getInstance().getColorFilterName(mColorIndex));
}
}
@Override
public void swipeFrontal() {
mColorIndex--;
if (mColorIndex < 0) {
int count = ColorFilterManager.getInstance().getColorFilterCount();
mColorIndex = count > 0 ? count - 1 : 0;
}
DrawerManager.getInstance()
.changeFilterType(ColorFilterManager.getInstance().getColorFilterType(mColorIndex));
if (isDebug) {
Log.d("changeFilter", "index = " + mColorIndex + ", filter name = "
+ ColorFilterManager.getInstance().getColorFilterName(mColorIndex));
}
}
``
其中DrawerManager是用于管理相機預覽渲染線程的。這樣的方案可以使得我們在操作頁面邏輯的時候就不需要考慮濾鏡切換的細節是如何處理的,頁面需要大改的時候,對濾鏡來說并沒有任何影響。
至此,你就會得到一個耦合度較低、渲染效率較高、預覽幀率足夠的美顏類相機了。
具體實現過程可以參考本人的項目CainCamera,github地址:CainCamera
備注:截止這篇文章發布時,本人還沒有實現貼紙功能,但人臉關鍵點檢測已經成功接入Face++ 的SDK,關鍵點已經取得,但貼紙濾鏡的具體實現還沒有實現。本人的實現思路如下:
貼紙功能分成JSON解析器、zip解壓器、貼紙管理器等功能構成,用于解壓貼紙的zip包,解析json,生成texture等流程,后續功能等實現了,本人將會發布新的文章介紹如何實現。當然,這是在幀率得到保證、不產生內存抖動、內存溢出的前提下實現。本人對CainCamera項目的要求能夠達到商業相機的水平,直接可以商用的程度。
關于相機控制和渲染具體實現,可以參考本人的文章:
Android預覽實時渲染的幀率優化相關
關于磨皮算法的渲染效率優化,可以參考本人的文章:
Android OpenGLES 實時美顏的優化