标签:
http://blog.csdn.net/yanzi1225627/article/details/33339965
GLSurfaceView是OpenGL中的一个类,也是可以预览Camera的,而且在预览Camera上有其独到之处。独到之处在哪?当使用Surfaceview无能为力、痛不欲生时就只有使用GLSurfaceView了,它能够真正做到让Camera的数据和显示分离,所以搞明白了这个,像Camera只开预览不显示这都是小菜,妥妥的。Android4.0的自带Camera源码是用SurfaceView预览的,但到了4.2就换成了GLSurfaceView来预览。如今到了4.4又用了自家的TextureView,所以从中可以窥探出新增TextureView的用意。
虽说Android4.2的Camera源码是用GLSurfaceView预览的,但是进行了大量的封装又封装的,由于是OpenGL小白,真是看的不知所云。俺滴要求不高,只想弄个可拍照的摸清GLSurfaceView在预览Camera上的使用流程。经过一番百度一无所获,后来翻出去Google一大圈也没发现可用的。倒是很多人都在用GLSurfaceView和Surfaceview同时预览Camera,Surfaceview用来预览数据,在上面又铺了一层GLSurfaceView绘制一些信息。无奈自己摸索,整出来的是能拍照也能得到数据,但是界面上不是一块白板就是一块黑板啥都不显示。后来在stackoverflow终于找到了一个可用的链接,哈哈,苍天啊,终于柳暗花明了!参考此链接,自己又改改摸索了一天才彻底搞定。之所以费这么多时间是不明白OpenGL ES2.0的绘制基本流程,跟简单的OpenGL的绘制还是稍有区别。下面上源码:
一、CameraGLSurfaceView.java 此类继承GLSurfaceView,并实现了两个接口
- <span style="font-family:Comic Sans MS;font-size:18px;">package org.yanzi.camera.preview;
-
- import javax.microedition.khronos.egl.EGLConfig;
- import javax.microedition.khronos.opengles.GL10;
-
- import org.yanzi.camera.CameraInterface;
-
- import android.content.Context;
- import android.graphics.SurfaceTexture;
- import android.opengl.GLES11Ext;
- import android.opengl.GLES20;
- import android.opengl.GLSurfaceView;
- import android.opengl.GLSurfaceView.Renderer;
- import android.util.AttributeSet;
- import android.util.Log;
-
- public class CameraGLSurfaceView extends GLSurfaceView implements Renderer, SurfaceTexture.OnFrameAvailableListener {
- private static final String TAG = "yanzi";
- Context mContext;
- SurfaceTexture mSurface;
- int mTextureID = -1;
- DirectDrawer mDirectDrawer;
- public CameraGLSurfaceView(Context context, AttributeSet attrs) {
- super(context, attrs);
-
- mContext = context;
- setEGLContextClientVersion(2);
- setRenderer(this);
- setRenderMode(RENDERMODE_WHEN_DIRTY);
- }
- @Override
- public void onSurfaceCreated(GL10 gl, EGLConfig config) {
-
- Log.i(TAG, "onSurfaceCreated...");
- mTextureID = createTextureID();
- mSurface = new SurfaceTexture(mTextureID);
- mSurface.setOnFrameAvailableListener(this);
- mDirectDrawer = new DirectDrawer(mTextureID);
- CameraInterface.getInstance().doOpenCamera(null);
-
- }
- @Override
- public void onSurfaceChanged(GL10 gl, int width, int height) {
-
- Log.i(TAG, "onSurfaceChanged...");
- GLES20.glViewport(0, 0, width, height);
- if(!CameraInterface.getInstance().isPreviewing()){
- CameraInterface.getInstance().doStartPreview(mSurface, 1.33f);
- }
-
-
- }
- @Override
- public void onDrawFrame(GL10 gl) {
-
- Log.i(TAG, "onDrawFrame...");
- GLES20.glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
- GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
- mSurface.updateTexImage();
- float[] mtx = new float[16];
- mSurface.getTransformMatrix(mtx);
- mDirectDrawer.draw(mtx);
- }
-
- @Override
- public void onPause() {
-
- super.onPause();
- CameraInterface.getInstance().doStopCamera();
- }
- private int createTextureID()
- {
- int[] texture = new int[1];
-
- GLES20.glGenTextures(1, texture, 0);
- GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture[0]);
- GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
- GL10.GL_TEXTURE_MIN_FILTER,GL10.GL_LINEAR);
- GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
- GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
- GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
- GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
- GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
- GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
-
- return texture[0];
- }
- public SurfaceTexture _getSurfaceTexture(){
- return mSurface;
- }
- @Override
- public void onFrameAvailable(SurfaceTexture surfaceTexture) {
-
- Log.i(TAG, "onFrameAvailable...");
- this.requestRender();
- }
-
- }
- </span>
关于这个类进行简单说明:
1、Renderer这个接口里有三个回调: onSurfaceCreated() onSurfaceChanged() onDrawFrame(),在onSurfaceCreated里设置了GLSurfaceView的版本: setEGLContextClientVersion(2); 如果没这个设置是啥都画不出来了,因为Android支持OpenGL ES1.1和2.0及最新的3.0,而且版本间差别很大。不告诉他版本他不知道用哪个版本的api渲染。在设置setRenderer(this);后,再设置它的模式为RENDERMODE_WHEN_DIRTY。这个也很关键,看api:
When renderMode is RENDERMODE_CONTINUOUSLY, the renderer is called repeatedly to re-render the scene. When renderMode is RENDERMODE_WHEN_DIRTY, the renderer only rendered when the surface is created, or when requestRender
is called. Defaults to RENDERMODE_CONTINUOUSLY.
Using RENDERMODE_WHEN_DIRTY can improve battery life and overall system performance by allowing the GPU and CPU to idle when the view does not need to be updated.
大意是RENDERMODE_CONTINUOUSLY模式就会一直Render,如果设置成RENDERMODE_WHEN_DIRTY,就是当有数据时才rendered或者主动调用了GLSurfaceView的requestRender.默认是连续模式,很显然Camera适合脏模式,一秒30帧,当有数据来时再渲染。
2、正因是RENDERMODE_WHEN_DIRTY所以就要告诉GLSurfaceView什么时候Render,也就是啥时候进到onDrawFrame()这个函数里。SurfaceTexture.OnFrameAvailableListener这个接口就干了这么一件事,当有数据上来后会进到
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
// TODO Auto-generated method stub
Log.i(TAG, "onFrameAvailable...");
this.requestRender();
}
这里,然后执行requestRender()。
3、网上有一些OpenGL ES的示例是在Activity里实现了SurfaceTexture.OnFrameAvailableListener此接口,其实这个无所谓。无论是被谁实现,关键看在回调里干了什么事。
4、与TextureView里对比可知,TextureView预览时因为实现了SurfaceTextureListener会自动创建SurfaceTexture。但在GLSurfaceView里则要手动创建同时绑定一个纹理ID。
5、本文在onSurfaceCreated()里打开Camera,在onSurfaceChanged()里开启预览,默认1.33的比例。原因是相比前两种预览,此处SurfaceTexture创建需要一定时间。如果想要开预览时由Activity发起,则要GLSurfaceView利用Handler将创建的SurfaceTexture传递给Activity。
二、DirectDrawer.java 此类非常关键,负责将SurfaceTexture内容绘制到屏幕上
- <span style="font-family:Comic Sans MS;font-size:18px;">package org.yanzi.camera.preview;
-
- import java.nio.ByteBuffer;
- import java.nio.ByteOrder;
- import java.nio.FloatBuffer;
- import java.nio.ShortBuffer;
-
- import android.opengl.GLES11Ext;
- import android.opengl.GLES20;
- import android.opengl.Matrix;
-
- public class DirectDrawer {
- private final String vertexShaderCode =
- "attribute vec4 vPosition;" +
- "attribute vec2 inputTextureCoordinate;" +
- "varying vec2 textureCoordinate;" +
- "void main()" +
- "{"+
- "gl_Position = vPosition;"+
- "textureCoordinate = inputTextureCoordinate;" +
- "}";
-
- private final String fragmentShaderCode =
- "#extension GL_OES_EGL_image_external : require\n"+
- "precision mediump float;" +
- "varying vec2 textureCoordinate;\n" +
- "uniform samplerExternalOES s_texture;\n" +
- "void main() {" +
- " gl_FragColor = texture2D( s_texture, textureCoordinate );\n" +
- "}";
-
- private FloatBuffer vertexBuffer, textureVerticesBuffer;
- private ShortBuffer drawListBuffer;
- private final int mProgram;
- private int mPositionHandle;
- private int mTextureCoordHandle;
-
- private short drawOrder[] = { 0, 1, 2, 0, 2, 3 };
-
-
- private static final int COORDS_PER_VERTEX = 2;
-
- private final int vertexStride = COORDS_PER_VERTEX * 4;
-
- static float squareCoords[] = {
- -1.0f, 1.0f,
- -1.0f, -1.0f,
- 1.0f, -1.0f,
- 1.0f, 1.0f,
- };
-
- static float textureVertices[] = {
- 0.0f, 1.0f,
- 1.0f, 1.0f,
- 1.0f, 0.0f,
- 0.0f, 0.0f,
- };
-
- private int texture;
-
- public DirectDrawer(int texture)
- {
- this.texture = texture;
-
- ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4);
- bb.order(ByteOrder.nativeOrder());
- vertexBuffer = bb.asFloatBuffer();
- vertexBuffer.put(squareCoords);
- vertexBuffer.position(0);
-
-
- ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);
- dlb.order(ByteOrder.nativeOrder());
- drawListBuffer = dlb.asShortBuffer();
- drawListBuffer.put(drawOrder);
- drawListBuffer.position(0);
-
- ByteBuffer bb2 = ByteBuffer.allocateDirect(textureVertices.length * 4);
- bb2.order(ByteOrder.nativeOrder());
- textureVerticesBuffer = bb2.asFloatBuffer();
- textureVerticesBuffer.put(textureVertices);
- textureVerticesBuffer.position(0);
-
- int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
- int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
-
- mProgram = GLES20.glCreateProgram();
- GLES20.glAttachShader(mProgram, vertexShader);
- GLES20.glAttachShader(mProgram, fragmentShader);
- GLES20.glLinkProgram(mProgram);
- }
-
- public void draw(float[] mtx)
- {
- GLES20.glUseProgram(mProgram);
-
- GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
- GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture);
-
-
- mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
-
-
- GLES20.glEnableVertexAttribArray(mPositionHandle);
-
-
- GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);
-
- mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");
- GLES20.glEnableVertexAttribArray(mTextureCoordHandle);
-
- GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, textureVerticesBuffer);
-
- GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
-
-
- GLES20.glDisableVertexAttribArray(mPositionHandle);
- GLES20.glDisableVertexAttribArray(mTextureCoordHandle);
- }
-
- private int loadShader(int type, String shaderCode){
-
-
-
- int shader = GLES20.glCreateShader(type);
-
-
- GLES20.glShaderSource(shader, shaderCode);
- GLES20.glCompileShader(shader);
-
- return shader;
- }
- private float[] transformTextureCoordinates( float[] coords, float[] matrix)
- {
- float[] result = new float[ coords.length ];
- float[] vt = new float[4];
-
- for ( int i = 0 ; i < coords.length ; i += 2 ) {
- float[] v = { coords[i], coords[i+1], 0 , 1 };
- Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);
- result[i] = vt[0];
- result[i+1] = vt[1];
- }
- return result;
- }
- }
- </span>
三、有了上面两个类就完成95%的工作,可以将GLSurfaceView看成是有生命周期的。在onPause里进行关闭Camera,在Activity里复写两个方法:
- <span style="font-family:Comic Sans MS;font-size:18px;"> @Override
- protected void onResume() {
-
- super.onResume();
- glSurfaceView.bringToFront();
- }
-
- @Override
- protected void onPause() {
-
- super.onPause();
- glSurfaceView.onPause();
- }</span>
这个glSurfaceView.bringToFront();其实不写也中。在布局里写入自定义的GLSurfaceView就ok了:
- <span style="font-family:Comic Sans MS;font-size:18px;"> <FrameLayout
- android:layout_width="wrap_content"
- android:layout_height="wrap_content" >
- <org.yanzi.camera.preview.CameraGLSurfaceView
- android:id="@+id/camera_textureview"
- android:layout_width="0dip"
- android:layout_height="0dip" />
- </FrameLayout></span>
CameraActivity里只负责UI部分,CameraGLSurfaceView负责开Camera、预览,并调用DirectDrawer里的draw()进行绘制。其他代码就不上了。
注意事项:
1、在onDrawFrame()里,如果不调用mDirectDrawer.draw(mtx);是啥都显示不出来的!!!这是GLSurfaceView的特别之处。为啥呢?因为GLSurfaceView不是Android亲生的,而Surfaceview和TextureView是。所以得自己按照OpenGL ES的流程画。
2、究竟mDirectDrawer.draw(mtx)里在哪获取的Buffer目前杂家还么看太明白,貌似么有请求buffer,而是根据GLSurfaceView里创建的SurfaceTexture之前,生成的有个纹理ID。这个纹理ID一方面跟SurfaceTexture是绑定在一起的,另一方面跟DirectDrawer绑定,而SurfaceTexture作渲染载体。
3、参考链接里有,有人为了解决问题,给出了下面三段代码:
@Override
public void onDrawFrame(GL10 gl)
{
float[] mtx = new float[16];
mSurface.updateTexImage();
mSurface.getTransformMatrix(mtx);
mDirectVideo.draw(mtx);
}
private float[] transformTextureCoordinates( float[] coords, float[] matrix)
{
float[] result = new float[ coords.length ];
float[] vt = new float[4];
for ( int i = 0 ; i < coords.length ; i += 2 ) {
float[] v = { coords[i], coords[i+1], 0 , 1 };
Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);
result[i] = vt[0];
result[i+1] = vt[1];
}
return result;
}
textureVerticesBuffer.clear();
textureVerticesBuffer.put( transformTextureCoordinates( textureVertices, mtx ));
textureVerticesBuffer.position(0);
我已经把代码都融入到了此demo,只不过在draw()方法里么有使用。原因是使用之后,得到的预览画面反而是变形的,而不用的话是ok的。上面的代码是得到SurfaceTexture的变换矩阵:mSurface.getTransformMatrix
然后将此矩阵传递给draw(),在draw的时候对textureVerticesBuffer作一个变化,然后再画。
下图是未加这个矩阵变换效果时:
下图为使用了变换矩阵,划片扭曲的还真说不上来咋扭曲的,但足以说明OpenGL ES在渲染效果上的强大,就是设置了个矩阵,不用一帧帧处理,就能得到不一样显示效果。
-----------------------------本文系原创,转载请注明作者yanzi1225627
版本号:PlayCamera_V3.0.0[2014-6-22].zip
CSDN下载链接:http://download.csdn.net/detail/yanzi1225627/7547263
百度云盘:
附个OpenGL ES简明教程:http://www.apkbus.com/android-20427-1-1.html
【转】玩转Android Camera开发(三):国内首发---使用GLSurfaceView预览Camera 基础拍照demo
标签:
原文地址:http://www.cnblogs.com/tc310/p/5258227.html