前兩篇文章已經(jīng)把如何控制android設(shè)備的輸入講了,這一篇就是如何獲取輸出,通過adb的方式
原理
AdbClient和AdbServer都是運(yùn)行在PC上的,AdbDaemon運(yùn)行在android設(shè)備上。那framebuffer倒底是個(gè)啥?
幀緩沖(frame buffer)是Linux視頻系統(tǒng)的核心概念,因此先了解一下他的功能。
因?yàn)橐曨l適配器可能基于不同的硬件體系架構(gòu),較高內(nèi)核層和應(yīng)用程序的實(shí)現(xiàn)可能會(huì)因視頻卡的不同而不同,這會(huì)導(dǎo)致在使用不同視頻卡的時(shí)需要采用不同的方案。隨之而來的低可移植性和冗余的代碼需要大量的投入和維護(hù)開銷。幀緩沖的概念解決了這個(gè)問題,它進(jìn)行了一般化的抽象并規(guī)定編程接口,從而開發(fā)人員可以以與平臺(tái)無關(guān)的方式編寫應(yīng)用層和較高內(nèi)核層程序。因此,內(nèi)核的幀緩沖接口允許應(yīng)用程序與底層圖形硬件的變化無關(guān),如果應(yīng)用和顯示器驅(qū)動(dòng)程序遵循幀緩沖接口,應(yīng)用程序不用改變就可以在不同類型的視頻硬件上運(yùn)行。 (引用鏈接)
我來簡單的解釋一下,其實(shí)就是當(dāng)前時(shí)間和之前幾個(gè)幀的屏幕圖像信息(每個(gè)像素點(diǎn)的顏色,分辨率等等)。
java實(shí)現(xiàn)
我們來看看android系統(tǒng)里面對每一幀圖像信息的定義
//ddmlibsrccomandroidddmlibDevice.java
public RawImage getScreenshot()
throws TimeoutException, AdbCommandRejectedException, IOException {
return AdbHelper.getFrameBuffer(AndroidDebugBridge.getSocketAddress(), this);
}
//ddmlibsrccomandroidddmlibAdbHelper.java
/**
* Retrieve the frame buffer from the device.
* @throws TimeoutException in case of timeout on the connection.
* @throws AdbCommandRejectedException if adb rejects the command
* @throws IOException in case of I/O error on the connection.
*/
static RawImage getFrameBuffer(InetSocketAddress adbSockAddr, Device device)
throws TimeoutException, AdbCommandRejectedException, IOException {
RawImage imageParams = new RawImage();
byte[] request = formAdbRequest("framebuffer:"); //$NON-NLS-1$
byte[] nudge = {
0
};
byte[] reply;
SocketChannel adbChan = null;
try {
adbChan = SocketChannel.open(adbSockAddr);
adbChan.configureBlocking(false);
// if the device is not -1, then we first tell adb we're looking to talk
// to a specific device
setDevice(adbChan, device);
write(adbChan, request);
AdbResponse resp = readAdbResponse(adbChan, false /* readDiagString */);
if (resp.okay == false) {
throw new AdbCommandRejectedException(resp.message);
}
// first the protocol version.
reply = new byte[4];
read(adbChan, reply);
ByteBuffer buf = ByteBuffer.wrap(reply);
buf.order(ByteOrder.LITTLE_ENDIAN);
int version = buf.getInt();
// get the header size (this is a count of int)
int headerSize = RawImage.getHeaderSize(version);
// read the header
reply = new byte[headerSize * 4];
read(adbChan, reply);
buf = ByteBuffer.wrap(reply);
buf.order(ByteOrder.LITTLE_ENDIAN);
// fill the RawImage with the header
if (imageParams.readHeader(version, buf) == false) {
Log.e("Screenshot", "Unsupported protocol: " + version);
return null;
}
Log.d("ddms", "image params: bpp=" + imageParams.bpp + ", size="
+ imageParams.size + ", width=" + imageParams.width
+ ", height=" + imageParams.height);
write(adbChan, nudge);
reply = new byte[imageParams.size];
read(adbChan, reply);
imageParams.data = reply;
} finally {
if (adbChan != null) {
adbChan.close();
}
}
return imageParams;
}
上面兩段源碼取自于ddmlib庫,展示了AdbClient需要如何去和AdbServer通信去獲取framebuffer當(dāng)前幀,PS:當(dāng)前adb自帶的一些工具都還沒有實(shí)現(xiàn)該功能,如果我們想實(shí)現(xiàn)該功能,很簡單,建立一個(gè)java程序,引用ddmlib和一些其他的庫,去調(diào)用即可(getScreenshot()),怎么調(diào)用?請看《Android Adb調(diào)試功能漫談》
buf.order(ByteOrder.LITTLE_ENDIAN);
注意上面這一句,涉及到一個(gè)大字節(jié)序和小字節(jié)序的問題,詳情請見大字節(jié)序與小字節(jié)序詳解。
由于現(xiàn)在大部分民用cpu都是小字節(jié)序,所以我在adb.exe翻譯成c代碼時(shí),這一段沒有實(shí)現(xiàn)。也希望有興趣的朋友把大小字節(jié)序轉(zhuǎn)換的函數(shù)實(shí)現(xiàn),網(wǎng)上也有部分代碼參考哦。
adb.exe 實(shí)現(xiàn)
這里是一個(gè)代碼翻譯過程,不多啰嗦,看代碼
//outScreen由函數(shù)里面申請,外面用完了必須要釋放內(nèi)存,因?yàn)閰f(xié)議里面帶有長度
int getScreen(char* inSerial, unsigned char** outScreen, int* outWidth, int* outHeight)
{
int fd = 0;
int rel = 0;
int dbgRel = 0;
char buf[10240];
int version = -1;
int headerSize = 0;
unsigned char* retScreen;
int* cousor;
unsigned char* cCousor;
unsigned char nudge[] = {0};
//圖片的信息
int bpp;
int size;
int width;
int height;
int red_offset;
int red_length;
int blue_offset;
int blue_length;
int green_offset;
int green_length;
int alpha_offset;
int alpha_length;
transport_type ttype = kTransportAny;
int server_port = DEFAULT_ADB_PORT;
adb_set_transport(ttype, inSerial);
adb_set_tcp_specifics(server_port);
fd = adb_connect("framebuffer:"); //+
if(fd < 0) {
return 0;
//read_finished(fd);
//adb_close(fd);
}
//開始接收數(shù)據(jù)
if (readx(fd,buf,4) == 0){
version = *(int*)(buf);
//printf("version:%d
",version);
}else{
fprintf(stderr,"err recv size");
return 0;
}
switch (version)
{
case 1:
headerSize = 12;
break;
case 16:
headerSize = 3;
break;
default:
break;
}
if (headerSize == 0)
{
adb_close(fd);
return 0;
}
retScreen = (unsigned char*)malloc(headerSize*4);
if (readx(fd,retScreen,headerSize*4) != 0)
{
fprintf(stderr,"err recv size");
adb_close(fd);
return 0;
}
cousor = (int*)retScreen;
if (version == 1)
{
bpp = *cousor;cousor++;
size = *cousor;cousor++;
width = *cousor;cousor++;
height = *cousor;cousor++;
red_offset = *cousor;cousor++;
red_length = *cousor;cousor++;
blue_offset = *cousor;cousor++;
blue_length = *cousor;cousor++;
green_offset = *cousor;cousor++;
green_length = *cousor;cousor++;
alpha_offset = *cousor;cousor++;
alpha_length = *cousor;
}
else if(version == 16)
{
bpp = 16;
// read actual values.
size = *cousor;cousor++;
width = *cousor;cousor++;
height = *cousor;cousor++;
// create default values for the rest. Format is 565
red_offset = 11;
red_length = 5;
green_offset = 5;
green_length = 6;
blue_offset = 0;
blue_length = 5;
alpha_offset = 0;
alpha_length = 0;
}
else
{
fprintf(stderr,"unsupport version");
}
//printf("bpp:%d width:%d high:%d size:%d
",bpp,width,height,size);
adb_write(fd,nudge,sizeof nudge);
retScreen = (unsigned char*)realloc(retScreen,size);
if (readx(fd,retScreen,size) != 0)
{
fprintf(stderr,"err recv size should be:%d
",size);
adb_close(fd);
return 0;
}
* outScreen = retScreen;
* outWidth = width;
* outHeight = height;
//write png
//write_PNG(retScreen,"d:\a.png",width,height);
//free(retScreen);
adb_close(fd);
return 1;
}
使用該函數(shù)就可以把當(dāng)前幀的framebuffer取過來,進(jìn)行處理。C代碼?怎么用?請見《讓Adb.exe支持Monkey》
framebuffer格式
那把framebuffer取下來之后怎么處理呢?這是一個(gè)問題。
對字節(jié)流進(jìn)行處理
typedef struct RGBA
{
RGBA()
{
R = 0;
G = 0;
B = 0;
A = 0;
}
RGBA(BYTE pR, BYTE pG, BYTE pB, BYTE pA)
{
R = pR;
G = pG;
B = pB;
A = pA;
}
BYTE R;
BYTE G;
BYTE B;
BYTE A;
}*PRGBA;
這是一個(gè)像素點(diǎn)的RGBA信息,A是Alpha通道,與透明度相關(guān),RGB就不解釋了。一個(gè)像素是占用4個(gè)字節(jié),按順序是R,G,B,A信息。整個(gè)字節(jié)流是一個(gè)二維數(shù)組,可以根據(jù)framebuffer里面的元數(shù)組取得長和寬,這樣就可以精確的讀取到每一個(gè)像素點(diǎn)的信息。
用opencv進(jìn)行處理
//傳入 framebuffer, frameWidth, frameHeight
...
try
{
Size size(frameWidth, frameHeight);
//這里創(chuàng)建不會(huì)復(fù)制數(shù)據(jù),只會(huì)創(chuàng)建一個(gè)頭部
Mat ref(size, CV_8UC4, framebuffer);
cvtColor(ref, ref, CV_BGRA2RGB);
ret = ...(function)(Mat ref)
}
catch (Exception* e)
{
fprintf(stderr, "Function execute err:%s
", e->what());
ret = false;
}
if (freeFramebuffer)
free(framebuffer);
...
現(xiàn)在的圖形處理如果還不用opencv可就有點(diǎn)out了哦 :)
后記
經(jīng)過實(shí)測,framebuffer是無壓縮的,有些分辨率高的設(shè)備取到的framebuffer非常大,從設(shè)備傳送到AdbClient耗時(shí)會(huì)比較大,5秒左右的也有,如果追求極限是可以在android側(cè)制作一個(gè)程序先獲取framebuffer再壓縮,再傳送到AdbClient,就會(huì)快很多,有興趣的朋友可以交流。
framebuffer獲取的原理和格式都已經(jīng)清楚了,其實(shí)已經(jīng)可以做非常多的自由發(fā)揮了。下一篇是啥?怎么用opencv做一些處理?或者用tesseract做一些有意思的東東?