最近在做一個自定義相機的Demo, Demo的需求是相機對著某一處,當自動對焦成功后,然后拍攝圖片。
- (void)initAVCaptureSession{
self.session = [[AVCaptureSession alloc] init];
NSError *error;
self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[self.device addObserver:self
forKeyPath:@"adjustingFocus"
options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld
context:nil];
//更改這個設置的時候必須先鎖定設備,修改完后再解鎖,否則崩潰
[self.device lockForConfiguration:nil];
//設置閃光燈為自動
// [device setFlashMode:AVCaptureFlashModeOff];//AVCaptureFlashModeAuto
[self.device unlockForConfiguration];
self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:self.device error:&error];
if (error) {
NSLog(@"%@",error);
}
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
//輸出設置。AVVideoCodecJPEG 輸出jpeg格式圖片
NSDictionary * outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
if ([self.session canAddInput:self.videoInput]) {
[self.session addInput:self.videoInput];
}
if ([self.session canAddOutput:self.stillImageOutput]) {
[self.session addOutput:self.stillImageOutput];
}
self.session.sessionPreset = AVCaptureSessionPresetHigh;
//初始化預覽圖層
WEAKSELF
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
weakSelf.previewLayer.frame = CGRectMake(0, 0,kScreenWidth, kScreenHeight);
dispatch_async(dispatch_get_main_queue(), ^{
weakSelf.cameraView.layer.masksToBounds = YES;
[weakSelf.cameraView.layer addSublayer:self.previewLayer];
});
});
}
自動對焦的功能使用的是KVO實現的, 在上面的方法中添加了一個KVO:
[self.device addObserver:self
forKeyPath:@"adjustingFocus"
options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld
context:nil];
下面是自動對焦KVO的方法:
-(void)observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context {
if([keyPath isEqualToString:@"adjustingFocus"]){
BOOL adjustingFocus =[[change objectForKey:NSKeyValueChangeNewKey] isEqualToNumber:[NSNumber numberWithInt:1]];
if (adjustingFocus == 0) {
[timer setFireDate:[NSDate distantFuture]];//停止
[self.device removeObserver:self forKeyPath:@"adjustingFocus"];
AVCaptureConnection *stillImageConnection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
UIDeviceOrientation curDeviceOrientation = [[UIDevice currentDevice] orientation];
AVCaptureVideoOrientation avcaptureOrientation = [self avOrientationForDeviceOrientation:curDeviceOrientation];
[stillImageConnection setVideoOrientation:avcaptureOrientation];
[stillImageConnection setVideoScaleAndCropFactor:self.effectiveScale];
WEAKSELF
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
dispatch_sync(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
if (!error) {
if (imageDataSampleBuffer == NULL) {
//沒有圖片
GPLog(@"當前拍攝的圖片沒有數據!");
}
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAuthorizationStatus author = [ALAssetsLibrary authorizationStatus];
if (author == ALAuthorizationStatusRestricted || author == ALAuthorizationStatusDenied){
//無權限
[weakSelf showHUDText:@"沒有權限!"];
return;
}
UIImage *key_image = [UIImage imageWithData:jpegData];
if (!weakSelf.isPacket) {
weakSelf.cameraImg.hidden = NO;
weakSelf.cameraImg.image = [UIImage scaleImage:key_image WithSize:CGSizeMake(480, 480)];
weakSelf.cameraObj = key_image;
weakSelf.packetCannelBtn.hidden = NO;
weakSelf.packetHereBtn.hidden = NO;
} else {
NSURL *key_file = [NSURL URLWithString:weakSelf.storeModel.key_file_url];
[[SDWebImageManager sharedManager] loadImageWithURL:key_file options:0 progress:nil completed:^(UIImage * _Nullable image, NSData * _Nullable data, NSError * _Nullable error, SDImageCacheType cacheType, BOOL finished, NSURL * _Nullable imageURL) {
double key_image_double = [GetSimilarity getSimilarityValueWithImgA:image ImgB:key_image];
if (key_image_double >= 0.75) {
[weakSelf performSegueWithIdentifier:@"PacketDetailVC" sender:weakSelf.storeModel];
} else {
[self.device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:nil];
}
}];
}
}
});
}];
}
}
}
首先來說說Demo的需求, 其實需求也就是做一個類似支付寶AR紅包的APP,新版本的支付寶現在查看不了這個功能了。需求是這樣的, 首先相機使用自動對焦拍攝一張圖片,然后埋下紅包或者圖片或者視頻, 然后上傳服務器。 附近的人查看到當前位置有藏得紅包, 圖片或者視頻的時候, 可以通過支付寶相機的自動對焦來獲取一張圖片和之前上傳服務器的圖片進行對比,如果圖片匹配相同的, 就會打開紅包, 圖片或者視頻, 整個Demo的功能就是這樣的。
但是呢, 在iPhone 5s(我自己的手機), 上面使用這份代碼測試沒有一點問題, 但是使用iPhone7測試這份代碼的時候, 發現根本就不能。在iPhone 7上面, 第一次相機會很快的會自動對焦成功, 然后自動拍攝一張圖片, 但是這張圖片很模糊或者就是一張純黑的圖片,所以需要重新拍攝一張圖片。在自動對焦成功后的代碼中, 我有使用:
[self.device removeObserver:self forKeyPath:@"adjustingFocus"];
下面的方法是點擊重新拍攝的方法,里面有重新添加自動對焦的KVO:
- (IBAction)actionForPacketCannel:(id)sender {
[self addTimer];
self.cameraImg.hidden = YES;
self.packetHereBtn.hidden = YES;
self.packetCannelBtn.hidden = YES;
[self.device addObserver:self
forKeyPath:@"adjustingFocus"
options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld
context:nil];
}
但是現在的問題是:在iPhone7上面, 進入到VC后,沒有多久, 他就自動對焦成功,然后會拍攝圖片,但是這個圖片和iPhone5s上面對比,相差太遠了,簡直就不像是對焦成功后的,有時候既然是一張全黑的圖片。既然不是我想要的圖片, 我就按重新選擇圖片,然后重新添加KVO,然相機又自動對焦,可是這個時候,怎么對焦都不成功!除非我把相機對著地板,或者相機要靠近物件只有幾厘米進行對焦才會成功, 這個時候的圖片拍出來也是很模糊的。Why? 這是為什么啊 ?怎么感覺iPhone5s 和 iPhone 7 上面的不是一樣的效果,難道是iPhone 7 的手機相機又不同之處 ?求大神指點一二,謝謝!