ARKit功能demo

  1. ARKit點(diǎn)擊屏幕增加文字
  2. ARKit點(diǎn)擊屏幕增加3D模型
  3. ARKit檢測(cè)到平面自動(dòng)增加3D模型
  4. QuickLook的最簡(jiǎn)單使用
  5. ARKit人臉貼圖
  6. ARKit微笑檢測(cè)
  7. ARKit皺眉檢測(cè)
  8. ARKit人臉參數(shù)BlendShapes詳解
  9. demo

1. ARKit點(diǎn)擊屏幕增加文字

1.點(diǎn)擊屏幕增加文字.gif
  • command+shift+n新建一個(gè)項(xiàng)目,然后選擇Augmented Reality App

  • 在Content Technology中選擇SpriteKit即可

  • 控制文字距離相機(jī)的距離(改變這個(gè)Z感受一下變化)

matrix_float4x4 translation = matrix_identity_float4x4;
translation.columns[3].z = -1;

2. ARKit點(diǎn)擊屏幕增加3D模型

2.點(diǎn)擊屏幕增加3D模型.gif

2.1 畫面捕捉

主要就是三個(gè)類:

  • ARSCNView: 畫面顯示
  • ARConfiguration: 捕捉畫面
    • ARWorldTrackingConfiguration:后置攝像頭
    • ARFaceTrackingConfiguration:前置攝像頭,會(huì)實(shí)時(shí)監(jiān)測(cè)面部表情特征
  • ARSession:數(shù)據(jù)中轉(zhuǎn)

viewDidLoad的時(shí)候初始化資源

self.arSCNView = [[ARSCNView alloc] initWithFrame:self.view.bounds options:nil];
    self.arSCNView.session = [[ARSession alloc] init];
    // 1. 創(chuàng)建世界追蹤配置,需要支持A9芯片也就是iPhone6S以上
    self.arWordTrackingConfiguration = [[ARWorldTrackingConfiguration alloc] init];
    // 2. 設(shè)置追蹤方向,追蹤平面
    self.arWordTrackingConfiguration.planeDetection = ARPlaneDetectionHorizontal;
    self.arWordTrackingConfiguration.lightEstimationEnabled = YES;

viewDidAppear時(shí)讓session開(kāi)始工作

[self.arSession runWithConfiguration:self.arWordTrackingConfiguration]

2.2 點(diǎn)擊增加3D圖像

當(dāng)點(diǎn)擊屏幕的時(shí)候加載一個(gè)scn文件并且作為childNode添加到self.arSCNView.scene.rootNode

- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
    // 1. 使用場(chǎng)景加載scn文件
    SCNScene *scene = [SCNScene sceneNamed:@"art.scnassets/ship.scn"];
    
    SCNNode *shipNode = scene.rootNode.childNodes.firstObject;
    shipNode.position = SCNVector3Make(0, -1, -1);
    
    [self.arSCNView.scene.rootNode addChildNode:shipNode];
}

3. ARKit檢測(cè)到平面自動(dòng)增加3D模型

3.檢測(cè)到平面增加3D模型.gif

前期準(zhǔn)備工作和2.1一樣,只是增加了self.arSCNView.delegate = self
然后在代理方法renderer:didAddNode:forAnchor:中實(shí)現(xiàn)以下代碼:

#pragma mark - ARSCNViewDelegate
// 添加節(jié)點(diǎn)的時(shí)候調(diào)用(當(dāng)開(kāi)啟平地捕捉模式之后,如果捕捉到平地,ARKit會(huì)自動(dòng)添加一個(gè)平地節(jié)點(diǎn))
- (void)renderer:(id<SCNSceneRenderer>)renderer didAddNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
    
    if (![anchor isMemberOfClass:[ARPlaneAnchor class]]) return;
    
    // 添加一個(gè)3D平面模型,ARKit只有捕捉能力,錨點(diǎn)只是一個(gè)空間位置,想更加清楚看到這個(gè)空間,我們需要給控件添加一個(gè)平地的3D模型來(lái)渲染它
    // 1. 獲取捕捉到的平地錨點(diǎn)
    ARPlaneAnchor *planeAnchor = (ARPlaneAnchor *)anchor;
    // 2. 創(chuàng)建一個(gè)3D模型(系統(tǒng)捕捉到的平地是一個(gè)不規(guī)則的大小長(zhǎng)方形,這里筆者q將其變成一個(gè)長(zhǎng)方形,并且對(duì)平地做了一個(gè)縮放效果)
    // 參數(shù)分別是長(zhǎng)、寬、高、圓角
    SCNBox *planeBox = [SCNBox boxWithWidth:planeAnchor.extent.x * 0.3 height:0 length:planeAnchor.extent.x * 0.3 chamferRadius:0];
    // 3. 使用Material渲染3D模型(默認(rèn)模型是白色的)
    planeBox.firstMaterial.diffuse.contents = [UIColor clearColor];
    // 4. 創(chuàng)建一個(gè)基于3D物體模型的節(jié)點(diǎn)
    SCNNode *planeNode = [SCNNode nodeWithGeometry:planeBox];
    // 5. 設(shè)置節(jié)點(diǎn)的位置為捕捉到的平地的錨點(diǎn)的中心位置
    // SceneKit中節(jié)點(diǎn)的位置position是一個(gè)基于3D坐標(biāo)系的矢量坐標(biāo)SCNVector3Make
    planeNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z);
    
    [node addChildNode:planeNode];
    
    // 6. 創(chuàng)建一個(gè)花瓶場(chǎng)景
    SCNScene *scene = [SCNScene sceneNamed:@"art.scnassets/vase/vase.scn"];
    // 7. 獲取花瓶節(jié)點(diǎn)
    // 一個(gè)場(chǎng)景有多個(gè)節(jié)點(diǎn),所有場(chǎng)景有且只有一個(gè)根節(jié)點(diǎn),其它所有節(jié)點(diǎn)都是根節(jié)點(diǎn)的子節(jié)點(diǎn)
    SCNNode *vaseNode = scene.rootNode.childNodes.firstObject;
    // 8. 設(shè)置花瓶節(jié)點(diǎn)的位置為捕捉到的平地的位置,如果不設(shè)置,則默認(rèn)為原點(diǎn)位置也就是相機(jī)位置
    vaseNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z);
    // 9. 將花瓶節(jié)點(diǎn)添加到屏幕中
    // !!!!FBI WARNING: 花瓶節(jié)點(diǎn)是添加到代理捕捉到的節(jié)點(diǎn)中,而不是AR視圖的根接節(jié)點(diǎn)。
    // 因?yàn)椴蹲降降钠降劐^點(diǎn)是一個(gè)本地坐標(biāo)系,而不是世界坐標(biāo)系
    [node addChildNode:vaseNode];
}

4. QuickLook的最簡(jiǎn)單使用

4.QuickLook簡(jiǎn)單使用.gif

這個(gè)沒(méi)什么好說(shuō)的,直接上代碼

#import "ViewController.h"
#import <QuickLook/QuickLook.h>
#import "WYPreviewItem.h"

@interface ViewController ()<QLPreviewControllerDataSource, QLPreviewControllerDelegate>

@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    
}

- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
    QLPreviewController *preVC = [[QLPreviewController alloc] init];
    preVC.dataSource = self;
    preVC.delegate = self;
    
    [self presentViewController:preVC animated:YES completion:nil];
}

#pragma mark - QLPreviewControllerDataSource && QLPreviewControllerDelegate
- (NSInteger)numberOfPreviewItemsInPreviewController:(QLPreviewController *)controller {
    return 1;
}

- (id<QLPreviewItem>)previewController:(QLPreviewController *)controller previewItemAtIndex:(NSInteger)index {
    
    return [[NSBundle mainBundle] URLForResource:@"plantpot.usdz" withExtension:nil];
}

- (UIImage *)previewController:(QLPreviewController *)controller transitionImageForPreviewItem:(id<QLPreviewItem>)item contentRect:(CGRect *)contentRect {
    
    CGRect rect = CGRectMake(100, 200, 300, 300);
    contentRect = &rect;
    
    return [UIImage imageNamed:@"wy.jpeg"];
}

5. ARKit人臉貼圖

5.人臉貼圖.gif

設(shè)置session的configuration為ARFaceTrackingConfiguration,然后在ARSCNView的代理renderer:willUpdateNode:forAnchor中增加一個(gè)SCNNode核心代碼如下:

  • 創(chuàng)建SCNNode
    • 試試看設(shè)置fillMesh為YES會(huì)怎么樣
    • 試試看設(shè)置masterial.diffuse.contents為一個(gè)顏色會(huì)怎么樣
- (SCNNode *)textureMaskNode {
    if (!_textureMaskNode) {
        
        id<MTLDevice> device = self.arSCNView.device;
        ARSCNFaceGeometry *geometry = [ARSCNFaceGeometry faceGeometryWithDevice:device fillMesh:NO];
        SCNMaterial *material = geometry.firstMaterial;
        material.fillMode = SCNFillModeFill;
        material.diffuse.contents = [UIImage imageNamed:@"wy.jpg"];
        _textureMaskNode = [SCNNode nodeWithGeometry:geometry];
    }
    _textureMaskNode.name = @"textureMask";
    return _textureMaskNode;
}
  • 添加SCNNode并更新人臉特征
- (void)renderer:(id<SCNSceneRenderer>)renderer willUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
    
    if (!anchor || ![anchor isKindOfClass:[ARFaceAnchor class]]) return;
    ARFaceAnchor *faceAnchor = (ARFaceAnchor *)anchor;
    
    if (!_textureMaskNode) {
        [node addChildNode:self.textureMaskNode];
    }
    
    ARSCNFaceGeometry *faceGeometry = (ARSCNFaceGeometry *)self.textureMaskNode.geometry;
    if (faceGeometry && [faceGeometry isKindOfClass:[ARSCNFaceGeometry class]]) {
        [faceGeometry updateFromFaceGeometry:faceAnchor.geometry];
    }
}

6. ARKit微笑檢測(cè)

6.微笑檢測(cè).gif

主要用到了ARBlendShapeLocationMouthSmileLeftARBlendShapeLocationMouthSmileRight表示微笑的鍵值
我提供的demo是用于調(diào)試微笑閥值的
核心代碼:

- (void)renderer:(id<SCNSceneRenderer>)renderer didUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
    
    if (!anchor || ![anchor isKindOfClass:[ARFaceAnchor class]]) return;
    
    ARFaceAnchor *faceAnchor = (ARFaceAnchor *)anchor;
    
    NSDictionary *blendShips = faceAnchor.blendShapes;
    CGFloat leftSmile = [blendShips[ARBlendShapeLocationMouthSmileLeft] floatValue];
    CGFloat rightSmile = [blendShips[ARBlendShapeLocationMouthSmileRight] floatValue];
    
    NSLog(@"leftSmile = %f, rightSmile = %f", leftSmile, rightSmile);
    if (leftSmile > self.smileValue && rightSmile > self.smileValue) {
        NSLog(@"檢測(cè)到笑容");
        [self.arSession pause];
        dispatch_async(dispatch_get_main_queue(), ^{
            self.resultLabel.hidden = NO;
        });
    }
}

7. ARKit皺眉檢測(cè)

7.皺眉檢測(cè).gif

我這里用的是眉毛向上的鍵值ARBlendShapeLocationBrowInnerUp
核心代碼:

- (void)renderer:(id<SCNSceneRenderer>)renderer didUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor {
    
    if (!anchor && ![anchor isKindOfClass:[ARFaceAnchor class]]) return;
    
    ARFaceAnchor *faceAnchor = (ARFaceAnchor *)anchor;
    NSDictionary *blendShapes = faceAnchor.blendShapes;
    NSNumber *browInnerUp = blendShapes[ARBlendShapeLocationBrowInnerUp];
    
    if ([browInnerUp floatValue] > self.browValue) {
        [self.arSession pause];
        dispatch_async(dispatch_get_main_queue(), ^{
            self.resultLabel.hidden = NO;
        });
    }
    
    NSLog(@"browInnerUp = %@", browInnerUp);
}

8. BlendShapes

  • 僅在iOS11及以上可用,每個(gè)參數(shù)的詳細(xì)介紹和圖片對(duì)比可以打開(kāi)Xcode->Window->Developer Documentation,然后搜索對(duì)應(yīng)的鍵值即可
  • 每個(gè)建對(duì)應(yīng)的值都是0~1的值
  • 共51個(gè)表示人臉特征的參數(shù)


屬性 說(shuō)明 備注
ARBlendShapeLocationBrowDownLeft 左眉毛外部向下
ARBlendShapeLocationBrowDownRight 右眉毛外部向下
ARBlendShapeLocationBrowInnerUp 兩眉毛內(nèi)部向上
ARBlendShapeLocationBrowOuterUpLeft 左眉毛外部向上
ARBlendShapeLocationBrowOuterUpRight 右眉毛外部向上
ARBlendShapeLocationCheekPuff 兩個(gè)臉頰向外
ARBlendShapeLocationCheekSquintLeft 左眼向下斜視
ARBlendShapeLocationCheekSquintRight 右眼向下斜視
ARBlendShapeLocationEyeBlinkLeft 眨左眼
ARBlendShapeLocationEyeBlinkRight 眨右眼
ARBlendShapeLocationEyeLookDownLeft 左眼瞼運(yùn)動(dòng)的系數(shù)與向下凝視一致
ARBlendShapeLocationEyeLookDownRight 右眼瞼運(yùn)動(dòng)的系數(shù)與向下凝視一致
ARBlendShapeLocationEyeLookInLeft 左眼瞼運(yùn)動(dòng)的系數(shù)與向右凝視一致。
ARBlendShapeLocationEyeLookInRight 右眼瞼運(yùn)動(dòng)的系數(shù)與向左凝視一致。
ARBlendShapeLocationEyeLookOutLeft 左眼瞼運(yùn)動(dòng)的系數(shù)與向左凝視一致
ARBlendShapeLocationEyeLookOutRight 右眼瞼運(yùn)動(dòng)的系數(shù)與向右凝視一致
ARBlendShapeLocationEyeSquintLeft 左眼臉部收縮
ARBlendShapeLocationEyeSquintRight 右眼臉部收縮
ARBlendShapeLocationEyeWideLeft 左眼周圍眼瞼變寬
ARBlendShapeLocationEyeWideRight 右眼周圍眼瞼變寬
ARBlendShapeLocationJawForward 下頜向前運(yùn)動(dòng)
ARBlendShapeLocationJawLeft 下頜向左運(yùn)動(dòng)
ARBlendShapeLocationJawOpen 下頜開(kāi)口
ARBlendShapeLocationJawRight 下頜向右運(yùn)動(dòng)
ARBlendShapeLocationMouthClose 嘴唇閉合的系數(shù)與頜位置無(wú)關(guān)
ARBlendShapeLocationMouthDimpleLeft 嘴左角后移
ARBlendShapeLocationMouthDimpleRight 嘴右角后移
ARBlendShapeLocationMouthFrownLeft 嘴左角向下運(yùn)動(dòng)
ARBlendShapeLocationMouthFrownRight 嘴右角向下運(yùn)動(dòng)
ARBlendShapeLocationMouthFunnel 兩個(gè)嘴唇收縮成開(kāi)放形狀
ARBlendShapeLocationMouthLeft 兩個(gè)嘴唇向左移動(dòng)
ARBlendShapeLocationMouthLowerDownLeft 左側(cè)下唇向下運(yùn)動(dòng)
ARBlendShapeLocationMouthLowerDownRight 又側(cè)下唇向下運(yùn)動(dòng)
ARBlendShapeLocationMouthPressLeft 左側(cè)下唇向上壓縮
ARBlendShapeLocationMouthPressRight 右側(cè)下唇向上壓縮
ARBlendShapeLocationMouthPucker 兩個(gè)閉合嘴唇的收縮和壓縮
ARBlendShapeLocationMouthRight 兩個(gè)嘴唇向右運(yùn)動(dòng)
ARBlendShapeLocationMouthRollLower 下唇向嘴內(nèi)側(cè)移動(dòng)
ARBlendShapeLocationMouthRollUpper 上唇向嘴內(nèi)側(cè)移動(dòng)
ARBlendShapeLocationMouthShrugLower 下唇向外運(yùn)動(dòng)
ARBlendShapeLocationMouthShrugUpper 上唇向外運(yùn)動(dòng)
ARBlendShapeLocationMouthSmileLeft 嘴左角向上運(yùn)動(dòng)
ARBlendShapeLocationMouthSmileRight 嘴右角向上運(yùn)動(dòng)
ARBlendShapeLocationMouthStretchLeft 嘴左角向左移動(dòng)
ARBlendShapeLocationMouthStretchRight 嘴左角向右移動(dòng)
ARBlendShapeLocationMouthUpperUpLeft 左側(cè)上唇向上運(yùn)動(dòng)
ARBlendShapeLocationMouthUpperUpRight 右側(cè)上唇向上運(yùn)動(dòng)
ARBlendShapeLocationNoseSneerLeft 左鼻孔抬高
ARBlendShapeLocationNoseSneerRight 右鼻孔抬高
ARBlendShapeLocationTongueOut 舌頭延伸
ARBlendShapeLocationMouthClose
最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡(jiǎn)書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

推薦閱讀更多精彩內(nèi)容

  • 引言ARKit 為開(kāi)發(fā) iPhone 和 iPad 增強(qiáng)現(xiàn)實(shí)(AR)app 提供了一個(gè)前沿平臺(tái)。本文為你介紹 AR...
    螞蟻安然閱讀 9,417評(píng)論 0 14
  • ARKit ARKit框架通過(guò)集成iOS設(shè)備攝像頭和運(yùn)動(dòng)功能,在您的應(yīng)用程序或游戲中產(chǎn)生增強(qiáng)現(xiàn)實(shí)體驗(yàn)。 概述 增強(qiáng)...
    暗夜夜夜行路閱讀 5,833評(píng)論 0 17
  • 今天周三,而且是單周,意味著今天沒(méi)有課程安排,5小只開(kāi)心的奔赴彭老師家,雖然旅途是遙遠(yuǎn)而暈眩的,但是下車的一刻大家...
    一只亂寫亂畫的喵閱讀 613評(píng)論 0 0
  • 在一個(gè)地方住的久了,那里的每一棵樹(shù),每一個(gè)人都仿佛成為生命中的一部分。 好像活在這個(gè)世上,就是為了每天看著那棵樹(shù)上...
    郭清平閱讀 285評(píng)論 0 3
  • 去年寒假,我第一次嘗試為笑儒報(bào)名參加冬令營(yíng)。 當(dāng)時(shí)有兩條線路可以選擇,一條是國(guó)內(nèi)線去紹興、杭州,另一條是國(guó)外線去韓...
    午后紅茶_雙子座閱讀 270評(píng)論 0 0