ARKit學習-7

轉載請注明出處
Apple原文地址: https://developer.apple.com/documentation/arkit/displaying_an_ar_experience_with_metal


Displaying an AR Experience with Metal(使用 Metal 來展示 AR 場景)

Build a custom AR view by rendering camera images and using position-tracking information to display overlay content.
通過渲染相機圖像,以及使用位置追蹤信息來展示覆蓋(overlay)物,從而來構建自定義的 AR 視圖場景。


Overview

  • ARKit includes view classes for easily displaying AR experiences with SceneKit or SpriteKit. However, if you instead build your own rendering engine (or integrate with a third-party engine), ARKit also provides all the support necessary to display an AR experience with a custom view.
  • ARKit 中內置了一些視圖類,從而能夠輕松地用 SceneKit 或者 SpriteKit 來展示 AR 場景。然而,如果您使用的是自己的渲染引擎(或者集成了第三方引擎),那么 ARKit 還提供了自定義視圖以及其他的支持環境,來展示 AR 場景。
image.png
  • In any AR experience, the first step is to configure an ARSession
    object to manage camera capture and motion processing. A session defines and maintains a correspondence between the real-world space the device inhabits and a virtual space where you model AR content. To display your AR experience in a custom view, you’ll need to:
  • 在所有的 AR 場景中,首先就是要配置一個 ARSession**對象,用來管理攝像頭拍攝和對設備動作進行處理。Session 定義并維護現實空間和虛擬空間之間的關聯關系,其中,現實空間是用戶所處的世界,虛擬空間是可對可視化內容進行建模的世界。如果要在自定義視圖當中展示您的 AR 場景的話,那么您需要:

  • 1.Retrieve video frames and tracking information from the session.

  • 1.從 Session 中檢索視頻幀和追蹤信息

  • 2.Render those frame images as the backdrop for your view.

  • 2.將這些幀圖像作為背景,渲染到自定義視圖當中

  • 3.Use the tracking information to position and draw AR content atop the camera image.

  • 3.使用追蹤信息,從而在相機圖像上方定位并繪制 AR 內容

Note/注意
This article covers code found in Xcode project templates. For complete example code, create a new iOS application with the Augmented Reality template, and choose Metal from the Content Technology popup menu.
本文所涉及的代碼均可以在 Xcode 項目模板當中找到。如果要獲取完整的示例代碼,請使用 “Augmented Reality” 模板來創建一個新的 iOS 應用,然后在彈出的 Content Technology 菜單當中選擇 “Metal”

Get Video Frames and Tracking Data from the Session(從 Session 中獲取視頻幀和追蹤數據)

  • Create and maintain your own ARSession
    instance, and run it with a session configuration appropriate for the kind of AR experience you want to support. (To do this, see Building a Basic AR Experience.) The session captures video from the camera, tracks the device’s position and orientation in a modeled 3D space, and provides ARFrame
    objects. Each such object contains both an individual video frame image and position tracking information from the moment that frame was captured.
  • 請自行創建并維護 ARSession實例,然后根據您所希望提供的 AR 場景類型,使用合適的 Session 配置來運行這個實例。(要實現這點的話,請參閱「構建基本的 AR 場景」。)Session 從攝像機當中捕獲視頻,然后在建模的 3D 空間中追蹤設備的位置和方向,并提供 ARFrame對象。每個 ARFrame對象都包含有單獨的視頻幀 (frame) 圖像和被捕獲時的設備位置追蹤信息。

  • There are two ways to access ARFrame
    objects produced by an AR session, depending on whether your app favors a pull or a push design pattern.

  • 要訪問 AR Session 中生成的 ARFrame對象的話,有以下兩種方法,使用何種方法取決于您應用的設計模式是偏好主動拉取 (pull) 還是被動推送 (push)。

  • If you prefer to control frame timing (the pull design pattern), use the session’s currentFrame
    property to get the current frame image and tracking information each time you redraw your view’s contents. The ARKit Xcode template uses this approach:
  • 如果您傾向于定時獲取視頻幀的話(也就是主動拉取設計模式),那么請使用 Session 的 currentFrame屬性,這樣就可以在每次重繪視圖內容的時候,獲取當前的幀圖像和追蹤信息。ARKit Xcode 模板使用了如下方法:
// in Renderer class, called from MTKViewDelegate.draw(in:) via Renderer.update()
func updateGameState() {        
    guard let currentFrame = session.currentFrame else {
        return
    }
    
    updateSharedUniforms(frame: currentFrame)
    updateAnchors(frame: currentFrame)
    updateCapturedImageTextures(frame: currentFrame)
    
    if viewportSizeDidChange {
        viewportSizeDidChange = false
        
        updateImagePlane(frame: currentFrame)
    }
}
  • Alternatively, if your app design favors a push pattern, implement the session(_:didUpdate:)
    delegate method, and the session will call it once for each video frame it captures (at 60 frames per second by default).
  • 相反,如果您的應用設計傾向于使用被動推送模式的話,那么請實現session(_:didUpdate:)代理方法,當每個視頻幀被捕獲之后,Session 就會調用這個代理方法(默認每秒捕獲 60 幀)。

  • Upon obtaining a frame, you’ll need to draw the camera image, and update and render any overlay content your AR experience includes.

  • 獲得一個視頻幀之后,您就需要繪制相機圖像了,然后將 AR 場景中包含的所有覆蓋物進行更新和展示。

Draw the Camera Image(繪制相機圖像)

  • Each ARFrame
    object’s capturedImage
    property contains a pixel buffer captured from the device camera. To draw this image as the backdrop for your custom view, you’ll need to create textures from the image content and submit GPU rendering commands that use those textures.
  • 每個 ARFrame對象的 capturedImage屬性都包含了從設備相機中捕獲的像素緩沖區 (pixel buffer)。要將這個圖像作為背景繪制到自定義視圖當中,您需要從圖像內容中構建紋理 (texture),然后提交使用這些紋理進行 GPU 渲染的命令。

  • The pixel buffer’s contents are encoded in a biplanar YCbCr (also called YUV) data format; to render the image you’ll need to convert this pixel data to a drawable RGB format. For rendering with Metal, you can perform this conversion most efficiently in GPU shader code. Use CVMetalTextureCache APIs to create two Metal textures from the pixel buffer—one each for the buffer’s luma (Y) and chroma (CbCr) planes:

  • 像素緩沖區的內容將被編碼為雙面 (biplanar) YCbCr 數據格式(也成為 YUV);要渲染圖像的話,您需要將這些像素數據轉換為可繪制的 RGB 格式。對于 Metal 渲染而言,最高效的方法便是使用 GPU 著色代碼 (shader code) 來執行這個轉換了。借助CVMetalTextureCache**API,可以從像素緩沖區中生成兩個 Metal 紋理——一個用于決定緩沖區的亮度 (Y),一個用于決定緩沖區的色度 (CbCr) 面。

func updateCapturedImageTextures(frame: ARFrame) {
    // Create two textures (Y and CbCr) from the provided frame's captured image
    //從所提供的視頻幀中,根據其中所捕獲的圖像,創建兩個紋理 (Y and CbCr)
    let pixelBuffer = frame.capturedImage
    if (CVPixelBufferGetPlaneCount(pixelBuffer) < 2) {
        return
    }
    capturedImageTextureY = createTexture(fromPixelBuffer: pixelBuffer, pixelFormat:.r8Unorm, planeIndex:0)!
    capturedImageTextureCbCr = createTexture(fromPixelBuffer: pixelBuffer, pixelFormat:.rg8Unorm, planeIndex:1)!
}

func createTexture(fromPixelBuffer pixelBuffer: CVPixelBuffer, pixelFormat: MTLPixelFormat, planeIndex: Int) -> MTLTexture? {
    var mtlTexture: MTLTexture? = nil
    let width = CVPixelBufferGetWidthOfPlane(pixelBuffer, planeIndex)
    let height = CVPixelBufferGetHeightOfPlane(pixelBuffer, planeIndex)
    
    var texture: CVMetalTexture? = nil
    let status = CVMetalTextureCacheCreateTextureFromImage(nil, capturedImageTextureCache, pixelBuffer, nil, pixelFormat, width, height, planeIndex, &texture)
    if status == kCVReturnSuccess {
        mtlTexture = CVMetalTextureGetTexture(texture!)
    }
    
    return mtlTexture
}
  • Next, encode render commands that draw those two textures using a fragment function that performs YCbCr to RGB conversion with a color transform matrix:
  • 接下來,使用借助顏色變換矩陣將 YCbCr 轉換為 RGB 的函數片段,完成這兩個紋理的繪制,我們這里將整個渲染命令進行編碼。
fragment float4 capturedImageFragmentShader(ImageColorInOut in [[stage_in]],
                                            texture2d<float, access::sample> capturedImageTextureY [[ texture(kTextureIndexY) ]],
                                            texture2d<float, access::sample> capturedImageTextureCbCr [[ texture(kTextureIndexCbCr) ]]) {
    
    constexpr sampler colorSampler(mip_filter::linear,
                                   mag_filter::linear,
                                   min_filter::linear);
    
    const float4x4 ycbcrToRGBTransform = float4x4(
        float4(+1.164380f, +1.164380f, +1.164380f, +0.000000f),
        float4(+0.000000f, -0.391762f, +2.017230f, +0.000000f),
        float4(+1.596030f, -0.812968f, +0.000000f, +0.000000f),
        float4(-0.874202f, +0.531668f, -1.085630f, +1.000000f)
    );
    
    // Sample Y and CbCr textures to get the YCbCr color at the given texture coordinate
    float4 ycbcr = float4(capturedImageTextureY.sample(colorSampler, in.texCoord).r,
                          capturedImageTextureCbCr.sample(colorSampler, in.texCoord).rg, 1.0);
    
    // Return converted RGB color
    return ycbcrToRGBTransform * ycbcr;
}

Note / 注意

  • Use the displayTransform(withViewportSize:orientation:)
    method to make sure the camera image covers the entire view. For example use of this method, as well as complete Metal pipeline setup code, see the full Xcode template. (Create a new iOS application with the Augmented Reality template, and choose Metal from the Content Technology popup menu.)
  • 請使用 displayTransform(withViewportSize:orientation:)方法來確保整個相機圖像完全覆蓋了整個視圖。關于如何使用這個方法,以及完整的 Metal 管道配置代碼,請參閱完整的 Xcode 模板。(請使用 “Augmented Reality” 模板來創建一個新的 iOS 應用,然后在彈出的 Content Technology 菜單當中選擇 “Metal”。)

Track and Render Overlay Content(追蹤并渲染覆蓋內容)

  • AR experiences typically focus on rendering 3D overlay content so that the content appears to be part of the real world seen in the camera image. To achieve this illusion, use the ARAnchor
    class to model the position and orientation of your own 3D content relative to real-world space. Anchors provide transforms that you can reference during rendering.
  • AR 場景通常側重于渲染 3D 覆蓋物,使得這些內容似乎是從相機中所看到的真實世界的一部分。為了實現這種效果,我們使用 ARAnchor**類,來對 3D 內容相對于現實世界空間的位置和方向進行建模。錨點提供了變換 (transform) 屬性,在渲染的時候可供參考。

  • For example, the Xcode template creates an anchor located about 20 cm in front of the device whenever a user taps on the screen:

  • 舉個例子,當用戶點擊屏幕的時候,Xcode 模板會在設備前方大約 20 厘米處,創建一個錨點。

func handleTap(gestureRecognize: UITapGestureRecognizer) {
    // Create anchor using the camera's current position
    if let currentFrame = session.currentFrame {
        
        // Create a transform with a translation of 0.2 meters in front of the camera
        var translation = matrix_identity_float4x4
        translation.columns.3.z = -0.2
        let transform = simd_mul(currentFrame.camera.transform, translation)
        
        // Add a new anchor to the session
        let anchor = ARAnchor(transform: transform)
        session.add(anchor: anchor)
    }
}
  • In your rendering engine, use the transform
    property of each ARAnchor
    object to place visual content. The Xcode template uses each of the anchors added to the session in its handleTap
    method to position a simple cube mesh:
  • 在您的渲染引擎當中,使用每個 ARAnchor對象的 transform屬性來放置虛擬內容。Xcode 模板在內部的 handleTap方法中,使用添加到 Session 當中每個錨點來定位一個簡單的立方體網格 (cube mesh):
func updateAnchors(frame: ARFrame) {
    // Update the anchor uniform buffer with transforms of the current frame's anchors
    anchorInstanceCount = min(frame.anchors.count, kMaxAnchorInstanceCount)
    
    var anchorOffset: Int = 0
    if anchorInstanceCount == kMaxAnchorInstanceCount {
        anchorOffset = max(frame.anchors.count - kMaxAnchorInstanceCount, 0)
    }
    
    for index in 0..<anchorInstanceCount {
        let anchor = frame.anchors[index + anchorOffset]
        
        // Flip Z axis to convert geometry from right handed to left handed
        var coordinateSpaceTransform = matrix_identity_float4x4
        coordinateSpaceTransform.columns.2.z = -1.0
        
        let modelMatrix = simd_mul(anchor.transform, coordinateSpaceTransform)
        
        let anchorUniforms = anchorUniformBufferAddress.assumingMemoryBound(to: InstanceUniforms.self).advanced(by: index)
        anchorUniforms.pointee.modelMatrix = modelMatrix
    }
}

Note / 注意

  • In a more complex AR experience, you can use hit testing or plane detection to find the positions of real-world surfaces. For details, see the planeDetection
    property and the hitTest(_:types:)
    method. In both cases, ARKit provides results as ARAnchor
    objects, so you still use anchor transforms to place visual content.
  • 在更為復雜的 AR 場景中,您可以使用點擊測試或者水平面檢測,來尋找真實世界當中曲面的位置。要了解關于此內容的詳細信息,請參閱 planeDetection屬性和hitTest(_:types:)方法。對于這兩者而言,ARKit 都會生成 ARAnchor對象作為結果,因此您仍然需要使用錨點的 transform 屬性來放置虛擬內容。

Render with Realistic Lighting(根據實際光照度進行渲染)

  • When you configure shaders for drawing 3D content in your scene, use the estimated lighting information in each ARFrame
    object to produce more realistic shading:
  • 當您在場景中配置用于繪制 3D 內容的著色器時,請使用每個 ARFrame**對象當中的預計光照度信息,來產生更為逼真的陰影:
// in Renderer.updateSharedUniforms(frame:):
// Set up lighting for the scene using the ambient intensity if provided
var ambientIntensity: Float = 1.0
if let lightEstimate = frame.lightEstimate {
    ambientIntensity = Float(lightEstimate.ambientIntensity) / 1000.0
}
let ambientLightColor: vector_float3 = vector3(0.5, 0.5, 0.5)
uniforms.pointee.ambientLightColor = ambientLightColor * ambientIntensity

Note / 注意

  • For the complete set of Metal setup and rendering commands that go with this example, see the see the full Xcode template. (Create a new iOS application with the Augmented Reality template, and choose Metal from the Content Technology popup menu.)
  • 要了解該示例中的全部 Metal 配置,以及所使用的渲染命令,請參見完整的 Xcode 模板。(請使用 “Augmented Reality” 模板來創建一個新的 iOS 應用,然后在彈出的 Content Technology 菜單當中選擇 “Metal”。
最后編輯于
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市,隨后出現的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖,帶你破解...
    沈念sama閱讀 230,106評論 6 542
  • 序言:濱河連續發生了三起死亡事件,死亡現場離奇詭異,居然都是意外死亡,警方通過查閱死者的電腦和手機,發現死者居然都...
    沈念sama閱讀 99,441評論 3 429
  • 文/潘曉璐 我一進店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人,你說我怎么就攤上這事。” “怎么了?”我有些...
    開封第一講書人閱讀 178,211評論 0 383
  • 文/不壞的土叔 我叫張陵,是天一觀的道長。 經常有香客問我,道長,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 63,736評論 1 317
  • 正文 為了忘掉前任,我火速辦了婚禮,結果婚禮上,老公的妹妹穿的比我還像新娘。我一直安慰自己,他們只是感情好,可當我...
    茶點故事閱讀 72,475評論 6 412
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著,像睡著了一般。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發上,一...
    開封第一講書人閱讀 55,834評論 1 328
  • 那天,我揣著相機與錄音,去河邊找鬼。 笑死,一個胖子當著我的面吹牛,可吹牛的內容都是我干的。 我是一名探鬼主播,決...
    沈念sama閱讀 43,829評論 3 446
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了?” 一聲冷哼從身側響起,我...
    開封第一講書人閱讀 43,009評論 0 290
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后,有當地人在樹林里發現了一具尸體,經...
    沈念sama閱讀 49,559評論 1 335
  • 正文 獨居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內容為張勛視角 年9月15日...
    茶點故事閱讀 41,306評論 3 358
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發現自己被綠了。 大學時的朋友給我發了我未婚夫和他白月光在一起吃飯的照片。...
    茶點故事閱讀 43,516評論 1 374
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖,靈堂內的尸體忽然破棺而出,到底是詐尸還是另有隱情,我是刑警寧澤,帶...
    沈念sama閱讀 39,038評論 5 363
  • 正文 年R本政府宣布,位于F島的核電站,受9級特大地震影響,放射性物質發生泄漏。R本人自食惡果不足惜,卻給世界環境...
    茶點故事閱讀 44,728評論 3 348
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧,春花似錦、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 35,132評論 0 28
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至,卻和暖如春,著一層夾襖步出監牢的瞬間,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 36,443評論 1 295
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留,地道東北人。 一個月前我還...
    沈念sama閱讀 52,249評論 3 399
  • 正文 我出身青樓,卻偏偏與公主長得像,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當晚...
    茶點故事閱讀 48,484評論 2 379

推薦閱讀更多精彩內容