Places a 3D object on the first plane ARKit detects and uses SceneKit's positional audio feature to locate the object when it is outside of the camera's field of view.
This sample app runs an ARKit world tracking session with content displayed in a SceneKit view. To demonstrate plane detection, the app places a 3D model onto the first plane that ARKit detects. If the model's position is outside the current field of view, the app uses SceneKit's positional audio feature to indicate which direction to turn the device to see the model.
ARKit requires iOS 11 and a device with an A9 (or later) processor. ARKit is not available in the iOS Simulator. Building the sample code requires Xcode 9 or later.
The ARSCNView
class is a SceneKit view that includes an ARSession
object that manages the motion tracking and image processing required to create an AR experience; however, to run a session you must provide a session configuration.
The ARWorldTrackingConfiguration
class provides high-precision motion tracking and enables features to help you place virtual content in relation to real-world surfaces. To start an AR session, create a session configuration object with the options you want (such as plane detection), then call the run(_:options:)
method on the session
object of your ARSCNView
instance:
// Start the ARSession.
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
sceneView.session.run(configuration)
View in Source
Important: If your app requires ARKit for its core functionality, use the
arkit
key in theUIRequiredDeviceCapabilities
section of your app's Info.plist file to make your app available only on devices that support ARKit. If AR is a secondary feature of your app, use theisSupported
property to determine whether to offer AR-based features.
After you’ve set up your AR session, you can use SceneKit to place virtual content in the view.
When plane detection is enabled, ARKit adds and updates anchors for each detected plane. By default, the ARSCNView
class adds an SCNNode
object to the SceneKit scene for each anchor. Your view's delegate can implement the renderer(_:didAdd:for:)
method to add content to the scene.
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
// Place content only for anchors found by plane detection.
guard anchor is ARPlaneAnchor && previewNode != nil
else { return }
// Stop showing a preview version of the object to be placed.
contentNode.removeFromParentNode()
previewNode?.removeFromParentNode()
previewNode = nil
// Add the contenNode to the scene's root node using the anchor's position.
guard let cameraTransform = sceneView.session.currentFrame?.camera.transform
else { return }
setNewVirtualObjectToAnchor(contentNode, to: anchor, cameraTransform: cameraTransform)
sceneView.scene.rootNode.addChildNode(contentNode)
// Disable plane detection after the model has been added.
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
sceneView.session.run(configuration, options: [])
// Set up positional audio to play in case the object moves offscreen.
playSound()
}
View in Source
If you add content as a child of the node corresponding to the anchor, the ARSCNView
class automatically moves that content as ARKit refines its estimate of the plane's position and extent. This sample app instead adds content as a child of the scene's root node, but using the transform provided by the anchor — this alternative technique is a way to keep the content from moving after placement.
This sample configures an audio player that plays audio whenever the content node is no longer in view to help the user locate it.
The SCNAudioSource
class represents an audio source that can be added to any SCNNode
instance. To support positional audio in SceneKit, your application can create instances of SCNAudioSource
with a file URL pointing to an audio file. Because SceneKit’s audio engine uses panning for 3D positional purposes, you must use mono audio files for best results.
// Audio source for positional audio feedback.
var audioSource: SCNAudioSource = SCNAudioSource(fileNamed: "Assets.scnassets/ping.aif")!
.
.
// Setup the audio file.
audioSource.loops = true
audioSource.load()
To add the SCNAudioSource
to the SceneKit graph, you need to initialize an instance of SCNAudioPlayer
using that instance of SCNAudioSource
and then add the audio player to the SCNNode
.
contentNode.addAudioPlayer(SCNAudioPlayer(source: audioSource))
View in Source
Attaching the audio player to an SCNNode
object allows for spatialized 3D audio playback based on the position of that node relative to the scene's camera.
To control playback of the audio player, your view's delegate can implement the [renderer(_:updateAtTime:)
][12] method to update the playback state of the audio player as needed. This sample uses this delegate callback to control the volume of the audio player based on whether the contentNode
is visible.
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
if contentNode.parent == nil && previewNode == nil {
// If our model hasn't been placed and we lack a preview for placement then setup a preview.
setupPreviewNode()
updatePreviewNode()
} else {
updatePreviewNode()
}
updateLightEstimate()
cutVolumeIfPlacedObjectIsInView()
}
View in Source