Implementing a Video Player with RealityKit

David Cordero
3 min readAug 30, 2021

--

This post was originally published on dcordero.me, where you will always find the most updated, and free of spyware version of all my posts.

I am really looking forward to the eventual release of the rumored Apple Glasses. I think that AR could be the next big revolution that could make redundant some of the technology that we use nowadays.

One of the use cases where I see a potential for AR technology is in the world of video playback Apps.

If you think about it… who needs a huge physical TV if you could cast your content to a virtual tv that you can put wherever you want.

That’s the reason why I decided to check how to create a video player with RealityKit, and I found out that it is quite easy.

RealityKit Video Player

The first we need to do to create a AR Video Player with Reality is to create a standard instance of AVPlayer to play our video asset.

let url = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8")!
let playerItem = AVPlayerItem(url: url)
let player = AVPlayer(playerItem: playerItem)
player.play()

And then we need to create a ModelEntity with our instance of AVPlayer as VideoMaterial.

let screenMesh = MeshResource.generatePlane(width: 0.7, height: 0.5)
let videoMaterial = VideoMaterial(avPlayer: player)
let modelEntity = ModelEntity(mesh: screenMesh, materials: [videoMaterial])

And that’s all we need to create a RealityKit player, now we only need to place our ModelEntity in the AR world. To do that we need to create a AnchorEntity that defines a location in the AR World.

A simple way to create a AnchorEntity is by using UITapGestureRecognizer, once we have the location of the gesture on the screen, we can translate it to world coordinates using the method raycast of ARView.

@objc
private func tapWasReceived(recognizer: UITapGestureRecognizer) {
let location = recognizer.location(in: arView)
let results = arView.raycast(from: location, allowing: .estimatedPlane, alignment: .horizontal)

if let firstResult = results.first {
let anchorEntity = AnchorEntity(world: firstResult.worldTransform)
addScreen(anchorEntity: anchorEntity)
}
}

Show me the code

In the following code, you can find a ViewController with everything put together.

import UIKit
import RealityKit
import AVFoundation
import ARKit

class ViewController: UIViewController {

override func viewDidLoad() {
super.viewDidLoad()
setUpView()
setUpTapDectection()
}

override func viewDidLayoutSubviews() {
arView.frame = view.bounds
}

// MARK: - Private

private lazy var arView: ARView = {
let arView = ARView()
return arView
}()

private func setUpView() {
view.addSubview(arView)
}

private func setUpTapDectection() {
let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(tapWasReceived(recognizer:)))
arView.addGestureRecognizer(tapGestureRecognizer)
}

private func addScreen(anchorEntity: AnchorEntity) {
let url = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8")!
let playerItem = AVPlayerItem(url: url)
let player = AVPlayer(playerItem: playerItem)

let screenMesh = MeshResource.generatePlane(width: 0.7, height: 0.5)
let videoMaterial = VideoMaterial(avPlayer: player)
let modelEntity = ModelEntity(mesh: screenMesh, materials: [videoMaterial])

anchorEntity.addChild(modelEntity)

arView.scene.addAnchor(anchorEntity)

player.play()
}

// MARK: - Action

@objc
private func tapWasReceived(recognizer: UITapGestureRecognizer) {
let location = recognizer.location(in: arView)
let results = arView.raycast(from: location, allowing: .estimatedPlane, alignment: .horizontal)

if let firstResult = results.first {
let anchorEntity = AnchorEntity(world: firstResult.worldTransform)
addScreen(anchorEntity: anchorEntity)
}
}
}

--

--

David Cordero
David Cordero

Written by David Cordero

iOS and tvOS developer at Zattoo. Passionate about coding and lifelong learning.

No responses yet