J1プロサッカーチームFC町田ゼルビアのImmersive動画を撮影しSwiftでViewerを実装し体験会実施した

2.7K Views

April 13, 25

スライド概要

try! Swift Tokyo 2025

シェア

またはPlayer版

埋め込む »CMSなどでJSが使えない場合

関連スライド

各ページのテキスト
1.

try! Swift Tokyo 2025 Filming FC Machida Zelvia's Immersive Video and Implementing a Viewer in Swift for a viewing event Satoshi Hattori @shmdevelop

2.

Satoshi Hattori Cyber AI Productions visionOS Expert at Cyber Agent Host of "visionOS Engineer Meetup Tokyo" GitHub: satoshi 2 1 2 0 X: @shmdevelop

3.

visionOS 30 Days Challenge visionOS2 30 Days Challenge

4.

In This Session

5.

Immersive Video Shooting Work ow Viewer Implementation fl ff Demo Event: Key E orts and Outcomes

6.

Prerequisites

7.

Prerequisites What is Immersive Video? About FC Machida Zelvia

8.

Prerequisites Immersive Video

9.

Apple Immersive Video K D video content featuring a -degree field of view and spatial audio, viewable on Apple Vision Pro 0 8 1 https://www.apple.com/jp/apple-vision-pro/ 3 8 Prerequisites

11.

Have you experienced Apple Immersive Video? ✋

12.

Prerequisites 「VIP」 S E : Feel the excitement of game day like never before as broadcasting great Joe Buck narrates an all-access Bronx experience - with the personalities, players, and spine-tingling moments that make the Yankees' ballpark legendary. 5 4 1 1 https://tv.apple.com/jp/episode/yankee-stadium/umc.cmc. awwgdeond reeitehveujxpu?l=en-US

13.

Prerequisites 3 https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/converting_side-by-side_ d_video_to_multiview_hevc_and_spatial_video

14.

Prerequisites 3 https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/converting_side-by-side_ d_video_to_multiview_hevc_and_spatial_video

15.

Prerequisites 3 https://developer.apple.com/documentation/avfoundation/converting-side-by-side- d-video-to-multiview-hevc-and-spatial-video

16.

Prerequisites Specs ISO Base Media File Format and Apple HEVC Stereo Video Video Extended Usage box hierarchy https://developer.apple.com/av-foundation/Stereo-Video-ISOBMFF-Extensions.pdf

17.

Prerequisites Apple Immersive Video

18.

Prerequisites FC Machida Zelvia

19.

FC Machida Zelvia is a professional football club based in Machida, Tokyo, Japan, and is a member of the Japan Professional Football League. The major shareholder is Cyber Agent. FC町 ゼルビアは、 本の東京都町 市をホームタウンとして、 本プロ サッカーリーグ(Jリーグ)に加盟するプロサッカークラブ。 日 田 日 田 主要株主 サイバーエージェント。

23.

Immersive Video Shooting Work ow Viewer Implementation fl ff Demo Event: Key E orts and Outcomes

24.

Immersive Video Shooting Work ow Viewer Implementation fl ff Demo Event: Key E orts and Outcomes

25.

Shooting work ow fl Planning Filming Editing

26.

Shooting work ow fl Planning Filming Editing

27.

Immersive Video × Sports

28.

fi In 2024, FC Machida Zelvia entered their rst year in the J1 League. We set out to create content that allows supporters to relive this historic moment.

29.

ZELVISION XR A project aimed at immersing supporters in the world of FC Machida Zelvia, delivering a new kind of sports experience and demonstrating its value — utilizing the Immersive Video format.

31.

Shooting work ow fl Planning Filming Editing

32.

Filming Filming Permissions Negotiations and coordination with the club Compliance with J.League regulations Wearing o cial bibs Shoot / times Sapporo match / Kyoto match ffi 3 0 1 3 2 1 9 1 Farewell before the FC Tokyo match

33.

Filming Filming Permissions Negotiations and coordination with the club Compliance with J.League regulations Wearing o cial bibs Shoot Mr. Hirose ffi 3 0 1 3 2 1 9 1 Professional Cameraman / times Sapporo match / Kyoto match Farewell before the FC Tokyo match

34.

5 8 2 2 5 Canon EOS R C RF . mm F . L DUAL FISHEYE

35.

https://x.com/makoto_hirose/ 2 1 7 9 7 6 3 5 6 0 7 3 9 6 4 2 6 8 1 status/

37.

Filming Camera: Canon EOS R C + RF . mm F . L DUAL FISHEYE Camera Settings: K P( x pixels, frames per second) Continuously capturing images equivalent to megapixels at frames per second. This generates considerable heat. The EOS R C features an internal fan and has never stopped, even in the scorching heat. Media: RAW LT format stored on CFExpress Type B memory cards. With RAW LT, TB stores approximately minutes of footage. Simultaneous recording of proxy videos ( x pixels) on SD cards. Proxy Videos: These are lightweight and perfect for review and rough editing. They are equivalent to the JPEG files in the RAW+JPEG photo shooting format. 0 0 2 2 8 3 0 6 4 0 8 0 5 1 3 8 5 4 0 2 2 5 0 1 8 2 0 6 2 5 0 2 3 4 5 1 2 9 1 8 0 6 8 Storage: A large amount of high-speed SSDs. In the past six months, TB x and TB x were purchased. In-house environment: Gb Ethernet, QNAP NAS with approximately TB of storage.

38.

Filming Camera: Canon EOS R C + RF . mm F . L DUAL FISHEYE Camera Settings: K P( x pixels, frames per second) Continuously capturing images equivalent to megapixels at frames per second. This generates considerable heat. The EOS R C features an internal fan and has never stopped, even in the scorching heat. Media: RAW LT format stored on CFExpress Type B memory cards. With RAW LT, TB stores approximately minutes of footage. Simultaneous recording of proxy videos ( x pixels) on SD cards. Proxy Videos: These are lightweight and perfect for review and rough editing. They are equivalent to the JPEG files in the RAW+JPEG photo shooting format. 0 0 2 2 8 3 0 6 4 0 8 0 5 1 3 8 5 4 0 2 2 5 0 1 8 2 0 6 2 5 0 2 3 4 5 1 2 9 1 8 0 6 8 Storage: A large amount of high-speed SSDs. In the past six months, TB x and TB x were purchased. In-house environment: Gb Ethernet, QNAP NAS with approximately TB of storage.

39.

Filming Video Recording Settings: Canon Log , Cinema Gamut Essential for P RAW Shooting: USB-PD mobile battery Base ISO Considerations: Daytime: Base ISO Nighttime: Base ISO Recommended Aperture: F-stop: F . 0 0 2 0 3 0 8 0 0 1 0 1 6 0 3 6 6 5 1 Shutter Speed: Shutter: / , /

40.

Shooting work ow fl Planning Filming Editing

41.

Editing ② ① Blackmagic DaVinci Resolve Apply HDR color grading, sharpening, and noise reduction. Export in ProRes HQ. EOS VR Utility Convert RAW to VR projection and export in Apple ProRes HQ. ③ 2 2 4 0 8 1 Adobe Premiere Pro Edit with lightweight proxy videos. Export edit information as XML. Spatial Metadata GUI Convert to MV-HEVC with Meatadata.

42.

Filming & Editing: See also 4 8 https://www.youtube.com/watch?v=nOB u iSbms

43.

Shooting work ow fl Planning Filming Editing

44.

Shooting work ow Planning fl Filming Editing

45.

Shooting work ow Planning fl Filming Editing App Development Event

46.

Immersive Video Shooting Work ow Viewer Implementation fl ff Demo Event: Key E orts and Outcomes

47.

Immersive Video Shooting Work ow Viewer Implementation fl ff Demo Event: Key E orts and Outcomes

48.

Why we need custom viewer?

49.

fi For delivering a smooth experience tailored to rst-time users within a limited timeframe.

50.

Viewer Implementation Core features:

51.

Viewer Implementation Core features: Retrieve video information Creating a hemispherical mesh for projection Player

52.

Viewer Implementation Retrieve video information

53.
[beta]
static func getVideoInfo(asset: AVAsset) async -> VideoInfo? {
let videoInfo = VideoInfo()
guard let videoTrack = try? await asset.loadTracks(withMediaType: .video).first else {
print("No video track found")
return nil
}
guard
let (naturalSize, formatDescriptions, mediaCharacteristics) =
try? await videoTrack.load(.naturalSize, .formatDescriptions, .mediaCharacteristics),
let formatDescription = formatDescriptions.first
else {
print("Failed to load video properties")
return nil
}
videoInfo.size = naturalSize
videoInfo.isSpatial = mediaCharacteristics.contains(.containsStereoMultiviewVideo)
let projection = VideoTools.getProjection(formatDescription: formatDescription)
videoInfo.projectionType = projection.projectionType
videoInfo.horizontalFieldOfView = projection.horizontalFieldOfView
return videoInfo
}

54.
[beta]
static func getVideoInfo(asset: AVAsset) async -> VideoInfo? {
let videoInfo = VideoInfo()
guard let videoTrack = try? await asset.loadTracks(withMediaType: .video).first else {
print("No video track found")
return nil
}
guard
let (naturalSize, formatDescriptions, mediaCharacteristics) =
try? await videoTrack.load(.naturalSize, .formatDescriptions, .mediaCharacteristics),
let formatDescription = formatDescriptions.first
else {
print("Failed to load video properties")
return nil
}
videoInfo.size = naturalSize
videoInfo.isSpatial = mediaCharacteristics.contains(.containsStereoMultiviewVideo)
let projection = VideoTools.getProjection(formatDescription: formatDescription)
videoInfo.projectionType = projection.projectionType
videoInfo.horizontalFieldOfView = projection.horizontalFieldOfView
return videoInfo
}

55.
[beta]
static func getVideoInfo(asset: AVAsset) async -> VideoInfo? {
let videoInfo = VideoInfo()
guard let videoTrack = try? await asset.loadTracks(withMediaType: .video).first else {
print("No video track found")
return nil
}
guard
let (naturalSize, formatDescriptions, mediaCharacteristics) =
try? await videoTrack.load(.naturalSize, .formatDescriptions, .mediaCharacteristics),
let formatDescription = formatDescriptions.first
else {
print("Failed to load video properties")
return nil
}
videoInfo.size = naturalSize
videoInfo.isSpatial = mediaCharacteristics.contains(.containsStereoMultiviewVideo)
let projection = VideoTools.getProjection(formatDescription: formatDescription)
videoInfo.projectionType = projection.projectionType
videoInfo.horizontalFieldOfView = projection.horizontalFieldOfView
return videoInfo
}

56.
[beta]
static func getVideoInfo(asset: AVAsset) async -> VideoInfo? {
let videoInfo = VideoInfo()
guard let videoTrack = try? await asset.loadTracks(withMediaType: .video).first else {
print("No video track found")
return nil
}
guard
let (naturalSize, formatDescriptions, mediaCharacteristics) =
try? await videoTrack.load(.naturalSize, .formatDescriptions, .mediaCharacteristics),
let formatDescription = formatDescriptions.first
else {
print("Failed to load video properties")
return nil
}
videoInfo.size = naturalSize
videoInfo.isSpatial = mediaCharacteristics.contains(.containsStereoMultiviewVideo)
let projection = VideoTools.getProjection(formatDescription: formatDescription)
videoInfo.projectionType = projection.projectionType
videoInfo.horizontalFieldOfView = projection.horizontalFieldOfView
return videoInfo
}

57.
[beta]
static func getProjection(formatDescription: CMFormatDescription) -> (
projectionType: CMProjectionType?,
horizontalFieldOfView: Float?) {
var projectionType: CMProjectionType?
var horizontalFieldOfView: Float?
if let extensions = CMFormatDescriptionGetExtensions(formatDescription) as Dictionary? {
if let projectionKind = extensions["ProjectionKind" as CFString] as? String {
projectionType = CMProjectionType(fromString: projectionKind) ?? .rectangular
}
if let horizontalFieldOfViewValue =
extensions[kCMFormatDescriptionExtension_HorizontalFieldOfView] as? UInt32 {
horizontalFieldOfView = Float(horizontalFieldOfViewValue) / 1000.0
}
}
return (projectionType, horizontalFieldOfView)
}

58.
[beta]
static func getProjection(formatDescription: CMFormatDescription) -> (
projectionType: CMProjectionType?,
horizontalFieldOfView: Float?) {
var projectionType: CMProjectionType?
var horizontalFieldOfView: Float?
if let extensions = CMFormatDescriptionGetExtensions(formatDescription) as Dictionary? {
if let projectionKind = extensions["ProjectionKind" as CFString] as? String {
projectionType = CMProjectionType(fromString: projectionKind) ?? .rectangular
}
if let horizontalFieldOfViewValue =
extensions[kCMFormatDescriptionExtension_HorizontalFieldOfView]
UInt32
{
ProjectionKind shouldas?be
HalfEquirectangular
horizontalFieldOfView = Float(horizontalFieldOfViewValue) / 1000.0
}
}
return (projectionType, horizontalFieldOfView)
}

59.
[beta]
static func getProjection(formatDescription: CMFormatDescription) -> (
projectionType: CMProjectionType?,
horizontalFieldOfView: Float?) {
var projectionType: CMProjectionType?
var horizontalFieldOfView: Float?
if let extensions = CMFormatDescriptionGetExtensions(formatDescription) as Dictionary? {
if let projectionKind = extensions["ProjectionKind" as CFString] as? String {
projectionType = CMProjectionType(fromString: projectionKind) ?? .rectangular
}

HorizontalFieldOfView should be 180,000

if let horizontalFieldOfViewValue =
extensions[kCMFormatDescriptionExtension_HorizontalFieldOfView] as? UInt32 {
horizontalFieldOfView = Float(horizontalFieldOfViewValue) / 1000.0
}
}
return (projectionType, horizontalFieldOfView)
}

60.

Viewer Implementation Creating a hemispherical mesh for projection

61.

Build compelling spatial photo and video experiences 6 6 1 0 1 4 2 0 2 https://developer.apple.com/videos/play/wwdc /

62.

Thanks to Mike Swanson https://github.com/mikeswanson/SpatialPlayer

63.
[beta]
public static func generateVideoSphere(
radius: Float,
sourceHorizontalFov: Float,
sourceVerticalFov: Float,
clipHorizontalFov: Float,
clipVerticalFov: Float,
verticalSlices: Int,
horizontalSlices: Int
) -> MeshResource? {

...
let mesh = try? MeshResource.generate(from: [meshDescriptor])
return mesh
}

64.
[beta]
public static func generateVideoSphere(
radius: Float,
// 10,000
sourceHorizontalFov: Float, // 180
sourceVerticalFov: Float,
// 180
clipHorizontalFov: Float,
// 180
clipVerticalFov: Float,
// 180
verticalSlices: Int,
// 60
horizontalSlices: Int
// 60
) -> MeshResource? {

...
let mesh = try? MeshResource.generate(from: [meshDescriptor])
return mesh
}

65.
[beta]
public static func generateVideoSphere(
radius: Float,
sourceHorizontalFov: Float,
sourceVerticalFov: Float,
clipHorizontalFov: Float,
clipVerticalFov: Float,
verticalSlices: Int,
horizontalSlices: Int
) -> MeshResource? {
// Vertices
var vertices: [simd_float3] = Array(
repeating: simd_float3(), count: (verticalSlices + 1) * (horizontalSlices + 1))
let verticalScale: Float = clipVerticalFov / 180.0
let verticalOffset: Float = (1.0 - verticalScale) / 2.0
let horizontalScale: Float = clipHorizontalFov / 360.0
let horizontalOffset: Float = (1.0 - horizontalScale) / 2.0
for y: Int in 0...horizontalSlices {
let angle1 = ((Float.pi * (Float(y) / Float(horizontalSlices))) * verticalScale) + (verticalOffset * Float.pi)
let sin1 = sin(angle1)
let cos1 = cos(angle1)
for x: Int in 0...verticalSlices {
let angle2 =
((Float.pi * 2 * (Float(x) / Float(verticalSlices))) * horizontalScale)
+ (horizontalOffset * Float.pi * 2)
let sin2 = sin(angle2)
let cos2 = cos(angle2)

}

}

vertices[x + (y * (verticalSlices + 1))] = SIMD3<Float>(
sin1 * cos2 * radius, cos1 * radius, sin1 * sin2 * radius)

// Normals
var normals: [SIMD3<Float>] = []
for vertex in vertices {
normals.append(-normalize(vertex))
}

// Invert to show on inside of sphere

// UVs
var uvCoordinates: [simd_float2] = Array(repeating: simd_float2(), count: vertices.count)
let uvHorizontalScale = clipHorizontalFov / sourceHorizontalFov
let uvHorizontalOffset = (1.0 - uvHorizontalScale) / 2.0
let uvVerticalScale = clipVerticalFov / sourceVerticalFov
let uvVerticalOffset = (1.0 - uvVerticalScale) / 2.0
for y in 0...horizontalSlices {
for x in 0...verticalSlices {
var uv: simd_float2 = [
(Float(x) / Float(verticalSlices)), 1.0 - (Float(y) / Float(horizontalSlices)),
]
uv.x = (uv.x * uvHorizontalScale) + uvHorizontalOffset
uv.y = (uv.y * uvVerticalScale) + uvVerticalOffset
uvCoordinates[x + (y * (verticalSlices + 1))] = uv
}
}
// Indices / triangles
var indices: [UInt32] = []
for y in 0..<horizontalSlices {
for x in 0..<verticalSlices {
let current: UInt32 = UInt32(x) + (UInt32(y) * UInt32(verticalSlices + 1))
let next: UInt32 = current + UInt32(verticalSlices + 1)
indices.append(current + 1)
indices.append(current)
indices.append(next + 1)

}

}

indices.append(next + 1)
indices.append(current)
indices.append(next)

var meshDescriptor = MeshDescriptor(name: "proceduralMesh")
meshDescriptor.positions = MeshBuffer(vertices)
meshDescriptor.normals = MeshBuffer(normals)
meshDescriptor.primitives = .triangles(indices)
meshDescriptor.textureCoordinates = MeshBuffer(uvCoordinates)
let mesh = try? MeshResource.generate(from: [meshDescriptor])
}

return mesh

66.
[beta]
// Vertices
var vertices: [simd_float3] = Array(
repeating: simd_float3(), count: (verticalSlices + 1) * (horizontalSlices + 1))
let verticalScale: Float = clipVerticalFov / 180.0
let verticalOffset: Float = (1.0 - verticalScale) / 2.0
let horizontalScale: Float = clipHorizontalFov / 360.0
let horizontalOffset: Float = (1.0 - horizontalScale) / 2.0
for y: Int in 0...horizontalSlices {
let angle1 = ((Float.pi * (Float(y) / Float(horizontalSlices))) * verticalScale) + (verticalOffset * Float.pi)

let sin1 = sin(angle1)
let cos1 = cos(angle1)
for x: Int in 0...verticalSlices {
let angle2 =
((Float.pi * 2 * (Float(x) / Float(verticalSlices))) * horizontalScale)
+ (horizontalOffset * Float.pi * 2)
let sin2 = sin(angle2)
let cos2 = cos(angle2)
vertices[x + (y * (verticalSlices + 1))] = SIMD3<Float>(
sin1 * cos2 * radius, cos1 * radius, sin1 * sin2 * radius)
}
}

67.
[beta]
public static func generateVideoSphere(
radius: Float,
sourceHorizontalFov: Float,
sourceVerticalFov: Float,
clipHorizontalFov: Float,
clipVerticalFov: Float,
verticalSlices: Int,
horizontalSlices: Int
) -> MeshResource? {
// Vertices
var vertices: [simd_float3] = Array(
repeating: simd_float3(), count: (verticalSlices + 1) * (horizontalSlices + 1))
let verticalScale: Float = clipVerticalFov / 180.0
let verticalOffset: Float = (1.0 - verticalScale) / 2.0
let horizontalScale: Float = clipHorizontalFov / 360.0
let horizontalOffset: Float = (1.0 - horizontalScale) / 2.0
for y: Int in 0...horizontalSlices {
let angle1 = ((Float.pi * (Float(y) / Float(horizontalSlices))) * verticalScale) + (verticalOffset * Float.pi)
let sin1 = sin(angle1)
let cos1 = cos(angle1)
for x: Int in 0...verticalSlices {
let angle2 =
((Float.pi * 2 * (Float(x) / Float(verticalSlices))) * horizontalScale)
+ (horizontalOffset * Float.pi * 2)
let sin2 = sin(angle2)
let cos2 = cos(angle2)

}

}

vertices[x + (y * (verticalSlices + 1))] = SIMD3<Float>(
sin1 * cos2 * radius, cos1 * radius, sin1 * sin2 * radius)

// Normals
var normals: [SIMD3<Float>] = []
for vertex in vertices {
normals.append(-normalize(vertex))
}

// Invert to show on inside of sphere

// UVs
var uvCoordinates: [simd_float2] = Array(repeating: simd_float2(), count: vertices.count)
let uvHorizontalScale = clipHorizontalFov / sourceHorizontalFov
let uvHorizontalOffset = (1.0 - uvHorizontalScale) / 2.0
let uvVerticalScale = clipVerticalFov / sourceVerticalFov
let uvVerticalOffset = (1.0 - uvVerticalScale) / 2.0
for y in 0...horizontalSlices {
for x in 0...verticalSlices {
var uv: simd_float2 = [
(Float(x) / Float(verticalSlices)), 1.0 - (Float(y) / Float(horizontalSlices)),
]
uv.x = (uv.x * uvHorizontalScale) + uvHorizontalOffset
uv.y = (uv.y * uvVerticalScale) + uvVerticalOffset
uvCoordinates[x + (y * (verticalSlices + 1))] = uv
}
}
// Indices / triangles
var indices: [UInt32] = []
for y in 0..<horizontalSlices {
for x in 0..<verticalSlices {
let current: UInt32 = UInt32(x) + (UInt32(y) * UInt32(verticalSlices + 1))
let next: UInt32 = current + UInt32(verticalSlices + 1)
indices.append(current + 1)
indices.append(current)
indices.append(next + 1)

}

}

indices.append(next + 1)
indices.append(current)
indices.append(next)

var meshDescriptor = MeshDescriptor(name: "proceduralMesh")
meshDescriptor.positions = MeshBuffer(vertices)
meshDescriptor.normals = MeshBuffer(normals)
meshDescriptor.primitives = .triangles(indices)
meshDescriptor.textureCoordinates = MeshBuffer(uvCoordinates)
let mesh = try? MeshResource.generate(from: [meshDescriptor])
}

return mesh

68.
[beta]
// Normals
var normals: [SIMD3<Float>] = []
for vertex in vertices {
normals.append(-normalize(vertex))
}

// Invert to show on inside of sphere

69.
[beta]
// Normals
var normals: [SIMD3<Float>] = []
for vertex in vertices {
normals.append(-normalize(vertex))
}

// Invert to show on inside of sphere

70.
[beta]
public static func generateVideoSphere(
radius: Float,
sourceHorizontalFov: Float,
sourceVerticalFov: Float,
clipHorizontalFov: Float,
clipVerticalFov: Float,
verticalSlices: Int,
horizontalSlices: Int
) -> MeshResource? {
// Vertices
var vertices: [simd_float3] = Array(
repeating: simd_float3(), count: (verticalSlices + 1) * (horizontalSlices + 1))
let verticalScale: Float = clipVerticalFov / 180.0
let verticalOffset: Float = (1.0 - verticalScale) / 2.0
let horizontalScale: Float = clipHorizontalFov / 360.0
let horizontalOffset: Float = (1.0 - horizontalScale) / 2.0
for y: Int in 0...horizontalSlices {
let angle1 = ((Float.pi * (Float(y) / Float(horizontalSlices))) * verticalScale) + (verticalOffset * Float.pi)
let sin1 = sin(angle1)
let cos1 = cos(angle1)
for x: Int in 0...verticalSlices {
let angle2 =
((Float.pi * 2 * (Float(x) / Float(verticalSlices))) * horizontalScale)
+ (horizontalOffset * Float.pi * 2)
let sin2 = sin(angle2)
let cos2 = cos(angle2)

}

}

vertices[x + (y * (verticalSlices + 1))] = SIMD3<Float>(
sin1 * cos2 * radius, cos1 * radius, sin1 * sin2 * radius)

// Normals
var normals: [SIMD3<Float>] = []
for vertex in vertices {
normals.append(-normalize(vertex))
}

// Invert to show on inside of sphere

// UVs
var uvCoordinates: [simd_float2] = Array(repeating: simd_float2(), count: vertices.count)
let uvHorizontalScale = clipHorizontalFov / sourceHorizontalFov
let uvHorizontalOffset = (1.0 - uvHorizontalScale) / 2.0
let uvVerticalScale = clipVerticalFov / sourceVerticalFov
let uvVerticalOffset = (1.0 - uvVerticalScale) / 2.0
for y in 0...horizontalSlices {
for x in 0...verticalSlices {
var uv: simd_float2 = [
(Float(x) / Float(verticalSlices)), 1.0 - (Float(y) / Float(horizontalSlices)),
]
uv.x = (uv.x * uvHorizontalScale) + uvHorizontalOffset
uv.y = (uv.y * uvVerticalScale) + uvVerticalOffset
uvCoordinates[x + (y * (verticalSlices + 1))] = uv
}
}
// Indices / triangles
var indices: [UInt32] = []
for y in 0..<horizontalSlices {
for x in 0..<verticalSlices {
let current: UInt32 = UInt32(x) + (UInt32(y) * UInt32(verticalSlices + 1))
let next: UInt32 = current + UInt32(verticalSlices + 1)
indices.append(current + 1)
indices.append(current)
indices.append(next + 1)

}

}

indices.append(next + 1)
indices.append(current)
indices.append(next)

var meshDescriptor = MeshDescriptor(name: "proceduralMesh")
meshDescriptor.positions = MeshBuffer(vertices)
meshDescriptor.normals = MeshBuffer(normals)
meshDescriptor.primitives = .triangles(indices)
meshDescriptor.textureCoordinates = MeshBuffer(uvCoordinates)
let mesh = try? MeshResource.generate(from: [meshDescriptor])
}

return mesh

71.

// UVs var uvCoordinates: [simd_float2] = Array(repeating: simd_float2(), count: vertices.count) let uvHorizontalScale = clipHorizontalFov / sourceHorizontalFov let uvHorizontalOffset = (1.0 - uvHorizontalScale) / 2.0 let uvVerticalScale = clipVerticalFov / sourceVerticalFov let uvVerticalOffset = (1.0 - uvVerticalScale) / 2.0 for y in 0...horizontalSlices { for x in 0...verticalSlices { var uv: simd_float2 = [ (Float(x) / Float(verticalSlices)), 1.0 - (Float(y) / Float(horizontalSlices)), ] uv.x = (uv.x * uvHorizontalScale) + uvHorizontalOffset uv.y = (uv.y * uvVerticalScale) + uvVerticalOffset uvCoordinates[x + (y * (verticalSlices + 1))] = uv } }

72.
[beta]
public static func generateVideoSphere(
radius: Float,
sourceHorizontalFov: Float,
sourceVerticalFov: Float,
clipHorizontalFov: Float,
clipVerticalFov: Float,
verticalSlices: Int,
horizontalSlices: Int
) -> MeshResource? {
// Vertices
var vertices: [simd_float3] = Array(
repeating: simd_float3(), count: (verticalSlices + 1) * (horizontalSlices + 1))
let verticalScale: Float = clipVerticalFov / 180.0
let verticalOffset: Float = (1.0 - verticalScale) / 2.0
let horizontalScale: Float = clipHorizontalFov / 360.0
let horizontalOffset: Float = (1.0 - horizontalScale) / 2.0
for y: Int in 0...horizontalSlices {
let angle1 = ((Float.pi * (Float(y) / Float(horizontalSlices))) * verticalScale) + (verticalOffset * Float.pi)
let sin1 = sin(angle1)
let cos1 = cos(angle1)
for x: Int in 0...verticalSlices {
let angle2 =
((Float.pi * 2 * (Float(x) / Float(verticalSlices))) * horizontalScale)
+ (horizontalOffset * Float.pi * 2)
let sin2 = sin(angle2)
let cos2 = cos(angle2)

}

}

vertices[x + (y * (verticalSlices + 1))] = SIMD3<Float>(
sin1 * cos2 * radius, cos1 * radius, sin1 * sin2 * radius)

// Normals
var normals: [SIMD3<Float>] = []
for vertex in vertices {
normals.append(-normalize(vertex))
}

// Invert to show on inside of sphere

// UVs
var uvCoordinates: [simd_float2] = Array(repeating: simd_float2(), count: vertices.count)
let uvHorizontalScale = clipHorizontalFov / sourceHorizontalFov
let uvHorizontalOffset = (1.0 - uvHorizontalScale) / 2.0
let uvVerticalScale = clipVerticalFov / sourceVerticalFov
let uvVerticalOffset = (1.0 - uvVerticalScale) / 2.0
for y in 0...horizontalSlices {
for x in 0...verticalSlices {
var uv: simd_float2 = [
(Float(x) / Float(verticalSlices)), 1.0 - (Float(y) / Float(horizontalSlices)),
]
uv.x = (uv.x * uvHorizontalScale) + uvHorizontalOffset
uv.y = (uv.y * uvVerticalScale) + uvVerticalOffset
uvCoordinates[x + (y * (verticalSlices + 1))] = uv
}
}
// Indices / triangles
var indices: [UInt32] = []
for y in 0..<horizontalSlices {
for x in 0..<verticalSlices {
let current: UInt32 = UInt32(x) + (UInt32(y) * UInt32(verticalSlices + 1))
let next: UInt32 = current + UInt32(verticalSlices + 1)
indices.append(current + 1)
indices.append(current)
indices.append(next + 1)

}

}

indices.append(next + 1)
indices.append(current)
indices.append(next)

var meshDescriptor = MeshDescriptor(name: "proceduralMesh")
meshDescriptor.positions = MeshBuffer(vertices)
meshDescriptor.normals = MeshBuffer(normals)
meshDescriptor.primitives = .triangles(indices)
meshDescriptor.textureCoordinates = MeshBuffer(uvCoordinates)
let mesh = try? MeshResource.generate(from: [meshDescriptor])
}

return mesh

73.

// Indices / triangles var indices: [UInt32] = [] for y in 0..<horizontalSlices { for x in 0..<verticalSlices { let current: UInt32 = UInt32(x) + (UInt32(y) * UInt32(verticalSlices + 1)) let next: UInt32 = current + UInt32(verticalSlices + 1) indices.append(current + 1) indices.append(current) indices.append(next + 1) indices.append(next + 1) indices.append(current) indices.append(next) } }

74.
[beta]
public static func generateVideoSphere(
radius: Float,
sourceHorizontalFov: Float,
sourceVerticalFov: Float,
clipHorizontalFov: Float,
clipVerticalFov: Float,
verticalSlices: Int,
horizontalSlices: Int
) -> MeshResource? {
// Vertices
var vertices: [simd_float3] = Array(
repeating: simd_float3(), count: (verticalSlices + 1) * (horizontalSlices + 1))
let verticalScale: Float = clipVerticalFov / 180.0
let verticalOffset: Float = (1.0 - verticalScale) / 2.0
let horizontalScale: Float = clipHorizontalFov / 360.0
let horizontalOffset: Float = (1.0 - horizontalScale) / 2.0
for y: Int in 0...horizontalSlices {
let angle1 = ((Float.pi * (Float(y) / Float(horizontalSlices))) * verticalScale) + (verticalOffset * Float.pi)
let sin1 = sin(angle1)
let cos1 = cos(angle1)
for x: Int in 0...verticalSlices {
let angle2 =
((Float.pi * 2 * (Float(x) / Float(verticalSlices))) * horizontalScale)
+ (horizontalOffset * Float.pi * 2)
let sin2 = sin(angle2)
let cos2 = cos(angle2)

}

}

vertices[x + (y * (verticalSlices + 1))] = SIMD3<Float>(
sin1 * cos2 * radius, cos1 * radius, sin1 * sin2 * radius)

// Normals
var normals: [SIMD3<Float>] = []
for vertex in vertices {
normals.append(-normalize(vertex))
}

// Invert to show on inside of sphere

// UVs
var uvCoordinates: [simd_float2] = Array(repeating: simd_float2(), count: vertices.count)
let uvHorizontalScale = clipHorizontalFov / sourceHorizontalFov
let uvHorizontalOffset = (1.0 - uvHorizontalScale) / 2.0
let uvVerticalScale = clipVerticalFov / sourceVerticalFov
let uvVerticalOffset = (1.0 - uvVerticalScale) / 2.0
for y in 0...horizontalSlices {
for x in 0...verticalSlices {
var uv: simd_float2 = [
(Float(x) / Float(verticalSlices)), 1.0 - (Float(y) / Float(horizontalSlices)),
]
uv.x = (uv.x * uvHorizontalScale) + uvHorizontalOffset
uv.y = (uv.y * uvVerticalScale) + uvVerticalOffset
uvCoordinates[x + (y * (verticalSlices + 1))] = uv
}
}
// Indices / triangles
var indices: [UInt32] = []
for y in 0..<horizontalSlices {
for x in 0..<verticalSlices {
let current: UInt32 = UInt32(x) + (UInt32(y) * UInt32(verticalSlices + 1))
let next: UInt32 = current + UInt32(verticalSlices + 1)
indices.append(current + 1)
indices.append(current)
indices.append(next + 1)

}

}

indices.append(next + 1)
indices.append(current)
indices.append(next)

var meshDescriptor = MeshDescriptor(name: "proceduralMesh")
meshDescriptor.positions = MeshBuffer(vertices)
meshDescriptor.normals = MeshBuffer(normals)
meshDescriptor.primitives = .triangles(indices)
meshDescriptor.textureCoordinates = MeshBuffer(uvCoordinates)
let mesh = try? MeshResource.generate(from: [meshDescriptor])
}

return mesh

75.

var meshDescriptor = MeshDescriptor(name: "proceduralMesh") meshDescriptor.positions = MeshBuffer(vertices) meshDescriptor.normals = MeshBuffer(normals) meshDescriptor.primitives = .triangles(indices) meshDescriptor.textureCoordinates = MeshBuffer(uvCoordinates) let mesh = try? MeshResource.generate(from: [meshDescriptor]) return mesh }

76.

Viewer Implementation Player

77.

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

78.

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

79.

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

80.

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

81.

@State private var player: AVPlayer = AVPlayer() @State private var videoMaterial: VideoMaterial? RealityView { content in guard let url = viewModel.videoURL else { return } let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) guard let videoInfo = await VideoTools.getVideoInfo(asset: asset) else { return } viewModel.videoInfo = videoInfo viewModel.isSpatialVideoAvailable = videoInfo.isSpatial guard let (mesh, transform) = await VideoTools.makeVideoMesh(videoInfo: videoInfo) else { return } videoMaterial = VideoMaterial(avPlayer: player) guard let videoMaterial else { return } let videoEntity = Entity() videoEntity.components.set(ModelComponent(mesh: mesh, materials: [videoMaterial])) videoEntity.transform = transform content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }

82.

Build compelling spatial photo and video experiences 6 6 1 0 1 4 2 0 2 https://developer.apple.com/videos/play/wwdc /

83.

Immersive Video Shooting Work ow Viewer Implementation fl ff Demo Event: Key E orts and Outcomes

84.

Immersive Video Shooting Work ow Viewer Implementation fl ff Demo Event: Key E orts and Outcomes

85.

Demo Event: Key E orts and Outcomes Key E orts ff ff Outcomes

86.

Key E orts Simple Experience Design Prepared an instruction guide No guest mode required Glasses-friendly One action to start the experience ff Removed unnecessary features

87.

Key E orts Simple Experience Design Prepared an instruction guide No guest mode required Glasses-friendly One action to start the experience ff Removed unnecessary features

88.

Development Menu / File loading feature Playback thumbnail position tracking / Playback button press detection / Seek bar for Ops Immersive Space from startup / Pre-play music Playback button animation / Back button behind the screen End message / Pause and resume functionality 88

89.

Development Menu / File loading feature Playback thumbnail position tracking / Playback button press detection / Seek bar for Ops Immersive Space from startup / Pre-play music Playback button animation / Back button behind the screen End message / Pause and resume functionality 89

91.

Development Menu / File loading feature Playback thumbnail position tracking / Playback button press detection / Seek bar for Ops Immersive Space from startup / Pre-play music Playback button animation / Back button behind the screen End message / Pause and resume functionality 91

94.

Development Menu / File loading feature Playback thumbnail position tracking / Playback button press detection / Seek bar for Ops Immersive Space from startup / Pre-play music Playback button animation / Back button behind the screen End message / Pause and resume functionality 94

96.

VideoPlayer.swift func openStream(_ stream: StreamModel) { stop() title = stream.title details = stream.details let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] let destinationURL = documentsURL.appendingPathComponent(stream.url.lastPathComponent) do { if FileManager.default.fileExists(atPath: destinationURL.path) { copyAndRenameFileInDocuments(fileName: destinationURL.lastPathComponent) } else { defer { stream.url.stopAccessingSecurityScopedResource() } _ = stream.url.startAccessingSecurityScopedResource() try FileManager.default.copyItem(at: stream.url, to: destinationURL) } } catch { print(error.localizedDescription) return } let playerItem = AVPlayerItem(url: destinationURL) playerItem.preferredPeakBitRate = 200_000_000 // 200 Mbps player.replaceCurrentItem(with: playerItem) scrubState = .notScrubbing setupObservers() }

97.

VideoPlayer.swift func openStream(_ stream: StreamModel) { stop() title = stream.title details = stream.details let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] let destinationURL = documentsURL.appendingPathComponent(stream.url.lastPathComponent) do { if FileManager.default.fileExists(atPath: destinationURL.path) { copyAndRenameFileInDocuments(fileName: destinationURL.lastPathComponent) } else { defer { stream.url.stopAccessingSecurityScopedResource() } _ = stream.url.startAccessingSecurityScopedResource() try FileManager.default.copyItem(at: stream.url, to: destinationURL) } } catch { print(error.localizedDescription) return } let playerItem = AVPlayerItem(url: destinationURL) playerItem.preferredPeakBitRate = 200_000_000 // 200 Mbps player.replaceCurrentItem(with: playerItem) scrubState = .notScrubbing setupObservers() }

98.
[beta]
VideoPlayer.swift
private func copyAndRenameFileInDocuments(fileName: String) {
let fileManager = FileManager.default
let documentsURL = fileManager.urls(for: .documentDirectory, in: .userDomainMask).first!
let originalFileURL = documentsURL.appendingPathComponent(fileName)
let copyFileURL = documentsURL.appendingPathComponent("\(fileName)_copy")
do {
guard fileManager.fileExists(atPath: originalFileURL.path) else { return }
if fileManager.fileExists(atPath: copyFileURL.path) {
try fileManager.removeItem(at: copyFileURL) // delete "_copy" file is exists
}
try fileManager.copyItem(at: originalFileURL, to: copyFileURL)
} catch { return }
do {
try fileManager.removeItem(at: originalFileURL)
} catch { return }
do {
try fileManager.moveItem(at: copyFileURL, to: originalFileURL)
} catch { // print error }
}

Warning: It's workaround

99.
[beta]
VideoPlayer.swift
private func copyAndRenameFileInDocuments(fileName: String) {
let fileManager = FileManager.default
let documentsURL = fileManager.urls(for: .documentDirectory, in: .userDomainMask).first!
let originalFileURL = documentsURL.appendingPathComponent(fileName)
let copyFileURL = documentsURL.appendingPathComponent("\(fileName)_copy")
do {
guard fileManager.fileExists(atPath: originalFileURL.path) else { return }
if fileManager.fileExists(atPath: copyFileURL.path) {
try fileManager.removeItem(at: copyFileURL) // delete "_copy" file is exists
}
try fileManager.copyItem(at: originalFileURL, to: copyFileURL)
} catch { return }
do {
try fileManager.removeItem(at: originalFileURL)
} catch { return }
do {
try fileManager.moveItem(at: copyFileURL, to: originalFileURL)
} catch { // print error }
}

Warning: It's workaround

100.

Development Menu / File loading feature Playback thumbnail position tracking / Playback button press detection / Seek bar for Ops Immersive Space from startup / Pre-play music Playback button animation / Back button behind the screen End message / Pause and resume functionality 100

102.
[beta]
SelectionView.swift
DragGesture(minimumDistance: 0)
.targetedToAnyEntity()
.onChanged { value in
guard let entity = value.entity.children.first(where: { $0.name == "PlayButton" }) else { return }
guard !isPressed else { return }
if initialTransform == nil {
initialTransform = entity.transform
var pressedTransform = Transform()
pressedTransform.scale = initialTransform!.scale * 0.8
pressedTransform.translation.x += 0.01
entity.move(to: pressedTransform, relativeTo: entity.parent, duration: 0.2,
timingFunction: .easeInOut)
}
isPressed = true
}
.onEnded { value in
guard let entity = value.entity.children.first(where: { $0.name == "PlayButton" }) else { return }
guard isPressed else { return }
entity.move(to: initialTransform!, relativeTo: entity.parent, duration: 0.2, timingFunction: .easeInOut)
initialTransform = nil
isPressed = false
guard let str = UserDefaults.standard.string(forKey: AppModel.selectedFileURL),
let url = URL(string: str)
else { return }
let stream = StreamModel(title: url.lastPathComponent, details: "", url: url)

}

appModel.selectedStream = stream
appModel.isSelectionMode = false

103.
[beta]
SelectionView.swift
DragGesture(minimumDistance: 0)
.targetedToAnyEntity()
.onChanged { value in
guard let entity = value.entity.children.first(where: { $0.name == "PlayButton" }) else { return }
guard !isPressed else { return }
if initialTransform == nil {
initialTransform = entity.transform
var pressedTransform = Transform()
pressedTransform.scale = initialTransform!.scale * 0.8
pressedTransform.translation.x += 0.01
entity.move(to: pressedTransform, relativeTo: entity.parent, duration: 0.2,
timingFunction: .easeInOut)
}
isPressed = true
}
.onEnded { value in
guard let entity = value.entity.children.first(where: { $0.name == "PlayButton" }) else { return }
guard isPressed else { return }
entity.move(to: initialTransform!, relativeTo: entity.parent, duration: 0.2, timingFunction: .easeInOut)
initialTransform = nil
isPressed = false
guard let str = UserDefaults.standard.string(forKey: AppModel.selectedFileURL),
let url = URL(string: str)
else { return }
let stream = StreamModel(title: url.lastPathComponent, details: "", url: url)

}

appModel.selectedStream = stream
appModel.isSelectionMode = false

104.

Demo Event: Key E orts and Outcomes Key E orts ff ff Outcomes

105.

Outcomes [Platinum Members Only] Pre-Season Immersive Experience Tour Announcement 0 0 3 0 1 8 1 2 Out of platinum members: → applied → participated in the experience

106.

Survey Results Evaluation of ZELVISION XR at the Demo Event - Every respondent rated the experience as “satisfactory” or higher. - Over half said it was “the most enjoyable” among all available content. Expectations for Future Content - % of respondents said they “strongly want to watch the next one,” 8 8 showing high anticipation for future releases.

107.

Participant Feedback "The intensity and presence during the match were on a whole di erent level. Seeing the players send-o and the pitchside view—things you can't normally experience—were truly unique to ZELVISION XR!" "The pitchside footage was fantastic, and the stadium atmosphere came through perfectly. I think it’s a great way to attract new fans to the stadium." "I felt incredibly close to the players—it was almost as if I could actually ff ff fi high- ve them."

109.

Wrap up You can shoot and implement this. The ecosystem will continue to grow. Building it yourself deepens understanding and opens up new possibilities. X: @shmdevelop

110.

Latest Updates

111.

Latest Updates

112.

Latest Updates Blackmagic URSA Cine Immersive から Vision Pro まで: DaVinci Resolve を使 した完全なワークフロー | NAB 5 2 0 0 2 3 2 用 – https://www.youtube.com/watch?v=RyDnqD aBoc Blackmagic Design URSA Cine Immersiveカメラ、Apple Vision Pro対応 インタビュー https://www.youtube.com/watch?v=QrV haN HOc