Apple Developer
Search documents
Bring your PC and console games to Mac | Meet with Apple
Apple Developer· 2025-12-15 21:14
Hi there. My name is Rich Forster and I'm here to talk to you about bringing your game and its renderer to Metal. I'm assuming you have a PC or console game, and you want to make it part of the growing catalog of titles on Mac.Bringing your game to Mac means that you can target a range of devices with the common foundation of Apple Silicon. This is inZOI, by the way. Apple Silicon means that you have a strong baseline of advanced rendering features, and knowing the capability of each device, you can give yo ...
Level up with Apple game technologies | Meet with Apple
Apple Developer· 2025-12-13 00:01
Hi, I’m Alexey and I lead the Game Technologies team here at Apple. Your players expect the best possible experience in your games. And today I want to talk about the frameworks, tools and resources that are available to help you achieve it.In fact, the Apple Arcade system is really outstanding when it comes to gaming. The vast majority of players already have more than one device available, and they are carrying these devices around using them every day. They don't need an additional dedicated device just ...
Chart your game’s course to Apple platforms | Meet with Apple
Apple Developer· 2025-12-12 22:00
Apple's Unified Gaming Platform - Apple's unified gaming platform enables developers to target Mac, iPad, and iPhone from a single code base, supporting graphically demanding games across various hardware and platforms [3] - The unified architecture also serves as the foundation for building games for spatial computing on Apple Vision Pro [3] Game Engine Support - Unity and Unreal Engine offer deep support for Apple platforms, leveraging Metal for direct GPU access, simplifying the process of targeting Apple devices [7] - Apple provides Unity plug-ins that bridge native APIs, allowing developers to integrate features like Game Center, Game Controller support (MFI compliant controllers, Sony, Microsoft), accessibility features (VoiceOver, Dynamic Type), and Core Haptics [9][10][11][12][13][14][15] - Apple's Game Center plugin facilitates integration of Leaderboards, Achievements, Challenges, and matchmaking [11] Game Porting Toolkit - Apple offers a Game Porting Toolkit for bringing Windows or console games to Apple platforms, enabling developers to leverage existing PC workflows and tools [21] - The toolkit includes an evaluation environment to run existing x86 Windows binaries on Mac, Metal developer tools for Windows, and Mac remote developer tools for Windows [22][23][26][28] - The Metal Shader Converter automatically converts DirectX Intermediate Language (DXIL) into a Metal library [29] - Metal-cpp, a header-only C++ wrapper for the Metal API, allows developers to use Metal in a familiar language [31] - The Game Porting Toolkit includes Human Interface guidelines for games, offering advice on platform conventions, on-screen controls, and input handling [31][32] - The game porting sample code serves as a tutorial for porting and a standalone example of a native game built for Mac, iPad, and iPhone [33][34]
Discover game development on Apple platforms | Meet with Apple
Apple Developer· 2025-12-12 20:13
Hi, I'm Greg. It's great to see so many of you here and online. And I'm Adrienne.Greg and I are part of Apple's Worldwide Developer Relations team. We're so excited to welcome you to a full day focused on game development across Apple platforms. Every day we collaborate with developers just like you, from indie dreamers to Triple-A powerhouses, pushing what's possible on iPhone, iPad, Mac and Apple Vision Pro.Today, you'll hear from the Apple teams who build the tools and technologies that make that possibl ...
Hands-on experience with Spatial Audio for Apple Immersive Video | Meet with Apple
Apple Developer· 2025-12-10 00:01
Spatial Audio Design Principles - Immersive audio design should account for the entire visual frame, as viewers can explore different areas and focus on mini-stories within the scene [1][2][5][6] - Spatial audio should consider the sonic world behind the viewer, utilizing equirectangular representations to place sounds accurately in 360-degree space [7][8][9][10] - Headlocked stereo is suitable for elements like voiceover and narration, where the sound should remain constant relative to the viewer's ears, rather than being spatialized within the scene [12][13][14][15][42][43][44] DaVinci Resolve Fairlight Integration - DaVinci Resolve integrates editing, color correction, and Fairlight audio post-production, enabling a collaborative workflow without data exchange between pages [17][18] - Fairlight offers standard DAW tools, including faders, spatial panners (azimuth, elevation, Cartesian view), built-in EQ, dynamics processors (compressor, gate, limiter), and bus routing [18][19][20][21][22] - Fairlight supports ASAF (Apple Spatial Audio Format) master bus output, allowing for mixing in binaural, stereo, or speaker system configurations, optimized for the listening environment [23][24][25][26] - Buses can be set up in Fairlight to route audio signals to different destinations, including ambisonic formats for Apple Immersive Video [27][28] ASAF Workflow and Features - ASAF and ambisonics enable realistic positioning of sounds in the sound field, such as placing stereo sounds in the upper canopy of trees [37][38] - Headlocking in Fairlight can be achieved by routing a track to the main ASAF output, selecting the track, and enabling the headlocked option in the inspector [43] - DaVinci Resolve Fairlight's IntelliTrack, an AI-based tracker, automates complex pan moves by tracking objects in the video, saving time for audio engineers [52][53][54][55][56] Project Setup and Delivery - Apple Immersive Audio requires DaVinci Resolve Studio (the paid version) and enabling Apple Spatial Audio in the Video and Audio I/O preferences [57][58] - Final output for Vision Pro or master files can be delivered through the Deliver page, with options for Vision Pro Review or creating a master file bundle [58][59]
Hands-on experience with visual effects for Apple Immersive Video | Meet with Apple
Apple Developer· 2025-12-09 22:29
Immersive VFX Challenges & Solutions - Immersive VFX is more complex than traditional VFX due to stereo lenses, high frame rates (90 fps), and 8K resolution per eye [2] - Even simple VFX elements can have a significant impact in immersive environments if executed well [3] - Previsualization in 180 degrees is crucial for planning various aspects of production, potentially justifying budget through cost savings [4][5] - Data capture on set (HDRIs, photogrammetry, LiDAR scans, etc) is essential for adding elements, cleanup, and ensuring spatial awareness in stereo [6][7] - Camera tracking accuracy is critical due to the realistic stereoscopic depth in immersive experiences [9] - Reviewing VFX work in Apple Vision Pro is necessary at all stages to avoid discrepancies between 2D monitors and the final immersive output [10] DaVinci Resolve/Fusion Workflow for Immersive Video - DaVinci Resolve's Fusion tab offers tools for immersive VFX, including Immersive Patcher, LatLong Patcher, and Panomap [15][18] - The Immersive Patcher converts lens space images to rectilinear images for easier paint and compositing work, with adjustable angle of view [20][21] - The Panomap tool converts between different formats, such as immersive to lat-long, useful for stabilization and CG render mapping [23][24] - Fusion allows previewing work in the Apple Vision Pro headset by combining left and right eyes into layers using a combiner node [26][27] - DaVinci Resolve can pass through the ILPD metadata into EXRs, maintaining camera information throughout the VFX pipeline [74] - Spherical Stabilizer in Fusion can stabilize shaky camera footage by converting immersive space to lat-long space [86][87]
Preserving presence for Apple Immersive Video: Learnings from post production | Meet with Apple
Apple Developer· 2025-12-09 00:00
Immersive Video Post-Production Workflow - The immersive post-production phases are similar to traditional film or television projects, with a focus on creative aspects due to the new Blackmagic workflow simplifying technical considerations [1][2] - Apple Immersive Video emphasizes viewer presence, requiring post-production to protect and intentionally alter this sense [3] - Reviewing footage in Daily's Envision Pro and Resolve is crucial for identifying the best moments and addressing capture issues that could interfere with presence [4][5] - Apple immersive video enabled cameras, like the Blackmagic camera, address common immersive issues such as bad 3D sync and inaccurate world scale, simplifying the media workflow [6] Creative and Technical Considerations - Reviewing shots involves both monitor and Vision Pro to identify issues that can detract from presence, as issues must be assessed in Vision Pro [7][8] - Editorial cuts in immersive video are like transporting the viewer, requiring careful attention to timing, pacing, and information delivery to avoid viewer overload [9][10][11][12] - Editors consider z-space and viewer comfort when cutting, using techniques like dip to black to reset depth cues and guide attention [13][14] - Spatial audio is crucial for a convincing sense of presence, requiring mixers to ensure all expected sounds are audible, even if not the focus [15][16][17][18] - Visual effects in immersive video must adhere to the sense of presence, with elements like title graphics needing to exist in 3D space and integrate with the environment's light [23][24][25][26][27] Finishing and Color Grading - The finishing process addresses distractions from presence, such as image noise and compression artifacts, which are particularly noticeable in immersive video [28][29][30][31] - Creative color grading aims for an idealized reality, pushing the limits of authenticity while maintaining a realistic feel [31][32][33] - Color grading should be done with the experience of being inside the image in mind, utilizing live preview in Resolve to get the feeling just right [34] Key Takeaway - All creative and technical decisions should be made in Vision Pro to accurately assess the sense of presence [35][36][37]
Capturing live events for Apple Immersive Video: Learnings from "Metallica" | Meet with Apple
Apple Developer· 2025-12-08 19:42
Immersive Media Production - Key Takeaways - Immersive experiences aim to capture the feeling of being part of an event, not just observing it [5] - Access to prime locations is crucial for capturing immersive experiences, requiring negotiation and navigating legacy workflows [8] - Collaboration with artists is essential to understand their vision and design coverage accordingly [9][10] Technical Aspects of Immersive Capture - Stage mapping and choreography analysis are vital for strategic camera placement [10][13] - Previsualization in 3D environments like Vision Pro helps refine camera angles and movement [14] - Stabilized cameras and dynamic rigs like Steadicams and Spidercams are necessary to match the energy of live events [16][17] Overcoming Traditional Limitations - Traditional long lenses and camera pit placements are not effective for immersive capture [6][7] - Immersive capture requires breaking down the wall between artist and fan [10] - The goal is to make the audience feel like they are on stage with the performers [6]
Production expectations for Apple Immersive Video: Learnings from "Submerged" | Meet with Apple
Apple Developer· 2025-12-05 23:00
Hi everyone. Hi everyone. Streaming at home.I am Alex and I am one of our producers working across many of our immersive projects, and one of the other exciting pieces we released last year was our first scripted project, Submerged. Let's take a quick look at what we made. Now.Everything we're doing here is the first time that anyone's ever doing it. When you put on the Apple Vision Pro, it does change the way you think about creating a story. It's a wonderful new medium that just expands the horizon of sto ...
Previsualization for Apple Immersive: Learnings from “Open Hearts” by The Weeknd | Meet with Apple
Apple Developer· 2025-12-05 21:00
Hello everyone, I'm Anton Tammi. Today I'm happy to share the benefits of previsualization that I learned when directing the first Apple Immersive music video, Open Hearts for The Weeknd. I've directed music videos for a decade now.Like many in my generation, I started with independent music videos surrounded by like minded colleagues, learning directing. With international community of filmmakers. We shared knowledge and tips with each other both offline and online.Eventually, my portfolio grew and big mus ...