Apple is introducing a new feature called Object Capture to its RealityKit 2 engine that will make it easier for app and game developers to create high-end quality and immersive AR content.
Apple has built a new feature called Object Capture into RealityKit 2 that will make it easier for AR developers to create immersive content.
As detailed at WWDC21, Object Capture is a "simple and powerful" API for macOS Monterey (currently available in beta form) that will enable developers to create photo-realistic 3D models of real-world objects in minutes by taking photos shot on iPhone, iPad, or DSLR and transforming them into 3D models optimized for AR.
"These models can be viewed in AR Quick Look or added to AR scenes in Reality Composer or Xcode, making it easier than ever to build amazing AR apps," explained Apple. "Developers like Maxon and Unity are using Object Capture to unlock entirely new ways of creating 3D content."
RealityKit 2, which is the second iteration of Apple's rendering, animation, audio, and physics engine, will also be bolstered with new APIs including custom render passes and dynamic shaders to let developers piece together more realistic and complex AR experiences.
You can use the following links to find out more about Object Capture and RealityKit 2, including feature breakdowns, developer forums, guides, and much more.
About the Author(s)
You May Also Like