Ncam updates camera tracking solution for AR
April 17, 2020

Ncam updates camera tracking solution for AR

LONDON — Ncam Technologies (www.ncam-tech.com), a developer of realtime augmented reality technology for the media and entertainment industries, has released its new Mk2 Camera Bar, Mk2 Server and Ncam Reality 2020 software. First shown as a prototype at IBC2019, the Mk2 Camera Bar has since undergone significant developments and enhancements.

Compared to the original Mk1, the Mk2 Camera Bar is 50 percent smaller and lighter while offering more functionality and higher quality. The Mk2 uses Intel RealSense hardware and is well suited for both broadcast and film environments. Its compact size allows it to be easily used with a jib, Steadicam, wire cam or a drone. 
 
Previous generation hardware required an ethernet tether to return tracking data to a server running Ncam Reality software. The next generation software now runs on the Mk2 Server and can be mounted on the camera or rig itself. All camera tracking and lens data is computed locally, and fully wireless tracking is available through a standard RF camera link.
 


The Ncam Reality 2020 software suite has also been redesigned, with key enhancements in several areas. An easy-to-use wizard system talks users through the set up. Hybrid feature extraction includes natural features, markers and fiducials. And remote access is now possible. Additionally, the AR Suite software, which in its Lite form, comes bundled as standard. This provides seamless and future-proofed integration into the latest Unreal Engine 4 toolset, providing a complete out-of-the-box solution for high fidelity realtime VFX.
 
“This release marks a huge milestone in the history of Ncam,” comments Nic Hatch, Ncam’s CEO. “This new platform will be the foundation of our technology moving forward and is just the beginning in allowing us to help customers realise their vision without having to worry about technology. The close partnerships we have with the likes of Intel, Epic, and others will allow us to leverage further enhancements in both tracking and rendering technologies, as well as our own developments around spatial environment data capture and its reuse in non-live environments.”