Designs projected into the physical space, made almost tangible. Augmented reality solutions help you to experience plans beyond two dimensional visualizations.
Seeing the mixed reality tutorials we wanted to move one step further and take advantage of Vive’s potential. We got the idea of somehow getting away with the tracking process on the actual footage, and started working out a solution.
Elements of the system
▪ Live connection between Tracker and 3DS Max. Tracker data can be imported to 3DS Max through Unity application.
▪ Live keyed image from camera to a control monitor
▪ The camera in 3DS Max is synchronized with the actual physical camera
▪ Creating log file about the spatial coordinates of the camera.
If these work then:
▪ We are able to render background footage with matching camera without the frustrating tracking process.
▪ We are capable of instructing the real camera in the studio, depending on the virtual environment.
▪ Without the need of tracking marks, the keying process also becomes much easier.
The link between the tracker and 3DS Max was implemented by a program that can translate between Unity and Max. Merging the synchronized image from the camera and the max was also done on a uniquely developed Unity application.
▪ Live keying helps a lot in case of studio work (HDMI transmitter needed)
▪ Struggling with footage and tracking data time sync
▪ This solution gives equal tracking information as the usual way does
▪ Accuracy problem in detection of the tracker some times
▪ Extended tracking space would be nice, more base station?
▪ Don’t forget calibrate your DJI Ronin
▪ Don’t forget to turn off the built-in stabilize function in the camera