HomeTech and GadgetsVirtual and Augmented Reality Update: Apple Vision Pro and the Metaverse

Virtual and Augmented Reality Update: Apple Vision Pro and the Metaverse

Please welcome back Dijam Panigrahi to 21st Century Tech Blog. This is his third contribution to this website.

Dijam is co-founder and COO of GridRaster, a company well-versed in developing augmented, virtual and mixed realities for industrial and commercial applications. GridRaster provides cloud-based augmented and virtual reality (AR/VR) platforms and experiences for mobile devices and computing environments. 

The world of digital twinning is no longer in its infancy and immersive technologies that use AR/VR headsets are making aerospace, defence, transportation and manufacturing companies integrate mixed realities into their planning, product development and production processes.

In this posting, Dijam introduces Apple’s new Vision Pro headset and how it will help manufacturers and engineers leverage the Metaverse. The Metaverse is considered by many to be the next evolution of the Internet. So please enjoy and feel free to comment.


Manufacturing is embracing new technologies. While the Internet and mobile technology have played a major role in this evolution, Apple’s release of the Vision Pro headset combined with the Metaverse represents a promising and elevating experience for manufacturers.

Equipped with two high‑resolution main cameras, six world‑facing tracking cameras, four eye‑tracking cameras and a TrueDepth camera, plus multiple sensors, a LiDAR scanner and more, the Vision Pro provides innovative features to distinguish it from other AR/VR headsets and gives manufacturers an effective immersive tool.

The Vision Pro uses finger, hand gestures and voice commands. It is equipped with two displays, one for each eye with a combined total of 23 million pixels. The visual experience is nothing short of extraordinary. With a custom 3D lens, the user interface remains consistently visible, complemented by features such as high dynamic range (HDR) depth, colour and contrast to enhance image quality.

Does that make Vision Pro just a better virtual gaming tool, or has it been designed to revolutionize manufacturing?

Manufacturers, Vision Pro and the Metaverse

Manufacturers today can leverage the Metaverse and build digital twins to create greater efficiencies in their operations. Vision Pro can make this even better.

The Metaverse is evolving. The virtual shared digital universe where users interact with each other in a computer-generated environment has taken off and is in use to create virtual retail spaces, warehouses, and manufacturing facilities.

Digital twins inhabit the Metaverse. They are digital representations or replicas of real-world entities or systems. Digital twins include physical objects, processes, and people. Using data from sensors, Internet-of-Things (IoT) devices, and other sources, digital twins simulate reality   

Vision Pro features are a perfect companion for the Metaverse. The spatial computing capability of the technology lets workers overlay digital twin information, such as instructions, diagrams, or 3D models, onto reality. This makes for more efficient operational processes providing visual cues, step-by-step guidance, and real-time feedback. Maintenance tasks are streamlined. Technicians access digital manuals, diagnostics, and remote expert support hands-free. Overall productivity improves and downtime is reduced.

Digital twins provide real-time visibility into inventory, production, and distribution. They identify bottlenecks, optimize logistics, reduce new operational startup time, enhance virtual prototyping, and accelerate product testing and design. The result is faster time-to-market at reduced cost.

In the retail space, the Metaverse using Vision Pro becomes a world of virtual stores and showrooms enhancing the customer’s experience, allowing for exploration and interaction with products in the virtual space and driving online sales.

Manufacturers can use Vision Pro in the Metaverse for employee training simulations especially where complex machinery is involved. This can improve employee skills and enhance safety. It also enables opportunities to create virtual workspaces across a wide geography, promoting collaboration and communication among even remote team members. Manufacturers can also leverage digital twins when physical presence is challenging.

3D & AI in Immersive Mixed Reality

One of the key requirements for mixed reality applications in the Metaverse is precisely overlaying an object in the physical world. This provides a visual presentation and work instructions for assembly and training. In the case of manufacturing overlaying objects with a digital twin can catch any errors or defects. Users can track objects and adjust renderings as work progresses. 

Most on-device object tracking systems use 2D images or marker-based tracking. This severely limits overlay accuracy in 3D because 2D tracking cannot estimate depth with high accuracy, and consequently the scale, and the pose. This means even though users can get what looks like a good match when looking from one angle or position, the overlay loses alignment as the user moves around in 6DOF. 

Not familiar with the term 6DOF. Machines or motion systems on the drawing board are designed to move a certain way but may move differently in the real world. These six types of motion are linear, horizontal, vertical, pitch, yaw and roll. Only 3D digital simulation captures all of them. When you add deep learning-based 3D artificial intelligence (AI), users can identify objects of arbitrary shape and size in various orientations in the virtual space and compare them with real-world counterparts. 

Working in the Cloud Environment

Technologies like AR/VR have been in use for several years. Many companies have deployed virtual solutions where all the data is stored locally. This severely limits performance and scale for today’s virtual designs. It also limits the ability to share knowledge between organizations which can prove critical in designing new products and understanding the best way for virtual buildouts.

To overcome these limitations companies are turning to cloud-based (or remote server-based) AR/VR platforms powered by distributed architecture and 3D vision-based AI. These cloud platforms provide the desired performance and scalability to drive innovation in the industry at speed and scale.

Vision Pro and its successors are among the technologies shaping the Metaverse in the present and future. New extended reality devices will continue to emerge such as brain-computer interfaces. The cloud will continue to grow. We are still at the very beginning of the age of computing barely three-quarters of a century old.

The human digital experience, therefore, remains at an early stage as we enter an alternate to the physical Universe, the Metaverse of our creation. So prepare to have your mind blown.

 

lenrosen4
lenrosen4https://www.21stcentech.com
Len Rosen lives in Oakville, Ontario, Canada. He is a former management consultant who worked with high-tech and telecommunications companies. In retirement, he has returned to a childhood passion to explore advances in science and technology. More...

LEAVE A REPLY

Please enter your comment!
Please enter your name here


Most Popular

Recent Comments

Verified by ExactMetrics