Hyundai sends Boston Dynamics’ Spot robot into the metaverse

 


Hyundai has ambitious goals for its robotics development, and one thing is certain: they will achieve them. As evidenced by its acquisition of Boston Dynamics, which valued the robotics pioneer at well in excess of $1 billion, the automaker has thus far demonstrated a willingness to put its money where its mouth is.

 

In line with expectations, the company’s CES presentation this week featured a heavy emphasis on robotics. Earlier this month, Hyundai unveiled a sneak preview of the Mobile Eccentric Droid, a four-wheel modular mobility platform designed for urban environments. With the introduction of its new “metamobility” concept, the company unveiled a more comprehensive vision for the future.

 

Further details about Hyundai’s strategy will be revealed, and we will speak with several executives to get a sense of how it might play out in practice. As of now, the broad concept is referred to as “Expanding Human Reach,” with the goal of establishing the role of mobility and robotics in a virtual reality metaverse as a result of its implementation. However, at this early stage, it’s difficult to distinguish between industry buzzwords and their practical implications, but one important component appears to be the use of hardware to serve as a sort of real-world proxy for virtual reality interactions.

 

At this point, let us simply state that there are a lot of big promises surrounding a lack of tangible experience, which has long been a fundamental issue with virtual reality applications. The following is a statement from Chang Song, President of Hyundai Motor Group:

 

A fundamental premise of metamobility is that space, time, and distance will all become irrelevant in the future. By connecting robots to the metaverse, we will be able to freely move between the real world and virtual reality. The metaverse will provide us with an immersive ‘be there’ proxy experience, but in the future, robots will become an extension of our physical senses, giving us the ability to reshape and enrich our daily lives through Metamobility.

 

This technology could be used to control a manufacturing robot from a distance in the near future, which is a plausible application for the time being. Toyota has been investigating this possibility with its T-HR3 system for quite some time. In addition, Microsoft Cloud for Manufacturing, according to the company, could serve as a gateway for remote-controlled work — and it’s not difficult to imagine such a system serving a practical purpose in some context.

 

Other applications are a long way off in the future. According to a press release from Hyundai, “a user can feed and hug a pet in Korea via an avatar robot when they access a digital twin of their home in the metaverse while traveling away from their physical home.” This allows users to participate in real-world experiences through the use of virtual reality.”

 

At the moment, such concepts appear to be primarily conceptual, though Hyundai will be demonstrating what they could eventually look like in person at CES this week. When you consider the number of people who participated in the show virtually during the most recent COVID surge, it’s easy to see how remote operations could become more common in the future.

 

Bringing robotics to inanimate objects, transit

 

Hyundai did not spend the entirety of its time at CES in the virtual reality world. Also unveiled was a concept known as “New Mobility of Things,” which will make use of robotics to autonomously move inanimate objects of all sizes.

 

Plug & Drive, also known as PnD, is a product that falls under the umbrella term “New Mobility of Things.” Aside from intelligent steering and braking, an in-wheel electric drive, and suspension hardware, this single-wheel unit is also equipped with lidar and camera sensors that allow it to detect and navigate around obstacles.

 

It is intended for these PnD modules to be attached to various objects, such as office tables. When more office space is required, the user can direct the table to move closer to them or schedule it to move at a predetermined time when more space is required.

 

In order to meet the needs of each individual, the PnD Module is adaptable and expandable. Because, in the future, you will no longer need to move your belongings; instead, they will move around you, according to Dong Jin Hyun, vice president and head of Hyundai’s robotics lab. “PnD breathes life into objects that would otherwise be inanimate. This ability allows for the transformation of virtually any space to be accomplished. It is a method for configuring spaces in a dynamic manner.”

 

Hyundai demonstrated a variety of product-to-device (PnD) applications, including a personal transportation system that can transport a single individual to a waiting public transportation bus. This pod, which contains four 5.5-inch PnD modules, would be docked with the “mother shuttle” after which it would be released.

 

After the bus has come to a stop, the pod (with the human inside) would continue on for the final mile to their destination, assuming everything goes as planned.

 

Using a video, Hyundai demonstrated this concept by showing an elderly woman receiving her cane through a single PnD before climbing into the pod and zipping off to an awaiting bus. But if it becomes a reality, it could be used to provide first- and last-mile public transportation without increasing the number of single-occupant large cars on the road.

 

Additionally, Hyundai demonstrated a concept known as “Drive & Lift” (or DnL), which is a module that has the capability of lifting objects. Hyundai used the DnL in conjunction with its MobED robot. It is mounted on each of the ModED’s wheels, allowing it to rise and fall in response to uneven terrain or low obstacles such as steps or speed bumps, among other things.

By admin

Leave a Reply

Your email address will not be published.