The hottest new concept in 2021 is the meta universe. On October 29, 2021, Facebook announced that it would change its name to Meta; on November 1, 2021, Roblox, the first stock of the “Meta Universe”, announced its relaunch after a brief adjustment. The following offline/online discussions about the meta-universe are in full swing, and the popularity of the meta-universe concept can be seen.
Dismantling the meta universe layer by layer
Professor Shen Yang, a professor at the School of Journalism of Tsinghua University and a doctoral supervisor, shared in an event that the metaverse, in English, is Metaverse, which is understood literally and consists of two parts: Meta (transcendence) and Universe (universe). The team of Professor Shenyang also gave a relatively precise definition of Metaverse:
Metaverse is a new type of Internet application and social form that integrates multiple new technologies. It provides an immersive experience based on extended reality technology, generates a mirror image of the real world based on digital twin technology, and builds an economic system based on blockchain technology. , The virtual world and the real world are closely integrated in the economic system, social system, and identity system, and each user is allowed to carry out content production and world editing.
Beamable’s founder Jon Radoff disassembled the concept of meta-universe at the industrial level: “Seven levels of meta-universe structure: experience; discovery; creator economy; spatial computing; decentralization; human-computer interaction; infrastructure ."
The experience level is relatively easy to understand. At present, the companies in the fields of games and social interactions that we commonly see are all working on the experience level. The famous game "Second Life" is particularly classic. In this game, users are called "residents" and can interact with each other through movable virtual avatars. This program also provides a high-level social network service on the basis of a usual meta-universe. Residents can walk around, meet other residents, socialize, participate in individual or group activities, create and exchange virtual property and services with each other. And the typical "Number One Player" is people's free imagination of the meta-universe at the experience level.
The discovery level is an important way for users to learn about the experience level, including various application stores, and the main participants are large Internet companies;
Creator Economy: Helps monetize the achievements of the creators of the meta universe, including design tools, monetization technologies, animation systems, graphics tools, etc.;
Spital Computing: Empower the creator's economic layer, including 3D engines, gesture recognition, spatial mapping and artificial intelligence, etc. The main participants are 3D hardware and software vendors;
Decentralization: Companies at this level mainly help the meta-universe ecosystem to build a distributed architecture to form a democratized structure;
Human Interface: The human-computer interaction layer is mainly the media tool for the public to contact the meta-universe, which is mainly reflected in the levels of touch, posture, sound, and nerves. Products include AR/VR, mobile phones, computers, cars, and smarts. Wearable devices such as glasses, the main participants are 3D hardware and software manufacturers;
Infrastructure: 5G, semiconductor chips, new materials, cloud computing and telecommunications networks, etc. The infrastructure layer is likely to be a game between giants, most of which are basic hardware companies.
It can be said that Metaverse is a concentrated export of the future needs of the entire human economy, including users' desire for new experiences, capital's desire for new exports, and technology's desire for new fields. It is an inevitable new development for science and technology to a certain stage. Conception. Even if the "meta universe" does not appear in 2021, other concepts such as "meta world" and "meta matrix" may appear.
The key supporting technology of the meta universe
The above definition of the meta-universe concept and industrial stratification is a concept that has been familiar to many people recently, but this still does not explain the realization path of the meta-universe. In the final analysis, what we want to figure out most is how to realize the dream. The meta universe in China.
From a technical perspective, the key support of each part of the meta universe can be referred to as: "HNCBD", which are hardware experience equipment (Hardware), network and computing power (Networking and Computing), content and application ecology (Content), block Chain and NFT (Blockchain), Digital Twin (Digital Twin). Of course, these core technologies may be slightly different in the eyes of different people, but the overall difference is not big.
In "HNCBD", H belongs to hardware and is not within the scope of regular discussion by software developers; C relies on a blooming application community; and network and computing power, blockchain and NFT, and digital twins, in fact, there is a unified form of carrying , Is cloud computing.
The discerning person has long seen that if the problem of re-creating wheels due to commercial competition is eliminated, the best way to realize the meta-universe is actually the cloud. In a sense, the cloud not only carries the unprecedented huge demand for computing power and infrastructure of Metaverse, but also various PaaS and SaaS services on top of the infrastructure. In the development process of Metaverse, if every application provider and content provider had to reconstruct the infrastructure, including basic data lake warehouse services, digital twin services, and machine learning services, the cost would be unimaginable.
In the current stage of cloud computing, in addition to providing basic computing power support, the most important thing is to provide sufficiently mature technical products in the three directions of games, AI algorithms and VR. The most representative of these is Amazon Cloud Technology.
Looking back at the movie screens of "Ready Player One", the actors put on their glasses and entered the game world. This is actually a typical cloud game scene.
At present, large-scale games adopt a server + client implementation mode, which requires relatively high client hardware, especially the rendering of 3D graphics, which basically completely relies on terminal operations. With the advent of the 5G era, games will be rendered on a large scale on the cloud GPU, and the game screens will be compressed and transmitted to users through the 5G high-speed network.
On the client side, the user's gaming device does not require any high-end processors and graphics cards, only basic video decompression capabilities. From the perspective of game development, the game platform can deploy new game features more quickly, reduce the amount of construction and testing required to start the game, and meet the needs of players.
In September 2020, Amazon Cloud Technology launched its own cloud gaming platform Luna, compatible with PC, Mac, Fire TV, iPad, iPhone and Android systems. Epic Games, a well-known game and platform manufacturer, is also using Amazon EC2 and other Amazon cloud technology services Expand capacity in time and support remote creators. The Amazon G4 instance drives cloud game rendering through GPU, and transmits the most complex cloud game through NVIDIA Video Codec SDK. The NVIDIA T4 GPU on the Amazon G4 instance is also the first GPU instance on the cloud that provides an RT core and supports NVIDIA RTX real-time ray tracing.
The experience of Metaverse is not limited to cloud games. Cloud games are just scenarios, and VR is the path.
The limitations of traditional VR applications are mainly reflected in four aspects, including: high cost of purchasing host and terminal hardware, low equipment utilization, scattered content, and limited mobility.
The combination of cloud computing and VR can migrate GPU rendering functions from the local to the cloud, making the terminal design more portable and cost-effective, and reducing the cost of users buying hardware devices. VR developers can quickly iteratively publish content on the cloud, users can click to play without downloading, and solve the problem of inconcentration of content.
Taking Amazon Sumerian as an example, developers can easily create 3D scenes and embed them in existing web pages. The Amazon Sumerian editor provides ready-made scene templates and intuitive drag-and-drop tools, allowing content creators, designers, and developers to build interactive scenes. Amazon Sumerian uses the latest WebGL and WebXR standards to create an immersive experience directly in a web browser, which can be accessed within a few seconds through a simple URL, and it can run on major hardware platforms suitable for AR/VR .
In addition to cloud games and VR, another key variable in the realization of Metaverse is AI. AI can shorten the time of digital creation and provide underlying support for the metaverse, which is mainly reflected in computer vision, intelligent speech semantics, and machine learning. All three require huge computing power and storage, and cloud computing provides unlimited computing power and storage support for artificial intelligence. There is a company called GE Healthcare that uses Amazon P4d instances to shorten the processing time of customized AI models from days to hours, and increase the speed of training models by two or three times, thereby providing various telemedicine and diagnostic services.
The value of AI in avatars is more obvious. Amazon Cloud Technology’s AI services have many application practices in this field, including image AI generation (automatic coloring, scene adjustment, image dualization), automatic model generation (automatic animation generation, Scene prop generation), game robots (game AI NPC, text interaction, voice-driven lip animation, action capture, expression migration), idol marketing operations (chat observation, popular matching, anti-plugging), etc.
The core work of cloud computing companies in the field of meta universe
If we say that the above dismantling is more of a theoretical analysis, then if we take a closer look at the recent developments of the head cloud computing companies, we will find that various technical support for the metaverse is becoming a reality in the cloud.
2021 Amazon Cloud Technology re:Invent, Amazon Cloud Technology released Amazon IoT TwinMaker and Amazon Private 5G.
The former allows developers to easily aggregate data from multiple sources (such as device sensors, cameras, and business applications), and combine these data to create a knowledge graph, modeling the real-world environment, and realize the industrial meta-universe One of the composition techniques.
The latter can automatically set up and deploy the company’s proprietary 5G network, and expand the capacity as needed to support more devices and network traffic, focusing on serving the huge sensor and end-side device clusters based on Industry 4.0. The industry mentioned above Metaverse and the Internet of Vehicles are naturally in the same sequence.
Not to mention Amazon SageMaker Canvas. It uses code-free concepts to build machine learning models and make model predictions to ensure that services can still be provided without the data engineering team. This further reduces the threshold for future meta-universe content production and ensures the content Diversity.
Also during the 2021 Yamaha Cloud Technology re:Invent global conference, Meta Universe company Meta announced that it will deepen its cooperation with Amazon Cloud Technology and use Amazon Cloud Technology as its strategic cloud service provider.
According to reports, Meta uses Amazon Cloud Technology’s reliable infrastructure and comprehensive functions to supplement its existing local infrastructure, and will use more Amazon Cloud Technology’s computing, storage, database and security services to obtain better privacy in the cloud. Protection, reliability, and scalability, including running third-party cooperative applications on Amazon Cloud Technology, and using cloud services to support its acquisition of companies that are already using Amazon Cloud Technology.
Meta will also use the computing services of Amazon Cloud Technology to accelerate the research and development of the artificial intelligence project of the Meta AI department. In addition, Amazon Cloud Technology and Meta will also cooperate to help customers improve the performance of the deep learning computing framework PyTorch running on Amazon Cloud Technology, and help developers accelerate the construction, training, deployment, and operation of artificial intelligence and machine learning models.
Zhang Wenyi, Amazon's global vice president and executive director of Amazon Cloud Technology Greater China, believes that this is an area where cloud computing can be greatly empowered. She said: "We believe that the metaverse must be a field where cloud computing can be massively empowered. The metaverse itself needs computing, storage, machine learning, etc., which are inseparable from cloud computing."
The future is still being drawn
Will the technology stack of the meta universe expand in the future, and will the presentation of the meta universe change drastically?
The answer is almost affirmative, just like before the popularization of 4G mobile phones, we could not imagine the main types of applications in the 4G ecosystem. From construction to maturity, only at the level of computing power, the meta universe still has at least more than ten years to go.
However, it is always changing. Paying attention to the update and iteration of the meta-universe supporting technology in the cloud computing field may be an important way for us to put aside the bubble and observe the ecological progress of the meta-universe.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。