字幕列表 影片播放 列印英文字幕 The Metaverse is the next step in the internet's evolution; it’s the convergence of physical, augmented, and virtual reality in a shared online space. The Metaverse is a 4-D version of our current internet and it can be thought of as an internet that you're inside of, rather than one that you're merely looking at. The Metaverse would need a critical mass of interconnected technologies including the following: Virtual Reality: This is an experience that simulates life-like situations. Real-life-use cases include gaming, social networking, education, and on-the-job training. Unity’s CEO, John Riccitiello, predicts AR and VR headsets will be as common as game consoles by 2030. And according to ARK Invest, VR headsets could reach smartphone adoption rates by 2030. By that time, accessories such as full-body haptic suits and VR gloves will likely explode in popularity. There are already multiple platforms that can allow you to connect with other VR users. Facebook Horizon lets you explore virtual worlds where you can connect with people across the world, participate in fun challenges, and even create your own virtual worlds. Facebook Workroom is a collaboration experience that lets people come together to work in the same virtual room, regardless of physical distance. Currently, the most notable VR headsets include the Oculus Quest 2, HTC Vive Cosmos, Sony Playstation VR, and Valve Index. Number two, artificial intelligence. This will benefit the metaverse in numerous ways, as stated by Eric Elliott in this "Medium" article. Realistic-looking intelligent AI beings can wander the metaverse and interact with us and each other. They could be programmed with their own life stories, motivations, and objectives. Depending on the type of virtual world they’re in, we could participate in pre-planned scenarios with these characters or create our own scenarios. Unreal Engine’s MetaHuman Creator could play a major part in the creation of these characters. And if and when these characters are able to exhibit general artificial intelligence, the results can be pretty amazing and surreal. AI can help us streamline the creation of metaverse assets such as characters, landscapes, buildings, character routines, and more. We may see a future where advanced AI capabilities are integrated with game engines such as Unreal Engine to make this possible. AI can automate software development processes so that we can build increasingly more complex assets within the metaverse with less and less effort. And AI can be used to create, audit, and secure smart contracts on the blockchain. Basically, smart contracts allow trusted transactions and agreements to be carried out without the need for a central authority or legal system. Number three, augmented reality; this is an experience where designers enhance parts of a user's physical world with a computer-generated input. Eventually, AR contact lenses and AR glasses could be used to augment the world around us and facilitate virtual assistance with the help of sophisticated artificial intelligence. This AI would help us navigate in both the real world and the virtual world. Currently, the Microsoft HoloLens and the Magic Leap One are the most notable augmented and mixed reality headsets on the market. They’re primarily used for enterprise purposes, but as their prices decrease, we’ll see more AR headsets sold to the general public. Number four, blockchain technology. In a decentralized metaverse, blockchain technology would be an ideal currency for facilitating quick and secure digital transactions. Even though blockchain technology came into existence with Bitcoin, blockchain has far-reaching potential outside of cryptocurrency. Basically, blockchain is a shared database that allows multiple parties to access data and verify that data in real-time. Number five, brain-computer interfaces. Brain-computer interfaces would allow us to control our avatars, various objects, and digital transactions with our brain signals. This technology is expected to gain an initial foothold in the video game and workforce productivity markets. This technology won’t play a major part in the early years of the metaverse. However, by the mid-2030s, some early adopters might begin using brain-computer interfaces to connect to their neocortexes, according to Ray Kurzweil. Several companies are already developing brain-computer interfaces; Neuralink, NextMind, and Neurable are among the most notable companies. Number six, internet infrastructure; first, I'll talk about 5G and 6G. The metaverse would require extremely high internet speeds, high bandwidth, and low latency, especially when a user enters a vast virtual world with highly-detailed textures and unbelievably high polygon counts. 5G enables extremely high frequencies at the millimeter-wave spectrum, which opens up possibilities like VR experiences that include the sense of touch and AR experiences that let visitors have in-depth conversations with AI characters in real-time. Eventually, 6G will replace 5G; 6G isn’t a functional technology yet, but several countries have already launched research initiatives. Some experts estimate that it could be 100 times faster than 5G, which equates to 1 terabyte (TB) per second. At that speed, you could download 142 hours of Netflix movies in one second. According to [a] white paper by NTT DOCOMO, 6G would make it possible for cyberspace to support human thought and action in real-time through wearable devices and micro-devices mounted on the human body. Sensory interfaces would feel and look just like real life. Next, I'll talk about Web 3.0. With Web 1.0, content creators were scarce with the vast majority of users simply acting as consumers of content. On the most part, we're currently in the Web 2.0 era. Web 2.0 brought us the "Web as a Platform", where software applications are built on the Web as opposed to just desktop computers. This enabled masses of users to participate in content creation on social networks, blogs, sharing sites, and more. However, Web 2.0 greatly empowers centralized tech giants and also enables surveillance and exploitive advertising. One of Web 3.0’s primary advantages is its ability to enable a decentralized blockchain protocol, which would enable individuals to connect to an internet where they can own and be properly compensated for their time and data. This is more advantageous than a web where giant, centralized companies own the lion’s share of the web and can siphon large percentages of the profits. Additionally, according to Rajarsji Mitra at blockgeeks.com, it’s expected that the Web 3.0 will allow computers to understand the semantics or meanings of sentences so that they can generate, share, and connect content through search and analysis. Thanks to semantic metadata, Web 3.0 will help facilitate greater connectivity between data sources. As a result, the user experience evolves to another level of connectivity that leverages all of the available information on the internet. Number seven, mobile device processors. In order for augmented reality to appeal to the mainstream public, we likely need augmented reality to work on normal-looking glasses. This would require small, super-fast mobile processors that can be fitted on normal-looking glasses. VR devices will also need fast mobile processors to handle hyper-realistic graphics, low-latency, high refresh rates, high frames per second, and so forth. As processors require more cores and components, we may also see the introduction of optical components that would work in conjunction with traditional silicon components. This could result in 100 times increase in data transfer speeds. Lightmatter is a start-up that was founded at MIT and it’s working on a chip that works using light. Thanks for watching; to learn more about the metaverse, make sure to watch the video to the right.