Redefining our digital lives

Technologies pushing our digital lives

Many different technologies and trends are part of what we call the metaverse. Schibsted Futures Lab guides us to a few that are major drivers for our expanded digital lives and experiences.

Our extended realities are growing

We have seen continuous growth in the adoption of extended reality (XR). It is predicted that the worldwide AR and VR industry would reach USD 209 billion by 2022. Aside from the gaming, healthcare and education sectors, there is also an emerging demand for digitally immersive customer experiences. Interacting with the goods digitally before visiting the store, from shopping for clothes to trying on make-up, or even searching for your next dream home, has made XR crucial in the purchasing process.

A hurdle to the widespread adoption of these technologies is the lack of multi-platform standards. To break down this barrier, WebXR has been introduced as the future web standard to facilitate XR experiences through a URL. It is cheaper to implement without any additional hardware or software. Not only does web XR make 3D content generation widely accessible to users, it could also be a game changer for enterprises to attract younger generations and scale up new businesses rapidly.
Yifan Hu,UX Designer

Game engines help speed up development

As experiences become digitally immersive, we need a framework to provide tools and libraries for building digital worlds and physics for their interactions. This led to the idea of game engines. Unity and Unreal Engine are the clear market leaders. Since their first debut in the late 90s and early 2000s, they have evolved and expanded into other industries. Unreal Engine’s metahuman creator has proved its ability of rapid avatar generation and customisation. What used to take a 3D artist months now only takes minutes.

Many other industries have also begun using game engines to create prototypes and digital twins, from car companies like Porsche to architecture firms to manufacturing plants. When more and more designs get populated in UE5, how long will it take to see your digital avatar driving a Porsche 911 down the street?
Yifan Hu,UX Designer

Synthetic media has opened Pandora’s box

If you have been online at all in the last year, you’ve likely heard something about synthetic media. AI-enabled tools for generating text and images have exploded in recent months. In September of this year, an AI-generated artwork won the Colorado State Fair art prize, likely the first time a rural art competition has been at the centre of global controversy in the fine art world. AI tools like Midjourney, which was used to generate the winning artwork, are related to the technology behind deepfakes.

Neural network AI models are trained on vast amounts of existing works, whether text, images, or, as recently released by Google, short video clips. These models can then be used to generate new works, potentially revolutionising creative industries, and in the process sparking heated debates about intellectual property and the role of human creativity. Synthetic media likely has a large role to play in content creation, not only for the metaverse, but in countless existing industries as well, not least of which is news media. Pandora’s box is open. The question now is whether we can find ways to make responsible and ethical use of what’s inside?
Christopher Pearsell-Ross, UX Designer

Blend the real with the virtual

The scope and areas of application for digital twins have significantly expanded. A digital twin works as a digital replica of a real-world object or system. It enables you to run one or several simulations with access to real-time data. This opens the possibility of studying, tracking and monitoring multiple processes with various application areas, including urban planning, healthcare services, the automotive industry, manufacturing operations, big structures and power-generation equipment. Digital twins allow for a seamless integration between digital and physical spaces, blending the real with the virtual.
Eline Wong, Junior Creative Technologist

Gaming has become social

Games are already the new social hubs for next-gen consumers. According to Newzoo’s latest report, more than 90% of Generation Alpha, Z and Millennials have engaged with video games in holistic form, including live streaming, podcasting and attending in-person live events. The revenue growth for the gaming industry has exceeded USD 200 billion.

Social gaming still performs at a high level of monetisation via advertising, in-game offers and virtual goods purchased by players. Minecraft, Roblox and Fortnite top the charts for Gen Alphas. These games share three key concepts: players can build worlds, host events, or go on adventures with their customised avatars. Since 2021, demand for building and creating tools has increased by 7% and is most relevant for young players in the eight to 15-year-old range. While many adults are still pondering what the metaverse is, these young master builders have already begun hammering down their proto-metaverses.
Yifan Hu, UX Designer

Avatars will need to become more realistic

Meta’s new release of the XR headset Quest Pro underlines the importance of avatars, which have surpassed their rudimentary perception as animated Memojis. Rather, live, hyper-realistic avatars will be crucial to how we express ourselves in the future. Meta’s so-called codec avatars and instant avatars will allow the user to generate a photorealistic digital self through a face scan with a mobile device.

These avatars should excel in the non-verbal cues, such as facial expressions and eye contact, which we rely on to enhance communication. That being said, securing your avatar, whether through encryption or an authenticated account, will become a critical challenge. Nvidia recently announced an open-source platform for building cloud-native, AI-powered avatars. From customer support agents to teaching assistants, digital humanoid interfaces will soon no longer be a fantasy.
Yifan Hu, UX Designer