Connect with us

Tech

The DeanBeat: A Big Bang week for the metaverse

Published

on

These people are not people. They MetaHumans.

Join Transform 2021 this July 12-16. Register for the AI event of the year.


The metaverse had a couple of Big Bangs this week that should put it on everyone’s radar. First, Epic Games raised $1 billion at a $28.7 billion valuation. That is $11.4 billion more valuable than Epic Games was just nine months ago, when it raised $1.78 billion at a $17.3 billion value.

And it wasn’t raising this money to invest more in Fortnite. Rather, Epic explicitly said it was investing money for its plans for the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. Epic Games CEO Tim Sweeney has made no secret of his ambitions for building the metaverse and how it should be open.

And while that might sound crazy, he received $200 million from Sony in this round, on top of $250 million received from Sony in the last round. I interpret this to mean that Sony doesn’t think Sweeney is crazy, and that it too believes in his dream of making the metaverse happen. And if Sony believes in the metaverse, then we should expect all of gaming to set the metaverse as its North Star. Epic’s $1 billion in cash is going to be spent on the metaverse, and that amount of money is going to look small in the long run.

Epic Games has a foothold to establish the metaverse because it has the users and the cash. It has 350 million-plus registered users for Fortnite. And it has been investing beyond games into things like social networks and virtual concerts, as Sweeney knows that the metaverse — a place where we would live, work, and play — has to be about more than just games. Games are a springboard to the metaverse, but they’re only a part of what must be built.

Above: These people are not people. They MetaHumans.

Image Credit: Epic Games

One of the keys to the metaverse will be making realistic animated digital humans, and two of Epic’s leaders — Paul Doyle and Vladimir Mastilović — will speak on that topic at our upcoming GamesBeat Summit 2021 conference on April 28 and April 29. This fits squarely with the notion of building out the experience of the metaverse. We need avatars to engage in games, have social experiences, and listen to live music, according to my friend Jon Radoff (CEO of Beamable) in a recent blog post.

Meanwhile, this morning Nvidia announced something called GanVerse, which can take a 2D picture of a car and turn it into a 3D model. It’s one more tool to automate creation for the metaverse.

To make the metaverse come to life, we need so many more layers, including discovery tools, a creator economy, spatial computing to deliver us the wow 3D experience, decentralization to make commerce between worlds seamless and permission-less, human interface and new devices that make the metaverse believable, and infrastructure too.

The Omniverse

BMW Group is using Omniverse to build a digital factory that will mirror a real-world place.

Above: BMW Group is using Omniverse to build a digital factory that will mirror a real-world place.

Image Credit: Nvidia

And when you think about those things, that is what we got in another Big Bang this week as Nvidia announced its enterprise version of the Omniverse, a metaverse for engineers. By itself, that doesn’t sound too exciting. But drilling deep on it, I learned a lot about how important the Omniverse could be in providing the foundational glue for the metaverse.

“The science fiction metaverse is near,” said Nvidia CEO Jensen Huang in a keynote speech this week at the company’s GTC 21 online event.

First, Nvidia has been working on the Omniverse — which can simulate real-world physics — for four years, and it has invested hundreds of millions of dollars in it, said Nvidia’s Richard Kerris in a press briefing.

Nvidia started this as “Project Holodeck,” using proprietary technology. But it soon discovered the Universal Scene Description language that Pixar invented for describing 3D data in an open, standardized way. Pixar invented this “HTML of 3D” and shared it with its vendors because it didn’t want to keep reinventing 3D tools for its animated movies.

“The way to think about USD is the way you would think about HTML for the internet,” Huang said. “This is HTML for 3D worlds. Omniverse is a world that connects all these worlds. The thing that’s unique about Omniverse is its ability to simulate physically and photorealistically.”

It open sourced USD about eight years ago, and it has spread to multiple industries. One of the best things about it is that it enable remote collaboration, where multiple artists could work on the same 3D model at once.

jon radoff 2

Above: The metaverse market map

Image Credit: Jon Radoff

Nvidia made USD the foundation for the Omniverse, adding real-time capabilities. Now BMW Group, Ericsson, Foster + Partners, and WPP are using it, as are 400 enterprises. It has application support from Bentley Systems, Autodesk, Adobe, Epic Games, ESRI, Graphisoft, Trimble, Robert McNeel & Associates, Blender, Marvelous Designer, Reallusion, and Wrnch. That’s just about the entire 3D pipeline for tools used to make things like games, engineering designs, architectural projects, movies, and advertisements.

BMW Group is building a car factory in the Omniverse, replicating exactly what it would build in the real world but doing it first in a “digital twin” before it has to commit any money to physical construction. I saw a demo of the Omniverse, and Nvidia’s engineers told me you could zip through it at 60 frames per second using a computer with a single Nvidia GeForce RTX card (if you can get one).

“You could be in Adobe and collaborate with someone using Autodesk or the Unreal Engine and so on. It’s a world that connects all of the designers using different worlds,” Huang said. “As a result, you’re in a shared world to create a theme or a game. With Omniverse you can also connect AI characters. They don’t have to be real characters. Using design tools for these AI characters, they can be robots. They can be performing not design tasks, but animation tasks and robotics tasks, in one world. That one world could be a shared world, like the simulated BMW factory we demonstrated.”

Bentley's tools used to create a digital twin of a location in the Omniverse.

Above: Bentley’s tools used to create a digital twin of a location in the Omniverse.

Image Credit: Nvidia

Nvidia hopes to test self-driving cars — which use Nvidia’s AI chips — inside the Omniverse, driving them across a virtual U.S., from California to New York. It can’t do that in the real world. Volvo needs the Omniverse to create a city environment around its cars so that it can test them in the right context. And its engineers can virtually sit in the car and walk around it while designing it.

The Omniverse is a metaverse that obeys the laws of physics and supports things that are being created by 3D creators around the world. You don’t have to take a Maya file and export it in a laborious process to the Omniverse. It just works in the Omniverse, and you can collaborate across companies — something that the true metaverse will require. Nvidia wants tens of millions of designers, engineers, architects and other creators — including game designers — to work and live in the Omniverse.

“Omniverse, when you generalize it, is a shared simulated virtual world. Omniverse is the foundation platform for our AR and VR strategies,” Huang said. “It’s also the platform for our design and collaboration strategies. It’s our metaverse virtual world strategy platform, and it’s our robotics and autonomous machine AI strategy platform. You’ll see a lot more of Omniverse. It’s one of the missing links, the missing piece of technology that’s important for the next generation of autonomous AI.”

Why the Omniverse matters to games

Nvidia Omniverse

Above: Nvidia’s Omniverse is going to be important.

Image Credit: Nvidia

By building the Omniverse for real-time interaction, Nvidia made it better for game designers. Gamers zip through worlds at speeds ranging from 30 frames per second to 120 frames per second or more. With Nvidia’s RTX cards, they can now do that with highly realistic 3D scenery that takes advantage of real-time ray tracing, or realistic lighting and shadows. And Kerris said that most what you see doesn’t have to be constantly refreshed on every user’s screen, making the real-time updating of the Omniverse more efficient.

Tools like Unreal or Unity can plug into the Omniverse, thanks to USD. They can create games, but once the ecosystem becomes mature, they can also absorb assets from other industries. Games commonly include realistic replicas of cities. Rockstar Games built copies of New York and Los Angeles for its games. Ubisoft has built places such as Bolivia, Idaho, and Paris for its games. Imagine if they built highly realistic replicas and then traded them with each other. The process of creating games could be more efficient, and the idea of building a true metaverse, like the entire U.S., wouldn’t seem so crazy. The Omniverse could make it possible.

Some game companies are thinking about this. One of the studios playing with Omniverse is Embark Studios. It’s founder is Patrick Soderlund, the former head of studios for Electronic Arts. Embark has backing from Nexon, one of the world’s biggest makers of online games. And since the tools for Omniverse will eventually be simplified, users themselves might one day be able to contribute their designs to the Omniverse.

Huang thinks that game designers will eventually feel more comfortable designing their worlds while inside the Omniverse, using VR headsets or other tools.

Nvidia's Omniverse can simulate a physically accurate car.

Above: Nvidia’s Omniverse can simulate a physically accurate car.

Image Credit: Nvidia

“Game development is one of the most complex design pipelines in the world today,” Huang said. “I predict that more things will be designed in the virtual world, many of them for games, than there will be designed in the physical world. They will be every bit as high quality and high fidelity, every bit as exquisite, but there will be more buildings, more cars, more boats, more coins, and all of them — there will be so much stuff designed in there. And it’s not designed to be a game prop. It’s designed to be a real product. For a lot of people, they’ll feel that it’s as real to them in the digital world as it is in the physical world.”

Omniverse enables game developers working across this complicated pipeline, allowing them to be connected, Huang said.

“Now they have Omniverse to connect into. Everyone can see what everyone else is doing, rendering in a fidelity that is at the level of what everyone sees,” he said. “Once the game is developed, they can run it in the Unreal engine that gets exported out. These worlds get run on all kinds of devices. Or Unity. But if someone wants to stream it right out of the cloud, they could do that with Omniverse, because it needs multiple GPUs, a fair amount of computation.”

He added, “That’s how I see it evolving. But within Omniverse, just the concept of designing virtual worlds for the game developers, it’s going to be a huge benefit to their work flow. The metaverse is coming. Future worlds will be photorealistic, obey the laws of physics or not, and be inhabited by human avatars and AI beings.”

Brands and the metaverse

hasbro

Above: Hasbro’s Nerf guns are appearing inside Roblox.

Image Credit: Hasbro/Roblox

On a smaller scale, Roblox also did something important. It cut a deal with Hasbro’s Nerf brand this week, where some new blasters will come to the game. Roblox doesn’t make the blasters itself. Rather, it picks some talented developers to make them, so that it stays true to its user-generated content mantra. That Roblox can partner with a company like Hasbro shows the brands have confidence in Roblox, as it has demonstrated in deals with Warner Bros.

Usually, user-generated content and brands don’t mix. The users copy the copyrighted brands, and the brands have to take some legal action. But Roblox invests a lot in digital safety and it doesn’t seem to have as big a problem as other entities. That’s important. We know that Roblox is a leading contender for turning into the metaverse because it has the users — 36 million a day. But the real test is whether the brands will come and make that metaverse as lucrative as other places where the brands show up, like luxury malls.

And FYI, we’ve got a panel on Brands and the Metaverse at our GamesBeat Summit 2021 event on April 28 and April 29. Kudos for Steven Augustine of Intel for planting that thought in my brain months ago.

I feel like the momentum for the metaverse is only getting stronger, and it is embedding itself in our brains as a kind of Holy Grail — or some other lost treasure in other cultures — that we must find in order to reach our ultimate goals.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Metacore secures $179.9M in credit from Supercell for casual games

Published

on

Metacore secures $179.9M in credit from Supercell for casual games

Did you miss GamesBeat Summit 2021? Watch on-demand here! 


Mobile game studio Metacore has raised $179.9 million in credit from Supercell to continue developing its casual mobile game Merge Mansion.

It’s a huge amount of money to pour into a game studio with one game, but it shows what Helsinki-based Supercell is willing to do with the cash it generates from mobile gaming hits like Clash of Clans, Clash Royale, Hay Day, and Brawl Stars.

Since releasing its first game (Merge Mansion) in late 2020, Metacore’s annual revenue run rate has reached $54 million, putting the company on track to become one of the fastest-growing game studios in Europe. Merge Mansion is a puzzle game with more than 800,000 daily players. The new funding will help boost Merge Mansion’s global operations and strengthen Metacore’s core team.

“We’re off to a really good start and raised this follow-on funding from Supercell to increase the scaling of the game,” said Metacore CEO Mika Tammenkoski in an interview with GamesBeat. “It couldn’t be more exciting than this.”

Supercell has backed the game studio for years, with an initial investment of $5.9 million in 2018 followed by a $17.9 million investment and $11.9 million credit line in 2020. The new credit line financing strengthens Metacore’s capability to accelerate its growth while maintaining their current ownership structure and autonomy.

Supercell’s investments lead Jaakko Harlas said in a statement that Metacore is going from strength to strength. He said Merge Mansion launched with high expectations and has met them. He said Supercell invests in strong teams, and Supercell’s role is to remove obstacles.

“Merge Mansion has hit its metrics, and we have been scaling it successfully so far,” Tammenkoski said. “We believe that we can really reach the top of the charts with that game. As you know, as you know, getting to the top of the charts, or scaling mobile games, is really capital intensive because of the dynamics of the free-to-play business model. And it means that you have to invest heavily, and then you have to wait for a while to get the return on the investment.”

Metacore looks to fill key roles in game development and brand marketing.

“Most of the money goes into marketing,” he said. “The personal costs are really marginal compared to what you can spend on performance and brand marketing. And we really want to make Merge Mansion into an entertainment brand in the mobile game space. And that means that we really need to invest into it as well.”

Metacore has a distinctive approach to scaling its studio: It builds and tests games with small, two-to-three person teams that have full autonomy over games they develop and only expand these teams once they’ve validated the concept on the market through player feedback. That’s pretty similar to the way that Supercell runs.

Regarding Supercell, “They know how capital intensive scaling these games is,” Tammenkoski said. “We couldn’t have a better partner than this.”

Above: Metacore’s Merge Mansion mixes puzzles with discovery.

Image Credit: Metacore

This enables Metacore to quickly pivot or scrap game projects that players aren’t responding to, but it also means the studio can swiftly act when it’s clear they have a hit game like Merge Mansion on its hands.

Metacore has doubled its team size to close to 30 since last fall and is actively recruiting for key roles in game development, brand marketing, and other strategic business functions. Tammenkoski emphasized that the company is not rushing with recruitment and is taking the time to find the right fit.

Tammenkoski and Aki Järvilehto founded the company. Merge Mansion features a grandmother and her grandaughter who bond over an old mansion and try to get it back into livable shape. The advertising will focus on telling a story for a mass market audience, Tammenkoski said.

The funding comes at a time when mobile advertising is in flux, as Apple is prioritizing user privacy over targeting advertising. Tammenkoski said there is turbulence in the market and no one knows how bad it will get, but he said he is not targeting any particular cohort of players. That should make it easier to deal with Apple’s change in the Identifier for Advertisers (IDFA).

“The dynamics will change, but we will go broad with our advertising,” Tammenkoski said.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading

Tech

Speech recognition system trains on radio archive to learn Niger Congo languages

Published

on

speech recognition

Join Transform 2021 this July 12-16. Register for the AI event of the year.


For many of the 700 million illiterate people around the world, speech recognition technology could provide a bridge to valuable information. Yet in many countries, these people tend to speak only languages for which the datasets necessary to train a speech recognition model are scarce. This data deficit persists for several reasons, chief among them the fact that creating products for languages spoken by smaller populations can be less profitable.

Nonprofit efforts are underway to close the gap, including 1000 Words in 1000 Languages, Mozilla’s Common Voice, and the Masakhane project, which seeks to translate African languages using neural machine translation. But this week, researchers at Guinea-based tech accelerator GNCode and Stanford detailed a new initiative that uniquely advocates using radio archives in developing speech systems for “low-resource” languages, particularly Maninka, Pular, and Susu in the Niger Congo family.

“People who speak Niger Congo languages have among the lowest literacy rates in the world, and illiteracy rates are especially pronounced for women,” the coauthors note. “Maninka, Pular, and Susu are spoken by a combined 10 million people, primarily in seven African countries, including six where the majority of the adult population is illiterate.”

The idea behind the new initiative is to make use of unsupervised speech representation learning, demonstrating that representations learned from radio programs can be leveraged for speech recognition. Where labeled datasets don’t exist, unsupervised learning can help to fill in domain knowledge by determining the correlations between data points and then training based on the newly applied data labels.

New datasets

The researchers created two datasets, West African Speech Recognition Corpus and the West African Radio Corpus, intended for applications targeting West African languages. The West African Speech Recognition Corpus contains over 10,000 hours of recorded speech in French, Maninka, Susu, and Pular from roughly 49 speakers, including Guinean first names and voice commands like “update that,” “delete that,” “yes,” and “no.” As for the West African Radio Corpus, it consists of 17,000 audio clips sampled from archives collected from six Guinean radio stations. The broadcasts in the West African Radio Corpus span news and shows in languages including French, Guerze, Koniaka, Kissi, Kono, Maninka, Mano, Pular, Susu, and Toma.

To create a speech recognition system, the researchers tapped Facebook’s wav2vec, an open source framework for unsupervised speech processing. Wav2vec uses an encoder module that takes raw audio and outputs speech representations, which are fed into a Transformer that ensures the representations capture whole-audio-sequence information. Created by Google researchers in 2017, the Transformer network architecture was initially intended as a way to improve machine translation. To this end, it uses attention functions instead of a recurrent neural network to predict what comes next in a sequence.

Above: The accuracies of WAwav2vec.

Despite the fact that the radio dataset includes phone calls as well as background and foreground music, static, and interference, the researchers managed to train a wav2vec model with the West African Radio Corpus, which they call WAwav2vec. In one experiment with speech across French, Maninka, Pular, and Susu, the coauthors say that they achieved multilingual speech recognition accuracy (88.01%) on par with Facebook’s baseline wav2vec model (88.79%) — despite the fact that the baseline model was trained on 960 hours of speech versus WAwav2vec’s 142 hours.

Virtual assistant

As a proof of concept, the researchers used WAwav2vec to create a prototype of a speech assistant. The assistant — which is available in open source along with the datasets — can recognize basic contact management commands (e.g., “search,” “add,” “update,” and “delete”) in addition to names and digits. As the coauthors note, smartphone access has exploded in the Global South, with an estimated 24.5 million smartphone owners in South Africa alone, according to Statista, making this sort of assistant likely to be useful.

“To the best of our knowledge, the multilingual speech recognition models we trained are the first-ever to recognize speech in Maninka, Pular, and Susu. We also showed how this model can power a voice interface for contact management,” the coauthors wrote. “Future work could expand its vocabulary to application domains such as microfinance, agriculture, or education. We also hope to expand its capabilities to more languages from the Niger-Congo family and beyond, so that literacy or ability to speak a foreign language are not prerequisites for accessing the benefits of technology. The abundance of radio data should make it straightforward to extend the encoder to other languages.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading

Tech

Gamescom announces online-only festival in August, reversing hybrid event plan

Published

on

The crowd at Gamescom 2019 on opening day on Tuesday, August 20.

Did you miss GamesBeat Summit 2021? Watch on-demand here! 


Reversing a plan announced in March, Gamescom will no longer try to do a hybrid gaming expo this summer. Instead, it will focus on an online-only event at the end of August.

The fan-and-business trade show is the world’s biggest game-industry event — with 370,000 people attending the physical event in 2019 — but it had to switch to online-only in 2020 due to the pandemic. The event organizers floated the idea of a hybrid physical event where fans could come see games in person along with digital announcements. The hope was that the coronavirus would subside thanks to vaccinations and that people would want to recapture the excitement of an in-person event.

But today, the Association of the German Games Industry and Koelnmesse decided against that plan, based on responses from potential exhibitors and fans. They plan to hold the main part of the show from August 25 to August 29.

Gamescom Congress will once again take place Thursday, August 26, and Devcom will start off the events August 23. The main days of Gamescom will take place on August 26 and August 27. IGN will produce a show dubbed Awesome Indies. Opening Night Live, which Geoff Keighley produces, will still take place, but it will now be online-only as well. Gamescom was planning to start selling tickets in May.

Above: The crowd at Gamescom 2019 on opening day. The show was online-only in 2020. It will be online-only again in 2021.

Image Credit: Dean Takahashi

“This decision was made after extensive discussions with partners and exhibitors,” the organizers said in a press release. “Thus, the organizers take into account the current situation, in which too many companies are unable to participate in physical events this year due to the still difficult development. In this way, they also meet the partners’ strong need for planning security. This means that Gamescom 2021 will be held exclusively digitally and free of charge for all Gamescom fans.”

Last year, Gamescom had more than 100 million video views over all formats and channels, more than 50 million unique viewers from 180 countries, and 370 partners from 44 countries. Oliver Frese, chief operating officer of Koelnmesse, said in a statement that Gamescom was coming too early for many companies in the industry, as it required so much advanced planning amid an uncertain environment. Companies need that planning reliability, he said.

Felix Falk, managing director of the German Games Industry Association, said in a statement that next year the groups will be able to implement more of the concepts they had in mind for a hybrid version of Gamescom. There will be business-to-business matchmaking events such as “indies meet investors and publishers” pitch events.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading

Trending