Connect with us

Tech

The cost of cloud, a trillion dollar paradox

Published

on

The cost of cloud, a trillion dollar paradox

Elevate your enterprise data technology and strategy at Transform 2021.


There is no doubt that the cloud is one of the most significant platform shifts in the history of computing. Not only has cloud already impacted hundreds of billions of dollars of IT spend, it’s still in early innings and growing rapidly on a base of over $100B of annual public cloud spend. This shift is driven by an incredibly powerful value proposition — infrastructure available immediately, at exactly the scale needed by the business — driving efficiencies both in operations and economics. The cloud also helps cultivate innovation as company resources are freed up to focus on new products and growth.

source: Synergy Research Group

However, as industry experience with the cloud matures — and we see a more complete picture of cloud lifecycle on a company’s economics — it’s becoming evident that while cloud clearly delivers on its promise early on in a company’s journey, the pressure it puts on margins can start to outweigh the benefits, as a company scales and growth slows. Because this shift happens later in a company’s life, it is difficult to reverse as it’s a result of years of development focused on new features, and not infrastructure optimization. Hence a rewrite or the significant restructuring needed to dramatically improve efficiency can take years, and is often considered a non-starter.

Now, there is a growing awareness of the long-term cost implications of cloud. As the cost of cloud starts to contribute significantly to the total cost of revenue (COR) or cost of goods sold (COGS), some companies have taken the dramatic step of “repatriating” the majority of workloads (as in the example of Dropbox) or in other cases adopting a hybrid approach (as with CrowdStrike and Zscaler). Those who have done this have reported significant cost savings: In 2017, Dropbox detailed in its S-1 a whopping $75M in cumulative savings over the two years prior to IPO due to their infrastructure optimization overhaul, the majority of which entailed repatriating workloads from public cloud.

Yet most companies find it hard to justify moving workloads off the cloud given the sheer magnitude of such efforts, and quite frankly the dominant, somewhat singular, industry narrative that “cloud is great”. (It is, but we need to consider the broader impact, too.) Because when evaluated relative to the scale of potentially lost market capitalization — which we present in this post — the calculus changes. As growth (often) slows with scale, near term efficiency becomes an increasingly key determinant of value in public markets. The excess cost of cloud weighs heavily on market cap by driving lower profit margins.

The point of this post isn’t to argue for repatriation, though; that’s an incredibly complex decision with broad implications that vary company by company. Rather, we take an initial step in understanding just how much market cap is being suppressed by the cloud, so we can help inform the decision-making framework on managing infrastructure as companies scale.

To frame the discussion: We estimate the recaptured savings in the extreme case of full repatriation, and use public data to pencil out the impact on share price. We show (using relatively conservative assumptions!) that across 50 of the top public software companies currently utilizing cloud infrastructure, an estimated $100B of market value is being lost among them due to cloud impact on margins — relative to running the infrastructure themselves. And while we focus on software companies in our analysis, the impact of the cloud is by no means limited to software. Extending this analysis to the broader universe of scale public companies that stands to benefit from related savings, we estimate that the total impact is potentially greater than $500B.

Our analysis highlights how much value can be gained through cloud optimization — whether through system design and implementation, re-architecture, third-party cloud efficiency solutions, or moving workloads to special purpose hardware. This is a very counterintuitive assumption in the industry given prevailing narratives around cloud vs. on-prem. However, it’s clear that when you factor in the impact to market cap in addition to near term savings, scaling companies can justify nearly any level of work that will help keep cloud costs low.

Unit economics of cloud repatriation: The case of Dropbox, and beyond

To dimensionalize the cost of cloud, and understand the magnitude of potential savings from optimization, let’s start with a more extreme case of large scale cloud repatriation: Dropbox. When the company embarked on its infrastructure optimization initiative in 2016, they saved nearly $75M over two years by shifting the majority of their workloads from public cloud to “lower cost, custom-built infrastructure in co-location facilities” directly leased and operated by Dropbox. Dropbox gross margins increased from 33% to 67%  from 2015 to 2017, which they noted was “primarily due to our Infrastructure Optimization and an… increase in our revenue during the period.”

source: Dropbox S-1 filed February 2018

But that’s just Dropbox. So to help generalize the potential savings from cloud repatriation to a broader set of companies, Thomas Dullien, former Google engineer and co-founder of cloud computing optimization company Optimyze, estimates that repatriating $100M of annual public cloud spend can translate to roughly less than half that amount in all-in annual total cost of ownership (TCO) — from server racks, real estate, and cooling to network and engineering costs.

The exact savings obviously varies company, but several experts we spoke to converged on this “formula”: Repatriation results in one-third to one-half the cost of running equivalent workloads in the cloud. Furthermore, a director of engineering at a large consumer internet company found that public cloud list prices can be 10 to 12x the cost of running one’s own data centers. Discounts driven by use-commitments and volume are common in the industry, and can bring this multiple down to single digits, since cloud compute typically drops by ~30-50% with committed use. But AWS still operates at a roughly 30% blended operating margin net of these discounts and an aggressive R&D budget — implying that potential company savings due to repatriation are larger. The performance lift from managing one’s own hardware may drive even further gains.

Across all our conversations with diverse practitioners, the pattern has been remarkably consistent: If you’re operating at scale, the cost of cloud can at least double your infrastructure bill.

The true cost of cloud

When you consider the sheer magnitude of cloud spend as a percentage of the total cost of revenue (COR), 50% savings from cloud repatriation is particularly meaningful. Based on benchmarking public software companies (those that disclose their committed cloud infrastructure spend), we found that contractually committed spend averaged 50% of COR.

Actual spend as a percentage of COR is typically even higher than committed spend: A billion dollar private software company told us that their public cloud spend amounted to 81% of COR, and that “cloud spend ranging from 75 to 80% of cost of revenue was common among software companies”. Dullien observed (from his time at both industry leader Google and now Optimyze) that companies are often conservative when sizing cloud commit size, due to fears of being overcommitted on spend, so they commit to only their baseline loads. So, as a rule of thumb, committed spend is often typically ~20% lower than actual spend… elasticity cuts both ways. Some companies we spoke with reported that they exceeded their committed cloud spend forecast by at least 2X.

If we extrapolate these benchmarks across the broader universe of software companies that utilize some public cloud for infrastructure, our back-of-the-envelope estimate is that the cloud bill reaches $8B in aggregate for 50 of the top publicly traded software companies (that reveal some degree of cloud spend in their annual filings). While some of these companies take a hybrid approach — public cloud and on-premise (which means cloud spend may be a lower percentage of COR relative to our benchmarks) — our analysis balances this, by assuming that committed spend equals actual spend across the board. Drawing from our conversations with experts, we assume that cloud repatriation drives a 50% reduction in cloud spend, resulting in total savings of $4B in recovered profit. For the broader universe of scale public software and consumer internet companies utilizing cloud infrastructure, this number is likely much higher.

cloud spend breakdownsource: company S-1 and 10K filings; a16z analysis

While $4B of estimated net savings is staggering on its own, this number becomes even more eye-opening when translated to unlocked market capitalization. Since all companies are conceptually valued as the present value of their future cash flows, realizing these aggregate annual net savings results in market capitalization creation well over that $4B.

How much more? One rough proxy is to look at how the public markets value additional gross profit dollars: High-growth software companies that are still burning cash are often valued on gross profit multiples, which reflects assumptions about the company’s long term growth and profitable margin structure. (Commonly referenced revenue multiples also reflect a company’s long term profit margin, which is why they tend to increase for higher gross margin businesses even on a growth rate-adjusted basis). Both capitalization multiples, however, serve as a heuristic for estimating the market discounting of a company’s future cash flows.

Among the set of 50 public software companies we analyzed, the average total enterprise value to 2021E gross profit multiple (based on CapIQ at time of publishing) is 24-25X. In other words: For every dollar of gross profit saved, market caps rise on average 24-25X times the net cost savings from cloud repatriation. (Assumes savings are expressed net of depreciation costs incurred from incremental CapEx if relevant).

This means an additional $4B of gross profit can be estimated to yield an additional $100B of market capitalization among these 50 companies alone. Moreover, since using a gross profit multiple (vs. a free cash flow multiple) assumes that incremental gross profit dollars are also associated with certain incremental operating expenditures, this approach may underestimate the impact to market capitalization from the $4B of annual net savings.

For a given company, the impact may be even higher depending on its specific valuation. To illustrate this phenomenon [please note this is not investment advice, see full disclosures below and at https://a16z.com/disclosures/], take the example of infrastructure monitoring as a service company Datadog. The company traded at close to 40X 2021 estimated gross profit at time of publishing, and disclosed an aggregate $225M 3-year commitment to AWS in their S-1. If we annualize committed spend to $75M of annual AWS costs — and assume 50% or $37.5M of this may be recovered via cloud repatriation — this translates to roughly $1.5B of market capitalization for the company on committed spend reductions alone!

While back-of-the-envelope analyses like these are never perfect, the directional findings are clear: market capitalizations of scale public software companies are weighed down by cloud costs, and by hundreds of billions of dollars. If we expand to the broader universe of enterprise software and consumer internet companies, this number is likely over $500B — assuming 50% of overall cloud spend is consumed by scale technology companies that stand to benefit from cloud repatriation.

For business leaders, industry analysts, and builders, it’s simply too expensive to ignore the impact on market cap when making both long-term and even near-term infrastructure decisions.

revenuesource: CapIQ as of May 2021; note: charts herein are for informational purposes only and should not be relied upon when making any investment decision

The paradox of cloud

Where do we go from here? On one hand, it is a major decision to start moving workloads off of the cloud. For those who have not planned in advance, the necessary rewriting seems SO impractical as to be impossible; any such undertaking requires a strong infrastructure team that may not be in place. And all of this requires building expertise beyond one’s core, which is not only distracting, but can itself detract from growth. Even at scale, the cloud retains many of its benefits — such as on-demand capacity, and hordes of existing services to support new projects and new geographies.

But on the other hand, we have the phenomenon we’ve outlined in this post, where the cost of cloud “takes over” at some point, locking up hundreds of billions of market cap that are now stuck in this paradox: You’re crazy if you don’t start in the cloud; you’re crazy if you stay on it.

So what can companies do to free themselves from this paradox? As mentioned, we’re not making a case for repatriation one way or the other; rather, we’re pointing out that infrastructure spend should be a first-class metric. What do we mean by this? That companies need to optimize early, often, and, sometimes, also outside the cloud. When you’re building a company at scale, there’s little room for religious dogma.

While there’s much more to say on the mindset shifts and best practices here — especially as the full picture has only more recently emerged — here are a few considerations that may help companies grapple with the ballooning cost of cloud.

Cloud spend as a KPI. Part of making infrastructure a first-class metric is making sure it is a key performance indicator for the business. Take for example Spotify’s Cost Insights, a homegrown tool that tracks cloud spend. By tracking cloud spend, the company enables engineers, and not just finance teams, to take ownership of cloud spend. Ben Schaechter, formerly at Digital Ocean, now co-founder and CEO of Vantage, observed that not only have they been seeing companies across the industry look at cloud cost metrics alongside core performance and reliability metrics earlier in the lifecycle of their business, but also that “Developers who have been burned by surprise cloud bills are becoming more savvy and expect more rigor with their team’s approach to cloud spend.”

Incentivize the right behaviors. Empowering engineers with data from first-class KPIs for infrastructure takes care of awareness, but doesn’t take care of incentives to change the way things are done. A prominent industry CTO told us that at one of his companies, they put in short-term incentives like those used in sales (SPIFFs), so that any engineer who saved a certain amount of cloud spend by optimizing or shutting down workloads received a spot bonus (which still had a high company ROI since the savings were recurring). He added that this approach — basically, “tie the pain directly to the folks who can fix the problem” — actually cost them less, because it paid off 10% of the entire organization, and brought down overall spend by $3M in just six months. Notably, the company CFO was key to endorsing this non-traditional model.

Optimization, optimization, optimization. When evaluating the value of any business, one of the most important factors is the cost of goods sold or COGS — and for every dollar that a business makes, how many dollars does it cost to deliver? Customer data platform company Segment recently shared how they reduced infrastructure costs by 30% (while simultaneously increasing traffic volume by 25% over the same period) through incremental optimization of their infrastructure decisions. There are a number of third-party optimization tools that can provide quick gains to existing systems, ranging anywhere from 10-40% in our experience observing this space.

Think about repatriation up front. Just because the cloud paradox exists — where cloud is cheaper and better early on and more costly later in a company’s evolution — exists, doesn’t mean a company has to passively accept it without planning for it. Make sure your system architects are aware of the potential for repatriation early on, because by the time cloud costs start to catch up to or even outpace revenue growth, it’s too late. Even modest or more modular architectural investment early on — including architecting to be able to move workloads to the optimal location and not get locked in — reduces the work needed to repatriate workloads in the future. The popularity of Kubernetes and the containerization of software, which makes workloads more portable, was in part a reaction to companies not wanting to be locked into a specific cloud.

Incrementally repatriate. There’s also no reason that repatriation (if that’s indeed the right move for your business), can’t be done incrementally, and in a hybrid fashion. We need more nuance here beyond either/or discussions: for example, repatriation likely only makes sense for a subset of the most resource-intensive workloads. It doesn’t have to be all or nothing! In fact, of the many companies we spoke with, even the most aggressive take-back-their-workloads ones still retained 10 to 30% or more in the cloud.

While these recommendations are focused on SaaS companies, there are also other things one can do; for instance, if you’re an infrastructure vendor, you may want to consider options for passing through costs — like using the customer’s cloud credits — so that the cost stays off your books. The entire ecosystem needs to be thinking about the cost of cloud.

*     *     *

How the industry got here is easy to understand: The cloud is the perfect platform to optimize for innovation, agility, and growth. And in an industry fueled by private capital, margins are often a secondary concern. That’s why new projects tend to start in the cloud, as companies prioritize velocity of feature development over efficiency.

But now, we know. The long term implications have been less well understood — which is ironic given that over 60% of companies cite cost savings as the very reason to move to the cloud in the first place! For a new startup or a new project, the cloud is the obvious choice. And it is certainly worth paying even a moderate “flexibility tax” for the nimbleness the cloud provides.

The problem is, for large companies — including startups as they reach scale — that tax equates to hundreds of billions of dollars of equity value in many cases… and is levied well after the companies have already, deeply committed themselves to the cloud (and are often too entrenched to extricate themselves). Interestingly, one of the most commonly cited reasons to move the cloud early on — a large up-front capital outlay (CapEx) — is no longer required for repatriation. Over the last few years, alternatives to public cloud infrastructures have evolved significantly and can be built, deployed, and managed entirely via operating expenses (OpEx) instead of capital expenditures.

Note too that as large as some of the numbers we shared here seem, we were actually conservative in our assumptions. Actual spend is often higher than committed, and we didn’t account for overages-based elastic pricing. The actual drag on industry-wide market caps is likely far higher than penciled.

Will the 30% margins currently enjoyed by cloud providers eventually winnow through competition and change the magnitude of the problem? Unlikely, given that the majority of cloud spend is currently directed toward an oligopoly of three companies. And here’s a bit of dramatic irony: Part of the reason Amazon, Google, and Microsoft — representing a combined ~5 trillion dollar market cap — are all buffeted from the competition, is that they have high profit margins driven in part by running their own infrastructure, enabling ever greater reinvestment into product and talent while buoying their own share prices.

And so, with hundreds of billions of dollars in the balance, this paradox will likely resolve one way or the other: either the public clouds will start to give up margin, or, they’ll start to give up workloads. Whatever the scenario, perhaps the largest opportunity in infrastructure right now is sitting somewhere between cloud hardware and the unoptimized code running on it.

Acknowledgements: We’d like to thank everyone who spoke with us for this article (including those named above), sharing their insights from the frontlines. 

Companies selected denoted some degree of public cloud infrastructure utilization in 10Ks

Sarah Wang is a partner at Andreessen Horowitz focused on late stage venture investments across enterprise, consumer, fintech, and bio.

Martin Casado is a general partner at Andreessen Horowitz, where he focuses on enterprise investing.

This story originally appeared on A16z.com. Copyright 2021

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Varjo Reality Cloud lets you virtually experience a real place via ‘teleportation’

Published

on

Varjo is unveiling its way to teleport to virtual spaces today.

Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.


Varjo is unveiling its Reality Cloud platform for virtual teleportation. That means one person can capture the reality of a space in a particular location and share that reality in extreme detail for a remote person to experience, virtually.

The Varjo Reality Cloud shares the details of a room in photorealistic detail, showing someone remotely located a view of the room in real time. Yes, you heard that. Varjo lets one person scan a 3D space and another person experience it virtually at almost the same time, as it can transfer the necessary data in compact streams of 10 megabits to 30 megabits per second with almost no time delays, the company said.

It’s a pretty amazing technology that comes from the pioneering work that Varjo has done in creating high-end virtual reality and mixed reality headsets for enterprises such Volvo, which uses it to design cars in virtual environments. The caveat, of course, is if the tech really works as envisioned.

“We are introducing Varjo Reality Cloud, and this is something very different from what you’ve seen from Varjo before,” said Timo Toikkanen, CEO of Varjo, said in an interview with GamesBeat. “We have been working on a software platform that is the first in the world that enables virtual teleportation.”

The earlier VR and mixed reality tech that Varjo introduced in the past couple of years now uses cameras on a Varjo VR-3 virtual reality headset to capture the environment around a person. Then it transmits that slice of reality to someone else who uses a headset to experience the exact same physical reality, but in a virtual way. If Varjo can deliver the Varjo Reality Platform with the same quality it shows in its videos, then it will feel like you’re “teleporting” from your real location to a virtual location.

“You can you can be anywhere in the world,” Toikkanen said. “You can scan your surroundings, not just a 3D object or something like that. You can digitize the world around you if you like. And do that in super high fidelity, through Varjo Reality Cloud, so anybody anywhere in the world can join you in that location and see it exactly the way you see it, in perfect color, with lights and reflections, and so forth.”

It’s no joke, as Varjo has been working on this for years and it has raised $100 million to date from investors including Volvo (via the Volvo Cars Tech Fund), Atomico, NordicNinja, EQT Ventures, Lifeline Ventures, Tesi, and Swisscanto Invest by Zürcher Kantonalbank.

“It’s a sci fi dream come true. But we are fully grounded in reality. So we have been looking at the at the experience. How can we enable people to have similar interpersonal experience as you do in real life, and do that remotely,” Toikkanen said. “What really accelerated for us during last year was the realization how world will never be returning to the same after COVID and travel will forever be changed. And we saw that this is one of those moments when world is more ready than ever for the transformation of this nature in the way the communication and interaction is done. This is the right time to begin that change.”

A realistic metaverse

Above: Varjo is unveiling its way to teleport to virtual spaces today.

Image Credit: Varjo

Toikkanen said that this capturing and sharing of reality is like a true-to-life metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One.

He said that you will be able to see in real-time what your friend is seeing in another place through the cloud-based platform. One person can map their reality by looking around in a room, and that view is transported to the cloud and rebuilt as a room. The person that you share this reality with can view it and feel like they’re there, Toikkanen said.

“It’s a metaverse grounded in reality,” he said. “It really is like the science fiction, beaming yourself to the other end of the world and back. And we think we think this is a really big deal. If you think of the economical and ecological drivers in the world today, something like this makes travel unnecessary.”

He said it could pave the way for a new form of human interaction and universal collaboration.

“You can engage on a completely different level than you have ever been in the history of communications,” Toikkanen said. “It really does change things in a big way. Both for businesses as well as for private individuals. You can teleport to other people, to your family,  or you can teleport to a work project.”

The system lets anybody scan their surroundings, turning them into 3D imagery using a Varjo XR-3 headset and then transport that 3D space to another person. That person gets to see the exact physical reality, completely bridging the real and the virtual in true-to-life visual fidelity, said Urho Konttori, chief technology officer at Varjo in Helsinki, Finland.

“It’s super important that the latency is kept low enough so that you have you feel that the interaction is logical, and that you don’t have like motion-related latency,” said Konttori. “We have put immense amount of effort into making it so that human-eye resolution, fully immersive stream, from the cloud, can be sent in 10 to 30 megabits per second speeds.”

This real-time reality sharing will usher in a new era in universal collaboration and pave the way for a metaverse of the future, transforming the way people work, interact, and play, Konttori said.

For the past five years, Varjo has been building and perfecting the foundational technologies needed to bring its Varjo Reality Cloud platform to market such as human-eye resolution, low-latency video pass-through, integrated eye tracking and the Lidar ability of the company’s mixed reality headset.

The company has already delivered these building block technologies in market-ready VR products that enterprises use to design their products. And now Varjo is uniquely positioned to combine them with Varjo Reality Cloud to empower users to enjoy the scale and flexibility of virtual computing in the cloud without compromising performance or quality.

Using Varjo’s proprietary foveated transport algorithm, users will be able to stream the real-time human-eye resolution, wide-field-of-view 3D video feed in single megabytes per second to any device. This ability to share, collaborate in and edit one’s environment with other people makes human connection more real and efficient than ever before, eliminating the restrictions of time and place completely.

Dimension10 acquisition

Varjo has been working on the Varjo Reality Cloud for years.

Above: Varjo has been working on the Varjo Reality Cloud for years.

Image Credit: Varjo

To further accelerate bringing the vision for Varjo Reality Cloud to life, Varjo today also announced the acquisition of Dimension10, a Norwegian software company that pioneers industrial 3D collaboration.

“We’re big fans of the company and have been for a long time,” Toikkanen said. “They have been pioneering collaboration, 3D models. And we think collaboration is at the heart Varjo Reality Cloud and us joining forces with them expedites progress.”

The Dimension10 virtual meeting suite is designed for architecture, engineering and construction teams and will become a critical component to making virtual collaboration possible within Varjo Reality Cloud. Dimension10 adds 14 people to Varjo’s team.

Additionally, Varjo added Lincoln Wallen to the company’s board of directors. Wallen currently serves as the CTO at Improbable, and he is a recognized scholar in computing and AI.

Previously, Wallen has worked as CTO of Dreamworks where he transitioned global movie production to the cloud, including the development of a cloud-native toolset for asset management, rendering, lighting, and animation.

Varjo Reality Cloud will first be available to existing customers and partners in alpha access starting later this year. For more information about Varjo’s new cloud platform and its vision for the metaverse, tune into a live, virtual event today, June 24, 2021, at 9 a.m. Pacific time via varjo.com.

Tech demos

varjo Press Image for Varjo Reality Cloud 4

Above: Varjo lets one person scan a 3D space and another person experience it virtually.

Image Credit: Varjo

In a video tech demo, Varjo showed a simplification to show how the world can be captured and streamed in real time as a 3D representation. It shows a time-lapse capture of a scene captured in real-time from a Varjo XR-3 headset. The video is converted into a 3D space that someone with a viewer and access to the Varjo Reality Cloud can use to see that room from any 3D angle.

In the beginning of the video, the user scans the room and then stops to watch Konttori give a talk. While Konttori is speaking, you see the naturalness of the movement, captured with just a Varjo XR-3 headset in the room, no additional cameras or recording devices. The camera is able to move freely as it’s all in 3D and not a flat video.

In a second video, Varjo teleports Konttori to the company’s Varjo HQ in Helsinki in mixed reality. A user wearing the headset sees the teleported Konttori mixed into a physical space at the headquarters. Later they mix the teleported surroundings together with the physical space in the headquarters.

Cool technology

Volvo is using Varjo headsets to design cars.

Above: Kia is using Varjo headsets to design cars.

Image Credit: Varjo

Varjo was founded in 2016, when other headsets like the Oculus Rift and the HTC Vive first appeared. But instead of targeting entertainment, Varjo went after enterprises with no-compromise technology.

It debuted its first VR headset, the XR-1, in early 2019 with human-eye resolution, or 1,920 pixels x 1,080 pixels per eye and an 87-degree field of view. That headset cost $10,000, but the company followed it up December 2020 with its XR-3 and VR-3 headsets that combined VR and augmented reality in the same headset.

That generation had twice the performance of the previous generation, with “human-eye resolution” of 1,920 pixels x 1,920 pixels per eye and a 115-degree field of view. It was also cheaper, ranging from $3,195 to $5,495 and it was available for cheaper enterprise subscriptions.

Now these headsets can be the jumping off point for the Varjo Reality Cloud, as they can connect to the datacenter and upload the scanned environment that someone can see via the cameras that are on the headset. The quality of the headset capture enables high-quality imagery in the cloud, Konttori said.

“We have innovated for the last five years on making that high fidelity possible,” Toikkanen said. “It links directly to the investment we have made on the headset side into gaze tracking, eye tracking, if you like, because that enables innovation. We have also invested in transporting the data between the locations, to the cloud and back, so that we can do this ensure high quality or super low latency. So that’s essentially what we are. We think of it as nothing less than the next form of human interaction.”

The hard part

Varjo is targeting professionals such as product designers with its XR/VR headsets.

Above: Varjo is targeting professionals such as product designers with its XR/VR headsets.

Image Credit: Varjo

“Nobody else is at the place that they have the hardware even near the quality that we have, let alone the software stack that allows us to actually pull this off,” Toikkanen said. “And we have of course be developing this simultaneously. And now is the culmination of all that work.”

Gaze tracking is important because if you can track where someone’s eyes are moving, then you know what they’re looking at and you can transport that view with low latency. That allows the company to create foveated transport algorithms, which means it only sends the data that you can see and that you are looking at, rather than other data that isn’t needed in real time at that moment.

“It’s a huge undertaking, and so we developed a year and a half ago a new way of doing that transport,” Konttori said. “The video stream focuses at the place that you’re looking at. That’s where we have the full resolution in the video stream. And then the degrades gradually from that towards the edges of the screen. And does that very quickly. It means that we can send the data that we send at the moment on cables from the computer to the headset, which is running at like 20 gigabits per second, and we can send that with our new compression technology at 10 megabits to 30 megabits per second.”

That means it works that you can share imagery with someone 2,000 miles away, Toikkanen said.

Enterprise applications

Varjo's XR-3 and VR-3 headsets.

Above: Varjo’s new XR-3 and VR-3 headsets.

Image Credit: Varjo

It’s a level of quality that is 10 times the resolution difference of other headsets out there, Konttori said.

“You get real-time presence because when we’re scanning, we’re just not just making a 3D model of the surroundings that you’re in and make that a teleport location,” Konttori said. “We’re actually updating that in real time.”

You could have a manager on a factory floor put on a headset. They can create a teleport node, and people from other countries can join and see what the manager sees. It’s all updated in real time and people get a sense they are truly at that location. They can fix the things that the manager is looking at, and then take off a headset and be at home.

“If you want to visit your family, it’s the same thing,” Konttori said. “You can share that physical location and people can instantly perceive the world as if they were actually there themselves.”

Once you scan a place, you don’t have to scan it again, Toikkanen said. And you can use any headset to teleport to a location, or use a phone and still have the freedom of movement to look around. But the Varjo XR-3 is the only device that can be the teleportation node that broadcasts and streams the 3D space to someone else.

Toikkanen said it’s like moving from the telephone to a video conference, and moving from that to something that is even more transformative.

“We think there are going to be a billion people using this kind of service over the next 10 years or 20 years,” he said. “We are in the alpha phase with real customers and partners this year.”

A cousin of the Omniverse

BMW Group is using Omniverse to build a digital factory that will mirror a real-world place.

Above: BMW Group is using Nvidia’s Omniverse to build a digital factory that will mirror a real-world place.

Image Credit: Nvidia

I asked if this would be a way to scan the world into Nvidia’s Ominverse, the metaverse for engineers that lets them simulate photorealistic details in a virtual world to test how they will work in reality. BMW is using the Omniverse for creating a “digital twin,” or a car factory it can design in a virtual space before it builds an exact copy in the physical world.

Toikkanen said that both tools are useful for the metaverse and they are complimentary.

“They’re both part of the like, movement towards metaverse, and this teleport functionality is adding a completely new node into the sphere of discussion of a metaverse, which is that one part of that can be the real world itself,” Toikkanen said. “And we make it so that you get the benefits of a metaverse also in real world setting. And we think that’s at least equally transformative as the metaverse which is typically seen only in virtual reality.”

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading

Tech

Immutable will launch Ethereum token for Gods Unchained

Published

on

Immutable will launch Ethereum token for Gods Unchained

Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.


The gods are evidently fond of tokens. Immutable said today that the Gods Unchained blockchain card game will launch a new Ethereum token dubbed $GODS in partnership with Nine Realms.

Sydney, Australia-based Immutable will launch the $GODS token to scale its trading market and play-to-earn systems in the game. That means that players will be able to buy and sell the tokens in the game and gain a voice on how the blockchain game is run.

Immutable said this helps give players a stake in the game and its economy, shifting power from the developers to the players by providing in-game assets that players can actually own.

The $GODS token will sit at the heart of the game’s ecosystem, providing both in-game and external utility. At launch, $GODS will operate as a utility and governance token, giving holders a voice in the digital space, as well as active staking opportunities that allow players to earn rewards through gameplay campaigns. Over time, functionality will expand to embed the token within Gods Unchained’s play-to-earn game loops, allowing players to earn $GODS tokens by simply playing the game. I call this the Leisure Economy, where we get paid to play games.

$GODS will then directly interact with Gods Unchained’s nonfungible token (NFT) assets, which are new NFTs that players can wield in-game and trade or sell on the marketplace. That means that those games will have uniquely identifiable digital items that players can earn or buy or sell, allowing the players to own the items permanently.

Immutable X

Above: The $GODS Unchained token.

Image Credit: Immutable

Immutable X has created a marketplace for players in games such as Gods Unchained to buy and sell the items they have collected. Immutable X is the brainchild of Immutable, an Australian game team that runs the NFT trading card game Gods Unchained. Gods Unchained is an important NFT game, as it is built by a development team headed by Chris Clay, the former director of Magic the Gathering: Arena. Gods Unchained is a “play to earn” game, where players can earn collectibles over time, Immutable founder Robbie Ferguson said in a recent interview with GamesBeat. And they can make money by trading those collectibles, including the unique NFTs that can be proven by the blockchain, the secure digital ledger technology, to not be copies.

In the past few months, NFTs have exploded in other applications such as art, sports collectibles, and music. NBA Top Shot (a digital take on collectible basketball cards) is one example. Published by Animoca Brands and built by Dapper Labs, NBA Top Shot has surpassed $540 million in sales, five months after going public to a worldwide audience. And an NFT digital collage by the artist Beeple sold at Christie’s for $69.3 million. Investors are pouring money into NFTs, and some of those investors are game fans. The prices for NFTs have fallen, but many of those fans are undeterred.

As one of the highest-grossing blockchain games of 2020, Gods Unchained has logged millions of matches during its ongoing beta and boasts over 4 million assets. The token launch comes off the back of Gods Unchained’s latest expansion set, Trial of the Gods. That set completely sold out, and a new expansion is on the horizon.

$GODS is being created, issued and distributed by Nine Realms for use within the Gods Unchained ecosystem.

gods unchained

Above: Gods Unchained

Image Credit: Immutable

$GODS is an ERC-20 token that will interact natively with Immutable X, the layer 2 scaling solution for Ethereum trading. The Immutable X platform allows for peer-to-peer trading without the hindrance of gas fees, and will be expanding to include ERC-20 tokens once the $GODS token drops.

In 2020, Immutable partnered with StarkWare, a company that taps the benefit of using the Ethereum cryptocurrency and its security without incurring huge fees. Immutable X is built on top of StarkWare’s layer 2 scaling technology. Essentially, users don’t have to trust in Immutable lasting permanently in order to keep owning their NFTs — they can just trust in Ethereum. Immutable X’s mainnet is now available as the first layer 2 solution for NFTs on Ethereum, the company said.

Other solutions to Ethereum are creating alternative, faster cryptocurrencies with different methods of reaching a consensus. But these alternatives aren’t as popular as Ethereum. Another solution is to create a side chain, with a different kind of processing for transactions. But Immutable believes those solutions can fail because their security isn’t still as strong as Ethereum’s. If the security fails, then so does the authenticity of the NFT, and that would be disastrous, Immutable said.

For more information on $GODS, keep an eye on this link for updates around eligibility, distribution methods, and ways to claim and earn tokens. Immutable has about 100 employees, with 40 of them working on Gods Unchained.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading

Tech

Survey-style measurement of IT isn’t effective, a ‘rigged lottery’

Published

on

Survey-style measurement of IT isn’t effective, a ‘rigged lottery’

Survey-style measurement of IT is a rigged lottery as it falls far short of providing a true measure of Digital Employee Experience (DEX).Read MoreK3d9ZEjzwis

Continue Reading

Trending