Connect with us

Tech

Tim Sweeney testimony and new documents shed light on Epic and Apple’s game businesses

Published

on

Epic Games' opening statement slides make its case against Apple.

Did you miss GamesBeat Summit 2021? Watch on-demand here! 


The Apple v. Epic antitrust trial that began Monday is digging into some of the secrets of the business of gaming, as documents and testimony are surfacing some of the key numbers behind each titan’s business practices.

Epic has sued Apple in federal court in the Northern District of California in Oakland, alleging that Apple’s control of its App Store and payment restrictions for mobile games and apps amount to an illegal monopoly. Apple responded with an opening statement that painted Epic Games as a big corporation that is overplaying the victim role and is threatening the security of the App Store.

The lawsuit started last August after Epic Games, the maker of the popular Fortnite battle royale blockbuster game, tried to circumvent Apple’s payment system and implement a discount for consumers that avoided Apple’s 30% fee for App Store’s transactions. Apple kicked Epic’s game out of the App Store, and Epic sued Apple for antitrust violations. (Epic Games also sued Google for similar reasons).

The trial started hilariously as the court accidentally left microphones open for all public callers, allowing gamers to profess their support for Epic and to get their favorite game back up on the iOS App Store. But eventually, outside counsels Katherine Forrest for Epic Games and Karen Dunn for Apple gave their opening statements.

Email evidence

Above: Epic Games’ opening statement slides make its case against Apple.

Image Credit: Epic Games

Epic pulled out statements going back to former Apple CEO Steve Jobs about his intentions to “lock in” consumers into Apple’s ecosystem in opposition to rivals such as Google. Apple, meanwhile, noted that Epic CEO Tim Sweeney tried to enlist the help of Apple competitor Microsoft in a scheme to call out Apple on its alleged anti-competitive practices, even while he was saying in the allegations that Apple was far more powerful than all competitors.

Much of the case will focus on a couple of issues. Epic has accused Apple of controlling app distribution with draconian policies on its App Store, including its high 30% fee on all transactions. Epic compared this to Apple getting a share of the purchase price of a car and then getting a 30% share of money paid for gas for the car every time the driver buys gas. Epic has also said Apple controls payments through its own payment system, even though rival payment systems work perfectly well. And it noted Apple does not uniformly enforce its policies, as it allows Uber to use a different payment system because it is in a different app category than Epic.

Epic argued the “relevant market” for evaluating anticompetitive behavior is the App Store, as Apple has a lock on a billion wealthy gamers who spend a lot of money and face a lot of switching costs when it comes to defecting to another platform. But Apple contends that the relevant market is far broader, as Apple has less than 50% of the global smartphone market and Android is more dominant. On top of that, it noted that it only accounts for 7% of global Fortnite revenue, while rivals such as Sony account for 46.8%.

Epic said Apple should allow it to sideload apps and use alternative payment systems. But Apple said allowing sideloading to let developers put untested apps into their App Store apps would circumvent security. Apple noted that its iOS devices accounted for only 1.72% of all malware infections, compared to 26.6% for Android and 38.9% for Windows. Epic pointed out that Apple allows sideloading on the Mac, without being as restrictive as it is on security for the iPhone.

Dunn said that evidence shows that platform-switching does happen, sometimes as much as 26% of the time when people buy new phones. There are also many alternatives to Apple for digital game transactions, Dunn said. Apple said 95% of its customers can use alternatives to iOS in the home, based on a survey.

Sweeney’s testimony

Epic's argument about its antitrust case.

Above: Epic’s argument about its antitrust case.

Image Credit: Epic Games

Epic Games CEO Tim Sweeney testified on Monday, the first day of an estimated three-week trial, but many of the questions were dull inquiries like “What is a console?” and “What is an avatar?” Sweeney answered those questions patiently. Occasionally, Judge Yvette Gonzalez Rogers, who is presiding over a bench trial, asked her own questions.

But the documents released through the court turned up more interesting information. Simon Carless of GameDiscoverCo found documents that mentioned the details of Epic’s game giveaways in 2018 and 2019. Subnautica had more than 804,000 users when Epic Games gave away the game on its then-new Epic Games Store. Epic gave the developers $1.4 million, and roughly 17% of the players were new to the Epic store. That means the giveaways were a relatively inexpensive way to acquire new users.

For Borderlands 3, Epic paid $146 million in advances to have the game as an exclusive on the PC. Epic recouped the $80 million minimum guaranteed fee of $80 million for marketing, bundle deals, and fees. And it got more than 1.56 million players. Of those, half were new to the store.

Some of the case will depend on which company is better at handling security. Epic said that Apple uses security as a reason for denying Epic’s desire to permit other ways of handling purchases on the iOS platform. But Epic said it is good at handling security and Apple hasn’t been over the years, citing evidence of numerous security breaches that have affected millions of users. There is evidence in the court of both sides being weak on security.

In one email, an Epic executive Daniel Vogel said that “payment fraud is an existential threat to our store.” In another email entered as evidence, Sweeney had to apologize to Ubisoft CEO Yves Guillemot because 70% of the downloads of Ubisoft’s The Division 2 were fraudulent purchases, and Epic had to temporarily halt the downloads for all Ubisoft games in the Epic Games Store at one point in 2019. Apple will likely point to this as evidence that Epic isn’t capable of providing app security in a store.

Sweeney said that Epic generated gross revenue of $5.1 billion in 2020, compared to a plan (in chart below) for $3.8 billion in 2020 revenue. That compares with our own discovery that in 2019, Epic Games reported $4.2 billion in revenue and $730 million in earnings before interest, taxes, depreciation, and amortization (EBITDA, a key measure of profitability). Sweeney said Epic has more than 3,200 employees. The documents also show Fortnite’s budget in 2017, as well as how much Epic paid various vendors for outsourcing tasks. There’s also a redacted copy of Epic’s agreement with Microsoft to publish Fortnite on the Xbox One game console.

Since its launch on iOS on March 16, 2018, Fortnite has generated 88 million downloads and $631 million in revenue. On Android, Fortnite has generated 80 million downloads and $47 million in revenue since August 13, 2018. That is a huge difference in how willing iOS users are to spend money compared to Android. According to a presentation in June 2020, Fortnite had 81 million monthly active users, the Epic Games Store had 45 million monthly active users, and Epic’s Unreal Engine had 540,000 monthly active users.

Big profits

epic revenues

Above: Epic’s once-secret revenues have been disclosed in court papers.

Image Credit: Epic Games

In an opening statement, Apple’s outside legal counsel Dunn reminded the court that Epic Games isn’t a small David fighting a Goliath, as Epic is valued at $28 billion. She also reminded the court that Apple’s App Store has 1.8 million developers who have generated more than 180 billion downloads from the store. Epic pointed out that Apple’s profit margin on the App Store is 78%. Dunn noted that before Apple came along, publishers typically charged a 70% royalty fee for games and apps, while many companies in the industry are now standardized on 30%, while Epic alone and Microsoft (as of last week) charge only 12%.

Asked why it took 10 years for Epic Games to file a lawsuit against Apple, Sweeney testified, “Epic didn’t initially take a critical view of Apple’s policies. It took a very long time for me to come to the realization of all the negative impacts of Apple’s policy.”

Apple says there is plenty of competition

Above: Apple says there is plenty of competition to its App Store.

Image Credit: Apple

Sweeney said in his testimony that Sony requires Epic Games to pay a royalty to Sony for enabling crossplay, or the capability to play Fortnite on PlayStation and to team up with other players on other platforms such as Nintendo or Xbox.

Sweeney said he wanted the world to see the consequence of Apple’s policies when he decided to take the action that led to Apple removing Fortnite from the App Store. He said he had hoped Apple would not take that action but wasn’t sure exactly what it would do. Still, anticipating action from both Apple and Google, Epic had “Project Liberty” ready to go — an antitrust lawsuit against both Apple and Google.

Apple argues its store has been great for developers.

Above: Apple argues its store has been great for developers.

Image Credit: Apple

Apple even tried to take away Apple tools from Epic’s Unreal Engine, but the judge quashed that motion, as that would have had a big impact on Epic’s game engine customers.

He said that Fortnite transcends gaming, with special entertainment features such as concerts and films, and he believes it is a contender to be a metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Understanding dimensionality reduction in machine learning models

Published

on

Feature selection

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Machine learning algorithms have gained fame for being able to ferret out relevant information from datasets with many features, such as tables with dozens of rows and images with millions of pixels. Thanks to advances in cloud computing, you can often run very large machine learning models without noticing how much computational power works behind the scenes.

But every new feature that you add to your problem adds to its complexity, making it harder to solve it with machine learning algorithms. Data scientists use dimensionality reduction, a set of techniques that remove excessive and irrelevant features from their machine learning models.

Dimensionality reduction slashes the costs of machine learning and sometimes makes it possible to solve complicated problems with simpler models.

The curse of dimensionality

Machine learning models map features to outcomes. For instance, say you want to create a model that predicts the amount of rainfall in one month. You have a dataset of different information collected from different cities in separate months. The data points include temperature, humidity, city population, traffic, number of concerts held in the city, wind speed, wind direction, air pressure, number of bus tickets purchased, and the amount of rainfall. Obviously, not all this information is relevant to rainfall prediction.

Some of the features might have nothing to do with the target variable. Evidently, population and number of bus tickets purchased do not affect rainfall. Other features might be correlated to the target variable, but not have a causal relation to it. For instance, the number of outdoor concerts might be correlated to the volume of rainfall, but it is not a good predictor for rain. In other cases, such as carbon emission, there might be a link between the feature and the target variable, but the effect will be negligible.

In this example, it is evident which features are valuable and which are useless. in other problems, the excessive features might not be obvious and need further data analysis.

But why bother to remove the extra dimensions? When you have too many features, you’ll also need a more complex model. A more complex model means you’ll need a lot more training data and more compute power to train your model to an acceptable level.

And since machine learning has no understanding of causality, models try to map any feature included in their dataset to the target variable, even if there’s no causal relation. This can lead to models that are imprecise and erroneous.

On the other hand, reducing the number of features can make your machine learning model simpler, more efficient, and less data-hungry.

The problems caused by too many features are often referred to as the “curse of dimensionality,” and they’re not limited to tabular data. Consider a machine learning model that classifies images. If your dataset is composed of 100×100-pixel images, then your problem space has 10,000 features, one per pixel. However, even in image classification problems, some of the features are excessive and can be removed.

Dimensionality reduction identifies and removes the features that are hurting the machine learning model’s performance or aren’t contributing to its accuracy. There are several dimensionality techniques, each of which is useful for certain situations.

Feature selection

A basic and very efficient dimensionality reduction method is to identify and select a subset of the features that are most relevant to target variable. This technique is called “feature selection.” Feature selection is especially effective when you’re dealing with tabular data in which each column represents a specific kind of information.

When doing feature selection, data scientists do two things: keep features that are highly correlated with the target variable and contribute the most to the dataset’s variance. Libraries such as Python’s Scikit-learn have plenty of good functions to analyze, visualize, and select the right features for machine learning models.

For instance, a data scientist can use scatter plots and heatmaps to visualize the covariance of different features. If two features are highly correlated to each other, then they will have a similar effect on the target variable, and including both in the machine learning model will be unnecessary. Therefore, you can remove one of them without causing a negative impact on the model’s performance.

Heatmap

Above: Heatmaps illustrate the covariance between different features. They are a good guide to finding and culling features that are excessive.

The same tools can help visualize the correlations between the features and the target variable. This helps remove variables that do not affect the target. For instance, you might find out that out of 25 features in your dataset, seven of them account for 95 percent of the effect on the target variable. This will enable you to shave off 18 features and make your machine learning model a lot simpler without suffering a significant penalty to your model’s accuracy.

Projection techniques

Sometimes, you don’t have the option to remove individual features. But this doesn’t mean that you can’t simplify your machine learning model. Projection techniques, also known as “feature extraction,” simplify a model by compressing several features into a lower-dimensional space.

A common example used to represent projection techniques is the “swiss roll” (pictured below), a set of data points that swirl around a focal point in three dimensions. This dataset has three features. The value of each point (the target variable) is measured based on how close it is along the convoluted path to the center of the swiss roll. In the picture below, red points are closer to the center and the yellow points are farther along the roll.

Swiss roll

In its current state, creating a machine learning model that maps the features of the swiss roll points to their value is a difficult task and would require a complex model with many parameters. But with the help of dimensionality reduction techniques, the points can be projected to a lower-dimension space that can be learned with a simple machine learning model.

There are various projection techniques. In the case of the above example, we used “locally-linear embedding,” an algorithm that reduces the dimension of the problem space while preserving the key elements that separate the values of data points. When our data is processed with the LLE, the result looks like the following image, which is like an unrolled version of the swiss roll. As you can see, points of each color remain together. In fact, this problem can still be simplified into a single feature and modeled with linear regression, the simplest machine learning algorithm.

Swiss roll, projected

While this example is hypothetical, you’ll often face problems that can be simplified if you project the features to a lower-dimensional space. For instance, “principal component analysis” (PCA), a popular dimensionality reduction algorithm, has found many useful applications to simplify machine learning problems.

In the excellent book Hands-on Machine Learning with Python, data scientist Aurelien Geron shows how you can use PCA to reduce the MNIST dataset from 784 features (28×28 pixels) to 150 features while preserving 95 percent of the variance. This level of dimensionality reduction has a huge impact on the costs of training and running artificial neural networks.

dimensionality reduction mnist dataset

There are a few caveats to consider about projection techniques. Once you develop a projection technique, you must transform new data points to the lower dimension space before running them through your machine learning model. However, the costs of this preprocessing step are not comparable to the gains of having a lighter model. A second consideration is that transformed data points are not directly representative of their original features and transforming them back to the original space can be tricky and in some cases impossible. This might make it difficult to interpret the inferences made by your model.

Dimensionality reduction in the machine learning toolbox

Having too many features will make your model inefficient. But cutting removing too many features will not help either. Dimensionality reduction is one among many tools data scientists can use to make better machine learning models. And as with every tool, they must be used with caution and care.

Ben Dickson is a software engineer and the founder of TechTalks, a blog that explores the ways technology is solving and creating problems.

This story originally appeared on Bdtechtalks.com. Copyright 2021

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading

Tech

SolarWinds breach exposes hybrid multicloud security weaknesses

Published

on

State of Cloud Security

Join Transform 2021 this July 12-16. Register for the AI event of the year.


A hybrid multicloud strategy can capitalize on legacy systems’ valuable data and insights while using the latest cloud-based platforms, apps, and tools. But getting hybrid multicloud security right isn’t easy.

Exposing severe security weaknesses in hybrid cloud, authentication, and least privileged access configurations, the high-profile SolarWinds breach laid bare just how vulnerable every business is. Clearly, enterprise leaders must see beyond the much-hyped baseline levels of identity and access management (IAM) and privileged access management (PAM) now offered by cloud providers.

In brief, advanced persistent threat (APT) actors penetrated the SolarWinds Orion software supply chain undetected, modified dynamically linked library (.dll) files, and propagated malware across SolarWinds’ customer base while taking special care to mimic legitimate traffic.

The bad actors methodically studied how persistence mechanisms worked during intrusions and learned which techniques could avert detection as they moved laterally across cloud and on-premises systems. They also learned how to compromise SAML signing certificates while using the escalated Active Directory privileges they had gained access to. The SolarWinds hack shows what happens when bad actors focus on finding unprotected threat surfaces and exploiting them for data using stolen privileged access credentials.

The incursion is particularly notable because SolarWinds Orion is used for managing and monitoring on-premises and hosted infrastructures in hybrid cloud configurations. That is what makes eradicating the SolarWinds code and malware problematic, as it has infected 18 different Orion platform products.

Cloud providers do their part — to a point

The SolarWinds hack occurred in an industry that relies considerably on cloud providers for security control.

A recent survey by CISO Magazine found 76.36% of security professionals believe their cloud service providers are responsible for securing their cloud instances. The State of Cloud Security Concerns, Challenges, and Incidents Study from the Cloud Security Alliance found that use of cloud providers’ additional security controls jumped from 58% in 2019 to 71% in 2021, and 74% of respondents are relying exclusively on cloud providers’ native security controls today.

Above: Cloud providers’ security controls are not enough for most organizations, according to the State of Cloud Security Concerns report.

Image Credit: Cloud Security Alliance

Taking the SolarWinds lessons into account, every organization needs to verify the extent of the coverage provided as baseline functionality for IAM and PAM by cloud vendors. While the concept of a shared responsibility model is useful, it’s vital to look beyond cloud platform providers’ promises based on the framework.

Amazon’s interpretation of its shared responsibility model is a prime example. It’s clear the company’s approach to IAM, while centralizing identity roles, policies, and configuration rules, does not go far enough to deliver a fully secure, scalable, zero trust-based approach.

The Amazon Shared Responsibility Model makes it clear the company takes care of AWS infrastructure, hardware, software, and facilities, while customers are responsible for securing their client-side data, server-side encryption, and network traffic protection — including encryption, operating systems, platforms, and customer data.

Like competitors Microsoft Azure and Google Cloud, AWS provides a baseline level of support for IAM optimized for just its environments. Any organization operating a multi-hybrid cloud and building out a hybrid IT architecture will have wide, unsecured gaps between cloud platforms because each platform provider only offers IAM and PAM for their own platforms.

Cloud security as shared responsibility

Above: The AWS Shared Responsibility Model is a useful framework for defining which areas of cloud deployment are customers’ responsibility.

Image Credit: Amazon Web Services

While a useful framework, the Shared Responsibility Model does not come close to providing the security hybrid cloud configurations need. It is also deficient in addressing machine-to-machine authentication and security, an area seeing rapid growth in organizations’ hybrid IT plans today. Organizations are also on their own when it comes to how they secure endpoints across all the public, private, and community cloud platforms they rely on.

There is currently no unified approach to solving these complex challenges, and every CIO and security team must figure it out on their own.

But there needs to be a single, unified security model that scales across on-premises, public, private, and community clouds without sacrificing security, speed, and scale. Averting the spread of a SolarWinds-level attack starts with a single security model across all on-premises and cloud-based systems, with IAM and PAM at the platform level.

Amid hybrid cloud and tool sprawl, security suffers

The SolarWinds attack came just as multicloud methods had started to gain traction. Cloud sprawl is defined as the unplanned and often uncontrolled growth of cloud instances across public, private, and community cloud platforms. The leading cause of cloud sprawl is a lack of control, governance, and visibility into how cloud computing instances and resources are acquired and used. Still, according to Flexera’s 2021 State of the Cloud Report, 92% of enterprises have a multicloud strategy and 82% have a hybrid cloud strategy.

Enterprise cloud strategy

Above: Cloud sprawl will become an increasing challenge, given organizations’ tendency to prioritize multicloud strategies.

Image Credit: Flexera

Cloud sprawl happens when an organization lacks visibility into or control over its cloud computing resources. Organizations are reducing the potential of cloud sprawl by having a well-defined, adaptive, and well-understood governance framework defining how cloud resources will be acquired and used. Without this, IT faces the challenge of keeping cloud sprawl in check while achieving business goals.

Overbuying security tools and overloading endpoints with multiple, often conflicting software clients weakens any network. Buying more tools could actually make a SolarWinds-level attack worse. Security teams need to consider how tool and endpoint agent sprawl is weakening their networks. According to IBM’s Cyber Resilient Organization Report, enterprises deploy an average of 45 cybersecurity-related tools on their networks today. The IBM study also found enterprises that deploy over 50 tools ranked themselves 8% lower in their ability to detect threats and 7% lower in their defensive capabilities than companies employing fewer toolsets.

Rebuilding on a zero trust foundation

The SolarWinds breach is particularly damaging from a PAM perspective. An integral component of the breach was compromising SAML signing certificates the bad actors gained by using their escalated Active Directory privileges. It was all undetectable to SolarWinds Orion, the hybrid cloud-monitoring platform hundreds of organizations use today. Apparently, a combination of hybrid cloud security gaps, lack of authentication on SolarWinds accounts, and lack of least privileged access made the breach undetectable for months, according to a Cybersecurity & Infrastructure Security Agency (CISA) alert. One of the most valuable lessons learned from the breach is the need to enforce least privileged access across every user and administrator account, endpoint, system access account, and cloud administrator account.

The bottom line is that the SolarWinds breach serves as a reminder to plan for and begin implementing zero trust frameworks that enable any organization to take a “never trust, always verify, enforce least privilege” strategy when it comes to their hybrid and multicloud strategies.

Giving users just enough privileges and resources to get their work done and providing least privileged access for a specific time is essential. Getting micro-segmentation right across IT infrastructures will eliminate bad actors’ ability to move laterally throughout a network. And logging and monitoring all activity on a network across all cloud platforms is critical.

Every public cloud platform provider has tools available for doing this. On AWS, for example, there’s AWS CloudTrail and Amazon CloudWatch, which monitors all API activity. Vaulting root accounts and applying multi-factor authentication across all accounts is a given.

Organizations need to move beyond the idea that the baseline levels of IAM and PAM delivered by cloud providers are enough. Then these organizations need to think about how they can use security to accelerate their business goals by providing the users they serve with least privileged access.

Adopting a zero trust mindset and framework is a given today, as every endpoint, system access point, administrative login, and cloud administrator console is at risk if nothing changes.

The long-held assumptions of interdomain trust were proven wrong with SolarWinds. Now it’s time for a new, more intensely focused era of security that centers on enforcing least privilege and zero-trust methods across an entire organization.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading

Tech

How Scopely tries to do game acquisitions the right way

Published

on

How Scopely tries to do game acquisitions the right way

Did you miss GamesBeat Summit 2021? Watch on-demand here! 


Big game companies could probably use a manual for acquiring game companies at a time when such acquisitions have reached an all-time record. In the first quarter, the money for acquisitions, public offerings, and investments hit $39 billion, more than the $33 billion for all of last year, according to InvestGame.

So it seems timely for the whole game industry to step back and think about the right way to do acquisitions. We had a couple of executives from mobile game publisher Scopely talk about this subject at our recent GamesBeat Summit 2021 event.

Nick Tuosto, managing director of Liontree and cofounder of Griffin Gaming Partners, moderated the session with Tim O’Brien, chief revenue officer at Scopely, and Amir Rahimi, president of games at Scopely. I also interviewed Aaron Loeb, chief business officer at Scopely, about the company’s culture that enables acquisitions to succeed. In those conversations, we extracted some lessons that all companies could use in making mergers work and distilling the right culture for game companies.

Rahimi became a part of Scopely after the Los Angeles company acquired FoxNext Games, the game division of the entertainment company Fox. Disney acquired Fox and then chose to spin out its game division. That deal in January 2020 was the end of a long journey for Rahimi, who was part of TapZen, a studio that was started in 2013 and was first acquired by Kabam in 2015 to make mobile games. That was the first of six different transactions that Rahimi was part of over seven years as the studio was passed around from owner to owner and finally found its home at Scopely.

Marvel Strike Force’s growth

Above: Left to right: Nick Tuosto of Liontree/Griffin Gaming, Tim O’Brien, and Amir Rahimi.

Image Credit: GamesBeat

That involved a “process filled with twists and turns and drama,” Rahimi said during the session. Rahimi’s studio in Los Angeles had managed to launch Marvel Strike Force, a free-to-play fighting game in 2018. And in 2020, under Scopely’s ownership, Marvel Strike Force grew its revenues more than 70% to $300 million. It was a big hit, and it became an even big after the Scopely deal.

What’s amazing about that result is that the growth happened during lockdown and as Scopely was integrating FoxNext Games into the company, a task that O’Brien was heavily involved with as an executive on the buying side.

“We were sort of anti-remote work. When shelter-in-place hit, I was terrified and then was subsequently blown away by how well the team rallied and stayed focused and productive,” Rahimi said. The deal closed in February, right before everything shut down. In terms of the transition from Disney to Scopely, it was a model for how to do M&A.”

It took about eight months to get the deal done. Rahimi said his team spent a lot of time getting to know O’Brien and the Scopely team. They knew their visions for the future were aligned.

“They created the optimal conditions for us to thrive,” Rahimi said. “We talked about how we could power up my studio.”

O’Brien said the whole plan to have gatherings and dinners was no longer possible in the pandemic. But the company had to make a lot of decisions to put the team members first, he said.

Creative freedom and user acquisition

Marvel Strike Force

Above: Scopely acquired Marvel Strike Force with its FoxNext games deal.

Image Credit: FoxNext Games

One of the things that Rahimi liked was that Scopely gave Rahimi’s studio creative freedom to make the right decisions for Marvel Strike Force and other games such as an upcoming Avatar title. Rahimi said Scopely’s leaders understood that the team was functioning well and it needed support, rather than a change in direction.

“I personally feel more empowered at Scopely than I have anywhere,” Rahimi said. “That allows me to empower my people and that benefits everyone. Unfortunately, there are companies that approach M&A in a very different way. They make changes in leaders or modify game concepts or swap out intellectual properties. And that proves to be disruptive.

“You can’t do that,” Rahimi said. The way you need to approach a team, especially a team that’s already successful, is really understanding what they do well, really understanding what you can bring to the table, and forming a partnership that is based on trust and transparency and candor,” and then accelerating the growth after the acquisition.

“The team went for it, and those decisions paid off,” Rahimi said.

Scopely did know where it had to make investments, and that was in marketing and acquiring new users, Loeb said in our fireside chat. It knew it had to do this in order to capture the players for the long term. Scopely applied its talent in user acquisition, something that it handles through its central technology services, and that really paid off, Rahimi said. That helped the game find new audiences around the globe. O’Brien said that the combination of the two companies led to the growth acceleration for the game and all of the game’s key performance indicators, such as how often players will play the game in the week. Scopely has also added a lot of the FoxNext Game team members into leadership positions at Scopely.

“We knew we could learn as much from them as they could from us,” O’Brien said.

How culture matters too

One of the new leaders at Scopely is Aaron Loeb, chief business officer, who was also president of FoxNext Games before the acquisition. I also spoke with Loeb in a session at the GamesBeat Summit 2021 event about how Scopely is gathering its learnings into what it calls the Scopely Operating System. It’s really more about all of the processes and culture at Scopely. That’s all in service of becoming the definitive free-to-play game company, Loeb said.

“Our tools, our technology, everything is built around enabling the vision of the game team to go and do what they’re seeking to do to grow the game based on their vision,” Loeb said. “Scopely is really focused on the core problem set of making a great game, which is going and finding the right talent, the right leadership, people with a vision for what they want, and then supporting them with both the right technology but also the right resources to go out and actually chase that vision.”

Loeb said that Scopely had a game that was lost in the wilderness. It decided to restart the effort and reconstitute the vision.

“It pushed through those dark hours the soul when it looked like they couldn’t make it,” Loeb said. “That game is now one of our biggest hits.”

A learning machine

Scopely has expanded to a 60,000-square-feet space in Culver City.

Above: Scopely has expanded to a 60,000-square-feet space in Culver City.

Image Credit: Scopely

Loeb said one of the things that defines the culture is humility.

“This culture starts with hiring the smartest people you can find who are also really humble, who are excited about learning from each other,” Loeb said. “Our culture is a learning culture. We often talk about the company as a learning machine.”

The lesson of that for other companies is that it’s easy to fall into a pattern of operating with blinders on, with a focus on execution. But Loeb believes it’s important to question the pattern and have a desire for learning new ways of thinking. It turns out this way of thinking at the company is good for players too. It leads to changes in the experience for gamers, even in areas that were previously thought as solved problems.

“Our people are incredibly dedicated to challenging their own assumptions,” Loeb said. “We are, we are skeptical of opinions, particularly our own. And I think that that’s a really critical factor to building a great learning culture.”

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading

Trending