Connect with us


Activision Blizzard’s Call of Duty drives 27% Q1 2020 revenue growth to $2.28 billion



Wraith is a new Operator in Season Three for Call of Duty.

Did you miss GamesBeat Summit 2021? Watch on-demand here! 

Activision Blizzard reported earnings today that beat Wall Street’s expectations with 27% revenue growth to $2.28 billion for the first quarter ended March 31. And Call of Duty is playing a significant role in this performance.

Activision’s Call of Duty franchise continued to outperform last year’s results as people continued to play a lot more games during the pandemic. In particular, players have embraced the free-to-play games Call of Duty: Warzone (a battle royale) and Call of Duty: Mobile.

Activision Blizzard CEO Bobby Kotick said in an interview with GamesBeat, “Everything was up, year over year. Call of Duty was up. Candy Crush was up considerably. Blizzard was up. Across the board, the business is going well. The pattern is strong. Free-to-play introduces people to the franchise.”

Those new entry points in the franchise have tripled monthly active users (MAUs) in the past couple of years, and Activision overall now has more than 150 million MAUs. Compared to a year ago, Call of Duty franchise MAUs are up 40% from the first quarter of 2020.

The free-to-play games have been a year-round onramp for players to upgrade to paid seasonal content as well with purchases of Call of Duty: Black Ops — Cold War, which debuted last fall.

The Santa Monica, California-based game publisher said its GAAP revenues for the first quarter ended March 31 were $2.28 billion, up 27% from $1.79 billion a year ago. GAAP earnings per share were 79 cents, compared to 65 cents a share a year earlier. In after-hours trading, Activision Blizzard’s stock price is up 5.6% to $93.67 a share.

If you think about the opportunity to infuse free-to-play throughout the company, it could replicate the revenue growth for multiple franchises (like Blizzard Entertainment is doing with Diablo Immortal, a mobile take on the action-RPG that’s in alpha testing).

Analysts expected Activision Blizzard’s non-GAAP earnings per share to be $70 cents a share, with revenue at $1.78 billion. The comparable actual results are non-GAAP earnings per share of 98 cents on net bookings of $2.06 billion.

Activision Blizzard has about 10,000 employees. But it still needs to hire more than 2,000 people, Kotick said. Toys for Bob, the maker of games such as Crash Bandicoot, is helping with Call of Duty this year. The company also recently laid off about 2% of the workforce, including a number in physical esports production. Taking recommendations from shareholder groups, Kotick also reduced his compensation, cutting his base salary in half to $875,000. He will still get bonuses based on how the company performs over multiple years, and his contract has been extended through March 31, 2023.

Activision Blizzard (which includes King) had 435 million MAUS, compared to 397 million monthly active users (MAUs) in the previous quarter. Kotick said in the earnings call that the goal is to hit a billion users, and he noted Call of Duty gained more than 100 million players in the past year.

Call of Duty franchise keeps growing

Above: Wraith is a new Operator in Season Three for Call of Duty.

Image Credit: Activision

The big video game publisher said that both Call of Duty: Warzone, the battle royale mode for Call of Duty: Modern Warfare; and Call of Duty: Mobile drove demand for the quarter. A year ago, when Call of Duty: Warzone launched on March 11, the U.S. was just going into its first pandemic lockdown.

Call of Duty ‘s premium game, Call of Duty: Black Ops — Cold War, has outsold Modern Warfare, in no small part to its integration with the free-to-play Warzone, which served to bring in new players and then make it easy for them to upgrade to the $60 game.

Kotick said that Warzone reached 100 million downloads and its recently launched third season is going strong. At 500 million downloads, Call of Duty: Mobile also brought tens of millions of new players into the franchise.

Call of Duty MAUs grew over 40% in Q1 from a year ago, and Cold War saw premium sales well beyond the usual number in the first quarter. Net bookings on the PC and console for Call of Duty grew more than 60% in the first quarter. Call of Duty: Mobile’s debut in China in December generated tens of millions of new players, said Daniel Alegre (the chief operating officer of Activision Blizzard) in the analyst call.

He said Sledgehammer Games is on track for a fall release of the next Call of Duty installment. The game will be announced “soon,” Alegre said.

Blizzard’s growth

Diablo II: Resurrection is a remake of the classic Blizzard game.

Above: Diablo II: Resurrection is a remake of the classic Blizzard game.

Image Credit: Blizzard

Blizzard’s revenue grew 7% from a year ago, led by strong growth in the Warcraft franchise. World of Warcraft’s Shadowlands expansion has grown, and Blizzard had 27 million MAUs in the first quarter.

Heartstone’s latest expansion, Forged in the Barrens, debuted March 30. Mercenaries, a new mode for Hearthstone, is in the works. Diablo Immortal is in its second phase of testing, and Activision Blizzard said it’s on-track to release later this year. Diablo II: Resurrected, a remake of Diablo II, is also testing well and will launch later this year. Alegre said online viewership of the alpha test was the highest Blizzard has ever seen for an alpha test.

Blizzard didn’t say much new about the status of Overwatch 2, the sequel to its Overwatch team shooter game. The title is expected to arrive in 2022. The company recently said that Jeff Kaplan, the game director for Overwatch 2, recently left the company. World of Warcraft: Classic will get the Burning Crusade as its next expansion.

King’s progress

Crash Bandicoot: On the Run.

Above: Crash Bandicoot: On the Run.

Image Credit: Activision Blizzard

King’s revenue grew 22% in the quarter, reporting 258 million MAUs in the quarter, which is down from 273 million a year ago. One driver of growth was Crash Bandicoot: On the Run, which launched March 25 and has seen more than 30 million downloads to date. That likely helped with MAUs, but revenues from that will probably materialize in the second quarter that ends June 30.

King also saw more than 70% growth in advertising net bookings in the first quarter, with significant increases across both direct brand advertisers and partner networks. King’s bookings have stayed strong in the second quarter.

During the lockdown, other forms of entertainment (like sports and movie theaters) remain stalled because of social distancing and shelter-in-place orders during the pandemic. With few other options, more people than ever are turning to gaming. Mobile gaming in particular is benefiting.


As far as outlook goes, the company said it expects non-GAAP earnings per share of 91 cents on revenues of $2.135 billion for the second quarter ending June 30. The company is raising its full-year guidance as well. A thousand dollars invested in Activision Blizzard in 2001 would be worth $45,000, based on the company’s stock price gains. Armin Zerza will serve as the new chief financial officer, replacing Dennis Durkin, who retired. In the analyst call, Zerza said the company is focusing on year-round entertainment and said the year is off to a very strong start.

He noted that competition for talent remains high and that Call of Duty faces tough comparisons for the June quarter, with lower operating performance in Q2 and stronger performance in the second half of the year. Bookings are expected to be $8.6 billion for the full year. Alegre said he believes that the company will benefit from the long-term trend of more people playing more games.

Kostich said that Call of Duty: Mobile has crossed 500 million downloads and that Sledgehammer has expanded as a studio. Sledgehammer Games announced today it opened a studio in Toronto, Canada.

Updated 2:45 p.m. Pacific with details from the earnings call.


GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *


Understanding dimensionality reduction in machine learning models



Feature selection

Join Transform 2021 this July 12-16. Register for the AI event of the year.

Machine learning algorithms have gained fame for being able to ferret out relevant information from datasets with many features, such as tables with dozens of rows and images with millions of pixels. Thanks to advances in cloud computing, you can often run very large machine learning models without noticing how much computational power works behind the scenes.

But every new feature that you add to your problem adds to its complexity, making it harder to solve it with machine learning algorithms. Data scientists use dimensionality reduction, a set of techniques that remove excessive and irrelevant features from their machine learning models.

Dimensionality reduction slashes the costs of machine learning and sometimes makes it possible to solve complicated problems with simpler models.

The curse of dimensionality

Machine learning models map features to outcomes. For instance, say you want to create a model that predicts the amount of rainfall in one month. You have a dataset of different information collected from different cities in separate months. The data points include temperature, humidity, city population, traffic, number of concerts held in the city, wind speed, wind direction, air pressure, number of bus tickets purchased, and the amount of rainfall. Obviously, not all this information is relevant to rainfall prediction.

Some of the features might have nothing to do with the target variable. Evidently, population and number of bus tickets purchased do not affect rainfall. Other features might be correlated to the target variable, but not have a causal relation to it. For instance, the number of outdoor concerts might be correlated to the volume of rainfall, but it is not a good predictor for rain. In other cases, such as carbon emission, there might be a link between the feature and the target variable, but the effect will be negligible.

In this example, it is evident which features are valuable and which are useless. in other problems, the excessive features might not be obvious and need further data analysis.

But why bother to remove the extra dimensions? When you have too many features, you’ll also need a more complex model. A more complex model means you’ll need a lot more training data and more compute power to train your model to an acceptable level.

And since machine learning has no understanding of causality, models try to map any feature included in their dataset to the target variable, even if there’s no causal relation. This can lead to models that are imprecise and erroneous.

On the other hand, reducing the number of features can make your machine learning model simpler, more efficient, and less data-hungry.

The problems caused by too many features are often referred to as the “curse of dimensionality,” and they’re not limited to tabular data. Consider a machine learning model that classifies images. If your dataset is composed of 100×100-pixel images, then your problem space has 10,000 features, one per pixel. However, even in image classification problems, some of the features are excessive and can be removed.

Dimensionality reduction identifies and removes the features that are hurting the machine learning model’s performance or aren’t contributing to its accuracy. There are several dimensionality techniques, each of which is useful for certain situations.

Feature selection

A basic and very efficient dimensionality reduction method is to identify and select a subset of the features that are most relevant to target variable. This technique is called “feature selection.” Feature selection is especially effective when you’re dealing with tabular data in which each column represents a specific kind of information.

When doing feature selection, data scientists do two things: keep features that are highly correlated with the target variable and contribute the most to the dataset’s variance. Libraries such as Python’s Scikit-learn have plenty of good functions to analyze, visualize, and select the right features for machine learning models.

For instance, a data scientist can use scatter plots and heatmaps to visualize the covariance of different features. If two features are highly correlated to each other, then they will have a similar effect on the target variable, and including both in the machine learning model will be unnecessary. Therefore, you can remove one of them without causing a negative impact on the model’s performance.


Above: Heatmaps illustrate the covariance between different features. They are a good guide to finding and culling features that are excessive.

The same tools can help visualize the correlations between the features and the target variable. This helps remove variables that do not affect the target. For instance, you might find out that out of 25 features in your dataset, seven of them account for 95 percent of the effect on the target variable. This will enable you to shave off 18 features and make your machine learning model a lot simpler without suffering a significant penalty to your model’s accuracy.

Projection techniques

Sometimes, you don’t have the option to remove individual features. But this doesn’t mean that you can’t simplify your machine learning model. Projection techniques, also known as “feature extraction,” simplify a model by compressing several features into a lower-dimensional space.

A common example used to represent projection techniques is the “swiss roll” (pictured below), a set of data points that swirl around a focal point in three dimensions. This dataset has three features. The value of each point (the target variable) is measured based on how close it is along the convoluted path to the center of the swiss roll. In the picture below, red points are closer to the center and the yellow points are farther along the roll.

Swiss roll

In its current state, creating a machine learning model that maps the features of the swiss roll points to their value is a difficult task and would require a complex model with many parameters. But with the help of dimensionality reduction techniques, the points can be projected to a lower-dimension space that can be learned with a simple machine learning model.

There are various projection techniques. In the case of the above example, we used “locally-linear embedding,” an algorithm that reduces the dimension of the problem space while preserving the key elements that separate the values of data points. When our data is processed with the LLE, the result looks like the following image, which is like an unrolled version of the swiss roll. As you can see, points of each color remain together. In fact, this problem can still be simplified into a single feature and modeled with linear regression, the simplest machine learning algorithm.

Swiss roll, projected

While this example is hypothetical, you’ll often face problems that can be simplified if you project the features to a lower-dimensional space. For instance, “principal component analysis” (PCA), a popular dimensionality reduction algorithm, has found many useful applications to simplify machine learning problems.

In the excellent book Hands-on Machine Learning with Python, data scientist Aurelien Geron shows how you can use PCA to reduce the MNIST dataset from 784 features (28×28 pixels) to 150 features while preserving 95 percent of the variance. This level of dimensionality reduction has a huge impact on the costs of training and running artificial neural networks.

dimensionality reduction mnist dataset

There are a few caveats to consider about projection techniques. Once you develop a projection technique, you must transform new data points to the lower dimension space before running them through your machine learning model. However, the costs of this preprocessing step are not comparable to the gains of having a lighter model. A second consideration is that transformed data points are not directly representative of their original features and transforming them back to the original space can be tricky and in some cases impossible. This might make it difficult to interpret the inferences made by your model.

Dimensionality reduction in the machine learning toolbox

Having too many features will make your model inefficient. But cutting removing too many features will not help either. Dimensionality reduction is one among many tools data scientists can use to make better machine learning models. And as with every tool, they must be used with caution and care.

Ben Dickson is a software engineer and the founder of TechTalks, a blog that explores the ways technology is solving and creating problems.

This story originally appeared on Copyright 2021


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading


SolarWinds breach exposes hybrid multicloud security weaknesses



State of Cloud Security

Join Transform 2021 this July 12-16. Register for the AI event of the year.

A hybrid multicloud strategy can capitalize on legacy systems’ valuable data and insights while using the latest cloud-based platforms, apps, and tools. But getting hybrid multicloud security right isn’t easy.

Exposing severe security weaknesses in hybrid cloud, authentication, and least privileged access configurations, the high-profile SolarWinds breach laid bare just how vulnerable every business is. Clearly, enterprise leaders must see beyond the much-hyped baseline levels of identity and access management (IAM) and privileged access management (PAM) now offered by cloud providers.

In brief, advanced persistent threat (APT) actors penetrated the SolarWinds Orion software supply chain undetected, modified dynamically linked library (.dll) files, and propagated malware across SolarWinds’ customer base while taking special care to mimic legitimate traffic.

The bad actors methodically studied how persistence mechanisms worked during intrusions and learned which techniques could avert detection as they moved laterally across cloud and on-premises systems. They also learned how to compromise SAML signing certificates while using the escalated Active Directory privileges they had gained access to. The SolarWinds hack shows what happens when bad actors focus on finding unprotected threat surfaces and exploiting them for data using stolen privileged access credentials.

The incursion is particularly notable because SolarWinds Orion is used for managing and monitoring on-premises and hosted infrastructures in hybrid cloud configurations. That is what makes eradicating the SolarWinds code and malware problematic, as it has infected 18 different Orion platform products.

Cloud providers do their part — to a point

The SolarWinds hack occurred in an industry that relies considerably on cloud providers for security control.

A recent survey by CISO Magazine found 76.36% of security professionals believe their cloud service providers are responsible for securing their cloud instances. The State of Cloud Security Concerns, Challenges, and Incidents Study from the Cloud Security Alliance found that use of cloud providers’ additional security controls jumped from 58% in 2019 to 71% in 2021, and 74% of respondents are relying exclusively on cloud providers’ native security controls today.

Above: Cloud providers’ security controls are not enough for most organizations, according to the State of Cloud Security Concerns report.

Image Credit: Cloud Security Alliance

Taking the SolarWinds lessons into account, every organization needs to verify the extent of the coverage provided as baseline functionality for IAM and PAM by cloud vendors. While the concept of a shared responsibility model is useful, it’s vital to look beyond cloud platform providers’ promises based on the framework.

Amazon’s interpretation of its shared responsibility model is a prime example. It’s clear the company’s approach to IAM, while centralizing identity roles, policies, and configuration rules, does not go far enough to deliver a fully secure, scalable, zero trust-based approach.

The Amazon Shared Responsibility Model makes it clear the company takes care of AWS infrastructure, hardware, software, and facilities, while customers are responsible for securing their client-side data, server-side encryption, and network traffic protection — including encryption, operating systems, platforms, and customer data.

Like competitors Microsoft Azure and Google Cloud, AWS provides a baseline level of support for IAM optimized for just its environments. Any organization operating a multi-hybrid cloud and building out a hybrid IT architecture will have wide, unsecured gaps between cloud platforms because each platform provider only offers IAM and PAM for their own platforms.

Cloud security as shared responsibility

Above: The AWS Shared Responsibility Model is a useful framework for defining which areas of cloud deployment are customers’ responsibility.

Image Credit: Amazon Web Services

While a useful framework, the Shared Responsibility Model does not come close to providing the security hybrid cloud configurations need. It is also deficient in addressing machine-to-machine authentication and security, an area seeing rapid growth in organizations’ hybrid IT plans today. Organizations are also on their own when it comes to how they secure endpoints across all the public, private, and community cloud platforms they rely on.

There is currently no unified approach to solving these complex challenges, and every CIO and security team must figure it out on their own.

But there needs to be a single, unified security model that scales across on-premises, public, private, and community clouds without sacrificing security, speed, and scale. Averting the spread of a SolarWinds-level attack starts with a single security model across all on-premises and cloud-based systems, with IAM and PAM at the platform level.

Amid hybrid cloud and tool sprawl, security suffers

The SolarWinds attack came just as multicloud methods had started to gain traction. Cloud sprawl is defined as the unplanned and often uncontrolled growth of cloud instances across public, private, and community cloud platforms. The leading cause of cloud sprawl is a lack of control, governance, and visibility into how cloud computing instances and resources are acquired and used. Still, according to Flexera’s 2021 State of the Cloud Report, 92% of enterprises have a multicloud strategy and 82% have a hybrid cloud strategy.

Enterprise cloud strategy

Above: Cloud sprawl will become an increasing challenge, given organizations’ tendency to prioritize multicloud strategies.

Image Credit: Flexera

Cloud sprawl happens when an organization lacks visibility into or control over its cloud computing resources. Organizations are reducing the potential of cloud sprawl by having a well-defined, adaptive, and well-understood governance framework defining how cloud resources will be acquired and used. Without this, IT faces the challenge of keeping cloud sprawl in check while achieving business goals.

Overbuying security tools and overloading endpoints with multiple, often conflicting software clients weakens any network. Buying more tools could actually make a SolarWinds-level attack worse. Security teams need to consider how tool and endpoint agent sprawl is weakening their networks. According to IBM’s Cyber Resilient Organization Report, enterprises deploy an average of 45 cybersecurity-related tools on their networks today. The IBM study also found enterprises that deploy over 50 tools ranked themselves 8% lower in their ability to detect threats and 7% lower in their defensive capabilities than companies employing fewer toolsets.

Rebuilding on a zero trust foundation

The SolarWinds breach is particularly damaging from a PAM perspective. An integral component of the breach was compromising SAML signing certificates the bad actors gained by using their escalated Active Directory privileges. It was all undetectable to SolarWinds Orion, the hybrid cloud-monitoring platform hundreds of organizations use today. Apparently, a combination of hybrid cloud security gaps, lack of authentication on SolarWinds accounts, and lack of least privileged access made the breach undetectable for months, according to a Cybersecurity & Infrastructure Security Agency (CISA) alert. One of the most valuable lessons learned from the breach is the need to enforce least privileged access across every user and administrator account, endpoint, system access account, and cloud administrator account.

The bottom line is that the SolarWinds breach serves as a reminder to plan for and begin implementing zero trust frameworks that enable any organization to take a “never trust, always verify, enforce least privilege” strategy when it comes to their hybrid and multicloud strategies.

Giving users just enough privileges and resources to get their work done and providing least privileged access for a specific time is essential. Getting micro-segmentation right across IT infrastructures will eliminate bad actors’ ability to move laterally throughout a network. And logging and monitoring all activity on a network across all cloud platforms is critical.

Every public cloud platform provider has tools available for doing this. On AWS, for example, there’s AWS CloudTrail and Amazon CloudWatch, which monitors all API activity. Vaulting root accounts and applying multi-factor authentication across all accounts is a given.

Organizations need to move beyond the idea that the baseline levels of IAM and PAM delivered by cloud providers are enough. Then these organizations need to think about how they can use security to accelerate their business goals by providing the users they serve with least privileged access.

Adopting a zero trust mindset and framework is a given today, as every endpoint, system access point, administrative login, and cloud administrator console is at risk if nothing changes.

The long-held assumptions of interdomain trust were proven wrong with SolarWinds. Now it’s time for a new, more intensely focused era of security that centers on enforcing least privilege and zero-trust methods across an entire organization.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading


How Scopely tries to do game acquisitions the right way



How Scopely tries to do game acquisitions the right way

Did you miss GamesBeat Summit 2021? Watch on-demand here! 

Big game companies could probably use a manual for acquiring game companies at a time when such acquisitions have reached an all-time record. In the first quarter, the money for acquisitions, public offerings, and investments hit $39 billion, more than the $33 billion for all of last year, according to InvestGame.

So it seems timely for the whole game industry to step back and think about the right way to do acquisitions. We had a couple of executives from mobile game publisher Scopely talk about this subject at our recent GamesBeat Summit 2021 event.

Nick Tuosto, managing director of Liontree and cofounder of Griffin Gaming Partners, moderated the session with Tim O’Brien, chief revenue officer at Scopely, and Amir Rahimi, president of games at Scopely. I also interviewed Aaron Loeb, chief business officer at Scopely, about the company’s culture that enables acquisitions to succeed. In those conversations, we extracted some lessons that all companies could use in making mergers work and distilling the right culture for game companies.

Rahimi became a part of Scopely after the Los Angeles company acquired FoxNext Games, the game division of the entertainment company Fox. Disney acquired Fox and then chose to spin out its game division. That deal in January 2020 was the end of a long journey for Rahimi, who was part of TapZen, a studio that was started in 2013 and was first acquired by Kabam in 2015 to make mobile games. That was the first of six different transactions that Rahimi was part of over seven years as the studio was passed around from owner to owner and finally found its home at Scopely.

Marvel Strike Force’s growth

Above: Left to right: Nick Tuosto of Liontree/Griffin Gaming, Tim O’Brien, and Amir Rahimi.

Image Credit: GamesBeat

That involved a “process filled with twists and turns and drama,” Rahimi said during the session. Rahimi’s studio in Los Angeles had managed to launch Marvel Strike Force, a free-to-play fighting game in 2018. And in 2020, under Scopely’s ownership, Marvel Strike Force grew its revenues more than 70% to $300 million. It was a big hit, and it became an even big after the Scopely deal.

What’s amazing about that result is that the growth happened during lockdown and as Scopely was integrating FoxNext Games into the company, a task that O’Brien was heavily involved with as an executive on the buying side.

“We were sort of anti-remote work. When shelter-in-place hit, I was terrified and then was subsequently blown away by how well the team rallied and stayed focused and productive,” Rahimi said. The deal closed in February, right before everything shut down. In terms of the transition from Disney to Scopely, it was a model for how to do M&A.”

It took about eight months to get the deal done. Rahimi said his team spent a lot of time getting to know O’Brien and the Scopely team. They knew their visions for the future were aligned.

“They created the optimal conditions for us to thrive,” Rahimi said. “We talked about how we could power up my studio.”

O’Brien said the whole plan to have gatherings and dinners was no longer possible in the pandemic. But the company had to make a lot of decisions to put the team members first, he said.

Creative freedom and user acquisition

Marvel Strike Force

Above: Scopely acquired Marvel Strike Force with its FoxNext games deal.

Image Credit: FoxNext Games

One of the things that Rahimi liked was that Scopely gave Rahimi’s studio creative freedom to make the right decisions for Marvel Strike Force and other games such as an upcoming Avatar title. Rahimi said Scopely’s leaders understood that the team was functioning well and it needed support, rather than a change in direction.

“I personally feel more empowered at Scopely than I have anywhere,” Rahimi said. “That allows me to empower my people and that benefits everyone. Unfortunately, there are companies that approach M&A in a very different way. They make changes in leaders or modify game concepts or swap out intellectual properties. And that proves to be disruptive.

“You can’t do that,” Rahimi said. The way you need to approach a team, especially a team that’s already successful, is really understanding what they do well, really understanding what you can bring to the table, and forming a partnership that is based on trust and transparency and candor,” and then accelerating the growth after the acquisition.

“The team went for it, and those decisions paid off,” Rahimi said.

Scopely did know where it had to make investments, and that was in marketing and acquiring new users, Loeb said in our fireside chat. It knew it had to do this in order to capture the players for the long term. Scopely applied its talent in user acquisition, something that it handles through its central technology services, and that really paid off, Rahimi said. That helped the game find new audiences around the globe. O’Brien said that the combination of the two companies led to the growth acceleration for the game and all of the game’s key performance indicators, such as how often players will play the game in the week. Scopely has also added a lot of the FoxNext Game team members into leadership positions at Scopely.

“We knew we could learn as much from them as they could from us,” O’Brien said.

How culture matters too

One of the new leaders at Scopely is Aaron Loeb, chief business officer, who was also president of FoxNext Games before the acquisition. I also spoke with Loeb in a session at the GamesBeat Summit 2021 event about how Scopely is gathering its learnings into what it calls the Scopely Operating System. It’s really more about all of the processes and culture at Scopely. That’s all in service of becoming the definitive free-to-play game company, Loeb said.

“Our tools, our technology, everything is built around enabling the vision of the game team to go and do what they’re seeking to do to grow the game based on their vision,” Loeb said. “Scopely is really focused on the core problem set of making a great game, which is going and finding the right talent, the right leadership, people with a vision for what they want, and then supporting them with both the right technology but also the right resources to go out and actually chase that vision.”

Loeb said that Scopely had a game that was lost in the wilderness. It decided to restart the effort and reconstitute the vision.

“It pushed through those dark hours the soul when it looked like they couldn’t make it,” Loeb said. “That game is now one of our biggest hits.”

A learning machine

Scopely has expanded to a 60,000-square-feet space in Culver City.

Above: Scopely has expanded to a 60,000-square-feet space in Culver City.

Image Credit: Scopely

Loeb said one of the things that defines the culture is humility.

“This culture starts with hiring the smartest people you can find who are also really humble, who are excited about learning from each other,” Loeb said. “Our culture is a learning culture. We often talk about the company as a learning machine.”

The lesson of that for other companies is that it’s easy to fall into a pattern of operating with blinders on, with a focus on execution. But Loeb believes it’s important to question the pattern and have a desire for learning new ways of thinking. It turns out this way of thinking at the company is good for players too. It leads to changes in the experience for gamers, even in areas that were previously thought as solved problems.

“Our people are incredibly dedicated to challenging their own assumptions,” Loeb said. “We are, we are skeptical of opinions, particularly our own. And I think that that’s a really critical factor to building a great learning culture.”


GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading