Connect with us

Tech

Australian lawmakers push news law after Facebook concessions

Published

on

Australian lawmakers push news law after Facebook concessions

Australian lawmakers advanced a bill that effectively will force Google and Facebook to pay media companies for news content, clearing the last major hurdle for legislation that could set precedents for government policies worldwide.

The bill, which was amended and green-lighted by Australia’s Senate, will return to its House of Representatives, where it is expected to pass as early as this week.

Lawmakers introduced amendments to the so-called Media Bargaining Code after Facebook last week escalated a dispute over the new laws by blocking Australian users from sharing and viewing news content on its popular social media platform.

Facebook on Tuesday said it would restore Australian users’ access to news in light of the compromise it had reached with the government.

On Wednesday, Facebook also said it plans to spend at least $1 billion in the news industry over the next three years. Facebook follows Google’s pledge last October to pay publishers $1 billion over the next three years.

“We’ve invested $600 million since 2018 to support the news industry, and plan at least $1 billion more over the next three years,” Nick Clegg, vice president of global affairs at Facebook, said in a blog published Wednesday.

“Facebook is more than willing to partner with news publishers,” added Clegg. “We absolutely recognize quality journalism is at the heart of how open societies function — informing and empowering citizens and holding the powerful to account.”

After the bill passes both houses, news businesses that want to be paid for content that appears on search engines or social media can sign up — provided they meet some conditions, including earning $150,000 per year in revenue.

“What we’ve sworn to do is create a level playing field,” Australian Treasurer Josh Frydenberg told Sky News on Wednesday. “We’ve sought to sustain public interest journalism in this country, and we’ve also sought to enhance and encourage those commercial deals between the parties.”

One major change is that Frydenberg will be given the discretion to decide that either Facebook or Google need not be subject to the code if they make a “significant contribution to the sustainability of the Australian news industry.”

Australia’s original legislation had required the tech giants to submit to forced arbitration if they could not reach a commercial deal with Australian news companies for their content, effectively allowing the government to set a price.

Some critics worry that small publishers could get cut out of the deal, which is supposed to address the power imbalance between the social media giants and publishers when negotiating payment for news content displayed on the tech firms’ sites.

“The big players could successfully negotiate with Facebook or Google. The minister then doesn’t designate them, and all the little players miss out,” independent senator Rex Patrick, who plans to vote against the amended bill, told Reuters.

Frydenberg said he will give Facebook and Google time to strike deals with Australian media companies before deciding whether to enforce his new powers.

After first threatening to withdraw its search engine from Australia, Google instead struck a series of deals with several publishers, including a global news deal with News Corp.

Major television broadcaster and newspaper publisher Seven West Media on Tuesday said it had signed a letter of intent to reach a content supply deal with Facebook within 60 days.

Rival Nine Entertainment Co also revealed on Wednesday it was in negotiations with Facebook.

“At this stage, we’re still obviously proceeding with negotiations,” Nine chief executive Hugh Marks told analysts at a company briefing on Wednesday. “It is really positive for our business and positive particularly for the publishing business.”

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Age of Empires IV could bring back RTS in a big way this fall

Published

on

Age of Empires IV could bring back RTS in a big way this fall

Microsoft gave fans a deep dive into Age of Empires IV today, and the game looks beautiful. It looks like it could give real-time strategy games on the PC a much-neeeded shot in the arm.

The company’s Relic Entertainment division has been making the game since 2017, and it finally announced it will be arriving on the PC on Windows 10, Xbox Game Pass for PC, and Steam during the fall of 2021. The fan event was a celebration of Age of Empires for those who miss the The Town Center. I feel like this RTS genre should be a mass market, not a niche.

RTS games are best played as mouse-and-keyboard games, and are far harder to play with a game controller. That has meant that the games can’t reach as many fans and they haven’t been as popular, even though they’re hard to develop. We’ve seen big lapses in Blizzard Entertainment’s RTS efforts in brand new installments in franchises such as StarCraft and Warcraft.

Above: You can zoom in pretty close in Age of Empires IV.

Image Credit: Microsoft/Relic

While Microsoft’s Age of Empires franchise has been stalled since 2005 (with the exception of some retro remakes), other key players have been carrying the RTS flag. Sega’s The Creative Assembly has a thriving Total War series, with the latest titles coming including Total War: Warhammer III and Total War: Rome Remastered. Meanwhile, Eugen Systems has been doing a great job with World War II titles with its Steel Division series. Other startups working on RTS titles are Frost Giant Studios and SunSpear games.

But Age of Empires IV could really fill the RTS gap. Microsoft’s success with Age of Empires started in 1997, and the marriage of history and RTS generated so much revenue that, in addition to Microsoft Flight Simulator, the game enabled a vast expansion in the company’s game investments and ultimately led to the debut of the Xbox game console in 2001.

Age of Empires and its sequels sold more than 20 million copies, but Microsoft shut down the studio that made it, Ensemble Studios, in 2009, during the Great Recession after attempts to branch out like Halo Wars met with limited success. Other games had higher priorities at Microsoft.

A new game

aoe

Above: Age of Empires IV will feature the Delphi Sultanate.

Image Credit: Microsoft/Relic

But this new game takes advantage of the last 15-plus years of graphics improvements that allow for much more detail to be used in the individual characters and buildings that make up the scenes in 4K HDR battlefields.

The game will have eight civilizations and Microsoft has revealed four so far. Today, Relic showed the Delhi Sultanate (which features elephant units), and it also showed campaigns of William the Conquerer in England, as well as campaigns in China with the Mongols and sea warfare.

Relic said the civilizations will play very differently, with strengths and weaknesses that play out  across campaign maps as well as randomly generated maps. The game will have four campaigns such as the Norman conquest of England.

aoe 4

Above: Age of Empires IV takes advantage of 4K graphics.

Image Credit: Microsoft/Relic

Players will be able to stage ambushes with stealth, which allows players to hide their units from the enemy unless scouts spot them first. Soldiers can also fend off attackers by shooting down from castle walls while attackers can use siege weapons.

Microsoft also said the classic titles — Definitive Edition of Age of Empires II and III — will be updated, as Age of Empires II: Definitive Edition will get a Dawn of Dukes expansion and co-op play this year. Age of Empires III: Definitive Edition will add the U.S. and an African expansion.

I’m looking forward to have real choices for RTS games in the future, and Age of Empires IV looks like it could consume an awful lot of my time.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading

Tech

Black women, AI, and overcoming historical patterns of abuse

Published

on

Black women, AI, and overcoming historical patterns of abuse

Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.


After a 2019 research paper demonstrated that commercially available facial analysis tools fail to work for women with dark skin, AWS executives went on the attack. Instead of offering up more equitable performance results or allowing the federal government to assess their algorithm like other companies with facial recognition tech have done, AWS executives attempted to discredit study coauthors Joy Buolamwini and Deb Raji in multiple blog posts. More than 70 respected AI researchers rebuked this attack, defended the study, and called on Amazon to stop selling the technology to police, a position the company temporarily adopted last year after the death of George Floyd.

But according to the Abuse and Misogynoir Playbook, published earlier this year by a trio of MIT researchers, Amazon’s attempt to smear two Black women AI researchers and discredit their work follows a set of tactics that have been used against Black women for centuries. Moya Bailey coined the term “misogynoir” in 2010 as a portmanteau of “misogyny” and “noir.” Playbook coauthors Katlyn Turner, Danielle Wood, and Catherine D’Ignazio say these tactics were also used to disparage former Ethical AI team co-lead Timnit Gebru after Google fired her in late 2020 and stress that it’s a pattern engineers and data scientists need to recognize.

The Abuse and Misogynoir Playbook is part of the State of AI report from the Montreal AI Ethics Institute and was compiled by MIT professors in response to Google’s treatment of Gebru, a story VentureBeat has covered in depth. The coauthors hope that recognition of the phenomena will prove a first step in ensuring these tactics are no longer used against Black women. Last May, VentureBeat wrote about a fight for the soul of machine learning, highlighting ties between white supremacy and companies like Banjo and Clearview AI, as well as calls for reform from many in the industry, including prominent Black women.

MIT assistant professor Danielle Wood, whose work focuses on justice and space research, told VentureBeat it’s important to recognize that the tactics outlined in the Abuse and Misogynoir Playbook can be used in almost any arena. She noted that while some cling to a belief in the impartiality of data-driven results, the AI field is in no way exempt from this problem.

“This is a process, a series of related things, and the process has to be described step by step or else people won’t get the point,” Wood said. “I can be part of a system that’s actually practicing misogynoir, and I’m a Black woman. Because it’s a habit that is so prolific, it’s something I might participate in without even thinking about it. All of us can.”

Above: The Abuse and Misogynoir Playbook (Design by Melissa Teng)

Image Credit: Design by Melissa Teng

The playbook outlines the intersectional and unique abuse aimed at Black women in five steps:

Step 1: A Black woman scholar makes a contribution that speaks truth to power or upsets the status quo. 

Step 2: Disbelief in her contribution from people who say the results can’t be true and either think a Black woman couldn’t have done the research or find another way to call her contribution into question.

Step 3: Dismissal, discrediting, and gaslighting ensues. AI chief Jeff Dean’s public attempt to discredit Gebru alongside colleagues is a textbook example. Similarly, after current and former Dropbox employees alleged gender discrimination at the company, Dropbox CEO Drew Houston attempted to discredit the report’s findings, according to documents obtained by VentureBeat.

Gaslighting is a term taken from the 1944 movie Gaslight, in which a character goes to extreme lengths to make a woman deny her senses, ignore the truth, and feel like she’s going crazy. It’s not uncommon at this stage for people to consider the targeted Black woman’s contribution an attempt to weaponize pity or sympathy. Another instance that sparked gaslighting allegations involved algorithmic bias, Facebook chief AI scientist Yann LeCun, and Gebru.

Step 4: Erasure. Over time, counter-narratives, deplatforming, and exclusion are used to prevent that person from carrying out their work as part of attempts to erase their contributions.

Step 5: Revisionism seeks to paper over the contributions of Black women and can lead to whitewashed versions of events and slow progress toward justice.

There’s been a steady stream of stories about gender and racial bias in AI in recent years, a point highlighted by news headlines this week. The Wall Street Journal reported Friday that researchers found Facebook’s algorithm shows different job ads to men and women and is discriminatory under U.S. law, while Vice reported on research that found facial recognition used by Proctorio remote proctoring software does not work well for people with dark skin over half of the time. This follows VentureBeat’s coverage of racial bias in ExamSoft’s facial recognition-based remote proctoring software, which was used in state bar exams in 2020.

Investigations by The Markup this week found advertising bans hidden behind an algorithm for a number of terms on YouTube, including “Black in tech,” “antiracism,” and “Black excellence,” but it’s still possible to advertise to white supremacists on the video platform.

Case study: Timnit Gebru and Google

Google’s treatment of Gebru illustrates each step of the playbook. Her status quo-disrupting contribution, Turner told VentureBeat, was an AI research paper about the dangers of using large language models that perpetuate racism or stereotypes and carry an environmental impact that may unduly burden marginalized communities. Other perceived disruptions, Turner said, included Gebru building one of the most diverse teams within Google Research and sending a critical email to the Google Brain Women and Allies internal listserv that was leaked to Platformer.

Shortly after she was fired, Gebru said she was asked to retract the paper or remove the names of Google employees. That was step two from the Misogynoir Playbook. In academia, Turner said, retraction is taken very seriously. It’s generally reserved for scientific falsehood and can end careers, so asking Gebru to remove her name from a valid piece of research was unreasonable and part of efforts to make Gebru herself seem unreasonable.

Evidence of step three, disbelief or discredit, can be found in an email AI chief Jeff Dean sent that calls into question the validity of the paper’s findings. Days later, CEO Sundar Pichai sent a memo to Google employees in which he said the firing of Gebru had prompted the company to explore improvements to its employee de-escalation policy. In an interview with VentureBeat, Gebru characterized that memo as “dehumanizing” and an attempt to fit her into an “angry Black woman” trope.

Despite Dean’s critique, a point that seems lost amid allegations of abuse, racism, and corporate efforts to interfere with academic publication is that the team of researchers behind the stochastic parrots research paper in question was exceptionally well-qualified to deliver critical analysis of large language models. A version of the paper VentureBeat obtained lists Google research scientists Ben Hutchinson, Mark Diaz, and Vinodkumar Prabhakaran as coauthors, as well as then-Ethical AI team co-leads Gebru and Margaret Mitchell. Diaz, Hutchinson, and Prabhakaran have backgrounds in assessing language or NLP for ageism, discrimination against people with disabilities, and racism, respectively. Linguist Emily Bender, a lead coauthor of the paper alongside Gebru, received an award from organizers of a major NLP conference in mid-2020 for work critical of large language models, which VentureBeat also reported.

Gebru is coauthor of the Gender Shades research paper that found commercially available facial analysis models perform particularly poorly for women with dark skin. That project, spearheaded by Buolamwini in 2018 and continued with Raji in a subsequent paper published in early 2019, has helped shape legislative policy in the U.S and is also a central part of Coded Bias, a documentary now streaming on Netflix. And Gebru has been a major supporter of AI documentation standards like datasheets for datasets and model cards, an approach Google has adopted.

Finally, Turner said, steps four and five of the playbook, erasure and revisionism, can be seen in the departmental reorganization and diversity policy changes Google made in February. As a result of those changes, Google VP Marian Croak was appointed to head up 10 of the Google teams that consider how technology impacts people. She reports directly to AI chief Jeff Dean.

On Tuesday, Google research manager Samy Bengio resigned from his role at the company, according to news first reported by Bloomberg. Prior to the restructuring, Bengio was the direct report manager for the Ethical AI team.

VentureBeat obtained a copy of a letter Ethical AI team members sent to Google leadership in the weeks following Gebru’s dismissal that specifically requested Bengio remain the direct report for the team and that the company not implement any reorganization. A person familiar with ethics and policy matters at Google told VentureBeat that reorganization had been discussed previously, but this source described an environment of fear after Gebru’s dismissal that prevented people from speaking out.

Before being named to her new position, Croak appeared alongside the AI chief in a meeting with Black Google employees in the days following Gebru’s dismissal. Google declined to make Croak available for comment, but Google released a video in which she called for more “diplomatic” conversations about definitions of fairness or safety.

Turner pointed out that the reorganization fits neatly into the playbook.

“I think that revisionism and erasure is important. It serves a function of allowing both people and the news cycle to believe that the narrative arc has happened, like there was some bad thing that was taken care of — ‘Don’t worry about this anymore.’ [It’s] like, ‘Here’s this new thing,’ and that’s really effective,” Turner said.

Origins of the playbook

The playbook’s coauthors said it was constructed following conversations with Gebru. Earlier in the year, Gebru spoke at MIT at Turner and Wood’s invitation as part of an antiracism tech design research seminar series. When the news broke that Gebru had been fired, D’Ignazio described feelings of anger, shock, and outrage. Wood said she experienced a sense of grieving and loss. She also felt frustrated by the fact that Gebru was targeted despite having attempted to address harm through channels that are considered legitimate.

“It’s a really discouraging feeling of being stuck,” Wood said. “If you follow the rules, you’re supposed to see the outcome, so I think part of the reality here is just thinking, ‘Well, if Black women try to follow all the rules and the result is we’re still not able to communicate our urgent concerns, what other options do we have?’”

Wood said she and Turner found connections between historical figures and Gebru in their work in the Space Enabled Lab at MIT examining complex sociotechnical systems through the lens of critical race studies and queer Black feminist groups like the Combahee River Collective.

In addition to instances of misogynoir and abuse at Amazon and Google, coauthors say the playbook represents a historical pattern that has been used to exclude Black women authors and scholars dating back to the 1700s. These include Phillis Wheatley, the first published African American poet, journalist Ida B. Wells, and author Zora Neale Hurston. Generally, the coauthors found that the playbook tactics visit great acts of violence on Black women that can be distinguished from the harms encountered by other groups that challenge the status quo.

The coauthors said women outside of tech who have been targeted by the same playbook include New York Times journalist and 1619 Project creator Nikole Hannah-Jones and politicians like Stacey Abrams and Rep. Ayanna Pressley (D-MA).

The long shadow of history

The researchers also said they took a historical view to demonstrate that the ideas behind the Abuse and Misogynoir Playbook are centuries old. Failure to confront forces of racism and sexism at work, Turner said, can lead to the same problems in new and different tech scenarios. She went on to say that it’s important to understand that historical forces of oppression, categorization, and hierarchy are still with us and warned that “we will never actually get to an ethical AI if we don’t understand that.”

The AI field claims to excel at pattern recognition, so the industry should be able to identify tactics from the playbook, D’Ignazio said.

“I feel like that’s one of the most enormous ignorances, the places where technical fields do not go, and yet history is what would inform all of our ethical decisions today,” she said. “History helps us see structural, macro patterns in the world. In that sense, I see it as deeply related to computation and data science because it helps us scale up our vision and see how things today, like Dr. Gebru’s case, are connected to these patterns and cycles that we still haven’t been able to break out of today.”

The coauthors recognize that power plays a major role in determining what kind of behavior is considered ethical. This corresponds to the idea of privilege hazard, a term coined in the book Data Feminism, which D’Ignazio coauthored last year, to describe an inability to fully comprehend another person’s experience.

A long-term view seems to run counter to the traditional Silicon Valley dogma surrounding scale and growth, a point emphasized by Google Ethical AI team research scientist and sociologist Dr. Alex Hanna weeks before Gebru was fired. A paper Hanna coauthored with independent researcher Tina Park in October 2020 called scale thinking incompatible with addressing social inequality.

The Abuse and Misogynoir Playbook is the latest AI work to turn to history for inspiration. Your Computer Is On Fire, a collection of essays from MIT Press, and Kate Crawford’s Atlas of AI, released in March and April, respectively, examine the toll datacenter infrastructure and AI take on the environment and civil rights and reinforce colonial habits about the extraction of value from people and natural resources. Both books also investigate patterns and trends found in the history of computing.

Race After Technology author Ruha Benjamin, who coined the term “new Jim Code,” argues that an understanding of historical and social context is also necessary to safeguard engineers from being party to human rights abuses, like the IBM workers who assisted Nazis during World War II.

A new playbook

The coauthors end by calling for the creation of a new playbook and pose a challenge to the makers of artificial intelligence.

“We call on the AI ethics community to take responsibility for rooting out white supremacy and sexism in our community, as well as to eradicate their downstream effects in data products. Without this baseline in place, all other calls for AI ethics ring hollow and smack of DEI-tokenism. This work begins by recognizing and interrupting the tactics outlined in the playbook — along with the institutional apparatus — that works to disbelieve, dismiss, gaslight, discredit, silence, and erase the leadership of Black women.”

The second half of a panel discussion about the playbook in late March focused on hope and ways to build something better, because, as the coauthors say, it’s not enough to host events with the term “diversity” or “equity” in them. Once abusive patterns are recognized, old processes that led to mistreatment on the basis of gender or race must be replaced with new, liberatory practices.

The coauthors note that making technology with liberation in mind is part of the work D’Ignazio does as director of the Data + Feminism Lab at MIT, and what Turner and Wood do with the Space Enabled research group at MIT Media Lab. That group looks for ways to design complex systems that support justice and the United Nations Sustainable Development Goals.

“Our assumption is we have to show prototypes of liberatory ways of working so that people can understand those are real and then try to adopt those in place of the current processes that are in place,” Wood said. “We hope that our research labs are actually mini prototypes of the future in which we try to behave in a way that’s anticolonial and feminist and queer and colored and has lots of views from people from different backgrounds.”

D’Ignazio said change in tech — and specifically for the hyped, well-funded, and trendy field of AI — will require people considering a number of factors, including who they take money from and choose to work with. AI ethics researcher Luke Stark turned down $60,000 in funding from Google last month, and Rediet Abebe, who cofounded Black in AI with Gebru, has also pledged to reject funding from Google.

In other work at the intersection of AI and gender, the Alan Turing Institute’s Women in Data Science and AI project released a report last month that documents problems women in AI face in the United Kingdom. The report finds that women only hold about 1 in 5 jobs in data science and AI fields in the U.K. and calls for government officials to better track and verify the growth of women in data science and AI.

“Our research findings reveal extensive disparities in skills, status, pay, seniority, industry, job attrition, and education background, which call for effective policy responses if society is to reap the benefits of technological advances,” the report reads.

Members of Congress interested in algorithmic regulation are considering more stringent employee demographic data collection, among other legislative initiatives. Google and Facebook do not currently share diversity data specific to employees working within artificial intelligence.

The Abuse and Misogynoir Playbook is also the latest AI research from people of African descent to advocate taking a historical perspective and adopting anticolonial and antiracist practices.

In an open letter shortly after the death of George Floyd last year, a group of more than 150 Black machine learning and computing professionals outlined a set of actions to bring an end to the systemic racism that has led Black people to leave jobs in the computing field. A few weeks later, researchers from Google’s DeepMind called for reform of the AI industry based on anticolonial practices. More recently, a team of African AI researchers and data scientists have recommended implementing anticolonial data sharing practices as the datacenter industry in Africa continues growing at a rapid pace.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading

Tech

The RetroBeat — Diablo II: Resurrected gives a diabolically good first impression

Published

on

Seeing this guy makes me feel like 13-year-old again.

After the disappointing Warcraft III: Reforged, I wondered if I really wanted a Diablo II remake. When Blizzard announced Diablo II: Resurrected earlier this year, I wanted to remain skeptical. It felt a little better learning that Vicarious Visions, which created fantastic remakes for Crash Bandicoot and Tony Hawk’s Pro Skater, was working on the project. Still, I was worried Blizzard would mess things up again.

Now that I’ve been playing a bit of Resurrected, I feel better. Blizzard has launched a PC technical alpha that runs through the weekend. It gives access to the first two of the game’s five acts. I can play as three of the seven classes: Barbarian, Amazon, and Sorceress. It’s only for single-player, so we can’t try out online multiplayer.

The full game comes out later this year for PlayStation 5, Xbox Series X/S, Switch, PlayStation 4, and Xbox One along with PC. And I’m going to let myself be excited now.

Above: Seeing this guy makes me feel like a 13-year-old again.

Image Credit: GamesBeat

Diablo II is still good

I love the original Diablo II. I played it back in 2000 and had an incredible, dark adventure with my Barbarian. Heck, I still remember the giant, orange-glowing bastard sword that I found toward the end of Act 2 that I ended up using for most of the game. I recalled the dozens of times I had to use a scroll to teleportation so I could go back to town and buy more health potions during my fight with Diablo. It’s one of my favorite game experiences.

That was a bit more than 20 years ago. Starting Diablo II again, everything came back to me. When I was in town but my health wasn’t full, I suddenly remembered that I had to talk to a specific NPC in the camp if I wanted to be healed. I began organizing my equipment the same way I did in 2000, putting my tomes on the far right slots and leaving the far left area for new loot.

It’s all an incredible wave of nostalgia. But Diablo II is more than that. It’s still a great game. It’s less flashy and slower than many modern action-RPGs, especially when compared to Diablo III. But that works in its favor. Yes, your inventory space is extremely small. You will not be spamming dozens of special abilities with dazzling particle effects during fights.

Combat, especially early on, is simple. You click on enemies and swing your weapon at them. As far as I can tell from my memory, it’s the same as it was in 2000. You have a stamina bar, and it limits how often your character can run instead of walk. Diablo II is restrictive. But it works. It just feels a bit more gritty than what we’re used to today. It doesn’t hold your hand and go out of its way to make things easy for you, which makes your victories feel more satisfying.

Never thought that a place called The Den of Evil could be so comforting.

Above: Never thought that a place called The Den of Evil could be so comforting.

Image Credit: GamesBeat

Remade

So, yes, Diablo II was great, and it’s still great. But we’ve seen Blizzard ruin a classic before with Warcraft III: Reforged. Thankfully, the effort appears much more polished this time. Resurrected looks updated while retaining the spirit of the original. It’s still a dark, dreary world. It’s just less pixelated and jagged. And if you like your jagged pixels, which I know I do, you can still switch to the old graphics by pushing a single button. I spent a lot of time going back and forth between the two, having fun comparing the old characters with their updated looks.

There are also some nice quality of life changes. The transparent map, which used to take up most of the screen, now nestles itself in the corner. You can still have it use the full screen if you want, you adorable purist you, but I like it just fine in its new position. Resurrected also has an option to have you pick up gold automatically when you walk over it. That’s nice. I mean, who ever doesn’t want to pick up gold?

Performance hasn’t been perfect. My framerate has suffered when loading new areas. It also drops when I’m standing on a waypoint. Of course, this is a technical alpha, so I’m not all that worried about it.

There’s a reason so many people had been asking for a Diablo II remake. It’s a special game. It’s something that I probably should have replayed on my own years ago. This remake just gives me a good excuse to do so.

This is just a sample, but it’s a promising one. Resurrected isn’t just bringing back a classic. It could also help revive Blizzard’s retro gaming credentials.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Continue Reading

Trending