Connect with us

Tech

Malicious cheats for Call of Duty: Warzone are circulating online

Published

on

Gloved hands manipulate a laptop with a skull and crossbones on the display.

Criminals have been hiding malware inside publicly available software that purports to be a cheat for Activision’s Call of Duty: Warzone, researchers with the game maker warned earlier this week.

Cheats are programs that tamper with in-game events or player interactions so that users gain an unfair advantage over their opponents. The software typically works by accessing computer memory during gameplay and changing health, ammo, score, lives, inventories, or other information. Cheats are almost always forbidden by game makers.

On Wednesday, Activision said that a popular cheating site was circulating a fake cheat for Call of Duty: Warzone that contained a dropper, a term for a type of backdoor that installs specific pieces of malware chosen by the person who created it. Named Warzone Cheat Engine, the cheat was available on the site in April 2020 and again last month.

An advertisement on a popular cheat site.
Enlarge / An advertisement on a popular cheat site.

Activision

Shields down

People promoting the cheat instructed users to run the program as an administrator and to disable antivirus. While these settings are often required for a cheat to work, they also make it easier for malware to survive reboots and to go undetected, since users won’t get warnings of the infection or that software is seeking heightened privileges.

“While this method is rather simplistic, it is ultimately a social engineering technique that leverages the willingness of its target (players that want to cheat) to voluntarily lower their security protections and ignore warnings about running potentially malicious software,” Activision researchers wrote in a deep-dive analysis. They provided a long list of Warzone Cheat Engine variants that installed a host of malware, including a cryptojacker, which uses the resources of an infected gaming computer to surreptitiously mine cryptocurrency.

Activision’s analysis said that multiple malware forums have regularly advertised a kit that customizes the fake cheat. The kit makes it easy to create versions of Warzone Cheat Engine that deliver malicious payloads chosen by the criminal using it.

An app available in malware forums that creates custom versions of <em>Warzone Cheat Engine</em>.
Enlarge / An app available in malware forums that creates custom versions of Warzone Cheat Engine.

The people selling the kit advertised it as an “effective” way to spread malware and “some nice bait for your first malware project.” The sellers have also posted YouTube videos that promote the kit and explain how to use it.

Activision’s report came on the same day that Cisco’s Talos security team disclosed a new malware campaign targeting gamers who use cheats. The malicious cheats used a previously unknown cryptor tool that prevented antivirus programs from detecting the payload. Talos didn’t identify the game titles that were targeted.

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

Lora DiCarlo launches crowdfunding campaign for sexual wellness products

Published

on

Lora DiCarlo launches crowdfunding campaign for sexual wellness products

Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.


Lora DiCarlo, the sexual wellness startup at the center of a trade show dustup in 2019, has launched a crowdfunding campaign to raise more money to make its microrobotic pleasure devices.

The Bend, Oregon-based company is raising money via a regulation crowdfunding offering on Republic, and it’s can raise anywhere from $25,000 to $5 million in the “my first time” campaign. After launching it this morning, the company has already raised more than $48,000, and the campaign should expire in 79 days.

Lora DiCarlo gained attention at CES 2019 after the Consumer Technology Association banned the company’s first female-oriented sex toy, Osé, from the show. In May 2019, the CTA updated its policy to make CES more “welcoming and inclusive,” and the sex toy was credited with kicking off a positive conversation about female empowerment and female-run startups.

The company already has investors like actress Cara Delevingne (co-owner of Lora DiCarlo), Romulus Capital, VU Fund, and Gaingels. Republic is a SEC-registered investment platform for investors seeking high growth investment returns in highly-vetted startups.

“We really feel like it’s an amazing opportunity,” said Lora Haddock, the CEO and founder of Lora DiCarlo, in an interview with VentureBeat. “There are not a lot of ways to invest in sexual health and wellness. So we want to better everyone’s sex lives and what better way to be able to do that and then to allow everybody to participate. We’ve taken an inclusive and transparent approach.”

Allied Market Research estimated the sex tech market at $74 billion.

Above: Lora DiCarlo’s line of sexual wellness products.

Image Credit: Lora DiCarlo

After CES changed its policy, sex toy makers such as Lora DiCarlo, OhMiBod, and Lioness were able to exhibit at CES 2020 in January. The show has continued to deny entrance to porn companies after those groups split off and formed their own show years ago. During the past couple of years, Lora DiCarlo has designed 11 products based on its microrobotic engine.

“Our engineering team has gone from a rickety car to a very well oiled machine in the last few years,” Haddock said. “Now we can dump ideas into this hopper and quickly synthesize them in a productive manner. We listen to the problems that people say they have, and we try to solve them.”

Lora DiCarlo started out creating high-tech sex toys in partnership with Oregon State University’s College of Engineering. Osé is a complex product with hundreds of parts. Its flexible body and custom controls allow people, either alone or with a partner, to simultaneously stimulate the G-spot and the clitoris to create a blended orgasm.

“Our mission is really rooted in just doing eradicating that stigma around sexuality and allowing people to just feel more empowered,” Haddock said.

Lora DiCarlo has also launched a sexual wellness coaching program called WellSx that complements the Osé family of devices. Haddock and other employees at the company got trained in sexual education for the purpose of offering the service. Haddock said the team welcomes constant feedback for its products.

“We were actually able to make some pretty awesome improvements to really better approach a wider berth of of physiology,” Haddock said, regarding the launch of the first product.

Founded in 2017, Lora DiCarlo has 25 employees, not counting contractors, and has raised $6 million. In 2020, the company reported revenues of $7.5 million and it has shipped more than 50,000 products to date. The company has 13 patents.

Lora DiCarlo offers direct-to-consumer delivery in 37 countries with product availability in over 400 retail stores in major markets including  the U.S., the United Kingdom, Europe, and Japan. The brand also recently launched a new collection of warming sextech and has additional products set to debut later this year. Funds raised will support continued product development, global marketing, and sales.

Regarding the success so far of the fundraising, Haddock said, “This is not a niche industry. It’s not going away.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading

Tech

Cloudera partners with Nvidia to expand GPU usage across AI applications

Published

on

Comparison of CPU and GPU workloads

Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.


Cloudera and Nvidia announced a collaboration that will allow organizations to use GPUs in more areas across the AI development lifecycle.

Cloudera will integrate its Cloudera Data Platform with Nvidia’s accelerated Apache Spark 3.0 libraries. The integration will make it easier to add machine learning workflows to processes and create architectures without requiring GPU customization. Enterprises will be able to make changes to their data science workflows without having to also update the Nvidia integration manually.

GPUs have shown tremendous promise in enhancing the data science side of AI development, enabling enterprises to run some types of workloads on top of GPUs. However, analytics often involve processes that span multiple teams, forcing enterprises to invest in customizing GPU integrations for those use cases.

Gartner has predicted that creating new architecture patterns that help operationalize data science and ML pipelines will be one of the major trends in 2021.

Benefits to accelerating GPUs

The partnership will allow enterprises to use GPUs across modern data workflows that span data preparation, data science, and analytics tasks. The typical workflow includes many steps including data ingestion, data curation, data pipeline automation, data science exploration, model development, testing, deployment, model monitoring and retraining, and delivery into the business. Cloudera has been busy in making these processes and the handoffs between them much easier over the last year.

The Apache Spark 3.0 libraries are accelerated using Nvidia’s RAPIDS platform, which will dramatically accelerate much of the boring prep work required to bring new machine learning models into production. For example, the US Internal Revenue Service is already seeing a threefold improvement in data science workflows for fraud detection, said Joe Ansaldi, IRS technical branch chief for the Research Applied Analytics & Statistics Division, in a statement.

Speeding up data preparation tasks and training models faster will save on infrastructure costs as well. GPU-accelerated Apache Spark 3 runs natively on CDP and can plug into high performance compute tools, Cloudera said.

Above: Comparing the CPU and GPU powered workflows.

Image Credit: Cloudera

Cloudera’s data portfolio

Cloudera was a trailblazer in the development of data lakes built on top of the Hadoop platform. Cloudera merged with Hortonworks, another Hadoop vendor, in 2018 and combined the technologies into a modern architecture called the Cloudera Data Platform (CDP). At the time, many speculated this spelled the end of Hadoop data warehouses, but Cloudera has continued to innovate and extend Hadoop into a more nimble workflow.

Cloudera added Applied ML Prototypes (AMPs), a framework for packaging AI and ML models for data scientists, to CDP earlier this year. AMPs allow teams to take the guesswork out of ML projects with prebuilt business application templates for specific use cases, and they often run on Nvidia GPU hardware. Cloudera Data Engineering (CDE) streamlines the data engineering and prep work at the start of a project. This solved common problems data engineers face, such as scheduling and orchestration of complex data, troubleshooting and performance tuning tools for data flows, and improving collaboration with analytic and data science teams.

The RAPIDS Accelerator for Apache Spark will be available in CDP Private Cloud this summer. Nvidia and Cloudera will roll out additional accelerated offerings in CDP over time, starting with Accelerated Deep Learning and Machine Learning in CDP Public Cloud in May. “This means that no matter where customers require these GPUs (from on-prem to public cloud, to hybrid cloud and beyond), they’ll be able to leverage best-in-class GPUs out of the box,” said Santiago Giraldo, Cloudera director of product marketing for data engineering and machine learning.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading

Tech

Nvidia unveils rental model for DGX Station A100 mini supercomputers

Published

on

Nvidia unveils rental model for DGX Station A100 mini supercomputers

The day has come when you can rent your own mini supercomputer. When you’re done with it, you can return it to Nvidia.

That’s the plan for the company’s new cloud-native supercomputer, the Nvidia DGX Station A100. Nvidia made the announcement during the keynote speech at its GTC 2021 event.

You can use the supercomputer for a short period of time when you need it and then return it after you’re done, said Manuvir Das, head of enterprise computing at Nvidia, in a press briefing. The DGX Station is a multi-tenant supercomputer that can be shared by as many as 28 data scientists. Also announced at GTC is a new Nvidia DGX SuperPod that will be available with Nvidia’s Bluefield-2 DPUs, enabling a cloud-native supercomputer. A DGX SuperPod consists of a bunch of individual DGX Station computers.

“You can think of us progressing in two directions, one with constant innovation to raise the bar. But the other is to really democratize AI to put it in the hands of as many companies and scientists as we possibly can,” Das said. “We are also announcing for the first time a rental model. Instead of procuring a DGX Station, customers will be able to rent a station directly from Nvidia at a low monthly price point, and they can use it for as long as they choose and then return it to Nvidia. So that’s an important direction that we are taking with the Station.”

This opens the world of AI to more enterprise customers as they investigate areas such as AI, drug discovery, autonomous vehicles, and more. Those DPUs can offload, accelerate, and isolate users’ data — providing customers with secure connections to their AI infrastructure.

Nvidia DGX SuperPod will be available with Nvidia’s Bluefield-2 DPUs, enabling a cloud-native supercomputer. It’s a full bare-metal supercomputer that is also sharable. In a keynote speech at GTC 21, Nvidia CEO Jensen Huang said the company will focus on three kinds of chips: DPUs, central processing units (CPUs) like Grace, and graphics processing units (GPUs). It will alternate between Arm-based and x86-based products, he said.

“We have to make AI easier to use,” Huang said.

Above: DGX Superpod

Image Credit: Nvidia

This is the first time Nvidia is turning to a rental model for DGX Stations. The idea is to broaden AI adoption for enterprise IT departments that need to support the work of teams in multiple locations or to help academic and research institutions, which often have to grant outside organizations access to their computers. DGX Stations start at $149,000, while the DGX SuperPod starts at $7 million and scales to $60 million.

The company also announced NVIDIA Base Command, which enables multiple users and IT teams to securely access, share, and operate their DGX SuperPod infrastructure. Base Command coordinates AI training and operations on DGX SuperPod infrastructure to enable the work of teams of data scientists and developers located around the globe.

The DGX SuperPods are AI supercomputers featuring 20 or more Nvidia DGX A100 systems and Nvidia InfiniBand HDR networking. Among the latest to deploy DGX SuperPODs to power new AI solutions and services is Sony, which uses the DGX SuperPod for its corporate research team to infuse AI across the company.

Other customers are Naver, Recursion, MTS, and VinAI. Additionally, Nvidia and Schrödinger today separately announced a strategic partnership designed to harness DGX SuperPods to further accelerate drug discovery at a supercomputing scale.

Nvidia also introduced a subscription offering for the DGX Station A100. The new subscription program makes it easier for companies at every stage of growth to accelerate AI development outside the datacenter for teams working in corporate offices, research facilities, labs, and home offices.

The DGX A100 is a collection of DGXs working together as one cluster of infrastructure to produce a supercomputer, Das said. Each box is capable of 2.5 teraflops of computing power. Cloud-native, multi-tenant Nvidia DGX SuperPods will be available in Q2 through Nvidia’s global partners.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Continue Reading

Trending