
Image by Brett Jordan, from Unsplash
Fake Job Emails Used to Spread BeaverTail Malware
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Sarah Frazier Former Content Manager
A new cyberattack is targeting job seekers by using fake recruitment emails to spread malware disguised as harmless developer files.
In a rush? Here are the quick facts:
- Hackers impersonated recruiters to spread malware via fake developer projects.
- Attackers used BitBucket links to trick victims into downloading files.
- Tropidoor backdoor can steal data, take screenshots, and run commands.
Cybersecurity experts at ASEC , who first identified this malware, explain that this incident represents an increasing tactic where attackers disguise themselves as either recruiters or members of developer communities.
The incident first emerged on November 29, 2024 when hackers used Dev.to’s identity to pose as the platform’s developers.
The attackers sent emails containing BitBucket code repository links which they asked users to review the project. The project files contained hidden malware which was disguised as ordinary project files.
The fake files included two major threats: a JavaScript-based malware called BeaverTail , disguised as a “tailwind.config.js” file, and a second component called car.dll, which acts as a downloader. When opened, these files work together to steal login details, browser data, and even cryptocurrency wallet information.
“BeaverTail is known to be distributed primarily in phishing attacks disguised as job offers,” researchers at ASEC explained. Previous versions of this attack were spotted on platforms like LinkedIn .
The malware poses a significant threat because it disguises its actual purpose by mimicking standard system operations. The malware employs PowerShell and rundll32 tools which are standard Windows utilities to evade detection by antivirus software.
After penetrating a system the malware retrieves and executes Tropidoor which functions as an advanced backdoor. The tool establishes encrypted connections with remote servers while executing more than 20 different commands that include file deletion and program code injection and screenshot capture.
“Tropidoor… collects basic system information and generates a random 0x20 byte key, which is encrypted with an RSA public key,” researchers said. This secure connection lets hackers control infected machines without being noticed.
Security teams urge everyone to remain very vigilant at this time. Be wary of unexpected recruitment emails especially those with links to code repositories or those asking you to download project files. Always check with the official company before opening any content.

Image by Oberon Copeland, from Unsplash
AI Bots Are Overloading Wikipedia’s Servers
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Sarah Frazier Former Content Manager
The Wikimedia Foundation has raised alarms over the growing pressure on its servers due to automated bots scraping data to train artificial intelligence models.
In a rush? Here are the quick facts:
- AI bots are scraping Wikimedia content at record levels.
- Bots caused a 50% rise in multimedia bandwidth use.
- 65% of high-cost traffic now comes from crawlers.
The Foundation reported in a recent post that machine-generated traffic continues to grow at an unprecedented rate while people make up only a small portion of this traffic.
“Since January 2024, we have seen the bandwidth used for downloading multimedia content grow by 50%,” the post states.
“This increase is not coming from human readers, but largely from automated programs that scrape the Wikimedia Commons image catalog of openly licensed images to feed images to AI models,” the post added.
The bots known as crawlers steal large amounts of data from Wikimedia’s projects including Wikipedia and Wikimedia Commons without proper credit or official access tools. The process makes it difficult for new users to discover Wikimedia and puts excessive strain on their technical systems.
For example, the post notes that Jimmy Carter’s Wikipedia page received more than 2.8 million views during the day he passed away in December 2024. The 1980 debate video caused a significant increase in website traffic. A video of his 1980 debate also spiked traffic. Wikimedia handled it — but just barely. The real problem according to engineers is the continuous stream of bot traffic.
“65% of our most expensive traffic comes from bots,” the Foundation wrote. Bots “bulk read” content, especially less popular pages, which triggers expensive requests to Wikimedia’s core datacenters.
While Wikimedia’s content is free to use, its servers are not. “Our content is free, our infrastructure is not,” the Foundation said. The team continues to develop methods for promoting “responsible use of infrastructure” by urging developers to use the API instead of scraping the entire site.
The problem affects Wikimedia as well as numerous other websites and publishers . But for the world’s largest open knowledge platform, it’s threatening the stability of services millions rely on.