
Image by Shane uchi, from Unsplash
Pornhub Blocks Access In 17 States Over Privacy Risks
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Justyn Newman Former Lead Cybersecurity Editor
Nearly two years ago, Louisiana passed a law that set off a wave of similar measures across the southern United States, changing how people access adult content online. Today it was reported by 404 Media that three additional states have joined the 17 already affected by age verification laws, blocking access to Pornhub.
In a Rush? Here are the Quick Facts!
- Louisiana’s 2022 law led to adult websites requiring government ID for access.
- Louisiana traffic to Pornhub dropped 80%, driving users to unregulated platforms.
- Legal challenges and vetoes question the constitutionality of these restrictions.
These restrictions, intended to protect children, are raising significant concerns about privacy and censorship for adults.
404 Media explains how the trend began with Louisiana’s “Act 440,” introduced by state representative Laurie Schegel, a counselor specializing in “sex addiction.” The law mandates that websites with adult content should verify users’ ages with government-issued IDs.
Noncompliance can lead to steep fines and lawsuits. Rather than risk these penalties while compromising user privacy, Aylo—the parent company of Pornhub and its network of sites—has chosen to block access in states with these regulations, as noted by 404 Media.
The list of affected states includes Virginia, Montana, North Carolina, Arkansas, Utah, Mississippi, Texas, Nebraska, Idaho, Kansas, Kentucky, Indiana, Alabama, Oklahoma, Florida, Tennessee, and South Carolina. Georgia is next in line, with its own age verification law set to take effect in July.
404 Media says that Louisiana offers a preview of how these laws play out in practice. There, users must verify their age through a state-issued digital driver’s license app before accessing sites like Pornhub.
However, the policy has backfired. Aylo reported that traffic to its sites in Louisiana plummeted by 80%, with many users turning to less regulated platforms lacking basic safeguards against harmful content, according to 404 Media.
This workaround trend isn’t unique to Louisiana. In Florida and other states with similar laws, people are increasingly using VPNs (virtual private networks) to bypass access restrictions , as indicated by spikes in related Google searches.
Critics argue these laws create more problems than they solve. Forcing websites to collect sensitive personal information, such as government IDs, exposes users to significant privacy risks. At the same time, banning access to well-moderated platforms only pushes users toward unregulated and potentially harmful alternatives.
Aylo has been vocal in its opposition to these measures. In a statement, the company emphasized its support for age verification in principle but criticized the implementation of these laws as “ineffective, haphazard, and dangerous.”
404 Media notes that some states are pushing back against these measures. Arizona Governor Katie Hobbs vetoed a similar bill, arguing it violated First Amendment protections. Meanwhile, legal challenges to these laws are gaining momentum.
In Florida, the Free Speech Coalition has filed a lawsuit, describing the regulations as invasive and a threat to online privacy and freedom, as reported by 404 Media.
“These laws create a substantial burden on adults who want to access legal sites without fear of surveillance,” Alison Boden, Executive Director of the Free Speech Coalition, stated in a recent press release .
“Despite the claims of the proponents, HB3 is not the same as showing an ID at a liquor store. It is invasive and carries significant risk to privacy. This law and others like it have effectively become state censorship, creating a massive chilling effect for those who speak about, or engage with, issues of sex or sexuality.”

Image by Mliu92, from Wikimedia Commons
Waymo Robotaxi And Serve Delivery Robot Collide In Los Angeles
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Justyn Newman Former Lead Cybersecurity Editor
On December 27, 2024, a collision occurred between a Waymo robotaxi and a Serve Robotics sidewalk delivery robot at a West Hollywood intersection, as first Reported by TechCrunch .
In a Rush? Here are the Quick Facts!
- The incident occurred at a West Hollywood intersection; no significant damage was reported.
- The collision happened when the delivery bot moved into the Waymo robotaxi’s path.
- Waymo’s system detected the robot and applied brakes before contact, hitting it at 4 mph.
The incident was captured in a video that quickly spread on social media, sparking questions about the safety and liability of autonomous vehicles.
Food Delivery Robot Hit By Waymo by u/mingoslingo92 in waymo
The footage shows the Serve bot crossing a street at night, attempting to climb onto the sidewalk. After backing up to correct its course, it moved toward the curb ramp, but as it did, a Waymo robotaxi making a right turn struck the delivery robot.
TechCrunch says that the person who posted the video claimed the Serve bot ran a red light, though the footage doesn’t clearly support this assertion.
When asked about the incident, a Waymo spokesperson explained to TechCrunch that its Driver system classified the Serve delivery robot as an inanimate object, failing to recognize it as a potential hazard.
The system is designed to prioritize safety by choosing the safest driving path based on available information. For example, it is programmed to be cautious around pedestrians and children.
According to the spokesperson, When the Serve bot moved into the taxi’s path, the Waymo system applied the brakes, but the collision happened due to a misjudgment in timing, as reported by TechCrunch.
Neither vehicle sustained damage, and both robots briefly remained locked before continuing on their way. This incident raises important questions about what happens when autonomous vehicles collide and how liability is determined.
Waymo’s protocol in such cases involves notifying its Fleet Response and Rider Support teams, with the latter contacting first responders if necessary. In this case, no passengers were inside the robotaxi, said TechCrunch.
Serve confirmed to TechCrunch that the delivery robot was under remote supervision at the time of the incident, as that is standard procedure for crossing intersections.
Neither Waymo nor Serve provided specific details about liability for future collisions, but both companies stated they are working together to prevent similar issues, says TechCrunch.
The collision between the Waymo robotaxi and Serve delivery robot raises concerns for pedestrian safety. If autonomous vehicles fail to accurately recognize hazards, they could inadvertently cause accidents, especially in busy urban environments where unpredictability is common.
Questions surrounding the companies’ responsibility in these cases will likely prompt regulatory scrutiny and a push for clearer laws regarding liability in autonomous vehicle collisions.
As autonomous technology evolves, legal frameworks will need to adapt to ensure accountability and the protection of public safety.