
Image by Alina Perekatenkova, from Unsplash
New Facebook Feature Uses Private Photos For AI-Generated Stories
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Sarah Frazier Former Content Manager
According to new reports Facebook now asks users to upload private phone photos for AI suggestion purposes, sparking concerns about data privacy, consent, and future model training.
In a rush? Here are the quick facts:
- Facebook requests access to users’ unshared camera roll photos for AI suggestions.
- The opt-in feature enables AI to generate collages and restyled photo content.
- Meta may analyze facial features and other metadata under its AI Terms.
According to new reports, Facebook is quietly rolling out a new feature where users are asked to enable Meta’s AI to analyze their entire phone photo collection on their device as part of a new feature, as first reported by TechCrunch .
The goal of the new AI feature is to provide users with content suggestions through the generation of collages, story recaps, and digital restyle options for their Stories. However, users are being asked to opt in to “cloud processing,” which means uploading their private photos to Meta’s servers on an ongoing basis.
A Meta spokesperson described the feature as an optional test in the U.S. and Canada. “These suggestions are opt-in only and only shown to you – unless you decide to share them – and can be turned off at any time,” said Meta’s Maria Cubeta, as reported by TechCrunch. “Camera roll media may be used to improve these suggestions, but are not used to improve AI models in this test,” she added.
Users can find this feature under the “Camera roll sharing suggestions” section of Facebook. As AI features become more deeply integrated into everyday apps, the balance between convenience and privacy continues to raise questions. While Meta describes the camera roll tool as an optional and user-controlled feature, the lack of long-term clarity about how personal data may be used leaves room for concern.

Image by user6702303, from Freepik
Robots That “Feel” With Sound Could Aid Farm Work
- Written by Kiara Fabbri Former Tech News Writer
- Fact-Checked by Sarah Frazier Former Content Manager
Scientists have created a robotic sensing system which enables machines to detect touch through sound waves instead of sight or pressure.
In a rush? Here are the quick facts:
- It works even in visually blocked or rough environments like dense farms.
- The system localizes touch with less than 0.5 cm error.
- It’s more durable and cheaper than camera or pressure-based sensors.
The system, called SonicBoom , uses sound to detect where and what a robot arm is touching, as first reported by Spectrum . The researchers note that this technology shows particular promise for agricultural applications, where robots typically struggle navigating through vines and bushes.
Many robots today depend on tiny camera-based tactile sensors, explains Moonyoung (Mark) Lee, a Ph.D. student at Carnegie Mellon University and co-developer of the system.
These sensors detect touch by looking at how a gel pad deforms, but they can be easily blocked by leaves or damaged in rugged farm environments, as reported by Spectrum. Pressure sensors are another option, but they would have to cover the entire robot to be useful, making them costly and fragile.
SonicBoom works differently. The system contains several tiny microphones embedded in the robot arm which detect the vibrations that occur when the robot makes contact with objects. By analyzing tiny differences in the signals, the system pinpoints where contact occurred—with remarkable precision.
In lab tests, SonicBoom achieved touch detection with an error of less than half a centimeter. The system maintained its performance when detecting new materials, such as plastic and aluminum, even though it had no prior training on these materials.
The researchers trained the system by tapping the robot over 18,000 times with a wooden stick. They’re now working on teaching SonicBoom to recognize what kind of object it touches, a leaf, a branch, or a tree trunk.
Lee noted that, “With SonicBoom, you can blindly tap around and know where the [contact happens], but at the end of the day, for the robot, the really important information is: Can I keep pushing, or am I hitting a strong trunk and should rethink how to move my arm?”
Lee added, that although promising, real-world farm tests are still to come