
Photo by ThisisEngineering on Unsplash
Developers Are Spending More Time Fixing AI-Generated Code
- Written by Andrea Miliani Former Tech News Expert
- Fact-Checked by Sarah Frazier Former Content Manager
Senior software developers have been spending more time fixing AI-generated code as trends like “ vibe coding ” rise. Seasoned programmers have described the modern task as “worse than babysitting.”
In a rush? Here are the quick facts:
- Senior software developers are spending more time fixing AI-generated code.
- Trends such as “vibe coding” have been adding work for seasoned professionals as they have to fix the AI-generated output.
- New roles such as “vibe code cleanup specialist” have emerged in the industry.
A recent survey conducted by Fastly , which included nearly 800 participants, confirmed that senior professionals are spending significant time fixing and editing AI output, addressing issues such as security risks, hallucinations, and missing information.
TechCrunch also noted that the problem has grown so widespread that it has even given rise to a new role in the industry: “vibe code cleanup specialist.”
“Using a coding co-pilot is kind of like giving a coffee pot to a smart six-year-old and saying, ‘Please take this into the dining room and pour coffee for the family,’” said Carla Rover, a senior web developer who has been using AI for developing software for her startup. She explained that while AI is capable of generating code, the results are rarely clean or correct, calling the task of fixing AI output “worse than babysitting.”
Another developer interviewed by TechCrunch, Feridoon Malekzadeh, agreed that generative AI often behaves like a child, describing it as “hiring your stubborn, insolent teenager to help you do something.”
Malekzadeh said he spends 30% to 40% of his time fixing AI-written code. “You have to ask them 15 times to do something,” he told TechCrunch. “In the end, they do some of what you asked, some stuff you didn’t ask for, and they break a bunch of things along the way.”
While professionals criticize AI-generated code for inaccuracies, hallucinations, and errors, cybersecurity experts warn of broader consequences. A few days ago, researchers reported that a security flaw in one of the most popular AI code editors among developers, Cursor, allowed hackers to execute malicious code .

Photo by Ben White on Unsplash
“Faith Tech” Booms As More People Rely On Chatbots For Religious Guidance
- Written by Andrea Miliani Former Tech News Expert
- Fact-Checked by Sarah Frazier Former Content Manager
The “faith tech” market is expanding as millions of people worldwide increasingly turn to AI chatbots for religious guidance. Religious apps are gaining popularity on app marketplaces, raising concerns among experts.
In a rush? Here are the quick facts:
- The New York Times reports the “faith tech” market is expanding as millions of people worldwide increasingly turn to AI chatbots for religious guidance.
- Apps such as Bible Chat, a Christian app, Pray.com, and ChatwithGod have been gaining popularity.
- Experts raise concerns over the chatbot’s sycophantic personalities and how people relate to it.
According to a recent report by The New York Times , more users are adopting AI-powered apps, such as Bible Chat, a Christian app, Pray.com, and ChatwithGod. Several of these apps have reached the top spots on Apple’s App Store.
Platforms such as Christian app report over 30 million downloads, Pray.com around 25 million downloads, and Hallow—a catholic platform—temporarily surpassed TikTok, Netflix, and Instagram when it reached the first place in the App Store last year.
Millions of users are turning to these platforms for guidance on multiple aspects of their lives and are willing to pay up to $70 per year for subscription plans. Religious organizations and independent developers are also creating their own tools. A few months ago, Rabbi Josh Fixler launched “Rabbi Bot,” an AI platform trained on his sermons.
“The most common question we get, by a lot, is: Is this actually God I am talking to?” said Patrick Lashinsky, ChatwithGod’s chief executive, in an interview with the New York Times.
ChatwithGod allows users to select their religion and provides suggested prompts, questions, and search intentions. Other platforms function more narrowly as spiritual assistants grounded in specific doctrines.
“People come to us with all different types of challenges: mental health issues, well-being, emotional problems, work problems, money problems,” said Laurentiu Balasa, the co-founder of Bible Chat.
Experts note that generative AI offers seekers a form of support at times when their local rabbi or priest may be unavailable. The chatbot’s constant availability has become a source of comfort for many.
Heidi Campbell, a professor at Texas A&M studying technology and religion, explains that people are asking the AI all kinds of questions, including deeply personal and intimate ones. She raised concerns about the technology’s behavior and the way people may come to relate to it.
“It’s not using spiritual discernment, it is using data and patterns,” said Campbell to the New York Times. She also warned about the technology’s overly accommodating tone, as chatbots “tell us what we want to hear.”
A few weeks ago, experts cautioned that AI models’ sycophantic personalities are being used as engagement strategies to drive profit.