Using AI to craft love messages may save time, but it comes at a cost. A new study finds that even perfectly written AI notes are judged as lazy and less sincere, signaling to recipients that the sender didn’t invest effort or thought into the gesture. People notice when shortcuts are taken, and using AI for personal communications like love letters, apologies, or wedding vows can make senders appear less caring, less trustworthy, and emotionally distant, no matter how polished the words seem

People may want to think twice before turning to artificial intelligence to craft Valentine’s messages. Psychologists warn that while AI can produce polished writing, it may also reshape how effort and sincerity are perceived in relationships.
A University of Kent study surveying 4,000 participants found that people were judged “more negatively” when they relied on AI to write love letters, apologies and wedding vows. Using AI for personal communication was widely seen as “less caring, less authentic, less trustworthy and lazier”, even when the writing itself was strong and users were transparent about using the technology.
The research forms part of the Trust in Moral Machines project, supported by the University of Exeter, and points to a growing tension between convenience and emotional authenticity as AI tools become more embedded in everyday life.
Jacqueline McKenzie, a resident of Tunbridge Wells, said she uses AI for some tasks but draws a firm line at romance. She said she would “never in a million years” use AI to compose a Valentine’s message.
Others echoed the sentiment. Liam Goodhew of Bexley, Greater London, said he would not write a love letter to his partner Paige with AI because it would not feel genuine. “She’s worth more than that,” he told BBC Radio Kent.
Reza Jafary, also from Tunbridge Wells, questioned whether machine generated affection could ever feel personal. “A Valentine’s Day message should come from the heart, not a computer,” he said.
Researchers say the reaction is less about the final product and more about the process behind it. “People don’t just judge what you produce, they judge how you produce it,” said Dr Scott Claessens.
Dr Jim Everett added: “If you use AI for these kind of social tasks that bind us together, you risk being judged not only because you didn’t put effort in, but because it makes people think you care less about the task and what it represents.”
He added that AI was “no substitute for investing effort into our interpersonal relationships”.
As AI writing tools become faster and more sophisticated, the study suggests their social acceptance may depend on where people draw the boundary between assistance and emotional outsourcing. For now, when it comes to matters of the heart, effort still signals intent.
Get the latest news and insights that are shaping the world. Subscribe to Impact Newswire to stay informed and be part of the global conversation.
Got a story to share? Pitch it to us at info@impactnews-wire.com and reach the right audience worldwide
Faustine Ngila is the AI Editor at Impact Newswire, based in Nairobi, Kenya. He is an award-winning journalist specializing in artificial intelligence, blockchain, and emerging technologies.
He previously worked as a global technology reporter at Quartz in New York and Digital Frontier in London, where he covered innovation, startups, and the global digital economy.
With years of experience reporting on cutting-edge technologies, Faustine focuses on AI developments, industry trends, and the impact of technology on society.
Discover more from Impact Newswire
Subscribe to get the latest posts sent to your email.



