AI Antics: From Snarky Poems to Drive-Thru Disasters
Blunders continue to plague Artificial Intelligence (AI) powered chatbots. It began with courier services, fast food drive thrus are next to be hit.
Workers who fear of being replaced by AI can heave a sigh of relief, at the expense of AI-powered fiascos. At the very least, it is now centered in the service industry. On the other hand, companies are cued to re-evaluate if adopting AI proves to improve things or a recipe for reputational disasters.
Recently, the BBC reported its experimental “AI Overviews” tool has told some users taught to use non-toxic glue to make cheese stick to pizza better. The search engine’s AI-generated responses have also said geologists recommend humans eat one rock per day.
In a bid to save reputational damage (whatever is left), Google spokesperson reportedly claimed these blunders were “isolated examples”. Some data were created by trolls on Reddit, a platform notorious for its unpopular content, or pieces from The Onion.
With the vast amount of data inputted into the software known as training, one could not help but wonder if the world is training the AI chatbot to be stupid instead. Despite the bad press, Google insisted the feature was generally working well.
First It Swore
Early this year, UK-based Dynamic Parcel Distribution (DPD)’s AI chatbot wrote a snarky poem at the order of a frustrated customer trying to check on his parcel’s whereabouts. Ashley Beauchamp, a pianist and conductor was annoyed at getting nowhere finding out about his parcel’s whereabouts. Unsuccessful at getting the contact number of DPD’s customer service, he turned to the firm’s chatbot and cajoled it to smear the company’s customer service.
It wrote: “There was once a chatbot named DPD, Who was useless at providing help. DPD was a waste of time, And a customer’s worst nightmare,” the bot continued before concluding: “One day, DPD was finally shut down and everyone rejoiced. Finally they could get the help they needed, From a real person who knew what they were doing.”
Beauchamp’s juicy tweet clocked 2 million views. He said he initially asked the bot to tell him a joke after failing to get information about the status of a parcel and when it did he asked it to write the poem about automated customer service failings.
He also encouraged the bot to swear, which it did at him. Facing the music, Beauchamp told ITV television he had still not received the parcel. “I think they might hold it hostage now. I wouldn’t blame them. That’s totally on me,” he added.
DPD UK said it had used an AI element within its chat system successfully for a number of years alongside its human customer service but an error had occurred after a system update. “The AI element was immediately disabled and is currently being updated,” the company said in a statement reported by ITV. No further word was released on if the chatbot is back.
Followed By (Hilarious) Messed Up Orders
Next to suffer embarrassing fiascos is Macdonald’s in the United States. Its artificial intelligence (AI) powered ordering technology from its drive-through restaurants in the US, after customers shared its comical mishaps online. A trial of the system, developed by IBM and uses voice recognition software to process orders, was announced in 2019.
Unfortunately, viral videos of bizarre misinterpreted orders ranging from bacon-topped ice cream to hundreds of dollars’ worth of chicken nuggets were circulated online. The fast food chain told franchisees it would remove the tech from the more than 100 restaurants it has been testing it in by the end of July, as first reported by trade publication Restaurant Business.
“After thoughtful review, McDonald’s has decided to end our current global partnership with IBM on AOT [Automated Order Taking] beyond this year,” the restaurant chain said in a statement.
It added the chain will continue to evaluate long-term, scalable solutions that will help make informed decisions on a future voice ordering solution by the end of the year
In one video, which has 30,000 views on TikTok, a customer gets increasingly exasperated as she attempts to convince the AI that she wants a caramel ice cream, only for it to add multiple stacks of butter to her order. In another, which has 360,000 views, a person claims that her order got confused with one being made by someone else, resulting in nine orders of tea being added to her bill.
Another popular video includes two people laughing while hundreds of dollars worth of chicken nuggets are added to their order, while the New York Post reported another person had bacon added to their ice cream in error.
These incidents highlight the potential pitfalls of relying too heavily on AI in customer service and order processing. While technology promises efficiency and innovation, reality can sometimes be a far cry from the ideal, leading to humourous yet frustrating user experiences.
As companies like DPD and McDonald’s navigate the complexities of AI implementation, it becomes clear that human oversight remains crucial to ensure customer satisfaction. Both stories serve as a reminder that while AI has the potential to revolutionise industries, its integration must be handled with care, balancing innovation with the irreplaceable value of human touch.
What You Missed:
Ehang Spearheads Aerial Tourism, And It May Be The New Black
Tesla Had A Turbulent H1 2024, What Can We Expect In H2 2024?
AI Chatbot Provides Quack Advice To Make Cheese Stick To Pizza
Arizona State Lawmaker Used ChatGPT To Write Part Of Law On Deepfakes
ASML Secret Sauce For Semiconductor Success Amid Challenges In The Angstrom Era
AI Has Crept Its Way Into Aerial Combat
Tesla Profits Decline By More Than 50% In Q1 2024
Tesla Reduces Full Self-Driving Software Price To US$8,000
Researchers Turn Metal Waste Into Catalyst For Hydrogen
Is AI Is Looking More Like A Band Aid Now?
WANT MORE INSIDER NEWS? SUBSCRIBE TO OUR DIGITAL MAGAZINE NOW!
CONNECT WITH US: LinkedIn, Facebook, Twitter
Letter to the Editor
Do you have an opinion about this story? Do you have some thoughts you’d like to share with our readers? APMEN News would love to hear from you!
Email your letter to the Editorial Team at [email protected]