Photo: iStock | HT Ganzo
Recent surveys show consumers are interested in but also increasingly concerned about potential threats from artificial intelligence (AI) with the arrival of ChatGPT and other generative AI.
A recent survey conducted by Forbes Advisor found 76 percent of U.S consumers were concerned with misinformation from AI tools such as Google Bard, ChatGPT and Bing Chat. Most were concerned about AI’s use for product descriptions, product reviews, chatbots answering questions and personalized advertising.
The findings suggest “a consumer demand for transparency and ethical AI practices to foster trust between businesses and their customers,” according to Forbes.
A survey from CX platform DISQO taken in early March found 34 percent of U.S. adults don’t think generative AI tools should be used for most consumer-facing content (43 percent among Boomers versus 21 percent for Gen Z).
The top-five concerns around AI were poorer accuracy, cited by 45 percent; lack of human touch, 38 percent; negative impact on jobs, 36 percent; low emotional depth, 35 percent; and more bias, 29 percent. Sixty-eight percent had a low overall knowledge level of AI-generated content tools.
“Consumers are wary and need to be informed and educated about what’s in it for them,” Patrick Egan, director of research and insights, DISQO, said in a statement.
A Morning Consult survey taken in mid-February found that while more than half of the U.S. public believes AI integrations into products and services are the future of technology, just one-third think AI technologies will be developed responsibly. One-third trust AI to provide factual results.
“We don’t need to be afraid of it, but we do need to be in control of it,” Massachusetts Congressman Jake Auchincloss told Morning Consult. “It can’t be like social media where it was allowed to scale and to influence much of our private and public lives before we really got a handle on it — and frankly, still haven’t gotten a handle on it.”
Leave a Reply
You must be logged in to post a comment.