fbpx
News

Google warns its staff about using its own AI chatbot Bard

Google says chatbots carry a potential risk of data leakage

Google logo

Google, the company that has been frantically pushing its AI chatbot Bard and adding AI-powered tech to several of its existing products, is wary about its own technology.

Alphabet, Google’s parent company, has advised its employees to not enter confidential information into any chatbot, including its own Bard, in an effort to safeguard sensitive information.

While programs like Bard and OpenAI’s ChatGPT, powered by generative artificial intelligence, are designed to engage in human-like conversations and provide answers to various types of queries, Google says they also carry a potential risk of data leakage. As reported by Reuters, the company also warned employees not to use the chatbot’s code suggestion feature as it might not be reliable or secure. Google showed off Bard’s code-generating capabilities at Google I/O 2023 last month.

The company said that Bard is still a useful tool, and it helps programmers nonetheless, but it also wants to be transparent about its limitations.

Samsung Electronics has already banned the use of artificial intelligence (AI) tools by employees after discovering that sensitive code was uploaded to OpenAI’s ChatGPT. In February, JPMorgan Chase & Co., Bank of America Corp. and Citigroup Inc. banned or restricted the use of OpenAI’s chatbot service. Italy also barred the use of ChatGPT over privacy fears. However, it has since reversed its stance.

Further, ChatGPT was taken down for several hours back in March when a bug exposed chat titles to other users. Read more about it here.

The launch of Bard came expeditiously as Google tried to compete with other players in the AI chatbot market, such as OpenAI’s ChatGPT and Microsoft’s Bing AI. While these chatbots have various use cases, such as writing content, solving problems, and creating codes for games, they also pose some serious risks, such as spreading misinformation and bias, leaking sensitive data, or infringing on copyrights.

Source: Reuters

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments