Microsoft reportedly used canned Bing Chat responses to promote Bing search

Microsoft has a knack for being pushy, but this might be too far

Microsoft has a reputation for using Windows to aggreesively promote its products over competitors. These tactics range from annoying and cringey to downright nefarious — the latest example falls into the latter category. Microsoft reportedly started using its new Bing Chat to promote its search engine and the promotion doesn’t appear to be generated by GPT-4.

The Verge’s senior editor Sean Hollister recently detailed his experience trying to download Google’s Chrome browser on a new Windows 11 computer. It started with opening Microsoft’s Edge browser, which comes bundled with Windows 11, and typing ‘Chrome’ into the address bar. Edge’s default search engine is Microsoft’s Bing and, for a while now, searching for other web browsers results in a prompt appearing above the results that says there’s “no need to download a new web browser.”

Except, that’s not what Hollister saw. Instead, he received a full-screen Bing Chat response highlighting features in the Bing search engine.

Hollister goes on to claim that he was able to reproduce the result across several searches and on other computers — he also wasn’t alone in the experience. He cited experiences from colleagues in other countries who also saw the response. This all suggests that the Bing Chat response isn’t generated by GPT-4, the large language model (LLM) powering Bing Chat. Hollister calls it a “completely canned interaction” that takes up the whole screen.

After Hollister published an article about the experience on The Verge, Microsoft reportedly stopped showing the canned response. Moreover, a company spokesperson told sent him a statement from Microsoft product marketing director Jason Fischel claiming it was an experiment:

“We often experiment with new features, UX, and behaviors to test, learn, and improve experiences for our customers. These tests are often brief and do not necessarily represent what is ultimately or broadly provided to customers.”

Still, the whole thing seems somewhat shady and it calls into question the integrity of Bing Chat’s responses — not that you should have put much trust in them to begin with. LLMs already have a tendency to make stuff up, something that’s caused several problems for companies like OpenAI. But it’s one thing for an LLM to make stuff up and another for a company to fake a response that promotes its products over competitors.

Source: The Verge