Microsoft has a new way to keep ChatGPT ethical, but will it work?

Microsoft shut down its AI ethics team in March, but the company says it’s still working on ways to keep ChatGPT and responsible and prevent further scandals.

Microsoft has a new way to keep ChatGPT ethical, but will it work?

Microsoft caught a lot of flak when it shut down its artificial intelligence (AI) Ethics & Society team in March 2023. It wasn’t a good look given the near-simultaneous scandals engulfing AI, but the company has just laid out how it intends to keep its future efforts responsible and in check going forward.

In a post on Microsoft’s On the Issues blog, Natasha Crampton — the Redmond firm’s Chief Responsible AI Officer — explained that the ethics team was disbanded because “A single team or a single discipline tasked with responsible or ethical AI was not going to meet our objectives.”

Bing Chat shown on a laptop.Jacob Roach / Digital Trends

Instead, Microsoft adopted the approach it has taken with its privacy, security, and accessibility teams, and “embedded responsible AI across the company.” In practice, this means Microsoft has senior staff “tasked with spearheading responsible AI within each core business group,” as well as “a large network of responsible AI “champions” with a range of skills and roles for more regular, direct engagement.”

Beyond that, Crampton said Microsoft has “nearly 350 people working on responsible AI, with just over a third of those (129 to be precise) dedicated to it full time; the remainder have responsible AI responsibilities as a core part of their jobs.”

After Microsoft shuttered its Ethics & Society team, Crampton noted that some team members were subsequently embedded into teams across the company. However, seven members of the group were fired as part of Microsoft’s extensive job cuts that saw 10,000 workers laid off at the start of 2023.

Navigating the scandals

Bing Chat saying it wants to be human.Jacob Roach / Digital Trends

AI has hardly been free of scandals in recent months, and it’s those worries that fuelled the backlash against Microsoft’s disbanding of its AI ethics team. If Microsoft lacked a dedicated team to help guide its AI products in responsible directions, the thinking went, it would struggle to curtail the kinds of abuses and questionable behavior its Bing chatbot has become notorious for.

The company’s latest blog post is surely aiming to alleviate those concerns among the public. Rather than abandoning its AI efforts entirely, it seems Microsoft is seeking to ensure teams across the company have regular contact with experts in responsible AI.

Still, there’s no doubt that shutting down its AI Ethics & Society team didn’t go over well, and chances are Microsoft still has some way to go to ease the public’s collective mind on this topic. Indeed, even Microsoft itself thinks ChatGPT — whose developer, OpenAI, is owned by Microsoft — should be regulated.

Just yesterday, Geoffrey Hinton — the “godfather of AI — quit Google and told the New York Times he had serious misgivings about the pace and direction of AI expansion, while a group of leading tech experts recently signed an open letter calling for a pause on AI development so that its risks can be better understood.

Microsoft might not be disregarding worries about ethical AI development, but whether or not its new approach is the right one remains to be seen. After the controversial start Bing Chat has endured, Natasha Crampton and her colleagues will be hoping things are going to change for the better.

Editors' Recommendations

Stop using generative-AI tools such as ChatGPT, Samsung orders staff MiniGPT-4: A free image-to-text AI tool you can try out today Why pay for GPT-4? This AI tool gives it to you for free, plus more Nvidia’s new Guardrails tool fixes the biggest problem with AI chatbots Even Microsoft thinks ChatGPT needs to be regulated — here’s why

Alex Blake

In ancient times, people like Alex would have been shunned for their nerdy ways and strange opinions on cheese. Today, he…

ChatGPT is great, but it really needs these 4 new features

The ChatGPT website on a laptop's screen as the laptop sits on a counter in front of a black background.

I know, it's dumb to complain about a free service, which ChatGPT always has been. It feels especially silly for a technology that feels this new and exciting.

ChatGPT remains the best free AI chatbot out there -- and yet, having used some of its rivals, coming back to ChatGPT can feel like a bare-bones experience. Here are five simple features that could really expand the usability of ChatGPT. Some of these would also make great additions to ChatGPT Plus, if the new features need to go there first.
A drafts option

Read more

What is ChatGPT Plus? Everything we know about the premium tier

Close up of ChatGPT and OpenAI logo.

ChatGPT is completely free to use, but that doesn't mean OpenAI isn't also interested in making some money.

ChatGPT Plus is a subscription model that gives you access to a completely different service based on GPT-4, with faster speeds and more reliable access.

Read more

You won’t believe how much ChatGPT costs to operate

The ChatGPT website on a laptop's screen as the laptop sits on a counter in front of a black background.

A new report claims to know how much ChatGPT costs to run per day, including how much each query approximately costs. The popular text-to-speech chatbot might have set off an AI revolution in November 2022, but has proven extremely expensive to maintain.

The new report comes from Dylan Patel, chief analyst at the research firm SemiAnalysis, who says it costs approximately $700,000 per day, or 36 cents per query, to keep the chatbot up and running.

Read more