Microsoft fires team that warns of risks of AI technology



It has been revealed that Microsoft has laid off its AI team, which assesses the risks of incorporating OpenAI technology into its products, as part of a large-scale

layoff that began in early 2023. With Microsoft's Bing Chat, which incorporates OpenAI's large-scale language model, in the spotlight, concerns have been raised in the media about the elimination of a team dedicated to ensuring the company's ' Principles for Responsible AI .'

Microsoft just laid off one of its responsible AI teams
https://www.platformer.news/p/microsoft-just-laid-off-one-of-its

According to a source who provided information to Platformer, a Silicon Valley-related newsletter, Microsoft's ethics and social responsibility team has been disbanded due to layoffs of 10,000 employees.

This team, dedicated to guiding ethical and responsible AI innovation, was a large team of about 30 employees, including engineers, designers, and philosophy experts, at its peak in 2020. It was subsequently reduced to about seven people in an organizational restructuring in October 2022, but since Microsoft partnered with OpenAI, it has been identifying risks associated with integrating OpenAI technology into Microsoft products.



John Montgomery, Microsoft's vice president of AI, said in a meeting with his restructured team that he was under pressure from Microsoft's CEOs. In an audio recording of the meeting obtained by Platformer, Montgomery said, 'The pressure from Kevin and Satya [CTO Kevin Scott and CEO Satya Nadella] is to get the latest OpenAI models and beyond to customers very quickly.'

When members of the ethics and society team heard this, they countered, 'Please reconsider. We understand there are business issues, but this team has always been deeply concerned about what impact or negative impact we have had on society. And it's significant.' But Montgomery wasn't listening.

At this time, Montgomery said, 'The team will not disappear,' but most of the team members will be transferred to other departments within Microsoft, and the remaining members will be notified that the team will be abolished on March 6, 2023.

Regarding this move, Platformer said, 'Anyone involved in AI development would agree that this technology poses potential risks. Even after the elimination of the Ethics and Social Responsibility Team, three groups within Microsoft continue to work on this issue. However, given the dangers of this technology, the reduction of a team dedicated to responsible use of AI is noteworthy.'



Platformer reports that the now-defunct Ethics and Society team made some prescient recommendations. In October 2022, Microsoft announced Bing Image Creator , incorporating OpenAI's image-generating AI, DALL-E. However, in a document left behind by the team, the team stated, 'We believe that very few artists have consented to the use of their work as training data. Furthermore, many people are still unaware that image-generating technology can generate an edited version of their work in just a few seconds.'

Furthermore, OpenAI has updated its terms of use to give users full ownership of the images they create with DALL-E. 'It's ethically questionable that the person who enters the prompt has full ownership of the resulting image.'

Based on these concerns, the researchers developed a list of mitigations, including blocking the use of names of living artists as prompts, but Microsoft went ahead with the experimental release of Bing Image Creator without implementing them.

In a statement, Microsoft said, 'We are committed to developing AI products and experiences safely and responsibly, and we achieve this by investing in the people, processes, and partnerships that make this a priority. Over the past six years, we have increased the number of employees across our product teams and in our Office of Responsible AI (ORA) who are accountable for ensuring our AI principles are put into practice. We are grateful for the pioneering work our Ethics and Society team has done in our ongoing responsible AI journey.'

in AI,   Note, Posted by darkhorse_log