Microsoft will stop using facial recognition technology, which it says can predict people’s emotions, among other things. According to Microsoft, it raises questions about artificial intelligence, privacy and discrimination and other forms of abuse.
This is facial recognition in the Azure Face service that Microsoft sells to businesses. Technology not only recognizes a person’s emotions, but also recognizes their gender, age, hairstyle and makeup.
Microsoft worked with researchers to list the pros and cons of the technology. “It raised a lot of questions about privacy and the lack of a clear definition of the word emotion, especially when it comes to emotion recognition,” the company wrote. “The same goes for not being able to relate facial expressions to people’s emotional states in different situations and regions.”
The service is interrupted to limit the risk of misuse. New users can no longer purchase the service, existing customers have access until June 30, 2023.
Criticism of facial recognition systems for a while
Experts have long been negative about the results of face reading programs. For example, they can lead to discrimination among employers when hiring new people. In addition, US and European lawmakers are debating the legality of the technology.
IBM stopped developing facial recognition two years ago. According to the company, the systems are often biased and can be used for mass surveillance and ethnic profiling.
While Microsoft is discontinuing some parts of facial recognition, some will remain available. This is Seeing AI suitable for visually impaired people. Image recognition is used to identify the environment in the application.
Source: NU
Jason Jack is an experienced technology journalist and author at The Nation View. With a background in computer science and engineering, he has a deep understanding of the latest technology trends and developments. He writes about a wide range of technology topics, including artificial intelligence, machine learning, software development, and cybersecurity.