Microsoft will stop using the facial recognition tool to identify emotions


Microsoft said Tuesday it will stop selling technology that guesses someone’s emotion based on a facial image and will no longer provide unlimited access to facial recognition technology.

These actions reflect efforts by major cloud providers to rein in sensitive technologies themselves, as lawmakers in the United States and Europe continue to weigh in on comprehensive legal limits.

For at least the last year, Microsoft has been examining whether emotion recognition systems are rooted in science.

See also  How is the economy doing right now in 2022?

“These efforts have raised significant questions about privacy, the lack of consensus on a definition of ’emotions,’ and the inability to generalize the link between facial expression and emotional state across use cases, regions and demographics,” Sarah Bird, senior product group lead for Microsoft’s Azure AI unit, said in a blog post.

Existing customers will have a year before losing access to artificial intelligence tools that claim to infer emotion, gender, age, smile, facial hair, hair and makeup.

See also  The Government Wants You To Update Your Google Chrome Browser Right Now, Here's Why

Last year, Alphabet’s Google Cloud embarked on a similar assessment, first reported by Reuters. Google blocked 13 planned emotions from its emotion reading tool and placed four existing emotions, such as joy and sorrow, under control. He was weighing a new system that would describe movements such as frowning and smiling, without trying to attach them to an emotion.

Google did not immediately respond to request for comment on Tuesday.

See also  Oppo Reno 3, Reno 3 Pro to get Stable ColorOS 12 in India

Microsoft also said customers must now obtain permission to use its facial recognition services, which can allow users to log into websites or open locked doors through face scanning.

The company called on customers to avoid situations that invade privacy or where the technology might struggle, such as identifying minors, but did not explicitly prohibit such uses.



Please enter your comment!
Please enter your name here