Microsoft Teams AI could tell you who is most enjoying your video call

HamaraTimes.com | Microsoft Teams AI could tell you who is most enjoying your video call

[ad_1]

Teacher giving a lesson from home

Online teaching can be difficult without feedback

Nathan Stirk/Getty Images

Microsoft has developed an artificial intelligence for its Teams videoconferencing software that aims to put people presenting a remote talk more at ease by highlighting the most positive audience reactions.

The AI, named AffectiveSpotlight, identifies participantsтАЩ faces and uses a neural network to classify their expressions into emotions such as sadness, happiness and surprise, and spot movements like head shaking and nodding. It also uses an eyebrow detection system to spot confusion, in the form of a furrowed brow.

Each expression is rated between 0 and 1, with positive responses scoring higher. Every 15 seconds, the AI highlights the person with the highest score in that time period to the presenter.

Advertisement


A Microsoft Research spokesperson told New Scientist that тАЬspotlighting audience responses makes the presenter more aware of their audience and achieves a communicative feedback loopтАЭ. The research team declined an interview.

In a survey of 175 people conducted by the team, 83 per cent of those who give presentations said they often miss relevant audience feedback when presenting online тАУ particularly non-verbal social cues.

To see whether AffectiveSpotlight could help address this problem, the team tested it against software that highlighted audience members at random. AffectiveSpotlight only highlighted 40 per cent of participants during talks, compared with 87 per cent by the random software. Speakers reported feeling more positive about speaking with AffectiveSpotlight, though audience members couldnтАЩt discern a difference in the quality of presentation from those using the AI.

Rua M. Williams at Purdue University, Indiana, queries whether the AI is much use. тАЬIt is certainly dubious at best that any interpretation based on just audio or video, or both, is ever accurate,тАЭ they say.

Williams also worries that relying on AI to parse human emotions тАУ which are more complicated than they may first appear тАУ is troublesome. тАЬWhile some studies like this one may mention issues of privacy and consent, none ever account for how someone might contest an inaccurate interpretation of their affect.тАЭ

Reference: arxiv.org/abs/2101.12284

More on these topics:

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here