Association for the Advancement of Artificial Intelligence | June 2022
In this paper, we present a novel (and troubling) finding that well-known automatic speech recognition (ASR) systems may produce text content highly inappropriate for kids while transcribing YouTube Kids’ videos. We dub this phenomenon as inappropriate content hallucination. Our analyses suggest that such hallucinations are far from occasional, and the ASR systems often produce them with high confidence. We release a first-of-its-kind data set of audios for which the existing state-of-the-art ASR systems hallucinate inappropriate content for kids. In addition, we demonstrate that some of these errors can be fixed using language models.
Sumeet Kumar is an Assistant Professor of Information Systems at the Indian School of Business (ISB). He studies problems at the intersection of technology and society. He is interested in analysing user behaviour, quantifying polarisation on online forums , and finding advertisements disguised as regular content on online platforms. His current focus is on identifying implicit or hidden advertisements in videos posted on children’s platforms such as YouTube Kids.
Additionally, Professor Kumar has conducted research in software design and development, with particular emphasis on user experience. He has investigated the use of mobile phone sensors during emergencies to improve situational awareness. His study on the Wireless Emergency Alerts (WEA) service in the United States addressed several issues of critical importance to emergency alerts effectiveness and adoption. Notably, some of his research recommendations was included in the US Federal Communications Commission (FCC) proposed changes to WEA.
He completed his undergraduate education at Indian Institute of Technology (IIT) Kanpur. He holds two Master’s degrees—in Software Engineering and in Machine Learning--both from Carnegie Mellon University, where he also earned his doctorate degree.
