Truecaller’s AI assistant can now clone your voice to answer calls


Dialer photo

Joe Hindy / Android Authority

TL;DR

  • Truecaller’s AI Assistant offers features like automatic call answering, message taking, and spam detection.
  • Users can now train the AI Assistant to answer calls using a digital clone of their voice.
  • Previously, users had to choose from seven pre-set digital voices for the assistant.

Truecaller, the popular dialer and spam-blocking app, has introduced a significant upgrade to its AI Assistant feature. The app now allows paid users to create a digital clone of their voice to handle incoming calls. This innovative yet mildly discomforting development results from Truecaller’s collaboration with Microsoft and utilizes the new Personal Voice technology from Azure AI Speech.

Truecaller’s AI Assistant, initially launched in September 2022, boasts an array of AI-driven capabilities, such as automatic call answering, screening, message taking, and call recording. Furthermore, it engages callers to determine the purpose of their call,  filtering out spam with a claimed accuracy exceeding 90%. Previously, users had the option to choose from seven pre-set digital voices for their assistant.

Truecaller assistant

What’s new is that users can now generate a personalized digital voice clone for the same purpose. After giving their consent, users need to record a few seconds of a script in their voice to create the digital copy. However, to maintain transparency, Truecaller has restricted the ability to customize the initial greeting when using the personal voice option, ensuring callers are aware they are interacting with a “digital” version of the user.

The rollout of the Personal Voice feature is being staggered across different regions, commencing with the USA, Canada, Australia, South Africa, India, Sweden, and Chile. It is expected to become available in other countries soon.

While this advancement offers a unique level of personalization, it also raises questions regarding the potential misuse of voice cloning technology. There are already a myriad of AI voice-generating apps available. Furthermore, reports of scammers in India using AI to mimic the voices of acquaintances to trick people out of money are getting increasingly common.

As the line between genuine and synthetic content blurs, one can’t help but wonder if all this AI progress is happening too fast, outpacing our ability to grasp the potential consequences fully.

Got a tip? Talk to us! Email our staff at news@androidauthority.com. You can stay anonymous or get credit for the info, it’s your choice.

You might like



Source link