First, OpenAI provided a device that allowed folks to create digital images just by describing what they needed to see. Then, it constructed related know-how that generated full-motion video like one thing from a Hollywood film.

Now, it has unveiled know-how that may recreate somebody’s voice.

The high-profile A.I. start-up mentioned on Friday {that a} small group of companies was testing a brand new OpenAI system, Voice Engine, that may recreate an individual’s voice from a 15-second recording. When you add a recording of your self and a paragraph of textual content, it might probably learn the textual content utilizing an artificial voice that appears like yours.

The textual content doesn’t need to be in your native language. In case you are an English speaker, for instance, it might probably recreate your voice in Spanish, French, Chinese language or many different languages.

OpenAI is just not sharing the know-how extra broadly as a result of it’s nonetheless attempting to grasp its potential risks. Like picture and video turbines, a voice generator might help spread disinformation throughout social media. It might additionally enable criminals to impersonate folks on-line or throughout cellphone calls.

The corporate mentioned it was notably apprehensive that this sort of know-how could possibly be used to interrupt voice authenticators that management entry to on-line banking accounts and different private purposes.

“This can be a delicate factor, and you will need to get it proper,” an OpenAI product supervisor, Jeff Harris, mentioned in an interview.

The corporate is exploring methods of watermarking artificial voices or including controls that stop folks from utilizing the know-how with the voices of politicians or different outstanding figures.

Final month, OpenAI took an identical strategy when it unveiled its video generator, Sora. It showed off the technology however didn’t publicly launch it.

OpenAI is among the many many corporations which have developed a brand new breed of A.I. know-how that may rapidly and simply generate artificial voices. They embrace tech giants like Google in addition to start-ups just like the New York-based ElevenLabs. (The New York Instances has sued OpenAI and its companion, Microsoft, on claims of copyright infringement involving synthetic intelligence methods that generate textual content.)

Companies can use these applied sciences to generate audiobooks, give voice to on-line chatbots and even construct an automatic radio station DJ. Since final yr, OpenAI has used its know-how to energy a model of ChatGPT that speaks. And it has lengthy provided companies an array of voices that can be utilized for related purposes. All of them had been constructed from clips offered by voice actors.

However the firm has not but provided a public device that may enable people and companies to recreate voices from a brief clip as Voice Engine does. The flexibility to recreate any voice on this manner, Mr. Harris mentioned, is what makes the know-how harmful. The know-how could possibly be notably harmful in an election yr, he mentioned.

In January, New Hampshire residents acquired robocall messages that dissuaded them from voting within the state main in a voice that was most certainly artificially generated to sound like President Biden. The Federal Communications Fee later outlawed such calls.

Mr. Harris mentioned OpenAI had no instant plans to make cash from the know-how. He mentioned the device could possibly be notably helpful to individuals who misplaced their voices by way of sickness or accident.

He demonstrated how the know-how had been used to recreate a lady’s voice after mind most cancers broken it. She might now converse, he mentioned, after offering a quick recording of a presentation she had as soon as made as a excessive schooler.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

The information provided on is for general informational purposes only. While we strive to ensure the accuracy and reliability of the content, we make no representations or warranties of any kind, express or implied, regarding the completeness, accuracy, reliability, suitability, or availability of the information. Any reliance you place on such information is therefore strictly at your own risk.

WP Twitter Auto Publish Powered By :