Google’s New Music Technology is Like a Real-Life Daft Punk

Photo: Adam Berry (Getty Images)

Google is getting into the music business. In what sounds like a plot to a Silicon Valley episode, the tech giant has revealed that it is developing a new program to synthesize sound using machine learning techniques or in human speak — artificial intelligence aka robots.

The real-life Daft Punk program is called “NSynth” (Neural Synthesizer) and is the brainchild of Google Magenta, a group of AI researchers who were charged with finding out if AI is capable of creating music. 

Also: YouTube Set to Lose $750 Million as Advertisers Don’t Want to be Linked to Hate Speech

NSynth uses deep neural networks to generate sounds at the level of individual samples instead of say a traditional synthesizer which generates audio from hand-designed components like oscillators and wavetables.

Musicians then could take the data NSynth provides to explore never before heard sounds that would be difficult or impossible to produce with a hand-tuned synthesizer. Check out a few of their samples below:

If this 2001 Space Odyssey scenario stirs up the Luddite in you, Google released a statement to ease your fears (for now):

Part of the goal of Magenta is to close the loop between artistic creativity and machine learning, so stay tuned for upcoming releases that will enable you to make your own music with these technologies.”

For Google’s full statement on NSynth go HERE.

TRENDING
No content yet. Check back later!

Load more...
Exit mobile version