Google A.I. sortiert Vogelgesang

birdsounds

Google hat unter AI-Experiments ’ne Spielwiese für Machine Learning und AI eingerichtet. Ein paar der Experimente kannte ich bereits von den Entwicklern (und manche hatte ich in ihrer rudimentären Form bereits gebloggt, Quick Draw etwa), aber die Ausführungen sind hier nochmal schicker und ausgefeilter.

Spontan am hübschesten weil entspanntesten fand ich die Sortierung von tausenden Vogelstimmen und ein bisschen Chill können wir wohl alle grade ziemlich gut gebrauchen:

Bird sounds vary widely. This experiment uses machine learning to organize thousands of bird sounds. The computer wasn’t given tags or the birds’ names – only the audio. Using a technique called t-SNE, the computer created this map, where similar sounds are placed closer together.

Außerdem am Start:

Giorgio Cam: „This is an experiment built with machine learning that lets you make music with the computer just by taking a picture. It uses image recognition to label what it sees, then it turns those labels into lyrics of a song.“

The Infinite Drum Machine: „Sounds are complex and vary widely. This experiment uses machine learning to organize thousands of everyday sounds. The computer wasn’t given any descriptions or tags – only the audio. Using a technique called t-SNE, the computer placed similar sounds closer together. You can use the map to explore neighborhoods of similar sounds and even make beats using the drum sequencer.“

Quick, Draw!: „This is a game built with machine learning. You draw, and a neural network tries to guess what you’re drawing. Of course, it doesn’t always work. But the more you play with it, the more it will learn. It’s just one example of how you can use machine learning in fun ways.“

Visualizing High-Dimensional Space: „This experiment helps visualize what’s happening in machine learning. It allows coders to see and explore their high-dimensional data. The goal is to eventually make this an open-source tool within TensorFlow, so that any coder can use these visualization techniques to explore their data.“

What Neural Networks See: „This experiment lets you turn on your camera to explore what neural nets see, live, using your camera. Watch the video explainer above to see how each layer of the neural net works.“

Thing Translator: „This experiment lets you take a picture of something to hear how to say it in a different language. It’s just one example of what you can make using Google’s machine learning API’s, without needing to dive into the details of machine learning.“

A.I. Duet: „This experiment lets you make music through machine learning. A neural network was trained on many example melodies, and it learns about musical concepts, building a map of notes and timings. You just play a few notes, and see how the neural net responds. We’re working on putting the experiment on the web so that anyone can play with it. In the meantime, you can get the code and learn about it by watching the video above.“