neural network

5 posts
Neural network creates images from text

OpenAI trained a neural network that they call DALL·E with a dataset of text and image pairs. So now the neural network can take text input and output random combinations of descriptors and objects, like a purse in the style of Rubik’s cube or a teapot imitating Pikachu. Tags: images, neural network, OpenAI, text

0 0
Neural network generates convincing songs by famous singers

Jukebox from OpenAI is a generative model that makes music in the same styles as many artists you’ll probably recognize: To train this model, we crawled the web to curate a new dataset of 1.2 million songs (600,000 of which are in English), paired with the corresponding lyrics and metadata from LyricWiki. The metadata includes artist, album genre, and year of the songs, along with common moods or playlist keywords...

0 0
Text-to-speech models trained on celebrity voices

The Vocal Synthesis channel on YouTube trains text-to-speech models using publicly available celebrity voices. Then using this new computer-generated voice, the celebrities “recite” various scripts. For example, the above is Jay-Z rapping the “To be, or not to be” soliloquy from Hamlet, but it’s not him. Find out more about the voice generation here, which was developed in 2017. Maybe more interesting, Jay-Z recently filed a copyright claim against the...

0 0
Neural networks to generate music

Kyle McDonald describes some of the history and current research on using algorithms to generate music. On how David Cope incorporated Markov chains to aid in his work: In 1981 David Cope began working with algorithmic composition to solve his writers block. He combined Markov chains and other techniques (musical grammars and combinatorics) into a semi-automatic system he calls Experiments in Musical Intelligence, or Emmy. David cites Iannis Xenakis and...

0 0
Algorithmic art shows what the machine sees

Tom White is an artist who uses neural networks to draw abstract pictures of objects. What looks blobby and fuzzy to us looks more concrete to the machine. James Vincent for The Verge: That “voice” is actually a series of algorithms that White has dubbed his “Perception Engines.” They take the data that machine vision algorithms are trained on — databases of thousands of pictures of objects — and distill...

0 0
Neural networks to communicate with Alexa devices using sign language

Many have found Amazon’s Alexa devices to be helpful in their homes, but if you can’t physically speak, it’s a challenge to communicate with these things. So, Abhishek Singh used TensorFlow to train a program to recognize sign language and communicate with Alexa without voice. Nice. Tags: Alexa, neural network, sign language, TensorFlow

0 0
Knitters and the neural network-trained machine

Janelle Shane, who likes to play with output from neural networks, teamed up with knitters in a discussion forum to produce abstract designs. Shane generates the kitting patterns, and the knitters bring the computer output to life. She calls the project SkyKnit. The neural network produces slightly flawed instructions, but the knitters can figure things out: Knitters are very good at debugging patterns, as it turns out. Not only are...

0 0
Here’s what you get when you cross dinosaurs and flowers with deep learning

Neural networks have shown usefulness with a number of things, but here is an especially practical use case. Chris Rodley used neural networks to create a hybrid of a dinosaur book and a flower book. The world may never be the same again. Tags: dinosaurs, flowers, neural network

0 0
What a neural network sees

Neural networks can feel like a black box, because, well, for most people they are. Supply input and a computer spits out results. The trouble with not understanding what goes on under the hood is that it’s hard to improve on what we know. It’s also a problem when someone uses the tech for malicious purposes, as people are prone to do. So, folks from Google Brain break down the...

0 0
Sentence gradients to see the space between two sentences

In a project he calls Sentence Space, Robin Sloan implemented a neural network so that you can enter two sentences and get a gradient of the sentences in between. I’d never even bothered to imagine an interpolation between sentences before encountering the idea in a recent academic paper. But as soon as I did, I found it captivating, both for the thing itself—a sentence… gradient?—and for the larger artifact it...

0 0