A sprint through some machine learning paradigms in TensorFlow, in the form of colab notebooks | Nov 2021

There are a ton of great code samples out there for various aspects of machine learning, but a couple more can’t hurt, right? I made these to go over with interested students in Dani Bassett’s group, so there’s a focus on intuition building and making some stuff off the beaten path (like a recurrent network to classify MNIST digits from just a random walk over the pixels, or a color space compression optimized for different classification datasets). Take a look if you’re so inclined — they’re easy to open in google colab through the github option — and feel free to give feedback!

Sure you can classify digits from MNIST if you see the image all at once, but what if you are stuck observing one pixel at a time while hitched to a random walker? The bottom of each image shows the class predictions as the walking runs its course, shown by the revealed pixels. Train your own in colab 6!

Of course, I had to show how to make my favorite network visualizations! Try it yourself in the full version, colab 1, or a shortened one that gets right to the visualizations (including full-color ‘quilts’!).

 

In colab 3 we use gradients at the pixel level to create some cool images; here the original images are tweaked to maximize different features in the 23rd layer of a ResNet50 (which is a pretty low-level conv layer, so low-ish level patterns are accentuated)

From these GAN-produced photos of cars it’s clear I’m no GAN guru, so take this all with a heavy grain of salt. Still, I think it’s amazing to see even imperfect car features arise out of pixel goop as training progresses. Try it yourself in colab 5, though a word of caution: I found it quite frustrating trying to find decent hyperparameters when you’re only able to train one at a time (as opposed to the spoiled experience I had at google, sending off jobs scanning dozens or even hundreds of different hyperparameters all at once).

 
Previous
Previous

A minimal NFT, with more tokens than atoms on Earth | Oct 2021

Next
Next

Composting at restaurant scale | June 2021