Two weeks ago I saw an interesting lecture from Gil Strang at MIT about the math behind machine learning. Sharing some of those ideas with kids has been on my mind ever since. Today I finally got around to it!

We’ve done a few previous projects that touched on ways to make machine learning accessible to kids. The Martin Gardner hexapawn project is incredibly fun and also is accessible to really young kids, the other project below uses the same Tensorflow website that we played with today:

Today I began be asking the boys what they knew about machine learning and then I explained a bit about classification problems:

Next I moved on to drawing a clumsy picture of what a neural network might look like and then did a clumsy explanation of how a neural network might work. My older son asked a really great question that gets to the difference between the Hexapawn game and how modern neural networks work – so we chatted about that for a bit.

Then I talked about the so-called “relu” firing function for neurons.

Before moving on to the Tensorflow program, I wanted to spend a few minutes talking about an idea that Gil Strang mentioned in his lecture. That idea is the connection between folding and classification.

This idea, I think, helps make the classification problem accessible to kids.

Next up was playing with the Tensorflow program and exploring some basic classification examples:

Then I let the kids play with the program by themselves for about 15 min – here are a few of the ideas that they found interesting:

Machine learning is an incredibly popular and growing area of math and computer science right now – the Tensorflow website is a great way to share some of the ideas in machine learning with kids.