Last week I saw Cathy O’Neil talk at Harvard:

Part of the talk was on how algorithms – and black box algorithms, in particular – can create unfair outcomes. O’Neil goes into this topic in much more detail (but also in a very easy to read and understand way) in her book *Weapons of Math Destruction*.

The talk was part of a conference honoring Harvard math professor Barry Mazur who was O’Neil’s PhD advisor. At the end of the talk one of the questions from the audience was (essentially): What can someone who has a focus on academic math do to help the public understand some of the problems inherent in the algorithms that shape our lives?

O’Neil said (again, essentially) that a good approach would be to find ways to communicate the mathematical ideas to the public in ways that were not “gobbledygook.”

Although I’m not an academic mathematician, this exchange was on my mind and I decided to try out a simple idea that I hoped would help the boys understand how small changes lead can lead to very unequal outcomes. There are no equations in this project, just our new ball dropping machine.

First I asked to boys to look at the result of several trials of the machine dropping balls and tell me what they saw. As always, it is really interesting to hear how kids describe mathematical ideas:

Next I tilted the board a bit by putting a thin piece of plastic under one side. I asked the boys to guess what would happen to the ball distribution now. They gave their guesses and we looked at what happened.

One nice thing was that my younger son noticed that the tails of the distribution were changed quite a bit, but the overall distribution changed less than he was expecting:

I’m sorry this part ran long, but hopefully it was a good conversation.

To finish up the project I tried to connect the changes in the tails of the distribution with some of the ideas that O’Neil talked about on Thursday. One thing that I really wanted to illustrate how small changes in our machine (a small tilt) led to large changes in the tails of our distribution.

I hope this project is a useful way to illustrate one of O’Neil’s main points to kids. Algorithms can create unfairness in ways that are hard to detect. Even a small “tilt” that doesn’t appear to impact the overall distribution very much can lead to big changes in the tails. If we are making decisions in the tails – admitting the “top” 10% of kids into a school, firing the “bottom” 10% of employees, or trying to predict future behavior of a portion of a population, say – that small tilt can be magnified tremendously.

It may not be so easy for kids to understand the math behind the distributions or the ways the distributions change, but they can understand the idea when they see the balls dropping in this little machine.

You’ve likely already touched on it before (‘cuz what HAVEN’T you covered with the boys at sometime!? ;), but seems like this makes a nice lead-in to some discussion of chaos theory or more generally complexity theory.