Choosing 3 million points on a 300-dimensional sphere

[sorry this post is a little sloppy – I had a hard stop at 7:00 pm and wanted to get it out the door.]

During the last day of my machine learning class we discussed “word2vec.” I’d heard of it before because of a couple of tweets and blog posts from Jordan Ellenberg. For example:

Jordan Ellenberg’s Messing Around with word2vec

I was reviewing this blog post during one of the breaks and stumbled on this passage:

During class I couldn’t figure out how to think through this problem, but on the bike ride home I had an idea that seemed to work.

I don’t actually know if I’ve arrived at a “close enough” answer by coincidence because higher dimensional geometry is strange.  Here are a few example projects that I’ve done with the boys which range from fun to mind blowing:

Did you know that there is a 30-60-90 triangle in a hyper-cube?

Carl Sagan on the 4th dimension

Using snap cubes to talk about the 4th dimension

Sharing 4d shapes with kids

Counting Geometric Properties in 4 and 6 dimensions

A Strange Problem I overheard Bjorn Poonen discussing

Bjorn Poonen’s n-dimensional sphere problem with kids

A fun surprise in Bjorn Poonen’s n-dimensional sphere problem

One strange thing that comes to mind immediately in the statement from Ellenberg’s blog is that it must be somehow hard to find vectors in higher dimensions that meet at angles close to 0 degrees or 180 degrees. But why?

My solution to the 3 million points on a 300 dimensional sphere problem on the way home went something like this:

(1) Use a 2x2x . . . x2 cube with center at the origin and with all verticies having coordinates in every dimension that are either +1 or -1.

(2) Assume that the first vector is the 300-dimensional vector (1,1, . . . ,1)

(3) Now form 3 million 300-dimensional vectors with +1 or -1 randomly chosen for each position in the vector.

(4) By the dot product formula for vectors, the smallest dot product will product the largest angle, so now we just have to figure out how many -1’s the vector with the most -1’s should have.

(5) The number of -1’s in our vectors should be described by the binomial distribution with a mean of 150 and a standard deviation of $5\sqrt{3}$

(6) To get to a 1 in 3 million chance we have to go out about 5 standard deviations (so about 43 away from the mean) but just to double check I ran this handy dandy Mathematica code:

sure enough, with 3,000,000 trials we expect about 1 vector to have 192 to 193 -1’s .  Let’s say 192.

(7) So, that vector is going to have 84 more -1’s than +1’s so the dot product with our vector (1,1,1, . . . . , 1) is going to be -84.    That means the cosine of the angle between the two vectors will be -0.28.

So, close to what Ellenberg stated in his blog.

I’m not sure that this method is 100% right, but think it captures the general idea.  It certainly helped me understand why it is hard for random vectors to be parallel (or anti-parallel) in high dimensions.

Sharing a fun neural network program with kids

I just finished a summer class at MIT on machine learning.  It was a week-long professional education class and it was great.  On the 4th day of the class we discussed neural networks and played around a little with this program:

A Neural Network Playground

Most of the ideas in the class involve math concepts from linear algebra and calculus and are way beyond anything that you would expect kids to know.  This program, however, seemed like something kids could play with – so we played with it this morning.

First I introduced them – in the most basic way possible – to the idea of classification:

After that short introduction I let the boys play with the program for 15 minutes. They (of course) built the most complicated model that they could and we talked about what they saw. They actually had a pretty good intuition for what was going on – it is too bad that the math ideas required to really dive into the topic are so advanced. Even though we didn’t talk about the math, it was really fun to hear their thoughts about how the model was working.

For the last part of the project we watched a video about a computer that learned to control a model helicopter. The original video is here:

and here’s how my kids reacted to it:

So, a really fun week for me learning about data analysis and machine learning. It was nice to have a small way to share some of what I learned with the boys.

5 great plays / lessons from the All-Stars vs. Riot game

The 2016 All-Star tour kicked off last night in Seattle. Lots of hard work went into putting the second season together and it was great to see the games finally getting started. The game film is here:

Here are 5 things that caught my eye watching the game last night (and sorry for the slightly fuzzy videos, I was having some odd computer issues this morning):

(1) 3 good hucks to study

Early in the 3rd point Riot tries to go deep to Hana Kawai, but the throw is pretty difficult. Lateer in the point the All-Stars score on a beautiful huck from Jesse Shofner to Kate Scarth. I love Scarth’s cut in this clip – watch it a few times to see how she sets it up.

Later in the game (at 3-3) Riot again goes deep to Kawai. This throw has far more margin for error than the first one and Kawai is able to track it down.

(i) same thirds hucks, and
(ii) throwing hucks to space.

Comparing these two throws helps illustrate some of the pros and cons of these two ideas:

(2) Riot working the break side

Riot’s goal to go up 3-2 is a great example of a team doing work on the break side. Other than the scoring pass, no pass Riot takes on this possession is remotely difficult. The movement on the break side on this possession is an especially great lesson for anyone learning to play ultimate. I love the pass from Sarah Griffith to (former All-Star) Jaclyn Verzuh right before the goal. Actually, it is worth watching this clip a few times to study Griffith’s positioning during the entire possession.

Also, the scoring cut from Julia Snyder is great. I love the way she plays.

(3) A great hustle play from Jesse Shofner

The play here is a nice look from Claire Revere (who is just blows me away every time I see her play) to Kate Scarth. That throw isn’t complete, but watch the great hustle play from Shofner. Watch especially where she is on the field when the throw goes up. It would have been totally easy for her to just watch the throw – instead the All-Stars get a goal!

(4) Great athleticism from Shiori Ogawa

Everything about this play is nice – Fitzgerald to Revere to Ogawa to Fitzgerald to Ode!!! I’m sure I’ll write plenty about all of them during the tour, but Ogawa really caught my eye here.

I love the attacking idea of the throw and cut, I love the path Ogawa takes to this disc and it forced Qxhna Titcome to have to take a path that wasn’t nearly as good. I absolutely love how calm Ogawa was after the catch – both with the throw to Fitzgerald and her clear to the endzone afterwards. She’s such a great addition to the All-Star tour!

(5) Kaylor and Walhroos attacking the far side of the field

This play is definitely  aggressive, but I think still worth studying. The play reminded me a lot of how Australia attacked Japan in the Dream Cup final, which I wrote about here:

Australia’s Michelle Phillips and Moe Sameshima – I’m speechless

I love the aggressive attacking style on display here. There’s no doubt that this throw is risky, but is also extremely hard to defend. That difficulty is illustrated here since the defender – Sarah Griffith – is one of the two or three best defenders in the game.

So, a great start to the All-Star tour last night. I can’t wait to see more games!!