Does good performance in the NBA Draft Combine actually predict for better players? Read on to find out.

Photo by on

I’ve always wondered how NBA teams select their rookies each year. A large part of the evaluation process is the NBA draft combine, where rookies perform a series of athletic tests: sprint speed, wingspan/height/weight measurements, vertical jump height, and more. But is good performance on these tests actually correlated with…

Let’s use Natural Language Processing techniques on Chris Cuomo’s show

Photo by on

In this post, I run some natural language processing techniques on transcripts of a popular CNN primetime show: Cuomo Prime Time. This is a followup to another post from a month ago, where I ran the same NLP techniques on Tucker Carlson’s primetime Fox News show. My goal here is…

Let’s use Natural Language Processing techniques on Tucker Carlson’s show

Photo by on

I’m no Fox News watcher, but on Youtube I always see at least one Fox News thumbnail. And more often than not, that thumbnail has Tucker Carlson’s face on it. Because of this, I’ve become curious about his show, and I decided to run some natural language processing techniques on…

Some neural networks are too big to use. There is a way to make them smaller but keep their accuracy. Read on to find out how.

Photo by on

Practical machine learning is all about tradeoffs. We can get better accuracy from neural networks by making them bigger, but in real life, large neural nets are hard to use. Specifically, the problem arises not in training, but in deployment. Large neural nets can be successfully trained on giant supercomputer…

Catastrophic forgetting used to be an intractable problem. Recently progress has been made, read on to find out more.

Photo by on

Modern neural networks are very good at learning one particular thing. Whether it be playing chess or folding proteins, with enough data and time, neural networks can achieve amazing results. Unfortunately, networks are currently unable to be good at more than one task. You can train a network to be…

Pruning is an important tool to make neural networks more economical. Read on to find out how it works.

Photo by on

One problem of neural networks is their size. The neural networks you see in online tutorials are small enough to run efficiently on your computer, but many neural networks in industry are huge and unwieldy. They often take days to train, and running them sucks up a lot of compute…

The VC dimension is a mathematical way to formulate model capacity. It also has practical machine learning uses. Read on to learn more.

Photo by on

It is common knowledge in machine learning that some models have more capacity than others. For example, neural networks are able to learn a much larger variety of functions than linear models. Intuitively this makes sense. However, what does having a higher capacity mean mathematically? And how does model capacity…

It can be very helpful to bound the sample size of a machine learning algorithm. Read on to find out how.

Photo by on

One common problem with machine learning algorithms is that we don’t know how much training data we need. A common way around this is the often used strategy: keep training until the training error stops decreasing. However, there are still issues with this. How do we know we’re not stuck…

In machine learning we often say something is “learnable”. What does that really mean? Read on to find out.

Photo by on

What makes a particular function, or a group of functions “learnable”? On the surface, this seems like an easy question. A simple answer would be to say: well, a function is learnable if there is some training algorithm that can be trained on the training set and achieve low error…

Wilson Wang

Amazon Engineer. I was into data before it was big.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store