Monthly Archives: May 2013


Over the last few weeks, I’ve introduced two classification methods – Support Vector Machines (SVM) and Logistic Regression – that attempt to find a line, plane or hyperplane (depending on the dimension) that separates two classes of data points. This has … Continue reading

Posted in Classification, Normalization/Kernels | 18 Comments

Logistic regression

In the last post, I introduced the Support Vector Machine (SVM) algorithm, which attempts to find a line/plane/hyperplane that separates the two classes of points in a given data set. This algorithm adapts elements of linear regression, a statistical tool (namely, … Continue reading

Posted in Classification, Regression | 18 Comments

Linear Separation and Support Vector Machines

So far on this blog, we’ve seen two very different approaches to constructing models that predict data distributions. With regression, we replaced the original data points with an equation defining a relatively simple shape that approximates the data, then used … Continue reading

Posted in Classification | 18 Comments

K-Nearest Neighbors

Two posts back, I introduced the Nearest Neighbor classification algorithm and described how it implicitly defines a distribution made up of Voronoi cells around the data points, with each Voronoi cell labeled according to the label of the point that … Continue reading

Posted in Classification | 8 Comments