01/08/2018

We often work in linear space, but you might ask how we could capture nonlinearity? The answer lies in basis functions.

Suppose we have data with a single, real-valued covariate input and that follows a quadratic trend. We’d like to fit a quadratic regression. How can we use the same tooling we have for linear regression and apply it to this problem?

Our problem as a linear regression would look like:

\[y = w_1x_1 + w_0\]So our $\mathbf{x}$ would look like:

\[\begin{bmatrix} 1 \\ x_1 \end{bmatrix}\]And our $\mathbf{w}$ would look like:

\[\begin{bmatrix} w_0 \\ w_1 \end{bmatrix}\]To capture this quadratic form, we can simply use a basis function.

Let $\phi(\mathbf{x})$ be a function that maps an input $\mathbf{x}$ to a new vector output of any dimensional output. For example, we can have $\phi$ do the following transformation:

\[\begin{bmatrix} 1 \\ x_1 \end{bmatrix} \to \begin{bmatrix} 1 \\ x_1 \\ x_1^2 \end{bmatrix}\]Because we’ve increased the dimensionality of $\mathbf{x}$, we’d also need to increase the dimensionality of $\mathbf{w}$. As a result, our new lienar regression would look like:

\[y = \mathbf{w}^\top \phi(\mathbf{x}) = w_2x_1^2 + w_1x_1 + w_0\]Ta-da! We have a quadratic regresion! Of course, we’re not limited to polynomial regressions. Our $\phi$ basis function can transform our feature vector into any output we want. It’s simply a matter of choosing the right functions to perform the basis transformation, but that’s a whole ordeal on its own. The point is that it is indeed possible to use the same techniques learned for linear regression in non-linear cases just by basis functions. Ultimately, everything ends up being something like a linear model (except for Random Forests and the like, TBA).

08/22/2018

Toward the Light: Behind the Scenes

07/01/2018

Arch Linux: Chromebook C720 Webcam Microphone Disappeared

06/21/2018

SSH: How to Set Up a Simple Local Media Server

02/28/2018

Pacman: File Conflicts

01/17/2018

Making an Arch Linux USB Flash Install Medium

01/17/2018

Arch Linux: Post-Install Notes

01/15/2018

Binary Classification Metrics

01/14/2018

Probabilitistic Classification

01/09/2018

Classification and Perceptron

01/08/2018

Linear Regression: Bayesian Approach, Normal Conjugacy

01/08/2018

Nonlinearity: Basis Functions

01/04/2018

Linear Regression: A Probabilistic Approach

12/30/2017

Linear Regression: A Mathematical Approach

12/20/2017

2017 Reflections: A Year of Curating

12/19/2017

Introduction to Regression: K-Nearest Neighbors

12/18/2017

Welcome to my Miscellaneous Blog!

12/18/2017

A Definitive Arch Linux Install Guide for the Chromebook C720

10/01/2017

C4D: Volume Effector

09/18/2017

Algorithms: Maximum Sliding Window

09/10/2017

Introduction to Inference: Coins and Discrete Probability

09/05/2017

C4D: Unreliable Booles

08/30/2017

Welcome to my Tech Blog!

08/30/2017

Welcome to my Problem Solving Blog!

Previous: Model Selection | Next: Linear Regression: Bayesian Approach, Normal Conjugacy