{ "cells": [ { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "i_f5u2x9nn6I", "slideshow": { "slide_type": "slide" } }, "source": [ " \n", "\n", "# Lecture 11: Kernels\n", "\n", "### Applied Machine Learning\n", "\n", "__Volodymyr Kuleshov__
Cornell Tech" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Part 1: The Kernel Trick: Motivation\n", "\n", "So far, the majority of the machine learning models we have seen have been *linear*.\n", "\n", "In this lecture, we will see a general way to make many of these models *non-linear*. We willl use a new idea called *kernels*." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Linear Regression\n", "\n", "Recall that a linear model has the form\n", "$$f(x) = \\sum_{j=0}^d \\theta_j \\cdot x_j = \\theta^\\top x.$$\n", "where $x$ is a vector of features and we used the notation $x_0 = 1$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "We pick $\\theta$ to minimize the (L2-regularized) mean squared error (MSE):\n", "$$J(\\theta)= \\frac{1}{2n} \\sum_{i=1}^n(y^{(i)} - \\theta^\\top x^{(i)})^2 + \\frac{\\lambda}{2}\\sum_{j=1}^d \\theta_j^2$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "skip" } }, "source": [ "# Review: Polynomials\n", "\n", "Recall that a polynomial of degree $p$ is a function of the form\n", "$$\n", "a_p x^p + a_{p-1} x^{p-1} + ... + a_{1} x + a_0.\n", "$$\n", "\n", "Below are some examples of polynomial functions." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "skip" } }, "source": [ "# Review: Polynomial Regression\n", "\n", "Specifically, given a one-dimensional continuous variable $x$, we can defining a feature function $\\phi : \\mathbb{R} \\to \\mathbb{R}^{p+1}$ as\n", "$$\\phi(x) = \\begin{bmatrix}\n", "1 \\\\\n", "x \\\\\n", "x^2 \\\\\n", "\\vdots \\\\\n", "x^p\n", "\\end{bmatrix}.\n", "$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "skip" } }, "source": [ "The class of models of the form\n", "$$f_\\theta(x) := \\sum_{j=0}^p \\theta_p x^p = \\theta^\\top \\phi(x)$$\n", "with parameters $\\theta$ and polynomial features $\\phi$ is the set of $p$-degree polynomials." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Towards General Non-Linear Features\n", "\n", "Any non-linear feature map $\\phi(x) : \\mathbb{R}^d \\to \\mathbb{R}^p$ can be used to obtain general models of the form\n", "$$f_\\theta(x) := \\theta^\\top \\phi(x)$$\n", "that are highly non-linear in $x$ but linear in $\\theta$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# The Featurized Design Matrix\n", "\n", "It is useful to represent the featurized dataset as a matrix $\\Phi \\in \\mathbb{R}^{n \\times p}$:\n", "\n", "$$\\Phi = \\begin{bmatrix}\n", "\\phi(x^{(1)})_1 & \\phi(x^{(1)})_2 & \\ldots & \\phi(x^{(1)})_p \\\\\n", "\\phi(x^{(2)})_1 & \\phi(x^{(2)})_2 & \\ldots & \\phi(x^{(2)})_p \\\\\n", "\\vdots \\\\\n", "\\phi(x^{(n)})_1 & \\phi(x^{(n)})_2 & \\ldots & \\phi(x^{(n)})_p\n", "\\end{bmatrix}\n", "=\n", "\\begin{bmatrix}\n", "- & \\phi(x^{(1)})^\\top & - \\\\\n", "- & \\phi(x^{(2)})^\\top & - \\\\\n", "& \\vdots & \\\\\n", "- & \\phi(x^{(n)})^\\top & - \\\\\n", "\\end{bmatrix}\n", ".$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Featurized Normal Equations\n", "\n", "The normal equations provide a closed-form solution for $\\theta$:\n", "$$\\theta = (X^\\top X + \\lambda I)^{-1} X^\\top y.$$\n", "\n", "When the vectors of attributes $x^{(i)}$ are featurized, we can write this as\n", "$$\\theta = (\\Phi^\\top \\Phi + \\lambda I)^{-1} \\Phi^\\top y.$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Push-Through Matrix Identity\n", "\n", "We can modify this expression by using a version of the [push-through matrix identity](https://en.wikipedia.org/wiki/Woodbury_matrix_identity#Discussion):\n", "$$(\\lambda I + U V)^{-1} U = U (\\lambda I + V U)^{-1}$$\n", "where $U \\in \\mathbb{R}^{n \\times m}$ and $V \\in \\mathbb{R}^{m \\times n}$ and $\\lambda \\neq 0$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "Proof sketch: Start with $U (\\lambda I + V U) = (\\lambda I + U V) U$ and multiply both sides by $(\\lambda I + V U)^{-1}$ on the right and $(\\lambda I + U V)^{-1}$ on the left." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Normal Equations: Dual Form\n", "\n", "We can apply the identity $(\\lambda I + U V)^{-1} U = U (\\lambda I + V U)^{-1}$ to the normal equations with $U=\\Phi^\\top$ and $V=\\Phi$.\n", "\n", "$$\\theta = (\\Phi^\\top \\Phi + \\lambda I)^{-1} \\Phi^\\top y$$\n", "\n", "to obtain the *dual* form:\n", "\n", "$$\\theta = \\Phi^\\top (\\Phi \\Phi^\\top + \\lambda I)^{-1} y.$$\n", "\n", "The first approach takes $O(p^3)$ time; the second is $O(n^3)$ and is faster when $p > n$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Feature Representations for Parameters\n", "\n", "An interesting corollary of the dual form\n", "$$\\theta = \\Phi^\\top \\underbrace{(\\Phi \\Phi^\\top + \\lambda I)^{-1} y}_\\alpha$$\n", "is that the optimal $\\theta$ is a linear combination of the $n$ training set features:\n", "$$\\theta = \\sum_{i=1}^n \\alpha_i \\phi(x^{(i)}).$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "Here, the weights $\\alpha_i$ are derived from $(\\Phi \\Phi^\\top + \\lambda I)^{-1} y$ and equal\n", "$$\\alpha_i = \\sum_{j=1}^n L_{ij} y_j$$\n", "where $L = (\\Phi \\Phi^\\top + \\lambda I)^{-1}.$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Predictions From Features\n", "\n", "Consider now a prediction $\\phi(x')^\\top \\theta$ at a new input $x'$:\n", "$$\\phi(x')^\\top \\theta = \\sum_{i=1}^n \\alpha_i \\phi(x')^\\top \\phi(x^{(i)}).$$\n", "\n", "The crucial observation is that the features $\\phi(x)$ are never used directly in this equation. Only their dot product is used!\n", "\n", "This observation will be at the heart of a powerful new idea called *the kernel trick*." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Learning From Feature Products\n", "\n", "We also don't need features $\\phi$ for learning $\\theta$, just their dot product! \n", "First, recall that each row $i$ of $\\Phi$ is the $i$-th featurized input $\\phi(x^{(i)})^\\top$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "Thus $K = \\Phi \\Phi^\\top$ is a matrix of all dot products between all the $\\phi(x^{(i)})$\n", "$$K_{ij} = \\phi(x^{(i)})^\\top \\phi(x^{(j)}).$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "We can compute $\\alpha = (K+\\lambda I)^{-1}y$ and use it for predictions\n", "$$\\phi(x')^\\top \\theta = \\sum_{i=1}^n \\alpha_i \\phi(x')^\\top \\phi(x^{(i)}).$$\n", "and all this only requires dot products, not features $\\phi$!" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# The Kernel Trick\n", "\n", "The above observations hint at a powerful new idea -- if we can compute dot products of features $\\phi(x)$ efficiently, then we will be able to use high-dimensional features easily.\n", "\n", "It turns our that we can do this for many ML algorithms -- we call this the Kernel Trick." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ " \n", "# Part 2: The Kernel Trick: An Example\n", "\n", "Many ML algorithms can be written down as optimization problems in which the features $\\phi(x)$ only appear as dot products $\\phi(x)^\\top \\phi(z)$ that can be computed efficiently.\n", "\n", "Let's look at an example." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Linear Regression\n", "\n", "Recall that a linear model has the form\n", "$$f(x) = \\sum_{j=0}^d \\theta_j \\cdot x_j = \\theta^\\top x.$$\n", "where $x$ is a vector of features and we used the notation $x_0 = 1$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Non-Linear Features\n", "\n", "Any non-linear feature map $\\phi(x) : \\mathbb{R}^d \\to \\mathbb{R}^p$ can be used in this way to obtain general models of the form\n", "$$f_\\theta(x) := \\theta^\\top \\phi(x)$$\n", "that are highly non-linear in $x$ but linear in $\\theta$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Featurized Design Matrix\n", "\n", "It is useful to represent the featurized dataset as a matrix $\\Phi \\in \\mathbb{R}^{n \\times p}$:\n", "\n", "$$\\Phi = \\begin{bmatrix}\n", "\\phi(x^{(1)})_1 & \\phi(x^{(1)})_2 & \\ldots & \\phi(x^{(1)})_p \\\\\n", "\\phi(x^{(2)})_1 & \\phi(x^{(2)})_2 & \\ldots & \\phi(x^{(2)})_p \\\\\n", "\\vdots \\\\\n", "\\phi(x^{(n)})_1 & \\phi(x^{(n)})_2 & \\ldots & \\phi(x^{(n)})_p\n", "\\end{bmatrix}\n", "=\n", "\\begin{bmatrix}\n", "- & \\phi(x^{(1)})^\\top & - \\\\\n", "- & \\phi(x^{(2)})^\\top & - \\\\\n", "& \\vdots & \\\\\n", "- & \\phi(x^{(n)})^\\top & - \\\\\n", "\\end{bmatrix}\n", ".$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Normal Equations\n", "\n", "The normal equations provide a closed-form solution for $\\theta$:\n", "\n", "$$\\theta = (\\Phi^\\top \\Phi + \\lambda I)^{-1} \\Phi^\\top y.$$\n", "\n", "They also can be written in this form:\n", "\n", "$$\\theta = \\Phi^\\top (\\Phi \\Phi^\\top + \\lambda I)^{-1} y.$$\n", "\n", "The first approach takes $O(d^3)$ time; the second is $O(n^3)$ and is faster when $d > n$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Learning From Feature Products\n", "\n", "An interesting corollary is that the optimal $\\theta$ is a linear combination of the $n$ training set features:\n", "$$\\theta = \\sum_{i=1}^n \\alpha_i \\phi(x^{(i)}).$$\n", "We can compute a prediction $\\phi(x')^\\top \\theta$ for $x'$ without ever using the features (only their dot products):\n", "$$\\phi(x')^\\top \\theta = \\sum_{i=1}^n \\alpha_i \\phi(x')^\\top \\phi(x^{(i)}).$$\n", "Equally importantly, we can learn $\\theta$ from only dot products." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Polynomial Regression\n", "\n", "Note that a $p$-th degree polynomial\n", "\n", "$$\n", "a_p x^p + a_{p-1} x^{p-1} + ... + a_{1} x + a_0.\n", "$$\n", "\n", "forms a linear model with parameters $a_p, a_{p-1}, ..., a_0$.\n", "This means we can use our algorithms for linear models to learn non-linear features!" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "Specifically, given a one-dimensional continuous variable $x$, we can defining a feature function $\\phi : \\mathbb{R} \\to \\mathbb{R}^p$ as\n", "\n", "$$\\phi(x) = \\begin{bmatrix}\n", "1 \\\\\n", "x \\\\\n", "x^2 \\\\\n", "\\vdots \\\\\n", "x^p\n", "\\end{bmatrix}.\n", "$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "Then the class of models of the form\n", "$$f_\\theta(x) := \\sum_{j=0}^p \\theta_p x^p = \\theta^\\top \\phi(x)$$\n", "with parameters $\\theta$ encompasses the set of $p$-degree polynomials. Specifically,\n", "* It is non-linear in the input variable $x$, meaning that we can model complex data relationships.\n", "* It is a linear model as a function of the parameters $\\theta$, meaning that we can use our familiar ordinary least squares algorithm to learn these features." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# The Kernel Trick: A First Example\n", "\n", "Can we compute the dot product $\\phi(x)^\\top \\phi(x')$ of polynomial features $\\phi(x)$ more efficiently than using the standard definition of a dot product? Let's look at an example.\n", "\n", "To start, consider polynomial features $\\phi : \\mathbb{R}^d \\to \\mathbb{R}^{d^2}$ of the form\n", "\n", "$$\\phi(x)_{ij} = x_i x_j \\;\\text{ for i,j \\in \\{1,2,\\ldots,d\\}}.$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "For $d=3$ this looks like\n", "$$\\small \\phi(x) = \\begin{bmatrix}\n", "x_1 x_1 \\\\\n", "x_1 x_2 \\\\\n", "x_1 x_3 \\\\\n", "x_2 x_1 \\\\\n", "x_2 x_1 \\\\\n", "x_2 x_2 \\\\\n", "x_3 x_3 \\\\\n", "x_3 x_1 \\\\\n", "x_3 x_2 \\\\\n", "x_3 x_3 \\\\\n", "\\end{bmatrix}.\n", "$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "The product of $x$ and $z$ in feature space equals:\n", "$$\\phi(x)^\\top \\phi(z) = \\sum_{i=1}^d \\sum_{j=1}^d x_i x_j z_i z_j$$\n", "Computing this dot product invovles the sum over $d^2$ terms and takes $O(d^2)$ time." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "An altenative way of computing the dot product $\\phi(x)^\\top \\phi(z)$ is to instead compute $(x^\\top z)^2$. One can check that this has the same result:\n", "\\begin{align*}\n", "(x^\\top z)^2 & = (\\sum_{i=1}^d x_i z_i)^2 \\\\\n", "& = (\\sum_{i=1}^d x_i z_i) \\cdot (\\sum_{j=1}^d x_j z_j) \\\\\n", "& = \\sum_{i=1}^d \\sum_{j=1}^d x_i z_i x_j z_j \\\\\n", "& = \\phi(x)^\\top \\phi(z)\n", "\\end{align*}\n", "\n", "However, computing $(x^\\top z)^2$ can be done in only $O(d)$ time! " ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "This is a very powerful idea:\n", "* We can compute the dot product between $O(d^2)$ features in only $O(d)$ time.\n", "* We can use high-dimensional features within ML algorithms that only rely on dot products (like kernelized ridge regression) without incurring extra costs." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# The Kernel Trick: Polynomial Features\n", "\n", "The number of polynomial features $\\phi_p$ of degree $p$ when $x \\in \\mathbb{R}^d$ \n", "\n", "$$\\phi_p(x)_{i_1, i_2, \\ldots, i_p} = x_{i_1} x_{i_2} \\cdots x_{i_p} \\;\\text{ for i_1, i_2, \\ldots, i_p \\in \\{1,2,\\ldots,d\\}}$$\n", "\n", "scales as $O(d^p)$. " ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "However, we can compute the dot product $\\phi_p(x)^\\top \\phi_p(z)$ in this feature space in only $O(d)$ time for any $p$ as:\n", "$$\\phi_p(x)^\\top \\phi_p(z) = (x^\\top z)^p.$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Algorithm: Kernelized Polynomial Ridge Regression\n", "\n", "* __Type__: Supervised learning (Regression)\n", "* __Model family__: Polynomials.\n", "* __Objective function__: $L2$-regularized ridge regression.\n", "* __Optimizer__: Normal equations (dual form).\n", "* __Probabilistic interpretation__: No simple interpretation!" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# The Kernel Trick: General Idea\n", "\n", "Many types of features $\\phi(x)$ have the property that their dot product $\\phi(x)^\\top \\phi(z)$ can be computed more efficiently than if we had to form these features explicitly." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "Also, we will see that many algorithms in machine learning can be written down as optimization problems in which the features $\\phi(x)$ only appear as dot products $\\phi(x)^\\top \\phi(z)$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "The *Kernel Trick* means that we can use complex non-linear features within these algorithms with little additional computational cost." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "Examples of algorithms in which we can use the Kernel trick:\n", "* Supervised learning algorithms: linear regression, logistic regression, support vector machines, etc.\n", "* Unsupervised learning algorithms: PCA, density estimation.\n", "\n", "We will look at more examples shortly." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ " \n", "# Part 3: The Kernel Trick in SVMs\n", "\n", "Many ML algorithms can be written down as optimization problems in which the features $\\phi(x)$ only appear as dot products $\\phi(x)^\\top \\phi(z)$ that can be computed efficiently.\n", "\n", "We will now see how SVMs can benefit from the Kernel Trick as well." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Binary Classification\n", "\n", "Consider a training dataset $\\mathcal{D} = \\{(x^{(1)}, y^{(1)}), (x^{(2)}, y^{(2)}), \\ldots, (x^{(n)}, y^{(n)})\\}$.\n", "\n", "We distinguish between two types of supervised learning problems depnding on the targets $y^{(i)}$. \n", "\n", "1. __Regression__: The target variable $y \\in \\mathcal{Y}$ is continuous: $\\mathcal{Y} \\subseteq \\mathbb{R}$.\n", "2. __Binary Classification__: The target variable $y$ is discrete and takes on one of $K=2$ possible values.\n", "\n", "In this lecture, we assume $\\mathcal{Y} = \\{-1, +1\\}$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: SVM Model Family\n", "\n", "We will consider models of the form\n", "\n", "\\begin{align*}\n", "f_\\theta(x) = \\theta^\\top \\phi(x) + \\theta_0\n", "\\end{align*}\n", "\n", "where $x$ is the input and $y \\in \\{-1, 1\\}$ is the target. " ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Primal and Dual Formulations\n", "\n", "Recall that the the max-margin hyperplane can be formualted as the solution to the following *primal* optimization problem.\n", "\\begin{align*}\n", "\\min_{\\theta,\\theta_0, \\xi}\\; & \\frac{1}{2}||\\theta||^2 + C \\sum_{i=1}^n \\xi_i \\; \\\\\n", "\\text{subject to } \\; & y^{(i)}((x^{(i)})^\\top\\theta+\\theta_0)\\geq 1 - \\xi_i \\; \\text{for all $i$} \\\\\n", "& \\xi_i \\geq 0\n", "\\end{align*}" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "The solution to this problem also happens to be given by the following *dual* problem:\n", "\\begin{align*}\n", "\\max_{\\lambda} & \\sum_{i=1}^n \\lambda_i - \\frac{1}{2} \\sum_{i=1}^n \\sum_{k=1}^n \\lambda_i \\lambda_k y^{(i)} y^{(k)} (x^{(i)})^\\top x^{(k)} \\\\\n", "\\text{subject to } \\; & \\sum_{i=1}^n \\lambda_i y^{(i)} = 0 \\\\\n", "& C \\geq \\lambda_i \\geq 0 \\; \\text{for all $i$}\n", "\\end{align*}" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Primal Solution\n", "\n", "We can obtain a primal solution from the dual via the following equation:\n", "$$\n", "\\theta^* = \\sum_{i=1}^n \\lambda_i^* y^{(i)} \\phi(x^{(i)}).\n", "$$\n", "\n", "Ignoring the $\\theta_0$ term for now, the score at a new point $x'$ will equal\n", "$$\n", "\\theta^\\top \\phi(x') = \\sum_{i=1}^n \\lambda_i^* y^{(i)} \\phi(x^{(i)})^\\top \\phi(x').\n", "$$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# The Kernel Trick in SVMs\n", "\n", "Notice that in both equations, the features $x$ are never used directly. Only their *dot product* is used.\n", "\\begin{align*}\n", "\\sum_{i=1}^n \\lambda_i - \\frac{1}{2} \\sum_{i=1}^n \\sum_{k=1}^n \\lambda_i \\lambda_k y^{(i)} y^{(k)} \\phi(x^{(i)})^\\top \\phi(x^{(k)}) \\\\\n", "\\theta^\\top \\phi(x') = \\sum_{i=1}^n \\lambda_i^* y^{(i)} \\phi(x^{(i)})^\\top \\phi(x').\n", "\\end{align*}\n", "\n", "If we can compute the dot product efficiently, we can potentially use very complex features." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# The Kernel Trick in SVMs\n", "\n", "More generally, given features $\\phi(x)$, suppose that we have a function $K : \\mathcal{X} \\times \\mathcal{X} \\to [0, \\infty]$ that outputs dot products between vectors in $\\mathcal{X}$\n", "\n", "$$K(x, z) = \\phi(x)^\\top \\phi(z).$$\n", "\n", "We will call $K$ the *kernel* function." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "Recall that an example of a useful kernel function is\n", "$$K(x,z) = (x \\cdot z)^p$$\n", "because it computes the dot product of polynomial features of degree $p$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "Then notice that we can rewrite the dual of the SVM as\n", "\\begin{align*}\n", "\\max_{\\lambda} & \\sum_{i=1}^n \\lambda_i - \\frac{1}{2} \\sum_{i=1}^n \\sum_{k=1}^n \\lambda_i \\lambda_k y^{(i)} y^{(k)} K(x^{(i)}, x^{(k)}) \\\\\n", "\\text{subject to } \\; & \\sum_{i=1}^n \\lambda_i y^{(i)} = 0 \\\\\n", "& C \\geq \\lambda_i \\geq 0 \\; \\text{for all $i$}\n", "\\end{align*}\n", "and predictions at a new point $x'$ are given by $\\sum_{i=1}^n \\lambda_i^* y^{(i)} K(x^{(i)}, x').$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "fragment" } }, "source": [ "Using our earlier trick, we can use polynomial features of any degree $p$ in SVMs without forming these features and at no extra cost!" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Algorithm: Kernelized Support Vector Machine Classification (Dual Form)\n", "\n", "* __Type__: Supervised learning (binary classification)\n", "* __Model family__: Non-linear decision boundaries.\n", "* __Objective function__: Dual of SVM optimization problem.\n", "* __Optimizer__: Sequential minimial optimization.\n", "* __Probabilistic interpretation__: No simple interpretation!" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ " \n", "# Part 4: Types of Kernels\n", "\n", "Now that we saw the kernel trick, let's look at several examples of kernels." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Linear Model Family\n", "\n", "We will consider models of the form\n", "\n", "\\begin{align*}\n", "f_\\theta(x) = \\theta^\\top \\phi(x) + \\theta_0\n", "\\end{align*}\n", "\n", "where $x$ is the input and $y$ is the target. " ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Kernel Trick for Ridge Regression\n", "\n", "The normal equations provide a closed-form solution for $\\theta$:\n", "\n", "$$\\theta = (\\Phi^\\top \\Phi + \\lambda I)^{-1} \\Phi^\\top y.$$\n", "\n", "They also can be written in this form:\n", "\n", "$$\\theta = \\Phi^\\top (\\Phi \\Phi^\\top + \\lambda I)^{-1} y.$$\n", "\n", "The first approach takes $O(d^3)$ time; the second is $O(n^3)$ and is faster when $d > n$." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "An interesting corollary is that the optimal $\\theta$ is a linear combination of the $n$ training set features:\n", "$$\\theta = \\sum_{i=1}^n \\alpha_i \\phi(x^{(i)}).$$\n", "We can compute a prediction $\\phi(x')^\\top \\theta$ for $x'$ without ever using the features (only their dot products):\n", "$$\\phi(x')^\\top \\theta = \\sum_{i=1}^n \\alpha_i \\phi(x')^\\top \\phi(x^{(i)}).$$\n", "Equally importantly, we can learn $\\theta$ from only dot products." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Review: Kernel Trick in SVMs\n", "\n", "Notice that in both equations, the features $x$ are never used directly. Only their *dot product* is used.\n", "\\begin{align*}\n", "\\sum_{i=1}^n \\lambda_i - \\frac{1}{2} \\sum_{i=1}^n \\sum_{k=1}^n \\lambda_i \\lambda_k y^{(i)} y^{(k)} \\phi(x^{(i)})^\\top \\phi(x^{(k)}) \\\\\n", "\\theta^\\top \\phi(x') = \\sum_{i=1}^n \\lambda_i^* y^{(i)} \\phi(x^{(i)})^\\top \\phi(x').\n", "\\end{align*}\n", "\n", "If we can compute the dot product efficiently, we can potentially use very complex features." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "slide" } }, "source": [ "# Definition: Kernels\n", "\n", "The *kernel* corresponding to features $\\phi(x)$ is a function $K : \\mathcal{X} \\times \\mathcal{X} \\to [0, \\infty]$ that outputs dot products between vectors in $\\mathcal{X}$\n", "$$K(x, z) = \\phi(x)^\\top \\phi(z).$$\n", "\n", "We will also consider general functions $K : \\mathcal{X} \\times \\mathcal{X} \\to [0, \\infty]$ and call these *kernel functions*." ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "Kernels have various intepreations:\n", "* The dot product or geometrical angle between $x$ and $z$\n", "* A notion of similarity between $x$ and $z$" ] }, { "cell_type": "markdown", "metadata": { "slideshow": { "slide_type": "subslide" } }, "source": [ "In order to illustrate kernels, we will use this dataset." ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "slideshow": { "slide_type": "fragment" } }, "outputs": [ { "data": { "text/plain": [ "(-3.0, 3.0)" ] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXYAAAD8CAYAAABjAo9vAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy86wFpkAAAACXBIWXMAAAsTAAALEwEAmpwYAAAfZklEQVR4nO3dd3gVZd7G8e8vhRBCD6EnBgXEDhIpiiuroKgoi4JixbKLYsG66oqKDRVRUbEgdrFR7NjABcSyKEFBqYLSWxIIoSQQkvO8f5DllU2Fc3ImmXN/rivXxTnznJl7BG6HmWfmmHMOERHxjyivA4iISGip2EVEfEbFLiLiMyp2ERGfUbGLiPiMil1ExGeCLnYzq2lmP5rZPDNbYGb3hSKYiIgcGAt2HruZGZDgnNtuZrHAt8ANzrlZoQgoIiL7JybYFbg9/2fYXvQytuhHdz2JiHgk6GIHMLNoYA7QGnjWOfdDCWMGAYMAEhISOrZr1y4UmxYRiRhz5szJcs4llTcu6FMx+6zMrD7wAXC9c25+aePS0tJcenp6yLYrIhIJzGyOcy6tvHEhnRXjnNsCTAd6hXK9IiJScaGYFZNUdKSOmcUDPYHFwa5XREQOTCjOsTcDXi86zx4FTHDOTQ7BekVE5ACEYlbML0CHEGQREZEQ0J2nIiI+o2IXEfEZFbuIiM+o2EVEfEbFLiLiMyp2ERGfUbGLiPiMil1ExGdU7CIiPqNiFxHxGRW7iIjPqNhFRHxGxS4i4jMqdhERn1Gxi4j4jIpdRMRnVOwiIj6jYhcR8RkVu4iIz6jYRUR8RsUuIuIzKnYREZ9RsVdD27Zt49FHH+WQNm2JiYmhYWIig6+5lmXLlnkdTXwoEAjw9ttv06Vje2rExlKrZhxnn9GLmTNneh1NSqFir2ays7M5vtuJfPjVN1x29+O8+v1v3PvGp2QWxNK5S1e+++47ryOKjwQCAS696ALuv/U6TknI4q2+B/NS74NomTWP8/r0ZvRTT3odUUpgzrmwbzQtLc2lp6eHfbt+cMFFF5Hj4rj41vsws32Wzft+Bq88cCurV64kLi7Oo4TiJ2PHjuWp++7g3hOSiIvZ9zgwY8du7pixgX/P/I5jjjnGo4SRxczmOOfSyhunI/ZqZOPGjXz66af0vermYqUOcMzx3Wl5SDsmTZrkQTrxG+ccTz42ggGH1i5W6gCNE2Lp1SqB0U+O8iCdlEXFXo2kp6fT9qhjqV23fqljjjmxBzO+1rlPCd62bdtYsWo1RzepVeqYTs1q8c3X08OYSioi6GI3s2Qzm25mC81sgZndEIpgUpyZ4QKBMscEAoESj+ZF9td//xyVdbI24NCftyooFEfsBcAtzrnDgS7AtWZ2eAjWK/+jc+fOLJ3/M1uzN5c6Zu7XX3LKyX8NYyrxqzp16tDm4IOZu35HqWNmrc+j+yk9wphKKiLoYnfOrXfO/VT0623AIqBFsOuV4hITE+l7zrlMfOYRSrroPXva52SuWUnfvn09SCd+dPPt/+LtJdvJ3V1YbNm6bflMXb6dITfe7EEyKUtIZ8WYWSowEzjSObe1tHGaFXPgtm7dyik9TyUQG89pF/2D1HZHsSVrIzM/Gs8PUz7m888+JS2t3IvmIhXinOPqf/ydqZ+8T5+D4+nQLIHdhY5vV29n8h87eOTxUVx55d+9jhkxKjorJmTFbma1ga+B4c6590tYPggYBJCSktJx5cqVIdluJMrLy+O1117j+RdeZNXK5dSpW48LBwzguuuuJTk52et44jPOOT755BOefmIkP/08l+joaE477TRuvOWfOogIs7AWu5nFApOBL51zT5Q3XkfsIiL7L2zz2G3PJfGXgUUVKXUREalcoZgVcwJwCXCymc0t+jkjBOsVEZEDEBPsCpxz3wKayCoiUkXozlMREZ9RsYuI+IyKXUTEZ1TsIiI+o2IXEfEZFbuIiM+o2EVEfEbFLiLiMyp2ERGfUbGLiPiMil1ExGdU7CIiPqNiFxHxGRW7iIjPqNhFRHxGxS4i4jMqdhERn1Gxi4j4TNBfjSdSGfLz85k2bRoZGRm0bNmSk046iejoaK9jiVQLKnapcp4fM4Zhw+6lccuDaNSsJetX/k7u1i088dhI+vfv73U8kSpPxS5Vyqgnn+SJp5/hltHjSGlz2N73l8ydzbVDriUQCHD++ed7mFCk6jPnXNg3mpaW5tLT08O+XanacnJySDkolfvGTaZJy4OKLV82/2eev2Mwq1auICZGxyQSecxsjnMurbxxungqVcbEiRM5stMJJZY6QOsjO9CgSTOmTJkS5mQi1YuKXaqMlStX0rRV2zLHtDjkUFatWhWmRCLVk4pdqoxGjRqxJXN9mWOyN64nMTExTIlEqicVu1QZ/fv3Z/a0z9mxLafE5ZnrVrN0/s+cccYZYU5W/TnnyMjIYPXq1RQUFHgdRyqZil2qjObNm3PJxZfwzO2Dyd2+bZ9lOZsyGX3b1dz2z3+SkJDgUcLq6e2336b9kYfRutVBHHvU4TRvksRdQ+8kNzfX62hSSTQrRqqUgoICrrt+CO+++y6dTz2LxGbJbFy5jNnTv2DIddfzwAP3Y2Zex6w27rrzDt56aQyXHl6HDs0SiDJjVc4uJizZRn79ZKbN/JZatWp5HVMqKKyzYszsFTPLMLP5oVifRK6YmBjGPP8c8+b+TPf27WhiuZx5YieWLlnCgw8+oFLfD+np6bz4/LPc3y2Jjs1rE1X03y6lXhy3HJdIbPZqRjzysMcppTKE5IjdzP4CbAfecM4dWd54HbGLVL7LLrkIFkzjnHb1S1y+cssuhs/OYc36jbovoJoI6xG7c24msDkU6xKR0Pg5fTZHJcWVuvyg+nEU5O8iIyMjjKkkHMJ28dTMBplZupmlZ2ZmhmuzIhErtkYNdhYESl1eGHDs2l1AXFzp5S/VU9iK3Tk31jmX5pxLS0pKCtdmRSJWn3P78936/FKXz1m/nXZt2+q+AB/SdEcRn/rHoKv4cV0e8zbsKLZsS14B4xZu57ahd1doXTNnzqR/3z6ktmxG64OSuW7w1SxZsiTUkSVEVOwiPtW0aVPe++hjnvoph+d/zuaXDTtYkpXHxEXZ3DJ9A1dcM6TcxyA757j9n7dyQd+zaLj6R25vX4shh8eQ9f2HdD2uIxMnTgzT3sj+CMmlcDN7B+gONDKzNcAw59zLoVi3iBy4k046iQVLfuOlF8fy0XsTyc/PJ63TSUx99Sbat29f7ucnTpzIhNdfYkT3JtSN+/8vOkltUJOuzeO56srL6dChA61bt67EvZD9pRuURKRUnTocQ886m+jcsk6Jy9/4NZvkv/Zn1NOjw5wsMumxvSISlO3bt/PrwkWkNa9d6pjjW8Tz5eefhjGVVISKXURKVFhYSFSUEVXGzb4xUeihYlWQil1ESlS3bl2aNWnCoqy8Usf8tCGPzl2PD2MqqQgVu4iUyMwYcvOtjF+yg4JA8Wtx2XkFfL58B9ffeLMH6aQsKnYRKdU111xD8pFp3P99Jr9u3IFzjt2Fjpkrt3LXNxlcd+MtdOrUyeuY8j/05B8RKVVMTAzvfzyZMWPGMHrU46z45ndcwNGlU0eee/UuzjrrLK8jSgk03VFESuScK/aY5Ly8PKKjo6lRo4ZHqSKbpjuKyH5bsGABl11yEXUS4omOjubglJaMfPRRduzY81iC+Ph4lXo1oGIXEQCmTJnCiV27EJj/b545tSXvndeWwe1i+OiFkZzYtTM5OSV/F61UPSp2EWHbtm1ccF4/buvckH6HNaBBfAzRUcahjeL5Z6dEmu7O5OYbrvc6plSQil1EGDduHIcnxXN4UvHvPzUzBhxWl0mTJpGdne1BOtlfKnYR4dsZ0+iQWHod1K8ZQ6tGdZg7d274QskBU7GLCGZGCfcg7SPgICpKlVEd6HdJROh5+pnMziwsdXlW7m5Wbd5Bx44dw5hKDpSKXUQ4//zzWbG1gPS124stKww43liwlUsHXkrt2qU/6VGqDt15KiLEx8fz0eTP6H36aXTblM/JybWoVzOGpZvzmLx8J/VT2jJi5ONex5QK0hG7iADQpUsXfpr3K217XcwTv+zklukZTN2WyI33P8aUaTOIj4/3OqJUkB4pICJSTeiRAiIiEUrFLiLiMyp2ERGfUbGLiPiMil1ExGdU7CIiPqNiFxHxGRW7iIjPqNhFRHwmJMVuZr3MbImZLTOzO0KxThEROTBBF7uZRQPPAqcDhwMXmNnhwa5XREQOTCiO2DsBy5xzfzjn8oF3gT4hWK+IiByAUBR7C2D1n16vKXpvH2Y2yMzSzSw9MzMzBJsVEZGShO3iqXNurHMuzTmXlpSUFK7NiohEnFB80cZaIPlPr1sWvSfVUG5uLu+88w5ff/MNZlH0POVk+vXrR82aNb2OJiIVFIoj9tlAGzNrZWY1gAHAxyFYr4TZ9OnTSUlNZeybE4hLPpzY5m0ZNeZlUg8+mB9++MHreCJSQUEfsTvnCszsOuBLIBp4xTm3IOhkElaLFy+mX//zGPzQsxxx3PF73z+l38XM+XoqZ/Y+i59/mkNycnIZaxGRqiAk59idc58559o65w5xzg0PxTolvB57/AlO6T9wn1L/r44n9aRLr78xevQzHiQTkf2lO08FgEmTJvGXPueXuvwvZ5/HO+PHhzGRiBwoFbsAsG1rDvUaNip1eb3EJLZv2xrGRCJyoFTsAkDqwYewfNGvpS5fvvAXWh18SBgTiciBUrFHsNzcXD788ENeeeUVTvlrdz5743mcc8XGBQIBvnjrRa4dfHX4Q4rIflOxRyDnHI+MGEGL5GTuHzmKtz7+kunf/Yf5s79n5JCBbM3evHfslqwMXrz3ZmpFOy6++GIPU4tIRYXiBiWpZu6+Zxjj3/+QYa9+RJPkVGBP2S/+aRZP3jqIm3p3pe3RHXDOsWLJAi644EIef/d14uLivA0uIhViJf3Tu7KlpaW59PT0sG9XYMOGDbRt145H35tR4sXSX3/4hndG3s0Lzz9HVFQUxx13HPXq1fMgqYj8LzOb45xLK2+cjtgjzLhx4+jco3epM2CO7NQNomOpVasW3bp1C3M6EQkFnWOPMCtWrqJZautSl5sZLQ9py5o1a8KYSkRCScUeYRonNWLzxnVljslav5bExMQwJRKRUFOxR5gLL7yQ7z//gF15eSUuX/nbQrI3rqN79+7hDSYiIaNijzBt2rShd+/ePHfnNezM3bHPssx1a3juzmu5995hxMbGepRQRIKli6cR6KWxLzDo6sHcdNbxdDn1LOonNWXd74uZ+90M7rnnbq6+6iqvI4pIEDTdMYItX76cCRMmsGnzZlqlpjJgwAAaNGjgdSwRKUVFpzuq2EUq0Y4dO5g0aRJLly6lfv369OvXj9TUVK9jSTVV0WLXOXaRSvLqa6/RMiWF515/m8VZeUyd/Svtj+3IxZcOZOfOnV7HEx/TEbtIJZg4cSLX3XATtz79Oi0POXTv+zvzchk77Caa10/gvYkTPEwo1ZGO2EU84pzjjjuHMui+UfuUOkDN+FoMfvBpvv3uO+bOnetNQPE9FbtIiKWnp1MQcBzWsUuJy2NrxNGtd3/eeGNcmJNJpFCxi4RYZmYmSc2TMbNSxzRqnszGzIwwppJIomIXCbEWLVqwbsXvBAKBUsdsWLGMlOTkMKaSSKJi95ndu3cza9Yspk2bxtq1a72OE5GOPvpoEhMb8tPMqSUuz92+jW8mT+Lyyy4LbzCJGLrz1Cecczz+xBM89tjj1G7QkIQ69VixZCEnnngio59+ilatWnkdMWKYGaMeG8n5F1xIzfgEjuh0wt7TMtmZGxlz1/X079ePtm3bepxU/ErF7hNDbriRqV9/w01Pvk5K28MA2Jm7g6njX+P4E7rxn++/040xYdSjRw/efON1rr7mWmrUqk1qu6PYmp3F4p9+4OqrB/PQ8Ae9jig+pnnsPjBv3jx69jqdh8ZPJaFO8W87en/sKKK2rGP8O+94kC6yBQIBpk2bxu+//07dunU544wz9I1UcsD0DUo+t2LFCl5+5RWWLl3GwoULOelvF5ZY6gCnnn8Zt/TpRnZ2tp4FE2ZRUVH06NGDHj16eB1FIogunlYzzjnuHHoX7Y/tyM/L19Pg8E5k79jJIUe2L/Uztes1oFHTFqxatSp8QUXEM0EdsZtZf+Be4DCgk3NO51cq2ZNPPcWEDz5ixKRp1G2w51uO5syYwtbNm0r9TCAQYNuWzdSpUydcMUXEQ8Eesc8HzgFmhiCLlGP37t088sgIBt03am+pA3TueSYzP5lY6ud++c/XNGveXDNjRCJEUMXunFvknFsSqjBStm+//ZYGTZqR3LrdPu93Ovl0sjM3MvmNMcU+k7luNeMevZthdw0t805IEfGPsF08NbNBwCCAlJSUcG3WV7Zu3Uq9ho2KvR8TW4M7nn2TkTcM5PsvPqJ7n/OpVacey+b9yH+mfMLwBx7g3HPP9SCxiHih3CN2M/vKzOaX8NNnfzbknBvrnEtzzqUlJSUdeOII1rp1a5Yvnk9hQUGxZY2ateDhd76kXoNEFkyfTMYv39Ht6ENZsmgR119/vQdpRcQr5R6xO+c0T6uKOOKII0hNPYjvv/iIE3sXPwLPzsrgj4VzWbpkCU2aNNn7vnOO9PR0fv31V+Li4ujZsyeNGzcOZ3QRCSPNY69mnhs9mlNP60UgUMgJp/clJjYW5xy/z5/Li/fdzNA779yn1OfNm8fAy68gI2sThx3bmZ07tjP42mvp3/88nh39NDVr1vRwb0SkMgR156mZ9QVGA0nAFmCuc+608j6nO0+DM2fOHIbcdDNLFi8hpc2hbMnMoCB/J8PuuZsrr7hi77jFixdzwoknct6QoXQ74xyiovaceduek81rDw+lTnQhn03+ZO/7IlK16cusI8DSpUv5448/qF+/Pscdd1yxgu533vnENW9D74FXF/tswe7d3HvpmYwZ/RQ9e/YMV2QRCYKKPcLl5OTQMjmFJyf/h1p16pY45t+T3mTL0p+ZNGF8mNOJyIHQd55GuI0bN1KvYWKppQ7Q4uA2rFq9OoypRCQcVOw+1bBhQ3KyN5G/M6/UMVnr15LUqPi8eBGp3lTsPtWoUSO6dO7K9198VOJy5xxff/g2lw+8NMzJRKSyqdh97P77hjHx2REs/vnHfd4vLCjg3aeGE+sK6NNnv+4zE5FqQPPYfaxr1668/eY4Lhk4kBYHt6VNh87syt3Bj1M/4bB27Zj65RfExsZ6HVNEQkzF7nO9evVi7erVfPDBB8ybN4+aTety/6eTad++vdfRRKSSaLqjiEg1oemOIiIRSsUuIuIzKnYREZ9RsYuI+IyKXUTEZ1TsIiI+o2IXEfEZFbuIiM+o2EVEfEbFLiLiMyp2ERGfUbGLiPiMil1ExGdU7CIiPqNiFxHxGRW7iIjPqNhFRHxGX40nsp/y8/P54osv2LBhA02aNKFXr17ExcV5HUtkLxW7yH4YO/YF7rrjDprXiaVZQjQbcgv5e04+9w9/iMHXXON1PBEgyGI3s5HAWUA+8DtwuXNuSwhyiVQ5Y8Y8z4ND7+Cuzg1JbVBz7/urcnbx8LB/UVhQwHVDhniYUGSPoL7M2sxOBaY55wrMbASAc+728j6nL7OW6iYvL48WTRvzQLckkusVP+2ybls+//p6I2vWbyQhIcGDhBIJwvJl1s65Kc65gqKXs4CWwaxPpKr6+OOPaZ1Yq8RSB2hepwbtkhL44IMPwpxMpLhQzoq5Avg8hOsTqTLWrFlD81plj2kev2eciNfKPcduZl8BTUtYNNQ591HRmKFAAfBWGesZBAwCSElJOaCwIl5JSkpi0y4rc0xWvpGUlBSmRCKlC+ocO4CZXQZcBZzinMutyGd0jl2qm5ycHFJaNGdUj2Y0qhVbbPnmvAKGTFnLyjVradCggQcJJRKE5Ry7mfUCbgPOrmipi1RH9erV4+ZbbuHRHzazJa9gn2U5Owt49MdNDBkyRKUuVUKw89ifAeKAqWYGMMs5d3XQqUSqoHvuvY+Cgt1c/9RTHNeiDk3iAmTsiuLHtdsYfM213D/8Ia8jigAhOBVzIHQqRqqzrKws3n33XdatW0uzZs0ZMGCAzq1LWITlVIxIJAoEAnt+Cgv3/lqkKlGxi1SQc447b7+NNgen8slzw1n7xat89vxDtD2kFbfdeosKXqoMPStGpILuvecuPhj3EqNPbUH9mv//V+eSI+ry8DuvEhsTw/BHRniYUGQPnWMXqYD/Tnd8skczEkuY7pidV8CQqWtZvmoNDRs29CChRAKdYxcJoQ8//JCjm9UpsdQBGsTH0KF5Xd5///1y17VixQpuuelGUls2p0mjhpx0fBcmTJhAYWFhqGNLhFKxi1RAVlYWiXFl/+s2sYZj06ZNZY6ZMWMGxx5zFCu+epebj47jkW4N6Ry1mmE3Deacs3uze/fuUMaWCKViF6mA5ORk1pZzC966PGjZsvTn4G3ZsoV+fftwc8cGDDyqAan1a5JYK5ZuKXV5sFsS6xfMZviDD4Q4uUQiFbtIBZx11ln8sTmPVTm7Sly+dms+izNz6du3b6nreP311zkyqSZHNy3+WN/YaOPSI+ry3DOjyc/PD1luiUwqdpEKiI+P5+FHR/LwrCz+yN65z7IVW3by0KwsHnzoYWrVKv0RkFM++4TOjUufiJZSL47asVEsXLgwZLklMmm6o0gFDRp0FTHRMfzr9n/SNGEHzRKiWb+jkPXbd/Pgw48yaNBVZX4+UFhIlJX9hMjoKNN8eAmail1kP1xx5ZVcfMklTJ06lfXr19O0aVNOPfVUatSoUe5nT/zrKcx8cxFdk0tevnF7Ppt27KJdu3YhTi2RRsUusp9q1KjBmWeeud+f+/s/BvHoIw9zemo8rf70nakAAed4Z/E2Bg68rMzTOSIVoXPsImHSuHFjXnzlNe7/PpNPlmSzPb+QgHMsysxlxKxN5NVL5sGHH/E6pviAil0kjPr378+X/55BTmoXLv94OeeO/42xvwXod93tTJv5rb4IW0JCp2JEijjn2LVrF3FxcVg5FzmDkZaWxoT3PsQ5R2FhITEx+msooaUjdol48+fP55ILB1ArviZ1atemWeNG3H3XULKzsyt1u2amUpdKoWKXiPbVV1/xlxO6Er14Bi+emcJ757Xhrk71mDXxRTp37EBGRobXEUX2m4pdIlZubi4D+p/Lrcc15Jx2Dagbt+foOaVeHNd3TOTohJ1cM+jvHqcU2X8qdolY48ePp3XDmhzZuOTpheceWpepU79i/fr1YU4mEhwVu0Ss72bOoH3D0i+SJtSIpl3TesyZMyeMqUSCp2KXiBUdHUNhoOxH8RYEAkRHR4cpkUhoqNglYp12xpn8mFn6c1my8wpYmrGNrl27hjGVSPBU7BKxzj77bLYEYpmxYmuxZYUBx+sLcrjgggHUr18//OFEgqBJtBKxYmJimPz5l/Q8uTuLsgs5ObkmDeJj+H3zTj5dsZOGB7Vl1NPPeB1TZL/piF0i2lFHHcUvCxZxwoXX8MofUQyblcPMXU257eEnmTJthh7IJdWSOVf2xaPKkJaW5tLT08O+XRGR6szM5jjn0sobpyN2ERGfUbGLiPhMUMVuZg+Y2S9mNtfMpphZ81AFExGRAxPsEftI59zRzrn2wGTgnuAjiYhIMIIqdufcnycAJwDhvxIrIiL7CHoeu5kNBy4FcoC/ljFuEDCo6OUuM5sf7LarsEZAltchKpGf98/P+wbav+ru0IoMKne6o5l9BTQtYdFQ59xHfxr3L6Cmc25YuRs1S6/IlJ3qSvtXffl530D7V91VdP/KPWJ3zvWo4DbfAj4Dyi12ERGpPMHOimnzp5d9gMXBxRERkWAFe479ETM7FAgAK4GrK/i5sUFut6rT/lVfft430P5VdxXaP08eKSAiIpVHd56KiPiMil1ExGc8K3Y/P47AzEaa2eKi/fvAzOp7nSmUzKy/mS0ws4CZ+WZqmZn1MrMlZrbMzO7wOk8omdkrZpbh1/tHzCzZzKab2cKiP5s3eJ0pVMysppn9aGbzivbtvnI/49U5djOr+987V81sCHC4c66iF1+rNDM7FZjmnCswsxEAzrnbPY4VMmZ2GHsumL8A3Oqcq/bPYDazaOA3oCewBpgNXOCcW+hpsBAxs78A24E3nHNHep0n1MysGdDMOfeTmdUB5gB/88Pvn5kZkOCc225mscC3wA3OuVmlfcazI3Y/P47AOTfFOVdQ9HIW0NLLPKHmnFvknFvidY4Q6wQsc8794ZzLB95lzxReX3DOzQQ2e52jsjjn1jvnfir69TZgEdDC21Sh4fbYXvQytuinzL709By7mQ03s9XARfj3AWJXAJ97HULK1QJY/afXa/BJMUQaM0sFOgA/eBwlZMws2szmAhnAVOdcmftWqcVuZl+Z2fwSfvoAOOeGOueS2XPX6nWVmSXUytu3ojFDgQL27F+1UpH9E6lqzKw28B5w4/+cFajWnHOFRU/RbQl0MrMyT6dV6pdZ+/lxBOXtm5ldBvQGTnHV8GaB/fi984u1QPKfXrcsek+qiaLzz+8Bbznn3vc6T2Vwzm0xs+lAL6DUC+Fezorx7eMIzKwXcBtwtnMu1+s8UiGzgTZm1srMagADgI89ziQVVHSB8WVgkXPuCa/zhJKZJf13Zp2ZxbPnAn+ZfenlrJj32PMIyr2PI3DO+eIIycyWAXHApqK3Zvllxg+AmfUFRgNJwBZgrnPuNE9DhYCZnQE8CUQDrzjnhnubKHTM7B2gO3sea7sRGOace9nTUCFkZt2Ab4Bf2dMpAHc65z7zLlVomNnRwOvs+XMZBUxwzt1f5meq4VkCEREpg+48FRHxGRW7iIjPqNhFRHxGxS4i4jMqdhERn1Gxi4j4jIpdRMRn/g/Srlf8VbUDQQAAAABJRU5ErkJggg==\n", "text/plain": [ "