diff --git a/notebooks/Conjugate Gradient.ipynb b/notebooks/Conjugate Gradient.ipynb index 4c5d7e0d..102a0bd0 100644 --- a/notebooks/Conjugate Gradient.ipynb +++ b/notebooks/Conjugate Gradient.ipynb @@ -1,505 +1,512 @@ { - "metadata": { - "language": "haskell", - "name": "", - "signature": "sha256:8332eed5b1a2647ecfe6b707d1d07de0e8798861c517cac876970de5eb31e43c" - }, - "nbformat": 3, - "nbformat_minor": 0, - "worksheets": [ + "cells": [ { - "cells": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the previous notebook, we set up a framework for doing gradient-based minimization of differentiable functions (via the `GradientDescent` typeclass) and implemented simple gradient descent for univariate functions. Next, let's try to extend this framework to a faster method such as nonlinear Conjugate Gradient, and see what modifications we'll need to make in order to accomodate it.\n", + "$\\newcommand\\vector[1]{\\langle #1 \\rangle}\\newcommand\\p[2]{\\frac{\\partial #1}{\\partial #2}}\\newcommand\\R{\\mathbb{R}}$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Conjugate Gradient\n", + "===\n", + "Before diving in to Haskell, let's go over exactly what the conjugate gradient method is and why it works. The \"normal\" conjugate gradient method is a method for solving systems of linear equations. However, this extends to a method for minimizing quadratic functions, which we can subsequently generalize to minimizing arbitrary functions $f\\!:\\!\\R^n \\to \\R$. We will start by going over the conjugate gradient method of minimizing quadratic functions, and later generalize.\n", + "\n", + "Suppose we have some quadratic function\n", + "$$f(x) = \\frac{1}{2}x^T A x + b^T x + c$$\n", + "for $x \\in \\R^n$ with $A \\in \\R^{n \\times n}$ and $b, c \\in \\R^n$.\n", + "\n", + "We can write any quadratic function in this form, as this generates all the coefficients $x_ix_j$ as well as linear and constant terms. In addition, we can assume that $A = A^T$ ($A$ is symmetric). (If it were not, we could just rewrite this with a symmetric $A$, since we could take the term for $x_i x_j$ and the term for $x_j x_i$, sum them, and then have $A_{ij} = A_{ji}$ both be half of this sum.)\n", + "\n", + "Taking the gradient of $f$, we obtain\n", + "$$\\nabla f(x) = A x + b,$$\n", + "which you can verify by writing out the terms in summation notation.\n", + "\n", + "If we evaluate $-\\nabla f$ at any given location, it will give us a vector pointing towards the direction of steepest descent. This gives us a natural way to start our algorithm - pick some initial guess $x_0$, compute the gradient $-\\nabla f(x_0)$, and move in that direction by some step size $\\alpha$. Unlike normal gradient descent, however, we do not have a fixed step size $\\alpha$ - instead, we perform a line search in order to find the *best* $\\alpha$. This $\\alpha$ is the value of $\\alpha$ which brings us to the minimum of $f$ if we are constrainted to move in the direction given by $d_0 = -\\nabla f(x_0)$.\n", + "\n", + "Note that computing $\\alpha$ is equivalent to minimizing the function\n", + "$$\\begin{align*}\n", + "g(\\alpha) &= f(x_0 + \\alpha d_0) \\\\\n", + "&= \\frac{1}{2}(x_0 + \\alpha d_0)^T A (x_0 + \\alpha d_0) + b^T (x_0 + \\alpha d_0) + c\\\\\n", + "&= \\frac{1}{2}\\alpha^2 {d_0}^T A d_0 + {d_0}^T (A x_0 + b) \\alpha + (\\frac{1}{2} {x_0}^T A x_0 + {x_0}^T d_0 + c)\n", + "\\end{align*}$$\n", + "Since this is a quadratic function in $\\alpha$, it has a unique global minimum or maximum. Since we assume we are not at the minimum and not at a saddle point of $f$, we assume that it has a minimum. \n", + "\n", + "The minimum of this function occurs when $g'(\\alpha) = 0$, that is, when\n", + "$$g'(\\alpha) = ({d_i}^T A {d_i})\\alpha + {d_i}^T(A x_i + b) = 0.$$\n", + "\n", + "Solving this for $\\alpha$, we find that the minimum is at\n", + "$$\\alpha = -\\frac{{d_i}^T (A x_i + b)}{{d_i}^T A d_i}.$$\n", + "\n", + "Note that since the directon is the negative of the gradient, a.k.a. the direction of steepest descent, $\\alpha$ will be non-negative. These first steps give us our second point in our iterative algorithm:\n", + "$$x_1 = x_0 - \\alpha \\nabla f(x_0)$$\n", + "\n", + "If this were simple gradient descent, we would iterate this procedure, computing the gradient at each next point and moving in that direction. However, this has a problem - by moving $\\alpha_0$ in direction $d_0$ (to find the minimum in direction $d_0$) and then moving $\\alpha_1$ in direction $d_1$, we may *ruin* our work from the previous iteration, so that we are no longer at a minimum in direction $d_0$. In order to rectify this, we require that our directions be *conjugate* to one another.\n", + "\n", + "We define two vectors $x$ and $y$ to be conjugate with respect to some semi-definite matrix $A$ if $x^T A y = 0$. (Semi-definite matrices are ones where $x^T A x \\ge 0$ for all $x$, and are what we require for conjugate gradient.)\n", + "\n", + "Since we have already moved in the $d_0 = -\\nabla f(x_0)$ direction, we must find a new direction $d_1$ to move in that is conjugate to $d_0$. How do we do this? Well, let's compute $d_1$ by starting with the gradient at $x_1$ and then subtracting off anything that would counter-act the previous direction:\n", + "$$d_1 = -\\nabla f(x_1) + \\beta_0 d_0.$$\n", + "\n", + "This leaves us with the obvious question - what is $\\beta_0$? We can derive that from our definition of conjugacy. Since $d_0$ and $d_1$ must be conjugate, we know that ${d_1}^T A d_0 = 0$. Expanding $d_1$ by using its definition, we get that ${d_1}^T A d_0 = -\\nabla f(x_1)^TAd_0 + \\beta_0 {d_0}^TA d_0 = 0$. Therefore, we must choose $\\beta_0$ such that\n", + "$$\\beta_0 = \\frac{\\nabla f(x_1)^T A d_0}{{d_0}^T A d_0}.$$\n", + "\n", + "Choosing this $\\beta$ gives us a direction conjugate to all previous directions. Interestingly enough, iterating this will *keep* giving us conjugate directions. After generating each direction, we find the best $\\alpha$ for that direction and update the current estimate of position.\n", + "\n", + "Thus, the full Conjugate Gradient algorithm for quadratic functions:\n", + "\n", + "> Let $f$ be a quadratic function $f(x) = \\frac{1}{2}x^T A x + b^T x + c$\n", + "which we wish to minimize.\n", + "> 1. **Initialize:** \n", + "Let $i = 0$ and $x_i = x_0$ be our initial guess, and compute $d_i = d_0 = -\\nabla f(x_0)$.\n", + "> \n", + "> 2. **Find best step size:**\n", + "Compute $\\alpha$ to minimize the function $f(x_i + \\alpha d_i)$ via the equation\n", + "$$\\alpha = -\\frac{{d_i}^T (A x_i + b)}{{d_i}^T A d_i}.$$\n", + "> \n", + "> 3. **Update the current guess:**\n", + "Let $x_{i+1} = x_i + \\alpha d_i$.\n", + ">\n", + "> 4. **Update the direction:**\n", + "Let $d_{i+1} = -\\nabla f(x_{i+1}) + \\beta_i d_i$ where $\\beta_i$ is given by\n", + "$$\\beta_i = \\frac{\\nabla f(x_{i+1})^T A d_i}{{d_i}^T A d_i}.$$\n", + ">\n", + "> 5. **Iterate:** Repeat steps 2-4 until we have looked in $n$ directions, where $n$ is the size of your vector space (the dimension of $x$)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Nonlinear Conjugate Gradient\n", + "---\n", + "So, now that we've derived this for quadratic functions, how are we going to use this for general nonlinear optimization of differentiable functions? To do this, we're going to reformulate the above algorithm in *slightly* more general terms.\n", + "\n", + "First of all, we will revise step two. Instead of \n", + "\n", + "> **Find best step size:**\n", + "Compute $\\alpha$ to minimize the function $f(x_i + \\alpha d_i)$ via the equation\n", + "$$\\alpha = -\\frac{{d_i}^T (A x_i + b)}{{d_i}^T A d_i}.$$\n", + "\n", + "we will simply use a line search:\n", + "\n", + "> **Find best step size:**\n", + "Compute $\\alpha$ to minimize the function $f(x_i + \\alpha d_i)$ via a line search in the direction $d_i$.\n", + "\n", + "In addition, we must reformulate the computation of $\\beta_i$. There are several ways to do this, all of which are the same in the quadratic case but are different in the general nonlinear case. We reformulate this computation by generalizing. Note that the difference between $x_{k+1}$ and $x_k$ is entirely in the direction $d_k$, so that for some constant $c$, $x_{k+1} - x_k = c d_k$. Since $\\nabla f(x) = A x + b$, \n", + "$$ \\nabla f(x_{k+1}) - \\nabla f(x_k) = (A x_{k+1} + b) - (A x_k + b) = A(x_{k+1}-x_k) = cA d_k.$$\n", + "\n", + "Therefore, $A d_k = c^{-1} (\\nabla f(x_{k+1}) - \\nabla f(x_k))$. We can now plug this in to the equation for $\\beta_i$ and obtain\n", + "$$\\beta_k = \\frac{\\nabla f(x_{k+1})^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}{{d_k}^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}.$$\n", + "\n", + "Conveniently enough, the value of $c$ cancels, as it is both in the numerator and denominator. This gives us the new update rule:\n", + "\n", + "> **Update the direction:**\n", + "Let $d_{k+1} = -\\nabla f(x_{k+1}) + \\beta_k d_k$ where $\\beta_k$ is given by\n", + "$$\\beta_k = \\frac{\\nabla f(x_{k+1})^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}{{d_k}^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}.$$\n", + "\n", + "We can now apply this algorithm to any nonlinear and differentiable function! This reformulation of $\\beta$ is known as the Polak-Ribiere method; know that there are others, similar in form and also in use." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Line Search\n", + "---\n", + "The one remaining bit of this process that we haven't covered is step two: the line search. As you can see above, we are given a point $x$, some vector $v$, and a multivariate function $f\\!:\\!\\R^n \\to \\R$, and we wish to find the $\\alpha$ which minimizes $f(x + \\alpha v)$. Note that a line search can be viewed simply as root finding, since we know that $v \\cdot \\nabla f(x + \\alpha v)$ should be zero at the minimum. (Since if it were non-zero, we could move from that minimum to a better location.)\n", + "\n", + "There are many ways to do this line search, and they can range from relatively simple linear methods (like the [secant method](http://en.wikipedia.org/wiki/Secant_method)) to more complex (using quadratic or cubic polynomial approximations). \n", + "\n", + "One simple method for a line search is known as the **bisection method**. The bisection method is simply a binary search. To minimize a univariate function $g(x)$, it begins with two points, $a$ and $b$, such that $g(a)$ and $g(b)$ have opposite signs. By the intermediate value theorem, $g(x)$ must have a root in $[a, b]$. (Note that in our case, $g(\\alpha) = v \\cdot \\nabla f(x + \\alpha v)$.) It then computes their midpoint, $c = \\frac{a + b}{2}$, and evaluates the function $g$ to compute $g(c)$. If $g(a)$ and $g(c)$ have opposite signs, the root must be in $[a, c]$; if $g(c)$ and $g(b)$ have opposite signs, then $[c, b]$ must have the root. At this point, the method recurses, continuing its search until it has gotten close enough to the true $\\alpha$.\n", + "\n", + "Another simple method is known as the **secant method**. Like the bisection method, the secant method requires two initial points $a$ and $b$ such that $g(a)$ and $g(b)$ have opposite signs. However, instead of doing a simple binary search, it does linear interpolation. It finds the line between $(a, g(a))$ and $(b, g(b))$:\n", + "$$g(x) \\approx \\frac{g(b) - g(a)}{b - a}(x - a) + g(a)$$\n", + "\n", + "It then finds the root of this linear approximation, setting $g(x) = 0$ and finding that the root is at\n", + "$$\\frac{g(b) - g(a)}{b - a}(x - a) + g(a) = 0 \\implies x = a -\\frac{b - a}{g(b) - g(a)}g(a).$$ \n", + "\n", + "It then evaluates $g$ at this location $x$. As with the bisection method, if $g(x)$ and $g(a)$ have opposite signs, then the root is in $[a, x]$, and if $g(x)$ and $g(b)$ have opposite signs, the root must be in $[x, b]$. As before, root finding continues via iteration, until some stopping condition is reached.\n", + "\n", + "There are more line search methods, but the last one we will examine is one known as **Brent's method**. Brent's method is a combination of the secand method and the bisection method. Unlike the previous two methods, Brent's method keeps track of three points:\n", + "\n", + "- $a_k$: the current \"contrapoint\"\n", + "- $b_k$: the current guess for the root\n", + "- $b_{k-1}$: the previous guess for the root\n", + "\n", + "Brent's method then computes the two possible next values: $m$ (by using the bisection method) and $s$ (by using the secant method with $b_k$ and $b_{k-1}$). (On the very first iteration, $b_{k-1} = a_k$ and it uses the bisection method.) If the secant method result $s$ lies between $b_k$ and $m$, then let $b_{k+1} = s$; otherwise, let $b_{k+1} = m$.\n", + "\n", + "After $b_{k+1}$ is chosen, it is checked to for convergence. If the method has converged, iteration is stopped. If not, the method continues. A new contrapoint $a_{k+1}$ is chosen such that $b_{k+1}$ and $a_{k+1}$ have opposite signs. The two choices for $a_{k+1}$ are either for it to remain unchanged (stay $a_k$) or for it to become $b_k$ - the choice depends on the signs of the function values involved. Before repeating, the values of $f(a_k{+1})$ and $f(b_{k+1})$ are examined, and $b_{k+1}$ is swapped with $a_{k+1}$ if it has a higher function value. Finally, the method repeats with the new values of $a_k$, $b_k$, and $b_{k-1}$.\n", + "\n", + "Brent's method is effectively a heuristic method, but is nice in practice; it has the reliability of the bisection method and gains a boost of speed from its use of the secant method." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Implementation\n", + "---\n", + "\n", + "Now that we've reviewed the conjugate gradient method, let's revise our previous gradient descent framework to so that we can implement conjugate gradient (using Brent's method for its line search).\n", + "\n", + "Recall that in the previous notebook, we defined a class that allowed us to do gradient descent on arbitrary function-like data types:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- Extensions and imports we'll need later.\n", + ":set -XTypeFamilies -XFlexibleContexts -XMultiParamTypeClasses -XDoAndIfThenElse -XFlexibleInstances\n", + "import Control.Monad.Writer\n", + "import Text.Printf" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "class Monad m => GradientDescent m a where\n", + " -- Type to represent the parameter space.\n", + " data Params a :: *\n", + " \n", + " -- Compute the gradient at a location in parameter space.\n", + " grad :: a -> Params a -> m (Params a)\n", + " \n", + " -- Move in parameter space.\n", + " paramMove :: Double -- Scaling factor.\n", + " -> Params a -- Direction vector.\n", + " -> Params a -- Original location.\n", + " -> m (Params a) -- New location." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This same class isn't going to work quite as nicely in this case, because we must be able to compute\n", + "$$\\beta_k = \\frac{\\nabla f(x_{k+1})^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}{{d_k}^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}.$$\n", + "\n", + "Since both the gradients and the search directions are represented as vectors in the parameter space (`Param a`), we must be able to take the dot product of any two such vectors. We already have the capability to add and subtract them via `paramMove`, though.\n", + "\n", + "One option is to add something like `paramDot` to `GradientDescent`, and call it a day. One one hand, that is simple; on the other hand, it seems to conflate two independent notions - the ability to do gradient descent and the ability to use `Param a` as a vector space. Instead of doing that, we can require that the parameters form an inner product space:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- We will call this a vector space, though the definition actually\n", + "-- requires an inner product, since it requires an implementation of `dot`.\n", + "class VectorSpace v where\n", + " -- Add two vectors in this inner product space.\n", + " add :: v -> v -> v\n", + " \n", + " -- Scale a vector.\n", + " scale :: Double -> v -> v\n", + " \n", + " -- Take the inner product of two vectors.\n", + " dot :: v -> v -> Double\n", + " \n", + " -- For convenience.\n", + " minus :: v -> v -> v\n", + " minus a b = add a (scale (-1) b)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, instead of requiring `GradientDescent` instances to provide `paramMove`, we'll just require that the parameters form a vector space:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "class (Monad m, VectorSpace (Params a)) => GradientDescent m a where\n", + " -- Type to represent the parameter space.\n", + " data Params a :: *\n", + " \n", + " -- Compute the gradient at a location in parameter space.\n", + " grad :: a -> Params a -> m (Params a)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Great! Now we start implementing these methods. In order to avoid spending too much time on line searches, let's just go with a simple bisection search for the time being.\n", + "\n", + "The implementation is pretty simple:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- A point consisting of a value and the function at that value.\n", + "-- The stopping condition is implemented as a function\n", + "-- Point -> Point -> Bool\n", + "-- That way, the stopping condition can decide based on convergence\n", + "-- of the x-coordinate or of the function values.\n", + "newtype Point = Point {unPt :: (Double, Double)}\n", + "\n", + "bisectionSearch :: Monad m\n", + " => (Double -> m Double) -- What function f to find the root of\n", + " -> Double -- Starting point\n", + " -> Double -- Second starting point\n", + " -> (Point -> Point -> Bool) -- Whether to stop\n", + " -> m Double -- Approximate root location.\n", + "bisectionSearch f a b stop = do\n", + " let midpoint = (a + b) / 2\n", + " aValue <- f a\n", + " bValue <- f b\n", + " \n", + " -- Check if we're done with these two values.\n", + " if stop (Point (a, aValue)) (Point (b, bValue))\n", + " then \n", + " -- If we are, return their midpoint.\n", + " return midpoint\n", + " else do\n", + " -- If we're not done, change one of the values to the midpoint.\n", + " -- Keep the two values having opposite signs, though.\n", + " midvalue <- f midpoint\n", + " if signum midvalue /= signum aValue\n", + " then bisectionSearch f midpoint a stop\n", + " else bisectionSearch f midpoint b stop" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we have our line search implemented, we can go ahead and implement the actual conjugate gradient algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "newtype StopCondition m a = StopWhen (Params a -> Params a -> m Bool)\n", + "\n", + "conjugateGradient :: GradientDescent m a =>\n", + " a -- What to optimize.\n", + " -> StopCondition m a -- When to stop.\n", + " -> Params a -- Initial point (x0).\n", + " -> m (Params a) -- Return: Location of minimum.\n", + "conjugateGradient f (StopWhen stop) x0 = go x0 Nothing\n", + " where\n", + " go x prevDir = do\n", + " -- Compute the search direction\n", + " gradVec <- grad f x\n", + " let dir = case prevDir of\n", + " -- If we have no previous direction, just use the gradient\n", + " Nothing -> scale (-1) gradVec\n", + "\n", + " -- If we have a previous direction, compute Beta and \n", + " -- then the conjugate direction in which to search.\n", + " Just (prevX, prevGrad, prevDir) ->\n", + " let diff = gradVec `minus` prevGrad\n", + " numerator = gradVec `dot` diff\n", + " denominator = prevDir `dot` diff\n", + " beta = max 0 $ numerator / denominator in\n", + " scale beta prevDir `minus` gradVec\n", + "\n", + " -- To minimize f(x + \\alpha d_k), we find the zero of\n", + " -- the dot product of the gradient and the direction\n", + " let lineVal alpha = do\n", + " let loc = x `add` scale alpha dir\n", + " gradient <- grad f loc\n", + " return $ gradient `dot` dir\n", + "\n", + " -- Stop when alpha is close enough\n", + " let stopLineSearch p1 p2 = \n", + " let val1 = fst $ unPt p1\n", + " val2 = fst $ unPt p2 in\n", + " abs (val1 - val2) < 0.1\n", + "\n", + " -- Find the best alpha value\n", + " alpha <- bisectionSearch lineVal 0 0.5 stopLineSearch\n", + "\n", + " -- Compute the new location, and check if we want to continue iterating.\n", + " let xNew = x `add` scale alpha dir\n", + " shouldStop <- stop x xNew\n", + " if shouldStop\n", + " then return xNew\n", + " else go xNew $ Just (x, gradVec, dir)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's try this out on a two-variable function. Since we do a line search, doing a single-dimensional conjugate gradient would be pointless." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- We need FlexibleInstances for declarations like these!\n", + "-- We must declare these instances together, because they have recursive dependencies on each other.\n", + "instance VectorSpace (Params (Double -> Double -> Double)) where\n", + " add (Arg a b) (Arg x y) = Arg (a + x) (b + y)\n", + " dot (Arg a b) (Arg x y) = a * x + b * y\n", + " scale s (Arg a b) = Arg (s * a) (s * b)\n", + " \n", + "-- In addition to our usual definition, let's log the number of function\n", + "-- gradient evaluations using a Writer monad.\n", + "instance GradientDescent (Writer [String]) (Double -> Double -> Double) where\n", + " -- The parameter for a function is just its argument.\n", + " data Params (Double -> Double -> Double) = Arg { x :: Double, y :: Double }\n", + "\n", + " -- Use numeric differentiation for taking the gradient.\n", + " grad f (Arg x y) = do\n", + " let dx = f x y - f (x - epsilon) y\n", + " dy = f x y - f x (y - epsilon)\n", + " gradient = (dx / epsilon, dy / epsilon)\n", + " tell [ \"Gradient at\\t\" ++ show' (x, y) ++ \"\\tis\\t\" ++ show' gradient ]\n", + " return $ uncurry Arg gradient\n", + " where epsilon = 0.0001\n", + " show' (x, y) = printf \"%.5f, \\t%.5f \" x y" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can define a function $f = x^2 + y^2 + 3$, which looks like this:\n", + "\n", + "![](data:image/jpeg;base64,/9j/4AAQSkZJRgABAQIAOQA5AAD/2wBDAAMCAgICAgMCAgIDAwMDBAYEBAQEBAgGBgUGCQgKCgkICQkKDA8MCgsOCwkJDRENDg8QEBEQCgwSExIQEw8QEBD/2wBDAQMDAwQDBAgEBAgQCwkLEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBD/wAARCADUAjsDASIAAhEBAxEB/8QAHgABAAIDAAMBAQAAAAAAAAAAAAYHBAUIAgMJAQr/xABKEAABAwQBAgQEAwUFBQUHBQEBAgMEAAUGEQcSIRMiMUEIFDJRQmFxCRUjM4EWJENSYhdygpGhJTRjorEYRFNzksHRJjVUZMTh/8QAGwEBAAIDAQEAAAAAAAAAAAAAAAMEAQIFBgf/xAA+EQABAgMFBQcCBQIGAgMAAAABAAIDESEEEjFBUQUiYXHwBhMygZGhscHRFCNCUuFi8QckM3KS4jSygqLS/9oADAMBAAIRAxEAPwD6p0pSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlfhUlOupQGzobPqawb9frNi9knZHkVzj262WyOuVMlyFhDbDKElSlqJ9AACa5Tv9ru3xGXA59lE67WGzRD4mDQWC5Fl25ST5Ls8lX/vS9AoQpOm2j0qBK3BViz2aJan3GLVzgwTK67pVTcHctTcsEzj3PFNR87xtpHz6UJ6GrnGPZu4Rx6dC9aWgd217SexQpVs1C9jmOLXCRCyDOoSlKVqspSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiV6332YrDkmS8hplpJW44tQSlCQNkknsAB715OONstqddWlCEAqUpR0EgepJ9hXMnIl+k/EuudhNtXLh8VJCmLlcGH1x3slc3osR1pIUmGD9bgI8X6RtHUTLBgvjvuMFVq5waJlL5kj/wARF+auUeYtPF9mkhcCOlJCcnlI0RKcJHeG2r+UgdnVp8Q7SEbl6xVccd5Dd8TvI4bzuT4k+GyV47dF9ITebegaAOtASWk6S4jQ6hpaRoqCbHWa9XYoLIEO6zz5qlEeXGZURzTEnr07ByTHpoteV2BapNluQKgG3CO7TwSQXI7n0uNnsR3GlAEWzw3y7C5Ss8qPOgmz5TYnExb/AGVxW1xHyNpcQf8AEjuAFTTo7KAI7KSpIrPM8vsOC49LyfI5ZjwogSD0IK3HXFKCW2m0J2pbi1FKUoAJKiAKrKx4nysu7K5ziXkWfPVpSm32V5R/d7VrSVKTbZYT/MW51Fa3e5bcI6NpR5oNo2EWrehjfHuOP0W0KLcocF25SoLxHy3Y+WrA5PhRX7Xd7c58rebLL0JVtlAd21gdlJPqhxPlWnRBqdV5cgtMiruKUpSsIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIleLjjbLannnEoQhJUpSjoJA9ST7CvRcrlb7Nb5N2u05iFChtLfkSZDgbaZbSNqWtR7JSACST2Fc55TfLv8AEm4LZGRMtPE6VBT6lhTMrLQD2Ro6U1b+wJ35pAOvK1vxJYUF8Z11i1c4MEyszKMtk8/yXbPYZTjHGkZ1TUuY0spXkjiFaU02odxDCgQpQ7ukEDybKpO1HjxGG4kVhtlhlAbbbbSEpQkDQSAOwAHtVUyGm/h6u7k5t1DPGN3k/wAaOhnSMbmOq7up6eyYbiz5hr+G4rq+lSim2VFKgCCCD3BH2r0djgsgNujHNUYkQvM1E+Q8DtHIVhVZ7k6/EksOJk2+4xldEm3ykd232l+ykn29FDaSCCRUawzkl4R7tjvJjkK0ZLizHj3NfiBMeXDAPTcGdns0sA9ST3QoKSfYqsaW/HiR3ZUp9tlllBccccUEpQkDZUSewAHvXPeZYHN+JWbDzKCtq0WLHNyMXkyIoU5epQUlQefSob/dxKEgNdi9/M7BLZVbcS0zZjpr/b+FGDPHBSbEYFw5UvsTlPKIc2FZIKyvFLLKR4Z6SnX7ykt+vjLBIaQru2g7IC1EJspwVGMCz9vL2pdqutuVZsmsxS1d7Q6vqUwo/S42rt4rC9EocA0RsHSgpIk66twAJTBnNRuJmohkNgvkG+xuROPJzVuy62t+ElTu/lrnG3sw5QH1Nn8K/qbV5k+4N58S8uY/yxZXpMBtduvNsWI16sslQ+atsjX0LA9Uq9UODyrT3HuBz/yRnczHTCxfEosW5ZhfipFqgvu9LbaE/wAyW/rulhoHZI7qUUoHdQrS4/xQ7x5MYzfj+7eBniFLeuF4lFRTfFLILrUxIPmaJGkJH8oBIb6dCudtLZothL4A3xjx4c+jkp4Mfu6OwXalKrbiHm2y8oMvWibbnsfy62Nhd1sEtXU4xslIdZc0EyGFEeV1HbuAoJVtIsmvKOa5hLXCRC6AIImEpSlarKUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlURcvic/sFzLdeL+XcTex+1u9EzH8macS5bpEEtpC1yVEhTCkO7QpRSW09bfUpPUkqveq55w4ud5JxZLthcjxcrsalzbDLkJJaD/TpTDw/Ew8nbbif8qtjzAEFgzlRWEw+xKZbkxnkPMupC23G1BSVpI2CCOxB+9eyuS+On8yx61KybhJ1ERmKtUG8cd3uSsQ4cxhag8xFd8yoLmyenpSplY6FBACuurx475ww/P7grGXUysfyyOwH5WO3dAZmIT7rb7lMhsHt4jRUnfqQe1FoyK2JQY6Kw6UpRSJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiVoM4zrFuOMalZZmF1RBt0QAFRSVuOuKOkNNNpBU66tRCUtpBUpRAAJNRnkzme1YLJbxiw2mRlWZzGw5Dx+3uJDobKukPyXD5Y0cHe3F+uiEBatJMEsmD5Her+zyBzBdYd7yNjZtsGIhQtdjSr1TGbWduPeypKwFqGwkNpJRViBZ3xzTDVRxIohjiobbcluPxHZncYvJVsuGO2jF5CXoWCTWyhc1pYBj3GeoEokJOldLCdttKB6+paR0XCoAAAAAAaAHtUM5Pwe7ZCLfleGTWbfmGOqW5a5L2/BkNrA8WHIA+pl0BIPqUqCVjSkg1nYFnVuz2yrnx2VQ7hBeVCu1tdUC9bpiP5jLn6eqVei0lKhsEGu3Z4bYIuAfyqER5fvLcXCFFuMORb58dD8aU0pl5pwbS4hQ0pJHuCDVQ47c3uFMgj8cZXdXXcTualDFbtLc38mUp2bZIcPp0gEsuK+pI6CepKeu3bnPhWqDIudzmMRIcRpT8iQ+4ENtNpG1KUo9gABsk1Sd9sX/tQwv3fe4EqHxU6kOJQ4FMSsjWD5HAOy2IqSApJ8q3To9kfXadld8XXstAZ44L2tLn8/XUSnmFM8YQHiW2nAQrKHknstQ//AISVDYB/nEdx4fZVsKASOkAAAAAAdgKrjBcnueFXaHw9yHJBnIaUjHLwrSWr3FbHZB12RKbRrrR+IDrT26gmx3KsQaieea0eVBeQ8AcyVyJkuNXFNmy2zpUbbc+jqSpJ7qjSEj+bHXodSfUHSkkKANRf/bxaYthmN3yzSI2aW99u3PYs0rxJT81wHwkxz/iMuAFaXuyQgKKunoUBYWYZZY8IsUjIshmfLxI+kgJSVuPOKOkNNoHdbilaSlIBJJFU5I4jy7Ori3zXc5y8c5CjI1jsVR8SNaoQ6v7nJSk6fLwUS8r1SSkII6OpUrrzT+Xjn9+emvusAgjeUz45we52MTMuzOU1PzHIA2u5yG0ANxW0jbcGP7hhratb7qUVrPdWhLnCai2D8jMZU6/jt7trliy22socudmfVtTYJIDrK/pfYUQelxP6KCVApEocNXoF26LuHWPFQvcZ1USzfHYU5prJ2b87jV6sKFyIGQRlpQ7AA0pXUVeVbJ6R4ja/ItI0oaqf/Dz8UMPktmFjWeRk2XJJyVuWh9bKo8TIoqfpkxAskoUpI6ywo+IE+YbTsikZ7g5uvkiyNIdGB2KX4U98KHRf5jZ80ZI9TGaUP4h7BxY6O6UrCptkuNWbJ7cLZd4niNtrS8w42otuxnkd0OsuJ0ptxJ7pWkgg+hqlbdls2mC9lHDA68+HHHyU0G0mz0NR8Lrelcv4Xz/k/FLMexcyvSL7jTISxHy2OwpcmMgI+q5NJ3sdiDIbGu460p7rrpa13W2Xy3RrvZrhGnQZjaXo8mM6lxp1tQ2FJUkkKBHuK8ZabLGsj+7jNkesF1ocVkVt5hWVSlKrqRKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSsG+X2y4zaZd/yO7w7XbIDSn5UyY+llhhtI2VrWohKQB7k0RZ1VPz78TnE/wAOVhVdM9vXiXFxlT0KyQul2fMA9VIbJHSge7iylA91egrm7nL9oa9LMnF/h4twcTpTbmW3Ng+Ak6HeHGVpTx7nTjnSgaBCXAaq/wCDn4f5vP8AyhM5c5Ek3C9WHHp6HZU+5ul96/XZshSGVKWD1R2PqUlOkhfQhI0lYHomdnbRAsR2ltD8uF+kHxPOQaDgMy40AqAaLS+CboXfnCGYZ3yBxva825DxFnFrjewqaxZ0ul12HDWdx0vqOtvFvpUsAAAnWtg1Pa/AABoV+151bpSlKIqH5mw+Zx/kb3PGHW9b8ZUdLOZWxgEmVDb7pntISCVSGBvY9VtbT3KUa9GS4vifLWLx3BNOpDHzNnvdue6JUJS07RIivp7oVoggg96v1aEOoU24hK0LBSpKhsEH1BFcUZ9xnnPwx8hKvnFt+Sxx9llyDqLLch/2Tbbg7sGKXAFLiNvuEKacSOhLhKFJUFo6chU7TBJ/MZiFuOF+aPiEwdmfg3JttPIsnFHRHuD0RKI16+VUT4E1DaulqaytAJ6klDgUlaCFqSSelcD5TwPkqO+7h+QsTH4auiZCWFMy4a/8rzCwHGj/ALyRv1GxXLt7zA5jPgZLjlpfxrlzFG3vBxq7vIYcvETsZMNLg2mQwsI6m3UEhDiW1KA7pNhR7FxtzbaLXn8Bh+LckpIiXeA6qHdLe6k9LjKnUEKSpC0lK2lbTtJCgRW12eCjZbC3xhdGUrme4c48ncG5LYMY5Bs0vP8AGsgdVEg5Fb2mY9wiyUp2mPKYJS08twBRQtstlRSUhClFIVdeEcr8e8ipcTiOURJkqP8A95gr6mJkY9uz0dwJdaPcdlpB7itSCMVdZEbEE2lS2lKVhbpSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlK/FKShJWtQSkdySdAVUN85+avbsixcH2L+290bUtldxS94NjhOpV0qD83RC1JO9tsBxYI0QkHYyGlxkFgkNEyrUul1tljt0i73m4xoEGI2p6RJkupaaaQkbKlKUQEgD3NUfeuUc75hItfCrzmPYs4VJk5vKjJW5ISCAU2uO4CHNjepLqfDHYoS7vYgF/x7I7VyJar58SF5i5zjt5W2xBUIi2LVj11KiEIMPqU2tlzaUokPFbiHEgbAWOnoAgJT0gAADQA9BV2DZZmb/RVYlpyYqGkYWj4cbs/wAiYzIudxxe6qQc1amyVS5Yd7JF3Di9rWpI7PIB0UAKQkFHSq6Y8yLPiszoMlqRGkNpdZeaWFIcQobCkkdiCO4NZLqEOoU06hK0LBSpKhsKB9QRVLWxSfh8yNvGpanBxxkU3ps74b/h49OeX/3NZH0xnVq/hKIAbWejelI10mgQ8MFULr/NW+uqY5rKOKnXefbDIjx5EJDMW/W5RSgX6J19LbaSdbloKz4PqVElv0XsTjPOT8dweTDsriJF2yO67FssNuCXJszRAKgkkBttOx1OuFKE+6vSo/jPHl9vl/j8h8uuxZd8iLWuz2eKsuW+xJIKdtlQBeklJIU+oD1KUJQN9Ux3t0Y/C1BlUqM4vb5vxEx7byDmrYjYSVfNWXFg6FKkKSryyLn09i4kg6i7KGz/ADOpYARcKgANAAAdgKq3IsbvfE1/nch8fwn5+PXFwyckxpgdSuv8U+En8Luu7jQ7OAbGl/VYdlvtoyiywsisE9qbbriwmTGkNHaXG1DYI/8Awe4qeDShx664LR7sxgtRneEWDkHH3sdyKOtbC1JeZeaWW34r6Dtt9lwd23EK0UqH/pVaR+XX+KmpOKc53IJmQmuuz3ppg9OSNbCUtttpH/ftlKVMJ7rKgpAIJCbOzXM8fwOwyMjyWd8vEY0lKUJK3X3D2S002nzOOKPZKEgkk1VrvFlz5lQ5lXL8eTbF9l41ZY8npcsHoUS1rQdG4EgErSSGh5EE7Wtczp3vy/F1j9PtNahwlvYLPxDEciy6/scpcpQzFlshRx/HSsLasrShrxXdeVyYsfUvuGwehH4lLsZw1AMYz662S/R+NeUCli9uoP7qu+koi31tPqUkdm5IHdbPbf1I2NhM9cIq5Z5Sp56z49clHEcZ1UNz7j+3ZqzHmNTX7Pfrb1Ltd5hgCTDWSDob7LbUUp621bSsDRFVJN5Fy/OMoPANylxLDeWllF8vkF4+FKipbSsswe4U3LdSolTZJLKApQKvITYOe5nf7jeTxpxi6ycjcQly5XNxHiR7BFX6POD0W+ob8Jn3PmVpIO/I8NYP/YhOCyYb8mOHPmlT3Hj8+uaVdRmF8aUHysdXWNd+w7dq3LTEce78+PDnxyw5YDw0b3l9/wCFJLXZrVj1qi2Ox29iDb4LKWI0ZhAS202kaCQB7V5Ofaq4YzTKeNJibJyl1zrEt3w4OXp6EoAIJCJ7aQAwoAa8YDw1H16D2O65CzxrErVGFri/vW+XlwRbJbmjszHyN7JG+lpA2tbh7JSD7kA9CHFYGkmksuvaWOShcDPmtRyNllz+ejccYZpeS3thTiny2HGrVC2EuTHgex1shts/Wv26QojJwGBlvw/RW/8AY5OVMtjY8S4Y1dH9x7ksIALjTuiYr6ikEqSC2oklSCdEe3BMJ/sjDlTrrKbuWSXp0Sr1cw30GS96JQkEkpabT5G0b7JHuSSZA4a2fYYduYRaWznhw5cdfTALDbQ6Afyz/KvPinn3AOWlSLXaZb1ryOAP+0MeuiAxPin/ADdGyHWz7OtlSD999qsivn7y7LssWJb0ItIn5ZLf8HHG2JC40tMnXd1uQ3/EYQ2na1uJPZI13JAM/wCN+c+aeLIkG18lOL5KsqI7SH7lEZQxd4jgSAslvYblN72R9LoHr4hrx1v7OR7O4mzb7RpiPLPyyyXXs+0ocQDvN0n0XYdKinH3KWBcpWxV1wfI41xSzpMmOCW5MRZ9EPsL04yrsfKtIPv6VK686QQZFdLFKUpWESlKURKUpREpSlESlKURKUpREpSlESlKx59wgWqG7cLnNYiRWElbr77gbbbSPUqUewH60RZFeDrrTDannnEttoBUpSjoJA9yT6Vyxzj+0O4e4ztzrHHyV8j3xagzGYsz6PkS+rYShUw7bJ36hvrUkdyAO9cd8q818tc49TXJmUf9krIUMdtXVHtiNaIDid9ckgj1dUU7GwlNen2F2S2nt98oDLrKEudMCR0pM4HD1koItoZCxNV2JzN+0C40wdcix8Xwv9oF6ZBDkiJJDVoika34szSgsjv5WUuHYIUU+tcB5jzHyz8RN5XlvKmXuXKxtu9VkscZn5a2M6J/jpYBJcPshbqlr1s7HUAIfktwgznHccRKag2e3JQu7yAoNtoR2KYwII6SoaKvsnt+IV5RVZjn12tmH8cWh1Eu9SW4FsK0BD0xxXoGG1dkIAG1OuAJSkFWiBuvpWzuy+xuzrvxdoPeOhTMzKpGJDcA1uF505uoCS2Rr97EjUFJqxuK+Lck515DhcX4pIVEU+j5m7XNKOpNrgJIC3T2I8RX0NpPqo7+lKiPrdgGB4xxjhtpwLDbamDZrLGTFisglR6R6qUo91LUSVKUe5UST3NVl8Jnw1Wb4aeM28d8du5ZPd3BPyO791LmSyPoSojq8JseRsHXYE6BUau2vmnajtJG7SWzvXbsNtGt0Gp4nPyGAVuHDEMSSlKV5lSJSlKIlazJsZsOZY/cMWye1sXG1XRhcaXFfTtDrahog/8A2I7g6I71s6URch3fC4Vpv0TgjmBld1jeaZhGQKdcbkyGmkn+F8wCFtzmEE7UlXU43peyesCGoZzn4Ur/AHnPrxd5eYcY3XT17dQyDdbXI2EpnuoTpDzfQAl5aAlZCUrKVEKJ7D5M45sfKWKSMWvjkmPtaJEObEX0SYMpB21IZV7LQruN7B7gggkVyJY+ZM9wW/5HxZ8T+KwU/uZAC8gtbRdg3C3ODpRKfjbUtptfcLUOpCVdYX4YAKpGEGhXLtMAwjeZVpxH2V1fNcbc8YLKgQbtCvtluCOhTsR4FyO6k7StJHmaebUApJ7KSpIPYiofjFhx7kCTL405pskGXnmLsJUzemEmLLuEBe0tXCM+30uNrOul0II6HAR9Kkbrm04Fi+JZdbrTHu8m3Wa/ISnDMzsbyWnYi9KKbXKcTtqQ0Adx/GSpJH8P6gnq33KEfmewRoGSS7Sxfbvi7xlWnJrHGIfU1/jQ58IdS1MPIASpTBWevpcDaSgVMFVaZGTT11irTVaucONYRlcf53/bW2QGSoY9lSOuW8lI34bFyb0sLOtAvoe2T3Ir34B8YOCZHj8G955YL1gCZvU3415aCoKHkL6HGlymypDK0L2lSX/CWCD5azOI+Vsa5iwa3Z1i7jgjzElL8V5PRIhSEHpdYeQe6FoUCCD+R9CKi2XtjhrLpXJ8KO49hmSPIRmUFCOtuC90hDd2SgD0ACUSP9AS5/hq6hgtdUKzCtb2m66q6Is1/sWRQ03CwXmDcoq/peiSEPIPbfZSSR6EVn1Rc/gTiC7yTfLXjCLBcZCkvqumMy3bRKe7dip6IptTgI7eYkEVCLrP554m5Btlki8wtT8Qvwbg2R3KbYiZ4Nx7/wB0kSGPBdHij+U6tS/MkpIUSnejoDhgrTLZDfQ0XVdKp1vlHmuwDw8q4O/fYT/7xid8jvdQ+5ZmmOUn8gtX5E+lY1y+LHAMYdgsZ/imdYq5cpIhxBOxuRIS9IIJDSVRA8kqOjoA96iLHDEKdsVjsCrrpVWxfih+H999ESVyvYLXJc30R7vI/dzqvyCJAQrf2Gu471MIXIvH9yQl235xYJKFp6kqauTKwR9xpXpWqkxUipWvTkNgWkLRfLepKhsESUEEf86wJfIGCQEeJOzSxR0f5nLiykf9VURb+lVxdPiP4Asz/wApcuaMLZkd/wCB++46nT+iEqKj/wAq1bnxOcdyXAzjNozTJFqG0KtWKXByOrXqPmVNJY2B30XN9teugcgE4LBIGKtulUHafiI5Dzy/3zFcA4Yct1xx9bLc8ZfeWYCmfFR1tueDGEham1p+hXYEhaSUlJFbJ7E+cMqKf7YcyNWKIo9S4OIWlEdZGtFtUqUXlkf6m0tK36EVu2E52SidHhtxKs7LM4w3A7abzmuVWmxQQQn5i4zG46Co+gBWRsn7etVs9zzkGWqficN8ZXa9eH2F4yBDtltR2NhSFOtl99J9QppkoOvrAIJra78d41wLyOzytKta75jFzaZg3q63uSu4z7FJ69NzkyJBU4mMskJdSFdKCELACQquhisKAIII120e2vyqZlnB8RUD7X+0LnHL7TmMjkCxW/4mchayXD8kQ1AhxrSyu32SHdyVgRpjBcUuS28lSUtqeWpHiI0UArTV/W+2W2ywI9os9vjQYURAaYjRmkttNIA0EpSkAAAewrFy3FrFm+N3DE8mgpmWy6MqYkMklJKT7gjulQOiCO4IBqu+M8tveNZE/wAKcj3Vcu9Qm1SMfu8jSVX62j0USAAZLI0l0D18qx2UQLTGiHQKs+IYlSVYt+stpySzzsfvsFqbbriwuNKjujaHWljSkn9Qaq7j7Ir5gOTs8J8gXBcxS2Vu4le3lEqusRGyqK6o+sphAG+5LjYC+5C9W6qqd5vyfBr3Ce44Qzdchy5tTMyDbMcIVcbfICtsSlOEhuKkK79bykpI2PNvRnNKqNrp0VrrISCokAAbJPoBVJZrlk/muJc+NeMbDBu1jlhy33vKLkjxLVGT3C0RkAhUx9J/ykNoUO69p6TH8Ai5jzDNueE/EXkPyd6saW/nMNtBVEhToq0AJlOPBRdmMrV1ApSUNhQU2tC9BR6AhQINpgx7Xa4TEOHEaSxHjsNhttptI0lCUjskAAAAVK2cQcFgkMPFUXxDjVu4OymRgOVM/N3G+rKrRmM5xT0u+oSCfk5LyySJDSR5UbCVo7oSOlYF3rNafOcLsXIGOyMayFha47xS4060vofivpO232Vju26hWlJUO4IqF4FmOSWS+/7KOUZKX7800p6z3pLQQxfoifxaHZuUgfzGu2/rRtPUETw/y93JaudermrFWa5/5NyZPw13d7MLDFevFjyVx5+ViELRmCWlBW5MgN+hQddUhJ0kd3dg9QVYWa8pC13tOB4PaTkmYPoDnySF9Ea3tn0fnPgEMN+4TouL0ehJ0SPHBeNRjk+TmOV3MX/M7owlifdVNlDbTIPUI0Vok+BHBJPSCSo91lRqY75kzHXTrT1Wgddq70Wh46xORlEq38x53dbfe71LjeJZmoD3jW2zRnRvpin0ccUk6XIIClDsAlPlqx3CKqufYb9wpdJeQYJaX7thE5xyXdccigF+2vq8y5UFJICkKOy5H367U33JSqwrFkVkyuyQ8jxy5MXC2z2g9Hksq2haT/6H2IPcEaNWYEhunHqqjiOzyWDl+KWDN7DKxnJbeiZAlp6VoJIUlQO0rQod0LSQFJUCCCAQaofJOWsy4wu6OG5dziXa4SWWU27LJax4VrYcV4aXLsAAEO7H8I9g+rSfKQVGy845Bu716e454wjM3HK/CSuZJdHVCsTS/pelEEdSyDtDAPWvW/KnahlY1xdjGO4zPx2Yx++lXwrdvsy4oS49dnnE6ccf7aII8oQAEoSAlIAAFTFpiO/LMjmfp1hzosB4aN/0+vWKyMLwu1YLY02q2qXIffcMq4T3jt+4S1/zJDqvdSiP0SAlKQEgAbdyqwLmU8JBLDrUnIePWEgJeSVv3KxNJSdhSe6pUcBI0Rt1Gz2WnXTLLnyDh8DEF54q/RpFjDXityoqvHD++yUNBGy4tR8oSnaiTrW6uwXtaLppLLrLj9VC+ZMxWa9Wd5RYMRxuXdskbU/EIEdMRtnxnZrrh6UR22/8RayQkJ99/aqVwzijOePLkvky12yPJkTGnm14cuWVNWiGtwLSxb3lbS24ACXEDTbi9AFASDU/xLGciyi+M8l8kRjFltBRsVh6gpFnaUCPEcI7LlrSdKUNhAPQj8SlT1wip2wvxBER1JYa8/49a4amL3YLR59fXoxfFM6xvOIb0mxzFF+IsszYT6C1KhOjsW3mleZB2D69iO4JBBrxzDKrPhlhlZFe3lIjRgAEoHU484ohKGm0+qlrUQlKR3JIFa7OcHxe4vLzOTcnsdu1ujqP79hvhhxllAKiHSrbbjadk9LoUn31VP4tl2Y3S7Qc/wCWLRNlYrBQoY5cIENQaWVLUn94TIqSpbalt9HhqAKUpWtRCOrQsmO+FJjsTnlzIy+CaT00DQ/eGGn0GqsXBcWvf7xlchZ2hIyS6tlpqGh7xGrRCJBTEbPYFRICnFgeZfbZSlOpc6qvG33e13y3tXWy3KLPhSE9TUiM6lxtwfcKSSDWtybIbPilkmZDfprcSBBbLjzqz6D0AA91E6AA7kkAetdKC1kJk50xn9Zqq97nu4qGcnzLLi6omT25mcxmcp1NvsT1mlmHcJMhQJS14qSNtDutYXtASkkg+lWjgnxGc94FBixOSINs5IYT0fNTra2i2XJvsOspZP8AAf770OprtruaqrCbFd7zdneT80hLjXWaz4Fqtrqgr9zwFdJ8M6H85wgKdPfR6UAkJ2Zk6qqcfYlk2tOLaGSJwIoZanUnQ4CQxmrEO3xbJJkMz1zHl/C6TwT4oeGs8nRrExk/7kv0pXhtWa+tGBMcXvXS2lzSXvbu0pY/OrXBBGwdivmlytcY8i3MYbEtVvut9v8A1NQIs1oOtNJT/MlOpP8AhtA7PptRSkd1Ctjg9vzfi23xIXH/AC9mNs+VYQ0pqRO+fiOqSkAn5eSHG2we500EevbWhXl7V2Kj94W2J4cBrQ8p1BMq5ZarqwtuQ7oMdsp6V89V9HaVxjaPiv54xwD+0uPYblcZsbW5FW/aJHQB6nqL7ale5+hJ+yfaWYZ+0C4xyS1x7xesFziyQ5HV0TBbEz47gCinqR8qtx1SDrYV4YBBBG68/adg7SsbrsWC7yF7/wBZrowtoWWMJtePOnzJdRUql7Z8ZPwz3Ieflm1Ww70U3ht62lPfWyJKEaH5nt71MrTzfwzftfuTljD5+96+WvcZ309fpWa5T2OhmTwQeII+QFbDg6oKm1K0kbN8Mma+Uy2zPbV0jw57Stn7dleteMzPMHt+/n8xskfp3vxbg0jWvX1V7Vqsre0qvL98RPAmMBP7/wCZsLgqWOpCHr5GC1j/AEp6+pXofQGq8yH4+fhcx1DKnM+l3BUpwMx026yTpIddV9LYWlro6j7AqHv7A6khwYkX/TaTyBPwCtS5rcTJdDUrlK7ftCcIbCkY3xRnd1VryOvNRIbG/souP+IP6NmqW5O+P3nz92OSMYxnFMXLiyzDjjxrtNlOq/loSohpts+vUShYSATvQJrt2fsttm0tL22ZwaM3C6P/ALS9gScMVWdb7M0yvifCvwvoqVBI2ogAe5qtOSPiS4R4nWqLmvIVtj3AelsilUyer09IzAW7ruO/TruNmvm5d8w5Zzm2Mt8rcq5JkUlxsGXHTOVEgLcIHUPlo4baWkaAHUk+m+xJ3Bb27DxtljGcLtsKHdrn1CMhlhKUMIH1yFga8qN+nuopHvXsbJ/hhaxCEfaMYMFN1ovOrgMmzJpKteRVN212ON2E2fE0C6rzv9o9mWU3O74vwrx41ZWIQDf9osiUHilavRKIbSgOvp0sdbvl6k9SPw1z1n2Y5jyI47d+WM+u2Rtt6fWxOk+HbWSjZC0xEdLCenvpRSVaA2o+taizWeHj1ratcHqKG9qccWdrecUdrcWfdSlEkn86geVZIclvxxK0W6RdIUMBdxSwNNvu+qGFO7CUIH1L9SeyQD5hXurB2V2P2bs7Yr4QiRyZC9vG8agASIk3Mhs6E5tVZ1qjWp0pybwp16raWJhV+npy2bGLEdpBas8YgANskd3yB2C1j09wjQ7EqrAyTPGUyTjGJqbuOQSULSygd2I5T9S3ljsAnY2keYkga7145AzNRAVOzG4qcbWoNR7NbFltD7h7JaKzpbpP/CnW9jQrAhSLXhqFKnstOXyajqMGFrpjMp2UtoHZLbSe+1q6QVEk+oFdaLaIsJvcNPdg1e8+KugEw1zsGDeIAndk2ssOGDVflqw6LYrezLyaUu8TWnA6lCWyW1SVn1Q13LjqlnsVbVsgJ6RoV9Pfgt+Flzii1nlLkOF/+vL9F8JMVzoULHCUeoRkFOwXV6Sp1e/UBI7J2qpPgB+GG9X/AOT+InmeysMpKy/h1ieYV/d0fhuLvXrqcI34QKQEpPWO5SR9Aa+M9re0sG3gbO2aC2A3E5vIzJMyQMpmprIUC6kKGW7zsUpSleFU6UpSiJSlKIlKUoiVXXMfDVp5WtsWSzNVZsnsqlPWS9sICnIq1DztrSezrDgAS40rsod+yglQsWlFggOEiuB5HF8q6Rspwu222Fi2WMI/7cxGStxVjuKz3anxAkpXHDihtElnRStJC0qUgipdwHy/kolL4h5mtU2w5ha0gW1+4OoUi/QgNB1l5J6XXUfS4Bonyr6U9RSnoHmriI8mWqLcseuybFmNiUt+x3gNdYbUR5mH09i5Gc0AtvY9EqGlJSRxpkub2DmO05DwT8SOLL48zKwyWwxPW/8A3RqXsmNMiS+xQFlPlCunrSopBUeoCxDfPmuPabO6Fxb7j7qxOY+J/wBwZqvn3CZF3tksRUsZKLI+WpTkdvZTMQ1otyHGh9TTiFBxA1oqSipZZ8l5giWBme/b7By3i90jJcYnWhSIE56K4O/VHdUY7+0n1DjQOtdHeq34wyLmjG0P4w1dWcmvFjbBuWMX6T4ctxrzdMq23Ag+Ow526UvhaknaFPJ1XtxTmHGOKM2FldXKx7F77KUqVj13iiLKxuc4skvtaJbdgOq31raK0NOKCirpWrpnEhVVgXESxl1zW94P5vxHGM6c4InXafBiP9TmJxb5DchzIiEfzbasOJAWlvYLK0FSVNno3tA30DluKWDO8cmYvksJMq3zkdK0nspCh3S4hXqlaVAKSodwQCKivJ3GuH8xYZPwrMrezKhzm/4bxbSpyM4O6Hm1KB0pJAIP5VUPE2NZtYJdw49sfKVzsGYY/wBL0u1XhKrtZ7pFV2amxEurDzTStaU008kNL2kp10KVKARQpea/eBkeqq2uJ8xv8S4S+I+R57b+W2FkyI0zQQL3auvpampSOwWOyHkj6XBvQStG5jneG2XkTFJ2I5A24Ys1IKXGllDsd5BCmnm1DuhxCwlSVA7BSKojlW3fELNgQLqzxzZL1k2NOqn2K945dfAdQ906Wy7DldALLydoWjx1dlAjSkpIkeF/F1xXfYfhZg9ccJvMN4wbrCvsB6OzAmpSkuMKllHgK11AghfdJSr0IpTArap3m+ylfGeXSrzJuvFvJTUKRl2Ntt/NLMdKWrvCX2ZnNIO+ytdK0j6HEqHoUk7HI+BuFsttk+1Xzi7GH2bky4w+sWxlDulpKSpKwnqSrROlA7HtWnz2zRuRrNbs64ryO0OZVYV/NWK5tSUOR5CSAXIby0bKo76dJUBvR6FjzISalPHeeW/kPGWb/EivQpKFri3G3SBp+3zGzp2O6PZSVeh9FJKVDaVAnW7kVm+RvBU3xbxBw9bbzN4i5G4Ywd/ILQlT9qub2Pxf+3rWCAiRvo8z7e0oeT6hRSvQDgFW/C4X4etrhdt/FOIRlkEFTVmjpJBBGthP2JH9axeVePnM+ssZ2y3U2fKLDI/eOP3ZCQVRJYSU6UNeZlxJLbiPxIUffRGTxhyEnPrAt2fAVa7/AGp35C+2te+qFNSB1pGwCptWwttfopCknt3A1DQDKS2MQuE5qB45abX8POcjEmLTEice5pOK7Q+ltKUWe8OqKlwlk9wy+olTR9EuFTf420i7wSkdI7D7DsK02V4zZM1x2fiuRwhKt1yZLL7fUUnR9ClQ7pUDogjuCARUB4sy3ILDeHeF+SZypWQWtkv2e7LBAv8AbE9ID+/T5hvqSh5A9+lY7L0MgSol68J5rM5Vw29i4QeV+OoiXMwx9vwlxfE8NF6tpVt2C4d633K2ln6HAPwqWDK8JzWxchYrb8wxx5bkG4tlSUuJ6HGlpUUuNOJPdLiFpUhST3CkkVvOo+wqlc1LvBWVP8pWlS1YZfJAOY2/q6hb3TpIuzKfwgdg+n0KQFjuk9SUqoHXhLNXDPhw7nDkW64Rm5EWU0pl9lxO0uNqGlJI9wQdVUmD3Cbw/kzHEGTzHXsduCj/AGLur61LISNlVrfWe3iNj+SSdrbHSfMjattdviL4ht7q4dryr+009ASpUHGIr15koCgCkrbiJWUAgggq0Nd6iWev8o814pMxS0cMtWa1T9dNwyy8CHJZUlXU1KjsRA8sLQoJWjrW0oFPfVbUyRpOBwV5KI9ftVHc35Zx3lsU4XZLrcbxnNnlJmWprFUJl3G0T0JPQ66d+FHQQSlXjqQhaVlJ31VHsHxnN8pvr/FnxG8mXe43i2x0Px7faum12y/wUEJEvraAkLWFaDzPihAUobR0LG70xfEMUwa0N2HDMbtlitrRJREt8VEdoE+p6UADZ9z71IJvWC4MKpTDci5i5nXPxDP5rfF860tNIu1ntLyXrvLQtHaS3JO22GFqCkgtBxWwoBxCk1bmH4JiPH1qNnw+xx7dHccLr6kbW7JdP1OvOqJW64fdayVH3NaXk7jqVk64WXYhcEWfNMfCl2q4FHU28g91w5KQQXIzmgFJ2Ck6WkhSQayePORIWfW2QHIa7XfLU98perQ8rb0CSPUegKm1fU25oBaSD27gSsEjIrDnzExgsbkfjmPmrcO72ud+58qshU9Zby2jqXGcOuptY2PEYXoBxsnSh9iARjcb8kJzVibZr3bxZstsSksXuzLc6lR1n6HW1f4jDgHUhwdiNg6UlQE6UT6CqG5iv1suGZRG+J4si88r462fCRbkgsNRnNFUW5vkpabYWNLDa1eJtIW2kqFTeHeC0Dr26ev4V0zpkSBFenTpLUaNHQXHXnVhCG0AbKlKPYAD3NUPmDV2+JuCizYc5Jx7DoktEgZeW1Nz5TzatpVaknRQkKA/vKx0qHZCVg9Y/eO7UnnREy7cyXVybc7RMEedggQWLdZZCCFIS+1vqmqOgtLzhLSx0qQhNXmoJSAEgADsABoAVM0d6OCwXd2eKpnhWXG49c/2K5RbGLZkbHiSo05K1LRk7Q11zg4sla5H0+MhZKkq7glJBq2lnuTWizzBLBn9pbtl7bcQ7FeTKgzY6y3JgyU76XmXB3QobI7eoJB2CRUHsfJVzwWWvDub7jEhvsg/uzJ3AmPBvDKUk7WT5GJSUpJW0dBWipvttKbEP8vddhr91G51+oxVnLO/SuZs0XlzeVXmJ8MDjxaafcRmDcdpkwmX1EFxUEuEI/eYBWSnu1sjxu/SDPXr1mXNZXFxKRcMTwhWuu/dHh3C8I7EphIUNsMqGx8wodSh/LSAQ5U+x7GrFh9jiY3jNsYt9sgo8NiOynSUjeyT7lRJJKjskkkkk1KG99hQa5+XX3Wt/usanT7qJ8Qp4/GGIXx2D8ouQ4ZynwfnFTgdPfN9XnMjqGldXfsNdtVL3SaguYce3mNd5Od8W3KLaMkeAVNjSWyqBeQlOkokpT3Sv6Ql9PmSAAQpO0nXtc6YrDhyY2aNP47kkBCBJsL6S7KecWrpQIgSP72latBCm9kkgKCVbSLcJwh7r6fHXD5Ubpvq2vz11RS/J8lseJWSXkWSXJmBboSOt9906SkegA91KJIASNkkgDua5+tfFGY3nInea7FaYeOyFSRNtOG3ArTEeB2FS5iUkhmc4gnpUhJ8MaCgs71YlgxG/wCdXaJn/KcL5f5VwSbHjKlBbVrI+l+QR5Xpejvf0tb6UbIK1WE4qrAhfiCC+gGGvP7D10Wne91RuOf2/n0UMw7ka0Zgt60vMu2jI4CEm5WSYQJMVRAOx7ONnfZxG0n77BA38p5phpb77iW220la1qOkpSO5JPsK0ua4BjGamJLvEd1m4WxSnIFyiPKYlw1EaJbdTogEdik7SR2IIqhXslzfM3n7RkKJ+XcVQJbjUvIbZDDcm7dBB8Fxhs7fjIUFJccYTp0jpCOnq3a750CTXic8D9xlxNQtA0RKtMtf415YqcoDXOk1E6Q04rj2A6HIrTiSlOQSEK2HVA9zEQQCkEadVpX0JHXYzmgOkAa9Na7arEx6/wCO5LZIt3xS4xJtrdR0sORVAtgJ7dOh9JTrRSdEEaIGqyHVa2SdAdzXRs7ABenMnPrLT7zVaI8kywAy6zVe3bjSJaJ8vK8DvRxW4PqVImpSnxLdMV6lciOSBv7rQUL7fVrtVaw81veX3S05jydiFxiYna3FP2mVbUGVAmSApSUT32wPHQ306LQUgoT1FalHyETSW7/tsuCosWQv+wFvfKJDrRKRf5CDotJV7xEKBCiOzqhrfQFBVg9LbLSWWm0obQkIShI0AkDQAHsKQ4HfmcMyb7OOstOUpmuArs6N3Yk+p9x56/HxqrRkdiyeAi7Y5eYV0hufS/EfS6g/1STWrzDKrbiFmdvFy8RzzJZjx2UlT0p9Z02y2kd1KUrQH29ToAmtNmWC4HFal5g9IVismM2p6TeLY+Ia+hPcl0jyOjt6OJUPyqvMdtXL9zuEPkm6ohZAxGQ6myWy5OGDLZYV2EpfQgtfMuIJGihPSkhIKSpdXjHjQ5Q7s3HMVAGpGPIVmaaqFrGO350408p4fZTvC8ZuVucm5RlbyH8kvXSZXhuKWzDZTvw4rO9aQjZ2rQK1FSj6gCQPK/Oog9yxarcfCy/H7/jroBKlSoCnmAB6qL7HW0lPp3UpPr6etazIeS7deW4tg40v9pu15u4WGH2JKXmYTCdByU4U7Gk9SQlP4lKSOw2R0IEezwWXWuqMv1EngZEknh6AUheyLEdMj7fUSC9OVS387vz3HdrdWi1QwheRy0K1tCu6YKCDsLWO6z+FsgeqwRLUtMxGG40VpDLLSAhttsdKUJA0AAPQAVjY/YIGMWdm0W9TriG9qcfeX1uyHVHa3XFfiWpRJJ/OvC83WBZbdJu10lIjRIjannnVnshIGya6VmhGGDGineOOgAy5DM5mZ0lVixLxDGYD3Ovn/C0ua5LHxq2JfEVMy4THBFt0IaC5UlQPSgb9B2JUr0SkKJ7CtBYOPrRAti/7R263XW7T3TLuMlcZJS4+QBpAVvpQkAJSn7JG9nZPnjVruF7u6uQsmjOMyXmizaIDoINvik72pOyA+4NFZ9QAlHsSZK6vdWbPBFqd38VtP0gjLUg5nLRvElaRIndDu2GuZ+nIe55BRG94txvbLfJut4xewsw4jSnX3XYTfShAGzvtUZw7j6wS5MjMrriNtiKnoSi328w0JEOIDtPWkDXir7KUfbyp/Ds7NXTyNfOtQ68ZscohIIIFwntK9dEeZlpQ7a7KcH2R3l7hJqSDY4FqiiL3bbjcN0VOBdhgKhupm7RSd7EhtuXjM41NBpjjr5DVasw7HZIzslEODBjsoLjqw2ltCEjuST6AD1qLY+xKyi6jNbtHUxFZ6kWOIsdJbaI0qSsb11uA+XY2lHbsVKr8urjefXp2wpUP7P2d9P7yXsgTZSfMIwPoW0HRc+50n/MK2F0zjErQ98rMv0QSD6RmVeK8f0bRtR/oKth8GI4PeQ2E00wAc4Z5UacNXTP6QrLGOaJATcfYfz8c1srhNjQIj06a+lmPHbU664s6ShCRsk/oKiGPRZGQ3L+3V3Yca6kKatER1OjGjk93VAgFLrmgSPZOh99xy6ZRd+RLo1Gx3EbjMsFsfKpSpahDblS0K8rSw4Ovw0KG1aSdqAB+kg72aznkllcm8ZJabBDZBW98k0XlpbAPUfGd0lOh7+H7UbbWWyJfY1zobMJCQcR+q866LrayxrN2QVyHBMNsiQCfbhITqf4WzyvKbPiFokXq9ykssMJ2E7HW4r2Qge6iewqC45eL5KEi9QMefud3uRBkSpXVEiRWgfIw2paetaUgk9SEkKPUdjYFezEcSt94vbedzI0l1lhKk2gz31vvLST3kq6yejrA8iE6AT30CdDeZrl8XFbcHlrYVLf2iKy88GkqI9VqUfpQkd1H+g2SAdHxYtpb+OtD+7hNndAqT/VMyEzg2TSZGYq6l2GxrPy2iZPUvvX4UUyaNkt6loxeXkSHJ8tHWuPAaUzGhRyTt15XUVuK7dKE9SQo+qdA1v2GsdwHGWoiHGoNrtrPSFLOvT1J/wAyiTv7kmo/Zp90Zt77lkj+I4/uRcL/AHYFplxzXdSG+y1ISBpI8iQnWlGo41Et025xsuzK6yZ0Zle7SiTvqlu67utRU9gkAHoGiojzKJ7Vz22ptnd38Ns4jxK84mTWzzcaurU3QAXSa0mQV9kMndOAWXFTlWXXY5NLQqyW1ttaYapKE/MNMq9VoSdhC1AbK1+g8oT6qPXfwU/BvD5Kfi8pZtZHI+DMyEyrfElpKnsmdQdpff69lUUEdgr+br/JrqxPgx+GC7fEO+zynylYVW/jaO8VWu1vK/jZC4hWut4D6YoUk7QD/EI0T07Cvp0wwzFZbjRmUNMtJCG20JCUoSBoAAdgAPavlfantRDeHWLZzy6c78Q0LicQ0fpbSRlUiTZ3cerBg3au9F5pSlCQlKQABoAegFftKV87VlKUpREpSlESlKURKUpREpSlESqn52+H6wczQGZrT0e15LAaWxEuTkUSGnoy/wCbDlMkgPxnB2UgkEHSklKhurYpQGSwQHCRXzZu3G3IttnM2LC1HGc1wrxHIWPXCUpxLSQU/wAa1TFjqeguJKUqjubbG0pPgrSCLPwDNMF+J3DpWC8xccMQ8jt6fDveMXqMC4wvukSGCe5bUQSlxB2Pv6E9N8ucN4nzDY27bfFzLdcoSy9ar3bXAzPtj+iA4y5o+xIKFAoUCQoEVwfy7aM7tF3Y49+IRLlmv9kdckYVydjpW25K0lIK3GEDaepJIeZSo7AUpKQlHiC1Cig0P91xrVZHQ95uGuY+491YsC4cycEZ3Gwe33WDlGB3QJaxePe5JZlMKSnzW4TfMVOAAqa8ZKvESOguJUnzbnO8+xTL41ufv6rjxTyFZX/HsMzI2PAYS/ruwqU2ox32HgChbYcJI0ekKCar7FOQc3y3EH8Uz+xwuRIJjocnR4Mhlu7tM7HRMjlBDM1relIeZLS0kAdPWCKl/E3PGN3K7f7G89v8a9GSPAtM25MeFIlNqTsQrhHdAUzLSkHXUkJeSCpPcKAstlgqZJxlUadesx9VdnFPKVu5KsTj5ZRAvlrcES9Wsuha4UkDfYj62ljS23NALQpJ7HYGo5Gxa947fjy9gEJ6fObZSxkVgR0qRfYCAfoSrsJTQJLZ7dY22rsQU1vn/wAOaMXcf5J+GuS9hGYRkID0K2JR8jdoqCSuKqK5tlCiCooWEjS9E+tb7BOXuWrnZ2L01htvzq0FJQ7Lsrwtl0iyEHTkaRb5a+lDqPfT/c+iQCDU4ruuWoI8bDTQ9eil1r4o+Hrku1xM3suC2J1q5tB1qdBimE+rudhZb6FpcSrYIVpSVAg6IqvOQvh7hcZXNPJvFU3PI0IlRyq02jI5bsiYzpITMZQ8tYcfZCd+Gey0FYAKtb0dx5ww3hPNVZYi2ZHjuOZLKT/aSyXe0yYzcGQogfvOK4UlhQ76fbbWSoAOJBIPX07YsmsGTQW7pjl9t91hvd25EOSh5tf6KSSDWtxrqZrYxHw65FQCy4LmVytEO/YL8TGUTrZNYRJgmdBtk6M42obSetMdDq0n83N+vf01Ds34r+InH7xI5dw3lKxXDIYdvEOXBaxZTZu8JKwvw1p+a6FvoHWWTpJ6lqSVBKjrfXDr+He9y8hhsD/Zpe5gfukdG949NdWAqWgE6EVxRBcSNBtRLgGiurnakIdQl1pxK0LAUlSTsKB7gg/ate7nRZ70tqKhVDhx5iz3G4OV4xz7jUu13FBU06jDFIWkglK0KSqVtDiVBSVJUNpUkgjtWFnXBXLmeRYCrjz2mJcrHMRcrPPgY2yy9FlIBAPUpatoUCULRrSkKUkg77bDLLZcuG8kncrYo26/i9yc+YzCyMtKcUg60bnFSnv4iBrxkAfxG07HmSAq1bVeLbfLZEvVnnMzYE9hEmLJYWFtvNLAUlaVDsQQQQa1uToUMUjebhyVEcZ4nI5EauVh5G5b5HkZVjjqYl+tX74btgQpQJQ8yYDbClR3gCptwHetpJCkkCcM/DTwaFqcumARL8tQ8yr++9dir07n5pbmz5R39ewrK5MwO63qXBzzAZUW35tYW1ohPvp0xPjq7rgyiB1FlZAII7oWErAOtHbcd8g2/kKwm4sRlwLlCdVCu9reWC/bZqP5jDmvt6pUOy0lKk7BBrIYJyKy6KSLzSq5x1qH8Ml4Rh64SY/GN5lk2eekDpx+W6rZhPkJHTFWoktOKJCFK8MnRRq71KBrBvVptWRWmZYb5AYnW+4MLjSozyQpDrShpSVA+oINVdh18u/E+QxeKM8urk2zz1FGIX6ST1uoHpbZSyNGQgfy3CduoHfzpJVsG3eS1L79c1LOTeOYHItnZaE6Rar3anvnbJeIp6ZFulgEBaT6KSQSlaCClaSQQa13G/Jb+USZ2GZfBbs+bWFKf3nbgT4b7ROkTIxP8yO5rsRsoVtCu4r3ZZzVxlh01dmumUMyrwlPX+6LW05cLioexEaOFu6Ppvp1Va53J5M5Xes974x4sn43eLLJTJtuS5RKbgJS2f5rJit+JIdacG0LadS130pJBSlQ3oDMLIJIk7BX6o9t77Cud+WM4w9Wbt3Lh6SrIeWLK2hhdusrKpDUuKVd4dyeR/BYQdlSFPLSW1aUAQSlXlxxZst5lYnjmvkGb+8bVJ+Wu+E2dsWyHCdG+kOOIWqRJaWNKQsuhtxPqgHqSLuseP2DFrYzZMZskC02+MnoZiwo6GGm0j2SlIAAqQAxBSi1vCGa1PsqXxKfnfxE2N+bfspGG2RqW5CuWOWNTiLww62fNFmTV9KmVaKSpLCEnRBQ8pJBNtYzieM4TaG7Didjh2qA1shmM2EBSj6rUfVSz6lSiST3JqI57hGRQ76rk7i1cZnJm2kN3G3P6RGyCM3vpYeXrbbqQpXhPdyk6B2gqSd3gXINk5EsZu1rbkRJMZwxrjbZaPDl26UnXWw+j8KhsEH0UCFJJBBqaGJGTsVq98xNuHXU1ouR+PrndLjGz3j+4M2nM7Wjw2nnU/3a5xt7VDlgd1Nn8Kx5m1aUO20nMwbke0ZwJtv8Fy2X+zrSzdrPJOn4iyOyh/8AEaV3KHU+VWiOygpIkOQZBZMXtEq/5Jd4lrtsJBckS5byWmmk/dSlEAVQ2ZwMz5wyCLkPEcZ7B3bKytqFntwiqD0pKxsx2IKwDJjK7Ere6Ug9KmwojYm8Bm3HRatN8SdhqrSz3kvGcAREYurr8y63Rws2uzwG/GnT3db6Wmh7D8S1FKEDupSR3qA3Dh648zxHZHxBR2Hbe51G34rAlLEWB1J0HX3UkGRKTs6V2bQfoBI6z+8DRsZtEifjl4tDsTkq2tJF+kXOSZk6e2o7ElqSsBTsVagSlKAlDZ2jpSRqrVudyt9ogv3S7To8OHFbLr8iQ4G22kAbKlKPYAD3NTsb3om/DT7rVz+7Mm46/ZVlbcsvnFkuNiPJkp+dZFIQzastcAKVHYQI8/pADTvdPS7oIXsg9Kh5o7zN8VGGcWXM4zbLbJynIGQHJcCA6hCYbZGwXXVeVK1D6UDajsEgDvVPc/fFgrP7fOwLiZKE2Ca2qNcb/Kj7VKaUCFtxGljXSR/jLH36Unssc8Wy2wbRERCtzAZZQSdbJJJ7kknuST3JPeq0W2GHuQTPj1jz+V77s32Gi7Uladogw4ZwGDncf6R85ALvpr4jeMrph0PK8eujl3kXR0w4NliI3cn5oA3G8E90LGx1KXpCUkLKgnzVqBwxKzqYc75WlrRlfg9FnRbJCkN40g99RljRW8TrxHVDza6QAjseL8bu18wfK2s8wee3a78034C3yyHG5TBUkqZfQfrQeke4UPVJBrsbhz4l8U5N+Xxu++Hj+X9GnLa+vTUtSUgqXEcPZ1Pr5eyxo7Gu5uWS1stRuRqHIZH+eGWS5HaTspbOzxMWGC+Ef1DEcHDLngeGCz2uQL7xo8xYuYpIfgOKS1Dy5mOGojpJSlKJiUkiM6SddfZpR9CgkJqwXZLHgmT4yPB6evxOodPTre9+mte9eN9cs6LTNcyAxE2xDK1yzL6fASyBtRc6vL0gb3vtXNTfHeSZ+2/dOOvFtfGTqmVsYrdH3Wo2QIQVKK2xorgxV7TpABS6E7U2Enzda8+Cbrd7QZ/yPfmvHC7FEzT4/v7clOp8u4c5um3WWXIg8eJJTNuLKi29kHqCxGWDtEX/ADvDRc+lGhtRsSNDiW2Gxb7dFaixYraWWGGUBCGm0jSUpSOwAHYAVGMM5MsV/LOOz7a/i+QNI6FWG4pS08kJHcsEeR9sD0W2SACN9J7VK3Faq/ZWtIvgzJz+nDl61UEV5G6RIDqfFQHIOMobk9V/wi6v4pe1OBx1+EgKjSz7iTGOkOgj8XZY9lCqhyXkfK8qXJxTLLLIcw23TVw8hyXGWnnWZ/R9UVLIBdbR1eV5TZcACVICgSemxbvkV35SucrFcHmuQschPLjXvIGjpTyx2VEgq91A9nHvRHdKdr2UTS3223WS2x7RaYjcWHEaDLLLY0lCQNAVuIH4gnujdbnofLTUiU8NSs993Q3xM+489dBWWOgWrxe94perHGewu4W6VamEJYZEBaS0ylIADfSn6OkaHToEVlS5LMZlyRIdQ000krWtZ0lKQNkkn0AFRzJONcXvVwN+YTJst5Hc3O1PGM+rQIAc6fK6kf5XApP5VUTzvJnIAXCbkRs0wO2Sy28+yU26XflI9W9/ynmEL7KKfCS4RrZSFBd8x3wAGuZMnCX1GIAzx4VkFAGNim8HSGc/vgT6fKmMJEnli6s5Bc4gbwyA4l60RXQeq7PJO0zHUnsGUkbaSfqOnD+Cpy8qoYOYMMtgRCydmfiLiOltLV4iGOykdgkJfTtgjuBpKzWbkeeY5YrAMhVcG5rL6kswm4iw6ua+o6QyyE/UpR7DX5k6AJq/ZXwWNLi8E4uOfpiAMAJepNYYveOIF2Qy/vnx+y9Ob5ezi0FpLMF253S4OfL222skeJLe1vWz2ShI8y1nslIJ+wMStnEGKrZkXHNbNbb7frk981NmvR0koc1pLbBI222gaSnWidbO1EmtziuNXNmc/mOXPeNf57ZbS0k/wrbGJChFa+/oCtfqtQ32ASkSF1e/Sr0Gzi0nvI7aZNIBlxONT7CmMyoXRu6F2GeZHxy+eUlBHeLcfgAuWS9ZDZUpBJTFu7xZSNdz4bilNjsPtUCiYxlfIE1yQnkG8PYvbpCF243KDEcE+Q2T/G0ltHUyg66CrfUodQ7BKjLb1Lc5NucnFrc++3jEFZavE1lXT886k94TSwd9A7h1Q+3QDvq1LQ2zGYbjRmkNNNJCG20DSUpA0AB7ACpIVhg2p24JQxoSLx8jK6OVSNBU60xII3jNx1lT2xPsOJpDnbDyKO3+0SMR9zZkb/rpdQm9NciZHeZOC2nP2FtNtEXia3auj5RKk7SylQc7urB2QDtKTvsSnc0zDJZ/zjeH4qrqvk1AWt4JC0W2OToyHAe2/UISfrUPsFEZ1hsMDGbU3a4BdWlBK3Hnl9br7qjtbrivxLUdkn/7VcNkbaXmDDc66PEbzv8AiK/8jkKYmmrIphtvuAmcKD1+2uOArGomHZZChs2xjPBBhx0Bppq3Wplrw0D0CfE6wD+oNRXKseuc6a3hloz3JJl3kthyVKVOSyLfF3ouqDCEArPcIT22dk9kmpxl+TvWgR7PZGWpd+ue0QYy1eVIH1Pu67hpG9k+/ZI7kV+Y5jzGM29Ta5JlTpSy/PnOABcp4jus/YDsEp9EpAA9KndYYMd34aHOQ8RvOoP2jeAmRw3RXEtVmFEewd47E4UFeJph8nzWhh8R8fW+IiALGZTKN+SZJdkJUSdqJC1EbJJJ7epNYN1ajiWrAsChRbWooC7pMhtobEBk+iUgD+csb6R+EbUfYH2Z5yZarEluzWe7Q3r3OX4TSBt8Rk/idcQ3tR0N6T6qOgPcjAx65ybVbBbsVwu9TlLUt1+fcgmGJEhXdTjviHxdqPuGyANAdtVuTYWxPw1mDRLxFom6X7RdBMzmZi6P6jS5DbGLe8iEnSeHOvtryCl8C3wbNb2LXbY6WIsVAbbbB9APuT3J9yT3J71XWQ5TY8glhNzuqI2ORJHhlBT1Ku8hB30ISAVLaQR30D1ka+kHfquCc3zTIFY+7kjcO2RdG7JtKCkN7B1GEhXmWtQ7qKQ30p17kVuf3Zg3GtqcnQ7XFgNNoSz1oR1vvegQ2FHalk9gBs1LFjxLYwthNEOAyhLqTlkGg+EZzImRdwDgbcKEIZmTNx06x/utbked3iPbVS7VYXoTKiG2ZFxb0p5auyEMxknxFrJ15VdHbZ32rCsOFtR1DLs9eZud+6CtUiQE+HBb7kNtj6UhIJ2r1J2a19uyCTebivIWoP74uKQpMbpcCLfamjrYL/dK3SD51NhRH0jQGzFb7dVZBPEjKLy/crZGKVt2uA30NznN+Xy72Wd9gVn+Ir6U60Fce07QhAi0xz3n7Q6QYJfrOVMpB5aM753elCgnwimuvLqXotxe79ec8lrZs8VmPisPa3rlO8seUpJ31BHYutJ16bSlRHckDR6w+E74FZXLEuNybyvFuEXE1JQ5Fal7bn39O99xofLQiPwpCS6DsdKe6p98JXwM3u9GJyl8StoYZYSW5FjwnW2o4B2h6f8A/EX9JDJASn8aersnv5KUpSEpAAA0APavlvaPtabY50GxuJn4nmk+DG/paMJmbjM1EzPrwYAYKr0W+3wbVBj2y2Q2YkSI0llhhlAQ202kaShKR2AAAAArIpSvBKylKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiVHM94+xHkzHXsWzO0Nz4Dq0up8ykOsPJO0PNOJIW24k90rSQR96kdKJivnfzT8M2e8NTRm1gzC6mPB0qBlsK3h963rIPX+9YaPK4yUoQFSWQknt4qSAVHIZe4n5+tFpsXPuM49HydwJNrvFtnARpzoV5XrZOQoL6upO/CJ60kaIUnufoM42h1Cm3UJWhYKVJUNgg+oIrjz4mPgJtGa2O+XThJUGxz7opUy4YxIbAs12fAB60JGjCkkoTp9kp7nagdk1Zhx7tHVXNtGzw83oRkesPthyURtGf898O5wzgOZMxeQMduW/7OXXrTDushKQSqKtRAjuyEIG0hZa8QBSgokFI2V35FsETLVZpx869YM1eCUXfEMgaVbDkTSe3S2XQG1ykpB8N1tSgfpUdd088YnneRY9cbhwBycqfaJMEIaTZs4U4YskggtFi4pHVEeGkFDgLjKlDbQT6C3sb5qgWu4R+C/ibxx5MW6smPbLvkERDkW4a2PlpDwBYU7090uBQ6x9SG1djeY8OFDTrNch8N0N0nNrnKkxrL5lzXUOHZpj/ACHjMbIbM548OWlSHGX0AOMODyuMPIP0OIVtKkHuCCKoK8cJcUcOZY5dblhMS34Ld5CnGL7aFuQJuLzXVbUl19hSXBCcWSQvemVq0rTZBR+ZXwffeLI0jN/hivVwsEzxm5F1xwH56Bdo6UpSoNMvK008lCR0eGpAUE9B11BSZDinMHIuQY2q53ni+Hm+NzWVN/PY1IQlx1Ouh1qRbpikKaWlXUlTYccI0R69qsEXqPFfVQsdd3oZ3TlgetFNHOF701Gdh49znncO3yEFCoUxyFdWVNlOukrmMOOqSRve3DsE/lqtbdaOduALnZ8EY5ax17BJ6xBsdxv2PuyVwpbiyUQHVNSGvDbI0GV90jXhaHkrVYp8Q2G8KZNb8FyCRkNrwW7OpYtbmQWiXEcxqQsnphuvOoCHIh9G1hZLOwkkt6KOjZX9i+S8Yl2l922ZBY7qyqO+hp5DzTyFD02kn7ggjuOxFa921/hxC2MR8PxibTw6qo6ib8SkAhDll43vqQdKIuUy2FQ9+3gyNff1P2991Uir18QXw8PTLs7gmBx8Au9wb/uacrmPs49JeUrxHy6YCS3EWsp2kJUELWVbCSdT7GsovvDd8g8ccj3R+4Y3cHRFxfKJStrCz9FvnLPo8PRt49nQAD5/qt2bHh3OFIt1wjtyIsppTL7LiepLiFDSkkH1BB1WO7vYGqd9cNQCD1qoALv8SzoDgxDjVsKGwkZFNcGvv1fKDf8Ayqu8zwj4pouRSuVcDHHdsyJEARZsNhMuU3e2UnyodQpTKS82CS0sq3vaCQlZqTsXqV8PUqHYL6Vv8aPrEeBdVK6lY6o6DceR/wD1PZLv+F2CvJ5kXGHUqAUkggjYIOwRWBDvUJTve7qAJHrVU7htnzzlDG4eUOfENfWYkxJ8SNZLHCt6mHkkpcYdTIQ+40tCwpKkdQUFJ1v75t3+GPjTKYLkHOZ2W5Slzv8A9q5LNW2hQ7oWlhLiWUrSdELCAoEA72K8stxHIsFyGZyhxTbzNdnrDuR40HQhu7gAD5hgq8rUxKQO/ZLoHSvv0qTOcSzGw5zYWMix2YXorxUhSVoKHWHUnS2XUK8zbiFApUhQBBBBrLWA0dijoxG8w0VYcPO2zh+6s8HZFYLTZ5qgpWP3eBb2ocbII6dkghsBKJbY/mN/iA8RPYqCbpUo/eo5neDY7yNjr2N5JFccjrUl5l9lZbkRJCDtt9hwd23UK0UqH/pUCsnKEzjmcnBucb7AiLSlf7oymQ+2xFvDSBsh7uEsSkp+pB0lelLR26kp3a25Q4LUv73eGPWCkOf8ey7tdI2eYRcE2jMbY0WmpBG2LhH9TElp/G0T3B7KQe6SO+/bx7yXDzlqZa59vcsmUWUpavdjkL6nYbh30rQrQDzC9Etup7KHr0qCkiP/APtB4zepDkLjnGcoziQn0XZ7aUQleh7TZJainsQrQcJII0DsAw3NON+d+TLtAzWBcsX4xv8AZmn0W6XD8S7Tnm1pOo8lZDTIZJ6VKb6HgFJBSrY3W2BmyqyJkXYlOvWSu7IcksOK2x69ZNeoVqt8dPU7JmPpabQPzUogVQWaSM75AyNnN/h4wpdsvURCQvJr+tdvtt3jg/8AdnIwSX5add0OKS2E9QUhzRIOw4ixPBLnlUp/P7LcZ3KdhSj51zJJZnLQn8Mq3ggMojrO9KYbbIOwtKVbFXopXapmtMUVoPdaF4gmlT7fyqO4qtFo5K1kXJt1nX7MbDIHz1jubSGI1hmeoDUNBKCNd231l1RHdKxsiroWf/8AlVVzwcNxW3M8mXDPYmDX61Hoh3ZxPiJlp7kw3YwIMtCxv+GnzggKQQpINcrZv8a/K2eY9GtWKWhnCvGj9FxuCVeNMdX6K+WSsaYQR3ClhTgB1pBG63ERsDddj89f2XR2fsm2bbiXbGyYzyDfPD68F0D8TnIXD2JRoScpvk6NmsdCpGPpsJBu7S9jZSdFCGVEaV42mlDYO/SuJeQ+buR+XZ8O28s3HwoLCGkxLbEHRbpDyR3ee1/MeKhvS9IT26E72a0zTKUPPTHHXpEuSQqRKkuqekPq/wAzjiyVLP5kmvKTFizo64syO2+y4NLbcSFJUPzBqpFiuinQafdfWthdi7PssNjRyIkUVExujkMfPzAWSD7V+jtUfDV2x4D5Txbnbh6tKVuQwP8AQT/MT+R83r3PYVtrdc4F2jCXb5SH2ienqSfQj1BHqCPse9QTyXuYUcON11HafbUdGSzQaw7ym0mEpy8ONtMtkKS8pfQppfspCh3SrfoR336Vi3C/NxJItsGOudcFDYjtHQbB9FOK9EJ/XufYGkCzPKkpud9komy0nbSAnTMb/wCWk+/+s9/XWgdVoeC3fEEYGE0B2Rnh568h5yVjcfc13i7T4Vs57fvF+wSCUmClxsKeLiddLtwZSnrktp15QTv3WhZ0U9rY7lOOZlZY+QYpeod0tslO2pEVwLQde3b6SPQpOiCNEV8/Qr86yLBecnwq4qvWB5BIsk5RK3EoHXFkn3DzB8q9/wCbsoeyhXWsG1XWU3YovDXP+eS+a9ov8NmWsGPsp1137DRp/wBsvDyqOK7ozPDcZze2ptuT25ElphxMhhwLU27GdT3S604khTax6hSSCKoVTnLuYuTrfgeTOZNx7CkeC/JmuiDcbqE9QdjQpjadLaB0lTqkJKtKSlz1WNLbeexybd4eIcwzGMOsq0IS98mtSot8fOwWnZJ0YrR7fwz3We3iEbSrpiO1CiQmIttZZZiNNpbYbZSEtobA0kJA7AAemq9PAMLaO/CdIZyoTwI05iuVMfjtustr2NENntkMtfo4UHEa8wZeeEFxHP8ABm1RcGjQnMVnxGg2xY7iwIriUJ0NNdy26Bv1bUofnUtkOoaQpxxaUJSCpSlHQAHuT9qwMtsOL5JaJEHL7Vb51u6Ct1M1tKm0JHfq2r6deu+2tbqhTx/lXIyXouGZpdbXx22pCo9vvjapzV4UlQPlKlJkCEQAAku6cB7AI7K6XeRbPJgbeOUqHzGEhqDwAmucBDi7xN3WdfQ4z5+qmcuXL5p6olskuRcB6lNyZbalNv3zpOi2yoaKYpO+pwd3ANJ8p2Z0zGiW+IzAgRmo0aM2lpllpAShtCRpKUpHYAD2qH/2tzrG2kx8l46MmK0kITKxyQmQhKR6dTDgQ4ka/CjxKxJXOPFzMKRKfyyMw9GT1KgSULYmqJPSlKYzgS6VKV5QOnuTV2zvgwt6K6TjjOnkJ5DQE61KhiCI/dYJtylX1ln5D0UhyrILLjdmk3jIJCG4bIAUCjrU4pR0ltCBsrWokJSkAkkgCqosXDlhyO6yOQMqxdmyXCZpVvhW1aob0BrR0t1xhSSuSoHzHZCR5BvRKpPj+PXfJby1n2dsJRJY6jZbOFhbdrbUNeIvXZclST5leiAelP4lKlzzlXodmFtIiRm7owBHudOAyxNZAQmObOC2GanEj4H1PkKYwV3Ar3BGrDyhk0RIOw1KLE1Gt+hLrZc/89QS/TOXMguszAMYzeyPhhvV1ujVqdYdghWiGQ4HVJL6k77JSChJ6jo9O5hlOS3fIbs9g2DyCzIa0LteAkKRbUEfQ3vsuSoeie4QD1K/ClW4sVhtGK2hmyWWN4MZnZ7qKluLUdqcWo91rUSSVHuSanbZG2l3dwSWsGJDnCf9IqfM5YCsyMG0GE29EkXHAED1NPQeZpjE7dB5MsVtj2e12zEGokRsNMoQ/ISAkfkUk7+5JJJ71o8mzPlKyPQra1jmNTbnc3PCixWLi8V6/G8rbQAbQDtRJHskd1AGYZdmEbHG2YrMdc+7zyW7fbmjpyQv3JPohCfVSz2A+50Dr8axqRbHX79f5aZ9/uCQJUhIIbZbHdLDCT9LSd/qo7Ue9Xvw7i78PZ4jhKUzMSaNPDiRgMhU0kDoyKCO8iNFcKY++Gp8hnKLYxaOUsdhutv2jG5lxnOfMT7o9eH1OSHSNb8MRwAkDQSkKASAB+Z9WV3rkewW8SZN8x5mTJWI8KDGtb0h+U+R5W0KLyQPQkqKSEjZOgN1MsnyaDjMFMqUh1999wMQ4bA6npbx9G20+57EknskAkkAE1pbBj05M1eV5Y40/e30FtCEHbFvZPfwGd+voOtfqsj2ASkT/hLoFls8R0xib1GjUyAmTWQJmTU0xsw33j3sRo9Mfeg+MBXCO2fjjLG5Ei+X7kiem8XFDXzbluhxkIQEDsy2XG1qDYJUfbZJPqTWJluP45ammGbt+98pu09RagwJk5akvq136mkkNBA2SpRT2B19hW9zfk7GcPjqbdnsTLoshEe2sOhT7iz6bSNlKB6qWRoD89AxbHb1PDjt9YxW93693AJS/NcjCFHZa9UtNCQUrS0nfskqUdqOzWIjLBCP4SEZuzJLnyniZTcC86EcXUABvwhGcO9dQZYCfnSg58ApDhGEW7C4D3hsRDcZ6w9Oejx0tIUsDQQhIHlbSOyR/U7JJOlzHOIyZL1gt17i24slKLjdH3EpbhBXo2gq7KfUD2T+EEE+wOqvNy5Kyq9HGLZd7fZm2ilV0et6C+uGg9/DD69AurGvKGx0g9RP0g+ydbuNOJLYby5bWkPqX0turBkTZTyjshJVtSlKJJOu3udAVIbTdgGDZGiFBZRznbvMNArOfidQzJA3ju24cKb70Q3nHACvr9B9McR3MoeM4641iWNSRCi+VEy4BUdl5xR7FOwXnlrUe3Sg9RPr3rTiyImQWct5tnxHXG3S5Dt7oDcWJ1fSnwwT4jpH3KtHsN+tYFyyK/yAzmOUttWNtBJgpmoLio+x6MRh51vFOwVr1rvpOtk2L8N3wX8yfFHfmM7nGfi+JBQU1kd5R40l1onZ+RYIDZcOv5gAbQD5SpSTvym1+0FnsDQ60bzZbrCLrZZOMMVP9IdTMmeHXs9lLsKanE+vyolBxzPuacsgYBhlkmrRJG2LJCbAlyEAgB2So+SJHHY7cI7diNkAfSb4YfgKw3hqTEzfPnIuR5ewQ7GQhBVAti9DzNBY63nhrXzDnm/ypR33d3C/BPGnAeKIxPjiwIhtr6VzZryvFmXB4DRekPHzOLPc/Yb0kAdqsGvmG2u0tq2u936WHKdToCcANGtAaP6jVdeFAbCEglKUrzamSlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURQPl/g/jPnTF5GKckYzHuMd1tSGpIHhyoqlDXWy8nztq7+x0fQggkVxtyXxDzn8N1jVZk4k7zvw61GLUqLJCX79b2u+yptXlkoA7eUDQ1oICST9BaVvDiOhmbSoY1nh2gXYgn1qvk9xpzaxjFwkzOGOY1XDCkoC5GN5Mw4/NxxQPn623FpkGLogl1tTga0doUnum2LlmmdYVcG+TIXH8u0XCYkvXViyvquuO5AhSB0ulxlPixpHlTqQpnXToLKxopun4lf2f3DHxBOu5LDhpxHMQlRbvNsZCPFc12U+2kp6yP8wKVH3JAArhdCPjE+AbJVRM8iuZRx7IleGl0guQiCvZW3ISNsOq6j/MSApRJI966MC0hxDTT6ev0PkuZH2W7xQZOOhoT54H096ru3jTlnjbn3DFXXHJka5Q5CFR7hbpQSX4qztK2X2tnpVsKHuDrYJHeqplcQcXcLTVIvuNqhYY85qHkVqffhTLGta/KzJfjKStUYKV5HFEhrsDpGimnMq5T+GvnpcLI4WSzOL+QYulwrpKLkBD52D4MiVGWG3GldIGy4FpBJTo1bHGGZ8iZdFuOK2rkaw3mfb2kouFgyiG3ML0dadJXHnRVIS/GWNjxFsrUfRelAiuj4zdcK6/xj1Rcd0KJZpzBbqDMS88Dz9VZt94EOTY/Lx0cz5wbLc2C07Dkuwbiy60odvNKjuLPsQoL6tgHdQaJ/t64eyW3YDfOaokjGJqWYWN5Df8fE3qkBPT8lOcaeZKXVa/huE6c2UkhWgcCyX/m7gi6PGRxE9dePXA4+/Cx26JuDllITsqhtOJadUwopJ+XCVFJV5FaHSZo18RHw7cpYh8jk19jxrJkEUhTWRQ3YDL7R+zjyUoJB9ClWwoexFCxh4HzHQWgfEH9TeAB6Pz8SmRB+IVyM9AuMzjS/RpDamXmnLdLhoWlQIKVJLjwIPoR7g/lo1Ym4/EP8PMCJbZ72FqwKRNLTEx0zJacWbUAG2XVeVaonXsJWf5IUEk9AHTJ7JyMeKFiNfMuayrjx1TKLdkDcpuRJsqFAgNzlJO3I4ISEye6h1fxPQuVc4etV+tnTuLcbfPY9NpdZkMrT/UKSpJ/Qg1jur+Br17LXvjD8QBB4dSPWChDcX4j5SEuNZpxsyhYC0qTj8x8EH00fm07H5/8ASq/yviX4jbTc7jyNhnKuPt3mQ22Lla7HjSoLd2Qgp2vUmTIbEsNpKUO9I35Ur2ANSBiPkXw9noiJmXzjFKnF/LoZW9Oxlrp6vJolUiGkhQCAkuNggDqSNJtm03q2X62RbzZZzE6BNaS/Hkx1hbbrahtKkqHYgg0EIOoZz5oYxh7zZEHh7HrkqkwnA8Q5Zx9jJ5nLHI2SsrUpp9iTfHLS7GkoOlsvsQBHCHEKGigj+h3ut/M+Gngafb5MCdxdYpS5TZbXOkxw/OGxoKTJc26FDewQrYPpXhl+B32y5HK5P4qSw1kEhpKbraH3PChX5CNdPiEA+HICQUofA9wFhSQAPO3fEBxkuxzLtlWSwcPkWp8xLrb7/MZiyIEjt/DXtXSeoKSUqSSlQUCCd1kQ2ijwhiPdWGTLhl6LCxLL8h47vjPGPKM1UmM+oIxvJ1p6WrgjYSmLKI8rcxO0pBOg8NFPm6ki1FLIrl7lD4z/AIbLjZpmKu2+65/EmNlmTFt9tPgFKh6+NILTZ9fVCiUkexFc1zfiv5ykWWXiGPX5dgsKX1C3yHXROvTUTt0srlKSE9tEdXQpzRA8QkdRy14ZQVC61k2BtDaRBhwi3iaDnX4lyXavxC3LiqwWu35bnXIEXCr3ZnS5ZLwhYM1Diuy2kMjapLax2Wz0qCgN6BAI5vyD4++Qp1jZtmH4VZWrokqak3yYp75R0A6DseGoJdHUPN0urSUHt5x3rk27jJTfJGVPXKbkcqUU/M/vSUp+WQO38N9wlWh/kUdfYitha7zBuza1xHSVtK6HmljpcaV7pUk9wf8A19qjMRxdovb7K7F2WHIbQcXuGWA8jiRwpyWxy6ReeQ7iu+5/kVxv14W54qJ0p3zx1f8AgJHkYT2HlQAPvutGLrc8e8mQdUuCkKULi2jzIG+wdbSO3b8ae3buBW6Bry7EFKgCD2INRFmYxX0CDZoVnYGWYBksJCnmM/nivY0626hLrS0rQsBSVJOwQfcGvYD7io3Ktj1gS9c7A+0y0Op16E8sJjue5KT/AISvzHl79x71iwM0fyZ35HGYyGXko6n3phHSyD2BQhJ2739wQn/V7VoXAUOKtC2NhkMiUccAKz5fzKSlE65wLVH+auEtuO1sJClq11E+gH3J9gO5qLyLfeb7MXcccbdx9LqT4kx5B8SV5SBtg9hrt516UNaA71t7fjsWNJFynvuXGeB2kSNHw+3cNpHlbH6DZ9ya3QPvWpaXYqcw3Wkfm0Ggx9cv/j6qNWG4wMcjN2m7wP3S8pXd5xwuMyXD6r8Y+qj9l6V+tSoH3Fel5liUytiQyh1pwFK0LSFJUD7EH1rSN2O42MdWNzQYye/yEtRLQH2bX9Tf/mSPtWJFvJTwy+zgNAvNGmI8sD5SPAqSBVYN3v8Ab7KhsSlrcffPSxGZT1vPK+yUjufzPoPciovFzx3IH0WixMtw5xKg+7MILbXSSFeH0n+Odj8J0Pcj0qRWXH4VnWuX1uyp74AfmSD1Ouflv0Sn7JToD7VpO94VJDtX4kf5Y0zdkOQzPoBmSsVFluWRafysJaiBQU3aml9TZ13BfVr+Id68o8o1+L1qd4dn/IPGy0owy9IftgASqy3MqdiAA/4JB62D69kko77KTWlCq8v0qSDFiWd1+E4g6hRW3Ylg2pBMC2wxEBzOPMHEeUhwVls832DkjJmrVzEoYpjzCm1R7Usl6FdHxo9cqX0hIbSr0ZUEhRAKifpHRTMyLMjNyoMhp+O4kKbcaUFIUn2II7arg2Xkipj7lpxuM3cZTagl9xStRo+z361+6h/kTs+m9DvWyw9rJcFUZmM5lcoMxxwuupaI+TcJ9U/KnbYT+YHV79W+9ej2d2mdAcRaWXp4uGPphTQSXybbf+Fn4gl2xYtBPddhyDxidZg8TkuwcvzCy4dbP3leH1bccSxGjtJ635T6jpDTSB3WtR9vb1OgCahEHj/+2EheU8t2S2XCa8hTUOzvNIkRbXHUd9HmGnX1aHW56D6UaGyqqsW5audqyGRlXI+MP5Fc1AtxZ1scR0wWP/hsxXSA3v8AEoLUpfv2AAs62c+cXXdTTL2TN2mS8elMe7NqhrKtb6QXAEqP+6TXrrFtOwbScC6I2WTTTzINCdBUDicPl+1Ozu2NhTbaLO9urgCR5ObMS4zBPAY+9/iPBWe9ogzrKR6C0XORCT/9DS0pP6EGoLc7bmdyvL2Lccco30MQyW7rNmojSmIRI7MtKLXiOSNEHRXpA0VdyAZJcclu/IzzlmwOeqFYkktz8ib7l0A6UzC9lK7EKe+lO/L1K30yO0Wa1Y1aY9jsUFESFET0NNI2ffZJJ7qUTslR2SSSTuuzCskK1GUFt1mbgSJ8GyIpq6XBuZHBdaXwROIZu0MjLiZjHQeZ0UIsuIckYra2rNZcsxT5ZnZHiY/I8VxRO1OOL+bPWtR2VK13JJrWZNkPKlgVGhNT8XudznL8OJCagPtrd19SyS6ehCR3Kj2Hp3OgZRlOafu24M4zYIX71yGUkLbiBXS3HbO/48hYB8NsEH2KlHskE+npxzGXLL411vM/95XycAZk1SOhIHqGmk/gaTs6T3PuSSSaussoJ7izOcJUJvGTeApU8MsTkDqIpP5sZoM8BKp48uOeWohtmxXlO0XGbfZcnF7heLidOzXvHHhND6WWkD6Wx9gdk9yT7MquPIdigok3DMLMh2Q6GIsK3WNa5Mp0js02XHyCTreykBIBJ7AmpPl2awcaDMNpoz7tN6hCt7SwFukeqlE9m2wSOpZ7DY9SQDEoN1xawXNWRZvm9pmZDJQptptt9JRDa7FTEZsEq1vXUvXUs63oaSN3QYMD8iFEI1JfINnWZwm45CZOBMhIG3CMSJ+Y9g4ANqfmQGvpVY1p42y2XLaynL+Q7mm+Lj+ApuAzGDEVsnZbbK2lEe3UtPSVdI32AA9OVWTG7K20xdHr5lF1mbRAtj9wcV8wrtvqbSQ2GxsFS1DQH9BXtvHLq5z8mxcf4vdbze0MeKlMiKqJHZB7Bbinug9PuAB5vb3I1totGfWtEq8znbDbpklPiTrrPcXMeUhIJ6ehPhoZQnvpIWoAHZ2dk5DbIR3VlaX/ALnbzgDnVxDS7nQYnANV6G2NO9FIGgoPgTA99NVt8HwG24gh66Pw7eLvNTqQuIwlpiOjewwwkDytj/mo+Y/lGMn5UYu95ew7C5EqU9FV03GXAjl9bI922vwdZ9CtZCUnt3V2qMXh+zZbLR/a7PZz9jCioNfMFl27AH6WYrOiWPTzKClq760NE7a753dhGi4xxjiRiy5a0xre29F6VudR0AxFb24pRUQAFhA2RuqFq27Y7DZ+6ERsGGNCHRH6ya2Upn9RNcpCRXbsWx7XbXd5DhueczIhreZNPKfqse7ZBfsStDVissG24+VoWtluS+JMpQP1yH1bDbQBPUpxSnNnfYkgV6OK7Bl/ImS/2f4fxq58mZrJ38zeXgUW63pJIJ8VQCG2Qdjy6Cj23sgV0Z8O/wCyyzXNJjebfE3kk2PGedTJTZgtK5D+wDtwd0NbGh0kKUBtPSkgKr6VYBxxg3FuOs4px9jECx2pjuI8Rvp6laA61q+paiEgFSiSdDvXzfaHbePEd/k23QKNLq3RqGiTAeMiRlmT3WbLhQGyiOvOzAw83Z8h5nJcqfDp+zlxjC5rGf8AxC3WJyFmSFdbMbwiLRb9EFIbZV3eUCAepYCdgEICh1Hs9KUoSEoSEgdgANAV+0rxMe0RbVEMWM4uccSTMqy1oaJBKUpUKylKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlY86BBukN233KGxLiyEFDrL7YW24k+oUk9iP1rIpRFxNz5+zO4/wAsRNyPgyS1ht6dHWqzOJK7LKPfYDYHVGUe3mb8g1/LOya+eXJHw+8q/DZlEa65BitwxKc26pMOfDkqbYeX9401kgJJB+gkHW9pGzX3mrW5FjWPZdaJGP5TZIN3tktBQ/EmsJeacSfYpUCDVqFa3wxddUccuRUjYhAuuExxXxaw/wCLvn+zdMeDybImFj6oN7gsSFJ9B5lAJdUPsQsg/c17Mf8AjJ5awnNJF1kY1iDNru3UZ1taYeatsySVdQkJClr+XfJKurQAXvv1FKSOxOcP2VHHGUF6+cJZDIwy5ja2rdIWqRbuvez0E7cj+v4djQ101wpy9wLz38PRXE5XwV2dbPELTd2gIDkaQnegQoeQk+yVFK9fgrqwrZCjyBJaR11L0Wn4GwRpzhgE+XwrcmfFNxze21Kyb4N8AuEhwkrd+caPWT6nzRNgn9T+tQWD8Q2E4tnshy3cPXi2Y5c2iU21nNZcNMaZvyqiPsBPhpI7FpXoUoKOnRBo2JeY0NoOWd9z5YEJMSWFJQj8kO6ISR6dJJH6VuIN3smSR3I6FsvAjTsdzRIH5juCPzGx+dWrkOLKRkfKvsFlux7Hg0H/AJH7q+Lj8TmWr6RilmyKwAE7EjkS5XQKGtAfx0A/9e//AFqr3OTuZMdcku4nlM222qUtb8i12q5T4+nVK2t9GpA6nD79wDr036xT5e62bzWomdF6tmI6v+I2n/w1n2H+VX/MVsrdd4dyChHcIcbOnGljpcbP2Uk9x/6H2obOx1DMHn8K5B2XYW7tz1M1tY+cZ7kR+ae5p5AlNo8q4qslmgIO96WFOFwEfYn/APNeiTardcX3pl0jC4SpI0/JmqMh53trzuL2pX9TWvuFlYnOfOR3Vw5yQAiUz9Xb2UPRQ/I7rwbvsi2LEfIm0sp2lKJjYPgOE9vN7tnf37fnWO7bDo4ef95yXZssKz2fww2jiAF7Wol2x4AW1TlxtyQdxXV/xmh/4aj9Q/0q/ofattbLvBurZchvhRRoONkdLjZ+y0nuk/rXgubFZT1vSmm063tSwO1R683DG5j5fgXN1N0ZASl62tqfdT330rSgEKT69ldv0rVwEPA+X2XSbF7nwmmk/j7fCmYNYFxssa4OomNuORJzSSGpTJ0tO/Yj0UOw8qtioy3l+RRG2mrpj7ccuLKETJL/AILC/sSAFqbJ+yu35963TcK/z0hU6/tMsqG+iAx0kj/5iirf6gCoy4PoArbLSyMLoBPt8yXg7lrePLTEy51mMVdmZSNlt/Q39PcoV+XcfY+1ZQu9+uZKLPZzFaI2mXcPKD+aWges/orpr2Qcfs8EL8OGHFujpcdfUXnHB9lLXskflusVNkuNnJcxyd/B9fkJSipn9EK+pv8ATuB9q0LXjHD3U7XRh4jThj6nHyAPNZUfGkPLbkZBPduz7fcJdSEsJP3S0PL+hV1Efes662O23lCBLZUlxk7ZfaUW3Wj/AKVp0RWFCyOO48mFc2HLbMV2S0+R0uH/AELHlX+g7/cCtx1hI6lKAA9ye1a3WkUV6D3L2kATGevnOvqtSh3I7IAmQg3mIn/FbARKQP8AUnslf6jR/I1s7ZebbeGlPW6Wl3oPS4nulbavspJ7pP5EVrFZdbnXVx7O2/dn2z0rRDAUlB+ynCQgH8id1rbjh83Kn/nb1LNrPhqbbRbV9L3QR6OvfiHr5QAPzNRmnhr1qtmx3MpA3+H/AG+hmt5KyaMJDkC0x3LnNbPStqP9Davs44fKj9Cd/YGvSrHpt8G8rmJdZJ2LfFKkx/Xt1q+p3+ukn/LWJCN1xOI1ANlbmwWvKh23NhC0j7ra9/zKSd9+wrc2zILRdlKbgzULdR3WyoFDqP8AeQrSh/UVjGjlYhvbGN2Oa/twH/b1I4LJlWm1zoiYEuAw5HbACEdAARr06f8ALr21WvFryC1gmzXcS2R3EW4bUf0S8PMP1UFVspc6Jb46pU6S0wyjupxxQSkf1Naf99Xq9HoxuD8vHPrcJzZCSO3dtrspfr6npH23WHAeasxHQgR+7K7j7Zc6L9l53brI2TlMSTaFgEpK0eK27obPQtGwT29Do/lX4y1esuQiRNdXa7Q55kRmHf7xJQfQuOJP8NJ/ypO/ur1FZVsxmDBkG4S3XrjcFApMuWQpYB9UoGulCfySB+e68HMThMrU/YpMizvqJUTFV/CUT92lbQf11v8AOtC1xxWA20v/ANWrf2zkfMih5CQ1JW9gw4duitwYEZuOwynpQ22kJSkfkKyASKhl1vmRYqyl24zrRcUK8rTRC48l5X+VCU9fWr8gBWC9f73dHgnJrTdbFaekLLcdovOPgjuHHGtlsd/QAHt9Q7itSQKKc7RhQvy7pBGUpAcyJgDoBSW4ZMtUldpxuILlcEdnD19MeKfu6v2/3RtX5D1r9g4u2p8XLIpAu08jQU62Ayz69mm+4T6nudqPua/LLesQTEbhWS425tloaSw04lBR390+oOz7/es25X22WmOiRLlpAePSyhHnW8rWwlCR3Ue3oKxIGpUzHQ4g72M8OA4i6PueJ8gFiuYbZksOMWlydZfFV1KNqmORNq+5S2Qknv7ioxMTlTMpdnxHkjMLnOT5XPGvT5jxB6fxFpWAD79IBUT7a3W9+Uv+TKDlzW7abYQf7my5/eXx/wCK4n6Br8KDvv3V7VvoUOHbYrcKBGbjsNDSG20hKUipYdojwv8ATeW8iR9Vy7X2f2XtY3otmYB+66A48pAS5uroFH7LZc4sTby2cqMiRNUHZryZMth2S70hJUtfir2SAACANDWgNV78gvsy02d+TdbAy88U9DcmbmVxdbLh7DbChpX5ISe/pXlKyWRNcVBxSKie+lRQ5JcJTFYIOj1LH1qB/Cnv9yn1rzg2KLEfF5vc794XBsE/Mv6S2yPcNo+lsfn3V9yavQdtW+C24yJTiAflcW0dg+z1qd+TBIP7g9wb/wBuQpxCwcLZhR7S3IuHCtnnTX1Fbsu9XLxH3CT9XSWD4SfsgaIHqN7rc3nNsjsVqd/c2OYjjDz4DLTsYLlOkk9g22lpHWr10NHv31UdmclWeY65Bx24xXloOnJStuNo9iUNo8zp39tJ/wBXtWJb79HTdA3ZrJdsivjyg0qQ80GfCJ15NK0Wkn2SlOz+Z71OztLtKEy5CiBvENYD6hs58Zz4qi7sX2YZWRfqTEcW+TRjyEgMzJb3FZXIka0LYcu0KxMOrU8pTEMLnPrPq8+48txPiHX2OhodtaGovTjuWJ/dj15ul7jvuJbL059SYzq9jSWmGQj5hW9aAHSfvvtXQ3HH7Pb4ieaZTN05Cu6MFx0k/wB2ejnxyPuGD5nd/T/ELWt9QB13+gXBfwjcJfD+lM7D8ZE3IFI6Hsguqvmrgsa0Ql1X8pB90NhKT7g1zrTty3x2d1EjOLcJXiBLkJD54rnR4vZ/ZZuWGyNc4fqMiSeZvS5CvELg7gb9nbydlCE3KbBb49tMvzSLjcoiV3aUnv8Ayog0GRsf4vRre/DVXfHCfwpcL8Co+bw3Gvmr453kX66L+auLx0QT4qh/DGjrpbCU6A7VcFK47nl2K4lt2nabdJsQyYMGijRyA+TMnVKUpWq56UpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJWPOgQbpEdt9yhsS4r6Sh1l9sLbWk+oUk9iKyKURcdc0/syOFOQ5cnIeO7hP49vrqdpNuSl23qVrt1xleifTyoUlPr2rjblH9nJ8QmGJfMjCrVndsSD03HHlhMoJA9Vxl9LgVvfZorr7HUqxDtUWFgZjQ1CkbFc1fzszcdVjV0fsNzgzBKheR+E647EuEcj2cZJSd6P2B/WvYxj+O3V35+FOniS2OnxBNd8VrfsQo7H6EV92+Wvh14V5zifK8o8e2q9rSgttyltluU0k+yH0EOJH5dWq435S/ZGYxL8W5cN8iz7XJSOpmDeFKW2D/AJUvtAKbT6erazV+FtCGaRGqwy0NPiC+c6bXcbcSLvNus+MD2fjSVpWkf60J0T+qd/oK2USx4tdY3itFc9lXY+LLceH5ghSjo/cGrK5W+Fj4oOEpDy8jxGZPtDCvDFyiwzPjr8u+sOMaKU+38VDfcVSrU5q6uiTEu1sjz1AbcbCo6yd6APdSV/od1eZFgv8ABXn95/IVqHFblVbv+xkG2yHJtjgQCXFdao0pkKSTr8C9FSP+o/KttAv8FlxFvlxFWt/0Q04kJQv/AHFjyq/Te/yrSqn5vAaQHrbbpyQPO6y8pJ/Uo6e/9P8AlXm1Mut9ZMfxrIUKTtbLrbjik/qhXSR/UCt5NadwEHSXXsrsKOGHcEuElMFhp9stOoS4hY0pKhsEVqP7Pv25RexqcYffZiuArjq/RPqj/hP9K0bWP5dbGHRacljd9FEZcVXhJ+4SVLUpP/UflXk1Lk9mskyG7W5fuFIabZP6OpRr/wAwP5Vq8z8TZdaq6LSHSvNkdZge6kDWUIhrRGyKP+7Xl+VLildUdw/6XPQforRr2nMLEV+FClLnud/LCaU/3HsSgED+pFYsbGsccSh9cJE89lJdlLMg/qCsnX9K/UY+u3K68buBgJ9TGWjxI5/RGwU/8JArUtiDqv0CuMjRwMpep+gK9sl+835hURrH2I8V3yqcuZCjr7hpG9/8Skmta9gsqOhpxu4uXlDA38hcnFeArvvya7J+w6gsD07VsU367QARfLK54YOvmIRLyNfco7LH9Af1r3JzPFy2lwX6GSokBHiDxNj1HR9W/wAtVE5jD4jX0691PegRKxXV409sP/ZeyLk9tiFuFcobtmXoJSmQ2Es/olxO0H8hvf5Vvm3kOIDja0rSobBB2CP1qOqyNM4Fi32C4TkqB2XGPBb/AKl3pJH6A1q2MNujjrklu5psBWFD5e0qKkHfurrHST+YQk/nWtcq9eiuMtT20Zvj098FLp96tdpSlVynsR+s6QFrAUo/ZI9SfyFaK4Kcy4IbgY8EtpJ6LhcGlNFrt9TaOzpPp69A/OsW22y6YuouGyMXcqICpbDh+bV+ag8o7H/H+grbIzKzoPTcVSLcoHR+dYU0n/6yOk/0NaGviopfxHei7HN0aS+pp6eqwGsMutukN3CNev3u+ykBDd3BcSgj3bUP5Z/MhR/M1tUZPKijV6xy4xOnsp1hHzLRP+nw9r1+ZSK9Ss0sjjhj2x9V1f1sNQR4v/NQ8qf+Iiil5bdAUtiPY2T6LJEiQR9un6EH+q61kB4euuamhvhwv/GceQ3h6n/9LJVm+KoZU6u+RklJA8Iq06VH0SG/qJPsAN16Ezsqv+v3dG/ccJWv7xKQFyljt9DX0o9+6yT/AKa/Y+HY+nrXOhJuUh0ackTgHnFfl3Gkj8kgD8qxbhaceswS4zfJlnWd+G3HlqIJHfysq6kk/kEmsEOzUzolpInFIDdAZHzJ+kua3Voxy12dxUptLkma4NOTJKy48v8A4j6D8hoflWZcbrbrRFVMuc1mMwn1W4oAH8h9z+QqGMJzu4SEfuq9So0IfVIukVoLWNfgaQlJ/qop/Q17YeJZBbpRub1+ttzl9z8zPhLUtH5I070oHf0SkVj/AGhSQ7Y5rbsCEQNaS54zPVVmyHZOVf8A7bjUZqOe3z11jA7H3bZPmV/xFPr715W3jPD7eFOuWpuRKWouOSVpCVdR9ekI0ED8kgf86wFZTlS3PlbXEtd3ePbrircDSO+vMsjpGvcdRV+RrwvEPJ5IS/kOWWi3RNAGKlpfhqV7hS/EQpX5DsPuDWsgaymojGgPPeOYYjhmQ0AetPdx4rJlRMbbfXCsSbzMmIPSpqBc30ttn/xFFfQn76Pm+wPpXlHwiXNQ+rJ8huC2H09KoTE5wNJT7hTh0pf6jpGu2qw8ZGaZhcGMR4raevlwKi2mNZ7GtSWdevU2nrWB776QnWyVADddR8e/s0ee+RmosnlLI28dgLbQ44zLWhxxXV6j5WPodgPRbwIJ9PXUTnMbiubaNr2CDMxJE6Ayb7Ek/HBcvGHiseP8pjkSbMaYSUdars+iEyB67WpfToa7hIUR9qkvGPAOR8vXAv4Ngs7OX2HChQgNhNtjL9SFyHlBoqH2W5s+oTX0t4x/Zt/Djgqo8/J7RLzm5R1pcQ5fXfEjoIGukMJ0laNfhd8SuorfbrfaobVvtcGPDisJCGmGG0tttpHoEpSAAPyFQOjD9IXn7V2ma6kCE2mExQcm5+Z8l89eNv2ZubX2Mh7mLPImNQwUFuz4r/Fd8PsSlyS4hIQsHY0hCk+vdQrsTij4aOE+FmY/9gsCt0WdHQUJuT7YfmnqGlaeXtSAruShHSjudJFWfSoXPc7Erz9q2jaraZx3k8MvQSCUpStVSSlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlEX4QCNEbBqpeS/hO+HPl1Tr+ecR4/PlvJKFzGo3y0kg79XWulR9Se5PerbpQGWCLhbL/2TvFT7Ly+MORsoxR9SSGmpBbnxWz31ptQSR6gfUfSubOQv2Y/xQ40Vu2aVYc+iI30LhutRZYGz/hvhCQdaPldNfXylWGWqMygdRSCM9ua+CV34K+IfBn1pv+EZZAZaBKhPxyS5HTr1PzKSU/8AnIqDOZNf4r7kS5/uxoIUpKnW0qeSNHWijqCwf6V/RKQCNEbB+9RTMOJeLuQkBGd8dY1kIA6Qbna2JJA/IrSSP6VZbtF4EiPc/WanbbHt/uvgHHt5lSUu23LoEF5RCyiCwWyvv+JtThSf1Ka2L8XNGHi4m/LmR+2kx0Msuj/60qSr/mmvsZkH7Pb4QMjV4krhi1xljRCob7zAGhoaSlfT/wBPWq2yv9k/8Ol9V4lgv+aY4pKSENwbkgsg67EoUgk9++uoe9Si3QiKgjkSrLLeG5HyJXy7a/s+vRyCTdisHzC4rWhoE+x6dNH/AKitiLZg0qO27GZtjbbf8p6KpLZQT7pWggj+hruy6fsf4SVE2Hm1txvqJDd3xpMrsfbqQ+jvr36TWjl/sjcoLRaZzrC3wkgo3Z3We/ufKpWvet22uCcZeYP8q1D2nDHiA8wf5XFD11FpHRBzOPI39DEpv5lf6JLZCj/Xqrzh5dkjxUn+x0iQhKCoOtOeEFH2HS8EEf8AWusH/wBlL8RcB0jGuSMIjM67JckzFA/lpbK9fbt7V7o37MT4r0I1J5E44cVrsfHlj/8Az0FqhE+KXXFWYe12T8RHKvyuRTmF6cUUTGY9i7nzSozz4A/3gEIB/wCI/wBazWo2P3VPiXbJk3gK7dC5KUsH8vDRpKh/vbrqhf7L34s3lKCuUOPWka2nw3JJJ/I7jdh+frWwtH7JjlOQpL+T5lhLzySnuVS5SfTuelaEpB/QCtDaYU6mfP8AhSjbMKcnG9zn8CnsuSJjHHjSwhaLYy+kBKRFIQ/0+gA8PzarGKbko+HjEzIgFb80laEsoPp5i+hTnt+EH/rXdLX7I+4vukyuU8XiJT9KmsTLy/Xv6yEa/Xv+lSawfsieN2XWl5Zy7lU5tCR1tWplq3JUv3IP8Qgb9jv9a0daYWXsJLD9uQQaD/iCPenwvnpIsuWOwtXXkIRxva/CjIbSU/YrBCv6gprAjzWMeKnbbeLDIedPQXW7e4txw+oC3vFOz29VH7V9bLL+zO+Eu2ttou2FTsgLaAkLudxcUoke5LfQdn1Pt+VWNi3wd/C/hrrMix8H4oHo+vBdlwUy1tkEEFKnuopPYdx3qJ1qZ+ke6hibeYDehsM9S4k+818ccbicq5qCzYLU0+46QhtFstz9xkFZPp4bRI369gVVPMT+DP4p86eCo/HuRSl9XUZF+iItcdA6tfypKx6dztLalV9q7darZaIjUC026NCisJ6WmY7SW0IH2SlIAFZNROtLjgFWjbetMUSHuSfag9l84uPf2WnJD623+VOdY8SPrTlux+3pcChs9hIcS2UkDXcI9Sft3vjC/wBmr8KuKPmbdsTuOWzekITKyC4uPrSO++yOhJ3seoOtDWu++pqVC6I92JXOjW602gSiPJHNaTFMJw7BLWmx4TitosFuSsuCJbITcVkKPqrobAGzrua3dKVoqiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoi//2Q==)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This function has an obvious minimum at $f(0, 0) = 3$.\n", + "\n", + "Let's minimize this function using our conjugate gradient, and output the minimum and the gradient evaluation logs:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [ { - "cell_type": "markdown", + "data": { + "text/plain": [ + "Min at x = 0.00078, y = 0.00054" + ] + }, "metadata": {}, - "source": [ - "In the previous notebook, we set up a framework for doing gradient-based minimization of differentiable functions (via the `GradientDescent` typeclass) and implemented simple gradient descent for univariate functions. Next, let's try to extend this framework to a faster method such as nonlinear Conjugate Gradient, and see what modifications we'll need to make in order to accomodate it.\n", - "$\\newcommand\\vector[1]{\\langle #1 \\rangle}\\newcommand\\p[2]{\\frac{\\partial #1}{\\partial #2}}\\newcommand\\R{\\mathbb{R}}$" - ] + "output_type": "display_data" }, { - "cell_type": "markdown", + "data": { + "text/plain": [ + "Gradient at\t3.00000, \t2.00000 \tis\t5.99990, \t3.99990 \n", + "Gradient at\t3.00000, \t2.00000 \tis\t5.99990, \t3.99990 \n", + "Gradient at\t0.00005, \t0.00005 \tis\t-0.00000, \t0.00000 \n", + "Gradient at\t1.50002, \t1.00002 \tis\t2.99995, \t1.99995 \n", + "Gradient at\t1.50002, \t1.00002 \tis\t2.99995, \t1.99995 \n", + "Gradient at\t0.00005, \t0.00005 \tis\t-0.00000, \t0.00000 \n", + "Gradient at\t0.75004, \t0.50004 \tis\t1.49997, \t0.99998 \n", + "Gradient at\t0.75004, \t0.50004 \tis\t1.49997, \t0.99998 \n", + "Gradient at\t0.00005, \t0.00005 \tis\t-0.00000, \t0.00000 \n", + "Gradient at\t0.37504, \t0.25004 \tis\t0.74999, \t0.49999" + ] + }, "metadata": {}, - "source": [ - "Conjugate Gradient\n", - "===\n", - "Before diving in to Haskell, let's go over exactly what the conjugate gradient method is and why it works. The \"normal\" conjugate gradient method is a method for solving systems of linear equations. However, this extends to a method for minimizing quadratic functions, which we can subsequently generalize to minimizing arbitrary functions $f\\!:\\!\\R^n \\to \\R$. We will start by going over the conjugate gradient method of minimizing quadratic functions, and later generalize.\n", - "\n", - "Suppose we have some quadratic function\n", - "$$f(x) = \\frac{1}{2}x^T A x + b^T x + c$$\n", - "for $x \\in \\R^n$ with $A \\in \\R^{n \\times n}$ and $b, c \\in \\R^n$.\n", - "\n", - "We can write any quadratic function in this form, as this generates all the coefficients $x_ix_j$ as well as linear and constant terms. In addition, we can assume that $A = A^T$ ($A$ is symmetric). (If it were not, we could just rewrite this with a symmetric $A$, since we could take the term for $x_i x_j$ and the term for $x_j x_i$, sum them, and then have $A_{ij} = A_{ji}$ both be half of this sum.)\n", - "\n", - "Taking the gradient of $f$, we obtain\n", - "$$\\nabla f(x) = A x + b,$$\n", - "which you can verify by writing out the terms in summation notation.\n", - "\n", - "If we evaluate $-\\nabla f$ at any given location, it will give us a vector pointing towards the direction of steepest descent. This gives us a natural way to start our algorithm - pick some initial guess $x_0$, compute the gradient $-\\nabla f(x_0)$, and move in that direction by some step size $\\alpha$. Unlike normal gradient descent, however, we do not have a fixed step size $\\alpha$ - instead, we perform a line search in order to find the *best* $\\alpha$. This $\\alpha$ is the value of $\\alpha$ which brings us to the minimum of $f$ if we are constrainted to move in the direction given by $d_0 = -\\nabla f(x_0)$.\n", - "\n", - "Note that computing $\\alpha$ is equivalent to minimizing the function\n", - "$$\\begin{align*}\n", - "g(\\alpha) &= f(x_0 + \\alpha d_0) \\\\\n", - "&= \\frac{1}{2}(x_0 + \\alpha d_0)^T A (x_0 + \\alpha d_0) + b^T (x_0 + \\alpha d_0) + c\\\\\n", - "&= \\frac{1}{2}\\alpha^2 {d_0}^T A d_0 + {d_0}^T (A x_0 + b) \\alpha + (\\frac{1}{2} {x_0}^T A x_0 + {x_0}^T d_0 + c)\n", - "\\end{align*}$$\n", - "Since this is a quadratic function in $\\alpha$, it has a unique global minimum or maximum. Since we assume we are not at the minimum and not at a saddle point of $f$, we assume that it has a minimum. \n", - "\n", - "The minimum of this function occurs when $g'(\\alpha) = 0$, that is, when\n", - "$$g'(\\alpha) = ({d_i}^T A {d_i})\\alpha + {d_i}^T(A x_i + b) = 0.$$\n", - "\n", - "Solving this for $\\alpha$, we find that the minimum is at\n", - "$$\\alpha = -\\frac{{d_i}^T (A x_i + b)}{{d_i}^T A d_i}.$$\n", - "\n", - "Note that since the directon is the negative of the gradient, a.k.a. the direction of steepest descent, $\\alpha$ will be non-negative. These first steps give us our second point in our iterative algorithm:\n", - "$$x_1 = x_0 - \\alpha \\nabla f(x_0)$$\n", - "\n", - "If this were simple gradient descent, we would iterate this procedure, computing the gradient at each next point and moving in that direction. However, this has a problem - by moving $\\alpha_0$ in direction $d_0$ (to find the minimum in direction $d_0$) and then moving $\\alpha_1$ in direction $d_1$, we may *ruin* our work from the previous iteration, so that we are no longer at a minimum in direction $d_0$. In order to rectify this, we require that our directions be *conjugate* to one another.\n", - "\n", - "We define two vectors $x$ and $y$ to be conjugate with respect to some semi-definite matrix $A$ if $x^T A y = 0$. (Semi-definite matrices are ones where $x^T A x \\ge 0$ for all $x$, and are what we require for conjugate gradient.)\n", - "\n", - "Since we have already moved in the $d_0 = -\\nabla f(x_0)$ direction, we must find a new direction $d_1$ to move in that is conjugate to $d_0$. How do we do this? Well, let's compute $d_1$ by starting with the gradient at $x_1$ and then subtracting off anything that would counter-act the previous direction:\n", - "$$d_1 = -\\nabla f(x_1) + \\beta_0 d_0.$$\n", - "\n", - "This leaves us with the obvious question - what is $\\beta_0$? We can derive that from our definition of conjugacy. Since $d_0$ and $d_1$ must be conjugate, we know that ${d_1}^T A d_0 = 0$. Expanding $d_1$ by using its definition, we get that ${d_1}^T A d_0 = -\\nabla f(x_1)^TAd_0 + \\beta_0 {d_0}^TA d_0 = 0$. Therefore, we must choose $\\beta_0$ such that\n", - "$$\\beta_0 = \\frac{\\nabla f(x_1)^T A d_0}{{d_0}^T A d_0}.$$\n", - "\n", - "Choosing this $\\beta$ gives us a direction conjugate to all previous directions. Interestingly enough, iterating this will *keep* giving us conjugate directions. After generating each direction, we find the best $\\alpha$ for that direction and update the current estimate of position.\n", - "\n", - "Thus, the full Conjugate Gradient algorithm for quadratic functions:\n", - "\n", - "> Let $f$ be a quadratic function $f(x) = \\frac{1}{2}x^T A x + b^T x + c$\n", - "which we wish to minimize.\n", - "> 1. **Initialize:** \n", - "Let $i = 0$ and $x_i = x_0$ be our initial guess, and compute $d_i = d_0 = -\\nabla f(x_0)$.\n", - "> \n", - "> 2. **Find best step size:**\n", - "Compute $\\alpha$ to minimize the function $f(x_i + \\alpha d_i)$ via the equation\n", - "$$\\alpha = -\\frac{{d_i}^T (A x_i + b)}{{d_i}^T A d_i}.$$\n", - "> \n", - "> 3. **Update the current guess:**\n", - "Let $x_{i+1} = x_i + \\alpha d_i$.\n", - ">\n", - "> 4. **Update the direction:**\n", - "Let $d_{i+1} = -\\nabla f(x_{i+1}) + \\beta_i d_i$ where $\\beta_i$ is given by\n", - "$$\\beta_i = \\frac{\\nabla f(x_{i+1})^T A d_i}{{d_i}^T A d_i}.$$\n", - ">\n", - "> 5. **Iterate:** Repeat steps 2-4 until we have looked in $n$ directions, where $n$ is the size of your vector space (the dimension of $x$)." - ] + "output_type": "display_data" }, { - "cell_type": "markdown", + "data": { + "text/plain": [ + "... and so on ... (39 evaluations)" + ] + }, "metadata": {}, - "source": [ - "Nonlinear Conjugate Gradient\n", - "---\n", - "So, now that we've derived this for quadratic functions, how are we going to use this for general nonlinear optimization of differentiable functions? To do this, we're going to reformulate the above algorithm in *slightly* more general terms.\n", - "\n", - "First of all, we will revise step two. Instead of \n", - "\n", - "> **Find best step size:**\n", - "Compute $\\alpha$ to minimize the function $f(x_i + \\alpha d_i)$ via the equation\n", - "$$\\alpha = -\\frac{{d_i}^T (A x_i + b)}{{d_i}^T A d_i}.$$\n", - "\n", - "we will simply use a line search:\n", - "\n", - "> **Find best step size:**\n", - "Compute $\\alpha$ to minimize the function $f(x_i + \\alpha d_i)$ via a line search in the direction $d_i$.\n", - "\n", - "In addition, we must reformulate the computation of $\\beta_i$. There are several ways to do this, all of which are the same in the quadratic case but are different in the general nonlinear case. We reformulate this computation by generalizing. Note that the difference between $x_{k+1}$ and $x_k$ is entirely in the direction $d_k$, so that for some constant $c$, $x_{k+1} - x_k = c d_k$. Since $\\nabla f(x) = A x + b$, \n", - "$$ \\nabla f(x_{k+1}) - \\nabla f(x_k) = (A x_{k+1} + b) - (A x_k + b) = A(x_{k+1}-x_k) = cA d_k.$$\n", - "\n", - "Therefore, $A d_k = c^{-1} (\\nabla f(x_{k+1}) - \\nabla f(x_k))$. We can now plug this in to the equation for $\\beta_i$ and obtain\n", - "$$\\beta_k = \\frac{\\nabla f(x_{k+1})^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}{{d_k}^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}.$$\n", - "\n", - "Conveniently enough, the value of $c$ cancels, as it is both in the numerator and denominator. This gives us the new update rule:\n", - "\n", - "> **Update the direction:**\n", - "Let $d_{k+1} = -\\nabla f(x_{k+1}) + \\beta_k d_k$ where $\\beta_k$ is given by\n", - "$$\\beta_k = \\frac{\\nabla f(x_{k+1})^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}{{d_k}^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}.$$\n", - "\n", - "We can now apply this algorithm to any nonlinear and differentiable function! This reformulation of $\\beta$ is known as the Polak-Ribiere method; know that there are others, similar in form and also in use." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Line Search\n", - "---\n", - "The one remaining bit of this process that we haven't covered is step two: the line search. As you can see above, we are given a point $x$, some vector $v$, and a multivariate function $f\\!:\\!\\R^n \\to \\R$, and we wish to find the $\\alpha$ which minimizes $f(x + \\alpha v)$. Note that a line search can be viewed simply as root finding, since we know that $v \\cdot \\nabla f(x + \\alpha v)$ should be zero at the minimum. (Since if it were non-zero, we could move from that minimum to a better location.)\n", - "\n", - "There are many ways to do this line search, and they can range from relatively simple linear methods (like the [secant method](http://en.wikipedia.org/wiki/Secant_method)) to more complex (using quadratic or cubic polynomial approximations). \n", - "\n", - "One simple method for a line search is known as the **bisection method**. The bisection method is simply a binary search. To minimize a univariate function $g(x)$, it begins with two points, $a$ and $b$, such that $g(a)$ and $g(b)$ have opposite signs. By the intermediate value theorem, $g(x)$ must have a root in $[a, b]$. (Note that in our case, $g(\\alpha) = v \\cdot \\nabla f(x + \\alpha v)$.) It then computes their midpoint, $c = \\frac{a + b}{2}$, and evaluates the function $g$ to compute $g(c)$. If $g(a)$ and $g(c)$ have opposite signs, the root must be in $[a, c]$; if $g(c)$ and $g(b)$ have opposite signs, then $[c, b]$ must have the root. At this point, the method recurses, continuing its search until it has gotten close enough to the true $\\alpha$.\n", - "\n", - "Another simple method is known as the **secant method**. Like the bisection method, the secant method requires two initial points $a$ and $b$ such that $g(a)$ and $g(b)$ have opposite signs. However, instead of doing a simple binary search, it does linear interpolation. It finds the line between $(a, g(a))$ and $(b, g(b))$:\n", - "$$g(x) \\approx \\frac{g(b) - g(a)}{b - a}(x - a) + g(a)$$\n", - "\n", - "It then finds the root of this linear approximation, setting $g(x) = 0$ and finding that the root is at\n", - "$$\\frac{g(b) - g(a)}{b - a}(x - a) + g(a) = 0 \\implies x = a -\\frac{b - a}{g(b) - g(a)}g(a).$$ \n", - "\n", - "It then evaluates $g$ at this location $x$. As with the bisection method, if $g(x)$ and $g(a)$ have opposite signs, then the root is in $[a, x]$, and if $g(x)$ and $g(b)$ have opposite signs, the root must be in $[x, b]$. As before, root finding continues via iteration, until some stopping condition is reached.\n", - "\n", - "There are more line search methods, but the last one we will examine is one known as **Brent's method**. Brent's method is a combination of the secand method and the bisection method. Unlike the previous two methods, Brent's method keeps track of three points:\n", - "\n", - "- $a_k$: the current \"contrapoint\"\n", - "- $b_k$: the current guess for the root\n", - "- $b_{k-1}$: the previous guess for the root\n", - "\n", - "Brent's method then computes the two possible next values: $m$ (by using the bisection method) and $s$ (by using the secant method with $b_k$ and $b_{k-1}$). (On the very first iteration, $b_{k-1} = a_k$ and it uses the bisection method.) If the secant method result $s$ lies between $b_k$ and $m$, then let $b_{k+1} = s$; otherwise, let $b_{k+1} = m$.\n", - "\n", - "After $b_{k+1}$ is chosen, it is checked to for convergence. If the method has converged, iteration is stopped. If not, the method continues. A new contrapoint $a_{k+1}$ is chosen such that $b_{k+1}$ and $a_{k+1}$ have opposite signs. The two choices for $a_{k+1}$ are either for it to remain unchanged (stay $a_k$) or for it to become $b_k$ - the choice depends on the signs of the function values involved. Before repeating, the values of $f(a_k{+1})$ and $f(b_{k+1})$ are examined, and $b_{k+1}$ is swapped with $a_{k+1}$ if it has a higher function value. Finally, the method repeats with the new values of $a_k$, $b_k$, and $b_{k-1}$.\n", - "\n", - "Brent's method is effectively a heuristic method, but is nice in practice; it has the reliability of the bisection method and gains a boost of speed from its use of the secant method." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Implementation\n", - "---\n", - "\n", - "Now that we've reviewed the conjugate gradient method, let's revise our previous gradient descent framework to so that we can implement conjugate gradient (using Brent's method for its line search).\n", - "\n", - "Recall that in the previous notebook, we defined a class that allowed us to do gradient descent on arbitrary function-like data types:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- Extensions and imports we'll need later.\n", - ":set -XTypeFamilies -XFlexibleContexts -XMultiParamTypeClasses -XDoAndIfThenElse -XFlexibleInstances\n", - "import Control.Monad.Writer\n", - "import Text.Printf" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 1 - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "class Monad m => GradientDescent m a where\n", - " -- Type to represent the parameter space.\n", - " data Params a :: *\n", - " \n", - " -- Compute the gradient at a location in parameter space.\n", - " grad :: a -> Params a -> m (Params a)\n", - " \n", - " -- Move in parameter space.\n", - " paramMove :: Double -- Scaling factor.\n", - " -> Params a -- Direction vector.\n", - " -> Params a -- Original location.\n", - " -> m (Params a) -- New location." - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 2 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "This same class isn't going to work quite as nicely in this case, because we must be able to compute\n", - "$$\\beta_k = \\frac{\\nabla f(x_{k+1})^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}{{d_k}^T (\\nabla f(x_{k+1}) - \\nabla f(x_k))}.$$\n", - "\n", - "Since both the gradients and the search directions are represented as vectors in the parameter space (`Param a`), we must be able to take the dot product of any two such vectors. We already have the capability to add and subtract them via `paramMove`, though.\n", - "\n", - "One option is to add something like `paramDot` to `GradientDescent`, and call it a day. One one hand, that is simple; on the other hand, it seems to conflate two independent notions - the ability to do gradient descent and the ability to use `Param a` as a vector space. Instead of doing that, we can require that the parameters form an inner product space:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- We will call this a vector space, though the definition actually\n", - "-- requires an inner product, since it requires an implementation of `dot`.\n", - "class VectorSpace v where\n", - " -- Add two vectors in this inner product space.\n", - " add :: v -> v -> v\n", - " \n", - " -- Scale a vector.\n", - " scale :: Double -> v -> v\n", - " \n", - " -- Take the inner product of two vectors.\n", - " dot :: v -> v -> Double\n", - " \n", - " -- For convenience.\n", - " minus :: v -> v -> v\n", - " minus a b = add a (scale (-1) b)" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 3 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now, instead of requiring `GradientDescent` instances to provide `paramMove`, we'll just require that the parameters form a vector space:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "class (Monad m, VectorSpace (Params a)) => GradientDescent m a where\n", - " -- Type to represent the parameter space.\n", - " data Params a :: *\n", - " \n", - " -- Compute the gradient at a location in parameter space.\n", - " grad :: a -> Params a -> m (Params a)" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 4 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Great! Now we start implementing these methods. In order to avoid spending too much time on line searches, let's just go with a simple bisection search for the time being.\n", - "\n", - "The implementation is pretty simple:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- A point consisting of a value and the function at that value.\n", - "-- The stopping condition is implemented as a function\n", - "-- Point -> Point -> Bool\n", - "-- That way, the stopping condition can decide based on convergence\n", - "-- of the x-coordinate or of the function values.\n", - "newtype Point = Point {unPt :: (Double, Double)}\n", - "\n", - "bisectionSearch :: Monad m\n", - " => (Double -> m Double) -- What function f to find the root of\n", - " -> Double -- Starting point\n", - " -> Double -- Second starting point\n", - " -> (Point -> Point -> Bool) -- Whether to stop\n", - " -> m Double -- Approximate root location.\n", - "bisectionSearch f a b stop = do\n", - " let midpoint = (a + b) / 2\n", - " aValue <- f a\n", - " bValue <- f b\n", - " \n", - " -- Check if we're done with these two values.\n", - " if stop (Point (a, aValue)) (Point (b, bValue))\n", - " then \n", - " -- If we are, return their midpoint.\n", - " return midpoint\n", - " else do\n", - " -- If we're not done, change one of the values to the midpoint.\n", - " -- Keep the two values having opposite signs, though.\n", - " midvalue <- f midpoint\n", - " if signum midvalue /= signum aValue\n", - " then bisectionSearch f midpoint a stop\n", - " else bisectionSearch f midpoint b stop" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 5 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now that we have our line search implemented, we can go ahead and implement the actual conjugate gradient algorithm." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "newtype StopCondition m a = StopWhen (Params a -> Params a -> m Bool)\n", - "\n", - "conjugateGradient :: GradientDescent m a =>\n", - " a -- What to optimize.\n", - " -> StopCondition m a -- When to stop.\n", - " -> Params a -- Initial point (x0).\n", - " -> m (Params a) -- Return: Location of minimum.\n", - "conjugateGradient f (StopWhen stop) x0 = go x0 Nothing\n", - " where\n", - " go x prevDir = do\n", - " -- Compute the search direction\n", - " gradVec <- grad f x\n", - " let dir = case prevDir of\n", - " -- If we have no previous direction, just use the gradient\n", - " Nothing -> scale (-1) gradVec\n", - "\n", - " -- If we have a previous direction, compute Beta and \n", - " -- then the conjugate direction in which to search.\n", - " Just (prevX, prevGrad, prevDir) ->\n", - " let diff = gradVec `minus` prevGrad\n", - " numerator = gradVec `dot` diff\n", - " denominator = prevDir `dot` diff\n", - " beta = max 0 $ numerator / denominator in\n", - " scale beta prevDir `minus` gradVec\n", - "\n", - " -- To minimize f(x + \\alpha d_k), we find the zero of\n", - " -- the dot product of the gradient and the direction\n", - " let lineVal alpha = do\n", - " let loc = x `add` scale alpha dir\n", - " gradient <- grad f loc\n", - " return $ gradient `dot` dir\n", - "\n", - " -- Stop when alpha is close enough\n", - " let stopLineSearch p1 p2 = \n", - " let val1 = fst $ unPt p1\n", - " val2 = fst $ unPt p2 in\n", - " abs (val1 - val2) < 0.1\n", - "\n", - " -- Find the best alpha value\n", - " alpha <- bisectionSearch lineVal 0 0.5 stopLineSearch\n", - "\n", - " -- Compute the new location, and check if we want to continue iterating.\n", - " let xNew = x `add` scale alpha dir\n", - " shouldStop <- stop x xNew\n", - " if shouldStop\n", - " then return xNew\n", - " else go xNew $ Just (x, gradVec, dir)" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 6 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's try this out on a two-variable function. Since we do a line search, doing a single-dimensional conjugate gradient would be pointless." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- We need FlexibleInstances for declarations like these!\n", - "-- We must declare these instances together, because they have recursive dependencies on each other.\n", - "instance VectorSpace (Params (Double -> Double -> Double)) where\n", - " add (Arg a b) (Arg x y) = Arg (a + x) (b + y)\n", - " dot (Arg a b) (Arg x y) = a * x + b * y\n", - " scale s (Arg a b) = Arg (s * a) (s * b)\n", - " \n", - "-- In addition to our usual definition, let's log the number of function\n", - "-- gradient evaluations using a Writer monad.\n", - "instance GradientDescent (Writer [String]) (Double -> Double -> Double) where\n", - " -- The parameter for a function is just its argument.\n", - " data Params (Double -> Double -> Double) = Arg { x :: Double, y :: Double }\n", - "\n", - " -- Use numeric differentiation for taking the gradient.\n", - " grad f (Arg x y) = do\n", - " let dx = f x y - f (x - epsilon) y\n", - " dy = f x y - f x (y - epsilon)\n", - " gradient = (dx / epsilon, dy / epsilon)\n", - " tell [ \"Gradient at\\t\" ++ show' (x, y) ++ \"\\tis\\t\" ++ show' gradient ]\n", - " return $ uncurry Arg gradient\n", - " where epsilon = 0.0001\n", - " show' (x, y) = printf \"%.5f, \\t%.5f \" x y" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 7 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We can define a function $f = x^2 + y^2 + 3$, which looks like this:\n", - "\n", - "![](data:image/jpeg;base64,/9j/4AAQSkZJRgABAQIAOQA5AAD/2wBDAAMCAgICAgMCAgIDAwMDBAYEBAQEBAgGBgUGCQgKCgkICQkKDA8MCgsOCwkJDRENDg8QEBEQCgwSExIQEw8QEBD/2wBDAQMDAwQDBAgEBAgQCwkLEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBD/wAARCADUAjsDASIAAhEBAxEB/8QAHgABAAIDAAMBAQAAAAAAAAAAAAYHBAUIAgMJAQr/xABKEAABAwQBAgQEAwUFBQUHBQEBAgMEAAUGEQcSIRMiMUEIFDJRQmFxCRUjM4EWJENSYhdygpGhJTRjorEYRFNzksHRJjVUZMTh/8QAGwEBAAIDAQEAAAAAAAAAAAAAAAMEAQIFBgf/xAA+EQABAgMFBQcCBQIGAgMAAAABAAIDESEEEjFBUQUiYXHwBhMygZGhscHRFCNCUuFi8QckM3KS4jSygqLS/9oADAMBAAIRAxEAPwD6p0pSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlfhUlOupQGzobPqawb9frNi9knZHkVzj262WyOuVMlyFhDbDKElSlqJ9AACa5Tv9ru3xGXA59lE67WGzRD4mDQWC5Fl25ST5Ls8lX/vS9AoQpOm2j0qBK3BViz2aJan3GLVzgwTK67pVTcHctTcsEzj3PFNR87xtpHz6UJ6GrnGPZu4Rx6dC9aWgd217SexQpVs1C9jmOLXCRCyDOoSlKVqspSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiV6332YrDkmS8hplpJW44tQSlCQNkknsAB715OONstqddWlCEAqUpR0EgepJ9hXMnIl+k/EuudhNtXLh8VJCmLlcGH1x3slc3osR1pIUmGD9bgI8X6RtHUTLBgvjvuMFVq5waJlL5kj/wARF+auUeYtPF9mkhcCOlJCcnlI0RKcJHeG2r+UgdnVp8Q7SEbl6xVccd5Dd8TvI4bzuT4k+GyV47dF9ITebegaAOtASWk6S4jQ6hpaRoqCbHWa9XYoLIEO6zz5qlEeXGZURzTEnr07ByTHpoteV2BapNluQKgG3CO7TwSQXI7n0uNnsR3GlAEWzw3y7C5Ss8qPOgmz5TYnExb/AGVxW1xHyNpcQf8AEjuAFTTo7KAI7KSpIrPM8vsOC49LyfI5ZjwogSD0IK3HXFKCW2m0J2pbi1FKUoAJKiAKrKx4nysu7K5ziXkWfPVpSm32V5R/d7VrSVKTbZYT/MW51Fa3e5bcI6NpR5oNo2EWrehjfHuOP0W0KLcocF25SoLxHy3Y+WrA5PhRX7Xd7c58rebLL0JVtlAd21gdlJPqhxPlWnRBqdV5cgtMiruKUpSsIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIleLjjbLannnEoQhJUpSjoJA9ST7CvRcrlb7Nb5N2u05iFChtLfkSZDgbaZbSNqWtR7JSACST2Fc55TfLv8AEm4LZGRMtPE6VBT6lhTMrLQD2Ro6U1b+wJ35pAOvK1vxJYUF8Z11i1c4MEyszKMtk8/yXbPYZTjHGkZ1TUuY0spXkjiFaU02odxDCgQpQ7ukEDybKpO1HjxGG4kVhtlhlAbbbbSEpQkDQSAOwAHtVUyGm/h6u7k5t1DPGN3k/wAaOhnSMbmOq7up6eyYbiz5hr+G4rq+lSim2VFKgCCCD3BH2r0djgsgNujHNUYkQvM1E+Q8DtHIVhVZ7k6/EksOJk2+4xldEm3ykd232l+ykn29FDaSCCRUawzkl4R7tjvJjkK0ZLizHj3NfiBMeXDAPTcGdns0sA9ST3QoKSfYqsaW/HiR3ZUp9tlllBccccUEpQkDZUSewAHvXPeZYHN+JWbDzKCtq0WLHNyMXkyIoU5epQUlQefSob/dxKEgNdi9/M7BLZVbcS0zZjpr/b+FGDPHBSbEYFw5UvsTlPKIc2FZIKyvFLLKR4Z6SnX7ykt+vjLBIaQru2g7IC1EJspwVGMCz9vL2pdqutuVZsmsxS1d7Q6vqUwo/S42rt4rC9EocA0RsHSgpIk66twAJTBnNRuJmohkNgvkG+xuROPJzVuy62t+ElTu/lrnG3sw5QH1Nn8K/qbV5k+4N58S8uY/yxZXpMBtduvNsWI16sslQ+atsjX0LA9Uq9UODyrT3HuBz/yRnczHTCxfEosW5ZhfipFqgvu9LbaE/wAyW/rulhoHZI7qUUoHdQrS4/xQ7x5MYzfj+7eBniFLeuF4lFRTfFLILrUxIPmaJGkJH8oBIb6dCudtLZothL4A3xjx4c+jkp4Mfu6OwXalKrbiHm2y8oMvWibbnsfy62Nhd1sEtXU4xslIdZc0EyGFEeV1HbuAoJVtIsmvKOa5hLXCRC6AIImEpSlarKUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlURcvic/sFzLdeL+XcTex+1u9EzH8macS5bpEEtpC1yVEhTCkO7QpRSW09bfUpPUkqveq55w4ud5JxZLthcjxcrsalzbDLkJJaD/TpTDw/Ew8nbbif8qtjzAEFgzlRWEw+xKZbkxnkPMupC23G1BSVpI2CCOxB+9eyuS+On8yx61KybhJ1ERmKtUG8cd3uSsQ4cxhag8xFd8yoLmyenpSplY6FBACuurx475ww/P7grGXUysfyyOwH5WO3dAZmIT7rb7lMhsHt4jRUnfqQe1FoyK2JQY6Kw6UpRSJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiVoM4zrFuOMalZZmF1RBt0QAFRSVuOuKOkNNNpBU66tRCUtpBUpRAAJNRnkzme1YLJbxiw2mRlWZzGw5Dx+3uJDobKukPyXD5Y0cHe3F+uiEBatJMEsmD5Her+zyBzBdYd7yNjZtsGIhQtdjSr1TGbWduPeypKwFqGwkNpJRViBZ3xzTDVRxIohjiobbcluPxHZncYvJVsuGO2jF5CXoWCTWyhc1pYBj3GeoEokJOldLCdttKB6+paR0XCoAAAAAAaAHtUM5Pwe7ZCLfleGTWbfmGOqW5a5L2/BkNrA8WHIA+pl0BIPqUqCVjSkg1nYFnVuz2yrnx2VQ7hBeVCu1tdUC9bpiP5jLn6eqVei0lKhsEGu3Z4bYIuAfyqER5fvLcXCFFuMORb58dD8aU0pl5pwbS4hQ0pJHuCDVQ47c3uFMgj8cZXdXXcTualDFbtLc38mUp2bZIcPp0gEsuK+pI6CepKeu3bnPhWqDIudzmMRIcRpT8iQ+4ENtNpG1KUo9gABsk1Sd9sX/tQwv3fe4EqHxU6kOJQ4FMSsjWD5HAOy2IqSApJ8q3To9kfXadld8XXstAZ44L2tLn8/XUSnmFM8YQHiW2nAQrKHknstQ//AISVDYB/nEdx4fZVsKASOkAAAAAAdgKrjBcnueFXaHw9yHJBnIaUjHLwrSWr3FbHZB12RKbRrrR+IDrT26gmx3KsQaieea0eVBeQ8AcyVyJkuNXFNmy2zpUbbc+jqSpJ7qjSEj+bHXodSfUHSkkKANRf/bxaYthmN3yzSI2aW99u3PYs0rxJT81wHwkxz/iMuAFaXuyQgKKunoUBYWYZZY8IsUjIshmfLxI+kgJSVuPOKOkNNoHdbilaSlIBJJFU5I4jy7Ori3zXc5y8c5CjI1jsVR8SNaoQ6v7nJSk6fLwUS8r1SSkII6OpUrrzT+Xjn9+emvusAgjeUz45we52MTMuzOU1PzHIA2u5yG0ANxW0jbcGP7hhratb7qUVrPdWhLnCai2D8jMZU6/jt7trliy22socudmfVtTYJIDrK/pfYUQelxP6KCVApEocNXoF26LuHWPFQvcZ1USzfHYU5prJ2b87jV6sKFyIGQRlpQ7AA0pXUVeVbJ6R4ja/ItI0oaqf/Dz8UMPktmFjWeRk2XJJyVuWh9bKo8TIoqfpkxAskoUpI6ywo+IE+YbTsikZ7g5uvkiyNIdGB2KX4U98KHRf5jZ80ZI9TGaUP4h7BxY6O6UrCptkuNWbJ7cLZd4niNtrS8w42otuxnkd0OsuJ0ptxJ7pWkgg+hqlbdls2mC9lHDA68+HHHyU0G0mz0NR8Lrelcv4Xz/k/FLMexcyvSL7jTISxHy2OwpcmMgI+q5NJ3sdiDIbGu460p7rrpa13W2Xy3RrvZrhGnQZjaXo8mM6lxp1tQ2FJUkkKBHuK8ZabLGsj+7jNkesF1ocVkVt5hWVSlKrqRKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSsG+X2y4zaZd/yO7w7XbIDSn5UyY+llhhtI2VrWohKQB7k0RZ1VPz78TnE/wAOVhVdM9vXiXFxlT0KyQul2fMA9VIbJHSge7iylA91egrm7nL9oa9LMnF/h4twcTpTbmW3Ng+Ak6HeHGVpTx7nTjnSgaBCXAaq/wCDn4f5vP8AyhM5c5Ek3C9WHHp6HZU+5ul96/XZshSGVKWD1R2PqUlOkhfQhI0lYHomdnbRAsR2ltD8uF+kHxPOQaDgMy40AqAaLS+CboXfnCGYZ3yBxva825DxFnFrjewqaxZ0ul12HDWdx0vqOtvFvpUsAAAnWtg1Pa/AABoV+151bpSlKIqH5mw+Zx/kb3PGHW9b8ZUdLOZWxgEmVDb7pntISCVSGBvY9VtbT3KUa9GS4vifLWLx3BNOpDHzNnvdue6JUJS07RIivp7oVoggg96v1aEOoU24hK0LBSpKhsEH1BFcUZ9xnnPwx8hKvnFt+Sxx9llyDqLLch/2Tbbg7sGKXAFLiNvuEKacSOhLhKFJUFo6chU7TBJ/MZiFuOF+aPiEwdmfg3JttPIsnFHRHuD0RKI16+VUT4E1DaulqaytAJ6klDgUlaCFqSSelcD5TwPkqO+7h+QsTH4auiZCWFMy4a/8rzCwHGj/ALyRv1GxXLt7zA5jPgZLjlpfxrlzFG3vBxq7vIYcvETsZMNLg2mQwsI6m3UEhDiW1KA7pNhR7FxtzbaLXn8Bh+LckpIiXeA6qHdLe6k9LjKnUEKSpC0lK2lbTtJCgRW12eCjZbC3xhdGUrme4c48ncG5LYMY5Bs0vP8AGsgdVEg5Fb2mY9wiyUp2mPKYJS08twBRQtstlRSUhClFIVdeEcr8e8ipcTiOURJkqP8A95gr6mJkY9uz0dwJdaPcdlpB7itSCMVdZEbEE2lS2lKVhbpSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlK/FKShJWtQSkdySdAVUN85+avbsixcH2L+290bUtldxS94NjhOpV0qD83RC1JO9tsBxYI0QkHYyGlxkFgkNEyrUul1tljt0i73m4xoEGI2p6RJkupaaaQkbKlKUQEgD3NUfeuUc75hItfCrzmPYs4VJk5vKjJW5ISCAU2uO4CHNjepLqfDHYoS7vYgF/x7I7VyJar58SF5i5zjt5W2xBUIi2LVj11KiEIMPqU2tlzaUokPFbiHEgbAWOnoAgJT0gAADQA9BV2DZZmb/RVYlpyYqGkYWj4cbs/wAiYzIudxxe6qQc1amyVS5Yd7JF3Di9rWpI7PIB0UAKQkFHSq6Y8yLPiszoMlqRGkNpdZeaWFIcQobCkkdiCO4NZLqEOoU06hK0LBSpKhsKB9QRVLWxSfh8yNvGpanBxxkU3ps74b/h49OeX/3NZH0xnVq/hKIAbWejelI10mgQ8MFULr/NW+uqY5rKOKnXefbDIjx5EJDMW/W5RSgX6J19LbaSdbloKz4PqVElv0XsTjPOT8dweTDsriJF2yO67FssNuCXJszRAKgkkBttOx1OuFKE+6vSo/jPHl9vl/j8h8uuxZd8iLWuz2eKsuW+xJIKdtlQBeklJIU+oD1KUJQN9Ux3t0Y/C1BlUqM4vb5vxEx7byDmrYjYSVfNWXFg6FKkKSryyLn09i4kg6i7KGz/ADOpYARcKgANAAAdgKq3IsbvfE1/nch8fwn5+PXFwyckxpgdSuv8U+En8Luu7jQ7OAbGl/VYdlvtoyiywsisE9qbbriwmTGkNHaXG1DYI/8Awe4qeDShx664LR7sxgtRneEWDkHH3sdyKOtbC1JeZeaWW34r6Dtt9lwd23EK0UqH/pVaR+XX+KmpOKc53IJmQmuuz3ppg9OSNbCUtttpH/ftlKVMJ7rKgpAIJCbOzXM8fwOwyMjyWd8vEY0lKUJK3X3D2S002nzOOKPZKEgkk1VrvFlz5lQ5lXL8eTbF9l41ZY8npcsHoUS1rQdG4EgErSSGh5EE7Wtczp3vy/F1j9PtNahwlvYLPxDEciy6/scpcpQzFlshRx/HSsLasrShrxXdeVyYsfUvuGwehH4lLsZw1AMYz662S/R+NeUCli9uoP7qu+koi31tPqUkdm5IHdbPbf1I2NhM9cIq5Z5Sp56z49clHEcZ1UNz7j+3ZqzHmNTX7Pfrb1Ltd5hgCTDWSDob7LbUUp621bSsDRFVJN5Fy/OMoPANylxLDeWllF8vkF4+FKipbSsswe4U3LdSolTZJLKApQKvITYOe5nf7jeTxpxi6ycjcQly5XNxHiR7BFX6POD0W+ob8Jn3PmVpIO/I8NYP/YhOCyYb8mOHPmlT3Hj8+uaVdRmF8aUHysdXWNd+w7dq3LTEce78+PDnxyw5YDw0b3l9/wCFJLXZrVj1qi2Ox29iDb4LKWI0ZhAS202kaCQB7V5Ofaq4YzTKeNJibJyl1zrEt3w4OXp6EoAIJCJ7aQAwoAa8YDw1H16D2O65CzxrErVGFri/vW+XlwRbJbmjszHyN7JG+lpA2tbh7JSD7kA9CHFYGkmksuvaWOShcDPmtRyNllz+ejccYZpeS3thTiny2HGrVC2EuTHgex1shts/Wv26QojJwGBlvw/RW/8AY5OVMtjY8S4Y1dH9x7ksIALjTuiYr6ikEqSC2oklSCdEe3BMJ/sjDlTrrKbuWSXp0Sr1cw30GS96JQkEkpabT5G0b7JHuSSZA4a2fYYduYRaWznhw5cdfTALDbQ6Afyz/KvPinn3AOWlSLXaZb1ryOAP+0MeuiAxPin/ADdGyHWz7OtlSD999qsivn7y7LssWJb0ItIn5ZLf8HHG2JC40tMnXd1uQ3/EYQ2na1uJPZI13JAM/wCN+c+aeLIkG18lOL5KsqI7SH7lEZQxd4jgSAslvYblN72R9LoHr4hrx1v7OR7O4mzb7RpiPLPyyyXXs+0ocQDvN0n0XYdKinH3KWBcpWxV1wfI41xSzpMmOCW5MRZ9EPsL04yrsfKtIPv6VK686QQZFdLFKUpWESlKURKUpREpSlESlKURKUpREpSlESlKx59wgWqG7cLnNYiRWElbr77gbbbSPUqUewH60RZFeDrrTDannnEttoBUpSjoJA9yT6Vyxzj+0O4e4ztzrHHyV8j3xagzGYsz6PkS+rYShUw7bJ36hvrUkdyAO9cd8q818tc49TXJmUf9krIUMdtXVHtiNaIDid9ckgj1dUU7GwlNen2F2S2nt98oDLrKEudMCR0pM4HD1koItoZCxNV2JzN+0C40wdcix8Xwv9oF6ZBDkiJJDVoika34szSgsjv5WUuHYIUU+tcB5jzHyz8RN5XlvKmXuXKxtu9VkscZn5a2M6J/jpYBJcPshbqlr1s7HUAIfktwgznHccRKag2e3JQu7yAoNtoR2KYwII6SoaKvsnt+IV5RVZjn12tmH8cWh1Eu9SW4FsK0BD0xxXoGG1dkIAG1OuAJSkFWiBuvpWzuy+xuzrvxdoPeOhTMzKpGJDcA1uF505uoCS2Rr97EjUFJqxuK+Lck515DhcX4pIVEU+j5m7XNKOpNrgJIC3T2I8RX0NpPqo7+lKiPrdgGB4xxjhtpwLDbamDZrLGTFisglR6R6qUo91LUSVKUe5UST3NVl8Jnw1Wb4aeM28d8du5ZPd3BPyO791LmSyPoSojq8JseRsHXYE6BUau2vmnajtJG7SWzvXbsNtGt0Gp4nPyGAVuHDEMSSlKV5lSJSlKIlazJsZsOZY/cMWye1sXG1XRhcaXFfTtDrahog/8A2I7g6I71s6URch3fC4Vpv0TgjmBld1jeaZhGQKdcbkyGmkn+F8wCFtzmEE7UlXU43peyesCGoZzn4Ur/AHnPrxd5eYcY3XT17dQyDdbXI2EpnuoTpDzfQAl5aAlZCUrKVEKJ7D5M45sfKWKSMWvjkmPtaJEObEX0SYMpB21IZV7LQruN7B7gggkVyJY+ZM9wW/5HxZ8T+KwU/uZAC8gtbRdg3C3ODpRKfjbUtptfcLUOpCVdYX4YAKpGEGhXLtMAwjeZVpxH2V1fNcbc8YLKgQbtCvtluCOhTsR4FyO6k7StJHmaebUApJ7KSpIPYiofjFhx7kCTL405pskGXnmLsJUzemEmLLuEBe0tXCM+30uNrOul0II6HAR9Kkbrm04Fi+JZdbrTHu8m3Wa/ISnDMzsbyWnYi9KKbXKcTtqQ0Adx/GSpJH8P6gnq33KEfmewRoGSS7Sxfbvi7xlWnJrHGIfU1/jQ58IdS1MPIASpTBWevpcDaSgVMFVaZGTT11irTVaucONYRlcf53/bW2QGSoY9lSOuW8lI34bFyb0sLOtAvoe2T3Ir34B8YOCZHj8G955YL1gCZvU3415aCoKHkL6HGlymypDK0L2lSX/CWCD5azOI+Vsa5iwa3Z1i7jgjzElL8V5PRIhSEHpdYeQe6FoUCCD+R9CKi2XtjhrLpXJ8KO49hmSPIRmUFCOtuC90hDd2SgD0ACUSP9AS5/hq6hgtdUKzCtb2m66q6Is1/sWRQ03CwXmDcoq/peiSEPIPbfZSSR6EVn1Rc/gTiC7yTfLXjCLBcZCkvqumMy3bRKe7dip6IptTgI7eYkEVCLrP554m5Btlki8wtT8Qvwbg2R3KbYiZ4Nx7/wB0kSGPBdHij+U6tS/MkpIUSnejoDhgrTLZDfQ0XVdKp1vlHmuwDw8q4O/fYT/7xid8jvdQ+5ZmmOUn8gtX5E+lY1y+LHAMYdgsZ/imdYq5cpIhxBOxuRIS9IIJDSVRA8kqOjoA96iLHDEKdsVjsCrrpVWxfih+H999ESVyvYLXJc30R7vI/dzqvyCJAQrf2Gu471MIXIvH9yQl235xYJKFp6kqauTKwR9xpXpWqkxUipWvTkNgWkLRfLepKhsESUEEf86wJfIGCQEeJOzSxR0f5nLiykf9VURb+lVxdPiP4Asz/wApcuaMLZkd/wCB++46nT+iEqKj/wAq1bnxOcdyXAzjNozTJFqG0KtWKXByOrXqPmVNJY2B30XN9teugcgE4LBIGKtulUHafiI5Dzy/3zFcA4Yct1xx9bLc8ZfeWYCmfFR1tueDGEham1p+hXYEhaSUlJFbJ7E+cMqKf7YcyNWKIo9S4OIWlEdZGtFtUqUXlkf6m0tK36EVu2E52SidHhtxKs7LM4w3A7abzmuVWmxQQQn5i4zG46Co+gBWRsn7etVs9zzkGWqficN8ZXa9eH2F4yBDtltR2NhSFOtl99J9QppkoOvrAIJra78d41wLyOzytKta75jFzaZg3q63uSu4z7FJ69NzkyJBU4mMskJdSFdKCELACQquhisKAIII120e2vyqZlnB8RUD7X+0LnHL7TmMjkCxW/4mchayXD8kQ1AhxrSyu32SHdyVgRpjBcUuS28lSUtqeWpHiI0UArTV/W+2W2ywI9os9vjQYURAaYjRmkttNIA0EpSkAAAewrFy3FrFm+N3DE8mgpmWy6MqYkMklJKT7gjulQOiCO4IBqu+M8tveNZE/wAKcj3Vcu9Qm1SMfu8jSVX62j0USAAZLI0l0D18qx2UQLTGiHQKs+IYlSVYt+stpySzzsfvsFqbbriwuNKjujaHWljSkn9Qaq7j7Ir5gOTs8J8gXBcxS2Vu4le3lEqusRGyqK6o+sphAG+5LjYC+5C9W6qqd5vyfBr3Ce44Qzdchy5tTMyDbMcIVcbfICtsSlOEhuKkK79bykpI2PNvRnNKqNrp0VrrISCokAAbJPoBVJZrlk/muJc+NeMbDBu1jlhy33vKLkjxLVGT3C0RkAhUx9J/ykNoUO69p6TH8Ai5jzDNueE/EXkPyd6saW/nMNtBVEhToq0AJlOPBRdmMrV1ApSUNhQU2tC9BR6AhQINpgx7Xa4TEOHEaSxHjsNhttptI0lCUjskAAAAVK2cQcFgkMPFUXxDjVu4OymRgOVM/N3G+rKrRmM5xT0u+oSCfk5LyySJDSR5UbCVo7oSOlYF3rNafOcLsXIGOyMayFha47xS4060vofivpO232Vju26hWlJUO4IqF4FmOSWS+/7KOUZKX7800p6z3pLQQxfoifxaHZuUgfzGu2/rRtPUETw/y93JaudermrFWa5/5NyZPw13d7MLDFevFjyVx5+ViELRmCWlBW5MgN+hQddUhJ0kd3dg9QVYWa8pC13tOB4PaTkmYPoDnySF9Ea3tn0fnPgEMN+4TouL0ehJ0SPHBeNRjk+TmOV3MX/M7owlifdVNlDbTIPUI0Vok+BHBJPSCSo91lRqY75kzHXTrT1Wgddq70Wh46xORlEq38x53dbfe71LjeJZmoD3jW2zRnRvpin0ccUk6XIIClDsAlPlqx3CKqufYb9wpdJeQYJaX7thE5xyXdccigF+2vq8y5UFJICkKOy5H367U33JSqwrFkVkyuyQ8jxy5MXC2z2g9Hksq2haT/6H2IPcEaNWYEhunHqqjiOzyWDl+KWDN7DKxnJbeiZAlp6VoJIUlQO0rQod0LSQFJUCCCAQaofJOWsy4wu6OG5dziXa4SWWU27LJax4VrYcV4aXLsAAEO7H8I9g+rSfKQVGy845Bu716e454wjM3HK/CSuZJdHVCsTS/pelEEdSyDtDAPWvW/KnahlY1xdjGO4zPx2Yx++lXwrdvsy4oS49dnnE6ccf7aII8oQAEoSAlIAAFTFpiO/LMjmfp1hzosB4aN/0+vWKyMLwu1YLY02q2qXIffcMq4T3jt+4S1/zJDqvdSiP0SAlKQEgAbdyqwLmU8JBLDrUnIePWEgJeSVv3KxNJSdhSe6pUcBI0Rt1Gz2WnXTLLnyDh8DEF54q/RpFjDXityoqvHD++yUNBGy4tR8oSnaiTrW6uwXtaLppLLrLj9VC+ZMxWa9Wd5RYMRxuXdskbU/EIEdMRtnxnZrrh6UR22/8RayQkJ99/aqVwzijOePLkvky12yPJkTGnm14cuWVNWiGtwLSxb3lbS24ACXEDTbi9AFASDU/xLGciyi+M8l8kRjFltBRsVh6gpFnaUCPEcI7LlrSdKUNhAPQj8SlT1wip2wvxBER1JYa8/49a4amL3YLR59fXoxfFM6xvOIb0mxzFF+IsszYT6C1KhOjsW3mleZB2D69iO4JBBrxzDKrPhlhlZFe3lIjRgAEoHU484ohKGm0+qlrUQlKR3JIFa7OcHxe4vLzOTcnsdu1ujqP79hvhhxllAKiHSrbbjadk9LoUn31VP4tl2Y3S7Qc/wCWLRNlYrBQoY5cIENQaWVLUn94TIqSpbalt9HhqAKUpWtRCOrQsmO+FJjsTnlzIy+CaT00DQ/eGGn0GqsXBcWvf7xlchZ2hIyS6tlpqGh7xGrRCJBTEbPYFRICnFgeZfbZSlOpc6qvG33e13y3tXWy3KLPhSE9TUiM6lxtwfcKSSDWtybIbPilkmZDfprcSBBbLjzqz6D0AA91E6AA7kkAetdKC1kJk50xn9Zqq97nu4qGcnzLLi6omT25mcxmcp1NvsT1mlmHcJMhQJS14qSNtDutYXtASkkg+lWjgnxGc94FBixOSINs5IYT0fNTra2i2XJvsOspZP8AAf770OprtruaqrCbFd7zdneT80hLjXWaz4Fqtrqgr9zwFdJ8M6H85wgKdPfR6UAkJ2Zk6qqcfYlk2tOLaGSJwIoZanUnQ4CQxmrEO3xbJJkMz1zHl/C6TwT4oeGs8nRrExk/7kv0pXhtWa+tGBMcXvXS2lzSXvbu0pY/OrXBBGwdivmlytcY8i3MYbEtVvut9v8A1NQIs1oOtNJT/MlOpP8AhtA7PptRSkd1Ctjg9vzfi23xIXH/AC9mNs+VYQ0pqRO+fiOqSkAn5eSHG2we500EevbWhXl7V2Kj94W2J4cBrQ8p1BMq5ZarqwtuQ7oMdsp6V89V9HaVxjaPiv54xwD+0uPYblcZsbW5FW/aJHQB6nqL7ale5+hJ+yfaWYZ+0C4xyS1x7xesFziyQ5HV0TBbEz47gCinqR8qtx1SDrYV4YBBBG68/adg7SsbrsWC7yF7/wBZrowtoWWMJtePOnzJdRUql7Z8ZPwz3Ieflm1Ww70U3ht62lPfWyJKEaH5nt71MrTzfwzftfuTljD5+96+WvcZ309fpWa5T2OhmTwQeII+QFbDg6oKm1K0kbN8Mma+Uy2zPbV0jw57Stn7dleteMzPMHt+/n8xskfp3vxbg0jWvX1V7Vqsre0qvL98RPAmMBP7/wCZsLgqWOpCHr5GC1j/AEp6+pXofQGq8yH4+fhcx1DKnM+l3BUpwMx026yTpIddV9LYWlro6j7AqHv7A6khwYkX/TaTyBPwCtS5rcTJdDUrlK7ftCcIbCkY3xRnd1VryOvNRIbG/souP+IP6NmqW5O+P3nz92OSMYxnFMXLiyzDjjxrtNlOq/loSohpts+vUShYSATvQJrt2fsttm0tL22ZwaM3C6P/ALS9gScMVWdb7M0yvifCvwvoqVBI2ogAe5qtOSPiS4R4nWqLmvIVtj3AelsilUyer09IzAW7ruO/TruNmvm5d8w5Zzm2Mt8rcq5JkUlxsGXHTOVEgLcIHUPlo4baWkaAHUk+m+xJ3Bb27DxtljGcLtsKHdrn1CMhlhKUMIH1yFga8qN+nuopHvXsbJ/hhaxCEfaMYMFN1ovOrgMmzJpKteRVN212ON2E2fE0C6rzv9o9mWU3O74vwrx41ZWIQDf9osiUHilavRKIbSgOvp0sdbvl6k9SPw1z1n2Y5jyI47d+WM+u2Rtt6fWxOk+HbWSjZC0xEdLCenvpRSVaA2o+taizWeHj1ratcHqKG9qccWdrecUdrcWfdSlEkn86geVZIclvxxK0W6RdIUMBdxSwNNvu+qGFO7CUIH1L9SeyQD5hXurB2V2P2bs7Yr4QiRyZC9vG8agASIk3Mhs6E5tVZ1qjWp0pybwp16raWJhV+npy2bGLEdpBas8YgANskd3yB2C1j09wjQ7EqrAyTPGUyTjGJqbuOQSULSygd2I5T9S3ljsAnY2keYkga7145AzNRAVOzG4qcbWoNR7NbFltD7h7JaKzpbpP/CnW9jQrAhSLXhqFKnstOXyajqMGFrpjMp2UtoHZLbSe+1q6QVEk+oFdaLaIsJvcNPdg1e8+KugEw1zsGDeIAndk2ssOGDVflqw6LYrezLyaUu8TWnA6lCWyW1SVn1Q13LjqlnsVbVsgJ6RoV9Pfgt+Flzii1nlLkOF/+vL9F8JMVzoULHCUeoRkFOwXV6Sp1e/UBI7J2qpPgB+GG9X/AOT+InmeysMpKy/h1ieYV/d0fhuLvXrqcI34QKQEpPWO5SR9Aa+M9re0sG3gbO2aC2A3E5vIzJMyQMpmprIUC6kKGW7zsUpSleFU6UpSiJSlKIlKUoiVXXMfDVp5WtsWSzNVZsnsqlPWS9sICnIq1DztrSezrDgAS40rsod+yglQsWlFggOEiuB5HF8q6Rspwu222Fi2WMI/7cxGStxVjuKz3anxAkpXHDihtElnRStJC0qUgipdwHy/kolL4h5mtU2w5ha0gW1+4OoUi/QgNB1l5J6XXUfS4Bonyr6U9RSnoHmriI8mWqLcseuybFmNiUt+x3gNdYbUR5mH09i5Gc0AtvY9EqGlJSRxpkub2DmO05DwT8SOLL48zKwyWwxPW/8A3RqXsmNMiS+xQFlPlCunrSopBUeoCxDfPmuPabO6Fxb7j7qxOY+J/wBwZqvn3CZF3tksRUsZKLI+WpTkdvZTMQ1otyHGh9TTiFBxA1oqSipZZ8l5giWBme/b7By3i90jJcYnWhSIE56K4O/VHdUY7+0n1DjQOtdHeq34wyLmjG0P4w1dWcmvFjbBuWMX6T4ctxrzdMq23Ag+Ow526UvhaknaFPJ1XtxTmHGOKM2FldXKx7F77KUqVj13iiLKxuc4skvtaJbdgOq31raK0NOKCirpWrpnEhVVgXESxl1zW94P5vxHGM6c4InXafBiP9TmJxb5DchzIiEfzbasOJAWlvYLK0FSVNno3tA30DluKWDO8cmYvksJMq3zkdK0nspCh3S4hXqlaVAKSodwQCKivJ3GuH8xYZPwrMrezKhzm/4bxbSpyM4O6Hm1KB0pJAIP5VUPE2NZtYJdw49sfKVzsGYY/wBL0u1XhKrtZ7pFV2amxEurDzTStaU008kNL2kp10KVKARQpea/eBkeqq2uJ8xv8S4S+I+R57b+W2FkyI0zQQL3auvpampSOwWOyHkj6XBvQStG5jneG2XkTFJ2I5A24Ys1IKXGllDsd5BCmnm1DuhxCwlSVA7BSKojlW3fELNgQLqzxzZL1k2NOqn2K945dfAdQ906Wy7DldALLydoWjx1dlAjSkpIkeF/F1xXfYfhZg9ccJvMN4wbrCvsB6OzAmpSkuMKllHgK11AghfdJSr0IpTArap3m+ylfGeXSrzJuvFvJTUKRl2Ntt/NLMdKWrvCX2ZnNIO+ytdK0j6HEqHoUk7HI+BuFsttk+1Xzi7GH2bky4w+sWxlDulpKSpKwnqSrROlA7HtWnz2zRuRrNbs64ryO0OZVYV/NWK5tSUOR5CSAXIby0bKo76dJUBvR6FjzISalPHeeW/kPGWb/EivQpKFri3G3SBp+3zGzp2O6PZSVeh9FJKVDaVAnW7kVm+RvBU3xbxBw9bbzN4i5G4Ywd/ILQlT9qub2Pxf+3rWCAiRvo8z7e0oeT6hRSvQDgFW/C4X4etrhdt/FOIRlkEFTVmjpJBBGthP2JH9axeVePnM+ssZ2y3U2fKLDI/eOP3ZCQVRJYSU6UNeZlxJLbiPxIUffRGTxhyEnPrAt2fAVa7/AGp35C+2te+qFNSB1pGwCptWwttfopCknt3A1DQDKS2MQuE5qB45abX8POcjEmLTEice5pOK7Q+ltKUWe8OqKlwlk9wy+olTR9EuFTf420i7wSkdI7D7DsK02V4zZM1x2fiuRwhKt1yZLL7fUUnR9ClQ7pUDogjuCARUB4sy3ILDeHeF+SZypWQWtkv2e7LBAv8AbE9ID+/T5hvqSh5A9+lY7L0MgSol68J5rM5Vw29i4QeV+OoiXMwx9vwlxfE8NF6tpVt2C4d633K2ln6HAPwqWDK8JzWxchYrb8wxx5bkG4tlSUuJ6HGlpUUuNOJPdLiFpUhST3CkkVvOo+wqlc1LvBWVP8pWlS1YZfJAOY2/q6hb3TpIuzKfwgdg+n0KQFjuk9SUqoHXhLNXDPhw7nDkW64Rm5EWU0pl9lxO0uNqGlJI9wQdVUmD3Cbw/kzHEGTzHXsduCj/AGLur61LISNlVrfWe3iNj+SSdrbHSfMjattdviL4ht7q4dryr+009ASpUHGIr15koCgCkrbiJWUAgggq0Nd6iWev8o814pMxS0cMtWa1T9dNwyy8CHJZUlXU1KjsRA8sLQoJWjrW0oFPfVbUyRpOBwV5KI9ftVHc35Zx3lsU4XZLrcbxnNnlJmWprFUJl3G0T0JPQ66d+FHQQSlXjqQhaVlJ31VHsHxnN8pvr/FnxG8mXe43i2x0Px7faum12y/wUEJEvraAkLWFaDzPihAUobR0LG70xfEMUwa0N2HDMbtlitrRJREt8VEdoE+p6UADZ9z71IJvWC4MKpTDci5i5nXPxDP5rfF860tNIu1ntLyXrvLQtHaS3JO22GFqCkgtBxWwoBxCk1bmH4JiPH1qNnw+xx7dHccLr6kbW7JdP1OvOqJW64fdayVH3NaXk7jqVk64WXYhcEWfNMfCl2q4FHU28g91w5KQQXIzmgFJ2Ck6WkhSQayePORIWfW2QHIa7XfLU98perQ8rb0CSPUegKm1fU25oBaSD27gSsEjIrDnzExgsbkfjmPmrcO72ud+58qshU9Zby2jqXGcOuptY2PEYXoBxsnSh9iARjcb8kJzVibZr3bxZstsSksXuzLc6lR1n6HW1f4jDgHUhwdiNg6UlQE6UT6CqG5iv1suGZRG+J4si88r462fCRbkgsNRnNFUW5vkpabYWNLDa1eJtIW2kqFTeHeC0Dr26ev4V0zpkSBFenTpLUaNHQXHXnVhCG0AbKlKPYAD3NUPmDV2+JuCizYc5Jx7DoktEgZeW1Nz5TzatpVaknRQkKA/vKx0qHZCVg9Y/eO7UnnREy7cyXVybc7RMEedggQWLdZZCCFIS+1vqmqOgtLzhLSx0qQhNXmoJSAEgADsABoAVM0d6OCwXd2eKpnhWXG49c/2K5RbGLZkbHiSo05K1LRk7Q11zg4sla5H0+MhZKkq7glJBq2lnuTWizzBLBn9pbtl7bcQ7FeTKgzY6y3JgyU76XmXB3QobI7eoJB2CRUHsfJVzwWWvDub7jEhvsg/uzJ3AmPBvDKUk7WT5GJSUpJW0dBWipvttKbEP8vddhr91G51+oxVnLO/SuZs0XlzeVXmJ8MDjxaafcRmDcdpkwmX1EFxUEuEI/eYBWSnu1sjxu/SDPXr1mXNZXFxKRcMTwhWuu/dHh3C8I7EphIUNsMqGx8wodSh/LSAQ5U+x7GrFh9jiY3jNsYt9sgo8NiOynSUjeyT7lRJJKjskkkkk1KG99hQa5+XX3Wt/usanT7qJ8Qp4/GGIXx2D8ouQ4ZynwfnFTgdPfN9XnMjqGldXfsNdtVL3SaguYce3mNd5Od8W3KLaMkeAVNjSWyqBeQlOkokpT3Sv6Ql9PmSAAQpO0nXtc6YrDhyY2aNP47kkBCBJsL6S7KecWrpQIgSP72latBCm9kkgKCVbSLcJwh7r6fHXD5Ubpvq2vz11RS/J8lseJWSXkWSXJmBboSOt9906SkegA91KJIASNkkgDua5+tfFGY3nInea7FaYeOyFSRNtOG3ArTEeB2FS5iUkhmc4gnpUhJ8MaCgs71YlgxG/wCdXaJn/KcL5f5VwSbHjKlBbVrI+l+QR5Xpejvf0tb6UbIK1WE4qrAhfiCC+gGGvP7D10Wne91RuOf2/n0UMw7ka0Zgt60vMu2jI4CEm5WSYQJMVRAOx7ONnfZxG0n77BA38p5phpb77iW220la1qOkpSO5JPsK0ua4BjGamJLvEd1m4WxSnIFyiPKYlw1EaJbdTogEdik7SR2IIqhXslzfM3n7RkKJ+XcVQJbjUvIbZDDcm7dBB8Fxhs7fjIUFJccYTp0jpCOnq3a750CTXic8D9xlxNQtA0RKtMtf415YqcoDXOk1E6Q04rj2A6HIrTiSlOQSEK2HVA9zEQQCkEadVpX0JHXYzmgOkAa9Na7arEx6/wCO5LZIt3xS4xJtrdR0sORVAtgJ7dOh9JTrRSdEEaIGqyHVa2SdAdzXRs7ABenMnPrLT7zVaI8kywAy6zVe3bjSJaJ8vK8DvRxW4PqVImpSnxLdMV6lciOSBv7rQUL7fVrtVaw81veX3S05jydiFxiYna3FP2mVbUGVAmSApSUT32wPHQ306LQUgoT1FalHyETSW7/tsuCosWQv+wFvfKJDrRKRf5CDotJV7xEKBCiOzqhrfQFBVg9LbLSWWm0obQkIShI0AkDQAHsKQ4HfmcMyb7OOstOUpmuArs6N3Yk+p9x56/HxqrRkdiyeAi7Y5eYV0hufS/EfS6g/1STWrzDKrbiFmdvFy8RzzJZjx2UlT0p9Z02y2kd1KUrQH29ToAmtNmWC4HFal5g9IVismM2p6TeLY+Ia+hPcl0jyOjt6OJUPyqvMdtXL9zuEPkm6ohZAxGQ6myWy5OGDLZYV2EpfQgtfMuIJGihPSkhIKSpdXjHjQ5Q7s3HMVAGpGPIVmaaqFrGO350408p4fZTvC8ZuVucm5RlbyH8kvXSZXhuKWzDZTvw4rO9aQjZ2rQK1FSj6gCQPK/Oog9yxarcfCy/H7/jroBKlSoCnmAB6qL7HW0lPp3UpPr6etazIeS7deW4tg40v9pu15u4WGH2JKXmYTCdByU4U7Gk9SQlP4lKSOw2R0IEezwWXWuqMv1EngZEknh6AUheyLEdMj7fUSC9OVS387vz3HdrdWi1QwheRy0K1tCu6YKCDsLWO6z+FsgeqwRLUtMxGG40VpDLLSAhttsdKUJA0AAPQAVjY/YIGMWdm0W9TriG9qcfeX1uyHVHa3XFfiWpRJJ/OvC83WBZbdJu10lIjRIjannnVnshIGya6VmhGGDGineOOgAy5DM5mZ0lVixLxDGYD3Ovn/C0ua5LHxq2JfEVMy4THBFt0IaC5UlQPSgb9B2JUr0SkKJ7CtBYOPrRAti/7R263XW7T3TLuMlcZJS4+QBpAVvpQkAJSn7JG9nZPnjVruF7u6uQsmjOMyXmizaIDoINvik72pOyA+4NFZ9QAlHsSZK6vdWbPBFqd38VtP0gjLUg5nLRvElaRIndDu2GuZ+nIe55BRG94txvbLfJut4xewsw4jSnX3XYTfShAGzvtUZw7j6wS5MjMrriNtiKnoSi328w0JEOIDtPWkDXir7KUfbyp/Ds7NXTyNfOtQ68ZscohIIIFwntK9dEeZlpQ7a7KcH2R3l7hJqSDY4FqiiL3bbjcN0VOBdhgKhupm7RSd7EhtuXjM41NBpjjr5DVasw7HZIzslEODBjsoLjqw2ltCEjuST6AD1qLY+xKyi6jNbtHUxFZ6kWOIsdJbaI0qSsb11uA+XY2lHbsVKr8urjefXp2wpUP7P2d9P7yXsgTZSfMIwPoW0HRc+50n/MK2F0zjErQ98rMv0QSD6RmVeK8f0bRtR/oKth8GI4PeQ2E00wAc4Z5UacNXTP6QrLGOaJATcfYfz8c1srhNjQIj06a+lmPHbU664s6ShCRsk/oKiGPRZGQ3L+3V3Yca6kKatER1OjGjk93VAgFLrmgSPZOh99xy6ZRd+RLo1Gx3EbjMsFsfKpSpahDblS0K8rSw4Ovw0KG1aSdqAB+kg72aznkllcm8ZJabBDZBW98k0XlpbAPUfGd0lOh7+H7UbbWWyJfY1zobMJCQcR+q866LrayxrN2QVyHBMNsiQCfbhITqf4WzyvKbPiFokXq9ykssMJ2E7HW4r2Qge6iewqC45eL5KEi9QMefud3uRBkSpXVEiRWgfIw2paetaUgk9SEkKPUdjYFezEcSt94vbedzI0l1lhKk2gz31vvLST3kq6yejrA8iE6AT30CdDeZrl8XFbcHlrYVLf2iKy88GkqI9VqUfpQkd1H+g2SAdHxYtpb+OtD+7hNndAqT/VMyEzg2TSZGYq6l2GxrPy2iZPUvvX4UUyaNkt6loxeXkSHJ8tHWuPAaUzGhRyTt15XUVuK7dKE9SQo+qdA1v2GsdwHGWoiHGoNrtrPSFLOvT1J/wAyiTv7kmo/Zp90Zt77lkj+I4/uRcL/AHYFplxzXdSG+y1ISBpI8iQnWlGo41Et025xsuzK6yZ0Zle7SiTvqlu67utRU9gkAHoGiojzKJ7Vz22ptnd38Ns4jxK84mTWzzcaurU3QAXSa0mQV9kMndOAWXFTlWXXY5NLQqyW1ttaYapKE/MNMq9VoSdhC1AbK1+g8oT6qPXfwU/BvD5Kfi8pZtZHI+DMyEyrfElpKnsmdQdpff69lUUEdgr+br/JrqxPgx+GC7fEO+zynylYVW/jaO8VWu1vK/jZC4hWut4D6YoUk7QD/EI0T07Cvp0wwzFZbjRmUNMtJCG20JCUoSBoAAdgAPavlfantRDeHWLZzy6c78Q0LicQ0fpbSRlUiTZ3cerBg3au9F5pSlCQlKQABoAegFftKV87VlKUpREpSlESlKURKUpREpSlESqn52+H6wczQGZrT0e15LAaWxEuTkUSGnoy/wCbDlMkgPxnB2UgkEHSklKhurYpQGSwQHCRXzZu3G3IttnM2LC1HGc1wrxHIWPXCUpxLSQU/wAa1TFjqeguJKUqjubbG0pPgrSCLPwDNMF+J3DpWC8xccMQ8jt6fDveMXqMC4wvukSGCe5bUQSlxB2Pv6E9N8ucN4nzDY27bfFzLdcoSy9ar3bXAzPtj+iA4y5o+xIKFAoUCQoEVwfy7aM7tF3Y49+IRLlmv9kdckYVydjpW25K0lIK3GEDaepJIeZSo7AUpKQlHiC1Cig0P91xrVZHQ95uGuY+491YsC4cycEZ3Gwe33WDlGB3QJaxePe5JZlMKSnzW4TfMVOAAqa8ZKvESOguJUnzbnO8+xTL41ufv6rjxTyFZX/HsMzI2PAYS/ruwqU2ox32HgChbYcJI0ekKCar7FOQc3y3EH8Uz+xwuRIJjocnR4Mhlu7tM7HRMjlBDM1relIeZLS0kAdPWCKl/E3PGN3K7f7G89v8a9GSPAtM25MeFIlNqTsQrhHdAUzLSkHXUkJeSCpPcKAstlgqZJxlUadesx9VdnFPKVu5KsTj5ZRAvlrcES9Wsuha4UkDfYj62ljS23NALQpJ7HYGo5Gxa947fjy9gEJ6fObZSxkVgR0qRfYCAfoSrsJTQJLZ7dY22rsQU1vn/wAOaMXcf5J+GuS9hGYRkID0K2JR8jdoqCSuKqK5tlCiCooWEjS9E+tb7BOXuWrnZ2L01htvzq0FJQ7Lsrwtl0iyEHTkaRb5a+lDqPfT/c+iQCDU4ruuWoI8bDTQ9eil1r4o+Hrku1xM3suC2J1q5tB1qdBimE+rudhZb6FpcSrYIVpSVAg6IqvOQvh7hcZXNPJvFU3PI0IlRyq02jI5bsiYzpITMZQ8tYcfZCd+Gey0FYAKtb0dx5ww3hPNVZYi2ZHjuOZLKT/aSyXe0yYzcGQogfvOK4UlhQ76fbbWSoAOJBIPX07YsmsGTQW7pjl9t91hvd25EOSh5tf6KSSDWtxrqZrYxHw65FQCy4LmVytEO/YL8TGUTrZNYRJgmdBtk6M42obSetMdDq0n83N+vf01Ds34r+InH7xI5dw3lKxXDIYdvEOXBaxZTZu8JKwvw1p+a6FvoHWWTpJ6lqSVBKjrfXDr+He9y8hhsD/Zpe5gfukdG949NdWAqWgE6EVxRBcSNBtRLgGiurnakIdQl1pxK0LAUlSTsKB7gg/ate7nRZ70tqKhVDhx5iz3G4OV4xz7jUu13FBU06jDFIWkglK0KSqVtDiVBSVJUNpUkgjtWFnXBXLmeRYCrjz2mJcrHMRcrPPgY2yy9FlIBAPUpatoUCULRrSkKUkg77bDLLZcuG8kncrYo26/i9yc+YzCyMtKcUg60bnFSnv4iBrxkAfxG07HmSAq1bVeLbfLZEvVnnMzYE9hEmLJYWFtvNLAUlaVDsQQQQa1uToUMUjebhyVEcZ4nI5EauVh5G5b5HkZVjjqYl+tX74btgQpQJQ8yYDbClR3gCptwHetpJCkkCcM/DTwaFqcumARL8tQ8yr++9dir07n5pbmz5R39ewrK5MwO63qXBzzAZUW35tYW1ohPvp0xPjq7rgyiB1FlZAII7oWErAOtHbcd8g2/kKwm4sRlwLlCdVCu9reWC/bZqP5jDmvt6pUOy0lKk7BBrIYJyKy6KSLzSq5x1qH8Ml4Rh64SY/GN5lk2eekDpx+W6rZhPkJHTFWoktOKJCFK8MnRRq71KBrBvVptWRWmZYb5AYnW+4MLjSozyQpDrShpSVA+oINVdh18u/E+QxeKM8urk2zz1FGIX6ST1uoHpbZSyNGQgfy3CduoHfzpJVsG3eS1L79c1LOTeOYHItnZaE6Rar3anvnbJeIp6ZFulgEBaT6KSQSlaCClaSQQa13G/Jb+USZ2GZfBbs+bWFKf3nbgT4b7ROkTIxP8yO5rsRsoVtCu4r3ZZzVxlh01dmumUMyrwlPX+6LW05cLioexEaOFu6Ppvp1Va53J5M5Xes974x4sn43eLLJTJtuS5RKbgJS2f5rJit+JIdacG0LadS130pJBSlQ3oDMLIJIk7BX6o9t77Cud+WM4w9Wbt3Lh6SrIeWLK2hhdusrKpDUuKVd4dyeR/BYQdlSFPLSW1aUAQSlXlxxZst5lYnjmvkGb+8bVJ+Wu+E2dsWyHCdG+kOOIWqRJaWNKQsuhtxPqgHqSLuseP2DFrYzZMZskC02+MnoZiwo6GGm0j2SlIAAqQAxBSi1vCGa1PsqXxKfnfxE2N+bfspGG2RqW5CuWOWNTiLww62fNFmTV9KmVaKSpLCEnRBQ8pJBNtYzieM4TaG7Didjh2qA1shmM2EBSj6rUfVSz6lSiST3JqI57hGRQ76rk7i1cZnJm2kN3G3P6RGyCM3vpYeXrbbqQpXhPdyk6B2gqSd3gXINk5EsZu1rbkRJMZwxrjbZaPDl26UnXWw+j8KhsEH0UCFJJBBqaGJGTsVq98xNuHXU1ouR+PrndLjGz3j+4M2nM7Wjw2nnU/3a5xt7VDlgd1Nn8Kx5m1aUO20nMwbke0ZwJtv8Fy2X+zrSzdrPJOn4iyOyh/8AEaV3KHU+VWiOygpIkOQZBZMXtEq/5Jd4lrtsJBckS5byWmmk/dSlEAVQ2ZwMz5wyCLkPEcZ7B3bKytqFntwiqD0pKxsx2IKwDJjK7Ere6Ug9KmwojYm8Bm3HRatN8SdhqrSz3kvGcAREYurr8y63Rws2uzwG/GnT3db6Wmh7D8S1FKEDupSR3qA3Dh648zxHZHxBR2Hbe51G34rAlLEWB1J0HX3UkGRKTs6V2bQfoBI6z+8DRsZtEifjl4tDsTkq2tJF+kXOSZk6e2o7ElqSsBTsVagSlKAlDZ2jpSRqrVudyt9ogv3S7To8OHFbLr8iQ4G22kAbKlKPYAD3NTsb3om/DT7rVz+7Mm46/ZVlbcsvnFkuNiPJkp+dZFIQzastcAKVHYQI8/pADTvdPS7oIXsg9Kh5o7zN8VGGcWXM4zbLbJynIGQHJcCA6hCYbZGwXXVeVK1D6UDajsEgDvVPc/fFgrP7fOwLiZKE2Ca2qNcb/Kj7VKaUCFtxGljXSR/jLH36Unssc8Wy2wbRERCtzAZZQSdbJJJ7kknuST3JPeq0W2GHuQTPj1jz+V77s32Gi7Uladogw4ZwGDncf6R85ALvpr4jeMrph0PK8eujl3kXR0w4NliI3cn5oA3G8E90LGx1KXpCUkLKgnzVqBwxKzqYc75WlrRlfg9FnRbJCkN40g99RljRW8TrxHVDza6QAjseL8bu18wfK2s8wee3a78034C3yyHG5TBUkqZfQfrQeke4UPVJBrsbhz4l8U5N+Xxu++Hj+X9GnLa+vTUtSUgqXEcPZ1Pr5eyxo7Gu5uWS1stRuRqHIZH+eGWS5HaTspbOzxMWGC+Ef1DEcHDLngeGCz2uQL7xo8xYuYpIfgOKS1Dy5mOGojpJSlKJiUkiM6SddfZpR9CgkJqwXZLHgmT4yPB6evxOodPTre9+mte9eN9cs6LTNcyAxE2xDK1yzL6fASyBtRc6vL0gb3vtXNTfHeSZ+2/dOOvFtfGTqmVsYrdH3Wo2QIQVKK2xorgxV7TpABS6E7U2Enzda8+Cbrd7QZ/yPfmvHC7FEzT4/v7clOp8u4c5um3WWXIg8eJJTNuLKi29kHqCxGWDtEX/ADvDRc+lGhtRsSNDiW2Gxb7dFaixYraWWGGUBCGm0jSUpSOwAHYAVGMM5MsV/LOOz7a/i+QNI6FWG4pS08kJHcsEeR9sD0W2SACN9J7VK3Faq/ZWtIvgzJz+nDl61UEV5G6RIDqfFQHIOMobk9V/wi6v4pe1OBx1+EgKjSz7iTGOkOgj8XZY9lCqhyXkfK8qXJxTLLLIcw23TVw8hyXGWnnWZ/R9UVLIBdbR1eV5TZcACVICgSemxbvkV35SucrFcHmuQschPLjXvIGjpTyx2VEgq91A9nHvRHdKdr2UTS3223WS2x7RaYjcWHEaDLLLY0lCQNAVuIH4gnujdbnofLTUiU8NSs993Q3xM+489dBWWOgWrxe94perHGewu4W6VamEJYZEBaS0ylIADfSn6OkaHToEVlS5LMZlyRIdQ000krWtZ0lKQNkkn0AFRzJONcXvVwN+YTJst5Hc3O1PGM+rQIAc6fK6kf5XApP5VUTzvJnIAXCbkRs0wO2Sy28+yU26XflI9W9/ynmEL7KKfCS4RrZSFBd8x3wAGuZMnCX1GIAzx4VkFAGNim8HSGc/vgT6fKmMJEnli6s5Bc4gbwyA4l60RXQeq7PJO0zHUnsGUkbaSfqOnD+Cpy8qoYOYMMtgRCydmfiLiOltLV4iGOykdgkJfTtgjuBpKzWbkeeY5YrAMhVcG5rL6kswm4iw6ua+o6QyyE/UpR7DX5k6AJq/ZXwWNLi8E4uOfpiAMAJepNYYveOIF2Qy/vnx+y9Ob5ezi0FpLMF253S4OfL222skeJLe1vWz2ShI8y1nslIJ+wMStnEGKrZkXHNbNbb7frk981NmvR0koc1pLbBI222gaSnWidbO1EmtziuNXNmc/mOXPeNf57ZbS0k/wrbGJChFa+/oCtfqtQ32ASkSF1e/Sr0Gzi0nvI7aZNIBlxONT7CmMyoXRu6F2GeZHxy+eUlBHeLcfgAuWS9ZDZUpBJTFu7xZSNdz4bilNjsPtUCiYxlfIE1yQnkG8PYvbpCF243KDEcE+Q2T/G0ltHUyg66CrfUodQ7BKjLb1Lc5NucnFrc++3jEFZavE1lXT886k94TSwd9A7h1Q+3QDvq1LQ2zGYbjRmkNNNJCG20DSUpA0AB7ACpIVhg2p24JQxoSLx8jK6OVSNBU60xII3jNx1lT2xPsOJpDnbDyKO3+0SMR9zZkb/rpdQm9NciZHeZOC2nP2FtNtEXia3auj5RKk7SylQc7urB2QDtKTvsSnc0zDJZ/zjeH4qrqvk1AWt4JC0W2OToyHAe2/UISfrUPsFEZ1hsMDGbU3a4BdWlBK3Hnl9br7qjtbrivxLUdkn/7VcNkbaXmDDc66PEbzv8AiK/8jkKYmmrIphtvuAmcKD1+2uOArGomHZZChs2xjPBBhx0Bppq3Wplrw0D0CfE6wD+oNRXKseuc6a3hloz3JJl3kthyVKVOSyLfF3ouqDCEArPcIT22dk9kmpxl+TvWgR7PZGWpd+ue0QYy1eVIH1Pu67hpG9k+/ZI7kV+Y5jzGM29Ta5JlTpSy/PnOABcp4jus/YDsEp9EpAA9KndYYMd34aHOQ8RvOoP2jeAmRw3RXEtVmFEewd47E4UFeJph8nzWhh8R8fW+IiALGZTKN+SZJdkJUSdqJC1EbJJJ7epNYN1ajiWrAsChRbWooC7pMhtobEBk+iUgD+csb6R+EbUfYH2Z5yZarEluzWe7Q3r3OX4TSBt8Rk/idcQ3tR0N6T6qOgPcjAx65ybVbBbsVwu9TlLUt1+fcgmGJEhXdTjviHxdqPuGyANAdtVuTYWxPw1mDRLxFom6X7RdBMzmZi6P6jS5DbGLe8iEnSeHOvtryCl8C3wbNb2LXbY6WIsVAbbbB9APuT3J9yT3J71XWQ5TY8glhNzuqI2ORJHhlBT1Ku8hB30ISAVLaQR30D1ka+kHfquCc3zTIFY+7kjcO2RdG7JtKCkN7B1GEhXmWtQ7qKQ30p17kVuf3Zg3GtqcnQ7XFgNNoSz1oR1vvegQ2FHalk9gBs1LFjxLYwthNEOAyhLqTlkGg+EZzImRdwDgbcKEIZmTNx06x/utbked3iPbVS7VYXoTKiG2ZFxb0p5auyEMxknxFrJ15VdHbZ32rCsOFtR1DLs9eZud+6CtUiQE+HBb7kNtj6UhIJ2r1J2a19uyCTebivIWoP74uKQpMbpcCLfamjrYL/dK3SD51NhRH0jQGzFb7dVZBPEjKLy/crZGKVt2uA30NznN+Xy72Wd9gVn+Ir6U60Fce07QhAi0xz3n7Q6QYJfrOVMpB5aM753elCgnwimuvLqXotxe79ec8lrZs8VmPisPa3rlO8seUpJ31BHYutJ16bSlRHckDR6w+E74FZXLEuNybyvFuEXE1JQ5Fal7bn39O99xofLQiPwpCS6DsdKe6p98JXwM3u9GJyl8StoYZYSW5FjwnW2o4B2h6f8A/EX9JDJASn8aersnv5KUpSEpAAA0APavlvaPtabY50GxuJn4nmk+DG/paMJmbjM1EzPrwYAYKr0W+3wbVBj2y2Q2YkSI0llhhlAQ202kaShKR2AAAAArIpSvBKylKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiVHM94+xHkzHXsWzO0Nz4Dq0up8ykOsPJO0PNOJIW24k90rSQR96kdKJivnfzT8M2e8NTRm1gzC6mPB0qBlsK3h963rIPX+9YaPK4yUoQFSWQknt4qSAVHIZe4n5+tFpsXPuM49HydwJNrvFtnARpzoV5XrZOQoL6upO/CJ60kaIUnufoM42h1Cm3UJWhYKVJUNgg+oIrjz4mPgJtGa2O+XThJUGxz7opUy4YxIbAs12fAB60JGjCkkoTp9kp7nagdk1Zhx7tHVXNtGzw83oRkesPthyURtGf898O5wzgOZMxeQMduW/7OXXrTDushKQSqKtRAjuyEIG0hZa8QBSgokFI2V35FsETLVZpx869YM1eCUXfEMgaVbDkTSe3S2XQG1ykpB8N1tSgfpUdd088YnneRY9cbhwBycqfaJMEIaTZs4U4YskggtFi4pHVEeGkFDgLjKlDbQT6C3sb5qgWu4R+C/ibxx5MW6smPbLvkERDkW4a2PlpDwBYU7090uBQ6x9SG1djeY8OFDTrNch8N0N0nNrnKkxrL5lzXUOHZpj/ACHjMbIbM548OWlSHGX0AOMODyuMPIP0OIVtKkHuCCKoK8cJcUcOZY5dblhMS34Ld5CnGL7aFuQJuLzXVbUl19hSXBCcWSQvemVq0rTZBR+ZXwffeLI0jN/hivVwsEzxm5F1xwH56Bdo6UpSoNMvK008lCR0eGpAUE9B11BSZDinMHIuQY2q53ni+Hm+NzWVN/PY1IQlx1Ouh1qRbpikKaWlXUlTYccI0R69qsEXqPFfVQsdd3oZ3TlgetFNHOF701Gdh49znncO3yEFCoUxyFdWVNlOukrmMOOqSRve3DsE/lqtbdaOduALnZ8EY5ax17BJ6xBsdxv2PuyVwpbiyUQHVNSGvDbI0GV90jXhaHkrVYp8Q2G8KZNb8FyCRkNrwW7OpYtbmQWiXEcxqQsnphuvOoCHIh9G1hZLOwkkt6KOjZX9i+S8Yl2l922ZBY7qyqO+hp5DzTyFD02kn7ggjuOxFa921/hxC2MR8PxibTw6qo6ib8SkAhDll43vqQdKIuUy2FQ9+3gyNff1P2991Uir18QXw8PTLs7gmBx8Au9wb/uacrmPs49JeUrxHy6YCS3EWsp2kJUELWVbCSdT7GsovvDd8g8ccj3R+4Y3cHRFxfKJStrCz9FvnLPo8PRt49nQAD5/qt2bHh3OFIt1wjtyIsppTL7LiepLiFDSkkH1BB1WO7vYGqd9cNQCD1qoALv8SzoDgxDjVsKGwkZFNcGvv1fKDf8Ayqu8zwj4pouRSuVcDHHdsyJEARZsNhMuU3e2UnyodQpTKS82CS0sq3vaCQlZqTsXqV8PUqHYL6Vv8aPrEeBdVK6lY6o6DceR/wD1PZLv+F2CvJ5kXGHUqAUkggjYIOwRWBDvUJTve7qAJHrVU7htnzzlDG4eUOfENfWYkxJ8SNZLHCt6mHkkpcYdTIQ+40tCwpKkdQUFJ1v75t3+GPjTKYLkHOZ2W5Slzv8A9q5LNW2hQ7oWlhLiWUrSdELCAoEA72K8stxHIsFyGZyhxTbzNdnrDuR40HQhu7gAD5hgq8rUxKQO/ZLoHSvv0qTOcSzGw5zYWMix2YXorxUhSVoKHWHUnS2XUK8zbiFApUhQBBBBrLWA0dijoxG8w0VYcPO2zh+6s8HZFYLTZ5qgpWP3eBb2ocbII6dkghsBKJbY/mN/iA8RPYqCbpUo/eo5neDY7yNjr2N5JFccjrUl5l9lZbkRJCDtt9hwd23UK0UqH/pUCsnKEzjmcnBucb7AiLSlf7oymQ+2xFvDSBsh7uEsSkp+pB0lelLR26kp3a25Q4LUv73eGPWCkOf8ey7tdI2eYRcE2jMbY0WmpBG2LhH9TElp/G0T3B7KQe6SO+/bx7yXDzlqZa59vcsmUWUpavdjkL6nYbh30rQrQDzC9Etup7KHr0qCkiP/APtB4zepDkLjnGcoziQn0XZ7aUQleh7TZJainsQrQcJII0DsAw3NON+d+TLtAzWBcsX4xv8AZmn0W6XD8S7Tnm1pOo8lZDTIZJ6VKb6HgFJBSrY3W2BmyqyJkXYlOvWSu7IcksOK2x69ZNeoVqt8dPU7JmPpabQPzUogVQWaSM75AyNnN/h4wpdsvURCQvJr+tdvtt3jg/8AdnIwSX5add0OKS2E9QUhzRIOw4ixPBLnlUp/P7LcZ3KdhSj51zJJZnLQn8Mq3ggMojrO9KYbbIOwtKVbFXopXapmtMUVoPdaF4gmlT7fyqO4qtFo5K1kXJt1nX7MbDIHz1jubSGI1hmeoDUNBKCNd231l1RHdKxsiroWf/8AlVVzwcNxW3M8mXDPYmDX61Hoh3ZxPiJlp7kw3YwIMtCxv+GnzggKQQpINcrZv8a/K2eY9GtWKWhnCvGj9FxuCVeNMdX6K+WSsaYQR3ClhTgB1pBG63ERsDddj89f2XR2fsm2bbiXbGyYzyDfPD68F0D8TnIXD2JRoScpvk6NmsdCpGPpsJBu7S9jZSdFCGVEaV42mlDYO/SuJeQ+buR+XZ8O28s3HwoLCGkxLbEHRbpDyR3ee1/MeKhvS9IT26E72a0zTKUPPTHHXpEuSQqRKkuqekPq/wAzjiyVLP5kmvKTFizo64syO2+y4NLbcSFJUPzBqpFiuinQafdfWthdi7PssNjRyIkUVExujkMfPzAWSD7V+jtUfDV2x4D5Txbnbh6tKVuQwP8AQT/MT+R83r3PYVtrdc4F2jCXb5SH2ienqSfQj1BHqCPse9QTyXuYUcON11HafbUdGSzQaw7ym0mEpy8ONtMtkKS8pfQppfspCh3SrfoR336Vi3C/NxJItsGOudcFDYjtHQbB9FOK9EJ/XufYGkCzPKkpud9komy0nbSAnTMb/wCWk+/+s9/XWgdVoeC3fEEYGE0B2Rnh568h5yVjcfc13i7T4Vs57fvF+wSCUmClxsKeLiddLtwZSnrktp15QTv3WhZ0U9rY7lOOZlZY+QYpeod0tslO2pEVwLQde3b6SPQpOiCNEV8/Qr86yLBecnwq4qvWB5BIsk5RK3EoHXFkn3DzB8q9/wCbsoeyhXWsG1XWU3YovDXP+eS+a9ov8NmWsGPsp1137DRp/wBsvDyqOK7ozPDcZze2ptuT25ElphxMhhwLU27GdT3S604khTax6hSSCKoVTnLuYuTrfgeTOZNx7CkeC/JmuiDcbqE9QdjQpjadLaB0lTqkJKtKSlz1WNLbeexybd4eIcwzGMOsq0IS98mtSot8fOwWnZJ0YrR7fwz3We3iEbSrpiO1CiQmIttZZZiNNpbYbZSEtobA0kJA7AAemq9PAMLaO/CdIZyoTwI05iuVMfjtustr2NENntkMtfo4UHEa8wZeeEFxHP8ABm1RcGjQnMVnxGg2xY7iwIriUJ0NNdy26Bv1bUofnUtkOoaQpxxaUJSCpSlHQAHuT9qwMtsOL5JaJEHL7Vb51u6Ct1M1tKm0JHfq2r6deu+2tbqhTx/lXIyXouGZpdbXx22pCo9vvjapzV4UlQPlKlJkCEQAAku6cB7AI7K6XeRbPJgbeOUqHzGEhqDwAmucBDi7xN3WdfQ4z5+qmcuXL5p6olskuRcB6lNyZbalNv3zpOi2yoaKYpO+pwd3ANJ8p2Z0zGiW+IzAgRmo0aM2lpllpAShtCRpKUpHYAD2qH/2tzrG2kx8l46MmK0kITKxyQmQhKR6dTDgQ4ka/CjxKxJXOPFzMKRKfyyMw9GT1KgSULYmqJPSlKYzgS6VKV5QOnuTV2zvgwt6K6TjjOnkJ5DQE61KhiCI/dYJtylX1ln5D0UhyrILLjdmk3jIJCG4bIAUCjrU4pR0ltCBsrWokJSkAkkgCqosXDlhyO6yOQMqxdmyXCZpVvhW1aob0BrR0t1xhSSuSoHzHZCR5BvRKpPj+PXfJby1n2dsJRJY6jZbOFhbdrbUNeIvXZclST5leiAelP4lKlzzlXodmFtIiRm7owBHudOAyxNZAQmObOC2GanEj4H1PkKYwV3Ar3BGrDyhk0RIOw1KLE1Gt+hLrZc/89QS/TOXMguszAMYzeyPhhvV1ujVqdYdghWiGQ4HVJL6k77JSChJ6jo9O5hlOS3fIbs9g2DyCzIa0LteAkKRbUEfQ3vsuSoeie4QD1K/ClW4sVhtGK2hmyWWN4MZnZ7qKluLUdqcWo91rUSSVHuSanbZG2l3dwSWsGJDnCf9IqfM5YCsyMG0GE29EkXHAED1NPQeZpjE7dB5MsVtj2e12zEGokRsNMoQ/ISAkfkUk7+5JJJ71o8mzPlKyPQra1jmNTbnc3PCixWLi8V6/G8rbQAbQDtRJHskd1AGYZdmEbHG2YrMdc+7zyW7fbmjpyQv3JPohCfVSz2A+50Dr8axqRbHX79f5aZ9/uCQJUhIIbZbHdLDCT9LSd/qo7Ue9Xvw7i78PZ4jhKUzMSaNPDiRgMhU0kDoyKCO8iNFcKY++Gp8hnKLYxaOUsdhutv2jG5lxnOfMT7o9eH1OSHSNb8MRwAkDQSkKASAB+Z9WV3rkewW8SZN8x5mTJWI8KDGtb0h+U+R5W0KLyQPQkqKSEjZOgN1MsnyaDjMFMqUh1999wMQ4bA6npbx9G20+57EknskAkkAE1pbBj05M1eV5Y40/e30FtCEHbFvZPfwGd+voOtfqsj2ASkT/hLoFls8R0xib1GjUyAmTWQJmTU0xsw33j3sRo9Mfeg+MBXCO2fjjLG5Ei+X7kiem8XFDXzbluhxkIQEDsy2XG1qDYJUfbZJPqTWJluP45ammGbt+98pu09RagwJk5akvq136mkkNBA2SpRT2B19hW9zfk7GcPjqbdnsTLoshEe2sOhT7iz6bSNlKB6qWRoD89AxbHb1PDjt9YxW93693AJS/NcjCFHZa9UtNCQUrS0nfskqUdqOzWIjLBCP4SEZuzJLnyniZTcC86EcXUABvwhGcO9dQZYCfnSg58ApDhGEW7C4D3hsRDcZ6w9Oejx0tIUsDQQhIHlbSOyR/U7JJOlzHOIyZL1gt17i24slKLjdH3EpbhBXo2gq7KfUD2T+EEE+wOqvNy5Kyq9HGLZd7fZm2ilV0et6C+uGg9/DD69AurGvKGx0g9RP0g+ydbuNOJLYby5bWkPqX0turBkTZTyjshJVtSlKJJOu3udAVIbTdgGDZGiFBZRznbvMNArOfidQzJA3ju24cKb70Q3nHACvr9B9McR3MoeM4641iWNSRCi+VEy4BUdl5xR7FOwXnlrUe3Sg9RPr3rTiyImQWct5tnxHXG3S5Dt7oDcWJ1fSnwwT4jpH3KtHsN+tYFyyK/yAzmOUttWNtBJgpmoLio+x6MRh51vFOwVr1rvpOtk2L8N3wX8yfFHfmM7nGfi+JBQU1kd5R40l1onZ+RYIDZcOv5gAbQD5SpSTvym1+0FnsDQ60bzZbrCLrZZOMMVP9IdTMmeHXs9lLsKanE+vyolBxzPuacsgYBhlkmrRJG2LJCbAlyEAgB2So+SJHHY7cI7diNkAfSb4YfgKw3hqTEzfPnIuR5ewQ7GQhBVAti9DzNBY63nhrXzDnm/ypR33d3C/BPGnAeKIxPjiwIhtr6VzZryvFmXB4DRekPHzOLPc/Yb0kAdqsGvmG2u0tq2u936WHKdToCcANGtAaP6jVdeFAbCEglKUrzamSlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURQPl/g/jPnTF5GKckYzHuMd1tSGpIHhyoqlDXWy8nztq7+x0fQggkVxtyXxDzn8N1jVZk4k7zvw61GLUqLJCX79b2u+yptXlkoA7eUDQ1oICST9BaVvDiOhmbSoY1nh2gXYgn1qvk9xpzaxjFwkzOGOY1XDCkoC5GN5Mw4/NxxQPn623FpkGLogl1tTga0doUnum2LlmmdYVcG+TIXH8u0XCYkvXViyvquuO5AhSB0ulxlPixpHlTqQpnXToLKxopun4lf2f3DHxBOu5LDhpxHMQlRbvNsZCPFc12U+2kp6yP8wKVH3JAArhdCPjE+AbJVRM8iuZRx7IleGl0guQiCvZW3ISNsOq6j/MSApRJI966MC0hxDTT6ev0PkuZH2W7xQZOOhoT54H096ru3jTlnjbn3DFXXHJka5Q5CFR7hbpQSX4qztK2X2tnpVsKHuDrYJHeqplcQcXcLTVIvuNqhYY85qHkVqffhTLGta/KzJfjKStUYKV5HFEhrsDpGimnMq5T+GvnpcLI4WSzOL+QYulwrpKLkBD52D4MiVGWG3GldIGy4FpBJTo1bHGGZ8iZdFuOK2rkaw3mfb2kouFgyiG3ML0dadJXHnRVIS/GWNjxFsrUfRelAiuj4zdcK6/xj1Rcd0KJZpzBbqDMS88Dz9VZt94EOTY/Lx0cz5wbLc2C07Dkuwbiy60odvNKjuLPsQoL6tgHdQaJ/t64eyW3YDfOaokjGJqWYWN5Df8fE3qkBPT8lOcaeZKXVa/huE6c2UkhWgcCyX/m7gi6PGRxE9dePXA4+/Cx26JuDllITsqhtOJadUwopJ+XCVFJV5FaHSZo18RHw7cpYh8jk19jxrJkEUhTWRQ3YDL7R+zjyUoJB9ClWwoexFCxh4HzHQWgfEH9TeAB6Pz8SmRB+IVyM9AuMzjS/RpDamXmnLdLhoWlQIKVJLjwIPoR7g/lo1Ym4/EP8PMCJbZ72FqwKRNLTEx0zJacWbUAG2XVeVaonXsJWf5IUEk9AHTJ7JyMeKFiNfMuayrjx1TKLdkDcpuRJsqFAgNzlJO3I4ISEye6h1fxPQuVc4etV+tnTuLcbfPY9NpdZkMrT/UKSpJ/Qg1jur+Br17LXvjD8QBB4dSPWChDcX4j5SEuNZpxsyhYC0qTj8x8EH00fm07H5/8ASq/yviX4jbTc7jyNhnKuPt3mQ22Lla7HjSoLd2Qgp2vUmTIbEsNpKUO9I35Ur2ANSBiPkXw9noiJmXzjFKnF/LoZW9Oxlrp6vJolUiGkhQCAkuNggDqSNJtm03q2X62RbzZZzE6BNaS/Hkx1hbbrahtKkqHYgg0EIOoZz5oYxh7zZEHh7HrkqkwnA8Q5Zx9jJ5nLHI2SsrUpp9iTfHLS7GkoOlsvsQBHCHEKGigj+h3ut/M+Gngafb5MCdxdYpS5TZbXOkxw/OGxoKTJc26FDewQrYPpXhl+B32y5HK5P4qSw1kEhpKbraH3PChX5CNdPiEA+HICQUofA9wFhSQAPO3fEBxkuxzLtlWSwcPkWp8xLrb7/MZiyIEjt/DXtXSeoKSUqSSlQUCCd1kQ2ijwhiPdWGTLhl6LCxLL8h47vjPGPKM1UmM+oIxvJ1p6WrgjYSmLKI8rcxO0pBOg8NFPm6ki1FLIrl7lD4z/AIbLjZpmKu2+65/EmNlmTFt9tPgFKh6+NILTZ9fVCiUkexFc1zfiv5ykWWXiGPX5dgsKX1C3yHXROvTUTt0srlKSE9tEdXQpzRA8QkdRy14ZQVC61k2BtDaRBhwi3iaDnX4lyXavxC3LiqwWu35bnXIEXCr3ZnS5ZLwhYM1Diuy2kMjapLax2Wz0qCgN6BAI5vyD4++Qp1jZtmH4VZWrokqak3yYp75R0A6DseGoJdHUPN0urSUHt5x3rk27jJTfJGVPXKbkcqUU/M/vSUp+WQO38N9wlWh/kUdfYitha7zBuza1xHSVtK6HmljpcaV7pUk9wf8A19qjMRxdovb7K7F2WHIbQcXuGWA8jiRwpyWxy6ReeQ7iu+5/kVxv14W54qJ0p3zx1f8AgJHkYT2HlQAPvutGLrc8e8mQdUuCkKULi2jzIG+wdbSO3b8ae3buBW6Bry7EFKgCD2INRFmYxX0CDZoVnYGWYBksJCnmM/nivY0626hLrS0rQsBSVJOwQfcGvYD7io3Ktj1gS9c7A+0y0Op16E8sJjue5KT/AISvzHl79x71iwM0fyZ35HGYyGXko6n3phHSyD2BQhJ2739wQn/V7VoXAUOKtC2NhkMiUccAKz5fzKSlE65wLVH+auEtuO1sJClq11E+gH3J9gO5qLyLfeb7MXcccbdx9LqT4kx5B8SV5SBtg9hrt516UNaA71t7fjsWNJFynvuXGeB2kSNHw+3cNpHlbH6DZ9ya3QPvWpaXYqcw3Wkfm0Ggx9cv/j6qNWG4wMcjN2m7wP3S8pXd5xwuMyXD6r8Y+qj9l6V+tSoH3Fel5liUytiQyh1pwFK0LSFJUD7EH1rSN2O42MdWNzQYye/yEtRLQH2bX9Tf/mSPtWJFvJTwy+zgNAvNGmI8sD5SPAqSBVYN3v8Ab7KhsSlrcffPSxGZT1vPK+yUjufzPoPciovFzx3IH0WixMtw5xKg+7MILbXSSFeH0n+Odj8J0Pcj0qRWXH4VnWuX1uyp74AfmSD1Ouflv0Sn7JToD7VpO94VJDtX4kf5Y0zdkOQzPoBmSsVFluWRafysJaiBQU3aml9TZ13BfVr+Id68o8o1+L1qd4dn/IPGy0owy9IftgASqy3MqdiAA/4JB62D69kko77KTWlCq8v0qSDFiWd1+E4g6hRW3Ylg2pBMC2wxEBzOPMHEeUhwVls832DkjJmrVzEoYpjzCm1R7Usl6FdHxo9cqX0hIbSr0ZUEhRAKifpHRTMyLMjNyoMhp+O4kKbcaUFIUn2II7arg2Xkipj7lpxuM3cZTagl9xStRo+z361+6h/kTs+m9DvWyw9rJcFUZmM5lcoMxxwuupaI+TcJ9U/KnbYT+YHV79W+9ej2d2mdAcRaWXp4uGPphTQSXybbf+Fn4gl2xYtBPddhyDxidZg8TkuwcvzCy4dbP3leH1bccSxGjtJ635T6jpDTSB3WtR9vb1OgCahEHj/+2EheU8t2S2XCa8hTUOzvNIkRbXHUd9HmGnX1aHW56D6UaGyqqsW5audqyGRlXI+MP5Fc1AtxZ1scR0wWP/hsxXSA3v8AEoLUpfv2AAs62c+cXXdTTL2TN2mS8elMe7NqhrKtb6QXAEqP+6TXrrFtOwbScC6I2WTTTzINCdBUDicPl+1Ozu2NhTbaLO9urgCR5ObMS4zBPAY+9/iPBWe9ogzrKR6C0XORCT/9DS0pP6EGoLc7bmdyvL2Lccco30MQyW7rNmojSmIRI7MtKLXiOSNEHRXpA0VdyAZJcclu/IzzlmwOeqFYkktz8ib7l0A6UzC9lK7EKe+lO/L1K30yO0Wa1Y1aY9jsUFESFET0NNI2ffZJJ7qUTslR2SSSTuuzCskK1GUFt1mbgSJ8GyIpq6XBuZHBdaXwROIZu0MjLiZjHQeZ0UIsuIckYra2rNZcsxT5ZnZHiY/I8VxRO1OOL+bPWtR2VK13JJrWZNkPKlgVGhNT8XudznL8OJCagPtrd19SyS6ehCR3Kj2Hp3OgZRlOafu24M4zYIX71yGUkLbiBXS3HbO/48hYB8NsEH2KlHskE+npxzGXLL411vM/95XycAZk1SOhIHqGmk/gaTs6T3PuSSSaussoJ7izOcJUJvGTeApU8MsTkDqIpP5sZoM8BKp48uOeWohtmxXlO0XGbfZcnF7heLidOzXvHHhND6WWkD6Wx9gdk9yT7MquPIdigok3DMLMh2Q6GIsK3WNa5Mp0js02XHyCTreykBIBJ7AmpPl2awcaDMNpoz7tN6hCt7SwFukeqlE9m2wSOpZ7DY9SQDEoN1xawXNWRZvm9pmZDJQptptt9JRDa7FTEZsEq1vXUvXUs63oaSN3QYMD8iFEI1JfINnWZwm45CZOBMhIG3CMSJ+Y9g4ANqfmQGvpVY1p42y2XLaynL+Q7mm+Lj+ApuAzGDEVsnZbbK2lEe3UtPSVdI32AA9OVWTG7K20xdHr5lF1mbRAtj9wcV8wrtvqbSQ2GxsFS1DQH9BXtvHLq5z8mxcf4vdbze0MeKlMiKqJHZB7Bbinug9PuAB5vb3I1totGfWtEq8znbDbpklPiTrrPcXMeUhIJ6ehPhoZQnvpIWoAHZ2dk5DbIR3VlaX/ALnbzgDnVxDS7nQYnANV6G2NO9FIGgoPgTA99NVt8HwG24gh66Pw7eLvNTqQuIwlpiOjewwwkDytj/mo+Y/lGMn5UYu95ew7C5EqU9FV03GXAjl9bI922vwdZ9CtZCUnt3V2qMXh+zZbLR/a7PZz9jCioNfMFl27AH6WYrOiWPTzKClq760NE7a753dhGi4xxjiRiy5a0xre29F6VudR0AxFb24pRUQAFhA2RuqFq27Y7DZ+6ERsGGNCHRH6ya2Upn9RNcpCRXbsWx7XbXd5DhueczIhreZNPKfqse7ZBfsStDVissG24+VoWtluS+JMpQP1yH1bDbQBPUpxSnNnfYkgV6OK7Bl/ImS/2f4fxq58mZrJ38zeXgUW63pJIJ8VQCG2Qdjy6Cj23sgV0Z8O/wCyyzXNJjebfE3kk2PGedTJTZgtK5D+wDtwd0NbGh0kKUBtPSkgKr6VYBxxg3FuOs4px9jECx2pjuI8Rvp6laA61q+paiEgFSiSdDvXzfaHbePEd/k23QKNLq3RqGiTAeMiRlmT3WbLhQGyiOvOzAw83Z8h5nJcqfDp+zlxjC5rGf8AxC3WJyFmSFdbMbwiLRb9EFIbZV3eUCAepYCdgEICh1Hs9KUoSEoSEgdgANAV+0rxMe0RbVEMWM4uccSTMqy1oaJBKUpUKylKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlY86BBukN233KGxLiyEFDrL7YW24k+oUk9iP1rIpRFxNz5+zO4/wAsRNyPgyS1ht6dHWqzOJK7LKPfYDYHVGUe3mb8g1/LOya+eXJHw+8q/DZlEa65BitwxKc26pMOfDkqbYeX9401kgJJB+gkHW9pGzX3mrW5FjWPZdaJGP5TZIN3tktBQ/EmsJeacSfYpUCDVqFa3wxddUccuRUjYhAuuExxXxaw/wCLvn+zdMeDybImFj6oN7gsSFJ9B5lAJdUPsQsg/c17Mf8AjJ5awnNJF1kY1iDNru3UZ1taYeatsySVdQkJClr+XfJKurQAXvv1FKSOxOcP2VHHGUF6+cJZDIwy5ja2rdIWqRbuvez0E7cj+v4djQ101wpy9wLz38PRXE5XwV2dbPELTd2gIDkaQnegQoeQk+yVFK9fgrqwrZCjyBJaR11L0Wn4GwRpzhgE+XwrcmfFNxze21Kyb4N8AuEhwkrd+caPWT6nzRNgn9T+tQWD8Q2E4tnshy3cPXi2Y5c2iU21nNZcNMaZvyqiPsBPhpI7FpXoUoKOnRBo2JeY0NoOWd9z5YEJMSWFJQj8kO6ISR6dJJH6VuIN3smSR3I6FsvAjTsdzRIH5juCPzGx+dWrkOLKRkfKvsFlux7Hg0H/AJH7q+Lj8TmWr6RilmyKwAE7EjkS5XQKGtAfx0A/9e//AFqr3OTuZMdcku4nlM222qUtb8i12q5T4+nVK2t9GpA6nD79wDr036xT5e62bzWomdF6tmI6v+I2n/w1n2H+VX/MVsrdd4dyChHcIcbOnGljpcbP2Uk9x/6H2obOx1DMHn8K5B2XYW7tz1M1tY+cZ7kR+ae5p5AlNo8q4qslmgIO96WFOFwEfYn/APNeiTardcX3pl0jC4SpI0/JmqMh53trzuL2pX9TWvuFlYnOfOR3Vw5yQAiUz9Xb2UPRQ/I7rwbvsi2LEfIm0sp2lKJjYPgOE9vN7tnf37fnWO7bDo4ef95yXZssKz2fww2jiAF7Wol2x4AW1TlxtyQdxXV/xmh/4aj9Q/0q/ofattbLvBurZchvhRRoONkdLjZ+y0nuk/rXgubFZT1vSmm063tSwO1R683DG5j5fgXN1N0ZASl62tqfdT330rSgEKT69ldv0rVwEPA+X2XSbF7nwmmk/j7fCmYNYFxssa4OomNuORJzSSGpTJ0tO/Yj0UOw8qtioy3l+RRG2mrpj7ccuLKETJL/AILC/sSAFqbJ+yu35963TcK/z0hU6/tMsqG+iAx0kj/5iirf6gCoy4PoArbLSyMLoBPt8yXg7lrePLTEy51mMVdmZSNlt/Q39PcoV+XcfY+1ZQu9+uZKLPZzFaI2mXcPKD+aWges/orpr2Qcfs8EL8OGHFujpcdfUXnHB9lLXskflusVNkuNnJcxyd/B9fkJSipn9EK+pv8ATuB9q0LXjHD3U7XRh4jThj6nHyAPNZUfGkPLbkZBPduz7fcJdSEsJP3S0PL+hV1Efes662O23lCBLZUlxk7ZfaUW3Wj/AKVp0RWFCyOO48mFc2HLbMV2S0+R0uH/AELHlX+g7/cCtx1hI6lKAA9ye1a3WkUV6D3L2kATGevnOvqtSh3I7IAmQg3mIn/FbARKQP8AUnslf6jR/I1s7ZebbeGlPW6Wl3oPS4nulbavspJ7pP5EVrFZdbnXVx7O2/dn2z0rRDAUlB+ynCQgH8id1rbjh83Kn/nb1LNrPhqbbRbV9L3QR6OvfiHr5QAPzNRmnhr1qtmx3MpA3+H/AG+hmt5KyaMJDkC0x3LnNbPStqP9Davs44fKj9Cd/YGvSrHpt8G8rmJdZJ2LfFKkx/Xt1q+p3+ukn/LWJCN1xOI1ANlbmwWvKh23NhC0j7ra9/zKSd9+wrc2zILRdlKbgzULdR3WyoFDqP8AeQrSh/UVjGjlYhvbGN2Oa/twH/b1I4LJlWm1zoiYEuAw5HbACEdAARr06f8ALr21WvFryC1gmzXcS2R3EW4bUf0S8PMP1UFVspc6Jb46pU6S0wyjupxxQSkf1Naf99Xq9HoxuD8vHPrcJzZCSO3dtrspfr6npH23WHAeasxHQgR+7K7j7Zc6L9l53brI2TlMSTaFgEpK0eK27obPQtGwT29Do/lX4y1esuQiRNdXa7Q55kRmHf7xJQfQuOJP8NJ/ypO/ur1FZVsxmDBkG4S3XrjcFApMuWQpYB9UoGulCfySB+e68HMThMrU/YpMizvqJUTFV/CUT92lbQf11v8AOtC1xxWA20v/ANWrf2zkfMih5CQ1JW9gw4duitwYEZuOwynpQ22kJSkfkKyASKhl1vmRYqyl24zrRcUK8rTRC48l5X+VCU9fWr8gBWC9f73dHgnJrTdbFaekLLcdovOPgjuHHGtlsd/QAHt9Q7itSQKKc7RhQvy7pBGUpAcyJgDoBSW4ZMtUldpxuILlcEdnD19MeKfu6v2/3RtX5D1r9g4u2p8XLIpAu08jQU62Ayz69mm+4T6nudqPua/LLesQTEbhWS425tloaSw04lBR390+oOz7/es25X22WmOiRLlpAePSyhHnW8rWwlCR3Ue3oKxIGpUzHQ4g72M8OA4i6PueJ8gFiuYbZksOMWlydZfFV1KNqmORNq+5S2Qknv7ioxMTlTMpdnxHkjMLnOT5XPGvT5jxB6fxFpWAD79IBUT7a3W9+Uv+TKDlzW7abYQf7my5/eXx/wCK4n6Br8KDvv3V7VvoUOHbYrcKBGbjsNDSG20hKUipYdojwv8ATeW8iR9Vy7X2f2XtY3otmYB+66A48pAS5uroFH7LZc4sTby2cqMiRNUHZryZMth2S70hJUtfir2SAACANDWgNV78gvsy02d+TdbAy88U9DcmbmVxdbLh7DbChpX5ISe/pXlKyWRNcVBxSKie+lRQ5JcJTFYIOj1LH1qB/Cnv9yn1rzg2KLEfF5vc794XBsE/Mv6S2yPcNo+lsfn3V9yavQdtW+C24yJTiAflcW0dg+z1qd+TBIP7g9wb/wBuQpxCwcLZhR7S3IuHCtnnTX1Fbsu9XLxH3CT9XSWD4SfsgaIHqN7rc3nNsjsVqd/c2OYjjDz4DLTsYLlOkk9g22lpHWr10NHv31UdmclWeY65Bx24xXloOnJStuNo9iUNo8zp39tJ/wBXtWJb79HTdA3ZrJdsivjyg0qQ80GfCJ15NK0Wkn2SlOz+Z71OztLtKEy5CiBvENYD6hs58Zz4qi7sX2YZWRfqTEcW+TRjyEgMzJb3FZXIka0LYcu0KxMOrU8pTEMLnPrPq8+48txPiHX2OhodtaGovTjuWJ/dj15ul7jvuJbL059SYzq9jSWmGQj5hW9aAHSfvvtXQ3HH7Pb4ieaZTN05Cu6MFx0k/wB2ejnxyPuGD5nd/T/ELWt9QB13+gXBfwjcJfD+lM7D8ZE3IFI6Hsguqvmrgsa0Ql1X8pB90NhKT7g1zrTty3x2d1EjOLcJXiBLkJD54rnR4vZ/ZZuWGyNc4fqMiSeZvS5CvELg7gb9nbydlCE3KbBb49tMvzSLjcoiV3aUnv8Ayog0GRsf4vRre/DVXfHCfwpcL8Co+bw3Gvmr453kX66L+auLx0QT4qh/DGjrpbCU6A7VcFK47nl2K4lt2nabdJsQyYMGijRyA+TMnVKUpWq56UpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJWPOgQbpEdt9yhsS4r6Sh1l9sLbWk+oUk9iKyKURcdc0/syOFOQ5cnIeO7hP49vrqdpNuSl23qVrt1xleifTyoUlPr2rjblH9nJ8QmGJfMjCrVndsSD03HHlhMoJA9Vxl9LgVvfZorr7HUqxDtUWFgZjQ1CkbFc1fzszcdVjV0fsNzgzBKheR+E647EuEcj2cZJSd6P2B/WvYxj+O3V35+FOniS2OnxBNd8VrfsQo7H6EV92+Wvh14V5zifK8o8e2q9rSgttyltluU0k+yH0EOJH5dWq435S/ZGYxL8W5cN8iz7XJSOpmDeFKW2D/AJUvtAKbT6erazV+FtCGaRGqwy0NPiC+c6bXcbcSLvNus+MD2fjSVpWkf60J0T+qd/oK2USx4tdY3itFc9lXY+LLceH5ghSjo/cGrK5W+Fj4oOEpDy8jxGZPtDCvDFyiwzPjr8u+sOMaKU+38VDfcVSrU5q6uiTEu1sjz1AbcbCo6yd6APdSV/od1eZFgv8ABXn95/IVqHFblVbv+xkG2yHJtjgQCXFdao0pkKSTr8C9FSP+o/KttAv8FlxFvlxFWt/0Q04kJQv/AHFjyq/Te/yrSqn5vAaQHrbbpyQPO6y8pJ/Uo6e/9P8AlXm1Mut9ZMfxrIUKTtbLrbjik/qhXSR/UCt5NadwEHSXXsrsKOGHcEuElMFhp9stOoS4hY0pKhsEVqP7Pv25RexqcYffZiuArjq/RPqj/hP9K0bWP5dbGHRacljd9FEZcVXhJ+4SVLUpP/UflXk1Lk9mskyG7W5fuFIabZP6OpRr/wAwP5Vq8z8TZdaq6LSHSvNkdZge6kDWUIhrRGyKP+7Xl+VLildUdw/6XPQforRr2nMLEV+FClLnud/LCaU/3HsSgED+pFYsbGsccSh9cJE89lJdlLMg/qCsnX9K/UY+u3K68buBgJ9TGWjxI5/RGwU/8JArUtiDqv0CuMjRwMpep+gK9sl+835hURrH2I8V3yqcuZCjr7hpG9/8Skmta9gsqOhpxu4uXlDA38hcnFeArvvya7J+w6gsD07VsU367QARfLK54YOvmIRLyNfco7LH9Af1r3JzPFy2lwX6GSokBHiDxNj1HR9W/wAtVE5jD4jX0691PegRKxXV409sP/ZeyLk9tiFuFcobtmXoJSmQ2Es/olxO0H8hvf5Vvm3kOIDja0rSobBB2CP1qOqyNM4Fi32C4TkqB2XGPBb/AKl3pJH6A1q2MNujjrklu5psBWFD5e0qKkHfurrHST+YQk/nWtcq9eiuMtT20Zvj098FLp96tdpSlVynsR+s6QFrAUo/ZI9SfyFaK4Kcy4IbgY8EtpJ6LhcGlNFrt9TaOzpPp69A/OsW22y6YuouGyMXcqICpbDh+bV+ag8o7H/H+grbIzKzoPTcVSLcoHR+dYU0n/6yOk/0NaGviopfxHei7HN0aS+pp6eqwGsMutukN3CNev3u+ykBDd3BcSgj3bUP5Z/MhR/M1tUZPKijV6xy4xOnsp1hHzLRP+nw9r1+ZSK9Ss0sjjhj2x9V1f1sNQR4v/NQ8qf+Iiil5bdAUtiPY2T6LJEiQR9un6EH+q61kB4euuamhvhwv/GceQ3h6n/9LJVm+KoZU6u+RklJA8Iq06VH0SG/qJPsAN16Ezsqv+v3dG/ccJWv7xKQFyljt9DX0o9+6yT/AKa/Y+HY+nrXOhJuUh0ackTgHnFfl3Gkj8kgD8qxbhaceswS4zfJlnWd+G3HlqIJHfysq6kk/kEmsEOzUzolpInFIDdAZHzJ+kua3Voxy12dxUptLkma4NOTJKy48v8A4j6D8hoflWZcbrbrRFVMuc1mMwn1W4oAH8h9z+QqGMJzu4SEfuq9So0IfVIukVoLWNfgaQlJ/qop/Q17YeJZBbpRub1+ttzl9z8zPhLUtH5I070oHf0SkVj/AGhSQ7Y5rbsCEQNaS54zPVVmyHZOVf8A7bjUZqOe3z11jA7H3bZPmV/xFPr715W3jPD7eFOuWpuRKWouOSVpCVdR9ekI0ED8kgf86wFZTlS3PlbXEtd3ePbrircDSO+vMsjpGvcdRV+RrwvEPJ5IS/kOWWi3RNAGKlpfhqV7hS/EQpX5DsPuDWsgaymojGgPPeOYYjhmQ0AetPdx4rJlRMbbfXCsSbzMmIPSpqBc30ttn/xFFfQn76Pm+wPpXlHwiXNQ+rJ8huC2H09KoTE5wNJT7hTh0pf6jpGu2qw8ZGaZhcGMR4raevlwKi2mNZ7GtSWdevU2nrWB776QnWyVADddR8e/s0ee+RmosnlLI28dgLbQ44zLWhxxXV6j5WPodgPRbwIJ9PXUTnMbiubaNr2CDMxJE6Ayb7Ek/HBcvGHiseP8pjkSbMaYSUdars+iEyB67WpfToa7hIUR9qkvGPAOR8vXAv4Ngs7OX2HChQgNhNtjL9SFyHlBoqH2W5s+oTX0t4x/Zt/Djgqo8/J7RLzm5R1pcQ5fXfEjoIGukMJ0laNfhd8SuorfbrfaobVvtcGPDisJCGmGG0tttpHoEpSAAPyFQOjD9IXn7V2ma6kCE2mExQcm5+Z8l89eNv2ZubX2Mh7mLPImNQwUFuz4r/Fd8PsSlyS4hIQsHY0hCk+vdQrsTij4aOE+FmY/9gsCt0WdHQUJuT7YfmnqGlaeXtSAruShHSjudJFWfSoXPc7Erz9q2jaraZx3k8MvQSCUpStVSSlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlESlKURKUpREpSlEX4QCNEbBqpeS/hO+HPl1Tr+ecR4/PlvJKFzGo3y0kg79XWulR9Se5PerbpQGWCLhbL/2TvFT7Ly+MORsoxR9SSGmpBbnxWz31ptQSR6gfUfSubOQv2Y/xQ40Vu2aVYc+iI30LhutRZYGz/hvhCQdaPldNfXylWGWqMygdRSCM9ua+CV34K+IfBn1pv+EZZAZaBKhPxyS5HTr1PzKSU/8AnIqDOZNf4r7kS5/uxoIUpKnW0qeSNHWijqCwf6V/RKQCNEbB+9RTMOJeLuQkBGd8dY1kIA6Qbna2JJA/IrSSP6VZbtF4EiPc/WanbbHt/uvgHHt5lSUu23LoEF5RCyiCwWyvv+JtThSf1Ka2L8XNGHi4m/LmR+2kx0Msuj/60qSr/mmvsZkH7Pb4QMjV4krhi1xljRCob7zAGhoaSlfT/wBPWq2yv9k/8Ol9V4lgv+aY4pKSENwbkgsg67EoUgk9++uoe9Si3QiKgjkSrLLeG5HyJXy7a/s+vRyCTdisHzC4rWhoE+x6dNH/AKitiLZg0qO27GZtjbbf8p6KpLZQT7pWggj+hruy6fsf4SVE2Hm1txvqJDd3xpMrsfbqQ+jvr36TWjl/sjcoLRaZzrC3wkgo3Z3We/ufKpWvet22uCcZeYP8q1D2nDHiA8wf5XFD11FpHRBzOPI39DEpv5lf6JLZCj/Xqrzh5dkjxUn+x0iQhKCoOtOeEFH2HS8EEf8AWusH/wBlL8RcB0jGuSMIjM67JckzFA/lpbK9fbt7V7o37MT4r0I1J5E44cVrsfHlj/8Az0FqhE+KXXFWYe12T8RHKvyuRTmF6cUUTGY9i7nzSozz4A/3gEIB/wCI/wBazWo2P3VPiXbJk3gK7dC5KUsH8vDRpKh/vbrqhf7L34s3lKCuUOPWka2nw3JJJ/I7jdh+frWwtH7JjlOQpL+T5lhLzySnuVS5SfTuelaEpB/QCtDaYU6mfP8AhSjbMKcnG9zn8CnsuSJjHHjSwhaLYy+kBKRFIQ/0+gA8PzarGKbko+HjEzIgFb80laEsoPp5i+hTnt+EH/rXdLX7I+4vukyuU8XiJT9KmsTLy/Xv6yEa/Xv+lSawfsieN2XWl5Zy7lU5tCR1tWplq3JUv3IP8Qgb9jv9a0daYWXsJLD9uQQaD/iCPenwvnpIsuWOwtXXkIRxva/CjIbSU/YrBCv6gprAjzWMeKnbbeLDIedPQXW7e4txw+oC3vFOz29VH7V9bLL+zO+Eu2ttou2FTsgLaAkLudxcUoke5LfQdn1Pt+VWNi3wd/C/hrrMix8H4oHo+vBdlwUy1tkEEFKnuopPYdx3qJ1qZ+ke6hibeYDehsM9S4k+818ccbicq5qCzYLU0+46QhtFstz9xkFZPp4bRI369gVVPMT+DP4p86eCo/HuRSl9XUZF+iItcdA6tfypKx6dztLalV9q7darZaIjUC026NCisJ6WmY7SW0IH2SlIAFZNROtLjgFWjbetMUSHuSfag9l84uPf2WnJD623+VOdY8SPrTlux+3pcChs9hIcS2UkDXcI9Sft3vjC/wBmr8KuKPmbdsTuOWzekITKyC4uPrSO++yOhJ3seoOtDWu++pqVC6I92JXOjW602gSiPJHNaTFMJw7BLWmx4TitosFuSsuCJbITcVkKPqrobAGzrua3dKVoqiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoiUpSiJSlKIlKUoi//2Q==)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "This function has an obvious minimum at $f(0, 0) = 3$.\n", - "\n", - "Let's minimize this function using our conjugate gradient, and output the minimum and the gradient evaluation logs:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- Create a stop condition that respects a given error tolerance.\n", - "stopCondition f tolerance = StopWhen stop\n", - " where stop (Arg a b) (Arg x y) = do\n", - " Arg dx dy <- grad f (Arg x y)\n", - " return $ abs dx < tolerance && abs dy < tolerance\n", - "\n", - "-- A demo function with minimum at (0, 0)\n", - "function x y = x^2 + y^2 + 3\n", - "\n", - "-- Note that we don't need to set an alpha!\n", - "let tolerance = 1e-2\n", - "let initValue = Arg 3 2\n", - "let writer = conjugateGradient function (stopCondition function tolerance) initValue\n", - " (minLoc, messages) = runWriter writer :: (Params (Double -> Double -> Double), [String])\n", - "\n", - "printf \"Min at x = %.5f, y = %.5f\\n\" (x minLoc) (y minLoc)\n", - "mapM_ putStrLn (take 10 messages)\n", - "printf \"... and so on ... (%d evaluations)\\n\" $ length messages" - ], - "language": "python", - "metadata": {}, - "outputs": [ - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "Min at x = 0.00078, y = 0.00054" - ] - }, - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "Gradient at\t3.00000, \t2.00000 \tis\t5.99990, \t3.99990 \n", - "Gradient at\t3.00000, \t2.00000 \tis\t5.99990, \t3.99990 \n", - "Gradient at\t0.00005, \t0.00005 \tis\t-0.00000, \t0.00000 \n", - "Gradient at\t1.50002, \t1.00002 \tis\t2.99995, \t1.99995 \n", - "Gradient at\t1.50002, \t1.00002 \tis\t2.99995, \t1.99995 \n", - "Gradient at\t0.00005, \t0.00005 \tis\t-0.00000, \t0.00000 \n", - "Gradient at\t0.75004, \t0.50004 \tis\t1.49997, \t0.99998 \n", - "Gradient at\t0.75004, \t0.50004 \tis\t1.49997, \t0.99998 \n", - "Gradient at\t0.00005, \t0.00005 \tis\t-0.00000, \t0.00000 \n", - "Gradient at\t0.37504, \t0.25004 \tis\t0.74999, \t0.49999" - ] - }, - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "... and so on ... (39 evaluations)" - ] - } - ], - "prompt_number": 8 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "That concludes the **Conjugate Gradient** notebook. We've derived the nonlinear conjugate gradient algorithm as well as a few simple line searches for it, and then implemented it in our previously-discussed Gradient Descent typeclass heirarchy." - ] + "output_type": "display_data" } ], - "metadata": {} + "source": [ + "-- Create a stop condition that respects a given error tolerance.\n", + "stopCondition f tolerance = StopWhen stop\n", + " where stop (Arg a b) (Arg x y) = do\n", + " Arg dx dy <- grad f (Arg x y)\n", + " return $ abs dx < tolerance && abs dy < tolerance\n", + "\n", + "-- A demo function with minimum at (0, 0)\n", + "function x y = x^2 + y^2 + 3\n", + "\n", + "-- Note that we don't need to set an alpha!\n", + "let tolerance = 1e-2\n", + "let initValue = Arg 3 2\n", + "let writer = conjugateGradient function (stopCondition function tolerance) initValue\n", + " (minLoc, messages) = runWriter writer :: (Params (Double -> Double -> Double), [String])\n", + "\n", + "printf \"Min at x = %.5f, y = %.5f\\n\" (x minLoc) (y minLoc)\n", + "mapM_ putStrLn (take 10 messages)\n", + "printf \"... and so on ... (%d evaluations)\\n\" $ length messages" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "That concludes the **Conjugate Gradient** notebook. We've derived the nonlinear conjugate gradient algorithm as well as a few simple line searches for it, and then implemented it in our previously-discussed Gradient Descent typeclass heirarchy." + ] } - ] -} \ No newline at end of file + ], + "metadata": { + "kernelspec": { + "display_name": "Haskell", + "language": "haskell", + "name": "haskell" + }, + "language_info": { + "name": "haskell", + "version": "7.8.3" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/notebooks/Diagrams.ipynb b/notebooks/Diagrams.ipynb deleted file mode 100644 index f5d913e4..00000000 --- a/notebooks/Diagrams.ipynb +++ /dev/null @@ -1,92 +0,0 @@ -{ - "metadata": { - "language": "haskell", - "name": "", - "signature": "sha256:d3903a8bf1d0cc71c024d88777d7a39da001586a55681e8e24409f5c12e0b13f" - }, - "nbformat": 3, - "nbformat_minor": 0, - "worksheets": [ - { - "cells": [ - { - "cell_type": "code", - "collapsed": false, - "input": [ - ":ext NoMonomorphismRestriction\n", - "import Diagrams.Prelude\n", - "import Data.Colour.SRGB\n", - "import Data.List.Split (chunksOf)" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 1 - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- Generate a board of size `size`\n", - "board :: Int -> ((Int, Int) -> Diagram b R2) -> Diagram b R2\n", - "board size sq = vcat rows\n", - " where rows = map hcat $ chunksOf size diagrams\n", - " diagrams = map (sq . tuple) indices\n", - " indices = sequence [[0..size-1], [0..size-1]]\n", - " tuple [a, b] = (a, b)\n", - "\n", - "-- Change visibility\n", - "visible 0 diagram = diagram # opacity 0\n", - "visible _ diagram = diagram\n", - "\n", - "-- Board square\n", - "tile board (row, col) = visible value diagram\n", - " where diagram = roundedRect size size 0.05 # fc color\n", - " # pad padding\n", - " padding = 1.15\n", - " size = 1 / padding\n", - " color = if value == 0\n", - " then white\n", - " else sRGB c c (c/3)\n", - " c = (20.0 - value) / 20.0\n", - " value = fromIntegral $ board !! row !! col \n", - " \n", - "-- Labels\n", - "labels board (row, col) = visible value $ number `atop` sq\n", - " where number = text (show $ 2 ^ value) # fontSize 0.5\n", - " value = board !! row !! col\n", - " sq = square 1 # opacity 0\n", - " \n", - "values = [[1, 2, 0], [0, 5, 6], [7, 0, 9]]\n", - "\n", - "diagram $ board 3 (labels values) \n", - " `atop` board 3 (const $ square 1)\n", - " `atop` board 3 (tile values)" - ], - "language": "python", - "metadata": {}, - "outputs": [ - { - "html": [ - "" - ], - "metadata": {}, - "output_type": "display_data", - "png": "iVBORw0KGgoAAAANSUhEUgAAASwAAAEsCAYAAAB5fY51AAAABmJLR0QA/wD/AP+gvaeTAAAgAElEQVR4nO3dd3gU1d4H8O9s300P6Y0QQiD0IiBKiXJRutgLKOALoqjXrvjqVdTL1cd2VVCviq+oqFwFlS4gEIqCSDeEEhJSSM+mbrZmd94/AgmTmd1sS3Yn+X2eZ5+wszO7h5zMd2fOnHOGAZB56UF8b96ln6t8WAbSat6ln6t8WAbSKgMAlvq2DOQKmaAvD3+SCaoPf7JU4usSEEKIsyiwCCGiQYFFCBENCixCiGg4E1iJQUHylRqNVCuTMWYALD2cf8jlElNgoKw8KEj+KYA4J37f7aH68LA+AgKk5YGBXqsP0olk7byeERgoW79oUYr63nt7ymNjVVCrpZ1SsK5Cr7cqior0UWvWFM37+OPzdzY2WqcD2Ofm22UEBEjXz5+frL7rrgR5TIwKKhXVhyv0equiuNgQtW5d8bzPP8/ztD5IJ2PQ3K1hqcBriWq19PSPP14TMHZsRKcWqqvas6cSt99+QGcwWNMBXBRYJfPSzwyB1xLVaunp1atHBYwZ06Ojitit7N9fhblz/3S3Pkjns9+tIShI/tJDD6UqKKy8Z8KESDzwQIoiMFD+gqvbBgXJX1qwoJeCwsp7xo6NwNy5yYrAQKnL9UF8w0EbFjtjzpwkeecVpXu4554khUTCznR9S3bGHXckUH142R13JCgYhnGjPogv2A0sg8Eanpio7syydAu9egVAr7dGurqdwWANT0jQdESRurXkZA0MBtfrg/iG3cCy2SCRSpnOLEu3IJUyYFnXu5PYbJBIJFQf3uZufRDfoIoihIgGBRYhRDQosAghokGBRQgRDQosQohoUGARQkSDAosQIhoUWIQQ0aDAIoSIBgUWIUQ0KLAIIaJBgUUIEQ0KLEKIaFBgEUJEgwKLECIa7d2Ewu9ZLDYYjTYEBsrA0HRRfkevt6KpiQUAMAwQFCT6PzniQ6L66zGZbNi5sxzr1hUjK6sOZWVG1NSYwbKATMYgNFSB9PQgjB8fiSlTYjBkSKivi9yt5eToMHnyPhgM1pZlJSXTfVgiInaiCaxPPsnDq69mo77eIvh6UxOLqioT9u0zYd++KixbdhqTJ8fg1VcHID09uJNLS8xmGxYvPsoJK0I85fdtWNXVZtxyy+94+ukTdsPKnl9+KcP11+/Brl0VHVQ6Ys+yZadx6lS9r4tBuhi/DiyrlcV99x3Cjh3ldtdhmObTQXt0uibcdtsBZGXVdUQRiYBduyqwcuUFXxeDdEF+fUq4bNlp7NlTyVt+zTU9cN99yZgwIRLR0UoAQF5eI06frsdHH+XiwAEtZ32LxYYnnjiB7dvHU8N8B6usNOHxx0+AZX1dEtIV+e0RVlmZEe++e46zTCpl8K9/DcK2beMxe3YSEhLUkMslkMsl6Ns3CLNmxWPbtvFYvnwY7xbuBw9q8e23hZ35X+h2WBZ4/PHjqKoy+boopIvy28D673+LYLVyv6affrovHn001eF2DAPMm5eMF15I57324otZMBqpEbijfPZZHnbvbj0idnSqTog7/DawvvmGezTUs6cGS5b0c3r7Rx9NxdCh3G4NVVUmZGdTQ3BHyMqqw7JlZ1qeh4crMHduMmcdOh0nnvLLwMrPb26PutLNN8e79I0tlTKYNSuetzwriwLL2wwGKxYvPgaLxday7J13Bre0LxLiLX4ZWAUFet6y6dPjXH6f9PQg3jK6Wuh9//jHKZw/r2t5Pnt2Em68McaHJSJdlV8GVnGxgbcsJSXA5fdJTuZvc/Ei/72J+zZtKuVczEhJCcArrwzwYYlIV+aXgVVSwg0VuVyCiAjXTy8uXGjkLevbl3/URdxTUmLAM8+cbHkulzNYsWIYNBqpg60IcZ9f9sO6+uoeeP31QS3PAwLcG9h89GgNb9ngwSGeFI1cYrWyeOSRY6irax198OSTabwLHYR4k18G1tixERg7NsKj92hoaMJ33xXxltOAaO/44IPzOHiwuuX56NHh7XY5IcRTfnlK6A3PPHMCRUXcxvsBA4LRq5frbWGE68iRGvz7362deoODZVi+fBgkEuq3QDpWlwysd945x+vHJZdL8MknI6gvkIfq65uwePGxljmuAGDZskFISFD7sFSku/DLU0J3NTQ04bHHjuGHHy7yXnvmmb50OugFS5b8xTlynTUrHrfeyu/vRkhH6DKBtWlTKZ566gTvCiPQ3Cb29NNpPihV1/L99xfx88/FLc/j49V4442BPiwR6W5EH1gFBXosWXISmzaVCr5+//298PbbgyGXd8mz306Tn9+IF17IankukTD44IOhCA6W+7BUpLsRbWDp9Va89dZZLF+eA5PJxns9JESO118fhHvv7emD0nUtFguLxYuPobGxqWXZQw+lYMyYHj4sFemORBdYLAusWVOIl18+hdJSo+A6d92ViGXLBiEqisayecObb57B8eO1Lc8HDw7Bs886PxCdEG8RVWCVlRmxcOFhZGbyJ/UDgH79gvDuu0MxbpxnfbhIq/37q/DRR7ktz9VqKVasGAa5nC63ks4nmsDavr0cixYdEZwcLipKiRdeSMfcucmQSmlH8qaPPsrlzB569dXhOHKkBkeO8EcRtHXyJHegOcs2z3PW1tSpsXT7L+IUv/8rsVhsWLo0G8uX5/Cm3VUqJfj73/vgySfTEBjo9/8VUbK1aR7cvbuSM0mfq5544gRv2ahR4RRYxCl+/1fy7LMnBW9ocMMN0Xj77SHUc52QbsSvA2vduou8sFIqJXj99UFYuDDFR6UihPiK3wZWXl4jHnnkGGeZQiHBjz9eg/HjI31Uqu5n9uwkTJjg3kWMo0drsXkzt3/cP/7Bn2s/PFzh1vuT7sdvA+v55/+CTtfa74dhgE8/HUFh1clmzIh1e9tvvinkBBbDAA891NsbxSLdlF92/66pMfNunvr442m49dYEH5WIEOIP/DKwfv65hHNDA6VSgkceobmWCOnu/PKUcO1a7mwLGRlRqK+3oL7eYmcL54WGyt2abpkQ4nt+GVhtpzbetq0M27aVeeW9H344FW+8Maj9FQkhfsfvTgkrK02cxnZCCLnM7wIrL49/pxtCCAH8MLByc3Xtr0QI6Zb8rg3rnnuScM89Sb4uBvGC2bOTMHs21SXxHr87wiKEEHsosAghokGBRQgRDQosQohoUGARQkSDAosQIhoUWIQQ0aDAIoSIBgUWIUQ0KLAIIaJBgUUIEQ0KLEKIaFBgEUJEgwKLECIaFFiEENGgwCKEiAYFFiFENCiwCCGiQYFFCBENCixCiGhQYBFCRIMCixAiGhRYhBDRsBtYEglYq5XtzLJ0C01NLBgGNle3YxiwNhvVh7e5Wx/EN+wGllot1RYU6DuzLN1CQUEjNBpppavbaTRSbWEh1Ye3FRbqoVa7Xh/ENxycEjIb16wpNHdeUbqHr78uNNtszAbXt2Q2rl17kerDy777rsjMsqwb9UF8wW5gNTRYlq5YkWveubOiM8vTpe3cWYGVK/NMOp3lNVe3bWiwLP3sswvmPXvoYMBb9uypxFdfFZh0OqvL9UF8Q+bgtWKj0Trj7rsPbnz00T7Ku+5KlEdGKjutYF1JVZUJq1blW1auzDMYDNbpAErceJtio9E64/77D29ctChFeeut8fKICKoPd2i1ZnzzTaHlq6/yDUaj2/VBfIABsPTSw574oCD5KwzDTjGb2bBOKVUXo1JJtTYbu6m+3rIMwEUHq2Ze+pnhYJ34oCD5KwA7xWy2UX24QaWSaK1WZpNO55X6IJ1nqaMjrMuKGxosCzq8KF2Y0Wj15ttRfXjIZKKLgmJF/bAIIaJBgUUIEQ0KLEKIaFBgEUJEgwGQf+lBfG8EmuvksK8LQgBQffibZDrC8i+MrwtAOKg+/NBSXxeAtMhEa98f4nuZoPrwJ0vpCIsQIhoUWIQQ0aDAIoSIBgUWIUQ0KLDEJ1Gjka5UqSRaqZQxA2Dp4fxDJmNMGo20XKORfgogzq0a4KL68LA+1Grn68OZwc/Ef2RoNNL1t9ySoJ4yJUYeEaGEUknfOa4wGq2K8nJT1I4d5fPWrr1456XpZfa5+XYZKpV0/fTpceq//S1K3qOHAgoF1YcrjEarorLSHJWZWTFvw4bSduvDmellSOfJvPQzQ+C1RKVScvrNNwcHDBkS2nkl6sKOHq3B88//pTOZbOkQnmYm89LPDIHXEpVKyemlS/sHDBwY0lFF7FZOnqzDq69mO6oP6tYgFhqN9KXbbktQUFh5z/DhYZg1K16h0UhfcHVbjUb60syZcQoKK+8ZPDgE06bFOqwPCiyRYBjMmDw5Ru7rcnQ1kyfHKBgGM93YdMbEiVFUH152/fVRDuuDAkskjEZbeHS0ytfF6HLi4tQwGm2Rrm5nMtnCacpw74uNVTmsDwoskWBZSCQSGtrmbRJJ8+/W1e2oPjqGRMI4rA8KLEKIaFBgEUJEgwKLECIaFFiEENGgwCKEiAYFFiFENCiwCCGiQYFFCBENCixCiGhQYBFCRIMCixAiGhRYhBDRoMAihIgGBRYhRDQosAghokE3oSB+oamJhclkhUYjA0PTTAlqaGiCXM5ApZJ2+mcbjVZYrc3/ZhhAo+n8MgAUWOSSnBwdfv21HGfONECrNUGrNYNlgZgYFWJiVIiNbf45cWIUIiI8m2nTYrHh0KFq7N5didxcHbRaMxoaLGBZQCplEBQkQ3JyAIYNC8U110SgT59AL/0vxYNlgT17KvHnn9UoKTGipMQAvb45MWQyBhERSgwYEIzhw0Nx7bURkEo7LuWLigx44onjMJlsLcs2bry2wz7PEQqsbu7cuQa88cYZ5OU1Cr6en9+I/PzW11auvICZM+Nwzz1J6NFD4fLn/fRTMVauvIDGxibB161WFrW1Fhw/Xovjx2vxxRf5GDOmBxYtSkFycoDLnydGf/xRja+/LkBBgV7w9aYmFmVlRpSVGbFzZwVWry7EAw+k4KqrwrxeFovFhrffPssJK1+iNqxubO3ai1i8+KjdsBJisdiwbt1F3H33QaxfX+L0dvX1Fjz77Em8/36O3bCy58ABLRYvPorDh2tc2k6Mli8/j3/+87TdsBJSWmrEa6+dRmZmpdfL8+WXBS79fXQ0CqxuavfuCqxYcR5NTaxb25vNNnzwQQ6ysuraXddmY7F0aTYOHaq2uw7DwOFpjV5vxZIlJ5Gbq3OrvGLw1VcF2L69XPC18HAF0tICERoqfKMem43F++/noKzM6LXyHDlSgw0bnP9S6gx0StgNlZQY8Pbb53jLw8IUGDcuAmlpQUhLC0RSkgbV1WYUFuqxaVMp9u+v4qxvtbJ45ZVsrFx5FUJC7N/x6osv8nH0KP/oaPDgEEydGovhw8MQHt58ellcbEB+fiPWrr2Iv/7ihmFTE4t//zsHy5cP63IN85s2leKHH7j3DmUYYMqUWMycGYv4eHXL8upqM44dq8Unn+TBYLC2LG9qYrFqVT6WLOnncXlqay34979zwLr3fdZhKLC6oW+/LeSdlsXFqfH224MRF6fmLY+LU+Pqq3sgO7seS5b8hfp6S8vrlZUmrFqVj8ce6yP4WVqtGd98U8hZJpEwePDBFNxxRyJv/Z49NejZU4Px4yOxeXMpPvggB2Zza/tJVlYdtm0rw+TJMS7/v/1VXZ0Fq1blc5YpFBIsWdIXI0eG89YPD1dg4sQo9OkTiNdeO805qvrtNy1On65Henqw2+VhWeC993JQV2dpf+VORqeE3YzRaMWuXRWcZcnJAVi+fBgvrNrq3z8YS5f25526nTnTYHebHTvKYbNxv6bnzEkSDKsrMQwwfXos5s9P5r32n//kckJM7NavL+E1aj/6aKpgWF0pKUmDZ5/ty1v+668VAms7b8OGEhw50npE3JFXIF1FgdXNZGZWtlwev+yxx/o4fcVv+PAwjBsXwVl24UIjL5Qu++WXMs7z2FgV5s5Ndrq8d96ZiLS0IM6y2loLLlzwn4ZgTzQ2NmHz5lLOspEjw5GR4dy9Xfv0CcTQoaGcZZ78bvLyGjlHe8HBckydyj2a9eXpOAVWN3PqVD3neXS0ivcH356+fbkBYjRaUVxs4K1XUmLgdIkAgIyMKJe+sSUSBhMm8Hfe8+e7RuP71q1lvC+QadNcO91t252hsFDvVtuTyWTDW2+d5VyI+fvfU1vaF/0BtWF1M5WVJs7z9PQgl78xhU4dq6vNSEzUcJYJXbG69toern0YgORkDW+ZP11q98TRo7Wc51FRSgwf7lp/qhtuiMagQSGcZSzLgnGxYj/9NA8XL7Z+8dx4YzRGjw5HUZHzXSw6GgVWN9M2sKKjVS6/R04O/+jmyqtY9j7L3nrtEQrIigrvXb73laYmFmfPctv/pkyJcfkLRK2WIiXFs061v/2m5XSpiI9XY8GCXh69Z0egwOpmJk2KxjXXtB7ljB7tuGFXSNvuBiqVFD168IfrtA0smYxBaKjrpxclJfzTzZ49xd/rPTdXx7t4MHZshJ21O05VlQkrVpxveS6TMXjqqTSfjFlsDwVWN3PXXY6vzrUnP78Rp09z28FGjQoXPCoYNCgEDz+c2vJcpZK41WArdBUyNVX84wvbtifKZAyiovjBX1FhQkWFEdXVFgQESJGUpEFkpGfjOS+z2Vi888456HSt3VzuvjvJb8dvUmARp506VY8lS05yjgrkcgkWLUoRXH/IkFAMGeJag35bjY1Ngr2//XWHckXbII6MVEIiaU50g8GK9etLsG9fFQoL+W1IGk1zcF1/fRQmT3b9NPKy77+/iKys1uAcMCAYt9+e4N6bdQIKLOKQyWRDXp4Oe/dW4aefimE0tl7RkkgYLFjQy612KWctX34e5eXc9qqUlIB2+4yJQXW1mfP8cnvisWO1WL78vGAb4GV6vRVnzjTgzJkGbNtWjkWLUpCeHmR3fSFnzjRgzZqilucBAVI8+WSaX48ioMAiHMeO1eL55/+CWi2F0WjlDP24UnS0Cv/7v/08PoJy5JtvCnn9uGQyBs8/n+7XO5WzrjwNA4DoaCX27q3C22+fdalbQm6uDs89dxJTpsTiwQdTnPrdNDZa8dZbZ2G1tn7QokW9BU9J/QkFFuExGq2cI6m25HIJnnwyrcPCqrGxCe++ew47d/J7bM+Z07NLnA4C/MCqqjLjvff44/fCwxVITtZALpegoECP8nIjbx2WBbZsKYVMxmDhwvav7n30US4qKlqP4MaPj8R11znXWdWXKLCIyywWG5577iQGDAjGI4+kejRura39+6vw3ns5qKrinw4NGRKKOXN6eu2zfK3teM4rh8MAwPjxEZg/P5k3YaLRaMVff9XhvffOc8Z1As3DaoYPD8WIEfb7cu3cWYG9e1unoomMVGLxYuF2SH9DPd0Jh1IpQXJyAHr1CkBYmMLh6cWpU/V48skTOHGi1v5KTiotNeLFF7Pw4otZgmE1c2Yc3nlnCGSyLnAuiObQsTe1j0IhwWuvDcAzz/QVnN1VpZJi5MhwvPPOYCQm8tvyvvqqwO4pZWmpEf/5T17Lc4YBnnyyDwICxHHsIo5Skk7Tv38wVq0a2fK8qYlFdbUZJ07UYv36Et78VwaDFc899xc++2wEr6e7M4xGK1avLsR//1sEi4U/oDkwUIbFi3tj6tRY1/8zfqztcJwrzZ3b06nhUjExKixdOgALFx7hjOXMy2vEhQuNvM6kTU0s3nrrLOd0/9ZbEzBwILeXvD+jIyzi0OW+QZMmRWPFimH4+OPhvG9jo9GKr78ucOl9WRbYvr0cc+YcwurVBYJhNWlSNL7+elSXCyuguR1QyMCBwZgxI87p94mKUgp2NhWaf+ybbwo5oxRSUwMxe3aS05/lDyiwiEvS04Px8sv9W/oLXbZzZ4XTs11qtWY89dQJ/OtfpwVP/5KTA/Dee0PxwgvpCAvzn4G33qRWC/ciHzcu0uUroDNn8gO97eDwEydqsW5d6wSBSqUETz2VJrpTbDolJC4bNSocN9wQzelyYLWyyM6uR0yM47GJf/xRjddfP43aWv7kcGFhCsyfn4zp02N5gdjVyGQMFAoJb2iOUJtUe3r1CgDDgNNuVVfHbdD/8cdizusDB4bg7NkG3lhGIW3Dj2UheAV3zJgeHX77Lwos4pa0tCBeHymhMX+XNTWx+OyzPHz/fRGvQVgul+CuuxJx991JPrvfnS8EBMhgNnM7jyYlud4OqFBIEBGh5HQ0bWjgfiHY2pxxHzlSw7sq6Yr33svhLUtPD4JG07EdeimwupGPPspFTk7rN2p0tMrt+b979+YPPi4vt98ze/nyHMG77IweHY7HHuvTJXquuyohQY2aGm5guTvgWKXitu501VNpCqxupLGxCceOtXZBkMslePbZvm6dfgntEEqlcJPorl0VvLCSyyV4+OHemDUr3uXP7ipSUgJ4M18UFxtcniqGZYGyMu6XRc+erh+piQEFVjfS9nTDYrGhtNTo1lhAodttCYVYcbEBb799lrNMLpfgzTcHY9iwjhvWIwa9e/N77BcV6V0OrMpKE+8qa9vAuvHGaAwf7t7v+8yZBvz+u5az7P77k3nrBQfbv3OSt1BgdSNC/aROnqxzK7CEJvETOk386KNcTp8jhgGef75ftw8roHn6nbaN5YcP1whOCe2IUFtU2/nCPJlna9u2ck5gMQxw882+OTKmbg3diNBUwxs3un6jzKoqE7Zs4d44QaGQ8Do71tdb8Mcf3G/mu+9OwvXXR7n8mV1RRISCN4Rmz55KnDvn/Hz1FosN33/PvZ/h5bGHXREFVjcSF6fGmDHcOdWzs+s5/XPaY7OxeO01freECRMieQ3Ge/ZUcoafyOUSv55ryRduvDGa85xlm+dWF+pIK2TDhlJeX7Z585KhUHTNXZtOCbuZBx/sjUOHqjnTiixffh6lpUYsXtzbYQN8Xl4jVqw4zxs7KJdLcP/9/BkC2t7/cMSIMOh0TbxZCtwRFCS3e9t2MRk9ugcGDQrhNL6fPduAp58+iWef7Wv3dJ1lm8cMrl3L/bJJTw92+hZhYkSB1c307KnB9OmxvKt2a9deRFZWHYYMCUVKSgBSUwMRGqpAcbEBRUV6/PVXHbZv598UFQAWLuyF2Fh+h9G2nRIPHtTi4EEtbz133HZbAh55JLX9Ff1c8+DjNDz66DFOkOflNeLxx09g2rQYpKYGtgxGLyhoRG5uIw4c0OLkyTreezk7H5ZYUWB1Q/PmJeO337S8U4nLM1i64uab4wXv4lxTY3Y4wJe0iohQ4IUX+uGf/zzDmXLGaLRi3bpip99n/vxkj++e4++65okucSgsTIEPPxyGhAT3O2vGxanx+uuD8NhjfQRfF7qxKrFv4MAQvPXWoHaHNgmRShk8/HBvn12560wUWN1UdLQKn3wyAnPnJrs0HEalkuJ//qcXvvxyJK8B/0oUWK5LTNTgww+HYfbsJKd7vI8cGY4VK4Zh8mTX7hYtVnRK2I0FBMgwf34ybr89AUeP1uDo0VqcP69Dfb0F9fUWKJVSJCaqkZCgQUKCGgkJavTtG+TUsI8bb4zBjTd2j53ImxSK5nGVN90UhyNHanDoUA3Ky42oqTGDYRiEhMgRHi7HoEEhGDkyvFPmYL/xxmje1UxfocAiCAyUYfz4SIwf33WvLomNWi3F2LERPrmxqj+jU0JCiGhQYBFCRIMCixAiGhRYhBDRoMAihIgGBRYhRDQosAghokGBRQgRDQosQohoUGARQkSDAosQIhoUWIQQ0aDAIoSIBgUWIUQ0KLAIIaJBgUUIEQ0KLEKIaFBgEUJEgwKLECIaFFiEENGgwCKEiAYFFiFENCiwRIJhwNpsrK+L0eVYrSwYBjZXt6P66Bjt1QcFlkioVBJtaanR18XocsrKjFCpJJWubqdUSrTl5aaOKFK3Vl5uhFJpvz4osESCZbFx+/Zys6/L0dVs2VJmZllscGPTjbt3V1B9eNmOHRVmwH59UGCJhF5vXbp2bZH5zz+rfV2ULuPPP6uxfn2xSa+3vubqtnq9den69SXmY8dqO6Jo3dKxY7XYsqXUYX3QrerFo9hkss148cWsjXfemaicNClaHhqq8HWZRKm21ozNm0stP/9cYjCZbNMBlLjxNsVms23GsmWnN958c7wyIyNSHhIi93ZRu4W6Ogu2by+3bNlSZjCbHdcHA2DppQfxvcxLPzMcrBOv0UhfYRhMsVjYsI4vUtejUEi0Nhu7Sa+3LgNw0cGqmZd+ZjhYh+rDQ3I5o2VZOFMfS+kIS3yK9XrrAl8XQszMZpcvCjpC9eEhswstgdSGRQgRDQosQohoUGARQkSDAosQIhoMgPxLD+J7I9BcJ4d9XRACgOrD3yTTEZZ/YXxdAMJB9eGHlvq6AKRFJlr7/hDfywTVhz9ZSkdYhBDRoMAihIgGBRYhRDQosAghouFMYCWqVNKVCoVEK5UyZgAsPZx/yGSMSaWSlKtU0k8BxDnx+ybikqhSSVfK5RKtREL7h6sPqZQxKZWScqVS4tT+0d7g5wyVSrp+8uQY9YQJEfKwMAUUCjooc4XJZFNUVZmifvtNO2/r1tI7L01nss/X5SJekaFQSNZfe20P9ciR4fKQEBnkcto/XGE22xQ1NZaoo0dr5u3bV3Xnpell7O4fjqaXSVQoJKefe65vQHp6cMeUtps5daoeb711Vmc229IhPI1G5qWfGZ1WKOJI5qWfGQKvJcrlktMLFvQK6N07oPNK1IWdP6/D55/n6ywWu/uH/W4NGo30pSlTYhUUVt4zYEAwbrghWqFSSV/wdVmIZ1Qq6UvjxvVQUFh5T2pqIK65JkKhVErs7h92A4tlMWP8+AiaQtHLxo2LVEgkmOnrchBPsTNGjgyn/cPLRo4MVUgkjN39w25gmc228IgImoLX26KjlTCZbJG+LgfxjNnMhoeF0f7hbT16KBzuH46OsCQSCQ2l8jaJhAHLUncSsWNZVsLQ7uF1lzLH7v5BOw4hRDQosAghokGBRQgRDQosQohoULqoImUAABLxSURBVGARQkSDAosQIhoUWIQQ0aDAIoSIBgUWIUQ0KLAIIaJBgUUIEQ0KLEKIaFBgEUJEgwKLECIaFFiEENEQdWBZrSwMBitYtvM+jxBin9XKwmjsuH2yvbvmuIRlgddfP42GhiYAQHp6MO67r6dX3ttiseHkyTocPKhFYaEeNTUWNDY2gWUBqZRBQIAMCQlq9O8fjOHDQ5Gc7Nlc2zpdEw4c0KKkxIjSUgNKSozQak1Qq6WIjVUhJkaFmBg1hg4NBc3rTS4rLTXi4kWDx+8TGipHnz6BHr8PywKffpqHxsbmfTIlJRCzZnnnbnNNTSzOnm3A8eO1KC01or7e0nIAIZEw0GikiI5WITU1AP37ByM+Xu3xZ3o1sHJyGpCVVd/yPDzcO1PIbt9eju+/L4JebxV83WplUV9vQXa2BdnZ9Vi79iKGDQvF3XcnISHBtV8SywJ791bi228LW4L3Snq9Fbm5jcjNbQQA/PjjRYwbF4G77kpCaChN8d3dHTigxe+/az1+n/T0YK8EVkGBHjk5upbnoaHe2Sd/+02LrVvLYDQK75M2Gwudrgk6nQ65uTps21aO/v2DMXVqDGJiVG5/rlcDyxsVdSWdrgkffpiLEydqXd722LFanD7dgCee6INBg0Kc2qa83IhPPsnDmTMNTn9Oc8BV4dChGsyZk4Trr49yuayk66iqMvu6CBzHjtV49f30eiu+/bbQpX3ksuzseuTm6jB3bjLS0twLY6+1YdXXW3DggPcCy2Zj8f77OQ7DimGaTwftMRqtePPNsygs1Lf7eRaLDe++e85hRTiaw9totOLzzy8gK6uu3c8iXZdWa/J1EVrodE04ftx7f482G4uvvipodx9xdC8Ik8mGzz+/gNJSo1tl8MoRlslkw1tvnRU8hXLX2rXFOHWqnre8X78gZGREYcCA4JZTsLKy5naDX34pw9mz3F+m1cri//4vHy+/3N9h4Pzww0UUFXHbHqRSBlOmxGDw4FDExakQGipHdbUZpaVGnDhRh23byjgN8SwLfPhhLl5/fRCdHnZDViuLmhpLy3OplEFIiHt/B0FBnu2aZrMNn3+e39J25Q3bt5fj/Hkdb3mvXgEYPTocqamBLeWuqjKjvNyIffuqcOFCI2d9q5XFunXFePjh3g73SSEe/VbMZhuOHKnB1q1lLW063lBba8GGDSWcZRIJg3vuScTUqbG89ePj1YiPV2PUqHDs3l2BL78sgMVia3n93LkG7NtXifHjhe8edO5cAzZvLuUsCw9X4Kmn0tCrF7dBPSJCiYgIJQYNCsH48RF4//0czrdFXZ0Fq1bl4/HH+7j8/ybiVlNjhs3W+gXWq1cAHnwwpVPLYLHYcOpUPfburUJRUftnFs6qr7dg165KzjKJhMG0aTGYMIG/X0VHKxEd3byfHDpUjZ9+KkZTU+vvJj+/EYcP12DkyDCXyuF0YFksNuTlNaK01IiSkuarZtnZ9XYb3Tyxf38Vp+IB4Kab4gTD6koMA1x/fRQaG6347rtCzmvffluEMWN6QC7nnwXv31/Fuwx73309eWHVVlKSBo88koqXXjrFOdLKyqoDyzo+hSRdT9v2q6goZYd+XlMTi6IiPSorTaioaH7k5upgMtna39hFR4/W8vbJiROjBMPqSgwDjB4dDr3eyjso2Ly5FMOGhUImc35HcTqwCgv1eOWVbKff2BN793KTPDJSiVtuiXd6+2nTYnDwoJZzKFpfb0FRkQEpKfwQunCB+03Us6cGo0aFO/VZvXoFYMKESOzaVdGyTK+3oqTE4JXLuEQ8qqq47VeRkR0bWCUlBnz4YW6HfsZlhw9zG+/DwxWYNMn5C0wTJkTg+PFaFBe3NrvodE0oLTUiMdH5/cTvOo5WVJh4/ViuvjrcYeN6WxIJIxg4Qo3vNhvLW56S4toVjL59g3jL8vK8d4pMxKHtEVZHB1Zn0WrNKCvjNpIPGRLisHG9LYmEwZAh/Kv1paWu9Vnzu8CqrORfZRkxwrXzXACC/a+EAqu62sxp7wKab5ftCqEGUrmczge7m7ZHWB19SthZamr4XTUGDHCuq9CVoqP5/a9KSly7Wuj0KWF8vBovvdTf4TqvvZbtcZf86mr+L8edjmZRUfxthC45h4UpIJMxnAZBVxsrCwr46wtVDunarjzCkskYhIV5p5OmPdHRKixe3NvhOh9/nOvxPllba+Eti4hw/f8mdCAg9N6OOB1YKpUU/frxT32uxDAMWA9/O20DSyplEBTk+qXhigp+csfF8Y+6pFIGsbFqTkjl5OjQ1MQ61RjIssDhw9WcZRIJ41FvXiI+NhvLORKJjFR2+EUXpVIi2CZ7JW/sk3V13FC5PBTOVVot/2AkOtq1o1Cv9nT3hrS0QMyZ0zr+UKWSuFXxQm1I9sYXDh0aygms6mozPv/8AhYtav+S9MaNJbwuHePGRUCtlrpYYiJmNTUWzpXiK9uvjEYrcnJ0qKoyQ6dr7hfVPBZVhZgYpeCVa3+SnByAmTNbxx8qFO7tk0JnLkIHEY74XWClpwcjPT3Yo/cwGKzYt6+Stzw5WSO4/i23xOPgQS2n/WzPnkowDHD77QmCh/Ymkw0//ngRW7aUcZbL5RLcdluCR+Un4iPUfqXVmrFxYwmysxt4XQIuY5jm8Jo2LbbdMxhf6d07wOMB/kajlXelEYDLV9L9LrC84csv83lXbBITNYLtWkDzofWDD6bgzTfPcvqwZGZW4vfftRg1Khzx8WpERChRU2NGSYkBWVl1vM+QShk88ECKy432RPza/i2cP69DZmYlp21UCMs2z/CwcuUFDB0aiptuivO4l7s/+vnnEl57VWysyuV9pcv9ZtavL8HevVWcZVIpg4ceSnF4GJueHoxXXx2ATz+9gNzc1uEHZrMN+/dX2d/wkogIJR54IAUDB3p2dEjEqe0RVn6+673Mjx+vxdmzDbjjjgSnB+yLwa5dFbyjK6mUwV13JXbu0Bx/YjA0Dz4WmjFi1qx4p+bHSkzU4Lnn+uKNN8641I+KYYCbb47DgAEUVt2VUIPyZVFRSqSnN499VSolqKgwoazMiIICPQwG7kgRg8GK1asLsXBhL6Smej69jC8ZjVasW1eMY8f4ExhMnBjlVsfqLhFYhw/XYNWqfMEuEenpwU5NWGazsfjll3L89FOxywNGWRb47LML2L27Eo88ktpl+t8Q5wn1H9RopLj33p5257W6PFzl0KFqTtcDq7V5VoQlS/pBoxHnxZusrHr89FMx7woj0NwmNnGie9MwiTqwKitN+PrrAsHGPKA5xefNS263l/zlwNmzh99QL5UyiI5WIS5OhchIJWpqLCgtNaCszMgbs3X+vA4vvpiFJUv6tXu5mXQdLMvvjtOjhwILFvRy2Ntdo5Hi9tsTMGxYKD799AKnYV6vt2LHjnLcdJN3ZgftLNXVZmzYUMKZyPNKY8b0wKxZcS6NXLmSKAPLZLLh55+LsWVLGa+XOtD8hzBnTk9kZDgemHnZ6tUFgmE1enQ4Zs/uKdhJzmi0YuPGUmzeXAqzubUMOl0TPv64eYoZVwZ1EvFqarJh0qRoGI1WGI1W2GzA1KkxCAx0bvdKTQ3EpElR2LatnLP84MFqTJ0a4/fdHoDmtt6dOyuwZ4/whQa1WooZM2KdHqNrj6gCi2WbZ1ZYs6ZIcLgAAIwdG4HZs5OcnoeooECPrVvLeMsfe6wPRo+2/8tVqZq/Ha+/Pgovv3yK8w1bXGzApk0lmDXL+QHbRLzkcgn+9jfPZpqdODEK2dn1nDnZLBYbcnJ06N/ff9tGWRY4erQGmzeXob5euNf6iBFhmD491itXP0UTWLW1Fnz00Xm7h5rx8WrMn5/scuVu2lTCWzZpUrTDsLpSjx4KPPRQb/zrX6c57RA//1yCG26IEW0bBOlcEgmDAQNCeJNIFhbq/Taw6ust+O67Is6c8VeKjlbhllvi0Lu39y4eiCKwjh+vxX/+kyeY4CEhctx2WwKuuy7SpdHjQPOp5cGD3GE1gYEy3HNPkkvvM2BAMCZMiERmZutppdlsw8WLeqSl+WdnQOJ/hIZzeXMWX286c6YBa9YUtfTcv1JQkAw33ND8pe/qPtkevw4sq5XFmjVF2LKllDeAUy6XYNq0WMycGQuVyr2jmMpKE+9eg8nJAVAqXW8zSEsL4gQW0HxqSIFFnCUUWEJttL5ktbLYsqUMe/dW8vZJmYxBRkYkrrsuyq19yBl+HVhfflmAX38t5y0fOjQUc+f29HhGBKFL0UlJ7k26JzQJ2ZWTlRHSHonAPu5vc2qtX18i2NexX78g3HxzfIeP8vDbwDpwQMsLK7lcgjlzkjBpUrRXPkOvFzqcde+mAT168P+w3L10S7qn8nL+DCP+NE3R8eO1vLCSyRjMnBmHa67p0Sll8MvAKi834rPPLnCWyWQMnnuur1cbIIOD+eHk7lFR2xkZgeY530nXlpvbiK+/LuAsu+OOBLf+TsvL+Uf8rk6/0lGqqsz44YeLnGVSKYMFCzq3R75fdvBYvbqQc3MLhgEeeqi316+WCM3CIDQZnzOEps5ISqLOo11dYqIaBoP10l2Omx9Hjrh+49/medW4HaAlEgYREf4RWBs3lnA6SjMMcPfdiZ0+fMjvAqv55o/cCp8+PQ5jxnj/kDM+Xs3rr1VUpBcc+9Setn9sMhmDuDj/OZwnHUOhkCAxkXsknZ1dz+lM7IzDh2sE5033h2YFvd7Ku3lqRkYkhg4N7fSy+F1gHTpUzblyJ5dLMHVqTId8FsMIzxf/xRf5Lt0qacuWMvz1F/cOuyNGhPnFHxvpeKmp3CNpi8WGrVvLnJ6auK7Ogm3b+POqTZvm+LZ2neXkyTrOPimTMXbv8dnR/K4Nq22j3oABwdDrrdDrPb//YUCAlNduNWNGLPbtq+JcPq6qMuGVV7KxcGEvh/cmtFpZ7NpVgTVruPdADAiQYe7cZI/LS8Rh8OBQ7N5dydmp9+2rQk2NGffckwSFwv5xwfnzOqxeXcjrzzRhQoTf3D287RlPnz6BLcOQPKVWS50ewgT4YWC1ndbl+PFa3i/MXVOmxODee3tylkVHqzBzZhzWreM2KObnN+If/ziFiROjkJYWiJgYNWJjVTCbbSgrM6KoqHlIj1Bj+733JvnNHxvpeHFxKtx+ewLWrCniLM/Kqsd77+Wgf/9gJCaqkZCgQXCwDOXlJpSUGFBYqMehQzW82UiDgmS47jrPhvp4U9v22dOnG3D69FmvvPe4cREuDfD2q8Cqr7d0yJ2k23PLLfGoq7PwulHYbCx27CjHjh38vmD2DB0a6rPDZeI7V10VhvJyI3bv5nYebr4jM39gvT0qlRT33dezwzpeukqna+qQO0m7y68CS+hopTMwDDB/fjISE9X48UfhOXycMXZsBBYu7OXdwhHRmDo1FlYri99+0/JGUDgjOFje8nfoL9pO/exrfhVYQv1QOgvDNA96Hj8+Elu3lmLTplKn283i49W4+eb4Tus8R/wTwwAzZ8Zh/PhI7NhRjj//5J/uCVGrpbjuukiMGxfhd1PJtJ362de8GlirV4/yaPtx4yIwblyEl0rjHqVSglmz4vG3v0Xj6NHalsn6Lj8YhkFQkAxhYQr06xeEgQODMWBASIffg46IR2ioHLffnoCJE6OQm9uIykpTywNovvij0cgQG6tCnz6BSErSdNgV5TffHOTR9lddFYarrnL9zusdxa+OsPxJYKAM48f7NjyJuIWHKxAeTndQ8ib/Ov4khBAHKLAIIaJBgUUIEQ0KLEKIaFBgEUJEgwKLECIaFFiEENGgwCKEiAYFFiFENCiwCCGiQYFFCBENCixCiGhQYBFCRIMCixAiGhRYhBDRoMAihIgGBRYhRDQosAghokGBRQgRDQosQohoUGARQkSDAosQIhqOAot15iaQxDVWKwuGgf/c+5u4hWEYlqXdw+tsNsf7h93AUiol2ooK/7rra1dQWWmCUimp9HU5iGcUCkar1frXbdy7Aq3WDLmcsbt/2A0shsHG/furqEa8bM+eSrPNhg2+LgfxFLPx6NEa2j+87NChajPL2t8/7AaWXm9dunVrmfnkybqOKVk3dPJkHXbsKDcZjdbXfF0W4hmj0bp0794q87lzDb4uSpdx7lwDDhyoNplMNrv7h6Nb1RebzbYZ7757buO0abHKsWMj5MHBdGd7d9TXN2HXrkrLr7+WGcxm23QAJb4uE/FYscVim/HFFwUbJ0yIUI4YESYPDKT9wx06XRP++KPa8vvvWoPF4nj/YAAsvfSwJ16lkr4ikWCKxcKGebeo3YNczmhtNmwyGq3LAFx0sGrmpZ8ZHV4o4ozMSz8zHKwTr1JJX2EYTLFYbLR/uEEul2htNnaTyWRrb/9Y6sxXQrHRaF3gpbJ1SxaLr0tAOhDtHx5qarI6vS71wyKEiAYFFiFENCiwCCGiQYFFCBENBs1XQjJ9WwxyybxLP1f5sAyk1bxLP1f5sAykVcb/Ay1wlOkEwQi2AAAAAElFTkSuQmCC" - } - ], - "prompt_number": 4 - }, - { - "cell_type": "code", - "collapsed": false, - "input": [], - "language": "python", - "metadata": {}, - "outputs": [] - } - ], - "metadata": {} - } - ] -} \ No newline at end of file diff --git a/notebooks/Gradient-Descent.ipynb b/notebooks/Gradient-Descent.ipynb index 98c4a8c7..77583a28 100644 --- a/notebooks/Gradient-Descent.ipynb +++ b/notebooks/Gradient-Descent.ipynb @@ -1,395 +1,401 @@ { - "metadata": { - "language": "haskell", - "name": "" - }, - "nbformat": 3, - "nbformat_minor": 0, - "worksheets": [ + "cells": [ { - "cells": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In supervised learning algorithms, we generally have some model (such as a neural network) with a set of parameters (the weights), a data set, and an error function which measures how well our parameters and model fit the data. With many models, the way to train the model and fit the parameters is through an iterative minimization procedure which uses the gradient of the error to find the local minimum in parameter space. This notebook will not go into the details of neural networks or how to compute the derivatives of error functions, but instead focuses on some of the simple minimization methods one can employ. The goal of this notebook is to develop a simple yet flexible framework in Haskell in which we can develop gradient descent methods.\n", + "\n", + "Although you should keep in mind that the goal of these algorithms (for our purposes) is to train neural networks, for now we will just discuss some abstract function $f(\\vec x)$ for which we can compute all partial derivatives.\n", + "$\\newcommand\\vector[1]{\\langle #1 \\rangle}\\newcommand\\p[2]{\\frac{\\partial #1}{\\partial #2}}$\n", + "\n", + "Gradient Descent\n", + "---\n", + "The simplest algorithm for iterative minimization of differentiable functions is known as just **gradient descent**.\n", + "Recall that the gradient of a function is defined as the vector of partial derivatives:\n", + "\n", + "$$\\nabla f(x) = \\vector{\\p{f}{x_1}, \\p{f}{x_2}, \\ldots, \\p{f}{x_n}}$$\n", + "\n", + "and that the gradient of a function always points towards the direction of maximal increase at that point.\n", + "\n", + "Equivalently, it points *away* from the direction of maximum decrease - thus, if we start at any point, and keep moving in the direction of the negative gradient, we will eventually reach a local minimum.\n", + "\n", + "This simple insight leads to the Gradient Descent algorithm. Outlined algorithmically, it looks like this:\n", + "\n", + "1. Pick a point $x_0$ as your initial guess.\n", + "2. Compute the gradient at your current guess:\n", + "$v_i = \\nabla f(x_i)$\n", + "3. Move by $\\alpha$ (your step size) in the direction of that gradient:\n", + "$x_{i+1} = x_i + \\alpha v_i$\n", + "4. Repeat steps 1-3 until your function is close enough to zero (until $f(x_i) < \\varepsilon$ for some small tolerance $\\varepsilon$)\n", + "\n", + "Note that the step size, $\\alpha$, is simply a parameter of the algorithm and has to be fixed in advance. \n", + "\n", + "Though this algorithm is simple, it will be a bit of a challenge to formalize it into executable Haskell code that we can later extend to other algorithms. First, note that gradient descent requires two things:\n", + "\n", + "- Something to optimize (a function)\n", + "- What to optimize over (the parameters)\n", + "- A way to compute partials of the function\n", + "\n", + "Note that we don't actually need to *call* the function itself - only the partial derivatives are necessary.\n", + "\n", + "We're going to define a single class for things on which we can run gradient descent. Although later we may want to modify this class, this serves as a beginning:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + ":set -XTypeFamilies\n", + "class GradientDescent a where\n", + " -- Type to represent the parameter space.\n", + " data Params a :: *\n", + " \n", + " -- Compute the gradient at a location in parameter space.\n", + " grad :: a -> Params a -> Params a\n", + " \n", + " -- Move in parameter space.\n", + " paramMove :: Double -- Scaling factor.\n", + " -> Params a -- Direction vector.\n", + " -> Params a -- Original location.\n", + " -> Params a -- New location." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to use some type `a` with our gradient descent, we require that it is an instance of `GradientDescent`. This class requires a few things.\n", + "\n", + "First off, we use type families in order to define a representation for the parameter space. We want to be able to operate on points in the parameter space of our function; however, while something like a list of values might be nice and simple in one case, it is inappropriate and inefficient when storing the weights of a neural network. Thus, we let each class instance decide how to store its parameters by defining an associated type instance. (We will see an instance of this later!)\n", + "\n", + "Next, `GradientDescent` requires a single function called `grad`, which takes the thing of type `a` and the current point in parameter space (via a `Param a`) and outputs a set of partial derivatives. The partial derivatives have the same form and dimension as the point in parameter space, so they are also a `Param a`. \n", + "\n", + "Finally, we must be able to move around in parameter space, so `GradientDescent` defines a function `paramMove` which does exactly that - it takes a parameter vector and an amount by which to move and uses these to generate a new position from an old one.\n", + "\n", + "Let's go ahead and create the simplest instantiation of this class and type family: a single-argument function. Note that this is just for demo purposes, and we're going to use numerical differentiation to compute the derivative." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- We need flexible instances for declarations like these.\n", + ":set -XFlexibleInstances\n", + "\n", + "instance Floating a => GradientDescent (a -> a) where\n", + " -- The parameter for a function is just its argument.\n", + " data Params (a -> a) = Arg { unArg :: a }\n", + "\n", + " -- Use numeric differentiation for taking the gradient.\n", + " grad f (Arg value) = Arg $ (f value - f (value - epsilon)) / epsilon\n", + " where epsilon = 0.0001\n", + " \n", + " paramMove scale (Arg vec) (Arg old) = Arg $ old + fromRational (toRational scale) * vec" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We're getting closer to implementing the actual algorithm. However, we have yet to define when the algorithm *stops* - and, in order to give maximum flexibility to the user, we'll let the stopping condition be an argument. This lets the user specify an error tolerance, as well as how they want to derive this error:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- Define a way to decide when to stop.\n", + "-- This lets the user specify an error tolerance easily.\n", + "-- The function takes the previous two sets of parameters and returns\n", + "-- `True` to continue the descent and `False` to stop.\n", + "newtype StopCondition a = StopWhen (Params a -> Params a -> Bool)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With a single instance of our class, we can now implement our gradient descent algorithm:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "gradientDescent :: GradientDescent a => a -- What to optimize.\n", + " -> StopCondition a -- When to stop.\n", + " -> Double -- Step size (alpha).\n", + " -> Params a -- Initial point (x0).\n", + " -> Params a -- Return: Location of minimum.\n", + "gradientDescent function (StopWhen stop) alpha x0 =\n", + " let iterations = iterate takeStep x0\n", + " iterationPairs = zip iterations $ tail iterations\n", + " in\n", + " -- Drop all elements where the resulting parameters (and previous parameters)\n", + " -- do not meet the stop condition. Then, return just the last parameter set.\n", + " snd . head $ dropWhile (not . uncurry stop) iterationPairs\n", + " where\n", + " -- For each step...\n", + " takeStep params = \n", + " -- Compute the gradient.\n", + " let gradients = grad function params in\n", + " -- And move against the gradient with a step size alpha.\n", + " paramMove (-alpha) gradients params" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's go ahead and try this for some simple functions. First, we need to create a stopping condition. In this case, we're going to stop when successive updates to the parameters do not affect the outcome very much - namely, when the difference between the function value at successive parameters is below some $\\varepsilon$-tolerance. In other scenarios, we may want to use a more complicated stopping criterion." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- Create a stop condition that respects a given error tolerance.\n", + "stopCondition :: (Double -> Double) -> Double -> StopCondition (Double -> Double)\n", + "stopCondition f tolerance = StopWhen stop\n", + " where stop (Arg prev) (Arg cur) =\n", + " abs (f prev - f cur) < tolerance " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's finally try to minimize something, such as the relatively trivial function $f(x) = x^2 + 3x$. It's graph looks like this:\n", + "\n", + "
\n", + "\n", + "
\n", + "\n", + "This function has a minimum at $x = -\\frac{3}{2}$." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- A demo function with minimum at -3/2\n", + "function x = x^2 + 3 * x" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, let's take the minimum. We're going to use a step size of $\\alpha = 0.1$, start at $x_0 = 12$, and stop with a tolerance of $1\\times 10^{-4}$:" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [ { - "cell_type": "markdown", + "data": { + "text/plain": [ + "-1.4892542376755242" + ] + }, "metadata": {}, - "source": [ - "In supervised learning algorithms, we generally have some model (such as a neural network) with a set of parameters (the weights), a data set, and an error function which measures how well our parameters and model fit the data. With many models, the way to train the model and fit the parameters is through an iterative minimization procedure which uses the gradient of the error to find the local minimum in parameter space. This notebook will not go into the details of neural networks or how to compute the derivatives of error functions, but instead focuses on some of the simple minimization methods one can employ. The goal of this notebook is to develop a simple yet flexible framework in Haskell in which we can develop gradient descent methods.\n", - "\n", - "Although you should keep in mind that the goal of these algorithms (for our purposes) is to train neural networks, for now we will just discuss some abstract function $f(\\vec x)$ for which we can compute all partial derivatives.\n", - "$\\newcommand\\vector[1]{\\langle #1 \\rangle}\\newcommand\\p[2]{\\frac{\\partial #1}{\\partial #2}}$\n", - "\n", - "Gradient Descent\n", - "---\n", - "The simplest algorithm for iterative minimization of differentiable functions is known as just **gradient descent**.\n", - "Recall that the gradient of a function is defined as the vector of partial derivatives:\n", - "\n", - "$$\\nabla f(x) = \\vector{\\p{f}{x_1}, \\p{f}{x_2}, \\ldots, \\p{f}{x_n}}$$\n", - "\n", - "and that the gradient of a function always points towards the direction of maximal increase at that point.\n", - "\n", - "Equivalently, it points *away* from the direction of maximum decrease - thus, if we start at any point, and keep moving in the direction of the negative gradient, we will eventually reach a local minimum.\n", - "\n", - "This simple insight leads to the Gradient Descent algorithm. Outlined algorithmically, it looks like this:\n", - "\n", - "1. Pick a point $x_0$ as your initial guess.\n", - "2. Compute the gradient at your current guess:\n", - "$v_i = \\nabla f(x_i)$\n", - "3. Move by $\\alpha$ (your step size) in the direction of that gradient:\n", - "$x_{i+1} = x_i + \\alpha v_i$\n", - "4. Repeat steps 1-3 until your function is close enough to zero (until $f(x_i) < \\varepsilon$ for some small tolerance $\\varepsilon$)\n", - "\n", - "Note that the step size, $\\alpha$, is simply a parameter of the algorithm and has to be fixed in advance. \n", - "\n", - "Though this algorithm is simple, it will be a bit of a challenge to formalize it into executable Haskell code that we can later extend to other algorithms. First, note that gradient descent requires two things:\n", - "\n", - "- Something to optimize (a function)\n", - "- What to optimize over (the parameters)\n", - "- A way to compute partials of the function\n", - "\n", - "Note that we don't actually need to *call* the function itself - only the partial derivatives are necessary.\n", - "\n", - "We're going to define a single class for things on which we can run gradient descent. Although later we may want to modify this class, this serves as a beginning:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - ":set -XTypeFamilies\n", - "class GradientDescent a where\n", - " -- Type to represent the parameter space.\n", - " data Params a :: *\n", - " \n", - " -- Compute the gradient at a location in parameter space.\n", - " grad :: a -> Params a -> Params a\n", - " \n", - " -- Move in parameter space.\n", - " paramMove :: Double -- Scaling factor.\n", - " -> Params a -- Direction vector.\n", - " -> Params a -- Original location.\n", - " -> Params a -- New location." - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 1 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In order to use some type `a` with our gradient descent, we require that it is an instance of `GradientDescent`. This class requires a few things.\n", - "\n", - "First off, we use type families in order to define a representation for the parameter space. We want to be able to operate on points in the parameter space of our function; however, while something like a list of values might be nice and simple in one case, it is inappropriate and inefficient when storing the weights of a neural network. Thus, we let each class instance decide how to store its parameters by defining an associated type instance. (We will see an instance of this later!)\n", - "\n", - "Next, `GradientDescent` requires a single function called `grad`, which takes the thing of type `a` and the current point in parameter space (via a `Param a`) and outputs a set of partial derivatives. The partial derivatives have the same form and dimension as the point in parameter space, so they are also a `Param a`. \n", - "\n", - "Finally, we must be able to move around in parameter space, so `GradientDescent` defines a function `paramMove` which does exactly that - it takes a parameter vector and an amount by which to move and uses these to generate a new position from an old one.\n", - "\n", - "Let's go ahead and create the simplest instantiation of this class and type family: a single-argument function. Note that this is just for demo purposes, and we're going to use numerical differentiation to compute the derivative." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- We need flexible instances for declarations like these.\n", - ":set -XFlexibleInstances\n", - "\n", - "instance Floating a => GradientDescent (a -> a) where\n", - " -- The parameter for a function is just its argument.\n", - " data Params (a -> a) = Arg { unArg :: a }\n", - "\n", - " -- Use numeric differentiation for taking the gradient.\n", - " grad f (Arg value) = Arg $ (f value - f (value - epsilon)) / epsilon\n", - " where epsilon = 0.0001\n", - " \n", - " paramMove scale (Arg vec) (Arg old) = Arg $ old + fromRational (toRational scale) * vec" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 2 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We're getting closer to implementing the actual algorithm. However, we have yet to define when the algorithm *stops* - and, in order to give maximum flexibility to the user, we'll let the stopping condition be an argument. This lets the user specify an error tolerance, as well as how they want to derive this error:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- Define a way to decide when to stop.\n", - "-- This lets the user specify an error tolerance easily.\n", - "-- The function takes the previous two sets of parameters and returns\n", - "-- `True` to continue the descent and `False` to stop.\n", - "newtype StopCondition a = StopWhen (Params a -> Params a -> Bool)" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 3 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "With a single instance of our class, we can now implement our gradient descent algorithm:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "gradientDescent :: GradientDescent a => a -- What to optimize.\n", - " -> StopCondition a -- When to stop.\n", - " -> Double -- Step size (alpha).\n", - " -> Params a -- Initial point (x0).\n", - " -> Params a -- Return: Location of minimum.\n", - "gradientDescent function (StopWhen stop) alpha x0 =\n", - " let iterations = iterate takeStep x0\n", - " iterationPairs = zip iterations $ tail iterations\n", - " in\n", - " -- Drop all elements where the resulting parameters (and previous parameters)\n", - " -- do not meet the stop condition. Then, return just the last parameter set.\n", - " snd . head $ dropWhile (not . uncurry stop) iterationPairs\n", - " where\n", - " -- For each step...\n", - " takeStep params = \n", - " -- Compute the gradient.\n", - " let gradients = grad function params in\n", - " -- And move against the gradient with a step size alpha.\n", - " paramMove (-alpha) gradients params" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 4 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's go ahead and try this for some simple functions. First, we need to create a stopping condition. In this case, we're going to stop when successive updates to the parameters do not affect the outcome very much - namely, when the difference between the function value at successive parameters is below some $\\varepsilon$-tolerance. In other scenarios, we may want to use a more complicated stopping criterion." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- Create a stop condition that respects a given error tolerance.\n", - "stopCondition :: (Double -> Double) -> Double -> StopCondition (Double -> Double)\n", - "stopCondition f tolerance = StopWhen stop\n", - " where stop (Arg prev) (Arg cur) =\n", - " abs (f prev - f cur) < tolerance " - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 5 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's finally try to minimize something, such as the relatively trivial function $f(x) = x^2 + 3x$. It's graph looks like this:\n", - "\n", - "
\n", - "\n", - "
\n", - "\n", - "This function has a minimum at $x = -\\frac{3}{2}$." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- A demo function with minimum at -3/2\n", - "function x = x^2 + 3 * x" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 6 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Finally, let's take the minimum. We're going to use a step size of $\\alpha = 0.1$, start at $x_0 = 12$, and stop with a tolerance of $1\\times 10^{-4}$:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "let alpha = 1e-1\n", - "let tolerance = 1e-4\n", - "let initValue = 12.0\n", - "unArg $ gradientDescent function (stopCondition function tolerance) alpha (Arg initValue)" - ], - "language": "python", - "metadata": {}, - "outputs": [ - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "-1.4892542376755242" - ] - } - ], - "prompt_number": 7 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Monadic Gradient Descent\n", - "---\n", - "\n", - "Although the above implementation of gradient descent works, we're going to run into problems when our functions are more complicated. For instance, suppose that computing the gradient required a lot of computation, and the computation required communicating with a distributed network of processing nodes. Or suppose that there were some regimes in which the function was non-differentiable, and we wanted to use the `Maybe` type to represent this. In order to support this, we can try to rewrite our class with *monadic* variants of its operations." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - ":set -XMultiParamTypeClasses\n", - "class Monad m => GradientDescent m a where\n", - " -- Type to represent the parameter space.\n", - " data Params a :: *\n", - " \n", - " -- Compute the gradient at a location in parameter space.\n", - " grad :: a -> Params a -> m (Params a)\n", - " \n", - " -- Move in parameter space.\n", - " paramMove :: Double -- Scaling factor.\n", - " -> Params a -- Direction vector.\n", - " -> Params a -- Original location.\n", - " -> m (Params a) -- New location.\n", - " \n", - " \n", - "-- Since we've redefined GradientDescent, we need to redefine StopCondition.\n", - "newtype StopCondition a = StopWhen (Params a -> Params a -> Bool)" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 8 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In order to utilize this, we're going to have to rewrite our instance to run all computations in a monad. The implementation will look quite familiar, but we won't be able to use as many built-in functions, as they do not have monadic variants in the base packages." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "gradientDescent :: (GradientDescent m a) => \n", - " a -- What to optimize.\n", - " -> StopCondition a -- When to stop.\n", - " -> Double -- Step size (alpha).\n", - " -> Params a -- Initial point (x0).\n", - " -> m (Params a) -- Return: Location of minimum.\n", - "gradientDescent function (StopWhen stop) alpha x0 = do\n", - " -- Take the next step.\n", - " next <- takeStep x0\n", - " \n", - " -- If we stop, do so, otherwise recurse.\n", - " if stop x0 next\n", - " then return next\n", - " else gradientDescent function (StopWhen stop) alpha next\n", - " where\n", - " takeStep params = do\n", - " gradients <- grad function params\n", - " paramMove (-alpha) gradients params" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 9 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's try this for something simple. Suppose we're using our old $f(x) = x^2 + 3x$, but for some reason, we are incapable of differentiating if the function value is below zero. We'll use the `Maybe` monad to represent this - if the parameter to a function is negative, we return `Nothing`, otherwise, we return `Just` the derivative." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "instance (Ord a, Floating a) => GradientDescent Maybe (a -> a) where\n", - " -- The parameter for a function is just its argument.\n", - " data Params (a -> a) = Arg { unArg :: a }\n", - "\n", - " -- Use numeric differentiation for taking the gradient.\n", - " grad f (Arg value) = \n", - " if value > 0\n", - " then Just $ Arg $ (f value - f (value - epsilon)) / epsilon\n", - " else Nothing\n", - " where epsilon = 0.0001\n", - " \n", - " paramMove scale (Arg vec) (Arg old) = Just $ Arg $ old + fromRational (toRational scale) * vec" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 10 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's go ahead and try this with the same example as before." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "stopCondition f tolerance = StopWhen stop\n", - " where stop (Arg prev) (Arg cur) =\n", - " abs (f prev - f cur) < tolerance \n", - " \n", - "let x0 = Arg initValue\n", - "let stopper = stopCondition function tolerance\n", - "case gradientDescent function stopper alpha x0 of\n", - " Just x -> print $ unArg x\n", - " Nothing -> putStrLn \"Nothing!\"" - ], - "language": "python", - "metadata": {}, - "outputs": [ - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "Nothing!" - ] - } - ], - "prompt_number": 11 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We saw in the original example that the minimum is at $-\\frac{3}{2}$, so this gradient descent tries to go into the $x < 0$ region - at which point the differentiation returns `Nothing`, and the gradient descent implicitly stops! This monadic gradient descent can be used to implement things such as bounded optimization, optimization that keeps track of the states it went through, optimization that uses networked IO to do its computation, and so on.\n", - "\n", - "That's it for now. In the next notebook, I'm going to try implementing conjugate gradient in this same framework." - ] + "output_type": "display_data" } ], - "metadata": {} + "source": [ + "let alpha = 1e-1\n", + "let tolerance = 1e-4\n", + "let initValue = 12.0\n", + "unArg $ gradientDescent function (stopCondition function tolerance) alpha (Arg initValue)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Monadic Gradient Descent\n", + "---\n", + "\n", + "Although the above implementation of gradient descent works, we're going to run into problems when our functions are more complicated. For instance, suppose that computing the gradient required a lot of computation, and the computation required communicating with a distributed network of processing nodes. Or suppose that there were some regimes in which the function was non-differentiable, and we wanted to use the `Maybe` type to represent this. In order to support this, we can try to rewrite our class with *monadic* variants of its operations." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + ":set -XMultiParamTypeClasses\n", + "class Monad m => GradientDescent m a where\n", + " -- Type to represent the parameter space.\n", + " data Params a :: *\n", + " \n", + " -- Compute the gradient at a location in parameter space.\n", + " grad :: a -> Params a -> m (Params a)\n", + " \n", + " -- Move in parameter space.\n", + " paramMove :: Double -- Scaling factor.\n", + " -> Params a -- Direction vector.\n", + " -> Params a -- Original location.\n", + " -> m (Params a) -- New location.\n", + " \n", + " \n", + "-- Since we've redefined GradientDescent, we need to redefine StopCondition.\n", + "newtype StopCondition a = StopWhen (Params a -> Params a -> Bool)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to utilize this, we're going to have to rewrite our instance to run all computations in a monad. The implementation will look quite familiar, but we won't be able to use as many built-in functions, as they do not have monadic variants in the base packages." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "gradientDescent :: (GradientDescent m a) => \n", + " a -- What to optimize.\n", + " -> StopCondition a -- When to stop.\n", + " -> Double -- Step size (alpha).\n", + " -> Params a -- Initial point (x0).\n", + " -> m (Params a) -- Return: Location of minimum.\n", + "gradientDescent function (StopWhen stop) alpha x0 = do\n", + " -- Take the next step.\n", + " next <- takeStep x0\n", + " \n", + " -- If we stop, do so, otherwise recurse.\n", + " if stop x0 next\n", + " then return next\n", + " else gradientDescent function (StopWhen stop) alpha next\n", + " where\n", + " takeStep params = do\n", + " gradients <- grad function params\n", + " paramMove (-alpha) gradients params" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's try this for something simple. Suppose we're using our old $f(x) = x^2 + 3x$, but for some reason, we are incapable of differentiating if the function value is below zero. We'll use the `Maybe` monad to represent this - if the parameter to a function is negative, we return `Nothing`, otherwise, we return `Just` the derivative." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "instance (Ord a, Floating a) => GradientDescent Maybe (a -> a) where\n", + " -- The parameter for a function is just its argument.\n", + " data Params (a -> a) = Arg { unArg :: a }\n", + "\n", + " -- Use numeric differentiation for taking the gradient.\n", + " grad f (Arg value) = \n", + " if value > 0\n", + " then Just $ Arg $ (f value - f (value - epsilon)) / epsilon\n", + " else Nothing\n", + " where epsilon = 0.0001\n", + " \n", + " paramMove scale (Arg vec) (Arg old) = Just $ Arg $ old + fromRational (toRational scale) * vec" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's go ahead and try this with the same example as before." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "Nothing!" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "stopCondition f tolerance = StopWhen stop\n", + " where stop (Arg prev) (Arg cur) =\n", + " abs (f prev - f cur) < tolerance \n", + " \n", + "let x0 = Arg initValue\n", + "let stopper = stopCondition function tolerance\n", + "case gradientDescent function stopper alpha x0 of\n", + " Just x -> print $ unArg x\n", + " Nothing -> putStrLn \"Nothing!\"" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We saw in the original example that the minimum is at $-\\frac{3}{2}$, so this gradient descent tries to go into the $x < 0$ region - at which point the differentiation returns `Nothing`, and the gradient descent implicitly stops! This monadic gradient descent can be used to implement things such as bounded optimization, optimization that keeps track of the states it went through, optimization that uses networked IO to do its computation, and so on.\n", + "\n", + "That's it for now. In the next notebook, I'm going to try implementing conjugate gradient in this same framework." + ] } - ] -} \ No newline at end of file + ], + "metadata": { + "kernelspec": { + "display_name": "Haskell", + "language": "haskell", + "name": "haskell" + }, + "language_info": { + "name": "haskell", + "version": "7.8.3" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/notebooks/Homophones.ipynb b/notebooks/Homophones.ipynb index e469f701..39b92774 100644 --- a/notebooks/Homophones.ipynb +++ b/notebooks/Homophones.ipynb @@ -1,615 +1,625 @@ { - "metadata": { - "language": "haskell", - "name": "" - }, - "nbformat": 3, - "nbformat_minor": 0, - "worksheets": [ + "cells": [ { - "cells": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A few days ago, a friend of mine sent me [a fascinating problem](http://math.ucsd.edu/~justin/190hw.html). The problem goes like this:\n", + "\n", + "> The *homophony group* (of English) is the group with 26 generators `a`,`b`, `c`, and so on until `z` and one relation for every pair of English words which sound the same. Prove that the group is trivial!\n", + "\n", + "For example, consider the group elements **knight** and **night**. By the [cancellation laws](http://www.proofwiki.org/wiki/Cancellation_Laws), this implies that **k** must be the identity element. Recall that a trivial group is one which consists solely of its identity element, so our task is to show that each letter of the English alphabet is the identity element.\n", + "\n", + "Skipping all of the algebraic jargon, we want to show that if we set all homophones \"equal\" to one another, and do left cancellation, right cancellation, and substitution, we can show that all the English letters equal one.\n", + "\n", + "This is a fun exercise to do by hand, but I'd like to do it in Haskell. I've started by compiling a list of homophones in American English, starting with [this list](http://members.peak.org/~jeremy/dictionaryclassic/chapters/homophones.php) and removing all single letters (such as `j` being a homophone with `jay`) and all words with apostrophes and periods, as well as some less commonly used words.\n", + "\n", + "The contents of the file look like this:\n", + "```\n", + "ad add\n", + "add ad\n", + "arc ark\n", + "ark arc\n", + "...\n", + "```\n", + "\n", + "Each line is a space-delimited list of words. The first word in the list sounds identical to all the remaining words in the list. This is why you see repeats - `ad` sounds like `add` but also `add` sounds like `ad`. This repetition isn't necessary, as we could do it programmatically, but is convenient.\n", + "\n", + "Let's go ahead and load this list:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import Control.Applicative ((<$>))\n", + "import Data.List.Utils (split)\n", + "\n", + "removeEmpty = filter (not . null)\n", + "homophones <- removeEmpty . map words . lines <$> readFile \"homophones.list\"" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's take a look at a few more of these homophones." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ { - "cell_type": "markdown", + "data": { + "text/plain": [ + "adieu\tado\n", + "ado\tadieu\n", + "affect\teffect\n", + "aid\taide\n", + "aide\taid\n", + "ail\tale\n", + "air\terr\their\n", + "airs\terrs\theirs\n", + "aisle\tisle\n", + "ale\tail" + ] + }, "metadata": {}, - "source": [ - "A few days ago, a friend of mine sent me [a fascinating problem](http://math.ucsd.edu/~justin/190hw.html). The problem goes like this:\n", - "\n", - "> The *homophony group* (of English) is the group with 26 generators `a`,`b`, `c`, and so on until `z` and one relation for every pair of English words which sound the same. Prove that the group is trivial!\n", - "\n", - "For example, consider the group elements **knight** and **night**. By the [cancellation laws](http://www.proofwiki.org/wiki/Cancellation_Laws), this implies that **k** must be the identity element. Recall that a trivial group is one which consists solely of its identity element, so our task is to show that each letter of the English alphabet is the identity element.\n", - "\n", - "Skipping all of the algebraic jargon, we want to show that if we set all homophones \"equal\" to one another, and do left cancellation, right cancellation, and substitution, we can show that all the English letters equal one.\n", - "\n", - "This is a fun exercise to do by hand, but I'd like to do it in Haskell. I've started by compiling a list of homophones in American English, starting with [this list](http://members.peak.org/~jeremy/dictionaryclassic/chapters/homophones.php) and removing all single letters (such as `j` being a homophone with `jay`) and all words with apostrophes and periods, as well as some less commonly used words.\n", - "\n", - "The contents of the file look like this:\n", - "```\n", - "ad add\n", - "add ad\n", - "arc ark\n", - "ark arc\n", - "...\n", - "```\n", - "\n", - "Each line is a space-delimited list of words. The first word in the list sounds identical to all the remaining words in the list. This is why you see repeats - `ad` sounds like `add` but also `add` sounds like `ad`. This repetition isn't necessary, as we could do it programmatically, but is convenient.\n", - "\n", - "Let's go ahead and load this list:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "import Control.Applicative ((<$>))\n", - "import Data.List.Utils (split)\n", - "\n", - "removeEmpty = filter (not . null)\n", - "homophones <- removeEmpty . map words . lines <$> readFile \"homophones.list\"" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 1 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's take a look at a few more of these homophones." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "import Control.Monad (forM_)\n", - "import Data.List (intercalate)\n", - "\n", - "-- Show ten of the homophone sets\n", - "forM_ (take 10 homophones) $ \\ homs -> \n", - " putStrLn $ intercalate \"\\t\" homs" - ], - "language": "python", - "metadata": {}, - "outputs": [ - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "adieu\tado\n", - "ado\tadieu\n", - "affect\teffect\n", - "aid\taide\n", - "aide\taid\n", - "ail\tale\n", - "air\terr\their\n", - "airs\terrs\theirs\n", - "aisle\tisle\n", - "ale\tail" - ] - } - ], - "prompt_number": 2 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Note that some of the sets have more than two elements, yet they are all on the same line.\n", - "\n", - "Let's convert this into a more usable format. We'll define a new type `WordPair` which represents a *single pair* of homophones, and convert this list into a list of `WordPair`s." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "data WordPair = WordPair String String\n", - "\n", - "-- Convert a list of homophones into a list of word pairs.\n", - "-- Note that the wordpairs should only use the first of the \n", - "-- list as the first word, since there will be repeat sets. \n", - "-- For instance, the set [\"a\", \"b\", \"c\"] would only generate \n", - "-- word pairs [WordPair \"a\" \"b\", WordPair \"a\" \"c\"].\n", - "pairs :: [String] -> [WordPair]\n", - "pairs (str:strs) = map (WordPair str) strs\n", - "\n", - "-- All pairs of words we consider homophones.\n", - "wordPairs = concatMap pairs homophones" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 3 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now that we have this data in a usable form, let's use it to derive relations. \n", - "\n", - "The initial relations we have are simply the set of word pairs. However, we can use two operations in order to derive more relations:\n", - "\n", - "- `reduce`: The reduction operation will be the application of left and right cancellation laws. If a relation has the same thing on the left of both sides, we can take it off; same for the right side. This generates a new, simpler relation.\n", - "- `substitute`: The substitution operation will be substituting identity relations in. For instance, if we've derived that `d` is the identity element, then we can remove `d` from all known relations to get new, simpler relations.\n", - "\n", - "In addition to each relation storing what strings it considers equal, we'd also like to be able to track what operations led to the creation of that word pair. So before defining a relation, let's define a history data type:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "data History = Reduce String String\n", - " | Substitute Char" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 4 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now, we'd like a relation to store all the transformations that were used to generate it, and also the two things it relates:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "data Relation = Relation [History] String String" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 5 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Since `Relation` and `WordPair` are slightly different, let's convert all our `WordPair`s to `Relation`s. This gives us our initial set of relations, which we will use to derive all other relations." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "toRelation :: WordPair -> Relation\n", - "toRelation (WordPair first second) = Relation [] first second\n", - "\n", - "initRelations = map toRelation wordPairs" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 6 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Eventually, we're going to iteratively improve these relations until we have proven that all letters equal the identity. First, though, let's define our two operators, starting with `reduce`.\n", - "\n", - "When we `reduce` a relation, we apply the right and left cancellation laws. If we have the equation\n", - "$$ab = ac$$\n", - "we can use the left cancellation law to reduce it to $b = c$; similarly, using the right cancellation law, we can reduce the equation \n", - "$$xa = ya$$\n", - "to just $x = y$.\n", - "\n", - "Our `reduce` operator repeats these steps until it can no longer do so, and then the resulting strings are the reduced relation.\n" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "reduce :: Relation -> Relation\n", - "reduce rel@(Relation hist first second)\n", - " | canReduce first second = go (first, second)\n", - " \n", - " -- Note that we also have to be careful with the history.\n", - " -- If the `reduce` does nothing, then we do not want to add\n", - " -- anything to the history of the relation.\n", - " | otherwise = rel\n", - " \n", - " where\n", - " -- A reduction can happen if both strings are non-zero\n", - " -- and share a common first or last letter.\n", - " canReduce first second =\n", - " not (null first) &&\n", - " not (null second) &&\n", - " (head first == head second ||\n", - " last first == last second)\n", - " \n", - " -- Modified history including this reduction.\n", - " hist' = Reduce first second : hist\n", - " \n", - " -- Base case: if we've reduced a word pair to an empty string \n", - " -- and something else, we're done, as that something else\n", - " -- is equivalent to the identity element.\n", - " go (\"\", word) = Relation hist' word \"\"\n", - " go (word, \"\") = Relation hist' word \"\" \n", - " \n", - " go (first, second)\n", - " -- Chop off the first element if they're equal.\n", - " | head first == head second\n", - " = go (tail first, tail second)\n", - " \n", - " -- Chop off the last element if they're equal.\n", - " | last first == last second\n", - " = go (init first, init second)\n", - " \n", - " -- If netiher first nor last element are equal,\n", - " -- we've simplified the relation down as much\n", - " -- as we can simplify it.\n", - " | otherwise =\n", - " Relation hist' first second" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 7 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "This looks pretty good. Next, let's define the `substitute` operator.\n", - "\n", - "The `substitute` operator removes a character from a relation. For instance, if we know that `d` is the identity, we can simplify the relation $$ad = dyd$$ to just $a = y$. \n", - "\n", - "Just like the `reduce` operator, we avoid modifying the `Relation`'s history if the `substitute` does nothing." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "import Data.List.Utils (replace)\n", - "\n", - "-- Generate a new relation by removing characters we know to be \n", - "-- the identity. Make sure to update the history of the relation\n", - "-- with this substitution!\n", - "substitute :: Char -> Relation -> Relation\n", - "substitute char rel@(Relation hist first second)\n", - " | canSubstitute first second\n", - " = Relation (Substitute char : hist) (replaced first) (replaced second)\n", - " \n", - " | otherwise = rel\n", - " where\n", - " canSubstitute first second = char `elem` first || char `elem` second\n", - " replaced = replace [char] \"\"" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 8 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "With `substitute` implemented, we've finished all the machinery we're going to use for simplifying our relations. We're going to iteratively reduce and substitute until we've found that all the English letters are the identity element of the homophony group. We're still missing one thing, though - how do we know which letters we've proven to be the identity?\n", - "\n", - "Let's define a quick helper datatype for every identity we find. We're going to store the character that we've proven is the identity, as well as the history; that way, when we want to examine the results, we can see exactly how each letter was reduced to the identity." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "data FoundIdent = FoundIdent {\n", - " char :: Char,\n", - " hist :: [History]\n", - " }" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 9 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's also define a function that extracts all the identity elements from a set of relations." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- mapMaybe = map fromJust . filter isJust . map\n", - "import Data.Maybe (mapMaybe)\n", - "\n", - "identities :: [Relation] -> [FoundIdent]\n", - "identities = mapMaybe go\n", - " where\n", - " go :: Relation -> Maybe FoundIdent\n", - " go (Relation hist [char] \"\") = Just $ FoundIdent char hist\n", - " go (Relation hist \"\" [char]) = Just $ FoundIdent char hist\n", - " go _ = Nothing" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 10 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's finally put all of this together. We're going to start with our initial set of relations, `initRelations`, and then we're going to iteratively simplify them. Initially, we have no known identity elements.\n", - "\n", - "In each iteration, we\n", - "\n", - "- Substitute into each relation each known identity (replacing it with the empty string).\n", - "- Reduce the resulting relations.\n", - "- Collect all known identity elements." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "import Data.List (nubBy)\n", - "import Data.Function (on)\n", - "\n", - "-- The iteration starts with a list of known identity elements\n", - "-- and the current set of relations. It outputs the updated \n", - "-- relations and all known identity elements.\n", - "iteration :: ([FoundIdent], [Relation]) -> ([FoundIdent], [Relation])\n", - "iteration (idents, relations) = (newIdents, newRelations)\n", - " where\n", - " -- Collect all the substitutions into a single function.\n", - " substitutions = foldl (.) id $ map (substitute . char) idents\n", - " \n", - " -- Do all substitutions, then reduce (for each relation).\n", - " newRelations = map (reduce . substitutions) relations\n", - "\n", - " -- We have to remove duplicate identity elements, because\n", - " -- in each iteration we find multiple ways to prove that some\n", - " -- letters are the identity element. We just want one.\n", - " removeDuplicateIdents =\n", - " nubBy ((==) `on` char)\n", - "\n", - " -- Find all identities in the new relations.\n", - " newIdents = removeDuplicateIdents $ idents ++ identities newRelations" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 11 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's iterate this process until we have all the identities we want. We want 26 of them, so we can just check the length. (If this operation never finishes, we're out of luck!)" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- Generate the infinite list of iterations and their results.\n", - "initIdents = []\n", - "iterations = iterate iteration (initIdents, initRelations)\n", - "\n", - "-- Define a completion condition.\n", - "-- We're done when there are 26 known identity elements.\n", - "done (idents, _) = length idents == 26\n", - "\n", - "-- Discard all iteration results until completion.\n", - "-- Take the next one - the first one where the condition is met.\n", - "result = head $ dropWhile (not . done) iterations" - ], - "language": "python", - "metadata": {}, - "outputs": [], - "prompt_number": 12 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Woohoo! We're *done*! Let's take a look at the results!" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "import Data.List (sort)\n", - "\n", - "idents = fst result\n", - "identChars = map char idents\n", - "putStrLn $ sort identChars\n", - "print $ length identChars" - ], - "language": "python", - "metadata": {}, - "outputs": [ - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "abcdefghijklmnopqrstuvwxyz" - ] - }, - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "26" - ] - } - ], - "prompt_number": 13 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Looks like we do indeed have every single letter mapped to the identity. \n", - "\n", - "Let's see if we can deduce, for each letter, how it was mapped to the identity. Instead of doing it in alphabetical order, we'll look at them in the order they were deduced, so it follows some logical flow." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "import Text.Printf (printf)\n", - "\n", - "forM_ idents $ \\(FoundIdent char hist) -> do\n", - " printf \"Proving %c = 1:\\n\" char\n", - " forM_ (reverse hist) $ \\op ->\n", - " putStrLn $ case op of\n", - " Reduce first second -> \n", - " printf \"Reduce %s and %s\" first second\n", - " Substitute ch ->\n", - " printf \"Substitute %c for ''\" ch\n", - " putStr \"\\n\"" - ], - "language": "python", - "metadata": {}, - "outputs": [ - { - "metadata": {}, - "output_type": "display_data", - "text": [ - "Proving e = 1:\n", - "Reduce aid and aide\n", - "\n", - "Proving a = 1:\n", - "Reduce aisle and isle\n", - "\n", - "Proving u = 1:\n", - "Reduce ant and aunt\n", - "\n", - "Proving t = 1:\n", - "Reduce but and butt\n", - "\n", - "Proving n = 1:\n", - "Reduce cannon and canon\n", - "\n", - "Proving s = 1:\n", - "Reduce cent and scent\n", - "\n", - "Proving h = 1:\n", - "Reduce choral and coral\n", - "\n", - "Proving k = 1:\n", - "Reduce doc and dock\n", - "\n", - "Proving l = 1:\n", - "Reduce filet and fillet\n", - "\n", - "Proving w = 1:\n", - "Reduce hole and whole\n", - "\n", - "Proving b = 1:\n", - "Reduce plum and plumb\n", - "\n", - "Proving g = 1:\n", - "Reduce reign and rein\n", - "\n", - "Proving c = 1:\n", - "Reduce scent and sent\n", - "\n", - "Proving o = 1:\n", - "Reduce to and too\n", - "\n", - "Proving i = 1:\n", - "Reduce waive and wave\n", - "\n", - "Proving r = 1:\n", - "Reduce air and err\n", - "Substitute i for ''\n", - "Substitute a for ''\n", - "Substitute e for ''\n", - "\n", - "Proving d = 1:\n", - "Reduce awed and odd\n", - "Substitute o for ''\n", - "Substitute w for ''\n", - "Substitute a for ''\n", - "Substitute e for ''\n", - "\n", - "Proving y = 1:\n", - "Reduce bite and byte\n", - "Substitute i for ''\n", - "\n", - "Proving z = 1:\n", - "Reduce boos and booze\n", - "Substitute s for ''\n", - "Substitute e for ''\n", - "\n", - "Proving q = 1:\n", - "Reduce cask and casque\n", - "Substitute k for ''\n", - "Substitute u for ''\n", - "Substitute e for ''\n", - "\n", - "Proving x = 1:\n", - "Reduce coax and cokes\n", - "Substitute k for ''\n", - "Substitute s for ''\n", - "Substitute a for ''\n", - "Substitute e for ''\n", - "\n", - "Proving p = 1:\n", - "Reduce coo and coup\n", - "Substitute o for ''\n", - "Substitute u for ''\n", - "\n", - "Proving f = 1:\n", - "Reduce draft and draught\n", - "Substitute g for ''\n", - "Substitute h for ''\n", - "Substitute u for ''\n", - "\n", - "Proving m = 1:\n", - "Reduce damned and dammed\n", - "Substitute n for ''\n", - "\n", - "Proving j = 1:\n", - "Reduce genes and jeans\n", - "Substitute g for ''\n", - "Substitute n for ''\n", - "Substitute a for ''\n", - "Substitute e for ''\n", - "\n", - "Proving v = 1:\n", - "Reduce felt and veldt\n", - "Substitute l for ''\n", - "Substitute e for ''\n", - "Substitute f for ''\n", - "Substitute d for ''" - ] - } - ], - "prompt_number": 14 - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "If you scan through the list above, there's a few weird cases, but for the most part, it seems legitimate. (I mildly question `felt` and `veldt`, but it depends on how you pronounce things. If you look at the British English list of homophones, it's totally different anyways!) \n", - "\n", - "So that's that! We've found the ways to reduce every letter to the identity, and shown how to do it.\n", - "\n", - "I wonder if other languages also have trivial homophony groups. It might be fun to try Spanish, French, Russian, and others, and see if the homophony groups tell us anything interesting about the language!\n", - "\n", - "**This work was done in [IHaskell](https://github.com/gibiansky/IHaskell), and what you're reading is the IHaskell notebook exported to HTML for viewing in the browser.**" - ] + "output_type": "display_data" } ], - "metadata": {} + "source": [ + "import Control.Monad (forM_)\n", + "import Data.List (intercalate)\n", + "\n", + "-- Show ten of the homophone sets\n", + "forM_ (take 10 homophones) $ \\ homs -> \n", + " putStrLn $ intercalate \"\\t\" homs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that some of the sets have more than two elements, yet they are all on the same line.\n", + "\n", + "Let's convert this into a more usable format. We'll define a new type `WordPair` which represents a *single pair* of homophones, and convert this list into a list of `WordPair`s." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "data WordPair = WordPair String String\n", + "\n", + "-- Convert a list of homophones into a list of word pairs.\n", + "-- Note that the wordpairs should only use the first of the \n", + "-- list as the first word, since there will be repeat sets. \n", + "-- For instance, the set [\"a\", \"b\", \"c\"] would only generate \n", + "-- word pairs [WordPair \"a\" \"b\", WordPair \"a\" \"c\"].\n", + "pairs :: [String] -> [WordPair]\n", + "pairs (str:strs) = map (WordPair str) strs\n", + "\n", + "-- All pairs of words we consider homophones.\n", + "wordPairs = concatMap pairs homophones" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we have this data in a usable form, let's use it to derive relations. \n", + "\n", + "The initial relations we have are simply the set of word pairs. However, we can use two operations in order to derive more relations:\n", + "\n", + "- `reduce`: The reduction operation will be the application of left and right cancellation laws. If a relation has the same thing on the left of both sides, we can take it off; same for the right side. This generates a new, simpler relation.\n", + "- `substitute`: The substitution operation will be substituting identity relations in. For instance, if we've derived that `d` is the identity element, then we can remove `d` from all known relations to get new, simpler relations.\n", + "\n", + "In addition to each relation storing what strings it considers equal, we'd also like to be able to track what operations led to the creation of that word pair. So before defining a relation, let's define a history data type:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "data History = Reduce String String\n", + " | Substitute Char" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, we'd like a relation to store all the transformations that were used to generate it, and also the two things it relates:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "data Relation = Relation [History] String String" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since `Relation` and `WordPair` are slightly different, let's convert all our `WordPair`s to `Relation`s. This gives us our initial set of relations, which we will use to derive all other relations." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "toRelation :: WordPair -> Relation\n", + "toRelation (WordPair first second) = Relation [] first second\n", + "\n", + "initRelations = map toRelation wordPairs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Eventually, we're going to iteratively improve these relations until we have proven that all letters equal the identity. First, though, let's define our two operators, starting with `reduce`.\n", + "\n", + "When we `reduce` a relation, we apply the right and left cancellation laws. If we have the equation\n", + "$$ab = ac$$\n", + "we can use the left cancellation law to reduce it to $b = c$; similarly, using the right cancellation law, we can reduce the equation \n", + "$$xa = ya$$\n", + "to just $x = y$.\n", + "\n", + "Our `reduce` operator repeats these steps until it can no longer do so, and then the resulting strings are the reduced relation.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "reduce :: Relation -> Relation\n", + "reduce rel@(Relation hist first second)\n", + " | canReduce first second = go (first, second)\n", + " \n", + " -- Note that we also have to be careful with the history.\n", + " -- If the `reduce` does nothing, then we do not want to add\n", + " -- anything to the history of the relation.\n", + " | otherwise = rel\n", + " \n", + " where\n", + " -- A reduction can happen if both strings are non-zero\n", + " -- and share a common first or last letter.\n", + " canReduce first second =\n", + " not (null first) &&\n", + " not (null second) &&\n", + " (head first == head second ||\n", + " last first == last second)\n", + " \n", + " -- Modified history including this reduction.\n", + " hist' = Reduce first second : hist\n", + " \n", + " -- Base case: if we've reduced a word pair to an empty string \n", + " -- and something else, we're done, as that something else\n", + " -- is equivalent to the identity element.\n", + " go (\"\", word) = Relation hist' word \"\"\n", + " go (word, \"\") = Relation hist' word \"\" \n", + " \n", + " go (first, second)\n", + " -- Chop off the first element if they're equal.\n", + " | head first == head second\n", + " = go (tail first, tail second)\n", + " \n", + " -- Chop off the last element if they're equal.\n", + " | last first == last second\n", + " = go (init first, init second)\n", + " \n", + " -- If netiher first nor last element are equal,\n", + " -- we've simplified the relation down as much\n", + " -- as we can simplify it.\n", + " | otherwise =\n", + " Relation hist' first second" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This looks pretty good. Next, let's define the `substitute` operator.\n", + "\n", + "The `substitute` operator removes a character from a relation. For instance, if we know that `d` is the identity, we can simplify the relation $$ad = dyd$$ to just $a = y$. \n", + "\n", + "Just like the `reduce` operator, we avoid modifying the `Relation`'s history if the `substitute` does nothing." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import Data.List.Utils (replace)\n", + "\n", + "-- Generate a new relation by removing characters we know to be \n", + "-- the identity. Make sure to update the history of the relation\n", + "-- with this substitution!\n", + "substitute :: Char -> Relation -> Relation\n", + "substitute char rel@(Relation hist first second)\n", + " | canSubstitute first second\n", + " = Relation (Substitute char : hist) (replaced first) (replaced second)\n", + " \n", + " | otherwise = rel\n", + " where\n", + " canSubstitute first second = char `elem` first || char `elem` second\n", + " replaced = replace [char] \"\"" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With `substitute` implemented, we've finished all the machinery we're going to use for simplifying our relations. We're going to iteratively reduce and substitute until we've found that all the English letters are the identity element of the homophony group. We're still missing one thing, though - how do we know which letters we've proven to be the identity?\n", + "\n", + "Let's define a quick helper datatype for every identity we find. We're going to store the character that we've proven is the identity, as well as the history; that way, when we want to examine the results, we can see exactly how each letter was reduced to the identity." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "data FoundIdent = FoundIdent {\n", + " char :: Char,\n", + " hist :: [History]\n", + " }" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's also define a function that extracts all the identity elements from a set of relations." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- mapMaybe = map fromJust . filter isJust . map\n", + "import Data.Maybe (mapMaybe)\n", + "\n", + "identities :: [Relation] -> [FoundIdent]\n", + "identities = mapMaybe go\n", + " where\n", + " go :: Relation -> Maybe FoundIdent\n", + " go (Relation hist [char] \"\") = Just $ FoundIdent char hist\n", + " go (Relation hist \"\" [char]) = Just $ FoundIdent char hist\n", + " go _ = Nothing" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's finally put all of this together. We're going to start with our initial set of relations, `initRelations`, and then we're going to iteratively simplify them. Initially, we have no known identity elements.\n", + "\n", + "In each iteration, we\n", + "\n", + "- Substitute into each relation each known identity (replacing it with the empty string).\n", + "- Reduce the resulting relations.\n", + "- Collect all known identity elements." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "import Data.List (nubBy)\n", + "import Data.Function (on)\n", + "\n", + "-- The iteration starts with a list of known identity elements\n", + "-- and the current set of relations. It outputs the updated \n", + "-- relations and all known identity elements.\n", + "iteration :: ([FoundIdent], [Relation]) -> ([FoundIdent], [Relation])\n", + "iteration (idents, relations) = (newIdents, newRelations)\n", + " where\n", + " -- Collect all the substitutions into a single function.\n", + " substitutions = foldl (.) id $ map (substitute . char) idents\n", + " \n", + " -- Do all substitutions, then reduce (for each relation).\n", + " newRelations = map (reduce . substitutions) relations\n", + "\n", + " -- We have to remove duplicate identity elements, because\n", + " -- in each iteration we find multiple ways to prove that some\n", + " -- letters are the identity element. We just want one.\n", + " removeDuplicateIdents =\n", + " nubBy ((==) `on` char)\n", + "\n", + " -- Find all identities in the new relations.\n", + " newIdents = removeDuplicateIdents $ idents ++ identities newRelations" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's iterate this process until we have all the identities we want. We want 26 of them, so we can just check the length. (If this operation never finishes, we're out of luck!)" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "-- Generate the infinite list of iterations and their results.\n", + "initIdents = []\n", + "iterations = iterate iteration (initIdents, initRelations)\n", + "\n", + "-- Define a completion condition.\n", + "-- We're done when there are 26 known identity elements.\n", + "done (idents, _) = length idents == 26\n", + "\n", + "-- Discard all iteration results until completion.\n", + "-- Take the next one - the first one where the condition is met.\n", + "result = head $ dropWhile (not . done) iterations" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Woohoo! We're *done*! Let's take a look at the results!" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "abcdefghijklmnopqrstuvwxyz" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/plain": [ + "26" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import Data.List (sort)\n", + "\n", + "idents = fst result\n", + "identChars = map char idents\n", + "putStrLn $ sort identChars\n", + "print $ length identChars" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Looks like we do indeed have every single letter mapped to the identity. \n", + "\n", + "Let's see if we can deduce, for each letter, how it was mapped to the identity. Instead of doing it in alphabetical order, we'll look at them in the order they were deduced, so it follows some logical flow." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "Proving e = 1:\n", + "Reduce aid and aide\n", + "\n", + "Proving a = 1:\n", + "Reduce aisle and isle\n", + "\n", + "Proving u = 1:\n", + "Reduce ant and aunt\n", + "\n", + "Proving t = 1:\n", + "Reduce but and butt\n", + "\n", + "Proving n = 1:\n", + "Reduce cannon and canon\n", + "\n", + "Proving s = 1:\n", + "Reduce cent and scent\n", + "\n", + "Proving h = 1:\n", + "Reduce choral and coral\n", + "\n", + "Proving k = 1:\n", + "Reduce doc and dock\n", + "\n", + "Proving l = 1:\n", + "Reduce filet and fillet\n", + "\n", + "Proving w = 1:\n", + "Reduce hole and whole\n", + "\n", + "Proving b = 1:\n", + "Reduce plum and plumb\n", + "\n", + "Proving g = 1:\n", + "Reduce reign and rein\n", + "\n", + "Proving c = 1:\n", + "Reduce scent and sent\n", + "\n", + "Proving o = 1:\n", + "Reduce to and too\n", + "\n", + "Proving i = 1:\n", + "Reduce waive and wave\n", + "\n", + "Proving r = 1:\n", + "Reduce air and err\n", + "Substitute i for ''\n", + "Substitute a for ''\n", + "Substitute e for ''\n", + "\n", + "Proving d = 1:\n", + "Reduce awed and odd\n", + "Substitute o for ''\n", + "Substitute w for ''\n", + "Substitute a for ''\n", + "Substitute e for ''\n", + "\n", + "Proving y = 1:\n", + "Reduce bite and byte\n", + "Substitute i for ''\n", + "\n", + "Proving z = 1:\n", + "Reduce boos and booze\n", + "Substitute s for ''\n", + "Substitute e for ''\n", + "\n", + "Proving q = 1:\n", + "Reduce cask and casque\n", + "Substitute k for ''\n", + "Substitute u for ''\n", + "Substitute e for ''\n", + "\n", + "Proving x = 1:\n", + "Reduce coax and cokes\n", + "Substitute k for ''\n", + "Substitute s for ''\n", + "Substitute a for ''\n", + "Substitute e for ''\n", + "\n", + "Proving p = 1:\n", + "Reduce coo and coup\n", + "Substitute o for ''\n", + "Substitute u for ''\n", + "\n", + "Proving f = 1:\n", + "Reduce draft and draught\n", + "Substitute g for ''\n", + "Substitute h for ''\n", + "Substitute u for ''\n", + "\n", + "Proving m = 1:\n", + "Reduce damned and dammed\n", + "Substitute n for ''\n", + "\n", + "Proving j = 1:\n", + "Reduce genes and jeans\n", + "Substitute g for ''\n", + "Substitute n for ''\n", + "Substitute a for ''\n", + "Substitute e for ''\n", + "\n", + "Proving v = 1:\n", + "Reduce felt and veldt\n", + "Substitute l for ''\n", + "Substitute e for ''\n", + "Substitute f for ''\n", + "Substitute d for ''" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import Text.Printf (printf)\n", + "\n", + "forM_ idents $ \\(FoundIdent char hist) -> do\n", + " printf \"Proving %c = 1:\\n\" char\n", + " forM_ (reverse hist) $ \\op ->\n", + " putStrLn $ case op of\n", + " Reduce first second -> \n", + " printf \"Reduce %s and %s\" first second\n", + " Substitute ch ->\n", + " printf \"Substitute %c for ''\" ch\n", + " putStr \"\\n\"" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you scan through the list above, there's a few weird cases, but for the most part, it seems legitimate. (I mildly question `felt` and `veldt`, but it depends on how you pronounce things. If you look at the British English list of homophones, it's totally different anyways!) \n", + "\n", + "So that's that! We've found the ways to reduce every letter to the identity, and shown how to do it.\n", + "\n", + "I wonder if other languages also have trivial homophony groups. It might be fun to try Spanish, French, Russian, and others, and see if the homophony groups tell us anything interesting about the language!\n", + "\n", + "**This work was done in [IHaskell](https://github.com/gibiansky/IHaskell), and what you're reading is the IHaskell notebook exported to HTML for viewing in the browser.**" + ] } - ] -} \ No newline at end of file + ], + "metadata": { + "kernelspec": { + "display_name": "Haskell", + "language": "haskell", + "name": "haskell" + }, + "language_info": { + "name": "haskell", + "version": "7.8.3" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/notebooks/Static Canvas IHaskell Display.ipynb b/notebooks/Static Canvas IHaskell Display.ipynb index 7545ab46..f5ca35df 100644 --- a/notebooks/Static Canvas IHaskell Display.ipynb +++ b/notebooks/Static Canvas IHaskell Display.ipynb @@ -1,316 +1,314 @@ { - "metadata": { - "language": "haskell", - "name": "", - "signature": "sha256:95812c4aac52ed0e9f86f92d457f4647bf9adb83e02b30f67ce5a9362a823c07" - }, - "nbformat": 3, - "nbformat_minor": 0, - "worksheets": [ + "cells": [ { - "cells": [ + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "Recently, Jeffrey Rosenbluth published (and showcased [on Reddit](http://www.reddit.com/r/haskell/comments/2vpf0t/announcing_staticcanvas_write_html5_canvas_in/)) a pretty cool Haskell package called [static-canvas](https://hackage.haskell.org/package/static-canvas). This package uses the free monad DSL pattern to make a DSL for programming for HTML5 `canvas`, restricted to fairly simple static use cases. While you can't use this to make user interfaces, it's still potentially a pretty cool tool, and there's a few very clear examples on the [GitHub readme](https://github.com/jeffreyrosenbluth/static-canvas)." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "As with most things involving pretty graphics or pictures, I think this would be a whole ton of fun to experiment with interactively, making it a great fit for [IHaskell](http://www.github.com/gibiansky/IHaskell), an interactive notebook-based environment for Haskell.\n", + "\n", + "IHaskell allows the creation of \"addon\" packages to specify how to display various data types in its browser-based UI. These addons can render data types as text, as images, or even as HTML mixed with Javascript; they can even render them as interactive Javascript widgets that can evaluate Haskell code at will. All of this is done without GHCJS or similar Haskell-to-Javascript compilation tools.\n", + "\n", + "However, these display packages have mostly been written by only a few people, those fairly closely involved with IHaskell development. As the creator of IHaskell, I'd love to have more of these packages, but I obviously can't create display instances for all existing packages, and certainly can't anticipate what people might want for their own packages or new ones. Thus, I'd love to use this very neat library as a showcase and tutorial for how to make IHaskell display packages." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "## The Tools\n", + "In this section, I'll very briefly introduce you to the tools IHaskell provides for creating IHaskell display packages. If you'd like to get to the real meat of this tutorial, skip this, read the next section, and maybe come back here if you need to.\n", + "\n", + "IHaskell internally uses a data type called `Display` to represent possible outputs. The `Display` data types looks like this:\n", + "\n", + "```haskell\n", + "-- In IHaskell.Display\n", + "data Display = Display [DisplayData] -- Display just one thing.\n", + " | ManyDisplay [Display] -- Display several things.\n", + "```\n", + "In turn, the `DisplayData` data type from the `ipython-kernel` package specifies how to actually display the object in the browser:\n", + "```haskell\n", + "-- In IHaskell.IPython.Display\n", + "data DisplayData = DisplayData MimeType Text\n", + "\n", + "-- All the possible ways to display things.\n", + "data MimeType = PlainText\n", + " | MimeHtml\n", + " | MimePng Width Height -- Base64 encoded.\n", + " | MimeJpg Width Height -- Base64 encoded.\n", + " | MimeSvg\n", + " | MimeLatex\n", + " | MimeJavascript\n", + "```\n", + "\n", + "For example, to output the string \"Hello\" in red in the browser, you might construct a value like this:\n", + "```haskell\n", + "redStr :: Display\n", + "redStr = Display [textDisplay, htmlDisplay]\n", + "\n", + "textDisplay :: DisplayData\n", + "textDisplay = DisplayData PlainText \"Hello\"\n", + "\n", + "htmlDisplay :: DisplayData\n", + "htmlDisplay = DisplayData MimeHtml \"Hello\"\n", + "```\n", + "\n", + "You may note that `Display` takes a *list* of `DisplayData` values; this allows IHaskell to choose the proper display mechanism for the frontend. The frontend can be a console or the in-browser notebook, and the in-browser notebook may have different preferences for displays, so by providing different ways to render output, the best possible rendering can be chosen for each interface.\n", + "\n", + "Instead of always using the data types, `IHaskell.Display` exports the following convenience functions:\n", + "```haskell\n", + "-- Construct displays from raw strings of different types.\n", + "plain :: String -> DisplayData\n", + "html :: String -> DisplayData\n", + "svg :: String -> DisplayData\n", + "latex :: String -> DisplayData\n", + "javascript :: String -> DisplayData\n", + "\n", + "-- Encode into base 64.\n", + "encode64 :: String -> Base64\n", + "decode64 :: ByteString -> Base64\n", + "\n", + "-- Display images.\n", + "png :: Int -> Int -> Base64 -> DisplayData\n", + "jpg :: Int -> Int -> Base64 -> DisplayData\n", + "\n", + "-- Create final Displays.\n", + "Display :: [DisplayData] -> Display\n", + "many :: [Display] -> Display\n", + "```\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "## Creating a Display\n", + "\n", + "In order to create a display for some data type, we must first import the main IHaskell display module:\n", + "```haskell\n", + "import IHaskell.Display\n", + "```\n", + "This package contains the following typeclass:\n", + "```haskell\n", + "class IHaskellDisplay a where\n", + " display :: a -> IO Display\n", + "```\n", + "\n", + "In order to display a data type, create an instance of `IHaskellDisplay` for your data type – then, any expression that results in your data type will generate a corresponding display. \n", + "\n", + "Let's go ahead and do this for `CanvasFree a` from the `static-canvas` package." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false, + "hidden": false + }, + "outputs": [], + "source": [ + "-- Start with necessary imports.\n", + "import IHaskell.Display -- From the 'ihaskell' package.\n", + "import IHaskell.IPython.Types(MimeType(..))\n", + "import Graphics.Static -- From the 'static-canvas' package.\n", + "\n", + "-- Text conversion functions.\n", + "import Data.Text.Lazy.Builder(toLazyText)\n", + "import Data.Text.Lazy(toStrict)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "Now that we have the imports out of the way, we can define the core instance necessary:" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": { + "collapsed": false, + "hidden": false + }, + "outputs": [], + "source": [ + "-- Since CanvasFree is a type synonym, we need a language pragma.\n", + "{-# LANGUAGE TypeSynonymInstances #-}\n", + "\n", + "instance IHaskellDisplay (CanvasFree ()) where\n", + " -- display :: CanvasFree () -> IO Display\n", + " display canvas = return $\n", + " let src = toStrict $ toLazyText $ buildScript width height canvas\n", + " in Display [DisplayData MimeHtml src]\n", + " \n", + " where (height, width) = (200, 600)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "We can now copy and paste the examples from the `static-canvas` Github page, and see them appear right in the notebook!" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": { + "collapsed": false, + "hidden": false + }, + "outputs": [ { - "cell_type": "markdown", - "metadata": { - "hidden": false + "data": { + "text/html": [ + "" + ] }, - "source": [ - "Recently, Jeffrey Rosenbluth published (and showcased [on Reddit](http://www.reddit.com/r/haskell/comments/2vpf0t/announcing_staticcanvas_write_html5_canvas_in/)) a pretty cool Haskell package called [static-canvas](https://hackage.haskell.org/package/static-canvas). This package uses the free monad DSL pattern to make a DSL for programming for HTML5 `canvas`, restricted to fairly simple static use cases. While you can't use this to make user interfaces, it's still potentially a pretty cool tool, and there's a few very clear examples on the [GitHub readme](https://github.com/jeffreyrosenbluth/static-canvas)." - ] - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "As with most things involving pretty graphics or pictures, I think this would be a whole ton of fun to experiment with interactively, making it a great fit for [IHaskell](http://www.github.com/gibiansky/IHaskell), an interactive notebook-based environment for Haskell.\n", - "\n", - "IHaskell allows the creation of \"addon\" packages to specify how to display various data types in its browser-based UI. These addons can render data types as text, as images, or even as HTML mixed with Javascript; they can even render them as interactive Javascript widgets that can evaluate Haskell code at will. All of this is done without GHCJS or similar Haskell-to-Javascript compilation tools.\n", - "\n", - "However, these display packages have mostly been written by only a few people, those fairly closely involved with IHaskell development. As the creator of IHaskell, I'd love to have more of these packages, but I obviously can't create display instances for all existing packages, and certainly can't anticipate what people might want for their own packages or new ones. Thus, I'd love to use this very neat library as a showcase and tutorial for how to make IHaskell display packages." - ] - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "## The Tools\n", - "In this section, I'll very briefly introduce you to the tools IHaskell provides for creating IHaskell display packages. If you'd like to get to the real meat of this tutorial, skip this, read the next section, and maybe come back here if you need to.\n", - "\n", - "IHaskell internally uses a data type called `Display` to represent possible outputs. The `Display` data types looks like this:\n", - "\n", - "```haskell\n", - "-- In IHaskell.Display\n", - "data Display = Display [DisplayData] -- Display just one thing.\n", - " | ManyDisplay [Display] -- Display several things.\n", - "```\n", - "In turn, the `DisplayData` data type from the `ipython-kernel` package specifies how to actually display the object in the browser:\n", - "```haskell\n", - "-- In IHaskell.IPython.Display\n", - "data DisplayData = DisplayData MimeType Text\n", - "\n", - "-- All the possible ways to display things.\n", - "data MimeType = PlainText\n", - " | MimeHtml\n", - " | MimePng Width Height -- Base64 encoded.\n", - " | MimeJpg Width Height -- Base64 encoded.\n", - " | MimeSvg\n", - " | MimeLatex\n", - " | MimeJavascript\n", - "```\n", - "\n", - "For example, to output the string \"Hello\" in red in the browser, you might construct a value like this:\n", - "```haskell\n", - "redStr :: Display\n", - "redStr = Display [textDisplay, htmlDisplay]\n", - "\n", - "textDisplay :: DisplayData\n", - "textDisplay = DisplayData PlainText \"Hello\"\n", - "\n", - "htmlDisplay :: DisplayData\n", - "htmlDisplay = DisplayData MimeHtml \"Hello\"\n", - "```\n", - "\n", - "You may note that `Display` takes a *list* of `DisplayData` values; this allows IHaskell to choose the proper display mechanism for the frontend. The frontend can be a console or the in-browser notebook, and the in-browser notebook may have different preferences for displays, so by providing different ways to render output, the best possible rendering can be chosen for each interface.\n", - "\n", - "Instead of always using the data types, `IHaskell.Display` exports the following convenience functions:\n", - "```haskell\n", - "-- Construct displays from raw strings of different types.\n", - "plain :: String -> DisplayData\n", - "html :: String -> DisplayData\n", - "svg :: String -> DisplayData\n", - "latex :: String -> DisplayData\n", - "javascript :: String -> DisplayData\n", - "\n", - "-- Encode into base 64.\n", - "encode64 :: String -> Base64\n", - "decode64 :: ByteString -> Base64\n", - "\n", - "-- Display images.\n", - "png :: Int -> Int -> Base64 -> DisplayData\n", - "jpg :: Int -> Int -> Base64 -> DisplayData\n", - "\n", - "-- Create final Displays.\n", - "Display :: [DisplayData] -> Display\n", - "many :: [Display] -> Display\n", - "```\n" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "## Creating a Display\n", - "\n", - "In order to create a display for some data type, we must first import the main IHaskell display module:\n", - "```haskell\n", - "import IHaskell.Display\n", - "```\n", - "This package contains the following typeclass:\n", - "```haskell\n", - "class IHaskellDisplay a where\n", - " display :: a -> IO Display\n", - "```\n", - "\n", - "In order to display a data type, create an instance of `IHaskellDisplay` for your data type \u2013 then, any expression that results in your data type will generate a corresponding display. \n", - "\n", - "Let's go ahead and do this for `CanvasFree a` from the `static-canvas` package." - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- Start with necessary imports.\n", - "import IHaskell.Display -- From the 'ihaskell' package.\n", - "import IHaskell.IPython.Types(MimeType(..))\n", - "import Graphics.Static -- From the 'static-canvas' package.\n", - "\n", - "-- Text conversion functions.\n", - "import Data.Text.Lazy.Builder(toLazyText)\n", - "import Data.Text.Lazy(toStrict)" - ], - "language": "python", - "metadata": { - "hidden": false - }, - "outputs": [], - "prompt_number": 12 - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "Now that we have the imports out of the way, we can define the core instance necessary:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "-- Since CanvasFree is a type synonym, we need a language pragma.\n", - "{-# LANGUAGE TypeSynonymInstances #-}\n", - "\n", - "instance IHaskellDisplay (CanvasFree ()) where\n", - " -- display :: CanvasFree () -> IO Display\n", - " display canvas = return $\n", - " let src = toStrict $ toLazyText $ buildScript width height canvas\n", - " in Display [DisplayData MimeHtml src]\n", - " \n", - " where (height, width) = (200, 600)" - ], - "language": "python", - "metadata": { - "hidden": false - }, - "outputs": [], - "prompt_number": 24 - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "We can now copy and paste the examples from the `static-canvas` Github page, and see them appear right in the notebook!" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "{-# LANGUAGE OverloadedStrings #-}\n", - "import Graphics.Static.ColorNames\n", - "\n", - "text :: CanvasFree ()\n", - "text = do\n", - " font \"italic 60pt Calibri\"\n", - " lineWidth 6\n", - " strokeStyle blue\n", - " fillStyle goldenrod\n", - " textBaseline TextBaselineMiddle\n", - " strokeText \"Hello\" 150 100 \n", - " fillText \"Hello World!\" 150 100\n", - " \n", - "text" - ], - "language": "python", - "metadata": { - "hidden": false - }, - "outputs": [ - { - "html": [ - "" - ], - "metadata": {}, - "output_type": "display_data" - } - ], - "prompt_number": 34 - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "As we play with this a little more, we see that this is a little bit unsatisfactory. Specifically, the width and the height of the resulting canvas are fixed in the `IHaskellDisplay` instance! I would solve this by creating a custom `Canvas` data type that stores these:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "data Canvas = Canvas {\n", - " width :: Int,\n", - " height :: Int,\n", - " canvas :: CanvasFree ()\n", - " }" - ], - "language": "python", - "metadata": { - "hidden": false - }, - "outputs": [], - "prompt_number": 26 - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "Then we could define an `IHaskellDisplay` that respects this width and height:" - ] - }, - { - "cell_type": "code", - "collapsed": false, - "input": [ - "{-# LANGUAGE TypeSynonymInstances #-}\n", - "instance IHaskellDisplay Canvas where\n", - " -- display :: Canvas -> IO Display\n", - " display cnv = return $\n", - " let src = toStrict $ toLazyText $ buildScript (width cnv) (height cnv) (canvas cnv)\n", - " in Display [DisplayData MimeHtml src]" - ], - "language": "python", - "metadata": { - "hidden": false - }, - "outputs": [], - "prompt_number": 27 - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "Then when we use this we can specify how to display our canvases:\n", - "```haskell\n", - "Canvas 200 600 $ do\n", - " font \"italic 60pt Calibri\"\n", - " lineWidth 6\n", - " strokeStyle blue\n", - " fillStyle goldenrod\n", - " textBaseline TextBaselineMiddle\n", - " strokeText \"Hello\" 150 100 \n", - " fillText \"Hello World!\" 150 100\n", - "```\n", - "\n", - "Sadly, it seems that the `static-canvas` library currently only supports having *one* generated canvas on the page \u2013 if you try to add another one, it simply modifies the pre-existing one. This is probably a bug that should be fixed, though!" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "hidden": false - }, - "source": [ - "## Packaging IHaskell Display Addons\n", - "\n", - "Once you've made an IHaskell display instance, you can easily package it up and stick it on Hackage. Specifically, for a package named `package-name`, you should take everything before the `-`. Then, prepend `ihaskell-` to the package name. Finally, make sure there exists a module `IHaskell.Display.Package`, where `Package` is the first word in `package-name` capitalized. If this is done, then IHaskell will happily load your package and instance upon startup, making it very easy for your users to install the display addon!\n", - "\n", - "For example, the `hatex` library is exposed as an addon through the `ihaskell-hatex` display package and the `IHaskell.Display.Hatex` module in that package. The `juicypixels` library has an addon package called `ihaskell-juicypixels` with a module `IHaskell.Display.Juicypixels`. \n", - "\n", - "As I write this now, I realize that this protocol is a little bit weird. Specifically, I think that perhaps the rule that you take the first thing before the `-` is not too great, but rather that perhaps the `-` should be a word separator, and thus `package-name` would get translated to `ihaskell-package-name` and `IHaskell.Display.PackageName`. (We do need *some* standard!)\n", - "\n", - "If you have any opinions about this, or suggestions for how to improve this process, please let me know!\n", - "\n", - "Anyway, I hope that this brief tutorial / guide can show someone how to write small IHaskell addons. Perhaps someone will find this useful, and please get in touch if you have any questions, comments, or suggestions!" - ] + "metadata": {}, + "output_type": "display_data" } ], - "metadata": {} + "source": [ + "{-# LANGUAGE OverloadedStrings #-}\n", + "import Graphics.Static.ColorNames\n", + "\n", + "text :: CanvasFree ()\n", + "text = do\n", + " font \"italic 60pt Calibri\"\n", + " lineWidth 6\n", + " strokeStyle blue\n", + " fillStyle goldenrod\n", + " textBaseline TextBaselineMiddle\n", + " strokeText \"Hello\" 150 100 \n", + " fillText \"Hello World!\" 150 100\n", + " \n", + "text" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "As we play with this a little more, we see that this is a little bit unsatisfactory. Specifically, the width and the height of the resulting canvas are fixed in the `IHaskellDisplay` instance! I would solve this by creating a custom `Canvas` data type that stores these:" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false, + "hidden": false + }, + "outputs": [], + "source": [ + "data Canvas = Canvas {\n", + " width :: Int,\n", + " height :: Int,\n", + " canvas :: CanvasFree ()\n", + " }" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "Then we could define an `IHaskellDisplay` that respects this width and height:" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false, + "hidden": false + }, + "outputs": [], + "source": [ + "{-# LANGUAGE TypeSynonymInstances #-}\n", + "instance IHaskellDisplay Canvas where\n", + " -- display :: Canvas -> IO Display\n", + " display cnv = return $\n", + " let src = toStrict $ toLazyText $ buildScript (width cnv) (height cnv) (canvas cnv)\n", + " in Display [DisplayData MimeHtml src]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "Then when we use this we can specify how to display our canvases:\n", + "```haskell\n", + "Canvas 200 600 $ do\n", + " font \"italic 60pt Calibri\"\n", + " lineWidth 6\n", + " strokeStyle blue\n", + " fillStyle goldenrod\n", + " textBaseline TextBaselineMiddle\n", + " strokeText \"Hello\" 150 100 \n", + " fillText \"Hello World!\" 150 100\n", + "```\n", + "\n", + "Sadly, it seems that the `static-canvas` library currently only supports having *one* generated canvas on the page – if you try to add another one, it simply modifies the pre-existing one. This is probably a bug that should be fixed, though!" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "hidden": false + }, + "source": [ + "## Packaging IHaskell Display Addons\n", + "\n", + "Once you've made an IHaskell display instance, you can easily package it up and stick it on Hackage. Specifically, for a package named `package-name`, you should take everything before the `-`. Then, prepend `ihaskell-` to the package name. Finally, make sure there exists a module `IHaskell.Display.Package`, where `Package` is the first word in `package-name` capitalized. If this is done, then IHaskell will happily load your package and instance upon startup, making it very easy for your users to install the display addon!\n", + "\n", + "For example, the `hatex` library is exposed as an addon through the `ihaskell-hatex` display package and the `IHaskell.Display.Hatex` module in that package. The `juicypixels` library has an addon package called `ihaskell-juicypixels` with a module `IHaskell.Display.Juicypixels`. \n", + "\n", + "As I write this now, I realize that this protocol is a little bit weird. Specifically, I think that perhaps the rule that you take the first thing before the `-` is not too great, but rather that perhaps the `-` should be a word separator, and thus `package-name` would get translated to `ihaskell-package-name` and `IHaskell.Display.PackageName`. (We do need *some* standard!)\n", + "\n", + "If you have any opinions about this, or suggestions for how to improve this process, please let me know!\n", + "\n", + "Anyway, I hope that this brief tutorial / guide can show someone how to write small IHaskell addons. Perhaps someone will find this useful, and please get in touch if you have any questions, comments, or suggestions!" + ] } - ] -} \ No newline at end of file + ], + "metadata": { + "kernelspec": { + "display_name": "Haskell", + "language": "haskell", + "name": "haskell" + }, + "language_info": { + "name": "haskell", + "version": "7.8.3" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +}