# Category Archives: machine learning

## Bellman optimal equation for Q

Q(s, a), the expected return from starting state s, by taking action a at time t. r(s, a), reward at state s, by taking action a maxQ(s’, a’),  maximized expected return for next state-action(s’,a’). Need to find the a’, which maximizes it.

## liner equation, non-linear equation

Single layer machine learning is actually linear equation. Linear equation is at most multi-variant polynomial equation, which at most include multiplication and adding. Deep learning uses multi layers. The multi layers is considered to be non-linear equation.  Non-linear equation may include sin, cos, square etc on variances. Both linear and non-linear equation fits curvature. However, non-linear is more… Read More »

## Logistic Regression by Scikit

Open Boundary Assume there are 3 lines separating 3 areas: line1: y=x , (x <= 10) line2: y=20-x,  (x>=10) line3: x=10, y>=10 For each area, there are some data: A: [-3, -2, 0] [-3, 1, 0] [0, 40, 0] [8, 90, 0] [8, 10, 0] [3, 40, 0] B: [11, 11, 1] [15, 6, 1] [13,… Read More »

## Logistic Regression

Logistic Regression answer the YES/NO question. For example, giving a set of size of tumor, it answers if it is a tumor. Giving height and weight of a person, answer if it is a man. Hypothesis We have hypothesis function , and it ranges  . And we define the answer is YES when hypothesis is greater… Read More »

## Polynomial Linear Regression by Scikit

Suppose we have a 2-degree polynomial function  And let’s generate some training data sets. First, let’s have some random (x1, x2) sets: Then, we transform it to (1, x1, x2, x1x2, x1^2, x2^2) form: For , we know the coefficients is [2, 1, -3, 2, -5, 6]. Then we will have training result: Let’s make it… Read More »

## Multivariate Linear Regression by Scikit

Suppose we have training sets for y = 3 + 2 * x1 + x2, run below code to find the coefficient and intercept. import numpy as np from sklearn import datasets, linear_model # z = 3 + 2 * x1 + x2 X = np.array([ [3, 0], [0, 3], [1, 1], [2, 3], [4,… Read More »

## Simple Linear Regression by Scikit

Suppose we have training set for y = 5 + 0.5 * x. Run below code we can find the coefficient(slope) and intercept import matplotlib.pyplot as plt import numpy as np from sklearn import datasets, linear_model # y = 5 + 0.5 * x x = [-10.0, -9.5, -9.0, -8.5, -8.0, -7.5, -7.0, -6.5, -6.0, -5.5,… Read More »

## Simple Linear Regression

I’m learning machine learning these days. Here let me write down the note for this. Suppose we have m training points  in x-y coordination, and we want to find the best line fit for these points . Since it is the simplest line, we can define the line function, and call it hypothesis: . A way… Read More »