• Machine Learning: Logistic Regression

    Logistic regression is a classification case of linear regression whith dependent variable $y$ taking binary values.

    Problem: Given a training set $\langle x^{(i)}, y^{(i)} \rangle$, $1 \le i \le m$, $x \in \mathbb{R}^{n+1}$, $x^{(i)} _ 0 = 0$, $y^{(i)} \in $ {0,1}, find classification function

  • Machine Learning: Linear Regression

    Let $y$ be a dependent variable of a feature vector $x$

    Problem: Given a training set $\langle x^{(i)}, y^{(i)} \rangle$, $1 \le i \le m$, find the value of $y$ on any input vector $x$.

    We solve this problem by constructing a hypothesis funciton $h_\theta(x)$ using one of the methods below.

  • Another Notation as a Tool of Thought

    In his seminal paper Kenneth Iverson described a new mathematical notation which soon became A Programming Language.

    Recently, while reading Surely You’re Joking, Mr. Feynman!, I found that Feynman invented his own notation when he was in school.

  • Two series

    Cliff Pickover twitted a fun puzzle: Which series is bigger?

    The first one is the famous geometric series which sum is equal to 1. The second one seems to be bigger because 1 < n, except for the 0th term, but that 0th term makes a big difference.

  • GC visualization

    While I was working through the chapter 5.3 of SICP, I created small visualization of the stop-and-copy garbage collection algorithm.

    I start with the following memory structure example.

    The content of the root register is a pointer p4 to the list of registers (x y z). The register x points to address 6 where improper list (1 . 2) saved. The register y points to address 8 where list (x x) starts. Finally, the register z points to address 10 where list (3 4 5) starts. Addresses 1, 3, 5, 9, 11, 13, 15 contain garbage.

    After we ran the GC algorithm, we got the following memory structure.

    root now points to address 0, x to 1, y to 3, and z to 6.