Posts

Showing posts from October, 2017

FA presented groups

In some of previous posts we have discussed regular languages and automatic relations from automata theory. This discussion were meant to lead to the upcoming discussion of automatic groups. Groups are one of the most fundamental algebraic structures arising almost everywhere in mathematics. Why to study automatic group? Well this kind of research arouse around sixties when Rabin and others asked what would happen if we impose some kind of algorithmic conditions on mathematical structures. First people studied heavily so called computable structures, i.e. structures recognized by computable functions. These days people study structures under all sorts of algorithmic constraints. Some study in computable constraints, some go beyond and look at Turing machines which work with reals, while some people, like me, study structures with automata-theoretic constraints. One of the first people who have done that are Hodgson, Khoussainov and Nerode. There are several variants of studying...

On Measures of Size and Complexity

Introduction People like to measure things. In mathematical analysis there is a notion of measure studied in Measure theory . In Topology we have a notion of topological dimension . In Information Theory we have a notion of entropy . In Computational Complexity Theory we have notions of time and space complexities . So the general idea is to quantify size or complexity of the given object. Then we can talk about whether this object is somehow bigger/more complex than the other object. In this post I would like to say few things about Hausdorff dimension . Motivation Why do we need Hausdorff dimension? There are many reasons. One of the reasons people talk most is its fineness. In other words Hausdorff measure can provide some meaningful information when Lebesgue measure gives none. To see that, observe that there are many sets of measure $0$. A single point is of measure $0$ and a Cantor space is of measure $0$. However, it is clear that those two are completely different objec...

Langrange Duality and SVM

Several posts ago, we have talked about SVMs. In particular how it is possible to formulate SVM objective as optimization technique. In this post we are going to learn about Lagrange Duality, an optimization method, which is going to help us to solve the optimization problem we ended up with last time. Let us start with a simpler optimization problem given as: \[ \min_w\, f(w) \] subject to constraints: \[ h_i(w) = 0, \quad i=1,\ldots, l. \] The classical way to solve such a problem is to write a Lagrangian function: \[ L(w,\beta) = f(w) + \sum_{i=1}^l \beta_i h_i(w) \] and set \[ \frac{\partial L}{\partial w_i} = 0, \quad \frac{\partial L}{\partial \beta_i}=0 \] This works because at critical points of $L(w,\beta)$ we must have $h_i(w)=0$ for all $i$. Now let us consider a more complex optimization problem: \[ \min f(w) \] subject to constraints: \[ g_i(w) \le 0, \quad i = 1,\ldots,k \] and \[ h_j(w) = 0,\quad j = 1,\ldots,l \] Let us call the above pr...

On Automatic Relations

In one of the previous posts we have gone through a concept of  regular languages. These languages are quite convenient to work with and have a great deal of good properties, which might be the topic for another post. In this post I would like to focus on automatic relations. This notion tries to extend regularity from unary relation to $k$-ary relations. So instead of asking if $x\in L$ for a given regular language $L$, we ask $(x_1,x_2,\ldots,x_n)\in R$ for a some relation $R$. Suppose $R\subseteq (\Sigma^*)^n$ be some $n$-ary relation. Our aim is to check a membership of the tuple $(x_1,x_2, \ldots, x_n)$ in $R$. For this we stack or \emph{convolute} these words, so that it might look like this in the case of $n=4$: \[ \begin{bmatrix} 1 & 0 & 0 & 1 & 1\\ 0 & 1 & 1 & & \\ 1 & & & &\\ 1 & 0 & 0 & & \end{bmatrix} \] Observe that above block is nonuniform in length, so it cannot be generated by the tuple...