# Category Archives: Uncategorized

## Hierarchical epsilon machine reconstruction

Having read this long paper by James P. Crutchfield (1994) “Calculi of emergence”, I have to admit, that it is very inspiring. Let’s think about the implications. The first achievement of the paper is the definition of statistical complexity. Note … Continue reading

## Curious abstractions: how do we know that circles don’t have corners?

A simple but deep question. Why do we want to know this? In order to speed up compression we have mainly considered two ideas, exploiting the compositionality of functions that generate data (incremental compression) and exploiting power laws in natural … Continue reading

## Universal approximators vs. algorithmic completeness

Finally, it has dawned on me. A problem that I had troubles conceptualizing is the following. On the one hand, for the purposes of universal induction, it is necessary to search in an algorithmically complete space. This is currently not … Continue reading

## My best paper

I have presented this paper in the AGI conference in New York this year. Some theorems on incremental compression It presents a general way of speeding up the search of short descriptions of data that is made up of features … Continue reading

## The merits of indefinite regress

The whole field of machine learning, and artificial intelligence in general, is plagued by a particular problem: the well known curse of dimensionality. In a nutshell, this curse means that whenever we try to increase the dimension of our search … Continue reading

## Using features for the specialization of algorithms

A widespread sickness of present “narrow AI” approaches is the almost irresistible urge to set up rigid algorithms that find solutions in an as large as possible search space. This always leads to a narrow search space containing very complex … Continue reading

## The physics of structure formation

The entropy in equilibrium thermodynamics is defined as , which always increases in closed systems. It is clearly a special case of Shannon entropy . If the probabilities are uniform, , then Shannon entropy boils down to thermodynamic entropy. A … Continue reading