Compression and Intelligence: Social Environments and Communication by Dowe, Hernandez-Orallo, Das

https://www.researchgate.net/publication/221328896_Compression_and_Intelligence_Social_Environments_and_Communication?enrichId=rgreq-a153919c4e1e79ad09e4306ce3b4edf2-XXX&enrichSource=Y292ZXJQYWdlOzIyMTMyODg5NjtBUzoxMDI4ODY5NTI0MDcwNTFAMTQwMTU0MTU2OTk1NQ%3D%3D&el=1_x_2&_esc=publicationCoverPdf

Abstract. Compression has been advocated as one of the principles
which pervades inductive inference and prediction – and, from there, it
has also been recurrent in definitions and tests of intelligence. However,
this connection is less explicit in new approaches to intelligence. In this
paper, we advocate that the notion of compression can appear again
in definitions and tests of intelligence through the concepts of `mind-
reading’ and `communication’ in the context of multi-agent systems and
social environments. Our main position is that two-part Minimum Mes-
sage Length (MML) compression is not only more natural and eective
for agents with limited resources, but it is also much more appropriate for
agents in (co-operative) social environments than one-part compression
schemes – particularly those using a posterior-weighted mixture of all
available models following Solomono’s theory of prediction. We think
that the realisation of these differences is important to avoid a naive
view of `intelligence as compression’ in favour of a better understanding
of how, why and where (one-part or two-part, lossless or lossy) compres-
sion is needed.

This is an interesting article about compression qua compression. This article http://lesswrong.com/lw/ite/mathematics_as_a_lossy_compression_algorithm_gone/ is a nice indication of the thinking that set me on this track.

Let us elaborate upon the points from the above paragraph with some ex-
amples. The creation of language is about developing a set of (hierarchical)
concepts for the purposes of concise description of the observed world and corre-
spondingly concise communication. Elaborating upon the ideas outlined in [25,
chap. 9] (and [2, footnote 128][4, sec. 7.2]), this can be thought of as a problem
of (hierarchical) intrinsic classication or (hierarchical) mixture modelling (or
clustering), where we might identify classes such as (e.g.) animal, vegetable, min-
eral, animal-dog, animal-cat, vegetable-carrot, vegetable-potato, vegetable-fruit,
mineral-metal, mineral-salt, animal-dog-labrador, animal-dog-collie, animal-dog-
labrador-black, animal-dog-labrador-golden, etc. Following these principles of
MML mixture modelling [26, 27, 29, 25] enables us to arrive at a single theory,
which is the rst part of an MML message and which describes the concepts or
classes. The data of all the various individual animals, vegetables and minerals
(or things) on the planet (such as their heights and weights, etc.) is encoded in
the second part of the message. Users of the language are free to communicate
the concepts from this single best MML theory.
Knowledge (and human knowledge especially) in a social environment is all
about this, about sharing models. And this shared knowledge makes co-operation
possible. For humans (elevated in knowledge), science is a type of knowledge
where we typically use one theory to explain the evidence, and not hundreds. p. 3

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s