7 Comments

The connection between time compression and AI hallucinations that LLMs output made me think of the effects certain drugs influence our perception of time when we ingest them.

I wonder what the AI will be capable of when it has the ability to sober itself up. Interesting to consider.

Thanks for this awesome article.

Expand full comment
author

I think hallucinations work for us because we have reality as a baseline to compare against. For LLM today it's all hallucinations (or all reality).

Expand full comment

It should be quite fruitful to depict all these musings onto the Ashby Space.

Ashby Space is amazing tool for visualizing knowledge complexity based upon the namesake Asby's law of requisite variety.

Its source are the works of Max Boisot and Bill McKelvey, see also a nice summary here:

https://harishsnotebook.wordpress.com/2019/04/28/exploring-the-ashby-space/

Expand full comment
author

Fascinating!

Expand full comment

My first thought is that we are misusing/mislabeling TikTok. People are hungry to learn on there if we meet them where they are.

Expand full comment

Anil Seth’s 'beast machine theory' suggests consciousness as merely a result of 'controlled hallucinations' that are our brain's attempt to keep us alive.

Expand full comment
author

Interesting. I'll have to read more on this. I do believe the fight against entropy, self directed, as key to this, though it's a conjecture.

Expand full comment