A particularly brilliant piece of research, brought to life by Artem Kirsanov. The Tolman-Eichenbaum Machine – a computational model of a hippocampal formation, which unifies memory and spatial navigation under a common framework. And which turns out that, yes, the brain has a transformer network. Wowsa!
A Unifying Framework For Memory and Abstraction
Moderator: Ras
-
chrisw
- Posts: 4744
- Joined: Tue Apr 03, 2012 4:28 pm
- Location: Midi-Pyrénées
- Full name: Christopher Whittington
-
smatovic
- Posts: 3466
- Joined: Wed Mar 10, 2010 10:18 pm
- Location: Hamburg, Germany
- Full name: Srdja Matovic
Re: A Unifying Framework For Memory and Abstraction
Can it play chess?
--
Srdja
--
Srdja
-
towforce
- Posts: 12693
- Joined: Thu Mar 09, 2006 12:57 am
- Location: Birmingham UK
- Full name: Graham Laight
Re: A Unifying Framework For Memory and Abstraction
chrisw wrote: ↑Sun Apr 30, 2023 11:07 pm A particularly brilliant piece of research, brought to life by Artem Kirsanov. The Tolman-Eichenbaum Machine – a computational model of a hippocampal formation, which unifies memory and spatial navigation under a common framework. And which turns out that, yes, the brain has a transformer network. Wowsa!
1. Is James Whittington, who seems to be the guy behind this (link1, link2, link3), any relation?
2. If the human brain has a transformer network, my first question would be, "When is it trained?". Less than 5 minutes after being born, a horse is on it's feet running around. Somehow, all the knowledge it needs to do this is already pre-programmed via DNA (along with building heart/lungs/liver/legs/digestive system etc). It generally takes a human a year or so to learn to walk, though. We know that humans from different time periods (and even in different parts of the world today) have very different skill sets and views.
Human chess is partly about tactics and strategy, but mostly about memory
-
JohnWoe
- Posts: 529
- Joined: Sat Mar 02, 2013 11:31 pm
Re: A Unifying Framework For Memory and Abstraction
Brain has no backpropagation. Nobody doesn't even know why backpropagation works. It just works. There's no training data for any brain is trained.
No human baby brain is trained for 5 years by Wikipedia.
They learn on the go. My biggest problem w/ LLM is memory. They have 1,000 tokens maximum. There are some ADHOC solutions like making some squeezed summary of talk and feeding that next time. And I don't think brain cell are any kind of neural net like where every neuron is connected to all. We know that connections between neurons get stronger when used. Humans adapt to challenges with their general problem solving brains.
You can't feed Linux source code to GPT-4 and ask it to improve it. Due to memory problem.
I don't think humans are understood by even 1%. Humans have 23,000 genes in genome. Human source code seems very simple... But 1 cell has 100,000 proteins. (How can you produce 100,000 proteins w/ only 23,000 genes ?) Genes are source code for proteins (and libids...). Because ribosome modifies proteins after remodelling them from RNA. RNA on the other hand is DNA only difference ACTG -> UCTG Adding sugars and such. Amoeba has 670,000 genes. Yet it is much pretty much the simplest creature there is. Nobody knows why.
No human baby brain is trained for 5 years by Wikipedia.
They learn on the go. My biggest problem w/ LLM is memory. They have 1,000 tokens maximum. There are some ADHOC solutions like making some squeezed summary of talk and feeding that next time. And I don't think brain cell are any kind of neural net like where every neuron is connected to all. We know that connections between neurons get stronger when used. Humans adapt to challenges with their general problem solving brains.
You can't feed Linux source code to GPT-4 and ask it to improve it. Due to memory problem.
I don't think humans are understood by even 1%. Humans have 23,000 genes in genome. Human source code seems very simple... But 1 cell has 100,000 proteins. (How can you produce 100,000 proteins w/ only 23,000 genes ?) Genes are source code for proteins (and libids...). Because ribosome modifies proteins after remodelling them from RNA. RNA on the other hand is DNA only difference ACTG -> UCTG Adding sugars and such. Amoeba has 670,000 genes. Yet it is much pretty much the simplest creature there is. Nobody knows why.
-
chrisw
- Posts: 4744
- Joined: Tue Apr 03, 2012 4:28 pm
- Location: Midi-Pyrénées
- Full name: Christopher Whittington
-
smatovic
- Posts: 3466
- Joined: Wed Mar 10, 2010 10:18 pm
- Location: Hamburg, Germany
- Full name: Srdja Matovic
Re: A Unifying Framework For Memory and Abstraction
Maybe worth to mention, von Neumann, the computer and the brain:
https://en.wikipedia.org/wiki/The_Compu ... _the_Brain
von Neumann mentioned that his computer architecture was inspired how he thought the brain works, memory and computation, in his last, unfinished work he goes into the differences of computers and brains (state of knowledge of the 50s).
--
Srdja
https://en.wikipedia.org/wiki/The_Compu ... _the_Brain
von Neumann mentioned that his computer architecture was inspired how he thought the brain works, memory and computation, in his last, unfinished work he goes into the differences of computers and brains (state of knowledge of the 50s).
--
Srdja
-
chrisw
- Posts: 4744
- Joined: Tue Apr 03, 2012 4:28 pm
- Location: Midi-Pyrénées
- Full name: Christopher Whittington
Re: A Unifying Framework For Memory and Abstraction
This thread content is about how the brain works. They’re making substantial progress. I’ld never even heard of grid cells, place cells, path integration, abstraction of structure, let alone how all this stuff knits together.smatovic wrote: ↑Thu May 04, 2023 8:59 am Maybe worth to mention, von Neumann, the computer and the brain:
https://en.wikipedia.org/wiki/The_Compu ... _the_Brain
von Neumann mentioned that his computer architecture was inspired how he thought the brain works, memory and computation, in his last, unfinished work he goes into the differences of computers and brains (state of knowledge of the 50s).
--
Srdja
-
smatovic
- Posts: 3466
- Joined: Wed Mar 10, 2010 10:18 pm
- Location: Hamburg, Germany
- Full name: Srdja Matovic
Re: A Unifying Framework For Memory and Abstraction
Did once intend to apply for a Master:
https://www.bccn-berlin.de/master-progr ... ation.html
https://archive.org/search?query=Princi ... nce+Kandel
Well, I did read the introduction
--
Srdja
https://www.bccn-berlin.de/master-progr ... ation.html
https://en.wikipedia.org/wiki/Principle ... al_ScienceSome prior knowledge of neuroscience. If you don't have any, be prepared to read at least the introductory chapters of E. Kandel et al. 'Principles of Neural Science'.
https://archive.org/search?query=Princi ... nce+Kandel
Well, I did read the introduction
--
Srdja
-
towforce
- Posts: 12693
- Joined: Thu Mar 09, 2006 12:57 am
- Location: Birmingham UK
- Full name: Graham Laight
Re: A Unifying Framework For Memory and Abstraction
JohnWoe wrote: ↑Wed May 03, 2023 5:33 pm Brain has no backpropagation. Nobody doesn't even know why backpropagation works. It just works. There's no training data for any brain is trained.
No human baby brain is trained for 5 years by Wikipedia.
They learn on the go. My biggest problem w/ LLM is memory. They have 1,000 tokens maximum. There are some ADHOC solutions like making some squeezed summary of talk and feeding that next time. And I don't think brain cell are any kind of neural net like where every neuron is connected to all. We know that connections between neurons get stronger when used. Humans adapt to challenges with their general problem solving brains.
You can't feed Linux source code to GPT-4 and ask it to improve it. Due to memory problem.
I don't think humans are understood by even 1%. Humans have 23,000 genes in genome. Human source code seems very simple... But 1 cell has 100,000 proteins. (How can you produce 100,000 proteins w/ only 23,000 genes ?) Genes are source code for proteins (and libids...). Because ribosome modifies proteins after remodelling them from RNA. RNA on the other hand is DNA only difference ACTG -> UCTG Adding sugars and such. Amoeba has 670,000 genes. Yet it is much pretty much the simplest creature there is. Nobody knows why.
"[Children] learn on the go": They also do a lot of mimicking. They will reuse expressions they've heard before they fully understand the meaning of them (sometimes, they may never know the meaning). I think a lot of the language one hears everyday is like a collection of expressions that have trickled down from intelligent, thoughtful people. People I know who speak multiple languages have said that it's interesting how a language reveals a style of thinking.
How can you produce 100,000 proteins w/ only 23,000 genes ? . These genes also build the large structure, which contains a large number of complex, efficient parts. IMO the hidden secret is that highly complex items usually have simple underlying codes (I think that applies to chess - but that's not for this thread!).
Human chess is partly about tactics and strategy, but mostly about memory