filmov
tv
THE GHOST IN THE MACHINE
![preview_player](https://i.ytimg.com/vi/axuGfh4UR9Q/maxresdefault.jpg)
Показать описание
Noam Chomsky discusses how the "mechanical philosophy" that originated in the 17th century with thinkers like Galileo, Descartes and Newton viewed the universe as a grand machine that could in principle be understood through science. However, Newton's discovery of gravity, which involved "action at a distance" rather than direct physical contact, undermined this mechanical view.
Full title: The Ghost in the Machine and the Limits of Human Understanding.
Please support us:
Professor Noam Chomsky is the most significant thinker of our generation. Chomsky argues that since Newton, the goal of science has become more modest - rather than trying to understand the true nature of the universe, which may be beyond human comprehension, science aims to construct abstract models that are intelligible to us, even if the underlying reality remains a mystery. He suggests there may be inherent biological limits to human understanding, just as other animals have limits to their cognitive capacities.
The upshot is that we shouldn't necessarily expect a complete unification of scientific knowledge or for complex phenomena like mind and language to be fully explainable in terms of physics. Chomsky provocatively states that after Newton "exorcised the machine" by showing the mechanical philosophy was untenable, only the "ghost" of intelligibility was left in science, which now relies on human-constructed models rather than grasping the true essence of nature. Achieving a direct, intuitive understanding - "exorcising the ghost" - may simply lie beyond the cognitive horizons of the human species.
Panel:
Dr. Tim Scarfe
Dr. Keith Duggar
Dr. Walid Saba
00:00:00 Kick off
00:02:24 C1: LeCun's recent position paper on AI, JEPA, Schmidhuber, EBMs
00:48:38 C2: Emergent abilities in LLMs paper
00:51:32 C3: Empiricism
01:25:33 C4: Cognitive Templates
01:35:47 C5: The Ghost in the Machine
02:00:08 C6: Connectionism and Cognitive Architecture: A Critical Analysis by Fodor and Pylyshyn
02:20:12 C7: We deep-faked Chomsky
02:29:58 C8: Language
02:34:34 C9: Chomsky interview kick-off!
02:35:32 Q1: Large Language Models such as GPT-3
02:39:07 Q2: Connectionism and radical empiricism
02:44:37 Q3: Hybrid systems such as neurosymbolic
02:48:40 Q4: Computationalism silicon vs biological
02:53:21 Q5: Limits of human understanding
03:00:39 Q6: Semantics state-of-the-art
03:06:36 Q7: Universal grammar, I-Language, and language of thought
03:16:20 Q8: Profound and enduring misunderstandings
03:25:34 Q9: Greatest remaining mysteries science and philosophy
03:33:04 Debrief and 'Chuckles' from Chomsky
References;
LeCun Path to Autonomous AI paper
Tim’s marked up version:
Emergent Abilities of Large Language Models [Wei et al] 2022
Connectionism and Cognitive Architecture: A Critical Analysis [Fodor, Pylyshyn] 1988
Ghost in the machine
Noam Chomsky in Greece: Philosophies of Democracy (1994) [Language chapter]
Richard Feynman clip
Chomsky Bryan Magee BBC interview:
Randy Gallistel's work (question 3)
Helmholtz “NNs : they’ve damn slow”
Purkinje cells
Barbara Partee
Iris Berent
Penrose Orch OR
Fodor “The Language of Thought”
Least Effort
structure dependence in grammar formation
three models
Darwin's problem
Descartes's problem
Control Theory
Full title: The Ghost in the Machine and the Limits of Human Understanding.
Please support us:
Professor Noam Chomsky is the most significant thinker of our generation. Chomsky argues that since Newton, the goal of science has become more modest - rather than trying to understand the true nature of the universe, which may be beyond human comprehension, science aims to construct abstract models that are intelligible to us, even if the underlying reality remains a mystery. He suggests there may be inherent biological limits to human understanding, just as other animals have limits to their cognitive capacities.
The upshot is that we shouldn't necessarily expect a complete unification of scientific knowledge or for complex phenomena like mind and language to be fully explainable in terms of physics. Chomsky provocatively states that after Newton "exorcised the machine" by showing the mechanical philosophy was untenable, only the "ghost" of intelligibility was left in science, which now relies on human-constructed models rather than grasping the true essence of nature. Achieving a direct, intuitive understanding - "exorcising the ghost" - may simply lie beyond the cognitive horizons of the human species.
Panel:
Dr. Tim Scarfe
Dr. Keith Duggar
Dr. Walid Saba
00:00:00 Kick off
00:02:24 C1: LeCun's recent position paper on AI, JEPA, Schmidhuber, EBMs
00:48:38 C2: Emergent abilities in LLMs paper
00:51:32 C3: Empiricism
01:25:33 C4: Cognitive Templates
01:35:47 C5: The Ghost in the Machine
02:00:08 C6: Connectionism and Cognitive Architecture: A Critical Analysis by Fodor and Pylyshyn
02:20:12 C7: We deep-faked Chomsky
02:29:58 C8: Language
02:34:34 C9: Chomsky interview kick-off!
02:35:32 Q1: Large Language Models such as GPT-3
02:39:07 Q2: Connectionism and radical empiricism
02:44:37 Q3: Hybrid systems such as neurosymbolic
02:48:40 Q4: Computationalism silicon vs biological
02:53:21 Q5: Limits of human understanding
03:00:39 Q6: Semantics state-of-the-art
03:06:36 Q7: Universal grammar, I-Language, and language of thought
03:16:20 Q8: Profound and enduring misunderstandings
03:25:34 Q9: Greatest remaining mysteries science and philosophy
03:33:04 Debrief and 'Chuckles' from Chomsky
References;
LeCun Path to Autonomous AI paper
Tim’s marked up version:
Emergent Abilities of Large Language Models [Wei et al] 2022
Connectionism and Cognitive Architecture: A Critical Analysis [Fodor, Pylyshyn] 1988
Ghost in the machine
Noam Chomsky in Greece: Philosophies of Democracy (1994) [Language chapter]
Richard Feynman clip
Chomsky Bryan Magee BBC interview:
Randy Gallistel's work (question 3)
Helmholtz “NNs : they’ve damn slow”
Purkinje cells
Barbara Partee
Iris Berent
Penrose Orch OR
Fodor “The Language of Thought”
Least Effort
structure dependence in grammar formation
three models
Darwin's problem
Descartes's problem
Control Theory
Комментарии