"I believe that artificial general intelligence (AGI) is achievable."
And the problem is circular: as the Riley Verge piece citing Fedorenko https://news.ycombinator.com/item?id=46072838 exhibits, words aren't related to thoughts. And do the thought experiment, neither are symbols, math, binary. They're all factorized, linear models of real events, which are manifold, nested, scale invariant, in other words, analog correlations.
AGI isn't solvable under these current technologies.
The nativist cog-sci program doesn't run in our heads or externalized. It's false.
Symbols and binary will never get us to
"I believe that artificial general intelligence (AGI) is achievable."
And the problem is circular: as the Riley Verge piece citing Fedorenko https://news.ycombinator.com/item?id=46072838 exhibits, words aren't related to thoughts. And do the thought experiment, neither are symbols, math, binary. They're all factorized, linear models of real events, which are manifold, nested, scale invariant, in other words, analog correlations.
AGI isn't solvable under these current technologies.
The nativist cog-sci program doesn't run in our heads or externalized. It's false.
Why it won’t reach AGI, ever, and how it might destroy the economy,
Deep analysis by a leading skeptic.