Discussion about this post

User's avatar
Christian Benedict's avatar

Agreed. After reading into Grok 4 and GPT-5, they have milked pretty much everything out of the current transformer architecture possible, and scaled up compute to the max. Something new must emerge

PEG's avatar

This connects beautifully to the ecological view that intelligence isn't something we "have" but something we *do* in dynamic coupling with our environment. Your point about LLMs being bounded by human knowledge corpus highlights how current AI is essentially doing sophisticated pattern matching on linguistic traces rather than engaging in the embodied sense-making that characterizes biological intelligence.

The constraint isn't in our heads (or silicon) but in the environment's capacity to support intelligent behavior. Even human intelligence emerges through ongoing organism-environment transactions—we don't possess abstract reasoning, we become capable of it through participation in culturally mediated activities.

This suggests AGI isn't about building systems with sufficient internal intelligence, but creating systems capable of the right kinds of environmental coupling. Current approaches hit walls because they can't participate in the dynamic dance of environmental attunement that makes intelligence possible in the first place.

Your "empirical AI" that learns from direct observation points toward this—intelligence as an active process of structural coupling with the world, not computational power applied to static datasets.

1 more comment...

No posts

Ready for more?