Marvin Von Renchler explains the fatal flaw with the Hard Problem

I have various exchanges going on these days. One such dialogue provided the words that have been alluding me. In particular his third paragraph.

If anyone is confused about what I’m trying to express - and discuss - this should clarify. Marvin made a comment that sounded mighty close to what I keep repeating, and I called him on it. Receiving the following:

I haven’t written formal articles on it (“Chalmers & Hard Problem”), but I’ve spent a fair amount of time looking at the “hard problem” through an evolutionary and systems lens.

My issue with it is that it seems to assume, rather than demonstrate, a separation between physical processes and subjective experience. Once that split is taken as given, the question becomes why one should produce the other—which makes it look intractable by design.

From a biological perspective, though, everything we observe points to continuity: increasingly complex systems of memory, integration, and self-referential processing emerging stepwise under selection pressures. There’s no clear place where something non-physical needs to be introduced—only increasing organizational complexity.

So I tend to see the “hard problem” less as an unsolved mystery and more as a framing issue. It may be that what feels like an explanatory gap is really a mismatch between philosophical categories and how evolved systems actually work.

I waited to see if there would be any follow up and to give this some thought. To me, this is a strawman. I have looked into Chalmers argument and I can’t find where “it seems to assume, rather than demonstrate, a separation between physical processes and subjective experience

The description I hear is, how does the physical process produce the subjective experience. It is not “why one should produce the other .” The “why” question should be set aside. That leaves the question of addressing”how”, and that’s hard.

The answer “increasingly complex systems of memory, integration, and self-referential processing emerging stepwise under selection pressures.” is inadequate. It’s a beginning, but more detail is needed.

I agree, “There’s no clear place where something non-physical needs to be introduced—only increasing organizational complexity”, but what is that complexity? That’s the framing.

Perhaps I didn’t word that well. Here’s AI take:

**

The hard problem of consciousness, coined by philosopher David Chalmers, is the question of why and how physical brain processes produce subjective, first-person experiences