• 1 Post
  • 142 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle













  • Haha “Don’t think about it too hard,” I suggested! But there’s some real value in what you’ve said: I do find your idea of an “audience” very helpful and will have to cogitate on it.

    The first thing it makes me think of is a story about Doctor Who that I once made up while trying to sleep: are you familiar with Doctor Who? It’s a sci-fi show like a home-made British Star Trek with more tinfoil and more time travel.

    In my story, if you’ll indulge me, one of the characters briefly enters a super-perceptual state where she sees some kind of invisible entity steal the consciousness out of her friend’s head, squirts someone else’s consciousness in there, and then makes off with the one originally belonging to her friend.

    It’s like the audience from your analogy walked out of one brain and became the audience for another brain. Her friend wakes up, and doesn’t seem to be any different: the brain his conscientiousness inhabits still has all his memories and nobody else’s, so why would she expect him to be any different? But the “audience” observing their friendship hitherto has been spirited away to who-knows-where and this grieves her.

    The middle of the story consists of her trying to track down the entity, and also her trying, with only ambiguous success, to determine any difference in his behaviours or thoughts. She becomes increasingly desperate as she imagines the invisible entity getting further and further away.

    In the end, she tracks down a way to re-enter the state in which she perceived the entity originally. This time, she is able to look over a large group of people, and the punch line, of course, is that there are countless such entities swapping our “audiences” from head to head constantly. How many times in the last minute has she herself had her conscientiousness swapped for somebody else’s?

    It makes me wonder if the ownership between brain and audience is a thing. Is that connection a necessary thing that is missing from our materialistic definition of a consciousness? Maybe that’s the nagging thing that you call a communication channel.

    I’ve got another thought experiment to share which challenges this idea, but I’ve given you far too much to wade through already so I’ll save it.


  • I guess, if we’re looking at trees, I see it more like the shelter provided by the tree. The shelter cannot exist without the tree, it consists entirely of the tree, and when a tree attains certain properties it creates/becomes a shelter almost definitionally.

    Note that here I’m making a distinction between a shelter and a sheltered area. The sheltered area is not the shelter; it is an effect of the shelter. I don’t think there’s room for the sheltered area in my consciousness analogy. So don’t think about it too hard?

    Anyway, do you believe there is any ingredient to consciousness other than the physically of the brain? For example a “spark”, or a soul, or a connection to something external to the brain?


  • You’re welcome. I too find it very interesting, though my expertise in it is below amater level.

    I am a little confused about your model of continuous and the brain: you speak of consciousness appearing to be a manifestation of the brain’s processing, but talk about what seems to be a communicative relationship between the two. My understanding is that consciousness is entirely an emergent property of the brain, impossible to distinguish from the squishy mechanics. If yours is significantly different to this, then it is no wonder that our beliefs diverge.


  • Because I experience them, and not just at times, but moment-to-moment, every waking day. And so do you. And so does essentially every single human in existence.

    Or, as you acknowledged before, it seems like you experience them. That experience of weighing up all the inputs, applying your mood and whatever else you bring, feels like making a decision freely.

    I simply include consciousness, and all it entails - reason, value, self-interest, preference, mood, etc. - among those parameters.

    These parameters are all examples of the complex inputs that precede a decision. And each of these inputs could be understood as the inevitable result of a causal chain.

    It’s super complex and likely involves technology that we don’t yet possess, but if I could perfectly simulate a brain identical to yours, with the same neural states, and the same concentrations of relevant chemicals in its simulated blood at the moment of the decision, that simulated brain would have to produce the same output as as your meaty one.