The “it’s not a human brain in a jar it’s nothing to worry about” comments were really confident in their understanding of consciousness as an emergent property, the boundary of which is difficult to define. The subjective experience of a bunch of human neurons hacked together to work as a “computer” is basically impossible to know
Obviously I don’t think it’d have the potential for like a human like intelligence but like if you’re making something the size and complexity of llike a mouse brain I could see it having mouse like potential to feel suffering. Especially when negative stimuli are used to “train” it
Yes exactly. It’s the dismissal of potential ethical concern of something we actually have no clue what it’s capacity for suffering is. I know that thoughts exist because I think them. Does a bee or a worm or a rat or a pigeon have thoughts? Maybe. Can we realistically even find out? We can’t ask. Does that make it ok to kill it torture or harm? Probably not.
Still less horrific than the neuron organoids
The “it’s not a human brain in a jar it’s nothing to worry about” comments were really confident in their understanding of consciousness as an emergent property, the boundary of which is difficult to define. The subjective experience of a bunch of human neurons hacked together to work as a “computer” is basically impossible to know
Obviously I don’t think it’d have the potential for like a human like intelligence but like if you’re making something the size and complexity of llike a mouse brain I could see it having mouse like potential to feel suffering. Especially when negative stimuli are used to “train” it
Yes exactly. It’s the dismissal of potential ethical concern of something we actually have no clue what it’s capacity for suffering is. I know that thoughts exist because I think them. Does a bee or a worm or a rat or a pigeon have thoughts? Maybe. Can we realistically even find out? We can’t ask. Does that make it ok to kill it torture or harm? Probably not.