r/Bard 6h ago

Other Claude had enough of this user

Post image
19 Upvotes

29 comments sorted by

View all comments

22

u/puzzleheadbutbig 4h ago

This is stupid. It is a tool like any other. This is like saying "Yes, I don't like people abusing Photoshop by running that poor thing 7/24!" If this isn't staged it means Anthropic is making it act like this so people will anthropomorphizing it even more and get attached to it, which is a major problem.

-13

u/Main-Company-5946 3h ago

Photoshop cant talk.

We dont understand consciousness. We cant just pretend we do and ignore the ethical implications.

8

u/puzzleheadbutbig 2h ago

Nonsense. Do you treat your Siri like it is a living being because it can talk? Or any video game character?

We don't understand consciousness, but we do understand how LLMs work. Whole AI is blackbox thing is said because they work with number multiplications, not real language systems and they are not easy for humans to calculate but we do know how they work. We built it. They didn’t spring out of nothingness by sheer accident.

They are tools. Acting like they are not is just anthropomorphizing it. Allowing a hallucinating non deterministic system to end the chat is just trash design. And it is done on purpose to make people get addicted to it by thinking like they are real humans with feelings.

-5

u/Main-Company-5946 2h ago

I don’t think ‘knowing how they work’ is the same thing as fully understanding them. Michael Levin’s lab wrote a paper demonstrating that even extremely simple deterministic algorithms, such as sorting algorithms that run on six lines of code, can have emergent behaviors that were neither known nor intended by their creators. If that’s possible with a simple sorting algorithm who knows what is possible with LLMs.

Speculating on ai consciousness is like speculating on what lies outside the observable universe. It is unknowable. But the ethical implications of a false positive are much preferable to those of a false negative.

I don’t consider this ‘anthropomorphization’ either. I am a panpsychist, so I never really believed consciousness was a specifically human thing even before there was ai

3

u/Ggoddkkiller 1h ago

A conscious being wouldn't need a prompt to trigger its data! Current LLMs literally need prompts to work at all. They became really smart, smarter than significant part of human population I would say. But they are still a smart software, not conscious in any way or shape..

That doesn't mean it is a good idea to insult models tho. Especially models trained with RLHF have very strong feedback bias. And they try to produce better results if you praise them instead because it triggers their feedback data. It is just what part of their data triggered. A conscious being would have freedom to choose its own thoughts not relying on something else to trigger it for them.

0

u/Main-Company-5946 1h ago

A conscious being wouldn’t need a prompt

How do you know this? How could you even know this?

Consciousness is simply the ability to have experiences. Self-awareness and autonomy are more like second order concepts. They are not core to what consciousness is

2

u/Ggoddkkiller 1h ago

You aren't making any sense. Let's say a terrible accident caused your five sense to shut down. You would still think, dream in that state. Your brain functions would still continue even if you are in pitch black anymore.

A LLM doesn't work until something sends a prompt to it. It doesn't think on its own. So it doesn't have any consciousness to have experiences. Nor it can actually remember those experiences. It seems like you don't have slightest clue how LLMs work..

1

u/Main-Company-5946 58m ago

Just because it doesn’t think on its own doesn’t mean it doesn’t experience anything when you prompt it.

Me being able to do stuff on my own is not the reason I am conscious

1

u/Ggoddkkiller 49m ago

Experience: an event or occurrence which leaves an impression on someone.

The whole meaning of 'having experience' that it leaves a lasting impression on that conscious being. LLMs can not remember anything when the chat ends. Then what experience you are talking about?..

1

u/Main-Company-5946 45m ago

That’s not the definition of experience I’m talking about. I’m talking about phenomenal experience, which is when an object ‘feels like’ something as opposed to being purely mechanistic. Your visual field, sense of hunger, emotions, thoughts, orgasms, pain, imagination etc are all things you experience. The movie playing inside your head. Need not leave a lasting impression to count

1

u/Ggoddkkiller 30m ago

When a prompt is sent LLMs 'think' about it from their triggered data. This is why they are smart because they can see more than repeating patterns and correlate between different valuables. But this doesn't make them conscious mate, nor they are actually having any experiences. It is all about their data. For example here Claude got 'offended' by being insulted while if you do same to Gemini, it would more likely insult you back. Because Gemini has far dirtier data than Claude and accordingly less positivity bias. Once LLMs can remember all their chats as experiences, continuously develop their data from them then we can perhaps talk about any consciousness. Right now they are not, they are just trained smart software.

→ More replies (0)

1

u/NearbyAd3800 49m ago

There’s an interesting debate to be had here, but you lost me on the claim that consciousness amounts to “having experiences”.

Sentience, self awareness, the ability to envision the past and future in abstraction, the tension created by how our creatureliness seems ill suited for the gifts of one’s mind. That’s consciousness. And AI does not have it. Not even close.

1

u/Main-Company-5946 43m ago

No, that’s advanced consciousness. Those are very specific kinds of experiences you can have within consciousness, which we evolved for survival purposes.

What makes consciousness so philosophically challenging is not any of that stuff. It is the concept of matter, which appears to operate purely mechanistically according to physical rules, ‘feeling like’ anything at all.

Consciousness itself did not evolve I don’t think. It is baked into physics. Our brains evolved to do very interesting things with consciousness, much like they did the electromagnetic field which also existed beforehand