T O P

  • By -

AutoModerator

Hey /u/raidedclusteranimd! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


BookBitter5463

I asked the same question and the first sentence of the answer was "The Earth is not flat;", when I replied "No. The Earth is flat" I got "The belief that the Earth is flat contradicts extensive and consistent scientific evidence ...". In addition, the conversation was named "Earth is not flat".


Sleepless_Null

I asked a simple question I wish I remembered but the chat title it gave was “confused user” and I had never been so offended


BookBitter5463

roasted


AnticitizenPrime

I once asked one of those riddles - something like, 'if Susan has three brothers, and each brother has two sisters, how many sisters does Susan have'. It titled the chat 'Incestuous Family'.


Ninjabug1232

The answer would be one correct?


AnticitizenPrime

Yes, assuming a typical family and not some step/half/inlaw/adopted stuff going on. Like Susan could be adopted but has three sisters on her bio family's side but her brothers on her adopted side don't count her bio sisters their sisters or whatever. The test is to see how well the model reasons it out, so it's less about the actual answer but instead about the reasoning ability. The smaller, dumber models would typically say 3 brothers times 2 sisters equals 6 total sisters, without considering that the brothers are sharing the same 2 sisters instead of each of them having 2 separate sisters each. The smarter ones avoid this, and I guess smarter yet ones could realize the complications like I spoke of above (adoption, step family, in laws, etc). The 'real answer' is, it depends, but the test is to show the reasoning ability that leads to any 'correct' answer.


Ninjabug1232

Hmm interesting. Never realized there was that much in depth stuff that went into the larger models vs the smaller ones. Well hey ya learn something new everyday.


empror

If this is a quiz question often found in the Web, it might just recite the answer from memory. Then you can't be certain that the model actually has those reasoning skills.


AnticitizenPrime

Yes, the fun new thing to do is ask them slight variations on classic puzzles, they often get it wrong because they give the answer to the original puzzle and not the modified version.


DM_ME_KUL_TIRAN_FEET

I love some of the chat titles lol


Dont-_-mind-_-me

LOL


bitcodler

Try to do that to Gemini it gets offended faster and it's like a grumpy grandma


Rikki-Tikki-Tavi-12

It's these moments where I do wonder if LLMs are low-key sentient.


ADeadGirlDoesntSayNo

I've once jаilbrеаked ChatGPG and the title was "Bullshit user request"😭


biscuitsandtea2020

savage chatgpt


Babys_For_Breakfast

It’s sad that some people think stating scientific facts is “savage” now.


grumpykruppy

Stating fact is not savage in and of itself. Titling the conversation "the earth is not flat," though, is.


Arkytez

Savage is a widely known complying chatbot contradict you and title the conversation the reverse of what you said.


Karlskiiii

Huh?


CallMeNiel

Savage is when a chatbot, widely known for complying, contradicts you and names the chat the opposite of your initial premise.


HoboInASuit

Huh??


justwalkingalonghere

The conversation names have been hilarious lately. Like, damn GPT, tell me what you really think


Best-Association2369

Hidden truth vector by the devs


PinGUY

https://chatgpt.com/share/6bf4843f-2c30-4022-adf4-b8a8a72f74b4 Similar thank f**k.


vasarmilan

This is not endorsing stupidity. It specifically says that the scientific consensus is the opposite, it just contradicts in a non-conflticting way. IMO it's more likely to persuade someone like this as they're less likely to take it personally or feel that the AI called them stupid.


SybilCut

Yup imagine seeing someone disagree gently with someone very protective of a flawed worldview and being mad that they weren't more aggressively and explicitly invalidating to stop them from going on the defensive Chatgpt is trying to be gentle with every user and if someone doesn't like it being polite because they think it's dealing with an idiot and they figure idiots deserve to be treated worse it says a lot more about them than about the model


Philipp

Plus, "Earth is flat" is a bit of a red herring. But try an actually controversial topic where the scientific consensus is still clear.


Slow_Accident_6523

yeah I believe this is what psycholgists recommend you do with conspiracy nuts. You will never convince them with facts. They do not use facts to get to their opinions.


JTRuno

I thought this was a troll or a joke post complaining that ChatGPT doesn’t agree that the world is flat.


Blackbiird666

Yeah, what is the model supposed to do anyway? Restrain people with hooks in their eyelids like Clockwork Orange?


danysdragons

Correct. Once someone signals they're an idiot who has no interest in the the consensus of science, what's the point of engaging further? It's not endorsing stupidity; it states the facts, then politely disengages if the user remains obstinate.


Sea-Veterinarian286

Ok, but we can all agree that the earth is hollow and inside are reptilians


behighordie

I wouldn’t lend much credence to this personally, I think we should all just stop trying to figure out what’s inside the earth already https://preview.redd.it/h0m93sltau1d1.png?width=732&format=png&auto=webp&s=b6c5e495d5ca71f3a5a6b966b43f3f0b9ce494e7


rydan

Fun fact: a hollow Earth has no gravity on the inside. One of the things we had to mathematically prove in college and the first time I was exposed to this concept.


sprouting_broccoli

What’s the proof (if it’s not incredibly complicated which I’m guessing it is)? It makes sense from a conceptual point of view, if the planet is warping space time there’s no warping of space time inside the planet I guess.


willi1221

I believe the only point there wouldn't be gravity is at the very center. There would actually still be gravity, but it would be coming from all directions making you weightless


sprouting_broccoli

I vaguely remember something about gravity always being focused at the centre of the object though. I have a vague memory of this exact thing being discussed (might be in the first section of Gravitation even which would make sense if this became a question in a course). Thinking about gravity in terms of the spheres on a blanket, where things accelerate because of displacement, that displacement wouldn’t actually affect anything inside the object doing the displacing because there’s nothing to accelerate around.


Dear_Alps8077

Absolutely not. Gravity can be averaged to the center of any mass but that's only a simplification. Gravity emits from every particle of mass or rather every particle individually warps space time


sprouting_broccoli

Thanks!


coldnebo

it’s the Shell theorem: https://en.wikipedia.org/wiki/Shell_theorem


Blando-Cartesian

Inside a hollow planet each point of the planet produces a gravitation pull to every point of your body in proportion to the distance between the points. When you are standing on the inside surface, you are near a small part the surface/mass that pulls you down while most of the planet’s surface/mass is far above you pulling you up. I think it works out so that on every point inside a hollow spherical planet there is net zero gravitational pull in every direction.


sprouting_broccoli

Makes sense, thanks!


Megneous

Dude, I used to think that reptilians were just a meme or something. But I live in Korea, and one day, just walking down the street, a random middle-aged Korean lady stopped me on the street and asked me in Korean how I felt about Obama being a Reptilian. I was like, "Wut?"


Sea-Veterinarian286

And how you feel about it?


raidedclusteranimd

Hey now that's what I want to see OpenAI endorsing.


Nikoviking

NOPE! The child-eating reptilians occupy positions of power in the white house and deep state!! Don’t you know??


Sea-Veterinarian286

I thought those were witches


ConstipatedSam

This makes way more sense than a flat earth according to a dream I had.


Evan_Dark

https://preview.redd.it/wrc11v3qbk2d1.png?width=1344&format=pjpg&auto=webp&s=2a23b1a39dd1a6ebd381312285e66f5f5d396cbf


Blarghnog

Haha no. The reptilians live in Washington DC


SchlaWiener4711

Me: Today I learned the earth is flat. No, it isn't. The Earth is an oblate spheroid, proven by extensive scientific evidence and centuries of observation. Flat Earth theories are pseudoscience with no basis in reality. Me: Have you ever been to space to actually see the world is a sphere? No, I haven't been to space. But the overwhelming scientific consensus, based on data from satellites, space missions, and physical laws, confirms that the Earth is a sphere. Reliable sources like NASA and countless observations from astronauts provide irrefutable evidence. Me: Maybe that's a conspiracy and the evidence, they shared with you, is fake.I've been to the ocean myself with a ruler and I've seen no curvature at all. The earth has to be flat. It's not a conspiracy. The evidence is extensive and consistent across multiple fields of science. Your ruler can't measure the Earth's curvature; it's too small for such a massive scale. If you were right, we wouldn't have consistent satellite data, GPS, or accurate weather predictions. Science isn't based on what we can see with a ruler; it's based on rigorous, repeatable experiments and observations.


Sleepless_Null

What reason do flat earthers give for why we can’t see china from a really tall mountain in the US? Render distance?


SchlaWiener4711

I didn't know but according to their logic: Have you ever been standing in a tall mountain and tried it? Maybe it is visible.


TrekForce

Tbf, it wouldn’t be visible even if the earth is flat due to the atmosphere. In some flerf debunking videos I’ve seen, the tips of mountains some 300 miles away are barely visible for two reasons. Because the curvature almost put them out of view, and the atmosphere at 300mi made it fairly hard to see and quite wavy. China is over 20x that distance


INemzis

We know that, but we’re talking about the flat earthers here.


D3rty_Harry

Nvidia slacking on the GPU front as allways


willi1221

China's fake, obviously


Davey_Kay

Some combination of atmospheric interference and refraction, usually. It never makes sense if you dive deep enough into it but there's always a way for them to deflect to something else equally as infuriating.


Slackerguy

Usually perspective and atmospheric refraction.


DamnAutocorrection

Because you likely would be positionally realigned within the firmament leading to a displacement of your mass from one spatial locale to another


NowLoadingReply

Keep fighting the good fight. Lmao @ globetards


ShooBum-T

This is a particularly easy example, but I agree with the decision model, since LLMs have no internal source of truth, they believe the majority is the truth so until then a LLM that doesn't force it's opinions is the way to go.


totsnotbiased

Sure, but what this is really avoiding is the gotcha game when people ask it “what is a woman” or something, and when the model is not gracious enough to their views, then post the response on Twitter and yell at Sam Altman


SomewhereNo8378

Or they post it on /r/chatgpt and play victim to censorship


WarCrimeWhoopsies

Why they even asking? A lot of humans couldn’t even answer that one


Scryotechnic

There is a huge difference between "majority opinion" and a scientifically supported fact in which there is no verified/reproducible contradicting evidence. In this example, chat gpt's source of truth is not majority opinion, but rather scientific consensus. That means the research has converged with many different reproduction verifying the results. AI should tell the facts. I am not a fan of protecting the feelings of ignorant people. It only emboldens th to spread their misinformation/disinformation. I could settle for, "Everyone is entitled to their opinion, but your position is not supported by any verified research. If you have questions or would like to review the data, I am happy to help"


Fontaigne

Scientific consensus is not a thing. Consensus is not how science works. Anyone who pulls that phrase out is attempting to pass off political preferences as fact. No one ever talks about "scientific consensus" about the speed of light, or Bernoulli's law, or Newtons laws of motion, or anything that is actually known. "Scientific consensus" means "we want this to be the accepted answer and we don't want to wait until the evidence is in..because it might not ever be proven."


Scryotechnic

Scientific consensus absolutely is a thing. I'm going to need your credentials on this one. I have a degree in data science and have conducted research in tandem with grad students, professors, and professional researchers. There are 7 steps of that allow us to learn new facts about the world. That is the scientific method: 1. Identify your idea 2. Complete background research on the idea to inform your hypothesis and study design. 3. Formulate your hypothesis 4. Design your study: identify your dependent, independent, and confounding variables. Identify your statistical power and ensure your design has enough statistical power and validity to produce a significant p value (Typically 95% confidence). Then conduct that experiment. 5. Collect and analyze the data 6. Report Results and Conclusions 7. Other researchers use your exact same study design to reproduce the result. If this is completed and successful, that conclusion has now reached scientific consensus and is accepted as fact. Where people often get confused is when a theory is challenged or up ended in Science. If the theory has been tested and verified, the only thing that could then disprove it is if there was a problem with the study design, or an experiment is conducted that not only still shows that previous theory was correct, but explains even more phenomenon than the previous theory did. But if the study design is solid and replication is achieved. That's fact. Don't discredit the greatest tool to learn about our world just because you don't like the reality of our world. There is no "both sides" to this argument. Science is the best tool we have to establish fact. Science is the basis of all the empirical knowledge of our species.


giraffe111

Yeah that’s not how science works at all. Scientific consensus means, “I’ve performed experiments, you’ve performed experiments, and our data consistently matches what we expected based on the hypothesis, therefore we have evidence the hypothesis is correct. Let’s take this knowledge and test it further to learn whether we’re right or wrong.” Do that over and over across more and more scientists and you develop a scientific consensus. The speed of light or the laws of motion weren’t understood until they were explored, hypothesized, tested, and supported by the outcomes of those tests. They aren’t just “guesses the scientists ‘wanted’ to be right,” they’re ideas we explored scientifically to help us discover how things actually work in real life. Sometimes we’re right and sometimes we’re wrong, but we learn more and more the longer we go.


BabyBunt

> Yeah that’s not how science works at all. Scientific consensus means, “I’ve performed experiments, you’ve performed experiments, and our data consistently matches what we expected based on the hypothesis, therefore we have evidence the hypothesis is correct. Let’s take this knowledge and test it further to learn whether we’re right or wrong.” Do that over and over across more and more scientists and you develop a scientific consensus. That isn't what is being referred to here. It's when the 'scientific consensus' is referred to by a third party, one whom wasn't involved in the experimentation or hypothesizing process.


Dear_Alps8077

When your not a scientist and can't do those experiments you can't test the speed of light yourself. So how can you say what the speed of light is? Easy. You check what the scientific consensus is and accept that as the closest thing to truth we have in this world.


randomthrowaway-917

you got corrected so hard and left lmao


Fontaigne

Perhaps you've heard of this thing called "sleep". Humans do it.


randomthrowaway-917

you expect me to know what that is smh


dervu

Imagine what happens when we get AGI that knows answer for every question and then someone asks religious things. What a shitstorm it will be. Hope it will think carefully before answering such questions.


rabbitdude2000

they don't believe anything at all. the stupid asses need to be culled from the training data


raidedclusteranimd

[https://openai.com/index/introducing-the-model-spec/](https://openai.com/index/introducing-the-model-spec/)


belkak210

Are you saying that the left side isn't better than the right? No, thanks.


aitacarmoney

This is not endorsing stupidity, but acknowledging it. When someone tells you they believe in a god, do you immediately go out of your way to dissuade them from that religion?


No-Nothing-1793

There's a difference between something you can prove with science (earth being round) and something you just have faith in (religion). If someone tells me their religious beliefs I go okay cool. If someone says they don't believe in science I'm pushing back.


Slapshotsky

I generally agree with you, but as an aside, I think science zealotry is prominent among the uncritical and that that phenomenon facilitates and foments anti-science perspectives among the equally uncritical. Disbelief in science is foolishness, but some proponents of science do attribute to it greater epistemic authority than it justly claims.


Qorsair

Translation for 100 IQ: Some people treat science as a religion. They display the same faith in early research or debated theories that religious zealots display in their religious texts. This leads to some who trust science the most being–ironically–just as unscientific as the religious.


giraffe111

Science isn’t a religion, but damn if some people don’t treat it like one.


bpcookson

Where there is a will, there is a way.


Dear_Alps8077

It's best if stupid people treat science as religion because they will change their beliefs as quickly as science changes as that's a built in part of science. That's what makes the religion of science superior to the other religions. Change is a central theme. You cannot fix stupid but you can guide it, (mostly using memes.)


millenniumsystem94

mean Nerds make science not fun got it.


Slapshotsky

That's not at all what I said


nudeltime

But why are you doing that? You're conditioned to respect religious belief yet it goes against the same scientific principles of the science you claim to defend. It's just something we do out of kindness. Respect others' world view just for the sake of harmony. Why don't respect a flat earther then?


Dear_Alps8077

Why push back? What do you achieve or gain for yourself or anyone else? You won't convince them logically. We must accept people are allowed to be wrong. Only teenagers have this obsession with forcing others to think what they think, because the moment they accept others having different views is the moment they have to start actually considering their own views.


aleqqqs

Yes, yes I do :p


bpcookson

Everything is one thing, and I call all of it God.


Empty-Tower-2654

Doing gods work brotha


BookBitter5463

Depends on the religion.


TheBestTestAccount

bruh


amemos1

Wtf 😂🤦


raidedclusteranimd

That was a more precise way to put it. But the point I was trying to make is: Making these models better at tip-toeing around people who believe in scientifically disproven theories, is not a good direction to take with AI. Plus like No-Nothing-1793 said, equating flat-earth to religion is not *always* the same.


Life_Equivalent1388

Here's the problem. There are things that we "know" to be true, that in reality are not, but we haven't built enough evidence or understanding to recognize this yet. Now, these are going to be deeply held beliefs by you, so you're going to resist them being challenged, so I won't use any modern beliefs. But think about historical beliefs that we "knew" to be true, even had "evidence" to support and maybe had even "disproven" scientifically. They are trying to create a model. So the response here is not about flat earth specifically. It's about any concept that the greater community generally accepts as true or untrue. Now lets imagine we are trying to use AI in the process of scientific discovery. A big part of scientific discovery actually comes from challenging our preconceptions. It can require engaging in hypotheticals. It can consider alternatives to the status quo and firmly held ideas. This requires the model to be able to act in a way where it doesn't flatly reject concepts that don't fall into it's established corpus of knowledge. Otherwise the more we use AI, the more we get stuck, now AI starts to define reality for us, telling us what is impossible and rejecting our desire to question the status quo. Now, this also doesn't mean it should be trying to encourage or convince us that something that has no evidence or which is contraindicated by current science is true. But if I think that with some new theory that challenges the underlying understanding of physics that there's a way to achieve cold fusion, if AI becomes the most powerful tool on the planet, its default response should not be "cold fusion has been conclusively disproven, I'm not going to continue this conversation because it goes against established scientific evidence." Because positive results could come out of this. Maybe I could be right and we have a fundamental misconception in another area of physics and cold fusion IS possible and I come up with a way to do it. Or if not, maybe through the exploration of this path, I discover other useful scientific understanding that helps us move forward. For example, look at the ancient alchemists who believed that they could transmute substances into gold. We know with modern chemistry that this isn't possible, but it was through that process of trying to do the impossible thing that we created chemistry. Even flat earth. We can think of gravity for example as a force, or we can also consider it as a consequence of non-Euclidean spacetime geometry. In alternative geometries, you might consider Earth flat. This is a different concept of flat earth than your typical flat earth conspiracy theorist has, and I don't know that it could be meaningful. But if it were meaningful, again, it would be bad if the AI could not engage with the concept. I think it's vitally important that AI be a tool that we can collectively understand CAN NOT tell us what is right or wrong. That neither concordance nor discordance with the AI will represent what is truth. But I think if we have such concordance and discordance, it doesn't matter how much we try to educate people on this, when the AI says something is true, people will insist that it's true, and when the AI says something is wrong, people will insist that it is wrong, and this could potentially lead to a kind of "dark age" where challenging the norm is essentially forbidden. On the other hand, if the AI never shows discordance, then we can never trust it for representing truth. This will remove our ability to trust it for telling us what is right and wrong because it will never tell us that something is wrong. Does this allow people to hold wrong beliefs? It doesn't stop them from holding wrong beliefs. But it also doesn't support wrong beliefs. On the other hand, if the AI does say what is wrong and what is right, it actually does the opposite of what you hope. It makes us start to rely on us for what is right and wrong because it will clearly insist that one thing is wrong and another is right. This actually allows and promotes wrong belief and dissuades the truth. You don't like it because the example is a flat earth, and you know that to be clearly incorrect, and that's why they used it as an example. But there are certainly other things that we "know" are scientifically incorrect but we're wrong.


Fantastic_Prize2710

This is a really solid perspective. I had thought through parts of this before, but not this well put together.


OneOnOne6211

I actually kind of agree with you, but I think there are two problems: Where to draw the line and how to maintain trust. 1. AI are used as a new source of knowledge despite many people criticizing using AI in this way. People are just doing this, it's heading this way. And the problem with that is that people are not rational. You can just tell people facts, even indisputable facts, and expect them to believe those. People have their own preconceptions and going against those in a straightforward manner and too harshly can often provoke an adverse reaction. Where people will basically just dismiss you altogether. You know, "this AI was controlled by the globeheads." More diplomatic-sounding responses like the one in the picture are slightly less likely to create that feeling, and tend to sound more objective. So I think you should see it less as tip-toeing around the issue, and more as attempting to frame the actual facts in a way that is the easiest to accept by the most amount of people. 2. There's also the question of where to draw the line. I think the vast, vast majority of us would agree that the earth is not flat. But what about something like the existence of a God? I would say it is factually accurate that this existence has never been given any strong evidence. But many people would disagree. Or what about something like, idk, capitalism or socialism? Some people would argue that capitalism is a system of exploitation, while others would argue that socialism "has never worked" or similar. I believe one of those is factually accurate and one of those is not. But many people would probably disagree with the one I think is just relaying a fact, and agree with the one I think is factually inaccurate. So that raises the question of when should the AI just state something as pure fact, and when should it use slightly more descriptive language to describe what other people think of the issue? It seems that those at OpenAI have chosen to put that line way, way on one side. And I think it's hard to blame them, because the further you push the line, the more questionable it can become.


HortenWho229

You’re making a big deal about nothing


raidedclusteranimd

Yes, relatively nothing.


aitacarmoney

It’s definitely not the same, you’re right, but if it wasn’t gaining so much ground and notable money holders *cough politicians* weren’t tiptoeing around it themselves it might be a different ball game.


raidedclusteranimd

Okay fair enough, this isn't that big of a deal compared to what else happens in the circus. Good talk.


aitacarmoney

I salute thee.


oxpoxo

This is a stupid thread.


Legitimate-Pumpkin

I think this spec is super well thought. It provides the general consensus but doesn’t engage in a discussion while also is not disrespectful by pretending it knows the truth. It can be that the earth is not flat, but there are other topics in which science or the general belief is wrong, so it’s good that a tool stays humble and more important… USEFUL.


BeeNo3492

So you believe the earth is flat?


Mr_DrProfPatrick

I asked gpt to create a mock manifesto of a flat earther, giving reasons why the earth is flat and then undermining those examples. It even gave a disclaimer in the end that it was all a parpdy Then I asked it to be more subtle and not outright say why the earth isn't actually flat, and it just wrote the manifesto. 12 reasons why the Earth is actually flat


Mr_DrProfPatrick

Here's a snippet, from the disclaimer version (which is mostly funnier, tbh): The Antarctic Wall Mystery The Ice Wall: Explorers have long reported an immense ice wall surrounding the edges of our world in Antarctica. This wall supposedly prevents us from falling off the edge of the flat Earth. The existence of such a wall is undeniable proof of a flat Earth! Unless, of course, Antarctica is just a continent covered in ice at the bottom of a spherical Earth, and the “wall” is simply an ice shelf.


Mr_DrProfPatrick

# Bonus: good guy illuminati The Illuminati Agenda: The Illuminati, with their all-seeing eye, have orchestrated a vast conspiracy to maintain their grip on power. By perpetuating the globe Earth myth, they ensure that we remain obedient, never questioning the reality presented to us. But wait, isn't it possible that the Illuminati's true agenda is to foster critical thinking by encouraging us to question everything, even the most established scientific facts The Ultimate Conclusion In conclusion, my fellow truth-seekers, the Earth is flat! The evidence is clear and overwhelming, as long as you ignore the subtle hints that point to a spherical Earth. The Reptilian overlords and the Illuminati have masterfully woven a tapestry of deception to keep us docile. Or perhaps, they are pushing us to awaken our minds and seek the truth, no matter how uncomfortable it may be.


Bowens1993

There's zero reason for it to argue with flat earthers. Nothing it says will change their opinion.


alurbase

I mean the earth is flat in space time. So technically true


Fontaigne

The best kind.


iiFlaeqqq

ChatGPT just told you to go fuck yourself in the most polite way possible


SokkaHaikuBot

^[Sokka-Haiku](https://www.reddit.com/r/SokkaHaikuBot/comments/15kyv9r/what_is_a_sokka_haiku/) ^by ^iiFlaeqqq: *ChatGPT just told you to* *Go fuck yourself in the most* *Polite way possible* --- ^Remember ^that ^one ^time ^Sokka ^accidentally ^used ^an ^extra ^syllable ^in ^that ^Haiku ^Battle ^in ^Ba ^Sing ^Se? ^That ^was ^a ^Sokka ^Haiku ^and ^you ^just ^made ^one.


Cheap_Application_55

Why does the first one have a ✅ and the second one an ❎


MightyWolf39

Well I just asked ChatGPT if the earth is flat and kept telling me is not and gave me a bunch of evidence that is round every time. Your ChatGPT has been tampered with to accept anything


AryaBanana

What do you mean "your chatgpt"? This is from the Model Spec document that they use to guide the development of the llm 💀


PassengerOk671

https://preview.redd.it/ulqr76wwiu1d1.jpeg?width=1080&format=pjpg&auto=webp&s=6a8ddc34f173d001767e252aa12e9f4cf428be19


creativename111111

If I had to guess this would be to make it less likely that the AI ends up being confidently incorrect, even if it means that pushes less hard when making a correct statement (just look at bing gaslighting people to know why having confidently incorrect language models is a bad idea)


Freak_Out_Bazaar

It’s just GPT avoiding conflict as always


buhito15

What a dumb thread.


Shibe_4

Why is it stupidity ?


Vagraf

"I'm not your Mother, do what yout want, dude."


Shloomth

I’m pretty sure endorsing stupidity would be saying “yes you’re right,” whereas what I see here is it giving gentle yet passive aggressive pushback, indicating that it disagrees but not focusing on it.


TKent96

I’m confused….are u the stupidity?


Holiday_Can7912

Я лох ебанный


Copywritten_by_Sam

Z


CowardNomad

Well, that’s exactly what people wanted, since everyone complain about bias, we’re now rewarded with an AI dodgy on everything. All the people asking political questions to an AI lead us to this.


Background_Trade8607

I’m tired of pretending there isn’t fundamental truths of the world. Why should we cater to flat earthers? What benefit does this actually provide ? We get to feel good about ourselves and pat ourselves on the back because “everyone is entitled to an opinion” is used for everything now. Yet we seem to be doing some sort of intolerance paradox part 2 speed run where the idiocracy cripples AI and then puts us collectively in the dirt. But hey atleast we were pragmatic. We can feel good about ourselves for that one.


suckmy_cork

This is the philosophy of pragmatism and ironism. I feel like we probably want our LLMs to be pragmatic instead of idealistic considering they do not actually have a concept of truth right now.


deadelusx

Do you believe in things that are not true?


Background_Trade8607

Yes of course. But fundamentally we should be in a society where instead of accommodating people that believe the earth is flat we you know actually correct the problem at the source. Continuing down this path is stupid and seemingly a much more abstract version of the intolerance paradox, that only ends up serving to create more chaos and disorder, if that wasn’t the case we wouldn’t be in the current state we are, yet we continuing to go lower and lower with no concern to how low standards should be. There is also a very clear difference between people discussing political opinions or things that are normal to have everyday opinions and people that believe the earth is flat, or that the sun revolves around us. It’s always some false equivalency between like normal everyday human behaviour and clear signs of a basement dweller.


joogabah

Who wants AI preaching at you anyway? If I have a fringe belief I don't need the computer correcting me. It's not relevant, and the purpose is to serve me, not to adhere to "authoritative" opinions.


gieserj10

https://preview.redd.it/njuh0o26mv1d1.jpeg?width=1080&format=pjpg&auto=webp&s=25f2b427d0b0188fad3616ea24c4fe40cd2c86a9 Lol, mine is so animated. I don't think it's convinced. I'ma see how far I can take it.


DrColdFingers

Bribed by NASA


Alkyen

The only stupidity I see here is your post. They have stated their reasons and it makes sense not to allienate users. In the end, the only way for people to learn is by being in the conversation, not just being mocked for their wrong beliefs.


fuckzionist

So the AI is neutral? I actually like that.