Crashing both ChatGPT and DeepSeek with the following question

What is the basis for reason? And mathematics?

Moderators: AMod, iMod

Flannel Jesus
Posts: 4302
Joined: Mon Mar 28, 2022 7:09 pm

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Flannel Jesus »

godelian wrote: Sat Feb 08, 2025 3:26 pm
Flannel Jesus wrote: Sat Feb 08, 2025 1:07 pm From your source:
"An internal server error indicates an issue on the server side, preventing ChatGPT from fulfilling user requests.

This error can arise due to various reasons, including high traffic, server maintenance, rate limits, and more."

None of the listed reasons is "asking it philosophical questions" numnuts.
I did not say for what reason these AI engines crashed.
Title of the thread: "Crashing both ChatGPT and DeepSeek with the following question".

You think you crashed it with the question. You think the question is the reason it crashed. Come on goofball
godelian
Posts: 2742
Joined: Wed May 04, 2022 4:21 am

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by godelian »

Flannel Jesus wrote: Sat Feb 08, 2025 5:28 pm
godelian wrote: Sat Feb 08, 2025 3:26 pm
Flannel Jesus wrote: Sat Feb 08, 2025 1:07 pm From your source:
"An internal server error indicates an issue on the server side, preventing ChatGPT from fulfilling user requests.

This error can arise due to various reasons, including high traffic, server maintenance, rate limits, and more."

None of the listed reasons is "asking it philosophical questions" numnuts.
I did not say for what reason these AI engines crashed.
Title of the thread: "Crashing both ChatGPT and DeepSeek with the following question".

You think you crashed it with the question. You think the question is the reason it crashed. Come on goofball
I asked other people to try the same action. Possible outcomes:
- It does not crash when I try.
- It also crashes when I try.
- Whatever
The very first concern is to reproduce the issue. That is what testing is all about. It is you the "goofball".
Flannel Jesus
Posts: 4302
Joined: Mon Mar 28, 2022 7:09 pm

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Flannel Jesus »

Well now you know. These are LLMs. They don't crash because of difficult philosophical questions, they read the question and start probing their neutral network for connections.

Some LLMs are designed specifically not to answer certain questions, but not by crashing . They're just trained or pre-prompted not to give answers on certain topics. Try asking deopseek about tiananman square for example.
Fairy
Posts: 3751
Joined: Thu May 09, 2024 7:07 pm
Location: The United Kingdom of Heaven

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Fairy »

AI is full of BS

Throw it in the trash where it belongs.
Flannel Jesus
Posts: 4302
Joined: Mon Mar 28, 2022 7:09 pm

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Flannel Jesus »

Fairy wrote: Sun Feb 09, 2025 2:12 pm AI is full of BS

Throw it in the trash where it belongs.
once you open pandora's box, it never goes back in the box...
Gary Childress
Posts: 11746
Joined: Sun Sep 25, 2011 3:08 pm
Location: It's my fault

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Gary Childress »

Flannel Jesus wrote: Sun Feb 09, 2025 6:40 pm
Fairy wrote: Sun Feb 09, 2025 2:12 pm AI is full of BS

Throw it in the trash where it belongs.
once you open pandora's box, it never goes back in the box...
Problems have a nasty habit of sticking around once they're discovered.
commonsense
Posts: 5380
Joined: Sun Mar 26, 2017 6:38 pm

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by commonsense »

Knowledge is justified true belief. Proof is a form of justification. Without proof, an alternate form of justification might be able to verify JTB, if such a method were to exist. Hypothetically, such a form of justification could exist unrecognized by humans. Do you know of such a form of justification? One can presume that your AIs do not.

I for one can only conclude that hypothetically it may be possible to know something is true without proof.
Flannel Jesus
Posts: 4302
Joined: Mon Mar 28, 2022 7:09 pm

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Flannel Jesus »

commonsense wrote: Sun Feb 09, 2025 8:32 pm Knowledge is justified true belief. Proof is a form of justification. Without proof, an alternate form of justification might be able to verify JTB, if such a method were to exist. Hypothetically, such a form of justification could exist unrecognized by humans. Do you know of such a form of justification?
It depends on how you're using the word "proof". Is a proof a syllogism? A series of statements that, if you accept the premises, the conclusion is logically necessary?

Or are you using "proof" a bit more loosely?
commonsense
Posts: 5380
Joined: Sun Mar 26, 2017 6:38 pm

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by commonsense »

Flannel Jesus wrote: Sun Feb 09, 2025 9:38 pm
commonsense wrote: Sun Feb 09, 2025 8:32 pm Knowledge is justified true belief. Proof is a form of justification. Without proof, an alternate form of justification might be able to verify JTB, if such a method were to exist. Hypothetically, such a form of justification could exist unrecognized by humans. Do you know of such a form of justification?
It depends on how you're using the word "proof". Is a proof a syllogism? A series of statements that, if you accepte the premises, the conclusion is logically necessary?

Or are you using "proof" a bit more loosely?
Right. I took the OP to be using it loosely and accordingly followed suit. Mea culpa.
godelian
Posts: 2742
Joined: Wed May 04, 2022 4:21 am

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by godelian »

commonsense wrote: Sun Feb 09, 2025 8:32 pm I for one can only conclude that hypothetically it may be possible to know something is true without proof.
Outside mathematics, there is no (formal) proof. So, in that context, the only way to know that a proposition is true, is without (formal) proof.
Flannel Jesus
Posts: 4302
Joined: Mon Mar 28, 2022 7:09 pm

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Flannel Jesus »

commonsense wrote: Mon Feb 10, 2025 1:08 am
Flannel Jesus wrote: Sun Feb 09, 2025 9:38 pm
commonsense wrote: Sun Feb 09, 2025 8:32 pm Knowledge is justified true belief. Proof is a form of justification. Without proof, an alternate form of justification might be able to verify JTB, if such a method were to exist. Hypothetically, such a form of justification could exist unrecognized by humans. Do you know of such a form of justification?
It depends on how you're using the word "proof". Is a proof a syllogism? A series of statements that, if you accepte the premises, the conclusion is logically necessary?

Or are you using "proof" a bit more loosely?
Right. I took the OP to be using it loosely and accordingly followed suit. Mea culpa.
The op is about mathematics specifically, which does have access to real proofs. Seems like your response is about beliefs in general, many of which are outside the remit of syllogistic proof.
Veritas Aequitas
Posts: 15722
Joined: Wed Jul 11, 2012 4:41 am

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Veritas Aequitas »

commonsense wrote: Sun Feb 09, 2025 8:32 pm Knowledge is justified true belief. Proof is a form of justification. Without proof, an alternate form of justification might be able to verify JTB, if such a method were to exist. Hypothetically, such a form of justification could exist unrecognized by humans. Do you know of such a form of justification? One can presume that your AIs do not.

I for one can only conclude that hypothetically it may be possible to know something is true without proof.
Somewhat agree.

Godelian is too dogmatic in insisting 'proof' is confined only to Mathematics. This is immature and not wise.

Yes, 'proof' in general usage means justification.
But 'justification' has to be conditioned upon a specific human-based Framework and System [FS].
If a claim meets all the conditions of the FS, then it is 100% true [JTB] as qualified to the FS or lesser degree of truth is it meets less of the conditions.
For example, Pluto is not a planet but 100% true as a dwarf planet because the IAU said so, in accordance to its conditions.
In 2006, the International Astronomical Union (IAU) formally redefined the term planet to exclude dwarf planets such as Pluto.
https://en.wikipedia.org/wiki/Pluto
All domains of knowledge [JTB] has it own specific conditions, e.g. Science, mathematics economics, history, legal, linguistic, politics, finance, art, sports, etc. Each sub with the fields will have their sub-FS.
If a claim meets all the conditions of the specific FS, then it is proven to be 100% true as qualified to that specified FS, but it cannot be 100% true upon other FS.

Based on a rational rating methodology, the scientific FS [at is best] is the most credible and objective, thus taken as the Gold Standard all other FS are compared against.

So 'proof' i.e. FS-proof is a form of justification with varying degrees of credibility and objectivity.

Godelian is too dogmatic in insisting 'proof' is confined only to Mathematics. This is immature and not wise.
godelian
Posts: 2742
Joined: Wed May 04, 2022 4:21 am

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by godelian »

Veritas Aequitas wrote: Sat Feb 15, 2025 4:54 am Godelian is too dogmatic in insisting 'proof' is confined only to Mathematics.
Justification outside mathematics is probabilistic. It cannot be proof. It is termed "evidence".
ChatGPT: Is scientific evidence proof?

Not exactly. Scientific evidence is strong support for a claim, but it is not absolute proof. Science operates on probabilities and falsifiability, meaning that evidence supports or refutes a hypothesis rather than proving it conclusively.

In contrast, mathematical proofs are absolute because they follow logical steps that cannot be contradicted. Science, however, deals with empirical observations, which are always subject to new interpretations or discoveries.
Hence, there is no proof outside mathematics.
Gary Childress
Posts: 11746
Joined: Sun Sep 25, 2011 3:08 pm
Location: It's my fault

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by Gary Childress »

godelian wrote: Wed Feb 19, 2025 2:55 am
Veritas Aequitas wrote: Sat Feb 15, 2025 4:54 am Godelian is too dogmatic in insisting 'proof' is confined only to Mathematics.
Justification outside mathematics is probabilistic. It cannot be proof. It is termed "evidence".
ChatGPT: Is scientific evidence proof?

Not exactly. Scientific evidence is strong support for a claim, but it is not absolute proof. Science operates on probabilities and falsifiability, meaning that evidence supports or refutes a hypothesis rather than proving it conclusively.

In contrast, mathematical proofs are absolute because they follow logical steps that cannot be contradicted. Science, however, deals with empirical observations, which are always subject to new interpretations or discoveries.
Hence, there is no proof outside mathematics.
Mathematics proves mathematical equations. I think we would all agree to that.
godelian
Posts: 2742
Joined: Wed May 04, 2022 4:21 am

Re: Crashing both ChatGPT and DeepSeek with the following question

Post by godelian »

Gary Childress wrote: Wed Feb 19, 2025 3:04 am Mathematics proves mathematical equations. I think we would all agree to that.
Mathematics does not prove anything about the physical universe. It only deals with Platonic abstractions. Otherwise it is not mathematics.

In other words, there simply is no proof possible about physical reality.
Post Reply