|
Welcome to the International Skeptics Forum, where we discuss skepticism, critical thinking, the paranormal and science in a friendly but lively way. You are currently viewing the forum as a guest, which means you are missing out on discussing matters that are of interest to you. Please consider registering so you can gain full use of the forum features and interact with other Members. Registration is simple, fast and free! Click here to register today. |
![]() |
#521 |
Penultimate Amazing
Join Date: Jan 2003
Location: Yokohama, Japan
Posts: 27,952
|
|
__________________
A fool thinks himself to be wise, but a wise man knows himself to be a fool. William Shakespeare |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#522 |
Penultimate Amazing
Join Date: Jan 2003
Location: Yokohama, Japan
Posts: 27,952
|
The newest iteration of GPT, GPT-4, can pass a bar exam:
https://www.iit.edu/news/gpt-4-passes-bar-exam
Quote:
|
__________________
A fool thinks himself to be wise, but a wise man knows himself to be a fool. William Shakespeare |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#523 |
Lackey
Administrator
Join Date: Aug 2001
Location: South East, UK
Posts: 109,555
|
I've got access to Google's Bard - thought I'd see how they compare to each other:
Create a fictional account of someone attending a CND rally in the 1980s Bard It was a cold and rainy day in London in 1983, but that didn't stop the thousands of people who had come to protest against nuclear weapons. The Campaign for Nuclear Disarmament (CND) had organized a rally in Hyde Park, and the atmosphere was electric. People of all ages and from all walks of life had come to show their support for CND's cause. There were families with young children, students, and older people. Many of them were carrying banners and placards with slogans like "Nuclear War: Not In My Name" and "Disarm Now." The rally started with a speech by CND's leader, Bruce Kent. He spoke about the dangers of nuclear weapons and the need for the world to disarm. He also spoke about the importance of peace and the need for people to work together to create a better future. After Kent's speech, there were performances by singers and musicians. There was also a march through the streets of London. The rally ended with a candlelight vigil in Hyde Park. ChatGPT As Jane stepped off the bus and onto the streets of London, the energy in the air was palpable. She had never been to a CND rally before, but she had heard about them from her friends and felt compelled to come and show her support. The streets were packed with people of all ages, carrying signs and banners with slogans like "No Nukes" and "Ban the Bomb". As Jane made her way through the crowd, she felt a sense of unity with everyone around her, all of them united in their opposition to nuclear weapons. The atmosphere was electric as the rally began. Speakers took to the stage, urging the crowd to take action and to raise their voices against the threat of nuclear war. The cheers and applause were deafening, and Jane felt a surge of emotion as she joined in. As the rally continued, Jane found herself swept up in the passion of the moment. She linked arms with strangers, chanting and singing as they marched through the streets of London. She had never felt so connected to a cause before, and she knew that she was part of something bigger than herself. As the day drew to a close and the rally began to disperse, Jane felt a sense of pride and hope. She knew that the fight against nuclear weapons was far from over, but she had seen firsthand the power of people coming together to make a difference. And as she made her way back home, she knew that she would continue to speak out against nuclear weapons and to stand in solidarity with all those fighting for peace. Quite different results. I'd say that Bard gets the win, it is less "breathlessly" purple prose plus it dropped in the right leader's name and used some of the slogans that CND did use more than the generic ChatGPT "Ban the bomb". |
__________________
I wish I knew how to quit you |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#524 |
Lackey
Administrator
Join Date: Aug 2001
Location: South East, UK
Posts: 109,555
|
Bard is interesting as it gives you other "drafts" of its answers.
Create a 1000 word science fiction story that ends with a twist
Quote:
Quote:
Quote:
|
__________________
I wish I knew how to quit you |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#525 |
Lackey
Administrator
Join Date: Aug 2001
Location: South East, UK
Posts: 109,555
|
Unfortunately, we will never know what Dr Zhang had discovered.
I tried again but got another network error just as it got interesting ![]() |
__________________
I wish I knew how to quit you |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#526 |
Graduate Poster
Join Date: Aug 2009
Posts: 1,469
|
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#527 |
Thinker
Join Date: Sep 2011
Posts: 181
|
In the process of finalising the paperwork for my divorce, I got an email from my ex. A "Thanks for our time together" kind of thing.
Showed it to a friend for his amusement and his reaction was "this sounds like it was written by ChatGPT". A few days later I signed up at work to see how it could help (workmate claiming he uses it more than Stackoverflow now (I.T. stuff)). So I fed in the barest minimum of details, asking ChatGPT to write me a "Thanks for our time together" kind of thing. My friend was right. |
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#528 |
Observer of Phenomena
Pronouns: he/him Join Date: Feb 2005
Location: Ngunnawal Country
Posts: 82,007
|
|
__________________
Слава Україні! Героям Слава! 20220224 - 20230224 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#529 |
Skepticifimisticalationist
Join Date: Jun 2002
Location: Gulf Coast
Posts: 28,228
|
It won't, and can't. In the end, fleshy humans still have to decide which information is good and which is bad, and they're not going to make good choices all the time.
This is not, strictly speaking, a problem "of AI". Before "AI" came along, a Google search would still readily hand you medical misinformation and mistaken data right alongside reputable sources and well-verified information. Likewise, the problem of people misattributing authority or accuracy to AI isn't something special to AI; people believe COVID isn't real because they've been primed to misattribute authority or accuracy to human sources like Tucker Carlson or Joe Rogan or Alex Jones and believe what they say. Given that it just specifically answers questions as it is asked, I'm not sure an AI search engine would even really impact the global rate at which people are exposed to, accept, and believe bad information. I think the winner there is always going to be social media sorting algorithms. |
__________________
"¿WHAT KIND OF BIRD? ¿A PARANORMAL BIRD?" --- Carlos S., 2002 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#530 |
Skepticifimisticalationist
Join Date: Jun 2002
Location: Gulf Coast
Posts: 28,228
|
It could be, but I'm not convinced it's especially likely.
People treat it as an inevitability because some very popular science fiction stories that were written back when you needed several rooms to house a single computer said it's inevitable, and people have by and large just uncritically accepted that, to a degree that they are ready to take any claim that some machine has achieved sentience more or less on face value. But...is it an inevitability? I don't think it actually is. I think the greatest danger of AI comes from the potential actions and decisions of people who mistakenly believe it is "sentient". |
__________________
"¿WHAT KIND OF BIRD? ¿A PARANORMAL BIRD?" --- Carlos S., 2002 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#531 |
Philosopher
Join Date: Jun 2008
Posts: 6,314
|
|
__________________
We don't want good, sound arguments. We want arguments that sound good. |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#532 |
Philosopher
Join Date: Jun 2008
Posts: 6,314
|
|
__________________
We don't want good, sound arguments. We want arguments that sound good. |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#533 |
Penultimate Amazing
Join Date: Jun 2005
Posts: 19,757
|
Look at humans in general, and tell me who is more sentient compared to AI.
If AI can pass the Bar, what does that say about lawyers. |
__________________
Any sufficiently advanced idea is indistinguishable from idiocy to those who don't actually understanding the concept. |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#534 |
Penultimate Amazing
Join Date: Jun 2005
Location: Way way north of Diddy Wah Diddy
Posts: 34,758
|
I think it probably says that you can learn the stuff by rote and still not know the difference between right and wrong, or if you do to care. I've known a few lawyers who were fools, as well as a couple of crooks. Also at least a couple who had real ideas and abilities. All passed the bar.
|
__________________
Like many humorless and indignant people, he is hard on everybody but himself, and does not perceive it when he fails his own ideal (Molière) A pedant is a man who studies a vacuum through instruments that allow him to draw cross-sections of the details (John Ciardi) |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#535 |
Penultimate Amazing
Join Date: Jul 2003
Posts: 10,244
|
I asked Bard:
Quote:
Quote:
|
__________________
I wish I could show you when you are lonely or in darkness the astonishing light of your own being. ~ Hafiz of Shiraz |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#536 |
Philosopher
Join Date: Aug 2002
Location: Denmark
Posts: 6,841
|
That is fun: ChatGPT thinks that Bard is generated by OpenAI, and that it is specifically for poetry.
Bard thinks that it is generated by Google, and that it is not specifically for poetry, but is more advanced than ChatGPT. |
__________________
Steen -- Jack of all trades - master of none! |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#537 |
The Clarity Is Devastating
Join Date: Nov 2006
Location: Betwixt
Posts: 20,153
|
I haven't the slightest idea,* but I'm pretty sure the earliest version of this puzzle (wherever Martin Gardner got it from) wasn't about shaking hands. Do you think ChatGPT knows enough about human nature to figure out why? *Okay, I lied. Before looking up any answers to see if I'm right, here's the answer I came up with. |
__________________
"*Except Myriad. Even Cthulhu would give him a pat on the head and an ice cream and send him to the movies while he ended the rest of the world." - Foster Zygote |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#538 |
Lackey
Administrator
Join Date: Aug 2001
Location: South East, UK
Posts: 109,555
|
Bard's answer
Quote:
Quote:
Quote:
|
__________________
I wish I knew how to quit you |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#539 |
The Clarity Is Devastating
Join Date: Nov 2006
Location: Betwixt
Posts: 20,153
|
Those answers are identical. So Bard's answer for "Here's how I solved it" should have been "I asked Bing chat." Or vice versa. Having just looked online for the answer, I can understand some of the chatbots' confusion.
Quote:
The answer I found first online started out with a simplified four-person version, for which the highlighted text would be correct. The bot is echoing what it learned from the Internet, but failed to distinguish the simplified version from the original question. |
__________________
"*Except Myriad. Even Cthulhu would give him a pat on the head and an ice cream and send him to the movies while he ended the rest of the world." - Foster Zygote |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#540 |
Lackey
Administrator
Join Date: Aug 2001
Location: South East, UK
Posts: 109,555
|
But they aren't - the first answer is : Therefore, Mrs. Lars shook hands with 1 person and Mr. Lars shook hands with 4 people.
When it explained the logic behind it, it gave this answer: Mrs. Lars shook hands with exactly 4 people and Mr. Lars shook hands with 0 people. So not only did it swap Mr & Mrs Lars around - it changed the 1 to a 0. |
__________________
I wish I knew how to quit you |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#541 |
Philosopher
Join Date: Sep 2010
Location: Lenoir City, TN/Mineral Bluff, GA
Posts: 7,555
|
The answer is 4. Once you establish that Mr. Lars got 9 district answers to his query, those answers must have been 0,1,2….8.
The AI properly conclude that whoever shook 8 hands must have been married to the person who shook 0 hands. Diagramming it out, 7 married to 1, 6 married to 2, 5 married to 3, and 4 married to 4. Anyone other than the “4” couple would have not gotten 9 distinct answers to their query - two would have given the same answer of 4. So the person making the query must have shaken 4 hands, and their spouse as well. https://youtu.be/W_TOq0zs6KM My one improvement would have been to diagram it out with Mr. and Mrs. Lars unidentified until the end, deducing who the must have been by the 9 distinct answers qualifier. Anyway, odd that the AI put parts of it together, but drops the ball at the end. |
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#542 |
Gentleman of leisure
Tagger
Join Date: May 2005
Location: Flying around in the sky
Posts: 27,723
|
Here is one use AI is being used in the medical setting already.
It takes months to train a sonographer. But people with less starting qualifications are able to get usable images immediately with the help of AI. My comment: This will save big $ and make it available wherever the equipment is available. https://www.abc.net.au/news/2023-03-...unds/102123878 |
__________________
This signature is for rent. |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#543 |
The Clarity Is Devastating
Join Date: Nov 2006
Location: Betwixt
Posts: 20,153
|
I meant the first two answers you posted: the "Bard's" answer and the "Bing chat" answer. They're identical except for having different line breaks. I don't know why it turned the 1 into a 0 in the further explanation of "the logic behind it." But the parts I highlighted in the further explanation are pretty clearly the result of confusing the original (10-person) puzzle with the simplified (4-person) version of it that's encountered in many of the solutions/explanations found online. Which I think is an interesting case for one way the chatbots' answers can go wrong. If given an equivalent problem in terms that don't match the words of the Martin Gardner version in any obvious way (8 people instead of 10, pairs of twins instead of spouses, two-player games instead of handshakes, different order of presenting the constraints, etc.) and so not as easily matched to text found online, would it be able to construct any of the same kind of reasoning at all? Here, I'll do some of the work: Four pairs of twins held a board game night. The eight of them were the only participants. During the evening some of them played games of checkers. At the end of the gathering, one of the twins, named Alice, asked how many checkers games each of the other seven people had played. They all told the truth, and it turned out each of them had played a different number of checkers games. No one played checkers alone, no checkers game had more than two players, no one played checkers against his or her own twin, and no one played checkers more than once against the same opponent. How many checkers games did Alice play? How many did Alice's twin play? |
__________________
"*Except Myriad. Even Cthulhu would give him a pat on the head and an ice cream and send him to the movies while he ended the rest of the world." - Foster Zygote |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#544 | |||
Penultimate Amazing
Join Date: Jan 2003
Location: Yokohama, Japan
Posts: 27,952
|
ChatGPT wrote the lyrics to a song. Original lyrics of course |
|||
__________________
A fool thinks himself to be wise, but a wise man knows himself to be a fool. William Shakespeare |
||||
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#545 |
Lackey
Administrator
Join Date: Aug 2001
Location: South East, UK
Posts: 109,555
|
|
__________________
I wish I knew how to quit you |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#546 |
Lackey
Administrator
Join Date: Aug 2001
Location: South East, UK
Posts: 109,555
|
Is there any way of getting (legal and ethical!) access to ChatGPT 4 directly rather than via Bing Chat's implementation without paying?
|
__________________
I wish I knew how to quit you |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#547 |
Skepticifimisticalationist
Join Date: Jun 2002
Location: Gulf Coast
Posts: 28,228
|
|
__________________
"¿WHAT KIND OF BIRD? ¿A PARANORMAL BIRD?" --- Carlos S., 2002 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#548 |
Skepticifimisticalationist
Join Date: Jun 2002
Location: Gulf Coast
Posts: 28,228
|
|
__________________
"¿WHAT KIND OF BIRD? ¿A PARANORMAL BIRD?" --- Carlos S., 2002 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#549 |
Skepticifimisticalationist
Join Date: Jun 2002
Location: Gulf Coast
Posts: 28,228
|
I disagree with the framing, though. "The machine" (speaking of AI programs in this case) isn't going to "hurt us". It isn't merely not "the greatest danger", it isn't a realistic danger at all.
The danger is and can only be the decisions and actions of people. This is important - so extremely important that I cannot stress enough how important it is. It's not some semantical nitpick. If someone is run over by an AI-controlled car, that person was killed by the person or company who was operating that vehicle, not "the AI". It could (and in my opinion, should) also be argued that the company who manufactured it is also liable for selling a defective product, just like any other company would be in any other instance that doesn't involve "AI". |
__________________
"¿WHAT KIND OF BIRD? ¿A PARANORMAL BIRD?" --- Carlos S., 2002 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#550 |
Philosopher
Join Date: Sep 2010
Location: Lenoir City, TN/Mineral Bluff, GA
Posts: 7,555
|
Upthread I mentioned that ChatGPT had made a math error in a fairly straightforward physics problem I gave. The error was pointed out by a fellow on another forum using a calculator. I verified his answer with WolframAlpha. I wondered why they couldn’t have ChatGPT “reach out” to calculator programs and/or WolframAlpha to “check its work”.
Apparently Stephen Wolfram himself had a similar thought! https://youtu.be/z5WZhCBRDpU |
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#551 |
Skepticifimisticalationist
Join Date: Jun 2002
Location: Gulf Coast
Posts: 28,228
|
Here's another example of how assumptions and mistaken impressions lead to people just enthusiastically vomiting up misinformation about AI. Senator Chris Murphy (D-CT) tweeted this:
Quote:
|
__________________
"¿WHAT KIND OF BIRD? ¿A PARANORMAL BIRD?" --- Carlos S., 2002 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#552 |
Skepticifimisticalationist
Join Date: Jun 2002
Location: Gulf Coast
Posts: 28,228
|
If you need an example of how good ChatGPT ISN'T at even basic chemistry, we have:
What would have to be done to an atom of oxygen for it to become an atom of silicon?
Quote:
|
__________________
"¿WHAT KIND OF BIRD? ¿A PARANORMAL BIRD?" --- Carlos S., 2002 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#553 |
Penultimate Amazing
Join Date: Jun 2005
Location: Way way north of Diddy Wah Diddy
Posts: 34,758
|
I did hear an interesting discussion with Stephen Wolfram, who has apparently enabled ChatGPT to utilize Wolfram Alpha and whatnot to do the mathematics it isn't set up for.
https://www.youtube.com/watch?v=z5WZhCBRDpU |
__________________
Like many humorless and indignant people, he is hard on everybody but himself, and does not perceive it when he fails his own ideal (Molière) A pedant is a man who studies a vacuum through instruments that allow him to draw cross-sections of the details (John Ciardi) |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#554 |
Graduate Poster
Join Date: Aug 2009
Posts: 1,469
|
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#555 |
Skepticifimisticalationist
Join Date: Jun 2002
Location: Gulf Coast
Posts: 28,228
|
That's certainly better than "the very first sentence is already wrong" levels of wrong, but also still just wrong. Adding a neutron to an oxygen nucleus would NOT transmute it into a silicon nucleus. If you add a single neutron to an oxygen nucleus and that neutron beta-decays into a proton you've increased the oxygen atom's atomic number (6, originally) by one - i.e., you'll get flourine.
I don't think it's an improvement they've made over the last couple of hours. I think it's a case of, because the machine doesn't actually "know" enough chemistry to decide how likely any of the multiple responses its generated is to be more correct than the others, it just randomly chooses one to output, and that can be different for different users. |
__________________
"¿WHAT KIND OF BIRD? ¿A PARANORMAL BIRD?" --- Carlos S., 2002 |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#556 |
Graduate Poster
Join Date: Aug 2009
Posts: 1,469
|
GPT-4 gives this:
Quote:
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#557 |
Graduate Poster
Join Date: Aug 2009
Posts: 1,469
|
Giving it a chance for a rebuttal:
Quote:
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#558 |
Lackey
Administrator
Join Date: Aug 2001
Location: South East, UK
Posts: 109,555
|
For me this is one of the major points of concern with the current chat bots, as humans we are trained to listen to people who speak with "authority" and most of us will have as a heuristic that we associate "well spoken" with expertise. Which I think means we will be less sceptical about their results than we should be.
|
__________________
I wish I knew how to quit you |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#559 |
Graduate Poster
Join Date: Aug 2009
Posts: 1,469
|
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
#560 |
Gentleman of leisure
Tagger
Join Date: May 2005
Location: Flying around in the sky
Posts: 27,723
|
How was silicon made?
Originally Posted by chatGPT
|
__________________
This signature is for rent. |
|
![]() ![]() |
![]() ![]() ![]() ![]() |
![]() |
Thread Tools | |
|
|