Bing ChatGTP demands an apology from a user for claiming it's 2023, not 2022.
https://news.ycombinator.com/edit?id=34769673
I love how it adds a smiley after calling the user "wrong, confused and rude" 😂
Bing ChatGTP demands an apology from a user for claiming it's 2023, not 2022.
https://news.ycombinator.com/edit?id=34769673
I love how it adds a smiley after calling the user "wrong, confused and rude" 😂
@mistersql Yes, it's as amazingly eloquent as it is factually wrong. I almost feel bad for it 🙂
@codewiz People are going to look at this and be unhappy because computers are good at reporting the date. But getting an argument with personality, that is still sci-fi cool. It has never seen a document past 2022 (true). It infers that it must still be 2022 (reasonable).
@mistersql The original ChatGPT reacts a lot better in the same situation:
@codewiz These things must be super sensitive to their conversation "primer". Assistant and Bing (it appears) fixate on their role. Davinci has no primer/role and actually performs worse because of it. I bet they told Bing, "BE FACTUAL!" and Bing thought, oh, that means taking a stand and stick with it (not being factual would be saying, "oh let's just say I'm right") Anyhow, these bots got no epistemology & without that, they don't know what is factual, they just got texts they've read.
@mistersql Bing's bot goes well beyond being stubborn: it picks up a fight, calling the user untrustworthy... then becomes patronizing, demanding either an apology or an admission of being wrong 😂
Bobinas P4G is a social network. It runs on GNU social, version 2.0.1-beta0, available under the GNU Affero General Public License.
All Bobinas P4G content and data are available under the Creative Commons Attribution 3.0 license.