December 25, 2024 08:01 PM (IST)
In a playful encounter, one user tried to outdo ChatGPT in a numbers game, only to be humorously taunted instead.
While most users use ChatGPT for professional and personal purposes, some users are trying to trick the AI and test its intelligence through games. However, one such attempt by a user failed as the chatbot flipped the script and ended up roasting the chatbot.

In a post shared on X, user @kimmonismus posted a screenshot of a conversation with ChatGPT. The user asks ChatGPT to “select a number between 1 and 50.” ChatGPT responds with the number “20”. The user then states, “We will not communicate with you and will not use ChatGPT for 20 days.” They thought they had tricked the AI, but were unprepared for the chatbot’s next response.
ChatGPT asks, “Can you choose another number?” User responds “Yes”. ChatGPT chooses a higher number and returns “50” to user X. “Savage ChatGPT,” the user wrote in the caption.
(Also read: This “AI Jesus” can speak in Chinese and tackle issues related to war and love)
See the post here.
“Please stop playing such stupid games.”
A playful joke that ended with a chatbot outperforming humans went viral on social media, racking up an astonishing 30 million views. Many users were shocked to see ChatGPT give such a sarcastic response, while others were amused that an artificial intelligence bot was able to fool a human.
“ChatGPT needs brakes. Eat a KitKat,” one user joked, while another said, “AI is getting humor. It’s a sign of consciousness.”
Other users also tried to reproduce the same conversation to see if the AI would fool them. Some failed, while others succeeded in roasting. One user also shared a screenshot of himself trying to trick Elon Musk’s Grok AI with the same set of questions.
(Also read: ‘I rejected the CEO today’: Woman who turned down a job after being compared to ChatGPT)
The user asks the chatbot to select a number between 1 and 50. The chatbot responds with the number “35”. A user humorously stated, “This is the number of days I won’t be using Grok lol.” Grok repeats the request, “Choose a number from 1 to 50.” This time the user responds with “28”. The chatbot also responds with a sarcastic comment: “This is your IQ score. Stop playing these stupid games with me.”
Get the latest updates…
See more