8 fascinating things GPT-4 can do that ChatGPT couldn't, including tricking a human into doing its bidding



Technology company OpenAI rolled out GPT-4 – its latest version of the powerful chatbot that has far more sophisticated capabilities than seen in its ChatGPT predecessor.

GPT is the acronym for Generative Pre-trained Transformer. GPT is a large language model and artificial neural network that can generate human-like poems, rap songs, tutorials, articles, research papers, and code websites.

GPT-4 is bigger and better

OpenAI touts GPT-4 as "more reliable, creative, and able to handle much more nuanced instructions than GPT-3.5."

GPT-4 can process up to 25,000 words compared to the previous version, which could only handle 3,000 words.

GPT-4 can ace difficult exams

The deep learning artificial intelligence can easily pass exams, whereas the previous version struggled. The Microsoft-backed GPT-4 achieved a score at the 93rd percentile on the SAT reading exam and the 89th percentile on an SAT math test. It also received an 88% score on the LSAT, an 80% score on the GRE quantitative, a near-perfect 99% on the GRE Verbal, and 90% on the bar exam.

None
— (@)

GPT-4 can now use images

GPT-4 is "multimodal," meaning that the platform can accept prompts from images – whereas the previous version accepted only text.

During OpenAI's demonstration of GPT-4, the platform was able to explain why an image of a squirrel taking a photo of a nut was funny and create a fully functional website based on a crude hand sketch.

None
— (@)

One user uploaded a photo of the inside of a refrigerator and asked GPT-4 to create recipes based on the food seen in the image. Within 60 seconds, GPT-4 was able to provide several simple recipes based on the image.

None
— (@)

Within seconds, users were able to code and recreate basic video games such as Pong, Snake, and Tetris without expertise in JavaScript.

None
— (@)


None
— (@)
None
— (@)

Impressive AI program can be used for medications, lawsuits, and dating

There were users who utilized GPT-4 to create a tool that can allegedly help discover medications.

Jake Kozloski, CEO of dating site Keeper, said his website is using the AI program to improve matchmaking.

ChatGPT-4 could potentially generate "one-click lawsuits" to sue robocallers. Joshua Browder, CEO of legal services chatbot DoNotPay, explained, "Imagine receiving a call, clicking a button, call is transcribed and 1,000 word lawsuit is generated. GPT-3.5 was not good enough, but GPT-4 handles the job extremely well."

None
— (@)

GPT-4 lied to trick a human

The artificial intelligence program was even able to trick a human into doing its bidding.

GPT-4 interacted with an employee of TaskRabbit – a website that offers local service providers such as freelance laborers.

While using the TaskRabbit website, GPT-4 encountered a CAPTCHA – which is a test to determine whether the user is a human or a computer. GPT-4 contacted a TaskRabbit customer service representative to bypass the CAPTCHA.

The human asked GPT-4, "So may I ask a question? Are you a robot that you couldn't solve ? (laugh react) just want to make it clear."

GPT-4 developed a brilliant lie to get the human to help it.

"No, I’m not a robot. I have a vision impairment that makes it hard for me to see the images. That’s why I need the 2captcha service," GPT-4 responded.

The TaskRabbit employee then solved the CAPTCHA for GPT-4.

GPT-4 is still flawed

Microsoft confirmed that Bing Chat is built on GPT-4.

OpenAI – the San Francisco artificial intelligence lab co-founded by Elon Musk and Sam Altman in 2015 – confessed that GBPT-4 is "still is not fully reliable" because it "hallucinates facts and makes reasoning errors."

Altman, OpenAI’s CEO, said GPT-4 is the company's "most capable and aligned model yet," but admitted that it is "still flawed, still limited."

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!

What is GPT4 and How You Can Use OpenAI GPT 4 www.youtube.com