ChatGPT-4o features: AI demos real-time conversational speech and emotion
OpenAI's spring update shows where the AI chatbot is headed, and it's pretty wild and mostly free
š¤ ChatGPT-4o is OpenAIās latest model thatās faster and sounds more real
š» A native ChatGPT app is coming to desktops ā Mac first, Windows later
š° Itās free (with limits), but some features will come to ChatGPT Plus first
š® You can ask ChatGPT-4o to voice emotion and now interrupt the AI
š New vision capabilities can analyze on-screen and on-camera queries
š¬ Real-time translation is possible using ChatGPT-4o
Update: The Shortcut can confirm that ChatGPT-4o has launched for ChatGPT Plus users paying $20/mo to OpenAI. We havenāt seen it launch for free users yet, nor have I seen the Mac app thatās supposed to launch today. Stay tuned for updates.
We just witnessed ChatGPT-4o, the latest model from OpenAI, and itās capable of wild real-time conversational speech and conveying human-like emotion. Hereās whatās new about the new ChatGPT LLM model, launching one day before Google IO and one month before Appleās WWDC 2024 keynote.
āTalking to a computer has never felt really natural for me; now it does,ā said OpenAI CEO Sam Altman. āI can really see an exciting future where we are able to use computers to do much more than ever before.ā
šāāļø The voice model of ChatGPT-4o offers faster answers (no more 2-3 second lag), is more responsive to feedback, and can be interrupted so you donāt have to wait your turn to inject new prompts, according to the demo by OpenAI CTO Mira Murati. Basically, you can be rude to ChatGPT-4o to advance conversations.
š» Desktop version. Launching today, ChatGPT will have an easy and simple desktop app with a new UI. But I havenāt seen the desktop app available to download just yet.
š ChatGPT set free. Free users were limited to ChatGPT 3.5, while ChatGPT Plus members, paying $20 a month, could access ChatGPT 4 with more up-to-date data. Thatās changing starting today (again, itās not live just yet), as ChatGPT-4o becomes free as do GPT Store tools. Paying members will still get 5x the capacity limits of the free users and likely first access to ChatGPT-5 soon.
š® Drama alert. ChatGPT-4o can tweak its voice to add emotion. Research lead Mark Chen asked GPT-4o to tell a bedtime story and it began with a normal-sounding āOnce upon a time...ā Stopping the AI by interrupting with new prompts, he asked for more emotion, and then even more emotion, and each time the AI would raise and lower its voice pitch more. Then it was asked to do it in a robot voice, and it complied.
š« Sensing your emotions. It also can pick up on your emotions. The demo here involved asking for breathing exercises and the AI noticed Mark Chen was basically hyperventilating instead of breathing in and out calmly. āMark, youāre not a vacuum cleaner!ā She asked him to adjust his breathing technique.
š Vision capabilities can analyze math on a sheet of paper (and offer hints instead of full answers if youād like), solve complex coding problems via screen share (and explain them with plain language), and analyze and explain detailed charts.
š¬ Real-time translation. Just about the only thing I liked in my Rabbit R1 review involved its simple translator tool, but AI in ChatGPT-4o goes further with a back-and-forth seamless conversation and natural-sounding voice. The demo was a quick one, so weāll go hands-on with the new ChatGPT soon to confirm that itās indeed better than the translator mode in our Samsung Galaxy S24 Ultra review.
āļø More ChatGPT-4o specs. OpenAI said that the new model can use memory to sense the continuity of all your conversations. Its API is also 3x faster, 50% cheaper, and has a 5x higher rate limited vs GPT-4 Turbo.
š¤ Where does that leave paying users? If ChatGPT 4o is launching for free, why would anyone stick with the $20/mo plan? According to OpenAI, paying members will get 5x the capacity limits of the free users. The company didnāt say this, but, likely, theyāll also get first access to ChatGPT-5 when it launches later this year.
Update: In a statement on X in the late afternoon, the company said: āAll users will start to get access to GPT-4o today. In coming weeks weāll begin rolling out the new voice and vision capabilities we demoād today to ChatGPT Plus.ā
I guess by cancelling the subscription youāll lose access to DALL-E etc?