Sam Altman, the CEO of OpenAI, recently spoke at an AC10 online meetup about the upcoming GPT-4 release. He debunked the popular belief that GPT-4 would be bigger than its predecessor GPT-3 and stated that they are getting more performance out of smaller models. In fact, a 100 trillion parameter model won’t be seen anytime soon. Altman also mentioned that Artificial General Intelligence (AGI) will require new algorithmic breakthroughs instead of larger models. GPT-4 is expected to focus on coding tasks such as Codex which is a descendant of GPT-3 and has already been released in private beta with API access available for users. Furthermore, Microsoft’s PowerApps software uses GTPT 3 for code snippets conversions from natural language commands – something which OpenAI could leverage more in their next instalment of GPTs. When asked about the expectations around GTPT 5, Altman said it might pass the Turing test but he does not consider it worth all the effort required for development currently.