
【#Tech24H】This internally codenamed “Spud” model, after two years of clandestine research and development, completed pre-training on March 24 at the Stargate data center in Texas, and is expected to be released as early as April. GPT-6 outperforms GPT-5.4 by more than 40% across coding, reasoning, and agent tasks. Its context window has been doubled from 1 million tokens in GPT-5.4 to 2 million tokens. Brockman has repeatedly emphasized a key positioning: GPT-6 is an entirely new foundational model that will serve as the base for all future OpenAI models.
