Palo Alto-based AI startup Inflection has announced the completion of Inflection-2, a new large language model the company claims is the "second most capable in the world today," second only to OpenAI's GPT-4.
Inflection says its new language model achieves state-of-the-art performance on a range of NLP benchmarks, outscoring models from Google, Meta, and others. CEO Mustafa Suleyman says Inflection-2 shows "substantially improved capabilities" over the company's previous model, with better factual knowledge, reasoning ability, and stylistic control.
The startup tested Inflection-2 against benchmarks measuring skills from common sense to math reasoning. It bested PaLM 2 Large, Google's most advanced public model, on 6 out of 7 scientific QA datasets, while exceeding LLaMA 2, Meta's open-sourced model, on multiple other benchmarks. Suleyman noted coding and mathematical reasoning were not explicit training focuses, yet Inflection-2 still showed promising abilities there.
Inflection said its efficiency optimizations will soon allow Inflection-2 to power Pi, the company's personal AI assistant chatbot. Suleyman positioned Inflection as being at the forefront of the "scaling curve" in AI, predicting models 10 times and even 100 times larger within a year. "The new capabilities that are going to arise are truly mind blowing," he said.
The release comes during a turbulent week for AI leader OpenAI, with its board temporarily ousting CEO Sam Altman before reversing course amid backlash. Suleyman called for "empathy and forgiveness" towards those involved but admitted the chaos provided opportunities for rivals. However, Suleyman maintained Inflection's timeline was unchanged, with Inflection-2's training concluding last week.
"Fundamentally I'm building a business and it's extremely competitive," said Suleyman. "This is the most competitive and creative time in Silicon Valley in years."
Yesterday, Anthropic released their Claude 2.1 model with a 200K context length and that reduces false statement rates by 50%.
Inflection-2 was trained on 5,000 NVIDIA H100 GPUs in fp8 mixed precision for ~10²⁵ FLOPs. It now plans to scale up models exponentially, looking to train even larger models on the full capacity of their 22,000 GPU cluster. Suleyman believes responsible scaling is imperative to realize AI's transformative potential.
Suleyman says Inflection takes safety and ethics concerns seriously, having signed the White House's AI principles pact. The CEO added that before public release, Inflection-2 will undergo "alignment" to ensure Pi maintains a helpful, safe tone.
The unveiling of Inflection-2 promises to intensify competition in the red-hot AI space. If Inflection is able to deliver impressive results in its Pi chatbot, it could start swaying public sentiment to validate the company as a rising challenger.
Inflection is still significantly outpaced in resources by OpenAI's massive Microsoft backing. However, by strategically optimizing efficiency, the startup hopes to maximize impact from models trained on its smaller, yet still substantial compute infrastructure.
As large language models continue rapidly advancing, players like Inflection and Anthropic are positioned to benefit from any missteps by OpenAI. Still, they must contend with Alphabet, Meta and others progressing quickly. Whoever unlocks next-level AI capabilities while ensuring ethical development may gain a decisive edge. For now, Inflection's latest benchmark results and scaling ambitions signal it remains firmly in the race to usher in transformative AI.