Generative AI coding startup Magic lands $320M investment from Eric Schmidt, Atlassian and others

Startups

Magic, an AI startup developing models to generate code and automate a range of software development tasks, says that it’s raised a large tranche of cash from investors including ex-Google CEO Eric Schmidt.

In a blog post this morning, Magic said that it closed a $320 million funding round with contributions from Schmidt, Jane Street, Sequoia, Atlassian, Nat Friedman & Daniel Gross, Elad Gil, CapitalG and others. The fundraising brings the company’s total raised to nearly half a billion dollars ($465 million), catapulting it into a cohort of better-funding generative coding startups that includes Anysphere, Codeium and Augment. (Interestingly, Schmidt’s backing Augment, too.)

Magic also announced a partnership with Google Cloud to build two “supercomputers” on Google Cloud Platform. One — Magic-G4 — will be made up of Nvidia H100 GPUs, while the other — Magic G5 — will comprise Nvidia’s next-gen Blackwell chips.

Magic says it aims to scale the latter cluster to “tens of thousands” of GPUs over time.

“We are excited to partner with Google and Nvidia to build our next-gen AI supercomputer on Google Cloud,” Magic co-founder and CEO Eric Steinberg said in a statement. “Nvidia’s [Blackwell] system will greatly improve inference and training efficiency for our models, and Google Cloud offers us the fastest timeline to scale, and a rich ecosystem of cloud services.”

Steinberger and Sebastian De Ro co-founded Magic in 2022. Steinberger says that he was inspired by the potential of AI at a young age; in high school, he and his friends wired up the school’s computers for machine learning algorithm training. That experience that planted the seeds for Steinberger’s computer science degree and his job at Meta as an AI researcher.

Magic provides AI-driven tools designed to help software engineers write, review, debug and plan code changes. The tools operate like an automated pair programmer, attempting to understand and continuously learn more about the context of coding projects.

Lots of platforms do the same, including the elephant in the room GitHub Copilot. But one of Magic’s innovations lie in its models’ ultra-long context windows.

A model’s context, or context window, refers to input data (e.g., text) that the model considers before generating output (e.g., additional text). A simple question — “Who won the 2020 U.S. presidential election?” — can serve as context, as can a movie script, show or audio clip. And as context windows grow, so does the size of the documents (or codebases, as the case may be) being fit into them.

Magic claims its latest model, LTM-2-mini, has a 100 million-token context window. (“Tokens” are subdivided bits of raw data, like the syllables “fan,” “tas” and “tic” in the word “fantastic.”) 100 million tokens is equivalent to around 10 million lines of code — or 750 novels. And it’s by far the largest context window of any commercial model; the next-largest are flagship Google’s Gemini models at 2 million tokens.

Magic says that — thanks to its long context — LTM-2-mini was able to implement a password strength meter for an open source project and create a calculator using a custom UI framework.

The company’s now in the process of training a larger version of the model.

Products You May Like

Articles You May Like

$25 billion valuation Chime takes another step towards an IPO
The DOJ wants a Perplexity executive to testify in its Google antitrust case
Alphabet-backed Indian lender files for $171M IPO
In just 4 months, AI coding assistant Cursor raised another $100M at a $2.5B valuation led by Thrive, sources say
Slip Robotics snags $28M for its bots that can load a truck in five minutes

Leave a Reply

Your email address will not be published. Required fields are marked *