Stability AI, the company funding the development of open-source music- and image-generating systems like Dance Diffusion and Stable Diffusion has reached unicorn status after raising funds from some top names in venture capital.
The London- and San Francisco–based company Stability recently announced that it has raised $101 million in a seed round led by Coatue Management and Lightspeed Venture Partners. The tranche values the company at $1 billion post-money, according to a Bloomberg source, and comes as the need for AI-powered content generation revs
Emad Mostaque co-founded Stability AI in 2020 after graduating from Oxford with a master’s in mathematics and computer science, he worked as an analyst at various hedge funds before he developed a personal fascination with AI and what he characterized as a lack of “organization” within the open source AI community.
“Nobody has any voting rights except our employees — no billionaires, big funds, governments or anyone else with control of the company or the communities we support. We’re completely independent,” Mostaque said in an interview. “We plan to use our computer to accelerate open source, foundational AI.”
Stability AI has a group of more than 4,000 Nvidia A100 GPUs running in AWS, which it uses to train AI systems, including Stable Diffusion. It is relatively costly to maintain — Business Insider reports that Stability AI’s operations and cloud expenses exceeded $50 million. But Mostaque has repeatedly asserted that the company’s R&D will encourage it to train models more efficiently going forward.
The company has had its own fair share of criticism. People and the media have over time speculated that the open-source release of Stable Diffusion has been used to create objectionable content like graphic violence and pornographic, nonconsensual celebrity scandals. Getty Images has hence prohibited the upload of the content generated by systems like Stable Diffusion, over fears of intellectual property disputes. (Stable Diffusion was trained on a data set that includes copyrighted works — and even private medical records.)
Stability AI recently was mentioned in a critical letter from U.S. House Representative Anna G. Eshoo (D-CA) to the National Security Advisor (NSA) and the Office of Science and Technology Policy, in which she advised the NSA and OSTP to address the release of “unsafe AI models” that “do not moderate content made on their platforms.”
“A percentage of people are simply unpleasant and weird, but that’s humanity,” Mostaque said in a previous interview. “Indeed, it is our belief this technology will be prevalent, and the paternalistic and somewhat condescending attitude of many AI aficionados is misguided in not trusting society.”
Stability AI intends to make money by preparing “private” models for customers and acting as a general infrastructure layer. It also offers a platform and API, DreamStudio, through which its models can be accessed by individual users — Mostaque told Bloomberg that DreamStudio has more than 1.5 million users who’ve created over 170 million images and Stable Diffusion has more than 10 million daily users “across all channels.”