Mitigate the OpenAI Debacle with Open Source and Cloud
Like many of you, I was shocked Friday afternoon by the ousting of OpenAI CEO Sam Altman, and have been watching closely as details and rumors emerged since then. After all, OpenAI has long been the 800-pound gorilla of the generative AI market—they brought GenAI to the mainstream when ChatGPT and DALL-E went viral in early 2023. As such, OpenAI now has some 80% market share for LLMs, and they used this momentum to expand their platform last week when they launched several new services.
In the fast-evolving world of GenAI, OpenAI provided a semblance of stability, and the shake ups of the last several days may have many wondering what it means for them, their teams, their companies, their apps. At DataStax, we’re fortunate to have built close partnerships and integrations with many of the other GenAI players and solutions: AWS Bedrock, Meta’s Llama 2, Cohere, Hugging Face, and GCP’s PaLM 2 and Vertex.ai; orchestrators like LangChain and LlamaIndex; and many more. Our customers also rely on a broad range of open source solutions, clouds, and APIs to provide resilience to rapid platform shifts.
We’ll discuss the app stack patterns we’re seeing for customers in production across LLMs, data, agents, and orchestrators in a Dec. 7 webinar, “Stop AI Doomscrolling: Mitigate the Open AI Debacle with Open Source and Cloud”.
We’ll also host a webinar next week (Nov. 30) with LlamaIndex, “Building an Open Source RAG Application Using LlamaIndex” where we’ll be talking with the LlamaIndex team about the challenges of bringing an LLM to production.
While it’s unclear how things will shake out with OpenAI, we work with a broad array of other players, from open source to the cloud providers. There are several different patterns we’re seeing from customers in production on how they build their GenAI stacks; learn more about these learnings in our webinars or by contacting us directly.