Transcript
Athena Intelligence, we're building an artificial employee named Athena in an operating system called Olympus, in which she operates on.
We're currently deployed with defense contractors, Anheuser-Busch, large Fortune 500s. We want to bring the access to analyze data quicker in a first class way to Fortune 500 enterprises and public sector.
So we've had to rebuild majority of the business applications that a normal entry level analyst will use. Think Google Sheets, Excel, Microsoft Word, PowerPoint, in an AI native way, such that an agent Athena can operate on top of them or their human counterpart.
One of the analysts at these organizations can also operate on them together. The reason we chose Langflow was the ability to iterate with our customers was probably like the single best way for us to move the needle forward.
When we were working with early stage customers, what we found was the iteration rate, if we are building or developing on our stack was quite slow. Langflow allowed us to push that to the edge where that's happening at the customer site without our involvement. And then we're pulling those learnings back into core product such that we can iterate much quicker, which in a case of the AI industry is probably the single best thing.
We have Langflow currently deployed on private clouds, AWS, GCP, Azure. We use the managed service through DataStax. And we also have gone so far as deploying AWS GovCloud.
All of those instances run the same version of Langflow using a Terraform deployment process. And we've got it to the point now where we can spin these up in a single day for a new tenant.
One of the single greatest things I think about Langflow is we're able to deploy in our customers environments, which is a big differentiator in a world where security and privacy are again, top of mind for every organization being able to say, hey, the data is not leaving your environment, Langflow is coming to you, Athena is coming to you is a differentiator for us. I think its the single best thing we've heard from our customers is whenever they had a small adjustment they want to make, or something wasn't working to their liking, or they have a new idea that they want to ideate on.
Instead of coming to our team and then asking that of us, they can now directly go and platform to the Langflow configuration, drag and drop, create the version zero of this without our involvement.
The advice I would give someone to operate the fastest in AI is to operate the fastest. I think the only way you can move quickly is by learning, by making 20, 50 mistakes, figuring out what works. Because the environment's changing so quickly right now, any hardened thing you build might get torn by either a new open source application, a new open source model, a closed source model, or some differences or combining of the two.