All activity
Aqueduct's LLM support makes it easy for you to run open-source LLMs on any infrastructure that you use. With a single API call, you can run an LLM on a single prompt or even on a whole dataset!
Aqueduct
The easiest way to run open source LLMs
Chenggang Wu
left a comment
Super excited to share! At Aqueduct, we're building an open-source platform that simplifies data teams' lives by raising the abstraction layer for production data science. We're working on some cool stuff under the hood: data caching, parallel operator scheduling, and compilation of high-level workflow definitions to low-level specs that run on powerful compute engines like Kubernetes and AWS...
Aqueduct
Taking data science to production
Aqueduct automates the engineering required to take data science to production. By abstracting away low-level cloud infrastructure, Aqueduct enables data teams to run models anywhere, publish predictions where they're needed, and monitor results reliably.
Aqueduct
Taking data science to production