Datatron announced the latest version of its enterprise-grade MLOps platform. Updates include increased flexibility, a new interface that simplifies data scientists’ workflow, and ease-of-use enhancements for the operational teams, resulting in an additional productivity gain of up to 68%.
Datatron Version 3.0 allows enterprises to achieve results with AI and ML by removing the roadblocks that prevent successful deployment. Based on feedback from clients, Datatron is further simplifying the life of data scientists, making it easier to register, iterate and deploy models with just a few simple commands – all from within the company’s familiar notebook environment.
“Data scientists tend to be skeptical that MLOps tools can handle the complexity of their models, in part because many similar tools force teams to integrate SDKs orpackaging mechanisms to operationalize their models,” said Victor Thu, president, Datatron. “Datatron is committed to providing our customers with full flexibility and capability to get models into production faster, by simplifying and streamlining the operationalization process. In today’s uncertain economic times, businesses are facing a host of economic uncertainties. Operationalizing ML models doesn’t need to be one of those uncertainties.”
Datatron has proven to be one of the most flexible and managed platforms to get AI and ML models out into production rapidly. Unlike most offerings on the market, the Datatron platform doesn’t require additional vendor-specific software development kit (SDK) lines of code to operationalize AI and ML models. The vendor/tech agnostic platform integrates with various workflow stacks including via open source tools, taking away the burden of having to scale to manage the scaffolding of tools that have been layered together.
The latest release further simplifies AI and ML workflows by fully augmenting the operationalization process of AI and ML for test/validation, deployment, and scale without requiring a staff of expensive engineers. Datatron’s platform allows customers to avoid “analysis paralysis”with their AI and ML programs, enabling them to get models into production more quickly and allowing teams to rapidly iterate based on real-world data.
“We believe that Datatron’s MLOps platform is crucial for getting models out of the lab and into production and we’re excited to bring even more value to Datatron’s customers through our expertise in Kubernetes management,” said Tenry Fu, CEO, SpectroCloud. “Our operational environment enables IT/DevOps teams to seamlessly use Datatron in any cloud environment.”
With AI/ML models, businesses can experience positive ROI within the first six months of using Datatron, far faster than if they had to build a team, and assemble all the various software or open source components.
Enhancements made to the platform include:
- JupyterHub integration: This release significantly simplifies the AI and ML operationalization process with a notebook interface for data or ML scientists, where they can register and deploy models with just a few commands. Furthermore, this integration eliminates the need for context switching between the data scientists’ coding environments and the Datatron deployment interface.
- Simpler deployment and management: Powered by SpectroCloud, customers gain a simpler way to deploy and manage Datatron’s Kubernetes infrastructure. In addition to supporting AWS, the new capability extends this greater simplicity to Google Cloud Platform and Microsoft Azure. This improved deployment and management capability removes the complexities of enterprises having to learn and manage Kubernetes.
- Enhanced enterprise support: The updated logging and operational dashboard significantly simplifies problem resolution of the platform when the operational team encounters errors. Furthermore, Datatron 3.0 includes Single Sign-on support for the industry’s popular identity platforms.
- Superior cost savings: The Datatron platform is the key to maximizing operational efficiencies and allows customers to realize the ROI on AI and ML investments faster and more easily.
Sign up for the free insideAI News newsletter.
Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1