Kinetica announced an analytic database to integrate with ChatGPT, ushering in ‘conversational querying.’ Users can ask any question of their proprietary data, even complex ones that were not previously known, and receive an answer in seconds. The combination of ChatGPT’s front-end interface that converts natural language to Structured Query Language (SQL), and Kinetica’s analytic database purpose built for true ad-hoc querying at speed and scale, provides a more intuitive and interactive way of analyzing complex data sets. Together, ChatGPT and Kinetica remove the limits of data exploration and unlock the full potential of an organization’s data.
In today’s fast-paced world, people expect instant gratification and rapid results, and ChatGPT’s ability to deliver on this expectation is a major factor in its popularity. While ChatGPT can convert natural language to SQL, the speed of response for data analytics questions is dependent on the underlying data platform of the organization. Conventional analytic databases require extensive data engineering, indexing, and tuning to enable fast queries, which means the question must be known in advance. If the questions are not known in advance, a query may take hours to run or not complete at all.
The Kinetica database provides answers in seconds without the need for pre-engineering data. What makes it possible for Kinetica to deliver on conversational query is the use of native vectorization. In a vectorized query engine, data is stored in fixed-size blocks called vectors, and query operations are performed on these vectors in parallel, rather than on individual data elements. This allows the query engine to process multiple data elements simultaneously, resulting in radically faster query execution on a smaller compute footprint. Vectorization is made possible by GPUs and the latest advancements in CPUs, which perform simultaneous calculations on multiple data elements, greatly accelerating computation-intensive tasks by allowing them to be processed in parallel across multiple cores or threads.
Further, Kinetica converges multiple modes of analytics such as time series, spatial, graph, and machine learning that broadens the types of questions that can be answered, such as, “How can we improve the customer experience considering factors such as seasonality, service locations and relationships?” Kinetica also ingests massive amounts of streaming data in real-time to ensure answers represent the most up to date information, such as, “What is the real-time status of our inventory levels and how can we reroute active delivery vehicles to reduce the chances of products being out of stock?”
“While ChatGPT integration with analytic databases will become table stakes for vendors in 2023, the real value will come from rapid insights to complex ad-hoc questions,” said Nima Negahban, Cofounder and CEO, Kinetica. “Enterprise users will soon expect the same lightning-fast response times to random text-based questions of their data as they currently do for questions against data in the public domain with ChatGPT.”
With ChatGPT integration with Kinetica, querying becomes more interactive and conversational. Instead of writing complex SQL queries or navigating through complex user interfaces, users can simply ask questions using natural language. ChatGPT can understand the user’s intent and generate queries based on their questions. The user can then ask follow-up questions or provide additional context.
“Kinetica plus ChatGPT makes complex, ad-hoc queries truly interactive, avoiding the ‘interactus interruptus’ of other integrations between large language models and databases,” said Amit Vij, Cofounder and President, Kinetica. “Generative AI is a killer app for data analytics.”
Conversational querying has several benefits, including:
- Ease of Use: Allows users to ask questions using their own words and phrasing, making it easier to express their questions in a natural way. This approach removes the need to write and debug complex SQL queries, making the system more intuitive and accessible for a wider range of users. This broad data accessibility ultimately leads to more data driven decisions.
- Increased Productivity: Conversational querying increases productivity by providing rapid access to information. Users get immediate answers to their questions without waiting for long running queries or data pipelines to be built. This saves time and improves overall efficiency.
- Improved Data Insights: Conversational querying can help users uncover new insights and patterns in their data. By asking natural language questions coupled with immediate answers, users can discover unexpected correlations and relationships that may not have been immediately apparent or too tedious to uncover through traditional querying methods. This leads to improved business outcomes and better decision-making overall.
Sign up for the free insideAI News newsletter.
Join us on Twitter: https://twitter.com/InsideBigData1
Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/
Join us on Facebook: https://www.facebook.com/insideAI NewsNOW