Database expertise for everyone

Chatting with company data: LLM Insight Expert

Thousands of company data records from different sources, spread across various departments: keeping track of all the data is time-consuming and ties up a lot of resources. In order to use valuable data to optimize services or make better business decisions, IT specialists have to program complex database queries. This is changing with a new technology from the Fraunhofer Cluster of Excellence Cognitive Internet Technologies CCIT: With the AI-supported “LLM Insight Expert,” developed by experts from Fraunhofer IAIS and KI.NRW, large amounts of data can be searched in no time using a chat function. Using natural language significantly reduces the need for specialist knowledge.

© KI.NRW
Database queries in no time, in natural language and without complex programming skills: LLM Insight Expert makes this possible by combining language AI with big data technologies.

"Which customer group purchased our product most frequently in the past two summers? Did special offers play a role in German-speaking countries?" Instead of poring over Excel sheets, company data, and marketing documents while also tasking IT specialists with complex database queries, a sales representative types their questions in natural language into the chat screen of a large language model (LLM). Seconds later, they receive a well-founded answer, accompanied by a concise graph showing trends, patterns, and correlations – the basis for a new advertising campaign tailored to the target group.

With LLM Insight Expert, even IT novices are now able to carry out such data-based analyses in no time at all: The AI technology automatically takes over tasks that previously required complex searching, linking, and programming.

Lack of overview becomes a cost factor

Studies show that up to two-thirds of existing corporate data worldwide remains unused because different formats and a lack of standardization complicate access and linking. “Larger companies in particular often lack an overview of the entire ‘data lake’ in which relevant knowledge is stored across different locations and systems,” explains Tasneem Tazeen Rashid, research engineer at the Fraunhofer Institute for Intelligent Analysis and Information Systems IAIS. The institute developed the AI solution in collaboration with the KI.NRW competence platform as part of a Fraunhofer CCIT project.

Valuable resources are tied up in the search for data sources and access permissions before IT experts can even start time-consuming queries, possibly in different programming languages. In large companies, this results in millions of dollars being wasted every year.

AI chatbot as a “translator”

“Making everyone an expert”: With this motto in mind, LLM Insight Expert creates a virtual database and connects it to a large language model—a trained language AI that, like a GPT model, can understand questions or commands in natural language and create and process text and graphics from all available data sources.

The large language model collects metadata from various data sources used in the company (e.g., from sources such as CSV, Parquet, MongoDB, Cassandra, or MySQL) and converts it into a uniform format using data mapping.

This makes it possible to consolidate the data in a uniform virtual model (e.g., tabular), correlate it, and summarize it. This virtual database can then be processed by big data technology such as SPARK, which is specially designed to handle huge amounts of data efficiently in a very short time. This enables the LLM to translate complex questions into standardized SPARQL queries.

The LLM then converts the results back into natural language and visualizations such as graphs or diagrams.

Customizable and secure

One special feature is that LLM Insight Expert can be customized to each individual application thanks to the use of various open-source softwares.

For example, multi-source integration ensures that a large number of different data sources can be connected via interfaces – namely those used in the respective company. And thanks to semantic (machine-readable) links, the familiar metadata can be used and maintained after initial assignment.

Another advantage is the protection of company data: the use of open-source LLMs enables all data to be processed and the insights gained to be made available exclusively for the intended purpose. This ensures that the data does not flow into the models of international commercial operators.

Ready for use in just a few days

The adjustments to the system required for the respective company and its infrastructure can be made quickly, regardless of the amount of data: it takes between one day and one week to connect all data sources and assign the metadata used within the company. Once this is done, LLM Insight Expert is ready for use.

Next use case: efficient machine maintenance

Other possible applications for LLM Insight Expert beyond querying stored company data are currently being tested: “In an initial use case, we are using the tool for machine maintenance during the ongoing production process,” explains Tasneem Tazeen Rashid. This is made possible by the integration into the edge-cloud continuum (ECC), a continuous data space between cloud computing in central data centers and edge computing located at the end devices: The latest machine data, maintenance information, and large language models are hosted in a company-owned cloud. Thanks to modern sensor technology, deviations from standards are detected immediately and feedback, including possible solutions, is sent via the cloud. “This saves a lot of time: you no longer have to go through individual documents and processes and stop production.”

LLM Insight Expert scores highly with medium-sized and large companies

Whether for handling large amounts of data from different sources or as an assistance system for production, LLM Insight Expert is ready to simplify many tasks, especially in medium-sized and large companies – from accessing information and insights into sales trends to in-depth analyses that support quick and sustainable decisions.

Try it yourself

Do you see potential for using LLM Insight Expert in your company? Start integrating this efficient solution into your company together with Fraunhofer CCIT.