AI Module in DataTalk: Cloud or On-Premises Options

·

·

The AI module in DataTalk is designed to work in two flexible configurations, tailored to meet your specific needs: a cloud-based AI LLM solution or a fully on-premises setup. Each option offers distinct benefits, so you can choose the one that best suits your requirements.

Cloud-Based AI LLM

DataTalk connects to a cloud-based large language model (LLM) such as ChatGPT with the cloud solution. Here’s how it works and what it offers:

  • Internet Requirement: The cloud solution requires an active internet connection, as the AI model is hosted externally.
  • Benefits: Cloud-based LLMs offer access to highly advanced, large-scale models without local infrastructure. This setup means you get the latest AI capabilities, including high processing power, which can handle extensive data analysis and more complex queries.

On-Premises (Local) AI LLM

For organizations that require more control over their data, DataTalk also supports a fully on-premises solution:

  • No Internet Requirement: This configuration allows you to operate the AI module without an internet connection, keeping all data processing on-site.
  • Data Privacy: With an on-premises setup, you have complete assurance that your data remains within your own infrastructure, with no sharing or exposure to third-party providers.
  • Hardware Requirements: Running an LLM locally will require a powerful PC or server, as large language models demand substantial processing power and memory. Due to these hardware constraints, the on-premises option typically uses a smaller LLM model than cloud-based alternatives.

Flexible and Tailored to Your Needs

At DataTalk, we support both approaches and make it easy to choose the best option for your needs:

  • Seamless Integration: The AI module can be connected to an external provider (cloud) or your local LLM setup.
  • Plug-and-Play Hardware Option: If you choose the on-premises route but prefer a hassle-free installation, DataTalk also offers pre-configured hardware explicitly designed to run LLMs locally. This “plug-and-go” solution makes deploying a local AI model simple and efficient.

With this flexibility, DataTalk empowers you to choose between a high-capacity cloud solution or a fully secure on-premises setup, giving you control over performance and data privacy.


Get in touch

We look forward to hearing from you

We are interested in your comments, ideas, compliments, or critics.

You can contact us via info@data-talk.eu. We will be happy to respond.

Contact info


Velvarská 1699/29

160 00 Prague

Czech Republic


Na Kodymce 973/1

160 00 Prague

Czech Republic