• Home
  • Software
  • AI at Sensetence – LLMs, AI Integration & Self-hosted Solutions

AI at Sensetence – LLMs, AI Integration & Self-hosted Solutions

When people talk about artificial intelligence (AI) these days, many people immediately think of large language models (LLMs) such as OpenAI GPT. But AI is much more than just LLMs. It includes rule-based systems, machine learning (ML), probabilistic models (e.g. Bayesian networks) and symbolic AI (e.g. expert systems). Neural networks are also used in many other areas, for example for image recognition (CNNs) or for autonomous systems (reinforcement learning).

Nevertheless, we will focus on LLMs in this article, as they are currently the most widespread AI technology and are particularly relevant for many companies.

Hosting the LLM Yourself?

In principle, there are two ways to use LLMs:

Hosted solutions (e.g. OpenAI API, Google Cloud AI, Azure AI, AWS Bedrock)
Self-hosted solutions (e.g. via OLLAMA on your own hardware)

Self-hosted Models

There are now many open source LLMs that can be operated locally. We have the necessary hardware to test and run models. A powerful graphics card with sufficient VRAM is crucial – our RTX 3060 with 12 GB is sufficient for the following AI models, for example:

  • Llama 3 – A powerful open source LLM from Meta
  • Mistral – Optimized for high efficiency and accurate responses
  • Gemma – A natural language processing (NLP) model developed by Google
  • Phi – Focused on efficient AI applications

Hosted Models

Commercial providers of hosted AI models include OpenAI, Google Cloud AI, Azure AI and AWS Bedrock. Sensetence has primarily worked with the OpenAI API to date.

Advantages of Hosted over Self-hosted Models

Hosted SolutionSelf-hosted Solution
Model QualityHigher, as companies like OpenAI use huge resources for trainingVariable, often not as good as hosted solutions
CostsRelatively low (e.g. per API call)High hardware costs (e.g. Nvidia A100 with 80 GB: approx. 26,000 € on 01.03.2025)
ImplementationSimple (no own infrastructure required)More complex (requires hosting, infrastructure, maintenance)

Use in Customer Systems

So far, we have primarily used OpenAI models in customer projects, as our experience has shown that they deliver better results than locally hosted models. In addition, our customers agree to their data being processed by OpenAI.

Exemplary AI use cases in customer systems:

  • Automatic creation of email suggestions
  • Correction and optimization of email texts for marketing campaigns
  • Writing summary texts based on information provided (stock market data, reports, summaries from other sources)

Use in Software Development

In the interest of efficiency and thus lower costs for our customers, we try to work with the best available AI tools. That’s why we rely on, among other things:

  • GitHub Copilot for more efficient code generation
  • LLM systems for software development, if the customer agrees to use them
  • Training for developers to ensure that no personal data is leaked into public models

Sensetence: Your AI Integration for Customized Software

Would you like to integrate AI into your software? Sensetence is your partner for customized AI solutions! We are specialists in software development and offer individual AI integration for automated workflows, intelligent data management and much more.

Based in Augsburg, we operate throughout Germany, whether on-site or remotely. Let’s make your software smarter together!