Blog

Your data. Your infrastructure. Your call.

Digel runs three ways: private cloud managed by us, cloud infrastructure managed by you, and fully on-premise with a local model. Here is how to choose.

Thomas
Thomas
Your data. Your infrastructure. Your call.

Every serious industrial customer asks the same question before they commit to any software. Where does our data go? Not as a compliance checkbox. As a real question, because the data inside a plant is sensitive. Production volumes, failure history, process parameters, sometimes things competitors would be happy to have.

We get asked about this a lot. Prospects walk through it line by line. Which cloud provider. Which region. Whether data is used to train models. What happens when an operator types a question into chat. Where that question goes and who can see it.

There are three options.

Private cloud

The default for most customers. We manage the infrastructure, but your environment is isolated from every other customer. Separate database, separate storage, separate encryption keys. Nothing shared.

This is the Professional tier. We handle uptime and updates. Your data stays inside an environment only your team can reach.

Data residency defaults to Europe. If your procurement or compliance team needs to verify the setup, we make that straightforward. Security documentation is available before you sign.

Customer-managed cloud

Some customers have an existing cloud environment and a hard requirement to keep data inside it. For those, we offer what the industry calls Bring Your Own Cloud, BYOC.

You run Digel on your own Kubernetes cluster. We provide licensed Helm charts and handle updates on your schedule. Your infrastructure team controls the environment. None of your data touches our systems.

This is the Enterprise tier. It takes more work to stand up, and you own the operational responsibility for the cluster. What you get back is full visibility into what is running and where, and an easy answer for any infosec team asking whether your factory data leaves your cloud account.

Security reviews also go faster here. Your team is reviewing infrastructure they already manage, not ours.

On-premise and local model

For customers who can't send data to any external cloud, Digel can run entirely on hardware you own. Air-gapped environments included.

Same Helm-based deployment. Your servers, your network, nothing leaves the building.

On-premise also means you can run Digel with a local language model. If sending queries to Anthropic, Google, or OpenAI is off the table because of data classification or network policy, a local model inside your environment handles inference. The quality depends on what you run. The architecture is the same either way, and the context graph is still what makes the answers useful.

We configure this with you. If on-premise is a requirement, talk to us early. Model options and hardware sizing are easier to work through before you are deep into an evaluation.

Same product, wherever it runs

Context modeler, triage, chat agent, maintenance, notes, root cause analysis. The agents, the tools, the cooperation model. All of it is identical across the three options.

The only thing that changes is who manages the infrastructure and where the data sits.

If you're in the middle of a security review and want to go deeper on any of this, reach out at hello@digel.io. We can share documentation and get someone technical on a call before you commit to anything.

Want to see Digel in action?

Book a 30-minute demo or try the live playground.