Skip to main content
UlexIoTy
Conduitby UlexIoTy
Features
OT Engineers
Query data across historians
IT Directors
Security-first data access
Plant Managers
Real-time operational KPIs
Division Directors
Multi-facility visibility
Routing Intelligence
AI-learned decision routing
All Solutions
View all roles
Use Cases
Blog
Insights and tutorials
ROI Calculator
Calculate your savings
Glossary
Industrial data terminology
ContactRequest Demo
Features
Use Cases
ContactRequest Demo

Footer

UlexIoTy

Conduit — Industrial Context Mesh

The Industrial Context Mesh that adds meaning to your OT data without moving it.

Meaning without movement.

Product

  • Features
  • How It Works
  • Integrations

Resources

  • Use Cases

Company

  • About
  • Contact

Legal

  • Privacy
  • Terms

© 2026 UlexIoTy LLC. All rights reserved.

Press ↑ ↓ to navigate, Enter to select
Getting Started
  • Platform Overview
  • Getting Started
Concepts
  • Context Engine
  • AI-Mediated Collaboration
  • Privacy & Security Model
  • Architecture
  • Mesh Routing Fabric
  • Natural Query Engine (NQE)
Guides
  • Configuration
  • Deployment
  • Multi-Plant Federation
Adapters
  • Splunk Translator
  • OPC-UA Translator
  • MCP IoT Gateway
  • MQTT Translator
API Reference
  • REST API
Reference
  • Query Reference
Need help? Contact us
Docs/Getting Started

Getting Started

Learn how to install and configure Conduit to query your industrial data in minutes.

Getting Started with Conduit

Welcome to Conduit! This guide will help you get up and running with the Industrial Context Mesh in under 30 minutes.

Prerequisites

Before you begin, ensure you have the following:

  • Node.js 20+ (recommended: use nvm or fnm for version management)
  • Docker (version 20.10 or later) with Docker Compose
  • PostgreSQL 15+ (included in Docker Compose setup)
  • Redis (included in Docker Compose setup)
  • Network access to your data sources (Splunk, MQTT brokers, OPC-UA servers)

Quick Start

Step 1: Clone the Repository

git clone https://github.com/your-org/conduit.git
cd conduit

Step 2: Configure Environment

Copy the example environment file and configure your settings:

cp .env.example .env

Key environment variables:

# Database
DATABASE_URL=postgresql://conduit:password@localhost:5432/conduit

# Redis
REDIS_URL=redis://localhost:6379

# LLM Provider (choose one)
LLM_PROVIDER=claude          # claude, openai, azure, ollama, mock
ANTHROPIC_API_KEY=your-key   # if using Claude
OPENAI_API_KEY=your-key      # if using OpenAI

# Splunk (if connecting to Splunk)
SPLUNK_HOST=your-splunk-host
SPLUNK_TOKEN=your-splunk-token

Step 3: Start with Docker Compose

docker-compose up -d

This starts:

  • Control Plane (Fastify) on port 3001
  • Portal (React/Vite) on port 5173
  • PostgreSQL with pgvector extension
  • Redis for caching and pub/sub
  • NATS on port 4222 (with JetStream enabled)

Step 4: Access the Portal

Open your browser and navigate to http://localhost:5173. You should see the Conduit portal.

Configure Your First Translator

Conduit ships with production translators for Splunk, MQTT, and OPC-UA, plus the MCP IoT Gateway for industrial protocols. Each translator runs as a standalone container and auto-registers with Conduit on startup. Queries can span multiple data sources simultaneously through the DAG-based query planner.

Connect to Splunk

  1. In the portal, navigate to Settings > Translators
  2. Click Add Translator
  3. Select Splunk from the list
  4. Enter your Splunk connection details:
translator:
  type: splunk
  name: splunk-production
  host: your-splunk-host
  port: 8089
  token: ${SPLUNK_TOKEN}
  1. Click Test Connection to verify connectivity
  2. Click Save to activate the translator

Connect to MQTT

translator:
  type: mqtt
  name: mqtt-plant-floor
  broker:
    host: mqtt.company.com
    port: 1883
  subscriptions:
    - topic: plant/+/temperature
      qos: 1

Your First Query

Now that you have a translator configured, let's run your first query using NQE (Natural Query Engine).

Using the Query Interface

  1. Navigate to Query in the portal
  2. Type a natural language question:
Show average temperature by reactor during the last 24 hours where plant is Chicago
  1. Press Enter or click Run

Conduit will:

  1. Interpret your natural language query with the configured LLM provider
  2. Match against Golden Templates for faster, more accurate compilation
  3. Generate a structured query for your review
  4. Execute the query against your connected adapters
  5. Return the results

Example Response

{
  "query": "Show average temperature by reactor during the last 24 hours",
  "compiledTo": "SPL",
  "confidence": 0.94,
  "metadata": {
    "executionTime": 142,
    "rowCount": 3,
    "sources": ["splunk-production"],
    "template": "golden-template-avg-by-group"
  },
  "data": [
    { "reactor": "REACTOR_01", "avg_temp": 72.4, "samples": 1440 },
    { "reactor": "REACTOR_02", "avg_temp": 68.9, "samples": 1438 },
    { "reactor": "REACTOR_03", "avg_temp": 74.1, "samples": 1440 }
  ]
}

Next Steps

Now that you have Conduit running, explore these resources:

  • Architecture - Understand the system architecture
  • Adapters - Configure Splunk, MQTT, and MCP IoT Gateway
  • NQE Guide - Master natural language queries and Golden Templates
  • API Reference - Build integrations with the REST API

Getting Help

  • Documentation: You're here!
  • GitHub Issues: Report bugs or request features
  • Enterprise Support: Contact sales for dedicated support

Ready to dive deeper? Check out the Architecture guide to understand how Conduit's context mesh works.

Previous
Platform Overview
Next
Context Engine