Ignition + Flow Software: An Analytics Powerhouse

Real-time performance meets advanced data modeling and governed analytics—an unmatched combo for turning raw data into trusted, actionable intelligence.

SEE A DEMO

Proud Member of the Ignition Technology Ecosystem

Flow Software and Timebase bring powerful data management, modeling, and analytics to your Ignition environment—without requiring custom code or external pipelines.Whether you're building a new system or scaling an existing one, our tools are designed to plug directly into Ignition:

This 1–2 punch of historical performance and data governance turns Ignition into a full-stack analytics powerhouse—scalable, sustainable, and smart from edge to cloud.

A Historian You'll Love Using

A high-speed, lossless historian purpose-built for industrial performance.

Native Ignition 8.3 integration makes setup simple and trending fast—without extra licensing or complexity.

Normalize & Cleanse Your Data

Eliminate messy, inconsistent data and establish trust.

Apply business rules, time alignment, and cleansing logic to create a trusted foundation for analysis and reporting.

Data Models That Matter

Build a structured, object-oriented model that mirrors your plant or process.

Flow helps turn raw tags into meaningful assets, KPIs, and event structures—without scripting.

High Performance Analytics

Define KPIs, calculations, and performance metrics in Flow’s no-code environment.

Trigger insights based on events, shifts, batches, or custom periods.

Store & Publish Information

Flow captures your calculated results and publishes them using a purpose-built Ignition module—or to the UNS, external SQL databases, or cloud platforms.

Your cleansed, contextualized data goes wherever it’s needed.

Custom Visualization

Integrate seamlessly with Ignition Perspective to visualize analytics alongside real-time controls and status or use Flow's dashboard tools.

Flow's widgets are all HTML based and give you the freedom to embed them in your project.

Governance &  Scalability

Accelerate deployment with reusable templates and enforce consistent logic across teams and systems.

Flow gives you centralized governance over how data is used—ensuring trusted, repeatable outcomes.

AI-Enablement From Day One

Flow turns your entire operational model into an intelligent, structured interface.

You are immediately ready for natural language interaction, autonomous agents, and ML-driven decisions at scale.

Now Part of the Ignition Technology Ecosystem

Flow Software and Timebase bring powerful data management, modeling, and analytics to your Ignition environment—without requiring custom code or external pipelines.Whether you're building a new system or scaling an existing one, our tools are designed to plug directly into Ignition:

Timebase Historian
A modern, high-performance historian built for edge and enterprise. With native Ignition modules (coming for Ignition 8.3), Timebase offers simple setup, lossless compression, and lightning-fast trending.

Flow Information Platform
Model your process data, cleanse it, define events and KPIs, and govern the rules that drive your analytics—all without writing a single script.

Data Sharing Back to Ignition
Everything you model and calculate in Flow can be pushed back into Ignition tags, Perspective dashboards, reports, or the Unified Namespace.

This 1–2 punch of historical performance and data governance turns Ignition into a full-stack analytics powerhouse—scalable, sustainable, and smart from edge to cloud.


The UAF also serves as an auditable single source of truth, logging all changes, versioning calculations, and maintaining a complete dependency map of all model components. This rigorous governance framework empowers citizen developers, particularly within the engineering core, by providing tools to leverage deep process knowledge and transform it into valuable data streams accessible by the entire enterprise. By combining the expertise of many operational subject matter experts, the UAF facilitates the creation of comprehensive and context-rich data models. This enables rapid, informed decision-making and fosters continuous improvement and innovation, ultimately driving operational efficiency and effectiveness.


Adding Historic Data Access and Transformations to a Unified Namespace

A Unified Namespace provides access to real time data, but does not address the need to query historic data or provide data transformations. In the video below, let's examine how the architectures that we have been using across industry have fallen short and how the UAF address these needs.

Data Sources and Data Consumers

Building a Unified Analytics Framework (UAF) requires a comprehensive approach to ensure cross-platform compatibility and freedom from vendor lock-in. This involves integrating and abstracting a wide variety of manufacturing systems, databases, and modern solutions.

A robust UAF must be designed to seamlessly integrate with a multitude of data sources and consumers, ensuring that the entire manufacturing environment can operate efficiently and cohesively. This involves connecting to various types of manufacturing systems and databases, enabling the organization to unify data from disparate sources, and delivering actionable insights to a wide range of systems.

Key Data Sources

Time Series Historians

The UAF must account for a variety of historian vendors and technologies to ensure cross-platform compatibility and free the enterprise from historian vendor lock-in. This includes supporting:

  • Enterprise Class Historians - AVEVA PI, Canary Labs, etc
  • Site Historians - Wonderware, Proficy, Citect, FactoryTalk, DeltaV, etc.
  • Open Source Historians - InfluxDB, Timescale, QuestDB, etc.
  • SQL-based Historians - Ignition, VTScada, etc.

Manufacturing Systems and Solutions

Unifying data from these diverse systems necessitates the ability to connect to and pull data from each of them. This integration must encompass both established and legacy providers, as well as modern solutions, to offer comprehensive and flexible data management and operational capabilities. By doing so, organizations can achieve seamless data integration and efficient operations across their entire manufacturing environment.

  • Laboratory Information Management Systems (LIMS) - STARLIMS, LabWare, Thermo Fisher Scientific SampleManager, etc.
  • Manufacturing Execution Systems (MES) - Siemens Opcenter, Rockwell Automation FactoryTalk, AVEVA MES, Sepasoft, etc.
  • Enterprise Resource Planning (ERP) - SAP, Oracle, etc.
  • Computerized Maintenance Management Systems (CMMS) - Fiix CMMS, UpKeep, Maintenance Connection, etc.
  • Enterprise Asset Management (EAM) - IBM Maximo, Infor, SAP, etc.

Real Time Data Capture

In many manufacturing environments, crucial real-time data is not currently being archived, which limits the ability to analyze and transform this information. To address this gap, the UAF should include the capability to collect and store real-time data, playing a role similar to a traditional data historian but specifically for data that is not already being stored. When connecting to a Unified Namespace (UNS) as a real-time source, it is essential to understand what data is already being historized elsewhere. By identifying and capturing only the data that is not currently stored, the UAF ensures that all relevant information is available for comprehensive analysis and transformation, thereby enhancing decision-making and operational efficiency. This approach prevents redundancy, optimizes storage, and ensures a complete and accurate data set for actionable insights.

Common technologies and protocols that should be supported include OPC servers, MQTT brokers (vanilla and Sparkplug), web APIs, Kafka streams, and other real time data sources.

SQL Databases

Transaction-based data is crucial in providing the necessary context to slice and interpret time series data effectively. This type of data helps in understanding the events and transactions that occur within the manufacturing process, offering a detailed view of operational activities and their impact. Incorporating transaction-based data into the UAF allows for a more comprehensive analysis, enabling better decision-making and insights.To support this integration, the UAF must be capable of connecting to and pulling data from various SQL databases. This includes widely-used technologies such as:

  • Microsoft SQL Server - An open-source database that is popular for its performance, reliability, and ease of use.
  • MYSQL - Siemens Opcenter, Rockwell Automation FactoryTalk, AVEVA MES, Sepasoft, etc.
  • PostgreSQL - An advanced open-source database that supports complex queries and a wide range of data types.
  • Oracle DB - Renowned for its advanced features, scalability, and strong security measures.

Manual Data Capture

Entering data, categorizing events, and capturing comments and context is vital in ensuring that all relevant information is included in the UAF, especially when certain data points are not automatically collected by sensors or systems. This type of data often includes critical insights from human observations, quality checks, or maintenance activities that provide additional context and depth to the automated data streams.

To effectively integrate manual data, the UAF should support various methods for capturing and categorizing this information. This can be achieved through web forms, mobile applications, and by importing CSV files.

Major Data Consumers

The value of data already transformed and heavily encoded with operational process knowledge and context cannot be overstated. For enterprises, such data is a gold mine, offering rich insights and actionable intelligence. The UAF ensures that this data is not only accessible but also integrated seamlessly into various systems, saving data teams significant time and effort. This feature is especially crucial for data teams overseeing multiple sites, as it enables them to make informed decisions quickly and efficiently, leveraging a unified, context-rich dataset. By structuring the data to match the existing schemas of these systems, the UAF facilitates smooth integration and usability.

Data lakes and warehouses

Standard connectivity methods will ensure that existing enterprise architectures and data strategies are met. This means ensuring solutions like AWS, Azure, Snowflake, Databricks, Google BigQuery, and Oracle can be fed data streams from the UAF.

Business Intelligence tools

Supports integration with standard BI tools such as PowerBI and Tableau to facilitate data visualization and business analytics.

Advanced analytics and ML/AI

Connects to advanced analytics platforms and machine learning and AI tools through standard methods, creating a plug-and-play environment with structured, contextualized, and even preprocessed data in wide table formats with normalized timestamps, greatly increasing the time to value for these projects.

Back to the UNS

Publishes processed and contextualized data back to the Unified Namespace (UNS) to maintain a continuous and updated data flow, ensuring the integrity and accuracy of the entire data ecosystem.

HOW FLOW WORKS

Origins of the Unified Analytics Framework

The Unified Analytics Framework (UAF) was conceived by industry leaders Graeme Welton, Leonard Smit, Walker Reynolds, Allen Ray, and Jeff Knepper, with contributions from manufacturing user groups like IntegrateLive! and the Industry 4.0 community. They recognized a common issue: fragmented data systems and inefficiencies due to disconnected data silos, particularly the spread of data transformation processes across the application layer, making them impossible to govern. Determined to centralize integration and transformation work, they set out to create a framework that would unify data sources while prioritizing OT's requirements.

Through collaborative efforts, discussions, and workshops, the concept of the UAF took shape. The aim was to develop a framework that could integrate, contextualize, and govern data, providing a single, auditable source of truth. The UAF was designed to make OT's job easier and ensure reliable data governance through its ability to log changes, version calculations, and maintain a full dependency map.

The leadership at CESMII, including John Dyck, Jonathan Wise, and John Louka, played a key role in highlighting the problems facing manufacturers and the need for centralized governance, helping to shape the understanding and necessity of the UAF. Today, the UAF continues to evolve with ongoing contributions from the manufacturing community, delivering improved decision-making and operational efficiency across the industry.