Information Management Simplified

Eliminate silos, scale your analytics, and securely deliver actionable information where it’s needed most.

Toss the Industry 3.0 Playbook, Build a Scalable Data Foundation

Data projects that have adhered to an Industry 3.0 playbook have often resulted in fragile and unscalable data integrations, burdened with custom scripting and excessive coding. Time and again, we find that these projects are constructed on top of platforms like BI tools, reporting solutions, historian products, or even worse, in Excel. This widespread dispersion of efforts leads to a lack of data governance, rendering any form of templatizable scalability unattainable.For a true digital transformation, it is imperative that we rewrite the playbook. Our current architectures are flawed; centralizing information management is no longer optional—it’s necessary. Welcome to Flow, where our Information Management Solution is meticulously designed to revolutionize data handling and utilization in the manufacturing industry, addressing these very challenges.
Flow stands out as a specialized Information Management Platform crafted for the manufacturing sector, where it functions as a centralized hub. It is meticulously designed to gather, amalgamate, and standardize data from various operational and enterprise data silos (regardless of platform or technology), efficiently orchestrating a sophisticated data transformation process. Within this framework, Flow executes detailed calculations and strategically embeds the data within precise timeframes and models before making it readily available as contextualized information.
Flow transforms Industry 3.0 data projects from fragile, scattered systems into a robust, centralized infrastructure that's easy to govern and ready to scale. Don't just manage, innovate.

Will It Scale?

Flow stands out in the manufacturing sector for its scalable capabilities, which hinge on two key principles.

The first principal, often overlooked, is cultural. We cannot over emphasize the importance of collaboration among operational subject matter experts (SMEs). Flow excels by uniting these experts, enabling them to share and integrate their deep, often untapped tribal knowledge directly into the information management process. This collaboration is the cornerstone of scaling analytics effectively in manufacturing environments. As data moves from the operational site to broader platforms like cloud services, data lakes, or business intelligence tools, it is meticulously enriched with operational context and insights from teams specializing in manufacturing, controls, production, quality, and maintenance engineering.

The second principle is technological. Flow has been built from the ground up to ensure scalability, with key features such as:

No Code / Low Code

Flow empowers users to manage and transform data with minimal coding, making it accessible for non-technical subject matter experts to contribute effectively and streamline workflows.

Universal Governance

Flow ensures consistent data governance across the platform, allowing for standardized control and compliance, which is crucial as operations scale. You can ensure your data is cleansed, calculated, and presented the way you want it across your entire organization.

Flexible Architecture

Built on a flexible infrastructure that includes Docker, Linux, and Windows, Flow supports both horizontal and vertical scaling and can be deployed at the edge, on-site, or in the cloud to meet diverse operational needs.

Platform Agnostic

No matter what variety of SCADA, historian, MES, ERP, or SQL sources you need to connect, Flow can unify your data. Flow also maintains compatibility with a wide range of data consumers, ensuring seamless integration and data flow regardless of cloud platform, business intelligence vendor, or database.

Template Library

Flow's Template Library enhances scalability by allowing you to templatize and manage the information models you build. Templates can be instantiated multiple times, nested within each other, and versioned to accommodate variations of the same foundational model, streamlining deployment and maintenance across different scenarios.

API Driven

With its robust API-driven approach, Flow allows for extensive customization and integration, facilitating automation and connectivity between various systems and technologies.

Three Steps to Building Your Information Management Platform

Data Modeling

We start by creating a single consolidated information model to abstract and unify multiple underlying namespaces. That's a mouthful; what does it mean?

Decoupled

Flow works independent of your data sources, remaining platform agnostic. You can connect and unify dozens of data sources while remaining independent of their functional namespace.

Finally, a way to unite time series historians, SQL dbs, real time data brokers and servers, and even manual data capture. ERP, MES, Historians, LIMS, CMMS, and many other operational systems can easily be connected.

Abstracted and Cleansed

The Operator, Team Leader or Manager accessing the Flow Model doesn't need to know which tag or SQL query was used to link data to Flow. In fact, they don't want to know, nor do they care! They just want information they can trust, and that requires first cleansing the data of anomalies, outliers, and bad data. Flow makes this easy by giving you a single location to create rules around how you want to cleanse and normalize data before moving it further downstream.

Structured Yet Flexible

The Flow Model is hierarchical and generic by design. We can build our model using ISA95, ISA88, PackML, custom asset, twin thing, entity meta-model, or any combination of these. (We're not sure twin thing is really a thing, but you get the idea). The Flow Model represents physical assets, performance indicators, and logical entities. You can structure this model by area, department, or both. The point is - it is flexible. And, despite its hierarchical nature, the Flow Model allows for object "linking" across the structure.

Templatized Deployments

The Flow Model can be standardized across multiple sites or production facilities. The source of a KPI will differ across sites, but the name will be consistent. On Site A, the KPI represents tag "FL001-123-FQ001.PV" (see why the managers don't care!) and on Site B, the measure represents a manually input value. But both KPIs are named "Line 1 Filler Volume", and that is what everyone will know it as, everywhere they go. Flow Templates allow for this model standardization.

Unified Namespace Plus?

In many ways, the Flow Model is the "uber" Unified Namespace, consolidating multiple underlying namespaces, whether they are Historian namespaces, SQL namespaces or even MQTT namespaces - Flow brings them all together into one persisted model and as you will learn later, will even make historical data access possible, helping you build value-added analytics apps.

The Flow Model is the "uber" Unified Namespace, consolidating multiple underlying namespaces and exposing ready to use information.

Calculation Engine

For us, the transformation pipeline is the most exciting part. Your engineering core's unparalleled knowledge of your process and operation must be added to your data. This is where Flow really shines.

Context

At its core, Flow enriches every piece of data with two essential contexts: time and model. As data streams into Flow—whether marking an event or contributing to a calculated metric—it is immediately contextualized by these dimensions.

Flow utilizes event-framed periods defined by specific triggers in the data stream, such as machine start and stop events. The creation of these event frames relies heavily on the engineering expertise and intimate process knowledge of your operations team (context that is completely lost when data is just streamed to the cloud). They provide crucial context, like reasons for a machine’s downtime, which adds a layer of rich, meaningful insight to the data. This expertise transforms raw data into actionable information, enabling precise monitoring and analysis of operational events.

In addition to event-based framing, time acts as the continuous thread through all Flow systems, anchoring every piece of information. Flow further breaks down time into comprehensible slices or periods—minutes, hours, shifts, days, weeks, months, quarters, and years. These calendar-based periods are essential for making insightful comparisons, such as assessing shift performance or analyzing year-over-year process efficiencies.

Calculation Services

As data streams into Flow, it is cleaned, contextualized, and transformed by a set of calculations services that include:

  • primary aggregations and filters
  • cumulative and secondary aggregations
  • moving window calculations
  • expression based calculations
  • evaluations against limits or targets
  • secondary aggregations on event periods

User-defined functions are used to encapsulate complex algorithms and standardize and lock down calculations throughout the Flow Model.

Power of Multiple

The Flow transformation pipeline applies these contextualization and calculation processes to multiple data streams simultaneously, removing the silos between them as they blend in near real-time. The pipeline allows us to build calculated measures that take inputs from more than one data source or trigger event periods using one data source while attributing its context from other data sources, whether these data sources are time-series or transactional in nature. The possibilities are limitless!

Remove data silos as they blend in near real-time

Information Hub

Flow is anything but a "black box". It contains your information and is open for you to easily access it via industry-standard protocols. Flow is your bridge from OT and IoT data streams to analytics-ready information.

API

Flow exposes an industry-standard REST API for model discovery and information access that can be used to build third-party apps or integration into existing applications.

Publish

Flow provides integration components to automatically publish information out to your other systems via industry standard protocols in near real-time. How about pushing maintenance information like running hours or stroke counts up to your Asset Management system? Or actual production figures up to your ERP system? What about sending information to your Machine Learning platform in the cloud? Or even just back to your SCADA for operator visibility of KPIs calculated from multiple data sources? Flow currently integrates with:

  • Industrial Historians - Canary Historian
  • SQL Databases - Microsoft SQL, MySQL, Oracle, PostgreSQL, etc.
  • Realtime Systems - MQTT (including SparkplugB)

Flow Tiering

Flow Systems can publish information to other Flow Systems! Why would this be useful? Imagine you were a multi-site organization, possibly spanning the globe, and each of your sites' Flow Systems is publishing its information up to your HQ Flow System? The HQ Flow System would provide invaluable fleet-wide information for site comparisons, benchmarking, and logistics planning. How about cost or efficiency comparisons between types of equipment? The possibilities are limitless.

Flow is your bridge from OT and IoT data streams to analytics-ready information

Visualize

Ultimately, Flow provides value in the form of decision-support, insight and action by presenting the "single source of truth" in a way that is seen and understood.

Dashboarding

Flow reports, charts and dashboards are easily configured and served via a web browser to operators, team leaders and managers. Chart configuration employs built-in visualization best-practice, thus maximizing the transfer of information to the human visual cortex:

  • Big screens in production areas or hand-over rooms
  • Interactive team meetings, in-person or remote
  • Individual consumption via laptops or devices

Reports and charts enable comment entry to add human context to our information.

Messaging

Sometimes it is more convenient for the information to find us rather than for us to find the information. Flow automatically compiles and distributes information and PDF exports as and when required. Distribution is secure and handled via mechanisms such as:

  • Email
  • Slack
  • Microsoft Teams
  • Telegram
  • SMS

Maximize the transfer of information to the human visual cortex

How does this all fit together?