Information Hub simplifies the management, execution, and distribution of information

Flow Software's Information Hub is designed to standardize the process of transforming raw data into actionable information distributed across an organization.

SEE A DEMO

Analytics have failed to scale, we know why... they are still based on Industry 3.0 architectures!

Analytics in manufacturing fails to scale due to fragmented data silos, lack of context, and brittle integrations. So we built a system designed to handle the complexity and volume of manufacturing data.

Our approach to the Unified Analytics Framework centralizes integration, contextualizes data, and ensures robust, scalable solutions. By addressing these pain points, Flow Software transforms raw data into actionable insights, driving operational excellence and continuous improvement. We provide the comprehensive information management needed to overcome the limitations of traditional analytics systems.
what is a uaf?

Toss the Industry 3.0 Playbook, Build a Scalable Data Foundation

Data projects that have adhered to an Industry 3.0 playbook result in fragile and unscalable data integrations, burdened with custom scripting and excessive coding. Time and again, we find that these projects are constructed on top of platforms like BI tools, reporting solutions, historian products, or even worse, in Excel.

This widespread dispersion of efforts leads to a lack of data governance, rendering any form of templatizable scalability unattainable.
Flow Software's Information Hub stands out as a specialized Industrial Data Management (IDM) solution, crafted for the manufacturing sector, where it leverages an advanced Unified Analytics Framework architecture. It is meticulously designed to gather, amalgamate, and standardize data from various operational and enterprise data silos (regardless of platform or technology), efficiently orchestrating a sophisticated data transformation process.

"Information Hub provides a robust, centralized infrastructure that's easy to govern and ready to scale."

SEE A DEMO

Here's how it works

Flow's Information Hub solution consists of pre configured and developed tools to ensure you can quickly scale your analytics and build a Unified Analytics Framework following a 5 step process.

  • Create an information model
  • Connect to your data sources
  • Transform and contextualize the raw data
  • Publish to other applications
  • Share information visually

Create an information model

Siloed data and numerous naming conventions create significant challenges in manufacturing, leading to inconsistencies and inefficiencies. Without a unified approach, different functional namespaces and disparate data sources make it nearly impossible to gain a holistic, accurate view of operations.

Decoupled

In most cases, we have a number of underlying data sources (e.g. Historians, SQL Databases, etc.). We access this data using tagnames or queries, but we can provide a more meaningful and standardized name for a "piece of information". Let's call this "piece of information" a Measure.

Abstracted

The Operator, Team Leader or Manager accessing the Information Hub Model doesn't need to know which tag or SQL query was used to create that Measure, that "piece of information" that they use to make key decisions. In fact, they don't want to know, nor do they care! They just want their information!

Templatized

The Information Hub Model can be standardized across multiple sites or production facilities. The source of a Measure will differ across sites, but the name will be consistent. On Site A, the measure represents tag "FL001-123-FQ001.PV" (see why the managers don't care!) and on Site B, the measure represents a manually input value. But both measures are named "Line 1 Filler Volume", and that is what everyone will know it as, everywhere they go. Information Hub Templates allow for this model standardization.

Structured but flexible

The Model is hierarchical and generic by design. We can build our model using ISA95, ISA88, PackML, custom asset, entity meta-model, or any combination of these. The Model represents physical assets, performance indicators, and logical entities. You can structure this model by area, department, or both. The point is - it is flexible. And, despite its hierarchical nature, the Model allows for object "linking" across the structure.

Unified but secure

In many ways, the Information Hub Model is the "uber" Unified Namespace, consolidating multiple underlying namespaces, whether they are Historian namespaces, SQL namespaces or even MQTT namespaces - Flow's Information Hub brings them all together into one persisted model. Together with a configurable security construct, this Unified Information Model presents the foundation for building value-added IT apps.

"The Information Model is the "uber" Unified Namespace, consolidating multiple underlying namespaces."

SEE A DEMO

Connect to your data sources

As we build out a Information Hub Model, we start filling it with information. We do this automatically, using data from existing sources, or manually, through Information Hub Forms.

Data Sources

Information Hub connects to and ingests data from multiple sources, meaning we can leverage the investments you have already made:

  • Industrial Historians - Canary Historian, AVEVA PI (formally OSIsoft PI) Historian, Ignition Historian, GE Historian, AVEVA Historian (formally Wonderware Historian), additional OPC HDA based historians, etc.
  • IoT and Cloud Platforms - REST APIs, Metering Solutions, Weather Platforms, Power Distribution APIs, etc.
  • SQL Databases - Microsoft SQL, MySQL, Oracle, PostgreSQL, etc.
  • NoSQL Databases - InfluxDB, etc.
  • Realtime Systems - MQTT, OPCUA, Telegraf, etc.

Scalability Matters

Data contained by the connected data sources is never replicated. Rather, it is referenced when required to perform aggregations and calculations. Information Hub stores only the results of this retrieval process, in the context of time and model. Information Hub guarantees fast and efficient access via charts and dashboards as and when needed by storing the resultant information only. But, more importantly, this efficient information storage allows your Information Hub to scale enormously, without losing the ability to drill into the underlying data source when necessary!

Data Entry

There will always be data that cannot be captured automatically, whether it's data read from an instrument indicator, or external data coming from email or paper-based systems. Information Hub handles manually captured data elegantly through the use of Forms. Information Hub Forms are easily configured and served via a web browser to data capturers in a familiar and intuitive spreadsheet-like interface. No more spreadsheet spaghetti! The best part is that as soon as someone captures data in a Form, any calculations or transforms in the downstream pipeline that depend on that entry are automatically processed and available for additional analytics.

"Flow's Information Hub helps me leverage the investments we've already made in our data infrastructure."

SEE A DEMO

Transform and contextualize the raw data

For us, the transformation pipeline is the most exciting part. This is where Information Hub really shines.

Context

Out of the box, and at its foundation, Information Hub enforces two critical pieces of context against which measure information is enriched, namely, time and model. Every data point streaming into the Information Hub, whether used for event framing or calculated into a measure's value, is contextualized by time and model to become part of the information that will ultimately serve our decision making processes.

Time is the base that runs through all Information Hub systems, a thread against which all information is stored. However, to present and publish this information as analytics-ready, Information Hub normalizes time into slices or periods:

Calendar-based periods include minutes, hours, shifts, days, weeks, months, quarters and years. All these periods are required to make meaningful comparisons to derive insight from your information. For example, how is the current shift running? How does our process this year compare to the same time last year? This information is at your fingertips.

Event-framed periods are derived from triggers in the underlying data. Information Hub monitors for start and stop triggers to generate periods against which you can attribute additional context dynamically. For example, Information Hub will monitor the necessary tags, or combination of tags, to record when a machine stops and starts up again. Additional information, like the reason for the stop, will be attributed to that event period, providing invaluable insight over time as to how often, how long, and why the machine stops.

Calculation Services

As data streams into Information Hub, it is cleaned, contextualized, and transformed by a set of calculations services that include:

  • primary aggregations and filters
  • cumulative and secondary aggregations
  • moving window calculations
  • expression based calculations
  • evaluations against limits or targets
  • secondary aggregations on event periods

User-defined functions are used to encapsulate complex algorithms and standardize and lock down calculations throughout the Information Hub Model.

Power of Multiple

The Information Hub transformation pipeline applies these contextualization and calculation processes to multiple data streams simultaneously, removing the silos between them as they blend in near real-time. The pipeline allows us to build calculated measures that take inputs from more than one data source or trigger event periods using one data source while attributing its context from other data sources, whether these data sources are time-series or transactional in nature. The possibilities are limitless!

"Unifying data silos by blending data from multiple platforms in near real-time."

SEE A DEMO

A Information Hub Pilot Is A Great Way To Start Your Journey!

A Information Hub Pilot is a great way to get started. Our team will build the solution based on your use case, train you along the way, and you and you get to keep a Information Hub Starter license! Most pilots can be completed in 4 weeks for $20,000.

GET STARTED

Publish to other applications

Flow Software's Information Hub is anything but a "black box". It contains your information and is open for you to easily access it via industry-standard protocols. Flow is your bridge from OT and IoT data streams to analytics-ready information.

API

Information Hub exposes an industry-standard REST API for model discovery and information access that can be used to build third-party apps or integration into existing applications.

Publish

Information Hub provides integration components to automatically publish information out to your other systems via industry standard protocols in near real-time. How about pushing maintenance information like running hours or stroke counts up to your Asset Management system? Or actual production figures up to your ERP system? What about sending information to your Machine Learning platform in the cloud? Or even just back to your SCADA for operator visibility of KPIs calculated from multiple data sources? Information Hub currently integrates with:

  • Industrial Historians - Canary Historian
  • SQL Databases - Microsoft SQL, MySQL, Oracle, PostgreSQL, etc.
  • Realtime Systems - MQTT (including SparkplugB)

Information Hub Tiering

Information Hub systems can publish information to other Information Hub systems! Why would this be useful? Imagine you were a multi-site organization, possibly spanning the globe, and each of your sites' Information Hub systems is publishing its information up to your HQ Information Hub system. The HQ Information Hub would provide invaluable fleet-wide information for site comparisons, benchmarking, and logistics planning. How about cost or efficiency comparisons between types of equipment? The possibilities are limitless.

"Information Hub is your bridge from OT data systems to enterprise applications."

SEE A DEMO

Share information visually

Ultimately, Information Hub provides value in the form of decision-support, insight and action by presenting the "single source of truth" in a way that is seen and understood.

Dashboarding

Information Hub reports, charts and dashboards are easily configured and served via a web browser to operators, team leaders and managers. Chart configuration employs built-in visualization best-practice, thus maximizing the transfer of information to the human visual cortex:

  • Big screens in production areas or hand-over rooms
  • Interactive team meetings, in-person or remote
  • Individual consumption via laptops or devices

Reports and charts enable comment entry to add human context to our information.

Messaging

Sometimes it is more convenient for the information to find us rather than for us to find the information. Information Hub automatically compiles and distributes information and PDF exports as and when required. Distribution is secure and handled via mechanisms such as:

  • Email
  • Slack
  • Microsoft Teams
  • Telegram
  • SMS

How does this all fit together?