Insights on Process Analytics

Mastering intuitive data refinement

Mastering intuitive data refinement

As highlighted in “Beyond the Dashboard: Why data analysis demands personalization” raw data rarely yields reliable analysis. Real-world process data is noisy, fragmented, and full of interruptions, and is often complicated by constantly changing process statuses such as varying grades, raw materials, and speed windows. To transform this tangle of signals into actionable results, everyone utilizing data—from process engineers and operators to management—needs tools that facilitate their natural workflows, particularly for data cleansing, filtering, and delay compensation.

Here is how Trimble’s Wedge transforms the concept of Intuitive Data Refinement from a theory into a practical, everyday toolkit for all data users.

Cleansing: Visual and immediate

Analysis derived from uncleansed raw data is usually worthless. Outliers, sensor failures, and shutdown periods skew results and ruin all analysis. In many systems, removing these requires complex SQL queries or manual, laborious data manipulation.

Wedge approaches this differently. It acknowledges that an experienced process expert is the best judge of which values are irrelevant. Using built-in “cutting tools,” users can visually identify outlier spikes or breaks and remove invalid data with a single mouse click. The system instantly recalculates all analysis and diagnostic results based on this refined dataset, allowing for an iterative, “what-if” style of investigation without altering the original database.

For recurring cleansing of known phenomena from data, manual cleansing is not needed. To address this, Wedge includes automated, rule-based data cleansing capabilities. Users can create custom formulas and rules, such as trimming artifact values during state changes or excluding data when a machine is down, to automatically refine data. This feature automates repetitive data cleansing tasks, ensuring results are always based on high-quality, pre-screened data without requiring constant human intervention.

Filtering: Focus on what matters

Data refinement isn’t just about removing bad data; it’s about isolating relevant data. You might need to analyze specific conditions, such as a particular product grade, a specific raw material batch, or a single shift.

Wedge allows users to filter data by process state. For example, if you are troubleshooting a quality issue that only appears during the production of “Grade X,” you can apply a filter to exclude all other grades from the view. All result windows, for example: histograms, X-Y plots, and statistics, automatically reflect this selection, ensuring you are comparing apples to apples.

Dynamic delay compensation: Eliminating process lags

Perhaps the most critical aspect of data refinement in continuous processes is handling time delays (lags). A change in chemical feed at the wet end of a paper machine does not impact the reel scanner for some time. Similarly, in steelmaking, an alloy adjustment made at the furnace won’t be reflected in the quality sensors until the slab has traveled hundreds of meters through the caster and cooling segments. If you analyze these signals based on their raw timestamps, you will find no correlation.

Wedge solves this with Dynamic Process-Delay Compensation. Unlike simple static time-shifting, Wedge allows users to build a delay model using drag-and-drop blocks representing pipes, tanks, and towers. The system calculates the true delay based on current process conditions, taking variable flows, speeds, and tank levels into account.

This capability extends to complex scenarios involving intermediate storage and distinct production phases, even where the First-In-First-Out (FIFO) principle does not apply. This capability allows the system to handle complex processing steps, such as reel or coil rewinding, and to handle varying production speeds across different production phases.

For example, Wedge can track specific production batches, such as a roll or a coil, allowing users to examine the history of an end-product batch even after it has been rewound. By selecting the product ID, users can see exactly what process conditions that specific batch experienced throughout the entire production line.

The delay feature includes a Virtual Tracer, which visualizes material flow through the process. It aligns the data so that the cause (upstream) and effect (downstream) appear simultaneously in your analysis view, regardless of how long the material actually took to travel between those points.

Accurate data for real-world analysis

By combining instant visual cleansing, automated rule-based refinement, state-based filtering, and intelligent delay compensation, Wedge ensures that the data you analyze reflects the physical reality of your process. It turns the chore of data preparation into a seamless part of the troubleshooting mindflow.

 


Want more insights?

Our sales team is full of data analysis experts ready to help you. Learn more about Wedge process data analysis tool.

Take the next step

Empower your team for efficiency improvement. Stand out from the competition.

Request a trial

Top Contact Sales