If you're encountering issues with duplicate dump events through control room operators capturing historic events too quickly in Data Acquisition, or mobile operators pressing the "dump" button multiple times (either accidentally or to artificially boost their numbers) here are a few things you can do to mitigate it.
Reference Data Configuration
Navigate to the "Reference Edit" module.
In Reference Edit, navigate to "Business Model Settings" and then "DuplicateDump". (If Business Model Settings is not available, you will need to deploy it under "Deploy Reference Data")
From here you will navigate to the DuplicateDump item.
You will mostly use "TimeWindow" to mitigate duplicate dump events. The measure used here is in Minutes, as seen on the screenshot above it is set to "5". This means that another dump event from this equipment piece will flag as a duplicate if it is under 5 minutes apart.
You have additional settings you can implement if needed such as "PrefferedMeasure" that will prioritize one movement measure over another. (If this is not set it will take the first movement as priority)
"IrrelevantLocation" will act as a lower priority location when Duplicate Dumps are detected.
"TimeWindowBasedOnDistanceAndSpeed" dynamic functionality that will calculate time window needed based on distance and speed.
Once done configuring you can go ahead and Check-in the new Reference Data and publish the changes.
When changes are published and a duplicate dumps takes place it will flag in Event Editor and not contribute to total tonnes in reporting.
Want to learn more?
Online Help Manuals - Click here for the latest version
Learning Management System - Click here to login or here to request access
Comments
0 comments
Please sign in to leave a comment.