The value of data vs information

We are seeing an exposition in the scale of data that has been made available to businesses, and traditional data inspection methods must now make way for more comprehensive methods.

Innovation

Never before has the diversity and granularity of data made available for businesses been as large as it is today. And this scale shows no sign at all of abating, but is instead growing at an exponential rate. This is primarily due to the shift in our technological landscape, where we see:

  • Wider ranges of data are being collected, moving from limited equipment sensors to remote field monitoring and more sophisticated asset monitoring equipment.
  • More powerful sensors able to collect data at a higher frequency, exposing underlying time-series data characteristics like chirps or spikes that were previously not captured.
  • Data transmission and storage getting cheaper with advances in network and IT infrastructure, leading to more and more data being retained, and available for future analysis.


Previously, all data was precious and there was a direct relationship between the scale of data that was collected, the level of analysis performed against it, and the value able to be extracted.  As this scale of data grows however, this relationship breaks down, where analysts are unable to extract the same level of detail from an excessive volume of data.  Compromises can be made where more time is spent collating and selectively discarding data, and less time on the investigation itself.  This leads to value extraction, albeit at a reduced rate.

To properly combat this, the data must be presented in a format that is more readily digestible to those who require it, without compromising the underlying integrity, converting raw data into useful, actionable information.  Currently, this is seen in multiple forms.  In the area of widespread aggregation, we have the likes of predictive modelling and asset aggregate health scores summarising the large range of measured values into a much smaller set of easily digestible metrics that can be acted upon.  Conversely there is more intelligent asset monitoring, where healthy or acknowledged assets are hidden from view, to ensure that analysts are only looking at assets that require intervention or a more detailed investigation.

Looking ahead at GTS, we recognise that this level of information presentation will continue and must in a more continuous manner.  As additional data streams are brought online and connected, more value opportunities will arise.  Instead of performing an individual investigation however, the underlying analysis methodologies will be implemented to run automatically and continuously against this ever-growing data set, generating useful information from the raw values.  This will ensure that this information is available to any future investigations, cascading down the value realisation tree.  Any ad-hoc or unforeseen investigations therefore can instead use this matured information landscape, instead of requiring extraction from the vast array of raw source data.