Read Time:
3 min.
Sponsored by:
Font Size:
Font Weight:
Analyst Viewpoint
Collectively, we are amid a transition from data at rest to data in motion. Large volumes of data are streaming into organizations from all sides and with increasing frequency, forcing organizations to rethink how they process data. And the changing nature of the frequency with which data informs decisions requires a corresponding change in analytic approaches. The batch processing and ad hoc analyses of the past are no longer enough for this new data environment. Organizations that collect this information without analyzing it more quickly put themselves at a competitive disadvantage.
Our research shows that one third of organizations (34%) need to process the data they collect every hour or in real time. Gone are the days when organizations could get by with weekly or monthly processing of the information they collect. There are certainly still monthly, quarterly and annual reporting processes, but these processes must be combined with more frequent processing of operational data as well. Whether it’s data from web sites, point of sale registers, call center interactions, products or location information, organizations need the ability to react in real time if they want to affect outcomes while the window of opportunity still exists.
Furthermore, it’s unrealistic to expect individuals to manually review hundreds of thousands or millions of data points and find all the actionable insights. To evaluate such large volumes of information re- quires new types of analytical techniques based on artificial intelligence and machine learning (AI/ML) algorithms. These algorithms can detect correlations in the data to highlight which factors have the greatest impact on the metrics that matter most to the organization’s performance. Often these analyses involve multiple factors and it’s a combination of circumstances triggering variation from the targets. Evaluating the myriad of combinations can be complex and time consuming. Using AI/ML can speed up the process of diagnosing the root causes of observed changes and determining what actions to take in response. These techniques can also help personalize analyses to individuals based on their past behaviors just like streaming video services personalize their recommendations to individuals based on their past viewing habits.
It must be said that there are significant challenges to creating and applying these models. Our research shows that organizations lack the skills required to create them; this lack of skills is the second most common challenge cited by organizations when deploying AI/ML. And with data streaming into organizations at a rapid pace the required reaction time frames are becoming compressed. Automation can help address both issues.
As organizations start to think about streaming data as a normal part of operations, automation will be a key tool to help them react and respond more quickly. Without automation, the analysis process often consists of ad hoc data exploration alongside the time-consuming trial and error experimentation needed to develop analyses and determine corrective actions. In addition, if metrics are collected about the analysis process itself, it creates a source of information to improve the process and get to the result more quickly.
Automated analyses can and should run continuously as well. There’s no reason to utilize automated analyses in a periodic manner anymore. Data is arriving continuously, and the technology is readily available and affordable to process data continuously. Running analyses continuously will help the analysts know where the most pressing issues are that require their attention. While there is much concern about automation replacing individuals, that is not the objective. The objective of automating processes is to direct individuals to where their attention is needed most so they have more time available to work on interesting, important business problems instead of performing rote analyses.
The increased focus on streaming data doesn’t eliminate the need for historical data. As indicated above, AI/ML techniques require historical data to identify and codify the patterns. There are also many time-based analyses that require historical data to see period-over-period comparisons and trends. Therefore, it is important to adopt an approach that can span both the new world of streaming data as well as the traditional world of historical databases.
As organizations contemplate the increasing velocity of data and its impact, they should take a step back and re-evaluate their analytics processes. Organizations operate at a speed and complexity that requires a new approach. This approach can’t be based on fragile, hard-to-maintain, manually created data integration processes, and organizations must adopt an architecture that accommodates new data arriving continuously. New analytics approaches that utilize automation for continuous analyses of this data will accelerate the decision-making process for line-of-business personnel and improve the organization’s performance.
Analyst Viewpoint
Collectively, we are amid a transition from data at rest to data in motion. Large volumes of data are streaming into organizations from all sides and with increasing frequency, forcing organizations to rethink how they process data. And the changing nature of the frequency with which data informs decisions requires a corresponding change in analytic approaches. The batch processing and ad hoc analyses of the past are no longer enough for this new data environment. Organizations that collect this information without analyzing it more quickly put themselves at a competitive disadvantage.
Our research shows that one third of organizations (34%) need to process the data they collect every hour or in real time. Gone are the days when organizations could get by with weekly or monthly processing of the information they collect. There are certainly still monthly, quarterly and annual reporting processes, but these processes must be combined with more frequent processing of operational data as well. Whether it’s data from web sites, point of sale registers, call center interactions, products or location information, organizations need the ability to react in real time if they want to affect outcomes while the window of opportunity still exists.
Furthermore, it’s unrealistic to expect individuals to manually review hundreds of thousands or millions of data points and find all the actionable insights. To evaluate such large volumes of information re- quires new types of analytical techniques based on artificial intelligence and machine learning (AI/ML) algorithms. These algorithms can detect correlations in the data to highlight which factors have the greatest impact on the metrics that matter most to the organization’s performance. Often these analyses involve multiple factors and it’s a combination of circumstances triggering variation from the targets. Evaluating the myriad of combinations can be complex and time consuming. Using AI/ML can speed up the process of diagnosing the root causes of observed changes and determining what actions to take in response. These techniques can also help personalize analyses to individuals based on their past behaviors just like streaming video services personalize their recommendations to individuals based on their past viewing habits.
It must be said that there are significant challenges to creating and applying these models. Our research shows that organizations lack the skills required to create them; this lack of skills is the second most common challenge cited by organizations when deploying AI/ML. And with data streaming into organizations at a rapid pace the required reaction time frames are becoming compressed. Automation can help address both issues.
As organizations start to think about streaming data as a normal part of operations, automation will be a key tool to help them react and respond more quickly. Without automation, the analysis process often consists of ad hoc data exploration alongside the time-consuming trial and error experimentation needed to develop analyses and determine corrective actions. In addition, if metrics are collected about the analysis process itself, it creates a source of information to improve the process and get to the result more quickly.
Automated analyses can and should run continuously as well. There’s no reason to utilize automated analyses in a periodic manner anymore. Data is arriving continuously, and the technology is readily available and affordable to process data continuously. Running analyses continuously will help the analysts know where the most pressing issues are that require their attention. While there is much concern about automation replacing individuals, that is not the objective. The objective of automating processes is to direct individuals to where their attention is needed most so they have more time available to work on interesting, important business problems instead of performing rote analyses.
The increased focus on streaming data doesn’t eliminate the need for historical data. As indicated above, AI/ML techniques require historical data to identify and codify the patterns. There are also many time-based analyses that require historical data to see period-over-period comparisons and trends. Therefore, it is important to adopt an approach that can span both the new world of streaming data as well as the traditional world of historical databases.
As organizations contemplate the increasing velocity of data and its impact, they should take a step back and re-evaluate their analytics processes. Organizations operate at a speed and complexity that requires a new approach. This approach can’t be based on fragile, hard-to-maintain, manually created data integration processes, and organizations must adopt an architecture that accommodates new data arriving continuously. New analytics approaches that utilize automation for continuous analyses of this data will accelerate the decision-making process for line-of-business personnel and improve the organization’s performance.
Fill out the form to continue reading
David Menninger
Executive Director, Technology Research
David Menninger leads technology software research and advisory for Ventana Research, now part of ISG. Building on over three decades of enterprise software leadership experience, he guides the team responsible for a wide range of technology-focused data and analytics topics, including AI for IT and AI-infused software.