Article 1 in a three-part series
Our current-day obsession with cloud-based data and analytics has been a long time in the making. Over the last 30 years, organizations have wrestled with how best to organize, manage, and analyze their data in a high-impact but cost-effective way. Now, with cloud-native tools like Snowflake Data Cloud that leave cumbersome and costly infrastructures behind, could the great data struggle finally be over? Before we explore the way forward, let’s reflect on how we got to this moment.
Data history lesson
In the early 1990s, organizations started to heavily invest in enterprise data warehouses, hoping to gain a better understanding of their operational performance and customer behavior from the data they were collecting. Throughout the next decade, investment in analytics was heavily focused on driving new product offerings and improving sales and marketing. The financial crisis in 2008 marked a turning point. In the years after, new regulations in the financial services and healthcare industries drove increased investment in analytics to support regulatory compliance.
Around the mid-2010s is when things really took off. Rapid innovations in technology infrastructure and tools drove down costs dramatically. Organizations now had access to a variety of open source analytics tools like Hadoop that they could run on significantly less expensive hardware (think Unix server farms). At the same time the variety and volume of data collection skyrocketed thanks to the Internet of Things, mobile devices and apps, and social media. Companies were inspired to experiment with these faster, cheaper approaches to gathering and analyzing data that promised to help them market, sell, and operate more effectively.
The modern era of data arrives—on a cloud
More recently, the emergence of cloud—and platforms like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform—has made the whole notion of hardware obsolete, freeing up funding and resources to pursue more aggressive, more effective, and more widespread use of business and advanced analytics.
The move to the cloud is happening a lot faster than anyone predicted, driven in large part by the need to replace or modernize legacy technology assets that are reaching the end of their lifecycle. Access to the cloud and cloud services can help organizations significantly speed time to delivery and increase flexibility while dramatically reducing total cost of ownership. Add to these advantages a never-ending stream of data sources and easier-to-implement technologies, and you can see how this perfect storm is leading companies to feverishly pursue cloud-powered advanced analytics that help them understand customer behaviors, rapidly iterate products, hyper-personalize marketing, and increase sales.
Better than a silver bullet: A modern data and analytics strategy
While it may be tempting to move to the cloud as fast as you can—and platforms like Snowflake Data Cloud are making it a lot easier—realizing its full potential for your organization requires a modern and comprehensive data and analytics strategy. What do we mean by modern and comprehensive?
- It’s built on best-of-breed technology and industry standards
- It provides organizations with a scalable framework to collect information, process, transform, structure, and deliver it through a standard platform and processes.
- It creates an environment for analytics products to be developed, automated, and propagated across an enterprise based on the organization’s business drivers.
In the face of rapid technology and marketplace change, having a strategic framework in place—one that is grounded in your organizations stated analytics priorities—will enable smarter, faster decision-making about how to invest in and assemble the necessary components and capabilities. Without the framework’s guardrails, organizations tend to revert to siloed processes, which ultimately leads to duplication of efforts, higher costs, and mistrust of the data.
Build your strategy on these 3 pillars
Based on our experience working with large companies across different industries, Aspirent has developed a reference architecture that enables and sustains a successful analytics journey and flexes to meet future needs. It’s built on 3 pillars.
- Data foundation
Organizing your data into raw, curated, and analytic layers enables organizations to blend disparate data sources and unlock the value of all your data. It’s also important to determine the proper ingestion, transformation, and analytical integration patterns for the data your organization gathers and uses. Having a roadmap that lays out the needs for the next 18-36 months can help you determine how the best way to organize the different elements into your ideal future state.
- Visualization and advanced analytics
Properly designed data layers also reduce overhead for visualization developers and data scientists, providing them with accurate and synthesized data from various sources. Deployed on a strong data foundation, visualization tools help automate and distribute analytical products to the organization, enabling business users to understand complex data via easy-to-interpret visualized data points and become empowered decision-makers. Advanced analytics can deliver even deeper business understanding by uncovering and predicting the drivers of critical decision-making. Identifying how the organization is going to consume these insights is key not only to selecting the right visualization and analytics tools, but also to maximizing their value.
- Operationalization
To ensure your data and analytics strategy delivers on its promise in a meaningful and sustainable way, it’s critical to establish a robust data governance program with a steering committee that collaborates to set and enforce data standards and policies. Without effective data governance your data and analytics strategy could be derailed by data quality issues that go unresolved and cause integration and accuracy issues in your analytics efforts. Performing data audits to track data usage patterns helps organizations manage the costs of cloud platforms—whether, like Snowflake, the platform uses a consumption-based pricing model or a more traditional subscription model.
Can Snowflake fast-track how your strategy comes to life?
There is no doubt that Snowflake offers a number of powerful capabilities and potential advantages that can help your move to the cloud progress more smoothly. As with any technology you consider as you map out your data strategy, it’s critical to understand how Snowflake’s capabilities align to your specific operating realities so you can match the right workloads to the right technologies.
For example, if your organization is regularly querying integrated and aggregated data to support a variety of business analytics, then minimizing the replication of data and providing easy, scalable, and workload-level compute resources are going to be top priorities. This is one of Snowflake’s sweet spots. If your organization requires a lot of routine data ingestion and transformation, then processing the data in a different platform and bringing in into Snowflake to support downstream business analytics might provide a more flexible and cost-effective solution.
Ultimately, what you want is the right solution stack for each of the three pillars in your data and analytics strategy. We explore how and where Snowflake can help you do that in our second installment in this series “How to use Snowflake to its maximum value in your data and analytics strategy.”
Dave Mobley, Chief Practices Officer
Greg Stuhlman, Principal, Data Analytics
Shan-Ming Chiu, Senior Manager, Data Architecture
Learn more about Aspirent’s Data & Analytics Practice.
Subscribe to our Insights newsletter.