In Part I, we examined the on-premise and cloud upgrade options available for SQL Server 2008 as it reaches EOL. For scenarios where memory management challenges or database operations have proved difficult, Snowflake is also a strong option, which is increasingly the default choice for many data warehouse and data lake offerings. Snowflake has been built from the ground up as a cloud-native data warehouse. The service separates both compute and storage, thereby allowing the two to independently scale. Snowflake can natively load and optimize both structured and semi-structured data and make them available via SQL. With fast, flexible, and easy to work workflows powering simple and advanced functions, users can run a highly scalable database. By natively running on AWS EC2 and S3 instances and offering compatibility with Azure services such as Azure Blob Storage and Compute options (introduced in 2018), Snowflake offers users optimal choices while freeing them from legacy upgrades.
Prior to examining the SQL Server 2008 to Snowflake migration prerequisites, let us briefly examine the major upgrade considerations. Snowflake is a true serverless cloud-based database and scales as resource needs like CPU, memory, and storage change. There’re various articles on the technical capabilities of Snowflake and why it is a viable cloud database to replace legacy databases.
Let’s examine the prerequisites for migrating from SQL Server to Snowflake.
Migration to Snowflake is much easier with Trianz’s Evove - a unique, highly automated migration technology. Evove produces a 95% migration of legacy data platforms to Snowflake’s state-of-the-art, cloud architecture with proven reliability and quality. Watch Video. In a recent use case, a customer had a 250TB box for redundancy and failover. The previous solution had a lot of expensive data sitting on it and Snowflake, Azure or AWS can act as a lower cost offload with equal performance in some cases for data and queries on other data platforms such as Teradata. If clients are still under contract with legacy vendors, Trianz can help you optimize your hybrid environment. As part of Evove, we can analyze and help select workloads that are better suited to Snowflake to free up your powerful TD boxes for heavy workloads.
For all your data footprint and migration conversations, you can reach out to us at [email protected].
Director of Analytics Practice
Kireet Kokala is a senior data technologist leader in the Data and Analytics Practice at Trianz who helps clients with digital transformation and data monetization. The Data and Analytics Practice works with enterprises to achieve significant competitive advantage via modern cloud technologies, with a particular focus on the Snowflake Computing ecosystem.
Contact Us Today
What are the Differences? Though often used interchangeably, data pipelines and ETL are two different methodologies for managing and structuring data. ETL tools are used for data extraction, transformation, and loading. Whereas data pipelines encompass the entire set of processes applied to data as it moves from one system to another. Sometimes data pipelines involve transformation, and sometimes they do not.Explore
One Unified Dashboard In the past, most enterprises would have used a legacy business management system to track business needs and understand how IT resources can fulfill these needs. The problem with these legacy systems is the manual data collection process, which introduces the risk of human error and is much slower than newer automated solutions.Explore
Intelligent automation in the workplace is becoming more relevant in the modern market. As automation technology becomes more refined and smart business models allow business owners to optimize their workflow, more and more are turning to intelligent automation for their internal and client-facing processes alike.Explore
What is a Hybrid Data Center? A hybrid data center is a computing environment that combines on-premise and cloud-based infrastructure to enable the sharing of applications and data across physical data centers and multi-cloud environments. This allows organizations to balance the security provided by on-premise infrastructure and the agility found with a public cloud environment.Explore
Leverage Your Data to Discover Hidden Potential The amount of data in the insurance industry is exploding, and the number of opportunities to leverage this data to achieve large-scale business value has exploded along with it. Rapid integration of technology makes it possible to use advanced business analytics in insurance to discover potential markets, risks, customers, and competitors, as well as plan for natural disasters.Explore
Increased Use of Data Lakes As volumes of big data continue to explode, data lakes are becoming essential for companies to leverage their data for competitive advantage. Research by Aberdeen shows that organizations that have deployed and are using data lakes outperform similar companies by nine percent in organic revenue growth.Explore