SQL Server is a RDBMS system that runs several businesses globally and has become an enterprise contender with advanced integration and cloud capabilities. Extended Support for SQL Server 2008 and 2008 R2 will end on July 9, 2019 and several customers will need to either upgrade or migrate their databases.
In Part I of this article series, let us examine the available options for on-premise and cloud services via Azure (IaaS/PaaS), AWS, and GCP. In Part II, we’ll look at the prerequisites for accelerating data loads for SQL Server data to Snowflake!
SQL Server has been the go-to database of choice for both OLTP and OLAP applications over 25 years. According to 2016 data from idatalabs.com, thousands of companies use SQL Server to power their business.
The survey found that in looking at SQL Server customers by industry, Computer Software (14%) and IT & Services (11%) are the largest segments. After 10 plus years in the running, Microsoft is providing customers options on how to deal with their SQL Server 2008 and SQL Server 2008 R2 installations, preferably to the Azure cloud. There are several choices with regards to Azure such as the SQL Database Managed Instance and the on-premises version, SQL Server 2017-2019.
An upgrade of SQL Server to the latest version generally translates to higher license fees and cost of operations to prepare and migrate data. However, with cybersecurity attacks, database, and zero-day vulnerabilities on the rise, the cost NOT to upgrade SQL Server is even higher.
The benefits of migrating to the latest SQL Server version are wider data integration options, improved language platform choices, better performance, and advanced security features. There’s certainly a lot more complexity that can be achieved towards Big Data landscapes and Data virtualization in the process. Though the steps of preparing for an on-premise or cloud-based SQL Server upgrade have complexities, it is well worth the preparation. In several cases, the upgrade might be part of a native cloud push or a multiyear cloud migration journey.
Note: SQL Server databases and other objects associated with the previous SQL Server instance must be backed up prior to upgrading.
The above summary list works well in a lift and shift of SQL Server database and will cover database backup and restore procedures.
Upgrade to SQL Server on Azure: extended security updates will be available for free in Azure for 2008 and 2008 R2 versions of SQL Server and Windows Server to help secure workloads for three more years after the end of support deadline. Customers can rehost these workloads to Azure with no application code change using Windows or Linux instances using Azure Portal. Additionally, SQL Server 2008 and 2008 R2 deployments can be moved with no application code change to Azure SQL Database Managed Instance.
Upgrade to SQL Server on AWS: SQL Server on Windows or Linux on Amazon EC2 enables quick to increase or decrease capacity within minutes. Migration of SQL Server databases can be accomplished securely with the AWS Database Migration Service. For SQL Server 2008, databases can be ingested into the AWS cloud and easily upgraded with AWS Systems Manager.
Upgrade to SQL Server on GCP: SQL Server 2008 can run on either Windows or Linux virtual machines using GCP console. There’s wide support for SQL Server from version 2012-2017 and with Express to Enterprise edition support. By the end of the year, CloudSQL for SQL Server will allow customers to migrate existing, on-premises SQL Server workloads to GCP and run them in a fully managed database service that autonomously handles backups, replication, patches, and updates. CloudSQL will preserve existing apps and data while enabling integration with GCP services like BigQuery
Licensing Costs and Updates: if performing an on-premise upgrade, procurement of the appropriate licensing key should be applied. If performing the upgrade on cloud, Azure, AWS, and GCP, license mobility allows BYOL or volume licensing options. With this benefit, customers save up to 55% on the cost of running SQL Server and Windows Server in the cloud. Customers running SQL Server 2008 or 2008 R2 in Azure virtual machines will receive free extended security updates. However, on-premise customers with active Software Assurance or subscription licenses can purchase extended security updates annually for 75% of the full license cost of the latest version of SQL Server or Windows Server.
These initial lift and shift operations on the most common cloud platforms will allow additional time to plan migration activities, including upgrading to newer versions such as SQL Server 2017 and 2019.
For all your data footprint and migration journeys, you can reach out to us at [email protected] Be sure to connect with us at Snowflake’s SnowSummit during June 3-6, where Trianz is a proud sponsor.
Director of Analytics Practice
Kireet Kokala is a senior data technologist leader in the Data and Analytics Practice at Trianz who helps clients with digital transformation and data monetization. The Data and Analytics Practice works with enterprises to achieve significant competitive advantage via modern cloud technologies, with a particular focus on the Snowflake Computing ecosystem.
Contact Us Today
What are the Differences? Though often used interchangeably, data pipelines and ETL are two different methodologies for managing and structuring data. ETL tools are used for data extraction, transformation, and loading. Whereas data pipelines encompass the entire set of processes applied to data as it moves from one system to another. Sometimes data pipelines involve transformation, and sometimes they do not.Explore
One Unified Dashboard In the past, most enterprises would have used a legacy business management system to track business needs and understand how IT resources can fulfill these needs. The problem with these legacy systems is the manual data collection process, which introduces the risk of human error and is much slower than newer automated solutions.Explore
Intelligent automation in the workplace is becoming more relevant in the modern market. As automation technology becomes more refined and smart business models allow business owners to optimize their workflow, more and more are turning to intelligent automation for their internal and client-facing processes alike.Explore
What is a Hybrid Data Center? A hybrid data center is a computing environment that combines on-premise and cloud-based infrastructure to enable the sharing of applications and data across physical data centers and multi-cloud environments. This allows organizations to balance the security provided by on-premise infrastructure and the agility found with a public cloud environment.Explore
Leverage Your Data to Discover Hidden Potential The amount of data in the insurance industry is exploding, and the number of opportunities to leverage this data to achieve large-scale business value has exploded along with it. Rapid integration of technology makes it possible to use advanced business analytics in insurance to discover potential markets, risks, customers, and competitors, as well as plan for natural disasters.Explore
Increased Use of Data Lakes As volumes of big data continue to explode, data lakes are becoming essential for companies to leverage their data for competitive advantage. Research by Aberdeen shows that organizations that have deployed and are using data lakes outperform similar companies by nine percent in organic revenue growth.Explore