SQL Server is a RDBMS system that runs several businesses globally and has become an enterprise contender with advanced integration and cloud capabilities. Extended Support for SQL Server 2008 and 2008 R2 will end on July 9, 2019 and several customers will need to either upgrade or migrate their databases.
In Part I of this article series, let us examine the available options for on-premise and cloud services via Azure (IaaS/PaaS), AWS, and GCP. In Part II, we’ll look at the prerequisites for accelerating data loads for SQL Server data to Snowflake!
SQL Server has been the go-to database of choice for both OLTP and OLAP applications over 25 years. According to 2016 data from idatalabs.com, thousands of companies use SQL Server to power their business.
The survey found that in looking at SQL Server customers by industry, Computer Software (14%) and IT & Services (11%) are the largest segments. After 10 plus years in the running, Microsoft is providing customers options on how to deal with their SQL Server 2008 and SQL Server 2008 R2 installations, preferably to the Azure cloud. There are several choices with regards to Azure such as the SQL Database Managed Instance and the on-premises version, SQL Server 2017-2019.
An upgrade of SQL Server to the latest version generally translates to higher license fees and cost of operations to prepare and migrate data. However, with cybersecurity attacks, database, and zero-day vulnerabilities on the rise, the cost NOT to upgrade SQL Server is even higher.
The benefits of migrating to the latest SQL Server version are wider data integration options, improved language platform choices, better performance, and advanced security features. There’s certainly a lot more complexity that can be achieved towards Big Data landscapes and Data virtualization in the process. Though the steps of preparing for an on-premise or cloud-based SQL Server upgrade have complexities, it is well worth the preparation. In several cases, the upgrade might be part of a native cloud push or a multiyear cloud migration journey.
Note: SQL Server databases and other objects associated with the previous SQL Server instance must be backed up prior to upgrading.
The above summary list works well in a lift and shift of SQL Server database and will cover database backup and restore procedures.
Upgrade to SQL Server on Azure: extended security updates will be available for free in Azure for 2008 and 2008 R2 versions of SQL Server and Windows Server to help secure workloads for three more years after the end of support deadline. Customers can rehost these workloads to Azure with no application code change using Windows or Linux instances using Azure Portal. Additionally, SQL Server 2008 and 2008 R2 deployments can be moved with no application code change to Azure SQL Database Managed Instance.
Upgrade to SQL Server on AWS: SQL Server on Windows or Linux on Amazon EC2 enables quick to increase or decrease capacity within minutes. Migration of SQL Server databases can be accomplished securely with the AWS Database Migration Service. For SQL Server 2008, databases can be ingested into the AWS cloud and easily upgraded with AWS Systems Manager.
Upgrade to SQL Server on GCP: SQL Server 2008 can run on either Windows or Linux virtual machines using GCP console. There’s wide support for SQL Server from version 2012-2017 and with Express to Enterprise edition support. By the end of the year, CloudSQL for SQL Server will allow customers to migrate existing, on-premises SQL Server workloads to GCP and run them in a fully managed database service that autonomously handles backups, replication, patches, and updates. CloudSQL will preserve existing apps and data while enabling integration with GCP services like BigQuery
Licensing Costs and Updates: if performing an on-premise upgrade, procurement of the appropriate licensing key should be applied. If performing the upgrade on cloud, Azure, AWS, and GCP, license mobility allows BYOL or volume licensing options. With this benefit, customers save up to 55% on the cost of running SQL Server and Windows Server in the cloud. Customers running SQL Server 2008 or 2008 R2 in Azure virtual machines will receive free extended security updates. However, on-premise customers with active Software Assurance or subscription licenses can purchase extended security updates annually for 75% of the full license cost of the latest version of SQL Server or Windows Server.
These initial lift and shift operations on the most common cloud platforms will allow additional time to plan migration activities, including upgrading to newer versions such as SQL Server 2017 and 2019.
For all your data footprint and migration journeys, you can reach out to us at [email protected] Be sure to connect with us at Snowflake’s SnowSummit during June 3-6, where Trianz is a proud sponsor.
Director of Analytics Practice
Kireet Kokala is a senior data technologist leader in the Data and Analytics Practice at Trianz who helps clients with digital transformation and data monetization. The Data and Analytics Practice works with enterprises to achieve significant competitive advantage via modern cloud technologies, with a particular focus on the Snowflake Computing ecosystem.
Contact Us Today
Better Insights in the Cloud Data analytics is not an entirely modern invention. The term “big data” was coined in the 1990s to describe massive data sets often used in the finance, science, and energy sectors. Since then, both the amount of data produced and the computing power it requires have grown at an astonishing rate. The tools and techniques honed through various scientific disciplines provide a platform for businesses to accelerate growth and make the most of their place in the market.Explore
What is Predictive Analytics? Predictive analytics is the practice of analyzing past and present data to predict a future outcome. Today, every industry from insurance and finance to healthcare and child services uses neural networking, machine learning, and artificial intelligence to build predictive models to solve complex problems and support better and faster business decisions.Explore
What is ITOM? IT operations management (ITOM) can be defined as the process of managing and maintaining an organization’s network infrastructure. An IT team is typically tasked with this work, covering aspects of computing such as compliance, security, and troubleshooting. This team works with internal and external network users, offering advice and remediation to overcome technical obstacles and maintain effective service delivery.Explore
Putting Data to Work Recently, one of the world’s largest global shipping companies was seeking to identify new revenue opportunities; specifically, they were interested in monetizing their data by building other, related business intelligence products for different industries. Like many other businesses, they had found themselves sitting on a mountain of actionable data without any processes in place to explore or leverage said data. Their intentions were now pointed in the right direction, but what they were missing was a data monetization strategy.Explore
The Data Tide Businesses in the digital age are inundated with data as it floods in from multiple channels. This data is both a challenge to wade through and an absolute goldmine. Its tremendous potential can be harnessed to communicate meaningfully with audiences and advance an organization’s brand awareness in the public eye. The problem is, however, that raw data itself can’t tell a compelling story to most people. It needs to be woven together artfully to create a narrative that connects with a specific audience. This is where data-driven storytelling comes in.Explore