Now that you’ve completed analyzing your current database, and worked through that list of items that won’t work with Windows Azure SQL Database (WASD), it’s time to begin testing your new solution. Testing begins with setting up your test instance of WASD, and loading your test data, then you point your application to that new instance and then run through your application end-to-end. Let’s walk through those steps in more detail, and cover some of the issues you might discover.
Push a copy of your data to the cloud
After you’ve set up your instance via manage.windowsazure.com, getting your data pushed is pretty simple. You could either use the Migration Wizard, or you could use SSIS. I prefer the migration wizard, since it has better throughput. All you’ve got to do is pick your source database and tables, then target your new instance and database, and click go! It takes care of the rest.
If you choose to use SSIS, remember you’ll have to use ADO.NET connections for your destinations. Also, be sure to set the connection string so that your package knows it’s sending this data to SQL Azure. I haven’t had a connection disruption in the middle of a data flow task, but I have had it fail between data flow tasks.
Once all your test data is in Azure, and you’ve verified it. You’re ready to move on to testing your application.
Test your application
It’s a good idea to invite a small number of your system’s power users to join in on the testing. They’re the ones who generally push your system the hardest currently, so you want them to bring that same focus to testing. Systematically test everything your application does with the database. You want to go through everything now, before you turn this over to the end users. There are some automatic tools for this, I would direct you to my application architect for more information on those automated test suites.
You want to make sure you go through every feature your application has. When you find an issue you want to record as much detail as you can around what you did to create the issue. Once you’ve gone through the application once, you’ll bring those issues back to the programmers and you’ll start breaking down what has to be changed to fix the issues.
This is where you’ll discover things like your connection pooling isn’t set appropriately for SQL Azure. You might discover your retry policy is too aggressive, or you’re sending atomic updates rather than batching them together. This is when you’ll discover if you’ve been following some programming best practices all along. If you haven’t, no time like the present to improve those skills, right?
Fixing those issues
As you discover issues you’ll have to prioritize them and decide how you will address them. You can choose to fix them now, later, or drop the feature all together. These decisions will come down to a question of time and money. I can’t give you a definitive answer on which kinds of issues are which.
As you work through these issues, you’ll return to the testing phase several times, each time verifying the issues you address truly are fixed. Don’t forget you are also re-testing all those items that passed the first time too. You want every test to pass every time. The great news is, once you pass all your tests, you’re ready for the final step.
Going live is very similar to the testing process. It starts with a database push to make sure all the data you need for production is in Azure. Once you have that, you make sure all the applications are wired to hit the new Azure instance. You run through the test suite one more time. If all your lights “go green”, then you announce to your users the migration is complete, and invite the full user base to start using the new system.
And then you wait.
During our first migration this wait was a little longer than usual. We heard no feedback from the end users that first day. All the diagnostics showed the system was performing well. No unusual SQL errors, the application response time was reporting well below the response time threshold. Reports were running as quickly as they had during testing. Why weren’t we hearing anything from the users?
Finally we got a call on the second day. I was manning the phone. One of our most demanding users who wasn’t in the test group called was on the line.
“So I’m going through this new system…” she began. She let that pause hang for far too long. “And it’s just so fast!”
I was completely relieved. We spent the next 30 minutes discussing the efforts we had put in performance tuning the reports and working with the test users to tweak how we would pull back the data from the data import processes. The call ended with me feeling the biggest smile cross my face. It was a great day.
From start to finish this process took a year. There was plenty of work poured in to the project, and it shows. The customer now has a system far beyond the one he began with. A year after go live, he’s doubled the number of clients using his system. During that year I got to learn a ton about what Azure is, and what it’s capable of. I also see the potential for it to become something so much more. I’ve also found that there is a data architecture that many SQL professionals across the country aren’t familiar with: sharding. It’s the real key to scaling out SQL Server.
And that’s the topic I’m tackling next. Are you ready to see what the OLTP version of Parallel Data Warehouse looks like? Stay tuned!
In the mean time, if you have any questions, send them in. I’m here to help!
Last time I walked you through how to see if your database will work with Windows Azure SQL Database (WASD) using the Migration Wizard on Codeplex. It’s an easy tool to learn to use, and it answers the question what won’t work if I migrate to SQL Azure. It will even help you perform the migration too, copying all the schema and data from your database to a new Azure Database. But if you find you have to make changes to your database before migrating to Azure, you’ll find the Migration Wizard lacks a lot of functionality that will prove useful during the process. That’s why I moved SQL Server Data Tools (SSDT).
Let’s take the same AdventureWorks2008R2 Database from the previous post, and test it for migration using SSDT. Before trying to follow along, make sure you have downloaded and installed SSDT on your machine. I’ll be using Visual Studio 2012, but I’ve used SSDT in 2010 as well and all the functionality appears to be the same to me.
You’ll then have to create a new connection to your database. In my case I’ll connect to my localhost copy of AdventureWorks2008R2. Then, for Import Settings I’m only going to capture the application scoped objects. I’m not concerned with security objects during this test.
I would be careful trying to migrate database settings from on-premise to Azure. Since you’re not going to be able to set up file groups, etc, don’t even try to import those if you’re using them. It’ll just save you hassle later in the migration project.
Once you’ve connected and set your options, just hit start to begin the import. You’ll get a console showing you the import progress. When it’s complete, hit finish to close the console and return to the project.
Now you have all the objects from your database in your project. At this point I would advise you to save your project and commit this as the initial version of your migration project. This is your starting point. You could deploy this project to another instance of your on-premise database server. But the source control conversation is a talk for another time.
Let’s talk about how we could use this project to see what will and won’t work in Azure SQL Database.
You’ll notice that the target platform on my screen is SQL Server 2012. If you imported from another version of SQL Server you’ll see that listed in the Target platform drop down. Select Windows Azure SQL Database, then hit save, or CTRL +S. That way the project now knows you’ve updated the target to WASD.
Hit View -> Error List. You’ll be presented with a list of all the objects and options you’re currently using that aren’t compatible with Azure SQL Database. This is the list of items you’re going to have to work through before migrating to Azure. I know I keep mentioning it, but being tied into Visual Studio lets you keep track of your updates (as long as you tie in version control). It also lets you assign tasks, if you tie in TFS. There are many added benefits in using SSDT to migrate your database.
And once you’ve addressed all the errors in that list, you can run a deploy against your target Azure Database and deploy your schema there! The only thing you really lose in moving from the migration wizard to SSDT is the easy way to deploy data from your source database to your new Azure database. You could always use the import export wizard, but the Migration Wizard will perform better. Since it runs using BCP, it consistently pushes more rows per second from my lab environment to Azure. My average is just shy of 30k rows per minute. Not screaming fast, but better than I see in SSIS, which runs around 12k rows per minute during my last tests.
I do admit I haven’t taken a great deal of time to performance tune my SSIS Package and see if I could squeeze more out of the throughput.
Once you’ve successfully deployed your schema to Azure SQL Database, you’re read for some integration tests. We’ll go through that next time. Until then, if you have any questions please send them in. I’m here to help!
As promised we’re going to dive into the details on how we would actually migrate the database from your on premise server to Windows Azure SQL Database. We’re going to go over how to do this with the SQL Database Migration Wizard. I’ll be honest, going forward we only use SQL Server Data Tools (SSDT), because you get the full database project experience. And once you tie your project into source control (you all do that, right?) You’re set up for the actual project life-cycle.
Features that won’t work with WASD (today)
Before we dive in, I want to link you to two pages that list some of the features you can’t currently use with Windows Azure SQL Database. This is the first, and this is the second. Notice I say today, that’s because since I started with Azure SQL Databases, features have been “turned on.”
At the beginning you had no backup strategy at all. Then, RedGate and a few other companies rolled out services that would snapshot your backup. It wasn’t necessarily a transactionally consistent backup, but it gave you the ability to create your daily backup, and you could restore this data to a second database (on prem or in the cloud), and restore just the data you lost.
A few months ago, this feature moved to first party. Microsoft also added a method to ensure transactional consistency on their snapshots. Depending on the size of your database, this new offering might have a lower price than the third party solutions. But given the fact it added transactional consistency, that was totally worth the cost change.
Now, if only we could get point in time backups, without having to put in a service request to Microsoft CSS…now that would be even better!
I want to make sure you get two points here;
- Windows Azure SQL Database is an evolving product. I’ve seen changes happen in less than a month. I’m really excited when I get my email from the Azure Marketing team telling me about all the new features that just went General Availability (GA), or those items just entering preview.
- Windows Azure SQL Database isn’t a dedicated private server. You’re sharing it with others, sometimes MANY others. When you notice there are certain features not supported, ask yourself, “what’s the cost of turning this on for a box with possibly a hundred different SQL workloads”. If you think the costs are too high, that’s a good indication of why that feature is not yet available in WASD. This means sometimes you’ll have to alter how you do things. Sometimes it means you’ll have to be more performance and cost conscience. And that’s something I’ve been preaching for years before moving to the cloud!
Analyzing your database with SQL Database Migration Wizard
During the next phase of the migration, you’ll need to find out if you’re using any features on premise that you can’t use in Azure. After you download and extract the Migration Wizard, fire it up!
When the program opens, you can do a couple things, you could analyze a trace. This is a great tool if you have a program that’s not very well documented in what it does with the database. You can start a replay trace from your on premise server, let it capture your “standard” work load. This could mean running that trace for many hours, possibly for a few days. Please run that trace from the server and not your local machine.
Once you’ve collected the trace, bring it down to your workstation and choose the first option and go. It will churn through your workload.
The steps we’re going to take are just going to look at the database itself. Choose to analyze / migrate your database and click next.
You have to authenticate to your local database server, then click next. on the next screen you’ll see a list of your databases. Choose the database you want to migrate, then click next again.
Next, you can choose to check specific objects or the full database. In this case, we’re checking the whole thing, Adventureworks2008R2.
Next you see a summary of what’s going to be scripted and checked. Followed by the obligatory are you sure Yes/No.
Once the wizard has crawled through your database you get a two tab result. The first tab shows you a summary of the results. Scan through looking for red text. These are the items you’ll have to address before moving on to Azure SQL Database. Notice that AdventureWorks is using XML schemas, XML indexes, and a few other no nos.
The second tab gives you a script you could execute against an azure database and create your objects there. It also includes comments on things you’ll want to fix before fully moving to Azure.
If you start in the Migration Wizard, you then would have to create a Database Project (in SSDT or not) and then start working through these problems. Some might require program changes, SSIS changes, or other changes outside the database. This becomes the hard part. Deciding how to change the database to make it work. There’s even a chance that this step might reveal problems making the project too expensive to migrate.
Next time, I’ll cover how to accomplish these same steps with SQL Server Data Tools (SSDT). In the mean time, if you have any questions, send them in. I’m here to help!