Skip to content

Migration to Windows Azure SQL Database: Step 2

2013 October 29
by Shannon Lowder

Let’s continue our discussion on what it takes to migrate a database from SQL Server classic to Windows Azure SQL Database (WASD).  At this point in the migration we’ve determined that all the access points in our system do have internet connectivity, so they’ll be able to reach our WASD instance.  We’re not going to store any highly sensitive data (credit card data in plain text or government secrets), and our current system was already using SQL Authentication to connect to the database and control who had access to what.

In our next step, we’re really going to dig into the code.  We’re really going to find out what would slow down the migration, and what might prevent the migration all together.

Research and Documentation

I’ve worked with both the small and medium sized businesses, and Fortune 500s.  There’s one thing they both have in common.  Business systems are never fully documented.  Some are just more documented than others.  The next step in the migration process requires you find all the documentation that exists on the system as if functions today.  With that in hand you’re going to see what things may no longer work with SQL Azure (WASD).  I like to break my research down into 5 basic questions.

Who uses the system?

Knowing who your users are will help you when you have questions later.  When you read through the documentation and discover that one step where the writer assumes you just know how to do something.  Knowing who normally performs that function will save you a lot of time and energy.  You can go straight to the source and find out, how do you perform that function.

Knowing who your power users are will help too.  You’ll want a few of them in your testing team.  After all, these are the people who are pushing your current system.  You’ll want them to push your new system too.  It’s better to push it before going live to see where it’s going to fall short.  Also, if you get some of these power users on your side, they’ll help sell these changes to others in the company.  And since resistance to change is one of the hardest challenges to overcome, you need to recruit as many to that cause as you can.

You can’t just change a system without considering the users.  Get to know them before you begin the process.  Listen to their concerns, if you can address them in the migration, it will make the whole project more successful in the long run.  Besides, you’re making this change to improve their experience…right?

What, what what?

There are many whats I like to answer.  What are the users doing in the system?  What kinds of questions are they trying to answer?  What does the system do?  What is the system supposed to do?

By answering these questions you’ll get to know the system from the inside out.  You’ll start to spot opportunities for improvement.  When you start answering the question “what questions are you trying to answer?”  you’ll really start to see room for improvement in your reporting solution.

As you answer these what questions, you’ll also start to see areas where the system is actually doing things differently than the users expect.  This is another area where you can make improvements to the system.  While not strictly a part of the migration, this question can help guide your team to make constant little improvements that create a strong relationship between your team and your customers (or co-workers).

These what questions will also help you spot troubles in the migration.  If the system regularly pulls down 300,000 rows to build a report, that’s not going to work well in SQL Azure.  I’d also argue it isn’t working well onPremise either!  But these what questions start pointing out problems that you’ll have to dig further into the code to determine if you can continue as is, or if you’ll have to make changes to the system in order to proceed.

Where are your users located, and when do they use the system?

These two questions help you manage your maintenance windows.  What’s really frustrating for many smaller shops is not having a clearly defined maintenance window.  I’ve actually worked in many shops where, there was no communication between IT and the end users.  Maintenance windows were just taken whenever we didn’t see users on the system.

That doesn’t work when you’re dealing with users all over the planet.  Even less when your business is online, and open to the public.  By talking to your users, either through actual discussions, questionnaires, or logging, you can get actual insight into where your users are located and when they are using your system.

You can then start scheduling your regular maintenance windows, establishing service level agreements, and also planning for those longer maintenance windows where you’re going to be migrating a ton of data from one data center (onPremise) to the cloud.

Trust me, you’re going to want to communicate that with your end users.  Be up front with them, good faith is hard to create with your users.  And communication is one of the few free ways to do that.

Why do your users use this system?

This last question may sound like a marketing question.  And it is in a way.  If you can understand why users are using the system, it will help you make certain decisions in the future.  If people use your system because it’s the most accurate, then that needs to be important to you while you conduct the migration.  If cost is the top priority, then keep that in mind.  Getting into the minds of your users help you stay on the same page.  It will also help you address problems with the right tools.

Why the users want to see how many people get on at a certain location and off at another could help fleet managers plan for putting more or less buses in service.  In turn that will help riders wait less, and also help the fleet save money.  But if you never understand that, you’ll never understand why the user keeps asking for several reports.  Once you take the time to understand why they are using the system, you can work with them to design the report that actually answers both questions, and helps them accomplish both goals in one report.  Then you no longer have to design multiple reports to answer the same questions over and over!

Conclusion

Get to know the system you’re migration, as well as the users on that system.  It makes the process go more smoothly.  It also helps you identify opportunities for improvements.  If you skip this step, it is possible to complete a migration, but you’ll be limited to a lift and shift migration.  Take the time here, and you open up a ton of possibilities.  And you’ll grow a relationship with your customers at the same time.

Next time, we’re going to dive into the technical details around whether or not you can migrate your database from SQL Server OnPremise to Windows Azure SQL Database.  We’re going to discuss the Migration Wizard that we used for our case study project, as well as how we would do it today with SQL Server Data tools built into Visual Studio 2012.

As always, if you have any questions about a migration, or anything SQL Server related: feel free to comment below or email me.  I’m here to help!

 

Migration to Windows Azure SQL Database: Step 1

2013 August 19
by Shannon Lowder

Server Closet?

Last time I introduced you to Windows Azure and some of the key features: pay as you go, scalability, and ease of use.  These are the three key points I make when I start talking to business owners about why I like Azure, and how it can help their businesses.  But before we try and sell them on Azure, we have to ask a very key question.

Would the system work in Azure?

We have to ask some pretty basic questions to figure out if the business system would work in the cloud.  The most basic of these is “Does every part of the system have internet connectivity?”  It goes without saying that if you’re looking to move to the cloud, you’re going to need a connection to the cloud!

In the presentation we are dealing with a company that uses sensors mounted on vehicle to collect data about those vehicles and the number of passengers traveling on those vehicles.  These sensors use a cellular data connection to reach the internet.  They were already using that internet connection to “dial home” and hand this data off to a service that would then record the data into the database, and then acknowledge that packet.  Once the acknowledgement is received, the sensor deletes the data, and proceeds to begin collecting more data. I’d like to point out, these sensors will hold on to these packets for up to a year without dialing home.  This helps address possible connectivity loss between these sensors and the central service.  The fact that this fail-safe was built in, saved a lot of time and effort in the migration.

We didn’t have to build anything extra to deal with the fact that you cannot assume your connection to the cloud will always be there.  Connections are transient. If you build your solutions assuming the connection isn’t always there, and build in a retry policy, you’ll always be in better shape in the long run!

On the other end, users were already connection to the system using a windows based application to connect to the database over the internet, and retrieve data for reporting.  So from end-to-end, the system already had internet connectivity, so we were good to go!

Are you dealing with secret data?

Officially you shouldn’t be putting “top secret” data into Windows Azure.  But you can deploy sensitive data, as long as you’re taking the proper precautions in your system design.  In fact, Microsoft currently holds ISO/IEC 27001:2005 certification, SSAE16/ISAE 3402 attestation, HIPAA Business Associate Agreement,  (and goes through an annual review process).  You can find the current references to these certifications here. While the big one that’s missing is PCI, you should be able to obtain a PCI certificate with Azure, as long as you show you’re meeting everything on the PCI DSS checklist.  Showing the controlled access to the data centers would be a breeze, since you never actually get physical access.

Whenever we’re asked to deal with secured data, we always bring in an outside consultant to help us build our solutions in a way that would ensure the client get’s the certification they’re going for.

Of course, I personally like to avoid PCI all-together by using a tokenization solution from a existing credit card processor.  It makes this process so much easier!

In the presentation, we’re only dealing with GPS data, and counts.  Nothing that really needed to be secured more than by user account.  So for the case study, we’re off to the next question.

What kind of authentication are you using?

In our case study we found the client was only using SQL Security…just one account.  After discussing some of the limitations of using a single account, and developed a plan for improving that security, we were good to go!

In Azure, when it comes to the database layer, you’ve got two options SQL in a virtual machine, or Windows Azure SQL Database.  If your application is already using Windows authentication, and you can’t move to SQL authentication, then you’re going to be locked into using a virtual machine (as of the publishing of this post). If you’re already using SQL Authentication, or could move to it, then you can use either virtual machines or SQL Database.

The great thing about Windows Azure is it’s constantly improving!  When I started giving this presentation, the only option was SQL authentication.  Almost three months ago, Windows Authentication and Active Directory became valid options!

Final thoughts for this step

You’ve got to do some research up front to figure out if the system you’re looking at could move to the cloud.  Chances are it could.  But there are some times where it doesn’t make sense.  you’ve got to be ready for those.  We’ve been asked to look at cloud based options for manufacturing plants.  There was just so much electromagnetic interference, a solid internet connection just wasn’t going to be possible. Once you’ve done this first bit of research.  You’ve got your first round of documentation on who is using the system, what are they doing when they use the system, etc.  In our next step, we’re really going to dig into the code.  We’re really going to find out what would slow down the migration, and what might prevent the migration all together.

As always, if you have any questions about your systems and if they could move to the cloud, let me know!  If you just have questions about Azure, SQL, or anything else, let me know, I’m here to help.

On-Premises To Azure Migration

2013 August 12
by Shannon Lowder

I’ve been presenting this presentation all year. I’ve tweaked it a little after each section.  Funny thing is I never blogged that content… So I’d like to take the next few posts to share the content from those sessions in written form.  Let’s jump right in.

First, what is Azure?

What is Azure?

What is Azure?

You’ll read and hear people talking about different cloud based services.You’ll find references to Platform as a Service (PAAS), and you’ll find references to Infrastructure as a Service (IAAS).  Microsoft Windows Azure has both. You can set up “private” network segments, Active Directory Domains, and virtual machines, all of these are considered infrastructure as a service.  You have to do a little more set up.  For example, when standing up a virtual machine you have to choose a number of cores, amount of memory (right now it’s tied to how many cores you choose), and a number of discs to attach to that virtual machine.  Once you choose that, your service starts up.  IAAS has a draw back, you have to shutdown the service to make a change to how many resources you’ve allocated.

But then you also have Platform as a Service options, such as web sites, storage, and SQL Databases.  In this case, all you  have to do is Name the service, and choose a class, and you’re up and running.  For example, if you go to spin up a new database on Windows Azure SQL Database, you name it, choose web (0-10 GB) or business class (0 – 150GB per instance), and you’re up.  You won’t have to shut down to grow the database.  The  down-side here, is you don’t get direct control over CPU, Memory, or Storage.

There are other offerings in Windows Azure that aren’t Platform or Infrastructure as a Service.  Most of these are found in the Azure Marketplace.  You can “lease” access to data sets maintained by other companies.  For example, Melissa Data has opened up their address and email checking services through the Marketplace.  You can add this to one of your solutions and pay a unit cost for each lookup performed! This can make it quicker, cheaper, and easier to deploy than the traditional subscription models.

Windows Azure is Global, flexible, highly available!

Windows Azure is Global, flexible, highly available!

Windows Azure has multiple data centers. As soon as I publish this, it will be out of date, but as of today, there are 8 fixed data centers. There are four in the United States, Illinois, Texas, California and Virginia. There are two in Asia: Hong Kong and Singapore. There are two in Europe: Ireland and Netherlands. There are two more coming soon in Australia.

Not only do they have these fixed-site data centers, but they also have mobile data centers.  Think of a data center in the back of a tractor trailer!  They can ship on of these to any location with power and network and spin the whole data center up on demand.  This can be useful for large volume data migrations.  You wouldn’t want to upload thousands of terabytes of data over your corporate internet connection, right?

These on-demand data centers could also be a response to disasters.  If a natural disaster would strike, these could be deployed for additional coverage until the main data centers could be brought back online.  Cloud based solutions are all about high availability!

52projects works mostly with small and medium sized companies.  Many of them start out with a single server, and slowly add more as they need extra capacity.  Some never consider the possibility that machines will fail, power will go out, disasters will happen; others don’t see how to justify the extra costs associated with planning for those kinds of failures.  One of the many reasons I’ve embraced Windows Azure is we can now bring this enterprise-level plan for high availability to these smaller companies…and we can do it for less than they are paying for their current services.  Talk about huge wins!

If you’d like to find out how we can help you take advantage of these new solutions, call or email today!  704.645.7226 slowder@52projectsinc.com