My SQL Saturday Chicago Precon–Managing and Architecting Azure Data Platform

Published On: 2019-01-16By:

After the MVP Summit in March, I’m headed to Chicago to speak at SQL Saturday Chicago, and on Friday March 22nd, I’ll be delivering an all-day training session on the Azure Data Platform. The term data platform is somewhat of a Microsoft marketing term, but we will talk about a wide variety of topics that will help you get up to speed on Azure.

All of the morning, and some of the afternoon will be spent talking about the core infrastructure of Azure. You’ll learn about topics like:

• Networking
• Storage
• Virtual Machines

While these are topics normally outside of the scope of the DBA, in the cloud you will have to at least understand them. Want to build an Availability Group in Azure? You’ll need to build an internal load balancer and map probe ports into your VM. Remember how you normally complain to the SAN team about your lack of IOPs? In the cloud, you can fix that yourself. You’ll also learn about what’s different about managing SQL Server in the Azure environment.

In the afternoon, we’ll spend our time talking about platform as a service (PAAS) offerings from Microsoft. While we will spend most of our time talking about Azure SQL Database and Azure SQL Managed Instance, I’ll also spend some time talking about other offerings like CosmosDB, and when it is appropriate to use them.

It will be a packed day, so put your learning hat on. You can register at Eventbrite here—there are five discounted tickets remaining.

When is the Right Time to Look at New Services?

Published On: 2019-01-14By:

Microsoft Azure is rolling out new features at a fantastic speed.  But when is the right time to evaluate those new features?  It might be right as the feature is released, it might be later in the lifecycle.  The basic answer to when to look at using new services is, it depends.

If the company has some problem that needs to be solved, that we can’t address today using available technology, or the solution we have built today could be made better, then a new feature or new service might be worth looking into.

If there’s nothing that needs to be solved, then the new feature isn’t helping you do anything. The feature is just shiny and new.

What it comes down to, is can the new feature potentially solve a problem that you or the company is having?  If you can’t solve the problem, then the new feature isn’t going to help you.  Now, this is going to mean some research into the new feature to see if it’s the right solution or not.  But if the feature that is being researched turns out to not be a solution to the solution to the problem (or a better solution to the problem than what you have today), then it’s time to move to another solution to the problem.

All too often, companies decide that they are going to use a solution to a problem, no matter what that right solution might be.  Even if the solution that they have decided must be used, costs thousands of millions of dollars a year.

Selecting a solution to a problem gets more complicated when there’s politics in the office involved.  All to often someone from upper management will decide that some product needs to be included in the solution.  This isn’t a new problem, either.

Back in the early 2000s, I was tasked with building a new knowledge base for a department at the company I worked at.  XML was getting popular and was being mentioned in all the magazines.  The knowledge base was supposed to be a normal relational database so that people could look up.  A Senior Manager wanted to use XML instead of a relational database.  I refused because XML wouldn’t perform well, XML wouldn’t scale well, and it would make no sense to build a database as a single XML file (which is what he wanted).  He insisted we use XML, and I asked him if he wanted to use XML, or he wanted the application to scale and perform well. It took a while to get him to see reason, but eventually, he saw reason and we used SQL Server for the relational data of the application.  And shockingly the application was able to be used successfully by thousands of employees on daily biases.

What it came to show, is that applications should be built to be successful, not to use some shiny bit of new technology.

Denny

The post When is the Right Time to Look at New Services? appeared first on SQL Server with Mr. Denny.

My 2018 Blogging By The Numbers

Published On: 2019-01-06By:

2018 was an great year for blogging for myself.  There was a decent amount of people reading articles that I’ve posted this year.  My numbers are a bit off, as there were some issues I didn’t notice when I did an upgrade of my WordPress plugin that counts all the views. My plugin recorded about 100k views of my posts.  Based on the stats that were recorded, my estimated page views are about 150k views for this year. Given the recovery this year, the lower number of page views make sense.

The most popular post that people were looking at was Difference between an Index and a Primary Key. With the next most popular post being my post on what MSDTC is, titled What exactly is MSDTC, any when do I need it?.

Other posts that were popular were about SQL Server Replication, Microsoft Ignite Announcements, SQL Server NUMA nodes, SQL Server NOLOCK, and a post about how important Disaster Recovry is.

All of this is a pretty huge spread of topics, but it gives me some idea what people are interested in reading about.

Here’s to an great 2019.

Denny

The post My 2018 Blogging By The Numbers appeared first on SQL Server with Mr. Denny.

Azure – Backing up to URL

Published On: 2019-01-04By:

In a previous post, I talked about some of the available storage options that can be used to back up your databases directly to the cloud.  Doing this is a really good way to get your backups off-site and replicated in a single motion, depending on the region and redundancy option select, for minimal cost.

So how do you back up to URL, the cloud?  I’ll show you. Before we start, this assumes that you have a storage account in Azure setup and configured.  If not, go do that first and then come back. You will also need SQL Server 2016 or higher in order to make this work.

Before you start backing up to Azure, you will need to decide on whether you want to use block or page blobs for your storage.  As I mentioned in my previous post, using block blobs is recommended for cost as well as performance.  However, you can use either.

With storage in Azure, the method used to back up the database will actually determine which storage type is used.  In either case you must use a credential.  If you use a Credential with a Share Access Signature, it will be a block blob.  If you use a Credential with a storage account access key, it will be a page blob.

If you look on the blade for the storage in the Azure portal, you will see both an option for Access Keys as well as the Share Signature.

Page Blobs

Azure allows you to have two access keys which allows you to rotate keys as needed when using page blobs.  You will need one of these keys in order to successfully back up using page blobs.

 

Like with any keys, you want to safe guard them.  If someone has the keys and knows the storage account name, they could gain access.  In short, rotate your keys regularly.

Once you have the key, you can create the CREDENTIAL.

CREATE CREDENTIAL [sqlbackups] WITH IDENTITY = 'backupsdemo' 
,SECRET = '<mystorageaccountaccesskey>';
GO

The identity value is the name of your storage account. The Secret is the key that you obtained from the portal.  Once the credential has been created, you can then backup your database.

BACKUP DATABASE ScratchDemo 
TO URL = 'https://backupsdemo blob.core.windows.net/sqlbackups/ScratchDemo_pageblog.bak'  
     WITH CREDENTIAL = 'sqlbackups';
GO

Block Blobs

With block blobs, using the Shared Access Signature (SAS) is a little different.  When you configure the signature, you will specify a time frame in which access will be allowed.  You also need to specify an IP address or range.   Once the signature has been created, you’ll see a bunch of options like what is shown below:

Keep in mind that when generating the shared access signature, it utilizes the access keys.  You can specify either key1 or key2 for this process.  If you regenerate either of those keys, you could need to regenerate the SAS token and then update.   Once the SAS token is generated, a connection string, token, and various URL endpoints will be displayed like shown below.

In order to use the signature for backups, we need to copy the SAS token.  We will use this token to create the corresponding credential.  One thing to note, the SAS token will begin with a ‘?’ which SQL Server already expects.  When you create the credential, remove the ‘?’ from the token.

CREATE CREDENTIAL [https://backupsdemo.blob.core.windows.net/sqlbackups]
WITH IDENTITY = 'SHARED ACCESS SIGNATURE', 
SECRET = '<SAS_TOKEN_GOES_HERE>';
GO

The credential name needs to be the URL that points to your storage account name and subsequent container.  The “Secret” is the SAS token that we gathered from the portal after generating the shared access signature (SAS).

BACKUP DATABASE ScratchDemo
TO URL = 'https://backupsdemo.blob.core.windows.net/sqlbackups/ScratchDemo_blockblob.bak'
GO

Note that it is completely possible to have a credential for both types of storage (block or page) created.  The page blobs credential can be named anything, however the block blob credential must to be named the URL of the storage endpoint including the container name.

Verification

Once the database has been backed up you can see the files in the Azure portal.

Note that I have these stored in Cool storage so unless I want to pay a penalty for early deletion, they will sit there for at least 30 days.  In a future post, I’ll show you how you can remove the files using Powershell.  If you are using Ola Hallengrens maintenance solution (which we recommend) it fully supports backing up to URL.

Summary

Ensuring your backups are safe and sound is a primary tenant of being a DBA.  Backing up to URL is a great way to accomplish this.  For a minimal cost, your backups could be off-site or even redundant across the country.   Keep in mind, however, that even with Azure things fail.  If and when your internet connectivity drops, make sure that you have a backup plan to handle temporary onsite backups until connectivity is restored.

Happy backing up!

© 2019 – 2018, John Morehouse. All rights reserved.

1 2 3 398

Video

Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.