Adaptive Joins in SQL Server

Published On: 2020-09-02By:

SQL Server 2017 (compatibility 140) brought us many Intelligent Query Processing (IQP), formally known as Adaptive Query Processing, features that improve performance on workloads straight out of the box with no code changes to implement. One of those features introduced was Adaptive Joins. In this feature the join operator is dynamically determined at runtime by the optimizer, which defines a threshold number of rows and then chooses between a Nested Loop or Hash Match join operator. This operator switch can produce a better join and improve performance of your queries without you having to lift a finger.

Not all queries will qualify for this new feature. The feature only applies to SELECT statements that would have normally returned a Nested Loop or Hash Match, no other joins are apply. In addition, the query must be run in Batch mode (using a Columnstore Index in the query) or using the SQL Server 2019 Batch Mode on Rowstore feature. To find out more about the latter, I recently blogged about Batch Mode on Rowstore here.

Now let us understand the difference between the two different join operators the optimizer will choose from in the feature.

Hash Match– Creates a hash table (in memory) for required columns for each row then creates a hash for second table and finds matches on each row. It is very expensive and requires a lot of memory resources.

Nested Loop– It performs a search on the inner (smaller) table for each row of the outer (larger) table. Less expensive than a Hash Match and ideal for small row inputs, it is the fastest join operator that requires the least I/O with the fewest rows having to be compared.

For this feature a new operator was introduced to show us that an Adaptive Join was used, and the properties give us details on how it determined which join to use.

At runtime if the row count is smaller than the Adaptive Threshold of rows a Nested Loop will be chosen. If it is larger than the threshold it will choose a Hash Match, it is that simple. This can be great for workloads the fluctuate between small and large row inputs for the same query. Note the screen shot below. Using estimations, the plan would have returned a Hash Match but during actual executions it dynamically changed to Nested loop.

The adaptive threshold rows is determined is based on operator cost. The optimizer will evaluate each operator cost using an algorithm for the join operation. Where that cost intersects (the row count tipping point) is what it uses to determine the threshold. Microsoft Docs gives us a good image of this.

Like with any SQL Server feature you have the ability to turn it off by disabling it if you find it is not providing any performance gains or is causing query regressions within your environments.

— SQL Server 2017

ALTER DATABASE SCOPED CONFIGURATION SET DISABLE_BATCH_MODE_ADAPTIVE_JOINS = ON;

— Azure SQL Database, SQL Server 2019 and higher

ALTER DATABASE SCOPED CONFIGURATION SET BATCH_MODE_ADAPTIVE_JOINS = OFF;

Intelligent Query Processing gives us many new “auto” fixes for our queries Adaptive Joins is one that has piqued my interest as a database administrator that loves performance tuning. Improper JOIN choices made by the optimizer can really hinder performance, which would likely require me to implement query hints or plan guides. I really like that SQL Server is now making automatic intelligent decision and fixing them for me on the fly without my intervention.

Contact the Author | Contact DCAC

How to Quickly Change Azure SQL Database Service Level Objectives

Published On: 2020-08-28By:

Sometimes as a DBA, I am lazy and want the ability to execute all of my tasks in one place. Lucky for me I discovered the other day that I can change my Azure SQL Database Service Level Object options within SQL Server Management Studio (SSMS) without ever having to go to the Azure Portal. By right clicking on your database properties and choosing the Configure SQL page you can change your Subscription and Azure Location. Even more and what is really cool is I can also scale up and down my Edition (which is generally referred to as Service Tier outside of SSMS), Size and Storage Tiers as well.

You will have to authenticate to Azure the first time you use this.

Once authenticated you will see the below options available.

Depending on your Service Tier such as Basic or Premium, or the more current General Purpose or Business Critical, you have selected your drop will dynamically change for each option. As you can see in the example above it is currently set to Premium, thus I would see only the “P” level objectives (compute and memory levels). However, if I changed my Editions to anything else my Service Level Objective would change accordingly. Azure changes frequently as do the offerings. SSMS is making a call to the SQL resource provider in Azure to get the offerings, so it should always be current, though it may look different than this screenshot. If you notice that the option, you want it not in the dropdown they have given you the ability to simply type the value.

The Max Size will allow you to see your current dataset maximum storage size or scale it up and down when needed. Leaving it blank will set it to the default size for the edition and service level objective.

By clicking ok these changes will be implement and make take a slight downtime (should be minimal) event so be careful. Another thing worth noting is permissions to alter a database per ms docs a login must be either:

  • the server-level principal login
  • a member of the dbmanager database role in master
  • a member of the db_owner database role in the current database, or dbo of the database

If you are GUI averse, you can also script these changes out to T-SQL and run those. For example, this script changes the Service Level Objective (SLO) to a Premium P2.

ALTER DATABASE [YOUR DATABASE] MODIFY (SERVICE_OBJECTIVE = 'P2');

GO

Once again Microsoft has catered to lazy DBAs like me that want everything in one place. This is another one of those things I am grateful to have the ability to do in SSMS.

Contact the Author | Contact DCAC

What Does Billing Look Like When Changing to an Azure CSP

Published On: 2020-08-26By:

Billing in Microsoft Azure is a strange beast to get your head around, so it is perfectly normal for questions to come up when moving from Pay As You Go (PAYG) in Azure to Cloud Solution Partner (CSP) billing.

Clear hour glass on brown and black stones

PAYG Billing

Both Azure PAYG and CSP both bill in arrears, but the time frames that the bill comes out are very different. With a PaTG subscription, Microsoft can generate the invoice up to one month after the usage, which is when they bill your credit card. So for example (using the invoice that I’m looking at right now), an invoice dated August 5, 2020, has usage for the subscription from June 6, 2020, through July 5, 2020. So even if you stopped using the services on July 2, 2020, you’ll still be getting invoiced on August 5th.

What makes the usage of these dates even more strange looking, is the date that the customer is billed is set based on the first subscription that the customer has. So if there are multiple subscriptions each one will have its own usage window, but they will all invoice on the same day. They can make figuring out how the billing breaks down much harder.

Another Subscription for this same customer, also billing on August 5, 2020 was for all usage between June 18, 2020, and July 17, 2020. So again, if the service stopped being used on July 2, 2020, Azure won’t bill for that usage until August 5th.

CSP Billing

With CSP billing, the billing is still in arrears, but the time window is much shorter. With a CSP, such as Denny Cherry & Associates Consulting, the billing happens a few days after the end of the month. So in the case of our example, any usage from July 1, 2020, through July 31, 2020, and is then invoiced on about August 5, 2020 (we get our bills between the 5th and the 8th typically, and we work to get them sent out the day we get them).

Overlap

This can create what appears to be overlap since in this case, the customer got two bills on August 5, 2020. One from Microsoft for the PAYG usage, and one from DCAC for the CSP usage. But the usage dates are what is important here. The PAYG usage is from June 6, 2020 – July 5, 2020, while the CSP usage is from July 1, 2020 – July 31, 2020 (the services were moved from one subscription to another, so only one service is up and running at a time) it’s because the Microsoft billing shows up a month late that it appears at first glance that double billing is happening.

If we look at the first charges for the subscription that can explain some of this. The subscription in question started recording usage on August 5, 2019 – September 4, 2020, but the first invoice wasn’t generated for another month on October 5, 2020; over 2 months after the usage first started on the subscription. So since it took so long to start billing, it makes sense that it will take some time before the billing from Microsoft will finish on the PAYG subscription.

What this means is that in the first month or two while services are being moved from a PAYG subscription to a CSP subscription it’s going to look like double billing is going on, but it isn’t. It’s because the billing dates are very different, and Microsoft is billing about one month after the usage period ends, while the CSP is billing just a couple of days after the usage period ends.

Hopefully, this helps clarify what some people are seeing on their Azure bills as they make the transition from PAYG to CSP. We will of course help you navigate through this as we move clients from PAYG billing to CSP billing.

More Questions?

If you’ve got more questions about this, or you want to move over to a CSP, contact us and we’ll help get the process going.

Denny

Contact the Author | Contact DCAC

Adding an Azure Active Directory User to Azure SQL Database with Automation

Published On: 2020-08-25By:

My teammate Meagan (b|t) messaged me in Teams yesterday afternoon to say “Joey, the client created a new database using your automated process, and my ETL user (which is a AAD user) didn’t get created, can you fix it?” Well, after a quick perusal of emails I remembered that I had the asked the client to add the create user process to their initial population process which hadn’t occurred yet. The reason why I did this was that creating an Azure Active Directory user in an Azure SQL Database from Azure Automation was painful and maybe not even possible. However, I pinged Rob Sewell (b|t) about the best way to do that. This sounded not that bad to do, but I managed to hit land mines around every corner.

The First Problem

Azure Automation is mostly PowerShell only—there is a Python option, but I’ve never used it, and I’m not going to start now. The trick with PowerShell is that it’s great for things you have to do to Azure Resources, it’s far less good for things you have to do inside of databases (think creating a user). I typically use the invoke-sqlcmd cmdlet, however we have a chicken and egg problem—I can’t create an AAD user from a SQL connection (a connection made using a SQL login) and invoke-sqlcmd doesn’t support authenticating with AAD. The Azure Automation service allows you to import 3rd party soluitons from the PowerShell gallery, so you can use DBATools which I did here. Rob has an excellent blog post here that describes this process.

$appid = (Get-AzKeyVaultSecret -vaultName “beard-key-vault” -name “service-principal-guid”).SecretValueText

$Clientsecret = (Get-AzKeyVaultSecret -vaultName “beard-key-vault” -name “service-principal-secret”).SecretValue

$credential = New-Object System.Management.Automation.PSCredential ($appid,$Clientsecret)

$tenantid =  (Get-AzKeyVaultSecret -vaultName “beard-key-vault” -name “Sewells-Tenant-Id”).SecretValueText

$AzureSQL = Connect-DbaInstance -SqlInstance $SqlInstance -Database $databasename  -SqlCredential $credential -Tenant $tenantid  -TrustServerCertificate

Invoke-DbaQuery -SqlInstance $AzureSql -Database $DatabaseName  -SqlCredential $credential -Query $query

The code, which I happily stole from Rob’s blog allows me to connect as a service principal. To easily facilitate this I made my automation account part of my DBA group (the Azure AD Admin group for the Azure SQL Server), which you can assign without this ridiculous process. I threatened to add Meagan’s ETL user to that group, but she was going to out me on Twitter.

After running that code I could connect to Automation run as account to my Azure SQL DB, but my query was failing with the following error:

I’m logged as a service principal there—hence the weird GUID, you can see that I have basically every privilege in SQL Server, but I can’t create a user from an external provider. PowerShell (and automation) say that the user could not be resolved.

The Next Land Mine

So I DMed Rob, and asked him WTF? It turns out for this to work, you need to create a service principal for your Azure SQL Database. If you aren’t familiar with service principals they are analogous to service accounts in an on-premises world. Doing this was the easiest step in the process—I have a PoSH script to hit every server in my subscription, and it was trivial to add a service principal as well as add to my database runbook. However, that was just the first part.

You have to give the service principal the “directory reader” permission in Azure AD, and the effective way to do this with Automation is to assign that privilege to a group. Well, it turns out adding AAD roles to group is a relatively new feature (it’s in preview) and more importantly requires P1 or P2 Azure Active Directory which has a per user cost. Which meant I needed to get approval. After much chatter on a DCAC teams channel I discovered since this feature was not user-assigned (e.g. it’s enabled for the entire AAD tenant once it’s enabled) I only had to have one AAD license in the tenant (I assigned it to Meagan). Once that was in place, I could grant the directory permission to the SQL Server Service Principals group.

Are We Done Yet?

I should have noticed in the documentation provided by the SQL team assigning groups with PowerShell, that there was a reference to the preview PowerShell module for Azure AD (I did, but I didn’t think it mattered because I was just assigning a user to a group). So I thought I had everything wired up when I started getting the following error:

Add-AzureADGroupMember -ObjectId $g.objectid  -RefObjectId $miIdentity.ObjectId

Add-AzureADGroupMember: Error occurred while executing AddGroupMember

Code: Authorization_RequestDenied

Message: Insufficient privileges to complete the operation.

RequestId: 62301512-a42f-4d00-a798-4e1bd4062df8

DateTimeStamp: Tue, 25 Aug 2020 13:14:08 GMT

HttpStatusCode: Forbidden

HttpStatusDescription: Forbidden

HttpResponseStatus: Completed

I have Global Admin and Subscription owner in the two environments I was testing in, so clearly this wasn’t a permissions issue. To further prove that point, I was able to add the service accounts I had created to the group through the Azure portal. So after writing like three emails with my further discoveries to the Azure MVP distribution list (I could add the service principal to a regular group, just not one with a role assigned to it). I went back and decided to play with that preview module.

Everything up to this point is me being an idiot, but I’m going to yell at Microsoft for a second. I couldn’t install the azureadpreview on my Mac because its dependent on Winforms—I thought Az modules were all supposed to be built on .NET core. I also couldn’t get it to run in cloud shell, which may be related to the Winforms thing, or not.

I do have a Windows VM, so I installed the module there, and it successfully worked on the DCAC tenant.  I went to Azure Automation to install the module. If you’ve never imported a module into Azure Automation, you should know that the portal notification about a module import being complete is meaningless, because Azure Automation is a lying liar who lies.

Look on the modules page and hit refresh a lot. It usually takes 1-2 minutes for a module to import. I messaged Kerry in Teams.

And what do you know? It worked. I was concerned and about ready to murder someone, but it worked. Rob’s code is really helpful and he covers key vault in his post. I did have some open GUIDs in some of my code pictures, it’s cool those aren’t sensitive. However, you should store all your secrets in Key Vault as it’s fast and easy.

The other thing I learned in this process is that you can now make a guest user you Azure Active Directory Admin (this means I could make joey@dcac.com or jdanton1@yahoo.com an admin in the joeydantoni.com tenant), which you weren’t able to do before. Prior to this you could use a group and add a guest user to that group as I mentioned above. (Note: you should still use a group and not a single user as it’s best practice)

Contact the Author | Contact DCAC
1 5 6 7 8 9 478

Video

Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner   Denny Cherry & Associates Consulting LLC BBB Business Review    Microsoft MVP    Microsoft Certified Master VMWare vExpert
INC 5000 Award for 2020    American Business Awards People's Choice    American Business Awards Gold Award    American Business Awards Silver Award    FT Americas’ Fastest Growing Companies 2020   
Best Full-Service Cloud Technology Consulting Company       Insights Sccess Award    Technology Headlines Award    Golden Bridge Gold Award    CIO Review Top 20 Azure Solutions Providers
Share via
Copy link