Add Azure Cache for Redis to Your Azure SQL Performance Tuning Toolbox

Published On: 2021-07-14By:

One of the biggest impacts on resource consumption for Azure SQL DB are repeated data pulls by the application layer. No matter how fast those queries execute calling the same procedure or issuing the same SQL statements hundreds, thousands, or million times a day can wreak havoc on database performance. Death by a thousand cuts can easily bring a system to its knees. Sometimes it’s hard for DBAs to troubleshoot these actively as the execution of the statements happens so quickly they don’t even show in tools like sp_whoisactive. It’s not until you begin to dive into things like Query Performance Insights or Query Store that you start to see the real issue.

Azure Portal

SSMS Query Store Top Consuming Queries with Executions Count Metric

The question is how do you combat this issue? The code has been fine-tuned and runs at optimal performance, it’s just the volume of application calls that is causing issues. One answer is Azure Cache for Redis.

What is Azure Cache for Redis?

Simply, it is a dedicated memory cache data store that can be accessed by applications within or outside of Azure based on open source Redis. It enables you to load data into the in-memory data store, read directly from that and reduce the number of calls to your database. Placing the data into this cache layer prevents the application from having to do repeatable data calls over and over again.  This can dramatically improve database performance, reduces latency, and frees up resources for other data requests by shifting the performance load to the cache and away from the database layer. This will require changes to your application code; however, it can potentially really increase database performance. You can see an example of a fairly complex app here in Microsoft docs.

This can be not only a performance gain but a monetary one as well. The result could in fact allow you to scale down you Azure SQL Databases because your resource consumption will be reduced.

Getting Started

Create a Resource, Under Databases choose Azure Cache for Redis

Pay attention to the cache type options. Be sure to click the link to the pricing tiers so you can pick the correct one for your environment. In this case I am choosing the cheapest one, Basic C0, which only gets me 250 MB cache but estimated at $16 per month, larger ones can get a little pricey.

For Networking you will have to choose Public or Private Endpoint the choose Next

Now choose which Redis version you want. Note there is a version 6 in Preview. Next will take you to the Tag options, which I skip, because I have no need to Tag my resources. Lastly, we Review and Create the resource. Now that we have a Redis Cache resource created there are a lot more steps to take to be able to use it, store data in it and access it through your applications. I’ll leave those steps to you, in this post I just wanted to show you where to find it in the portal and how to create it.


If you work within an environment that has repeated data calls thousands of times an hour, this may be a really great resource for you to look into. I highly suggest you add Azure Cache for Redis to your performance tuning tool kit. You can find all the information you need to continue with the process here. Be sure to also read up on all the security things to consider. There is a lot of useful documentation within Microsoft docs that can be found here as well.


The post Add Azure Cache for Redis to Your Azure SQL Performance Tuning Toolbox appeared first on A Shot of SQLEspresso.

Contact the Author | Contact DCAC

Today we Welcome Bill Fellows (@billinkc) to DCAC

Published On: 2021-07-12By:

Today DCAC is thrilled to say that Bill Fellows is joining our team. Bill brings with him a plethora of experience in both the core engine as well as the Business Intelligence worlds.

Bill is an Architect and five-time Microsoft Data Platform MVP with a focus on converting data into actionable intelligence. He has been a database developer for the past 20 years with a focus on automation and ETL. He is the organizer of Kansas City’s eight SQL Saturdays, co-author to “The BIML Book” and maintains the SSIS tag on StackOverflow. Bill blogs at and tweets under @billinkc.

Welcome to the team Bill.


Contact the Author | Contact DCAC

Calculating Age in Power BI

Published On: 2021-07-09By:

In week 26 of Workout Wednesday for Power BI, I asked people to calculate the age of Nobel laureates at the time they received the award. I provided some logic, but I didn’t prescribe how to create the age calculation. This inspired a couple of questions and a round of data validation as calculating age may be trickier than you think. In this post, I’ll explore some of the ways people have calculated age in Power BI and the edge cases where those calculations may not work.

In my solution video for Workout Wednesday, I used Power Query to calculate age. This was inspired by several blog posts and videos I had seen previously. There is an Age menu option in the Power Query editor under Date.

Calculating Age with the Power Query Editor user interface

When you select a date column and use that Age option, it calculates the duration between the selected date and the current date in days. You must then replace the current date with the second date column. Next you can choose Total Years under Duration, which divides the days by 365. Finally, you must round that number down to the next integer to get years.

If you follow Ruth’s video, you can do all of that in one step that creates a custom column with the final age value.

 Number.RoundDown(Duration.TotalDays([Date2] - [Date1])/365) 

That is the most common option in Power Query as there is no DateDiff function.

There are a few options for calculating age in DAX. Some people use the DATEDIFF function.

Age DateDiff = DATEDIFF([Date1],[Date2],YEAR) 

Another way I have seen is to use YEARFRAC function.

Age YearFrac = INT ( YEARFRAC ( [Date1], [Date2], 1 ) )

The way Marco Russo suggests is to use QUOTIENT.

Age Quotient (DAX): 

Age Quotient = 

VAR Birthdate = [Date1]

VAR ThisDay = [Date2]

VAR IntBirthdate = YEAR ( Birthdate ) * 10000 + MONTH ( Birthdate ) * 100 + DAY ( Birthdate )

VAR IntThisDay = YEAR ( ThisDay ) * 10000 + MONTH ( ThisDay ) * 100 + DAY ( ThisDay )

VAR Age = QUOTIENT ( IntThisDay - IntBirthdate, 10000 )

VAR CheckedAge = DIVIDE ( Age, NOT ISBLANK ( Birthdate ) )



As Marco points out, many people were using YEARFRAC, but there is a bug in the DAX implementation that causes it to occasionally return an incorrect answer for this purpose.

Checking the Numbers

I created a Power BI file to demonstrate the differences in these four calculations. You can download the file here. The image below displays the results in several tests. For each row, I’m using Date1 as the birthdate and Date2 as the “as of” date. You’ll notice that I focused on leap years for a few cases.

Table in Power BI with 10 date ranges showing the results from the four calculations. 6 of the 10 rows have different results across the calculations.
Example date ranges and result of the four age calculations

There are six of ten date ranges that have different results across the different calculation methods.

In the second row, the Power Query age calculation says that Feb 29 to Feb 28 in the following year is a full year. This may or may not be what you want depending on your requirements. I’m noting the difference so you can be aware. A similar thing occurs in the fifth row going from Feb 29, 2016 to Feb 28, 2020, and again on the 9th row going from March 1, 2019 to Feb 29, 2020.

On the third row, notice that the DAX DATEDIFF function calculates Feb 29 to Feb 27 of the following year to be a full year, despite it being a day or two short. Depending on what you do with leap years, you might consider Feb 29 to Feb 28 in the following year to be a full year, but that third row result means DATEDIFF is probably not the calculation I want. We see a similar result going from March 1 to Feb 28 of the following year.

YEARFRAC calculates that Feb 29 to Feb 28 in the following year is not a full year, which may be desirable. But it counts Feb 29, 2016 to Feb 29, 2020 as only three years. And we see that March 1, 2000 to March 1, 2021 is only counted as 20 years. So even without starting on a leap year, we get some incorrect results. Small numbers seem to be correct until it gets to about 13 years.

Using the QUOTIENT function provides what I consider to be the most correct results. It calculates Feb 29 to Feb 28 of the following year to be less than a year. It calculates Feb 29, 2016 to Feb 28, 2016 to be three years and not four. And it calculates March 1 to Feb 29 of the following year to be less than a year.

Which to use?

The QUOTIENT formula produces the most accurate results if you don’t want Feb 29 to Feb 28 the next year to be counted as a year. DATEDIFF and YEARFRAC produce too many incorrect results for me to ever suggest using them. Since there is a DAX option that produces more correct answers, I would just go for QUOTIENT instead of either of these two.

UPDATE: There is a better alternative! Imke Feldmann reminded me that there is an Number.IntegerDivide function in Power Query. So let’s take the logic from Marco’s DAX calculation and move it to Power Query:

(BirthDate as date, EndDate as date) =>
BirthDateInt = Date.Year(BirthDate)10000 + Date.Month(BirthDate)100 + Date.Day(BirthDate),
EndDateInt = Date.Year(EndDate)10000 + Date.Month(EndDate)100 + Date.Day(EndDate),
Age = Number.IntegerDivide((EndDateInt - BirthDateInt),10000)
in Age

The Power Query custom column created by invoking this function should produce better compression than a DAX calculated column. This might not be significant for a small dataset, but we should be efficient when we can.

Contact the Author | Contact DCAC

Add DBCC INPUTBUFFER to Your Performance Tuning Toolbox

Published On: 2021-07-07By:

A command I like to use when performance tuning is DBCC INPUTBUFFER. If you have ever run sp_whoisactive or sp_who2 to find out what sessions are executing when CPU is high for instance this can be a real quick life saver. At times, for me, those two options do not return enough information for what I’m looking for which is the associated stored procedure or object. Using this little helper along with the session id can easily get you that information.

Let’s take a look.

First, I will create a simple procedure to generate a workload.

WHILE (@i <1000)
select @@servername
WAITFOR DELAY '00:00:30'
select @i=@i+1

Now I will execute the procedure and capture what it looks like using first sp_who2 and then sp_whoisactive. Looking at the Sp_who2 screen shot all you can see is the session information including command and status that is being run but have no idea from what procedure it is being run from.

Exec KeepRunning

Now take it a step further and let’s run sp_whoisactive. Here we get more information such as the sql_text being run.

Take note of the session id, displayed with either tool, which in this case is 93. Now run DBCC INPUTBUFFER for that session.


BINGO! We’ve now got what we needed which is the name of the store procedure that the statement is associated with.

Now let’s try one last thing. Remember I said sp_whoisactive does not give us the store procedure name, well that wasn’t 100% true. There are fantastic parameter options we can us that can get us even more information. Let’s run sp_whoisactive using the parameter @get_outer_command = 1. Shown in the screenshot you can see here it essentially the same thing as DBCC INPUTBUFFER giving you the sql_command i.e. the store procedure name.


Quickly knowing the stored procedure associated with the query that is causing issues allows us to easily follow the breadcrumbs to identify the root cause of an issue. If you cannot run third party or community tools like sp_whoisactive, dbcc inputbuffer is an alternative for you. Therefore, I wanted to introduce DBCC INPUTBUFFER. Adding this little tidbit to your performance tuning toolbox can be a real time saver,, you may have other useful ways to use it as well. Be sure to have a look.

The post Add DBCC INPUTBUFFER to Your Performance Tuning Toolbox appeared first on A Shot of SQLEspresso.

Contact the Author | Contact DCAC
1 2 3 4 492


Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner   Denny Cherry & Associates Consulting LLC BBB Business Review    Microsoft MVP    Microsoft Certified Master VMWare vExpert
American Business Award 2021 Gold    American Business Award 2021 Bronze    FT Americas’ Fastest Growing Companies 2021   INC 5000 Award for 2020    American Business Awards People's Choice    American Business Awards Gold Award    American Business Awards Silver Award
FT Americas’ Fastest Growing Companies 2020       Best Full-Service Cloud Technology Consulting Company       Insights Sccess Award    Technology Headlines Award
Share via
Copy link