Coding Standards Gone Bad in SQL Server

Published On: 2020-04-21By:

Knowing your data is very important when it comes to writing code. Now I’ll admit that I am very far from being a developer, however as a DBA, I spend much of my day’s performance tuning code. In doing so, I get to see many ways code can introduce excess database reads.,

One of the most recent things I have come across has to do with NULLs. The environment I was working in had no default values in their table design, so it was riddled with NULL values. Over the years they had implemented coding standards to try and mitigate these NULL values within their code.

In every column search or join they would start with (VALUE ISNULL or VALUE IS NOT NULL and VALUE = @parameter) as one example. Adding this syntax to a query reduces SARGAbility causing a lot of unnecessary reads and overhead. This is especially true when the column definition is set to NOT NULL and the check was just put in place as part of a standard. For clarification SARGAbility just means searchability and the ability to seek within an index for faster performance.

Another practice I’ve seen lately is the use of LTRIM and RTRIM functions as a standard clean up method.  First, I highly recommend doing this on the application side or upon whatever insert method you use as the data is being populated into your tables.  It is a much better practice to clean up the data once rather than coding for the clean up in every WHERE, JOIN or CASE statement in each query you write.

Let’s look at a simple query that incorporates both of these methods. Note: we will be turning on STATISTICS IO, TIME to look at our performance improvements.

First let’s look at our table design and note a few things. We will be querying WorldWideImporters Sales.Invoices table using SalesPersonID which is an INT set to NOT NULL, IsCreditNote a BIT NOT NULL and ConfirmedRecievedBy which is a nvarchar(4000) that allows NULLs.

After looking at our table let’s take note of the WHERE clause. The first check is validating if IsCreditNote IS NOT NULL, then we check on SalespersonPersonID for a specific value and lastly, we a cleaning up our free form field ConfirmedRecievedBy for a specific person’s name.

SET STATISTICS IO, TIME ON

SELECT BillToCustomerID, OrderID, InvoiceDate

  FROM [WideWorldImporters].[Sales].[Invoices]

  WHERE IsCreditNote IS NOT NULL 

  AND SalespersonPersonID = 7

  AND LTRIM(RTRIM(ConfirmedReceivedBy)) ='Aile Mae' ;

Let’s run this and see what the Stats and Query plans shows us.

As I do with any performance tuning the first things that I note is CPU time and elapsed time then Ill note table scans and reads. We will use these metrics to measure our improvements. I would like you to take note of the tables it uses to complete the query. There is the Invoices table along with a Workfile and a Worktable it needed to use in order to return the desired results.

Next let’s look at our execution plan and note our seeks and scans. You can clearly see we are getting, and Index seek on the Primary Key as well as a Scan on our index for ConfirmedRecievedBy. It’s also letting us know that we have a missing index but let’s ignore that for now.

Now that we have some performance data, let’s look at some changes we can implement to make it behave better.

Since we looked at our table design, we can comment out the IsCreditNote IS NOT NULL check simply because does not allow NULL values so no need to check for them.

SELECT BillToCustomerID, OrderID, InvoiceDate

  FROM [WideWorldImporters].[Sales].[Invoices]

  WHERE --IsCreditNote IS NOT NULL 

  --AND

  SalespersonPersonID = 7

  AND LTRIM(RTRIM(ConfirmedReceivedBy)) ='Aile Mae' ;

Executing the query again, you’ll note that the plan remains the same, but we see a significant reduction in CPU time and elapsed time just by removing unnecessary checks.

Finally, we will execute the query again this time assuming our application managed the cleanup of our values prior to insertion for the ConfirmedRecievedBy column. We will remove the LTRIM & RTRIM functions allowing the optimizer to seek on our index instead of scanning all the values.

SELECT BillToCustomerID, OrderID, InvoiceDate

  FROM [WideWorldImporters].[Sales].[Invoices]

  WHERE --IsCreditNote IS NOT NULL 

 -- AND

  SalespersonPersonID = 7

--AND LTRIM(RTRIM(ConfirmedReceivedBy)) ='Aile Mae' ;

 AND ConfirmedReceivedBy ='Aile Mae';

Now, you can see a huge improvement. We now are only doing 2 scans with 50 logical reads verses 422 and CPU time is now 0 compared 47ms. Our elapsed time even dropped from 230 to 127 which is over 100ms less. Furthermore, note that the missing index suggestion is now gone. It is important that you don’t just add indexes because the optimizer suggests it, wait until you are done tuning your code before addressing you index needs. Lastly note by making these changes the optimizer no longer requires the use of worktables nor workfiles.

There is still one last thing we can do to this to improve performance, which is out of the scope and purpose of this blog but let’s do it anyway. That is get rid of the Key Lookup. You can read more about what those are in my blog here. After adding the required included columns, you can see how much cleaner and faster the simple query now runs. We now run with only 20 logical reads, 0 CPU time and 90ms elapsed timed. That’s a 264.44% performance improvement.

This was a simple tuning exercise to show how implementing coding standards without making sure they are applicable to your data can cost you performance. As a developer you should always know your data. What works with one set of data may not work with another, it is very important to know the differences and code for those.

Contact the Author | Contact DCAC

Moving Your SQL Workload to the Cloud   

Published On: 2020-03-25By:

Every day, more IT organizations decide to move their SQL Server databases to Azure. In fact, over a million on-premises SQL Server databases have been moved to Azure. There’s an interesting blog about how Microsoft is faster and cheaper than its competitors that’s worth a read. To assist with your move to Azure Microsoft offers a number of migration tools and services to make this move as smooth as possible which I think attributes to their success. Two of those options are below with some informational links.

If you’re migrating a number of large SQL Server instances, Azure Database Migration Service  is the best way to migrate databases to Azure at scale.

You can also automate the database migrations using the Azure Database Migration Service PowerShell commands.

Azure Database Migration Service (DMS) is a fully managed service for migrating multiple database sources to Azure data platforms at scale. It supports SQL Server 2005 through SQL Server 2019, as well as a variety of other source-target pairs. Azure DMS enables a seamless migration with minimal downtime or effort.

Let’s see how it works.

Working with Azure Database Migration Service

A typical database migration using Azure Database Migration Service consists of the following steps:

  1. Perform a migration assessment using Data Migration Assistant (DMA). Note any migration blockers that DMA reports and perform recommended remediations.
  2. If you need to assess the whole data estate and find the relative readiness of the databases migrating to Azure SQL, follow the steps provided here.
  3. Once you find the Azure SQL target, use Azure Database Migration Service to find the optimal Azure SQL target SKU that meets your performance needs.
  4. Create your target databases.
  5. Create an instance of Azure Database Migration Service.
  6. Create a migration project specifying the source databases, target databases, and the tables to migrate.
  7. Start the full load. Note: Select Online mode to include transaction log backups.
  8. Pick the subsequent validation.
  9. Perform a manual cutover of your production environment to the new Azure database.

Offline vs. Online Migration

The amount of downtime required to move to the cloud is always an important aspect in the decision to move to the cloud or not. Azure Database Migration Service allows you to do either an offline or online migration. The difference is the amount of downtime. With an offline migration, downtime begins at the same time that the migration starts. With an online migration, downtime is limited to the time required to cut over to the new environment at the end of the migration.

Microsoft recommends testing an offline migration. If you can’t tolerate the downtime, then switch to an online migration. It’s important to note, an Online migration requires an instance from the Premium pricing tier. However, they make it easy for you by offering it free for the first six months.

Summary

Azure Database Migration Service gives you a much better idea of what to expect when migrating large database instances at scale. It migrates your databases with ease using Microsoft’s best practices. Even better is that Microsoft continually invests in migrations by working to improve reliability and performance, as well as adding source/target pairs. I can easily see why they continue to lead in cloud technologies.

To learn more about moving your databases to the cloud, here is a great resource. “Future-Proof Your Data Infrastructure with Azure: A Business Case for Database Administrators.

Contact the Author | Contact DCAC

Being a Woman in the SQL Community

Published On: 2020-02-26By:

In celebration of Women’s History Month starting next week, I was asked to write this blog about my experiences as a woman in the SQL Community, and it really got me thinking. At first, I thought I should be very politically correct and only talk about the great stuff and hype up all wonderful experiences I’ve had but then I thought that wouldn’t give a true picture. So, I am writing this as openly and honestly as I can. Here it goes.

The Ugly

I am going to start with the negatives to get those out of the way and draw attention to things that are still happening not only to me but to other women in this community. I don’t want this to be a gripe piece, since that’s not what I do. It is intended to be an honest accounting of my experience. Your experiences will vary, and I am not speaking for anyone else, but know I am not the only woman to have these experiences. I will address the bad and then move on to the fantastic things that this community has to offer for women and what I have been able to achieve because of the opportunities this community has afforded to me.

These are the things that suck about being a woman in the SQL Community that have had an effect on me. I am just going list them without detail, again I am just calling these out so others are aware these things happen.

Seeing Diversity and Inclusion Panels with no women

Having men assume I am there with my husband

Being hit on by random “community” guys in person, on twitter, on linked in, on my website

Being stalked by other male attendees

Being physically assaulted by male attendees (being touched without permission)

Someone assuming I was only selected to speak BECAUSE they needed women

Not being chosen for something BECAUSE I was a woman

Several Implying I was given Microsoft MVP award only BECAUSE I was a woman

Needing a male to repeat what I said so my ideas/solution etc. would be heard

Having a male think they need to speak for me

Having people not accept that men and women in the community can’t be good friends and nothing more

Some assuming I got my current job only because they needed a woman on staff, not because of merit

The Good

Thankfully, unlike other communities, the SQL Server community is very welcoming to women.  The benefits of being a member of it FAR outweighs the negatives. This community has helped me achieve so much. I am grateful for all the community as afford me and I give back as much as I can because of it.

Here are some of the opportunities and experiences I’ve gotten, not just because I am a woman but because of being an active part of the SQL community.

Building a network of smart, strong, technical women that you can rely on for questions, support and feedback. Knowing they’ve been there done that too.

Getting a chance to sit on WIT panels regularly with topics dealing with gender issues or career advice

Running a user group

Being a Regional Mentor

Running a SQL Saturday

Being a speaker at conferences and SQL Saturdays

Ability to mentor younger women

Being amplified as an expert in my field (this is HUGE as a woman)

Being a role model as a successful woman to my daughters by being part of this community and giving back

Being an advocate for other women

Writing and being published blogger

Seeing more and more men attend WIT panels and speak up

Seeing women on panels because they deserve to be there, not because they needed a woman

Being ask directly to speak somewhere because of your knowledge, not because of my gender

Seeing the community come together and speak up when issues with regards to diversity and inclusion come to light.

Having a Board of Directors that takes action when there is a violation to the Anti-Harassment policy

Being turned to for advice on policies

Becoming a Microsoft MVP because of the work I do in this community

The feeling you get after speaking when someone says they learned something

The unwavering support you get from the SQL Family when times are tough

The knowledge there are other women in this community that have been there, done that, and you are not alone.

The “you got this” push you get from the SQL Family that pushes you to venture out of your comfort zone

I could  all the great ways this community has shaped my life and my career. I even got approached for my last job in part just by my involvement in this community. There is so much to be gained by being a part of it regardless of your gender. We all know there are so many challenges with being a woman in tech. We talk about these all the time. What makes the SQL Community different for me is that we UNDERSTAND that, and we work to continually improve upon it. Having been part of this community for almost a decade, I’ve seen so much change in this for the better. As Rie and I always say, we are grateful for those who have run the gauntlet before us.  We are standing on the shoulders of those women who came before us and fought the good fight. We are blessed to continue to do it for the others that will follow. Thank you, SQL Community, for helping me achieve what I hope will help other women as they rise.

 

 

 

Contact the Author | Contact DCAC

Upcoming International Speaking Engagements

Published On: 2020-02-19By:

I am very excited and fortunate to be chosen to speak at both SQLBits and DataGrillen this year. These two conferences are incredible community run events and if you can attend either of them, I would highly recommend it.  Here’s a little bit about each event and information on how you can register. If you can attend be sure to check out my sessions.

SQLBits

SQLBits the largest SQL Server conference in Europe, being held in London England March 31st– April 4th. It is a conference for leading data professionals with over 200 sessions from speakers all over the world.

https://sqlbits.com/

Join Me -Thursday April 2nd 17:10 Room 10

Always Encrypted for Beginners

One of the biggest challenges to successful implementation of data encryption has been the back and forth between the application and the database.  You have to overcome the obstacle of the application decrypting the data it needs.  Microsoft tried to simplify this process when it introduced Always Encrypted (AE) into SQL Server 2016 and Azure SQL Database.  In this demo intense session, you will learn about what Always Encrypted is, how it works, and the implications for your environment. By the end you will know how to now easily encrypt columns of data and just as importantly how to unencrypt. You will also learn about the current limitations of the feature and what your options are to work around them.

 

DataGrillen

https://datagrillen.com/

DataGrillen is a SOLD OUT 100% free training event with the biggest and brightest of the data platform world. Held in Lingen Germany each year it is the must attend event in the region.

Join Me – Thursday May 28th 16:30 Room Handschuh

Mastering Tempdb

Have you experienced performance problems caused by contention in TempDB? Have you ever wondered why your TempDB is suddenly 3 TB? In this session, you will learn about all the various components of SQL Server that use TempDB. Whether it be AlwaysOn Availability Groups, Read Committed Snapshot version stores, spills, or simply temporary tables, learn about how to identify what SQL Server or your applications are doing in TempDB. Once you understand all the ways SQL Server uses this critical resource, and how to proper configure it, you’ll be better prepared for your workloads whether it be an Azure VM, a physical server, or a container.

Be sure to check out my fellow Denny Cherry and Associates sessions at both events. Click the links for the full schedules.

SQLBits – John Morehouse, Joey D’Antoni

DataGrillen -Denny Cherry, Joey D’Antoni, John Morehouse and Meagan Longoria

Contact the Author | Contact DCAC
1 2 3 4 5 28

Video

Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner   Denny Cherry & Associates Consulting LLC BBB Business Review    Microsoft MVP    Microsoft Certified Master VMWare vExpert
INC 5000 Award for 2020    American Business Awards People's Choice    American Business Awards Gold Award    American Business Awards Silver Award    FT Americas’ Fastest Growing Companies 2020   
Best Full-Service Cloud Technology Consulting Company       Insights Sccess Award    Technology Headlines Award    Golden Bridge Gold Award    CIO Review Top 20 Azure Solutions Providers