Introduction to the performance features on SQL Server on Linux

Published On: 2019-12-04By:

I’ve been tinkering with SQL Server on Linux lately and noted a few things in regards to performance I thought I would share with you. SQL Server 2019 on Linux uses the same SQL Server database engine with many of the performance features and services you would find on Windows. There are more similarities than you would initially think. However, if you’re a Linux user who is new to SQL Server, I thought the following introduction to some performance features on SQL Server 2019 on Linux will be helpful.

Columnstore index

As I’ve written about before in my 3-part blog series that you can find here, a columnstore index allows you to store and query large stores of data in a columnar data format, called a columnstore. A columnstore index eliminates the need to maintain separate systems for operational and analytics workloads. The columnstore index maintains a copy of the data so that OLTP and analytics workloads run against separate copies of the data. This minimizes the impact of both workloads running simultaneously.

Let’s take a look again at creating one. You’ll note it’s exactly the same as in Windows.

Create a columnstore index

Using TSQL to add a Columnstore index to the SalesOrderDetail table


   ON Sales.SalesOrderDetail

   (UnitPrice, OrderQty, ProductID)

Now let’s run a query to use our new index.

SELECT ProductID, SUM(UnitPrice) SumUnitPrice, AVG(UnitPrice) AvgUnitPrice,
   SUM(OrderQty) SumOrderQty, AVG(OrderQty) AvgOrderQty
FROM Sales.SalesOrderDetail
   GROUP BY ProductID
   ORDER BY ProductID

To verify that the columnstore index was used,  we can view the execution plan which we can easily do just like in Windows or we can look up the object_id for the columnstore index and confirm that it appears in the usage stats for the SalesOrderDetail table:

SELECT * FROM sys.indexes
WHERE name = 'IX_SalesOrderDetail_ColumnStore'

SELECT * FROM sys.dm_db_index_usage_stats
WHERE database_id = DB_ID('AdventureWorks')
   AND object_id = OBJECT_ID('AdventureWorks.Sales.SalesOrderDetail');

In-Memory OLTP

So, here is something I haven’t yet written about. SQL Server’s In-Memory OLTP features can significantly improve the performance of your application systems. However, it’s not for everyone. Here is a couple example that a particularly good candidates for OLTP:

  • Environments that are highly transactional with large numbers of SQL INSERTs concurrently
  • Environments that process heavy calculations using TSQL

Some things In-Memory gives us, whether Linux or Windows  are

  • Memory-optimized tables—A memory-optimized table stores data of itself in active memory and a secondary copy on the disk. It does this in the event the data needs recovery after a shutdown-then-restart of the server or database. The memory-plus-disk duality is unseen and happens only behind the scenes.
  • Natively compiled modules—A native module references memory-optimized tables only it does not pertain to the disk-based copy. Microsoft has seen native compilation result in durations that are 1/100th of the interpreted duration. You can read more about the performance here.

So how do we configure and use In-Memory OLTP

First step is to set the database compatibility level to at least 130, SQL Server 2016. If you don’t know your compat level you can check it by doing the below. If you find you need you change it I have given you the script to do so.

Use the following query to check the current compatibility level of AdventureWorks:

USE AdventureWorks
SELECT d.compatibility_level
FROM sys.databases as d
WHERE = Db_Name();

If necessary, update the level to 130.


When a transaction involves both a disk-based table (regular table) and a memory-optimized table, the memory-optimized portion of the transaction must operate at the transaction isolation level named SNAPSHOT. Execute the following to turn on MEMORY_OPTIMIZED_ELEVATE_TO_SNAPSHOT  to enforce this level for memory-optimized tables in a cross-container transaction:


Now let’s create a Memory Optimized FILEGROUP and a container for data files. Remember we are using Linux, so we use containers for the files, but its just a file group like in windows.

ALTER DATABASE AdventureWorks 
ADD FILEGROUP AdventureWorks_mod CONTAINS memory_optimized_data

ALTER DATABASE AdventureWorks 
ADD FILE (NAME='AdventureWorks_mod', FILENAME='/var/opt/mssql/data/AdventureWorks_mod') 
TO FILEGROUP AdventureWorks_mod

It’s that easy. Now let’s talk about one of my favorite performance tuning tools, Query Store which is also available in Linux.

Query Store

Query store collects detailed performance information about queries, execution plans, and runtime statistics. Query store isn’t active by default, so you need to enable it on database level for each database you want it. I could write many blogs on Query Store, this is only meant to let you know it exists on Linux too, so I am only scratch the very top surface for you here.

Enable query store with ALTER DATABASE

ALTER DATABASE AdventureWorks 

Here is a query you can run to get information about queries and plans in the query store

SELECT Txt.query_text_id, Txt.query_sql_text, Pl.plan_id, Qry.*
FROM sys.query_store_plan AS Pl
   JOIN sys.query_store_query AS Qry
      ON Pl.query_id = Qry.query_id
   JOIN sys.query_store_query_text AS Txt
      ON Qry.query_text_id = Txt.query_text_id ;

Query dynamic management views

SQL Server on Linux as well as Windows gives us Dynamic management views that provide server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance. This is no different then what we are used to in Windows.

Here is one for example. Query the dm_os_wait stats dynamic management view and you can see for yourself, there is nothing different.

SELECT wait_type, wait_time_ms
FROM sys.dm_os_wait_stats;

Here I just went over a few performance related things inside SQL Server 2019 in regards to SQL Server on Linux. I did this just to shed some light on the fact that Linux and Windows offer the same tools with just a different underlying OS. Hopefully you can put some of them to use in your own environment. To learn more about what you can do with Microsoft SQL 2019, Microsoft has a free Packt guide Introducing Microsoft SQL 19.

Contact the Author | Contact DCAC

Install ALL Things SQL Server… What?

Published On: 2019-11-26By:

Does your server look like this?

Many of us have inherited a SQL Server instance that has all SQL Services installed. Someone, maybe even you, went through the SQL Server installation process using GUI and checked every option available to them, then just clicked Next, Next, Next and then Install. If this is your environment, please take a moment to evaluate and decide which of these services that are required.

From a performance tuning perspective, it is important to only run the services that you need. Each of these services can consume resources on your server. Sharing resources reduces what you SQL Server Engine, SQL Server (MSSQLSERVER) or named instance SQL Server (ServerName\NamedInstance) can consume to run your workload efficiently. It is highly recommended that all other services not associated with the engine be run on a separate server.

Sometimes due to licensing concerns, this not a viable solution, and that is understandable. However be sure you are knowledgeable about how each configuration setting, like memory, that may need to be adjusted with other services sharing these resources. As I stated in a prior blog the max memory set inside SQL Server does not impact other services like SSIS, SSAS or SSRS. Those services take their memory from the operating system allocation, which in turn can take memory from the SQL Server engine. Be sure to allocate enough to each service. (Note: you may end up needing to spend more in RAM, than licensing, especially if you can run standard edition for BI tools like SSRS or SSIS).

Take a moment and look to see what services you are running versus what you are using. If you find that SSIS, for example, is running and you are not actually using it in your environment turn off the services. Take the time to set it to DISABLED and not just STOP the service. Stopping the service isn’t enough if  it is set to start automatically.   When the server is rebooted, the service will diligently start, and you do not want that.  Make sure to disable the service to prevent that from occurring.

Contact the Author | Contact DCAC

Simple-Talk Article…Understanding Azure Storage Options

Published On: 2019-11-20By:

Simple- Talk is technical journal and community hub from RedgateThis week I have the pleasure of being a guest Author and explain Azure storage options. I cover types of storage accounts, supported services, what performance tiers work with each storage account, and what kind of access you have with those storage types. Be sure to check it out.

Here’s an excerpt….

Editor’s Intro:

The choices found when provisioning storage in Azure can be overwhelming. In this article, Monica Rathbun explains the options to help your organization research which storage might be right for your solution.

Understanding Azure Storage Options

Azure can be a complicated mess of alphabet soup with all the acronyms Microsoft uses to define options within the cloud structure. In this article, I am going to decipher some of that for you starting with storage and redundancy. I’ll cover types of storage accounts, supported services, what performance tiers work with each storage account, and what kind of access you have with those storage types.

Azure Storage offers several different types of options. Both storage accounts and managed disks are part of the offerings—typically you’re going to use storage accounts for programmatic storage access, whereas managed disks will be used for your Infrastructure as a Service VMs. There are unmanaged disked available for VMs as well, but that’s not recommended at all. Unmanaged disks means that you will manage the storage account and not Azure. It is actually a file (.vhd) residing on a Azure Storage account and not an ARM file. Adversely, with managed disk all you have to do is specify whether you want Premium or Standard and the size of the disk and Azure creates and manages the disk for you. Simplifing the entire process. I only mention it here so that you know the difference. It is important to note unmanaged is again not recommend and an older way of doing things.

Let’s start with Data Access Tiers for Azure Blob Storage which are part of storage accounts.

Azure offers Data Access Tiers to storage such as Hot, Cool and Archive, which has the potential to save you money.

  • Hot gives you the ability to optimize your storage for frequently accessed data.
  • Cool is optimized for more infrequently accessed data. Data must remain for a minimum of 30 days. Removal before then will incur a charge.
  • Archive is rarely accessed and kept in storage for a minimum of 180 days.

You can use Cool storage is for things like short term back-up and recovery solutions which are not accessed until needed but can be accessed immediately when required. This will save you money compared to Hot storage. To save even more money, you can place things like long term back-ups, compliance data, or rarely used data on Archived storage. The expectation is that you will keep the data here long term and won’t require immediate retrieval. Think of this like sending your tape backups to Iron Mountain. You’ll need to wait for the data to be mounted which would take considerable time (up to 14 hours), but you will pay a much lower storage rate. However, keep in mind, if you take data out too soon, you will be charged an early delete fee of the retention period-days stored * rate.

You can find the full article using the link below.

Contact the Author | Contact DCAC

I’m Speaking at Pass Summit 2019

Published On: 2019-11-01By:


I am very excited to be heading to PASS Summit 2019 in Seattle next week. It’s one of my favorite weeks of the year. Getting to hang out with and learn from so many data professions is priceless. I am even more excited this year because I will be giving a Pre Con and a regular session for the first time at Summit. If you have ever attended one of my sessions, you know how passionate I am about speaking and being able to share my knowledge.

What is PASS Summit?

According to PASS it’s “Interactive training on the latest technologies and spotlights on hot topics such as security, cloud, and AI will be led by the best data minds in the industry”.

To me it’s a 5-day convention that bring geeks together to celebrate and learn from each other. It’s networking, learning, mind melds, fun, inspiration and renewal.

Join Me for My Sessions

SQL Server Performance Tuning and Optimization Pre Con

In Room 6B all day Tuesday, November 5th 8:30-4:30

Do your users complain about slow reports? Are your database servers overwhelmed during times of high usage? Every SQL Server environment can benefit from performance tuning whether your environment has one server or thousands. In this full-day session you will learn about how identify problems using a wide variety of tools and scripts and how to implement best practices across your environment. Additionally, you will learn how to begin reading execution plans and how to tune queries to improve your performance within SQL Server. You will walk away with a list of items to evaluate in your environments and ways to resolve common issues. This session will guide you through real-life performance problems which can be solved by best practices and practical skills. Taught on a level anyone can understand, this session will focus on Microsoft SQL Server 2016 and forward.

You will also learn about maintenance activities and how they affect your server’s overall performance, and how to identify when your infrastructure is affecting your performance. Lastly, we will cover the newest performance enhancements coming with the latest release, SQL Server 2019. You’ll leave this demo-filled session better prepared to tackle many issues that can plague SQL Server performance along with the knowledge of how to resolve them.

I’m It – Survival Techniques for the Lone DBA

In Room 6C Thursday, November 6th 10:45-12:00 (also being live streamed on PASS TV)

Are you the only database person at your company? Are you both the DBA and the developer? Being the only data professional in an environment can seem overwhelming, daunting, and darn near impossible sometimes. However, it can also be extremely rewarding and empowering. This session will cover how you can keep your sanity, get stuff done, and still love your job. We’ll cover how I have survived and thrived being a Lone DBA for 15 years and how you can too. When you finish this session, you’ll know what you can do to make your job easier, where to find help, and how to still be able to advance and enrich your career.

I am especially thrilled to be giving my Lone DBA session. This session is near and dear to my heart and always seems to have a big impact on those to attend it. I walk you through my 16 years as a lone dba, how I loved it, and was able to do so without being overwhelmed or stressed out. I go over things you can do to be more productive and not feel alone. I’ll help you realize that there is an army of DBAs out there every day ready and willing to help you for free and how to take advantage of that.

I am really looking forward to see everyone and sharing come knowledge. Hope to see you there!


Contact the Author | Contact DCAC
1 2 3 24


Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner    Microsoft MVP    Microsoft Certified Master    VMWare Partner    VMWare vExpert
   Best Full-Service Cloud Technology Consulting Company    Insights Sccess Award    Technology Headlines Award    Golden Bridge Gold Award    CIO Review Top 20 Azure Solutions Providers