How Do I Measure IOPs from SQL Server?

Published On: 2015-12-11By:

One of the more common challenges DBAs face when requesting new SAN environments from storage administrators is being able to gather the actual number of I/O operations per second that SQL Server actually performs. See your storage admins always speak in terms of performance in terms of IOPs, latency and throughput, whereas database folks like to be able to speak in terms of transactions per second. So in the past it was a challenge for the DBA to communicate with the SAN admin about IOPs requirements—it involved carefully capturing a lot of different perfmon counters, and making sure they all added up correctly. It was definitely more art than science.

Introducing Resource Governor in SQL Server 2014

One of the key feature enhancements to SQL 2014 was the addition of I/O control to resource governor. Resource governor is an enterprise edition feature that allows DBAs to manage workloads by classifying connections that have specific amounts of hardware resources allocated to them. When this feature was introduced in SQL 2008, it had one missing element—the ability to constrain workloads based on I/O. Especially since CPUs have gotten much faster since 2008, I/O has become the bottleneck for many, many systems. So, in SQL 2014, Resource Governor can manage IOPs per volume (note, it’s not in the GUI as of now, you’ll have to use T-SQL to configure it).

I have Standard Edition—How Does This Help Me?

Well, if you have standard edition, you won’t be able to use resource governor to manage resources. However, the perfmon counters from resource governor still come with your install. And this is how you can measure IOPs from SQL.

Figure 1 Perfmon Counters

That handy SQL Server:Resource Pool Stats counter and it’s Disk Read IO/sec and Disk Write IO/sec provide you with the data you need to give your SAN admin. In this screenshot this is an Enterprise Edition instance, and you can see my resource pools on the left side—so if you are using resource governor, you could use this to classify IO workload by application for potential chargeback situations.

Good luck and happy SAN tuning.

Contact the Author | Contact DCAC

Using PowerShell to Generate a Transaction Log Restore Script

Published On: 2015-11-25By:

This is a really basic PowerShell post—those of you who do lots of PowerShell automation can skip it, and I know I could have taken this three steps further and automated the whole process. Thanks Mike Fal (b|t)! (seriously, if you want to read some great SQL PowerShell tips hit Mike’s blog) J Anyway, I’m building an Availability Group for a client with a very large database. I had taken a full backup, and was going to take a differential, so I didn’t have to apply a full day’s worth of transaction log backups to the database. Well, it turned out the workload consisted largely of bulk insert operations, which are minimally logged—what that meant was that my differential backup was huge, but my transaction log backups were quite manageable. Except for the fact that I had about 100 of them, so I didn’t want to hand enter all of the file names.

One of my favorite techniques for automation is to use the concatenation operator in T-SQL (+) to dynamically create SQL statements (note: if your reading this and you’re a database developer, don’t do this, write stored procedures—it’s a valid technique for running one-off queries, but doesn’t scale well). We can do the same thing in PowerShell. But first things first—let’s figure out how to list the files.



    Get-childitem $backpath

What we are doing with those two commands, is defining a variable ($backupPath) and then doing a file listing on it. That listing isn’t useful to us yet—it has too much information.

What we want to do is parse the name out of that data and concatenate a restore statement.


Get-Childitem $backupPath|select {“restore log dataBaseName from disk=’”+$backupPath+$”‘ with norecovery”}

We are doing a few things here—we’re doing our original listing of files, but we are now piping it (the |) to the next command—from there we are taking advantage of PowerShell’s object model. You’ll notice the curly brackets after the select—those are specifying a script, which you will need in order to concatenate text with the piped in variable. Unlike in dynamic SQL there is no need to escape the single quotes, that our T-SQL command will ultimately need, if a string is in double quotes in PowerShell is treated just as a string and not evaluated. So that single quote will print as a single quote. You’ll note we are a doing a select and our value $ is the file name—this is one of the objects that the get-childitem cmdlet contains. We are using the $_. in front of “name” because we are in a script task. For example, if you executed the following code:


Get-Childitem $backupPath|select name


You will see this:


So what do we get when we run our statement?

I only had one backup here—on my client system I had over 200. I was able to just copy the code and paste into Management Studio and run the restores. You will note that this is the bare minimum of what you can do with this—you could use time intelligence to specify a time you wanted to restore from, and you could definitely automate the restores into PowerShell commands for SQL Server. I was keeping it simple and fast as this was a simple scenario. One thing I probably would do that I didn’t is use the write-output cmdlet to send the code to a SQL file. Happy Thanksgiving and Happy Coding!



Contact the Author | Contact DCAC

A Long Day’s Journey to a Saints Game or #joeystravelstories

Published On: 2015-11-01By:


When I noticed the PASS Summit and MVP Summits were on back to weekends, and there was a New Orleans Saints home game on All Saint’s Day (01 November) and Alaska Airlines had introduced a new nonstop flight to Seattle that would let me watch that Saints game and make it back in time for MVP Summit, I jumped at the chance. (Apologies for the absurdly long run-on sentence—it’s late) So after a fantastic PASS Summit I went to Seattle’s airport early Saturday morning. For reasons I don’t remember now, I had booked myself on an American Airlines (I do have status with them) flight through Dallas rather than the inbound nonstop Alaska flight.


The Saints

In case you were wondering the significance of All Saint’s Day in relation to my favorite American football team, the New Orleans Saints, the franchise was granted to the city 49 years ago today, November 1, 1966. I’ve been to Saints games all over the United States, and even one in London. My dad keeps season tickets and I usually travel in for several games a year. It’s a fun bonding experience, and I’ve seen all sorts of miraculous things happen like Steve Gleason blocking the punt the night the Superdome reopened, Garrett Hartley kicking the Saints into the Super Bowl, and Tracy Porter closing the deal on that Super Bowl.


In general, I have pretty good travel luck. I suppose it helps that I live in a hub city (Philadelphia) so most of my flights are non-stop. Unlike my good friend Karen Lopez, who has her own hashtag (#karenstravelstories) I mostly have an occasional delay here or there, but I usually get where I am going without issue.

Today, however that would not be the case, we were a few minutes late leaving Dallas, and I knew there were some pretty bad thunderstorms around the New Orleans area. As we made our approach into Louis Armstrong airport the turbulence was incredible, and I was genuinely concerned about our ability to make a safe landing. As it turns out, so was the captain of the plane. We aborted the landing, and circled briefly before heading to scenic Shreveport, Louisiana (which is famous for Hope Foley having grown up there) to refuel and hopefully come back to New Orleans.

Oh, but American…

So we landed in Shreveport, and the pilot made mention of a minor mechanical issue that had to be checked out by a mechanic, in addition to our refueling, so to his credit he let everyone off of the plane. The flight crew was absolutely fantastic in general, but there was one problem, we were basically in a complete information blackout. I was chatting with the flight attendants and they were in the dark as well. After a couple of hours of this, I really just wanted them to cancel the flight so I could get my checked bags (yes, slap me now) and get a rental car. The on-call mechanic (who was a contractor and not an AA employee) was apparently trick or treating with his kid and couldn’t make it.

American didn’t officially cancel the flight until shortly after 10 PM (we landed at Shreveport at 6:30).

My DR Plan, or Why I’m Blogging This..

As a business traveler, you learn to plan for adversity and always have three fallback plans. It’s just like dealing with a major database issue—you need to have a plan of attack. As soon as I heard we were landing in Shreveport, I knew I was pretty screwed. It’s a tiny airport with next to no service to anywhere that’s not a hub. In addition to the Saints game, I need to make my Alaska flight tomorrow night at 7 so I can get to MVP Summit. So I had a couple options:

  • Wait out the delay—this was problematic, but a plan to where you are going is always the best bet
  • Get a rental car and drive to New Orleans—it’s a 4-5 hour drive, but I couldn’t get my baggage
  • Get a rental car and drive to Dallas and get AA to book me on the first flight to New Orleans tomorrow morning
  • Stand in line with the other people at the airport and try to get rebooked (LOL, don’t ever do this—you’ll get to where you’re going next Thursday)

After a certain point in time, I decided to book the rental car. I knew we weren’t going to get out on the plane until really late, and I started hearing murmurs of cancellation from the flight crew (pro tip—always stand next to the flight crew). As soon as the flight was officially cancelled, I took off to rental car, and got a car reserved before the hoard got there. I was in my car with my bags within 30 minutes.

Never Outdrive Your Headlights

This is a quote from “The Art of Racing in the Rain” that I used both metaphorically and in reality tonight. It was late, but I had the benefit of being on west coast time and napping on my flight from Seattle. So I didn’t feel tired at all, but my fallback plan was to stop and get a hotel if needed. It was also raining the whole way, which made the level of concentration required a little extreme, but I think it helped me stay awake. For example, I was mentally composing this post in my last hour. This same theory applies to solving a database problem—always look at what’s in front of you before trying to chase down other problems that might not be real, or that you can’t see in your vision.

Contact the Author | Contact DCAC

It’s Not Always a SAN Problem

Published On: 2015-09-15By:

Tomorrow I’ll be doing a new webinar on how to identify performance problems related to storage in SQL Server. Sometimes, what looks like a SAN problem to SQL Server may just be related to missing indexes, or not enough memory, or something else that’s forcing SQL Server to do way too much I/O. In this session you’ll learn how to:

  • Identify when SQL Server is having storage related issues
  • Understand what’s happening from the perspective of the O/S
  • What performance counters are useless when you are on a SAN
  • How to talk to your SAN admin about performance

Join me and my friends at Embarcadero for this great webcast at 11:00am Pacific / 1:00pm Central / 2:00pm Eastern on Wednesday September 16.

Contact the Author | Contact DCAC
1 30 31 32 33 34 36


Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner   Denny Cherry & Associates Consulting LLC BBB Business Review    Microsoft MVP    Microsoft Certified Master VMWare vExpert
INC 5000 Award for 2020    American Business Awards People's Choice    American Business Awards Gold Award    American Business Awards Silver Award    FT Americas’ Fastest Growing Companies 2020   
Best Full-Service Cloud Technology Consulting Company       Insights Sccess Award    Technology Headlines Award    Golden Bridge Gold Award    CIO Review Top 20 Azure Solutions Providers