My SQL Saturday Chicago Precon: Leveling Up Your Azure Skills

Published On: 2020-01-08By:

I’m doing another precon at SQL Saturday Chicago on March 20th, 2020. The event is at Benedictine University in Lisle, IL. This time we’re going to dive a little bit deeper into Azure. While last years precon focused on basic infrastructure skills in Azure, this time we’re going to focus a little deeper into the specifics of the Data Platform in Azure. I did a variant of this topic in India last August, but I’ve made a few changes based on a number of conversations with customers I had at Microsoft Ignite last year.

aerial photography of building city lights

Photo by Nate on Pexels.com

In this full day session, here’s what we’re going to cover:

  • VMs and Storage (like it or not this is still 80% of the cloud spend, and it’s how people migrate to the cloud)
  • Understanding the Azure services you need to manage your environment (this is going to focus heavily on Azure Automation, Azure Key Vaults, and Logic Apps)
  • Azure Data Platform–This is a Big Topic, but here’s how it breaks down
    • Understand the differences between
      • SQL Server in a VM
      • Azure SQL Database
      • Azure Managed Instance
      • Azure Synapse (nee Azure SQL Data Warehouse)
      • Cosmos DB
    • We’ll talk about the costs/benefits/performance of each of these options, and how they are incorporated into your broader architecture
    • Don’t worry, I’m a DBA, we’ll cover HA/DR here
  • We’ll also talk about a bit about cloud focused application architecture
    • Why Redis is front of your database is a good idea
    • Taking advantage of dynamically scaling resources

That’s a lot of topics–it’s a full day, and the deep dive part will be on the data platform, but any and all questions are welcome. There are still a few discount tickets left–you can sign up here.

 

Contact the Author | Contact DCAC

Let’s Talk About Ransomware (Again)

Published On: 2020-01-07By:

I’ve posted about ransomware a few times before. At DCAC, we’ve worked with a few customers who’ve fortunately been able to help them recover from their attacks. But ransomware attacks are trending upward, as seen below–there’s a ransomware attacker that has pulled in $10 million dollars this year (this year being so far in 2020. 7 days in). This had been a number of announcements about ransomware attacks in recent days– a small company that went bankrupt and money exchange Travelex who was recently attacked.

Ransomware attacks are quite different from traditional hacking. Traditionally hackers are in pursuit of high value data, whether it be from a government agency or a large enterprise business, or someone with a vast array of customer data like Equifax. Ransomware is targeting easy targets, that the attackers think will pay, or hope they will pay. What that means is that if you work for a smaller firm, you are far more likely to be attacked with ransomware than you would be in a traditional hacking attack. Theoretically,  large enterprises have more network segmentation, and better security controls, that make them less vulnerable to these attacks. As the the last year has shown, this is definitely not always the case.

It’s All About the Network

The way ransomware typically works, is to find a vulnerability via user accounts, in order to run exploit software on your network. If the only thing that can be reached on your network is user PCs, that’s the only data that’s going to be encrypted. Where organizations get exploited by this, is that their user network is either directly connected to their business network, or there is just one flat network structure. This means if a user is attacked (typical attack vectors are emailing infected office documents or PDFs, but others do include more advanced attack vectors like the aforementioned Pulse VPN vulnerability) the ransomware can go after file servers and domain controllers and start encrypting all of your files.

Assume Breach

One of the most important approaches to this, is that in order to build your network structures appropriately, you need to assume that your user network is going to be breached. It’s what Microsoft does with Azure, and it’s what you should do on your network as well. This means you need to do a few things:

  • Segment your network
  • Separate accounts, and maybe even a separate domain for your all of your servers
  • Disconnected backups, you can encrypt a safe full of tapes
  • Frequent restore testing
  • Basically, if you can connect to a file or database server from your corporate laptop without multi-factor auth, changing credentials, and/or network, you’re at the mercy of your dumbest user to not launch an infected Word file

All of these controls make our jobs harder, and it sucks, but it’s what we have to do. The one benefit of modern technology is that the cloud makes all these things approachable to even the smallest of businesses. I can build a segmented network in about 20 min in Azure, whereas it would take me a few hours to find the the right equipment at the right price point from a networking vendor if you were trying to do it on-premises.

I’m Just the DBA, What Can I Do?

I get it, and I’ve been there–you’re the DBA at a company where IT isn’t exactly the priority. There’s a few approaches you can take–the first is the most self-serving for me, is to engage a consulting firm like DCAC. While you know what you are talking about, your management sees you as a worker bee, and may not listen to your complaints about having an open WiFi network that your regulatory agency can logon to and see the public file servers (yes, this actually happened to me). However, when they are paying a consultant $ALot/hr they tend to be more receptive of taking advice, even if it’s the same thing the DBA has been talking about for years.

Another approach is to hire a penetration testing firm–many large organizations are required to do this by regulation, and they do a good job of identifying vulnerabilities in a firm.

Finally, and the easiest thing to execute is to PATCH YOUR SHIT. This is where you can assist the most–you can patch Windows and SQL Server (or Linux if that’s your bag), and you can work with the sysadmin teams in your organization to ensure that all OS, application, and mouse software is patched regularly.  It’s not perfect, and it doesn’t fix all of your problem, but it’s a start. Also, test your god damned restores, it’s not enough to have backups if you don’t know that they work.

 

 

Contact the Author | Contact DCAC

PowerShell for Parsing DiskSpd Results to CSV

Published On: 2019-12-04By:

We have a customer who is moving platforms, and as part of this, I’ve been tasked with testing a lot of various storage configurations. This means lots of utilization of DiskSpd, which is a disk performance benchmarking tool from Microsoft. We could argue about the value of synthetic disk benchmarks, but they are good for testing a variety of disk configurations with a standardized tool for comparing results. It also has the benefit of add the runtime configuration into the results file. So as long as you have your results file, you can know what parameters you ran the test with. (You do have to document your disk configuration–we are using the name of our output file for this).

muscles food

Photo by Elle Hughes on Pexels.com

Anyway, I have a bunch of these files, and I needed to get the data into Excel. Since I was too lazy to figure out how to parse a text file in C#, my first thought was to use some combination of sed, awk, and grep in a bash shell. I reached out to my friend Anthony Nocentino (b|t) about his thoughts on the best way to do this, and he immediately said PowerShell.

When I asked about how to do things I wanted to do with specific bash commands, he mentioned the fact that I could use bash statements that supported standard input and output in PowerShell. The linked blog shows how to do this in Windows, however I wrote all of this code in PowerShell natively on my MacBook Pro.

$path='$InsertYourPathHere'

foreach ($file in $files)

{

$content = get-content $file

$command= ($content)|select -first 1 -skip 1

$results= ($content)|grep total -m1|sed 's/"|"/","/g'|sed 's/"total:"//g'




$results= $results.split(",")

$Output = New-Object -TypeName PSObject -Property @{

FileName = $File.Name

Command = $Command

TotalBytes = $Results[0].Trim()

TotalIOs = $results[1].Trim()

MiBperSec = $results[2].Trim()

IOPs = $results[3].Trim()

AvgLatency = $results[4].Trim()

LatStdDev = $results[5].Trim()}| Select-Object FileName,Command,TotalBytes, TotalIOs, MiBperSec, IOPs,AvgLatency,LatStdDev

$Output|Export-CSV results.csv -Append
}
As you can see, I’m passing output in my $Results variable to a grep to give me the first match of the word “total” and then using sed to do a couple of find and replace commands to make parsing the file a little bit easier. After I’ve done all that, I split the array into a comma delimited set of results, and output it to a CSV file. This allows you to grab the results, with headers and open then in your favorite spreadsheet software. For posterity, the code is available at in our GitHub repo here.
Contact the Author | Contact DCAC

PowerShell for Parsing DiskSpd Results to CSV

Published On: By:

We have a customer who is moving platforms, and as part of this, I’ve been tasked with testing a lot of various storage configurations. This means lots of utilization of DiskSpd, which is a disk performance benchmarking tool from Microsoft. We could argue about the value of synthetic disk benchmarks, but they are good for testing a variety of disk configurations with a standardized tool for comparing results. It also has the benefit of add the runtime configuration into the results file. So as long as you have your results file, you can know what parameters you ran the test with. (You do have to document your disk configuration–we are using the name of our output file for this).

muscles food

Photo by Elle Hughes on Pexels.com

Anyway, I have a bunch of these files, and I needed to get the data into Excel. Since I was too lazy to figure out how to parse a text file in C#, my first thought was to use some combination of sed, awk, and grep in a bash shell. I reached out to my friend Anthony Nocentino (b|t) about his thoughts on the best way to do this, and he immediately said PowerShell.

When I asked about how to do things I wanted to do with specific bash commands, he mentioned the fact that I could use bash statements that supported standard input and output in PowerShell. The linked blog shows how to do this in Windows, however I wrote all of this code in PowerShell natively on my MacBook Pro.

$path='$InsertYourPathHere'

foreach ($file in $files)

{

$content = get-content $file

$command= ($content)|select -first 1 -skip 1

$results= ($content)|grep total -m1|sed 's/"|"/","/g'|sed 's/"total:"//g'




$results= $results.split(",")

$Output = New-Object -TypeName PSObject -Property @{

FileName = $File.Name

Command = $Command

TotalBytes = $Results[0].Trim()

TotalIOs = $results[1].Trim()

MiBperSec = $results[2].Trim()

IOPs = $results[3].Trim()

AvgLatency = $results[4].Trim()

LatStdDev = $results[5].Trim()}| Select-Object FileName,Command,TotalBytes, TotalIOs, MiBperSec, IOPs,AvgLatency,LatStdDev

$Output|Export-CSV results.csv -Append
}
As you can see, I’m passing output in my $Results variable to a grep to give me the first match of the word “total” and then using sed to do a couple of find and replace commands to make parsing the file a little bit easier. After I’ve done all that, I split the array into a comma delimited set of results, and output it to a CSV file. This allows you to grab the results, with headers and open then in your favorite spreadsheet software. For posterity, the code is available at in our GitHub repo here.
Contact the Author | Contact DCAC
1 2 3 30

Video

Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner    Microsoft MVP    Microsoft Certified Master    VMWare vExpert
   Best Full-Service Cloud Technology Consulting Company    Insights Sccess Award    Technology Headlines Award    Golden Bridge Gold Award    CIO Review Top 20 Azure Solutions Providers