Introducing our Emergency Response service at DCAC

Published On: 2020-03-12By:

According to Gartner, the average cost of IT downtime is $5,600:
PER MINUTE


Denny Cherry & Associates Consulting is introducting our Emergency Response service. With this service companies will have peace-of-mind when the inevitable 3am production outage happens. These events could be a: Clock with no hands

  • Catastrophic Event
  • Suddenly Sluggish Production
  • Unexpected Failover
  • Malware Infection

By signing up for our Emergency Response service you’ll have our Emergency Response experts available to you 24×7 to assist you in responding to your production down event. By having staff available to assist you 24 hours a day, 7 days a week you will be able to rest easy knowing that the assistance that you need is just a phone call away.

Enjoy peace-of-mind with the DCAC Emergency Response team at your fingertips:

  • Five Microsoft Gold and Silver Competencies
  • 80+% of DCAC consultants are Microsoft MVP’s
  • Featured speakers at Microsoft Ignite, PASS Summit
  • Winner of a dozen national/international awards for IT excellence

Our Emergency Response Package Includes:

  • Annual DR strategy review with a DCAC expert
  • 8 guaranteed hours of U.S. based IT expertise PER MONTH
  • Live response from an expert within one hour of contact
  • Any hour, including holidays
  • Bi-annual Disaster Recovery testing upon request

Get signed up for our Emergency Response service by contacting one of our team members today.

Contact the Author | Contact DCAC

Changing How I Write Abstracts

Published On: 2020-03-06By:

Sometimes a swift kick in the rear is needed to become motivated.  Recently, I was the recipient of the business end of a boot and it drove me to take a serious look at how I write abstracts for new sessions that I would like to teach.  After reviewing things, I found that my methods weren’t the best and I needed to improve on how I write (not just abstracts but other things like this blog post).

So, I’m trying something new. I’m going to crowd source feedback for new abstracts using GitHub.  Yes, you heard that correctly.  I want you to give me honest, full bore, good, bad, or whatever feedback.

Each title is link to a Gist on GitHub where you can leave your feedback.  Please, be honest and thorough.  If you think the abstract is great, tell me that.  If you think it sucks and I should start over, tell me that too.  Don’t like the title?  Shout it out.  Tell me all of it.

Let’s get this party started!

Improving Performance with Intelligent Query Processing

One of the core pillars in the role of a database administrator is to ensure their systems run as efficiently as possible. We don’t sit around and want a SQL Server to run slower, so we implement steps to ensure our SQL servers perform at maximum capacity. Re-branded from SQL Server 2017 Adaptive Query Processing (AQP) to  Intelligent Query Processing (IQP) with, Microsoft continues to bring forth enhancements in making the query optimizer a learning machine to help intelligently improve performance. Walking away after this session, you’ll have a better understanding on how IQP features, such as scalar user defined function improvements, table variable deferred compilation, or adaptive joins, will help ensure your SQL Servers run at optimal performance.

Indexes: The Voodoo of SQL Server

Microsoft SQL Server is a large data ecosystem with many facets that can affect how your queries perform. Facets like what kind of hardware are you using, how much memory, CPUs. Many of these facets cost money and there are many things about them that can be turned on, tweaked, or implemented to help improve performance. Did you know that you can implement a low-cost solution of proper indexing? Proper indexes can help not only improve query performance but also help save money on hardware! In this session, we’ll examine the foundation of how indexes work, what the moving parts are and why they are important. We’ll examine some real-world examples of where queries were falling short but then were successful with proper indexing. You’ll walk away with techniques on how to know where and how to add indexes to help you start to save money!

SQL Server Performance Tuning for Beginners

We’ve all had to start somewhere in our current career path. My own adventure started what feels like eons ago when I became an accidental DBA overnight and I had absolutely zero clue on how to performance tune our SQL Servers. I was stuck and wasn’t quite sure where to turn. Things had to perform better but no idea where to start. Does this sound familiar? Ringing any bells? If so, this session is for you! We’ll start at the ground floor and talk about the basics of how-to performance tune your SQL servers so that they run at peak performance! We’ll look at configuration settings, database options, trace flags, query tuning and a few tools that can help you squeeze every ounce of performance out of SQL Server! By the end of this session, you’ll walk away with a good solid understanding of how to start performance tuning. You will take away some scripts and tool suggestions that will enable you to hit the ground running back at the office!

The Award for the Two Best SQL Server 2019 Features Goes to…

SQL Server 2019, in my opinion, is one of the greatest releases that has arisen from Microsoft in the last decade. It comes with a multitude of enhancements across the board in the data platform ecosystem. However, there are two features that significantly impact the day to day lives of data professionals everywhere, namely Accelerated Database Recovery (ADR) and Resumable Index Creation. How impressive would it be to have a rollback operation complete in seconds where previously it would be hours or days? Would you like to have the ability to manage how you create that index on your multi-billion row table? Now you can! In this session we will examine in detail these two new features and demonstrate how they can help to accomplish both of those goals and improve your life when dealing with rollbacks and managing index operations. The days of horror stories around these two activities are a thing of the past thanks to SQL Server 2019.

Data Migration to Azure Made Easy

When you decided to move to any cloud provider, the thought about how to migrate all your data can seem like a daunting task. Thankfully, it isn’t as daunting as you might think thanks to some native tools as well as tools offered by Microsoft. In this session we’ll examine these methods and tools that will help you to migrate your data to Azure in a safe, secure, cost effective and successful manner. We will also look how these migrations work when working with three of the Azure data platform products, namely virtual machines, SQL DB, and Managed Instances. By the end of this session you will have gained newfound confidence to help you get your data up into the cloud!

Optimizing Query Performance in Azure SQL Database

Many think that moving to the cloud will not only help brighten your teeth but also solve all your bad coding practices that give you poorly performing queries. If it’s done correctly, implementing Azure SQL Database can help with one of those two and while it can mask things well, the best solution is to fix the bad code. In this hour-long session, we’ll examine several different methods that you can utilize to help fix bad query performance starting with the underlying service tier. Next, we’ll investigate what options are available directly from the Azure portal to determine where the bottlenecks might reside along with possible ways to fix them. Lastly, we will interrogate which native SQL Server tools exist within Azure SQL Database that can really help solve performance issues you might be having. You’ll leave this session with a solid understanding of how to trouble shoot performance issues in Azure SQL Database and what you might be able to do to help fix them.

Summary

Sometimes it is good to look at how you develop new things whether it’s writing, building, or communication.  Deeper investigations into those processes should lead you to improvements and hopefully success.  In this case, I’m trying to get better at writing abstracts.  The competition is fierce these days for speaker selection and I want to be better.  You can help me get there so please do so!

Thank you in advance for your honest feedback!

© 2020, John Morehouse. All rights reserved.

Contact the Author | Contact DCAC

PolicyViz Podcast Episode on Accessibility

Published On: 2020-03-05By:

I had the pleasure of talking with Jon Schwabish about accessibility in data visualization. The episode was released this week. You can check it out at https://policyviz.com/podcast/episode-169-meagan-longoria/.

Policy Viz
PolicyViz helps you do a better job processing, analyzing, sharing, and presenting your data.

If you’ve never thought about accessibility in data visualization before, here is what I want you to know.

  1. Your explanatory data visualization should be communicating something to your intended audience. You can’t assume people in your intended audience do not have a disability. People with disabilities want to consume data visualizations, too.
  2. We can’t make everything 100% usable for everyone. But that doesn’t mean we should do nothing. Achieving accessibility is a shared responsibility of the tool maker and the visualization designer. There are several things we can do to increase accessibility using any data visualization tool that don’t require much effort. Regardless of the tool you use, you can usually control things like color contrast, keyboard tab/reading order, and removing or replacing jargon.
  3. Accessible design may seem foreign or tedious in the beginning. We tend to design for ourselves because that is the user we understand most. But if we start adding tasks like checking color contrast and setting reading order into our normal design routine, it just becomes habit. Over time, those accessible design habits become easier and more intuitive.

I hope that one day accessible design will just be design. You can be part of that effort, whether you are a professional designer, a database administrator just trying to show some performance statistics, or an analyst putting together a report.

Listen to the podcast for my top 5 things you should do to make your data visualizations more accessible.

Contact the Author | Contact DCAC

Setting up Proper Outbound Email Protection Settings

Published On: 2020-03-02By:

Email is the wild wild west of the internet. Anyone can send email as another person, which is one of the reasons that we have junk email. One of the ways that companies can help others ensure that emails are coming from where they claim to be coming from, is to set up outbound email security. This involves setting a few DNS records in the DNS domain for that company. When the emails are then received by the receiving email server, it checks these DNS records against the email that was just received to see if the email was sent from an SMTP server that is listed in the DNS record.

There’s a couple of different records that need to be created. The first is called an SPF record. This record is a text record at the root of the domain.

To add this in, you first need to know what servers will be sending emails for your domain. In the case of DCAC, we use Office 365 as well as SendGrid, QuickBooks Online, and MailChimp.

SPF

This all needs to be added to our SPF record so that emails sent by any of these services are marked as valid from an SPF perspective.

We use the record below.

v=spf1 ip4:206.154.105.0/24 ip4:199.16.139.0/24 ip4:206.108.40.0/24 include:spf.protection.outlook.com include:servers.mcsv.net include:sendgrid.net ~all

Version

Let me explain how this specific record works. The “v=spf1” part says that this is SPF version 1. If version 2 of SPF comes out, then this would need to be changed to that.

QuickBooks Online

The “ip4:206.154.105.0/24 ip4:199.16.139.0/24 ip4:206.108.40.0/24” section is the SPF section for QuickBooks Online. QuickBooks Online uses a bunch of outbound email servers, and they are all contained within the three subnets listed.

Office 365

The “include:spf.protection.outlook.com” section is the SPF section for Office 365. This section says that any emails that are sent from Office 365’s outbound email servers are valid. As Office 365 will send all emails from severs that come back to that DNS name, this marks all the records as valid; and since you can’t send email through Office 365 without authenticating, we can trust this setup.

MailChimp

The “include:servers.mcsv.net” section is the SPF section for MailChimp. Again, since you can’t send email as someone else’s domain without verifying that it’s your domain, we can trust this setup.

SendGrid

The “include:sendgrid.net” section is the SPG section for SendGrid. Again, since you can’t send email as someone else’s domain without verifying that it’s your domain, we can trust this setup.

All

The final part is the “~all” section of the record. This covers everything else. The values within the SPF record are processed in the order listed. So if the email is sent through another email server that isn’t listed, then it’s processed by the “all” record. The “~” before “all” says that an email that matches “all” will be marked as a soft failure. There are a few values that can be used instead of the “~”. You can use a “+” which means that all emails will pass SPF checking, you probably don’t want this as unauthorized emails would be marked as valid.

You can use a “-” which means that all emails will fail SPF checking if they match “all”. The “~” which we use means that any emails that match “all” are given a soft failure. Eventually, we’ll change this from “~” to “-“.

The final value you can use is “?” which means that anything which matches “all” is given a match of Neutral. This is the same as not having “all” and tells the receiving server that the email isn’t from a valid server, but it shouldn’t fail it either. It means to ignore SPF and do other checking and use that to decide if it’s junk email or not.

DKIM

After you set up SPF, you will want to set up DKIM. This is another set of DNS records that need to be created. Your email provider will give you the values that you need to put in place. Since we use Office 365, we need to set up DKIM in Office 365.

To create this in Office 365 (where DCAC hosts its email), the first thing you need to do is to connect to Office 365 in PowerShell. This is done using a simple bit of code that you can run in PowerShell. You’ll need to run the PowerShell window in Admin mode.

Set-ExecutionPolicy RemoteSigned
$UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session -DisableNameChecking

Once you run that, it’ll prompt you for a username and password. Enter the username and password of your account that has admin rights within your O365 environment. Keep in mind that if you have MFA enabled (which you should have), this code won’t work. You’ll need to bypass MFA for your public IP address for this to work.

Once you get logged in to O365, you need to enable DKIM for the domain. Turning on DKIM is done using a PowerShell command, which you’ll want to run in the same window as above. You may receive an error saying that this object already exists. If you get this error message back, just ignore it the error.

New-DkimSigningConfig -DomainName {your domain} -enabled $false

Once that is done, you’ll want to ask O365 what the DNS setting for your domain should be. You get these settings using the PowerShell code below.

Get-DkimSigningConfig -Identity {your domain} | fl Selector1CNAME, Selector2CNAME

This will give you back two records. They will look something like this.

Selector1CNAME : selector1-{your domain}._domainkey.dcassoc.onmicrosoft.com
Selector2CNAME : selector2-{your domain}._domainkey.dcassoc.onmicrosoft.com

Take these values and create two new DNS entries for these. These will both be CNAME records. The first will be called “selector1._domainkey”, and it has the value that’s returned as Selector1CNAME. The second will be called “selector2._domainkey”, and it has the value that’s returned as Selector2CNAME.

The final command that needs to be run is to enable DKIM for this domain.
Set-DkimSigningConfig -Identity {your domain} -Enabled $true

DMARC

Once this is done, that’s most of what needs to be setup. The final think to be setup is called DMARC.

The setup DMARC, you need to create another DNS record. This is a TEXT record with the name of “_dmarc” (yes, there’s an underscore in front of the name). This record tells receiving email servers to send reports to someone about your SPF and DKIM settings about how many emails they get for each of these settings, and if emails fail these checks what should be done with the emails.

Our record looks something like this.

v=DMARC1; p=none; rua=mailto:{email address}; ruf=mailto:{email address}; fo=1; pct=100;

Let’s dive into what these various settings mean.

Version

The “v=DMARC1” section simply means the version of DMARC that we are using. Currently (in 2020), only version 1 is in use, so we use DMARC1 as the version.

Response

The “p=none” tells that receiving server what to do with emails that fail SPF and DKIM check. Currently, we are using a value of none here. The options are quarantine or reject. If you set “p=quarantine”, then emails that fail SPF or DKIM checks will be sent to quarantine (the junk folder in Outlook). If you set “p=reject”, then the receiving server will reject emails that fail SPF or DKIM checks. The standard process that most companies do is start with a value of none. After the reports are reviewed, then the p-value is moved to quarantine and then finely reject.

RUA

The rua value is an email address that feedback on the status of emails is sent to. This feedback is an XML file that is either zipped or gzipped.

RUF

The ruf values is an email address that forensic reports will be sent to.

Report Options

The “fo=1” section tells the receiver what options to use when sending the reports. A value of “0” will generate reports if both DKIM and SPF fail. A value of “1” will generate reports if either DKIM or SPF fails to produce a DMARC pass result. A value of “d” to generate a report if DKIM has failed, or a value of “s” if only SPF failed.

Percent

The final section is “pct=100”, which tells the receiving server what percent of emails should be included in the report. 100 percent is all emails, 50 is half the emails, etc. The bigger the company, and the more emails that you send out, the lower the setting should be. As we’re a small company that doesn’t send a massive number of emails, we want the inspection and reports for all emails that we send out.

What’s Reported On

As users find out about this reporting, they could start getting nervous about the admins getting reports on what emails they are sending out. These reports only look at the headers and do not include any of the data within the actual email.

Viewing the Reports

The reports that are sent because of DMARC settings are not meant to be read by people. They are XML files, and they are pretty complex. The easiest way to read them is to use a service. We use DMARC Analyzer to view the data. This gives some nice reports that summarize the data that gets sent back.

Conclusion

Hopefully, this will help people get their email servers properly secured. If you’re looking to get into a nice secure email platform, contact the sales team at DCAC, and our team can help you get setup in Office 365.

Denny

Contact the Author | Contact DCAC
1 2 3 4 5 456

Video

Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner    Microsoft MVP    Microsoft Certified Master    VMWare vExpert
FT Americas’ Fastest Growing Companies 2020    Best Full-Service Cloud Technology Consulting Company   
Insights Sccess Award    Technology Headlines Award    Golden Bridge Gold Award    CIO Review Top 20 Azure Solutions Providers
Share via
Copy link