With the hurricane bearing down on the east coast of the United States last week, making sure you have a good disaster recovery plan in place should have been a top priority for you. If you happen to be using Azure SQL DB, adding geo-replication is a simple and easy process. On Friday of last week I posted a blog on how to configure asynchronous geo-replication however I do not think the post effectively explains on how easy it is to configure.
So, I created a video. In less than 5 minutes, I show you how to configure geo-replication for a Azure SQL DB database. In this video, you will see me geo-replicate my database from the East Coast of the United States to the West Coast.
Keep in mind that this video just shows a test database that has hardly any data in it. If you had to replicate a larger database, it would take some time so planning would be important. The sooner you start, however, the sooner you’ll have a copy of your database safely in another region. Even with the risk of data loss in the event of a manual fail over, it might be better than being completely offline for hours, days, or even weeks.
It is worth mentioning that you will incur additional costs by having another copy of the database replicated in another region, but once the need has passed, you can remove the geo-replication and the secondary database. That is one of the beautiful things about Azure, the ability to scale things up and down as needed. Also, if you replicate to a different country, be cautious of their data privacy regulations. Once you put data there, it might not be able to leave the country.
Enjoy the video and be sure to check back for more videos to come! You can also subscribe to the Denny Cherry & Associates YouTube channel to watch videos from the entire team.
Quite often I see database administrators set SQL Server max server memory thinking everything related to SQL Server uses this shared memory pool. This is a mistake. There are many things that rely on memory that are not part of SQL Server. Best practices state that you should leave memory allotted for the operating system. However, did you know that if you are running services like SSIS, SSAS or SSRS on the same server as the database engine that it does not use the same memory you have allocated for SQL Server? If the Max Memory setting is not configured correctly, these other serves could incur memory pressure. While the memory consumed by SSAS and SSRS can be configured, SSIS can be a little bit more challenging. Beyond this, there are even scenarios where SQL Server max memory consumed can exceed the setting, like with CLR in versions earlier than 2012 and some other bugs in SQL Server.
As a consultant, I have seen memory pressure and memory exhausted too many times to count because the DBA was unaware of this. I applaud those that take the time to properly configure this setting according to what the database engine requires. Let’s take it a step further and take the time to look at what additional services you are using and allot memory accordingly.
Beyond just thinking of what additional services are running also be aware of additional instances on that server. Again, I have seen time and time again over allocation of memory when other instances are not considered. I have seen where each instance has the same maximum memory value which over extends the available physical memory instead of spreading that amount across the instances according to each of their workloads. SQL Server makes zero attempts to balance memory usage across instances that reside on the same server.
I am not going to go into how to set your max memory as there are many great resources out there to help you do that. I am writing this just to put a little bug in your ear of things to consider when choosing a value above and beyond what your database engine may require. Be sure to leave enough memory for those additional things running on your server besides the operating system.
Today is the day that I have the privilege of announcing the PASS Summit 2018 Speaker Idol contestants. The contestants have been grouped into three groups, with each of groups presenting on either Wednesday, Thursday or Friday.
The groups that we have for the PASS Summit 2018 this year are:
Wednesday (4:45 PM)
Dennes Torres de Oliveira
Thursday (4:45 PM)
Friday (11:15 AM)
The winners of these three days will then be presenting again in the finals on Friday afternoon at 3:30 pm along with one runner-up. Each of these 12 attendees to going to get some great feedback from our panel of judges (which I’ll announce later).
I wish everyone luck, and all see you at the PASS Summit.
As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.
And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.