The post Enterprise Grade Security for SMBs Without Breaking the Bank appeared first on DCAC Videos.Contact the Author | Contact DCAC
|Published On: 2021-07-25||By: DCAC|
|Published On: 2021-07-21||By: Denny Cherry|
Last week Joey and I recorded a webcast with Rob Krug who is a security researcher from Avast solutions. We were talking about the things that companies should be doing in order to better secure their infrastructure so that they aren’t the next victim of Ransomware that we see on the news.
The webcast was recorded and the video has been posted online and it’s available for viewing.
DennyContact the Author | Contact DCAC
|Published On: 2021-07-16||By: John Morehouse|
Come to find out, nope, probably not. My good friend Jim Donahoe (blog|twitter)was very nice to tell me that he’s running 100TB of storage in his house across multiple Synology devices. I also know for a fact that another good friend, David Klee (blog|twitter), utilizes enough storage in his house to help heat his home during winter in Nebraska. I’m no where close! You might have guessed that I’ve added storage to my home network. 18TB of it to be more precise.
The real question is, what am I going to do with 18TB of storage at home? What would you do with it or even 100TB?
Disclaimer: I got a Synology DS920+ from Synology on loan in exchange for reviewing it, playing around with it and writing some blogs. Honestly I’ve wanted one for years but never got around to picking one up so I’m really excited to have this opportunity. It came with 2 380GB Synology NVMe SSD drives to be used for caching. I purchased my own hard drives so that if or when I have to return the unit I can keep the drives to ensure my data stays with me. Plus, it never hurts to have some spare hard drives laying around. Yes, I know, I’m paranoid.
Now, back to my question. Why am I doing this?
For a couple of different reasons actually.
- One thing that I really want is to do some hands on work with storage pools and volumes. While I know the fundamentals, having this DS920 physically on my network loaded with drives gives me the chance to carve out storage pools and volumes.
- This also gives me the change to work with the Synology Hybrid RAID (SHR).
- Backups, backups, backups. With now 18TB of raw usable storage for distribution, I can ensure all of my computers within my house are backed up. I won’t get the full 18TB when using RAID, but that’s still a good chunk of drive space.
- You can use certain Synology units as Virtual Machine managers. While I can use Azure to facilitate this, sometimes I don’t want to mess with remembering to deallocate VM’s when I’m done with them. This will also let me work with iSCSI targets and attempt some VMWare migrations along with some other VMWare related ideas.
- Synology package manager also comes with a slew of useful packages (like a VPN server, Docker, Backup managers, etc) that will offer up some flexibility in testing things as situations come up. I’m really interested in the VPN capabilities as well as the ability to backup the entire array to cloud storage.
I’m really looking forward to being able to learn some things about Synology as well as provide valuable backup options for my various laptops and workstations.
If you are curious, here are some general specifications for the unit:
- Model – DS920+
- CPU – 4 core 2.0Ghz (base) Intel Celeron J4125
- Memory – 4GB expandable to 8GB
- Drive Bays – 4
- M.2 Drive Slots – 2 (NVMe SSD)
- Ethernet – 2 x RJ-45 1GbE LAN Port (link aggregation / failover)
Do you have a Synology device in your home? What do you use it for? Leave a comment and let me know!
© 2021, John Morehouse. All rights reserved.The post Is 18TB of Storage Enough? first appeared on John Morehouse. Contact the Author | Contact DCAC
|Published On: 2021-07-14||By: Monica Rathbun|
One of the biggest impacts on resource consumption for Azure SQL DB are repeated data pulls by the application layer. No matter how fast those queries execute calling the same procedure or issuing the same SQL statements hundreds, thousands, or million times a day can wreak havoc on database performance. Death by a thousand cuts can easily bring a system to its knees. Sometimes it’s hard for DBAs to troubleshoot these actively as the execution of the statements happens so quickly they don’t even show in tools like sp_whoisactive. It’s not until you begin to dive into things like Query Performance Insights or Query Store that you start to see the real issue.
SSMS Query Store Top Consuming Queries with Executions Count Metric
The question is how do you combat this issue? The code has been fine-tuned and runs at optimal performance, it’s just the volume of application calls that is causing issues. One answer is Azure Cache for Redis.
What is Azure Cache for Redis?
Simply, it is a dedicated memory cache data store that can be accessed by applications within or outside of Azure based on open source Redis. It enables you to load data into the in-memory data store, read directly from that and reduce the number of calls to your database. Placing the data into this cache layer prevents the application from having to do repeatable data calls over and over again. This can dramatically improve database performance, reduces latency, and frees up resources for other data requests by shifting the performance load to the cache and away from the database layer. This will require changes to your application code; however, it can potentially really increase database performance. You can see an example of a fairly complex app here in Microsoft docs.
This can be not only a performance gain but a monetary one as well. The result could in fact allow you to scale down you Azure SQL Databases because your resource consumption will be reduced.
Create a Resource, Under Databases choose Azure Cache for Redis
Pay attention to the cache type options. Be sure to click the link to the pricing tiers so you can pick the correct one for your environment. In this case I am choosing the cheapest one, Basic C0, which only gets me 250 MB cache but estimated at $16 per month, larger ones can get a little pricey.
For Networking you will have to choose Public or Private Endpoint the choose Next
Now choose which Redis version you want. Note there is a version 6 in Preview. Next will take you to the Tag options, which I skip, because I have no need to Tag my resources. Lastly, we Review and Create the resource. Now that we have a Redis Cache resource created there are a lot more steps to take to be able to use it, store data in it and access it through your applications. I’ll leave those steps to you, in this post I just wanted to show you where to find it in the portal and how to create it.
If you work within an environment that has repeated data calls thousands of times an hour, this may be a really great resource for you to look into. I highly suggest you add Azure Cache for Redis to your performance tuning tool kit. You can find all the information you need to continue with the process here. Be sure to also read up on all the security things to consider. There is a lot of useful documentation within Microsoft docs that can be found here as well.
The post Add Azure Cache for Redis to Your Azure SQL Performance Tuning Toolbox appeared first on A Shot of SQLEspresso.Contact the Author | Contact DCAC