Is 18TB of Storage Enough?

Published On: 2021-07-16By:

Come to find out, nope, probably not.  My good friend Jim Donahoe (blog|twitter)was very nice to tell me that he’s running 100TB of storage in his house across multiple Synology devices.  I also know for a fact that another good friend, David Klee (blog|twitter), utilizes enough storage in his house to help heat his home during winter in Nebraska.  I’m no where close! You might have guessed that I’ve added storage to my home network.  18TB of it to be more precise.

The real question is, what am I going to do with 18TB of storage at home? What would you do with it or even 100TB?

Disclaimer:  I got a Synology DS920+ from Synology on loan in exchange for reviewing it, playing around with it and writing some blogs. Honestly I’ve wanted one for years but never got around to picking one up so I’m really excited to have this opportunity. It came with 2 380GB Synology NVMe SSD drives to be used for caching.  I purchased my own hard drives so that if or when I have to return the unit I can keep the drives to ensure my data stays with me.  Plus, it never hurts to have some spare hard drives laying around.  Yes, I know, I’m paranoid.

Now, back to my question.  Why am I doing this?

For a couple of different reasons actually.

  1. One thing that I really want is to do some hands on work with storage pools and volumes.  While I know the fundamentals, having this DS920 physically on my network loaded with drives gives me the chance to carve out storage pools and volumes.
  2. This also gives me the change to work with the Synology Hybrid RAID (SHR).
  3. Backups, backups, backups.  With now 18TB of raw usable storage for distribution, I can ensure all of my computers within my house are backed up. I won’t get the full 18TB when using RAID, but that’s still a good chunk of drive space.
  4. You can use certain Synology units as Virtual Machine managers.  While I can use Azure to facilitate this, sometimes I don’t want to mess with remembering to deallocate VM’s when I’m done with them.  This will also let me work with iSCSI targets and attempt some VMWare migrations along with some other VMWare related ideas.
  5. Synology package manager also comes with a slew of useful packages (like a VPN server, Docker, Backup managers, etc) that will offer up some flexibility in testing things as situations come up.  I’m really interested in the VPN capabilities as well as the ability to backup the entire array to cloud storage.

I’m really looking forward to being able to learn some things about Synology as well as provide valuable backup options for my various laptops and workstations.

If you are curious, here are some general specifications for the unit:

  • Model – DS920+
  • CPU – 4 core 2.0Ghz (base) Intel Celeron J4125
  • Memory – 4GB expandable to 8GB
  • Drive Bays – 4
  • M.2 Drive Slots – 2 (NVMe SSD)
  • Ethernet – 2 x RJ-45 1GbE LAN Port (link aggregation / failover)

Stay tuned for future posts about getting everything setup as well as the drives that I have installed.  I am planning on blogging about how I’m using this device and what you can do with it.

Do you have a Synology device in your home? What do you use it for? Leave a comment and let me know!

© 2021, John Morehouse. All rights reserved.

The post Is 18TB of Storage Enough? first appeared on John Morehouse. Contact the Author | Contact DCAC

Quickly Determining Azure SQL Database Size

Published On: 2021-07-02By:

Recently I needed to determine how much storage space each database on a logical server was consuming.  I was doing some DR testing and I need to give the client a rough estimate on how much each database was going to cost.  You can do this through the Azure portal where you interrogate each individual database, however, this is a long and tedious process especially if you have many databases on a given instance.

Essentially, I was looking for the information shown below in the red box.

I did not want to have to go into each database and manually record the appropriate values.  Instead, I went looking for a faster way and discovered sys.resource_stats.

What is sys.resource_stats?

This is a dynamic management view (DMV) that is explicitly available to Azure SQL Database and Azure SQL Managed Instance.  In other words, you will not be able to find this DMV available with on-premises installations including those instances installed on an Azure Virtual Machine.

This DMV will show us information related to the amount of storage that a database is consuming as well as how much storage is allocated to the database currently.  This DMV will not, however, show you the maximize amount of storage the database can consume.

Here is a quick script that you can use to easily determine how much consumed and allocated storage each database has.

SELECT MAX(start_time) 'Last Collection Time'
    , database_name, MAX(storage_in_megabytes) 'Current Size(MBs)'
    , MAX(allocated_storage_in_megabytes) 'Allocated Storage (MBs)'
FROM sys.resource_stats 
GROUP BY database_name

We can see that by using this DMV the values returned match the values provided via the portal.

One thing to note is that if the database is in the serverless tier and it has not been online for some time, this DMV will not capture the information. However, if you the database has been active, the data will be available to you.

Summary

This was a quick script to determine quickly and easily how much storage my clients’ databases are consuming.  Armed with this information it made it easier to calculate the pricing information they were asking for without having to interrogate the Azure portal.

© 2021, John Morehouse. All rights reserved.

The post Quickly Determining Azure SQL Database Size first appeared on John Morehouse. Contact the Author | Contact DCAC

Quickly Determining Azure SQL Database Size

Published On: By:

Recently I needed to determine how much storage space each database on a logical server was consuming.  I was doing some DR testing and I need to give the client a rough estimate on how much each database was going to cost.  You can do this through the Azure portal where you interrogate each individual database, however, this is a long and tedious process especially if you have many databases on a given instance.

Essentially, I was looking for the information shown below in the red box.

I did not want to have to go into each database and manually record the appropriate values.  Instead, I went looking for a faster way and discovered sys.resource_stats.

What is sys.resource_stats?

This is a dynamic management view (DMV) that is explicitly available to Azure SQL Database and Azure SQL Managed Instance.  In other words, you will not be able to find this DMV available with on-premises installations including those instances installed on an Azure Virtual Machine.

This DMV will show us information related to the amount of storage that a database is consuming as well as how much storage is allocated to the database currently.  This DMV will not, however, show you the maximize amount of storage the database can consume.

Here is a quick script that you can use to easily determine how much consumed and allocated storage each database has.

SELECT MAX(start_time) 'Last Collection Time'
    , database_name, MAX(storage_in_megabytes) 'Current Size(MBs)'
    , MAX(allocated_storage_in_megabytes) 'Allocated Storage (MBs)'
FROM sys.resource_stats 
GROUP BY database_name

We can see that by using this DMV the values returned match the values provided via the portal.

One thing to note is that if the database is in the serverless tier and it has not been online for some time, this DMV will not capture the information. However, if you the database has been active, the data will be available to you.

Summary

This was a quick script to determine quickly and easily how much storage my clients’ databases are consuming.  Armed with this information it made it easier to calculate the pricing information they were asking for without having to interrogate the Azure portal.

© 2021, John Morehouse. All rights reserved.

The post Quickly Determining Azure SQL Database Size first appeared on John Morehouse. Contact the Author | Contact DCAC

Issues When Using Temporary Tables in Nested Stored Procedures

Published On: 2021-03-26By:

SQL Server offers a lot of flexibility when working with objects like stored procedures and temporary tables.  One of the capabilities is to allow for stored procedures to call other stored procedures.  This is called “nesting” one stored procedure within the context of the calling stored procedure.  In addition, you can instantiate a temporary table from within the parent procedure that can be utilized within the context of the child procedure.

But wait! I’m not done yet!

You can also instantiate a temporary table with the same name as the parent temporary table within the child procedure.  But there is a catch!

From Microsoft Documentation:

Nested stored procedures can also create temporary tables with the same name as a temporary table that was created by the stored procedure that called it. However, for modifications to resolve to the table that was created in the nested procedure, the table must have the same structure, with the same column names, as the table created in the calling procedure. 

In other words, the name and the schema of the child temporary table must be identical to the name and schema of the parent temporary table.  Otherwise, mass confusion reigns supreme and you could get errors when attempting to do any data modifications in the child temporary table.

Let’s look at some code so that we can see the behavior.

Examining the behavior

First, here’s the code to show the example:

USE Test
GO

CREATE OR ALTER PROCEDURE dbo.parent
AS
BEGIN
       CREATE TABLE #test (name sysname)
       INSERT #test (name) VALUES ('Joey')

       EXEC dbo.child

       SELECT * FROM #test
END
GO

CREATE OR ALTER PROCEDURE dbo.child
AS
BEGIN
       CREATE TABLE #test (fname sysname)
       INSERT #test (fname) VALUES ('John')

       SELECT * FROM #test
END
GO

-- Execute the parent procedure
EXEC dbo.parent

Now, let’s step through it so that we have a clear understanding of what it is doing.

  1. Create a parent procedure that creates a temporary table called #test with “name” as the column name.
    1. Insert a row into the parent temporary table
    2. Call a child stored procedure
    3. Select from the parent temporary table
  2. Create a child stored procedure that creates a temporary table also called #test with “fname” as the column name. Note that this column name is different from the parent temporary table.
    1. Insert a row into the child temporary table
    2. Select from the child temporary table

Below is the error that is returned when executing the code block.

SSMS Screen Shot showing an error

The error above is stating that the “fname” column does not exist within the temporary table but we can see from the code block that is most definitely is.  This means that SQL Server is attempting to update the parent temporary table and NOT the child temporary table as one might expect.

Summary

When working with nested procedures as well as nested temporary tables, make sure to either have the identical name and schema or make sure to use a different naming convention.  By doing so you help to eliminate any issues of the SQL Server not resolving to the appropriate temporary table.

 

© 2021, John Morehouse. All rights reserved.

The post Issues When Using Temporary Tables in Nested Stored Procedures first appeared on John Morehouse. Contact the Author | Contact DCAC
1 2 3

Video

Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner   Denny Cherry & Associates Consulting LLC BBB Business Review    Microsoft MVP    Microsoft Certified Master VMWare vExpert
American Business Award 2021 Gold    American Business Award 2021 Bronze    FT Americas’ Fastest Growing Companies 2021   INC 5000 Award for 2020    American Business Awards People's Choice    American Business Awards Gold Award    American Business Awards Silver Award
FT Americas’ Fastest Growing Companies 2020       Best Full-Service Cloud Technology Consulting Company       Insights Sccess Award    Technology Headlines Award