Over the past couple of blog posts, I have been talking about the versatility of deploying SQL Server with Docker. This combination is a great way to quickly and easily spin up local SQL Server instances. In the most recent post, I talked about a method to copy and restore a sample database into a Docker container that is running SQL Server. In this post, I am going to talk about an easier way to accomplish this by attaching a persistent volume to the container. With this method you don’t have to copy any files into the container, and it makes the overall process easier and repeatable.
First, before we get into the code, let’s talk about what a volume is. Essentially, a volume is a location on the host machine that can be referenced by the container. I think of this as a shared folder that the container can see. Once attached to the container, it can then read or write to the volume. You can easily declare the volume when you create the container with a simple switch in the command.
Let’s take a look!
Creating a Local Folder
Since I’m going to use a volume to hold backup copies of SQL Server sample databases, I am going to create a new local folder entitled “sampledbs”. Within a terminal session within your local profile, you can issue a “mkdir” command, which stands for “make directory”. MKDIR is also utilized in several various operating systems, including Windows, so it may be familiar to you.
Next, I want to put a copy of the AdventureWorks2017 database backup file into the folder we just created. If you haven’t already, you can down the file from Microsoft’s GitHub repository. In my case, I have it already downloaded. I’ll copy it to the appropriate folder using the the “cp” command, which stands for “copy”.
Shown above, I am in the “mssql” directory that I had in my profile. I issue the copy command to copy the AdventureWorks2017.bak file into the new directory. Then I move to the new samplesdb directory and issue a “list” or “ls” command to show the contents of the directory. We can see that the file now exists in the right directory.
Next, we need to share the folder within Docker and then we can create a container with a volume.
Sharing the Folder in Docker
In order for the volume to be mounted to the container, Docker must be granted access to do so. This is done by specifying the folder as “shared”.
Open the Docker Dashboard. This can be done by right clicking the Docker icon (looks like a ship) in the system tray. Select Dashboard.
Once the dashboard has opened, do the following:
Next, go to the gear icon in the upper right area.
Then on the left-hand side click expand Resources and then click on File Sharing.
Click on the plus icon and browse to the folder that you just created in the step above.
Apply & Restart.
After restarting, the folder is now able to be attached to a container. Now we can create a container with a volume.
Creating a container with a volume
The process is straight forward to create a volume when creating a container. If you recall from previous posts, we create the SQL Server container with this command,
docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -p 1433:1433 -d mcr.microsoft.com/mssql/server:2019-latest
If we wanted to add a volume to this container, we would include the “-v” or “-volume” switch. As shown below, the volume switch tells Docker that I want to attach a volume “/Users/jmorehouse/sampledbs”, which was defined in the previous step, to the container. Inside the container, the folder “sampledbs” will be allocated.
I think of this as a one to one mapping. The volume essentially lives outside the container itself and inside the container it is represented by the folder “sampledbs”. This is done with the context of Local volume location:inside the container location
So far we have examined how to create a container with a volume attached. In my previous post, I discussed how to copy a backup file from the host machine into a container. By doing this it allowed me to restore a database within the container but that meant I had to copy a file and duplicate it. By using a volume, I don’t have to copy anything into the container. i can now either issue a command in the container (like a SQLCMD query) to restore it or do it through tools like Azure Data Studio. Let’s take a look at using Azure Data Studio.
First, I’ll start the container with the volume mounted.
Once the container is up and running, I start Azure Data Studio and being a new query. First, I’m going to see if SQL Server can see the AdventureWorks2017 backup file in the sampledbs folder that was created.
RESTORE FILELISTONLY FROM DISK = N'/sampledbs/AdventureWorks2017.bak'
Success! SQL Server can see the file and restore the headers. This means that I can now restore it.
RESTORE DATABSE AdventureWorks2017 FROM DISK=N'/samplesdbs/AdventureWorks2017.bak'
WITH MOVE 'AdventureWorks2017' to '/var/opt/mssql/data/AdventureWorks2017.mdf',
MOVE 'AdventureWorks2017_log' to '/var/opt/mssql/data/AdventureWorks2017_log.ldf'
Remember, the backup file doesn’t actually exist within the container. It lives in a folder on the host, in this case my Mac laptop, and uses the mapping to reach into the host to obtain the file.
This was a brief look at using volumes in conjunction with containers. The volume can now be used with other containers which prevents having to manually copy files, such as backup files, into the container. The container itself can now just reach into the volume and do what it needs.
Last month I blogged about using Docker to run SQL Server as a quick and easy way to get SQL Server up and running. While it continues to be immensely useful, there aren’t any user databases running on it. The only databases present are the system databases. Yes, I could manually create a database, but it would be a lot easier to have a sample database available.
How do we do restore a sample database, in a container, that is running on a Mac laptop? Let’s check it out!
Disclaimer: Not a container expert so there might be various ways of doing it. This is just how I’ve figured out how to make the magic happen.
Also, if you have not read the first blog on how to get SQL Server running in Docker on your Mac, take a few minutes to read through it.
Here are the steps that we will take to make this work:
Download one of the sample databases from I have a “mssql” directory in my local profile to make things easier
Make sure the container is started. You can issue a “docker ps” command terminal to see which containers are running
Create a directory within the container
Copy the sample database backup file into the directory you just created
Restore the database onto the SQL instance that is running within the container
Sounds simple, right? Let’s see!
Create a directory
To create a directory within the container (remember that the container is running Linux), we can issue a command to the container that will create the directory. Using Terminal (you can go to Spotlight to find the Terminal program or it is under Utilities in the Applications folder), execute the following command,
Docker – this indicates that we are going to be doing something with Docker
cp – this is short for “copy”
./mssql/AdventureWorks2017.bak – this is the path of the source file that is being copied into the container. The “.” Indicates start with whatever working directory I am in, which is my profile directory as indicated by the “jmorehouse$”
Sql2019 – this is the name of the container.
:/var/opt/mssql/backups – this is the destination directory that is within the container.
Once the command is complete, we can check to make sure that the file was copied successfully.
Docker exec -it sql2019 ls /var/opt/mssql/backups
The “ls” refers to “list”. This is equivalent to executing a “dir” command in DOS.
Restore the Database
The backup file now resides within the container and we just need to tell SQL Server to restore it. In this section, I will be using Azure Data Studio and native T-SQL commands.
Let us first check that SQL Server can see the file.
RESTORE FILELISTONLY FROM DISK=N’/var/opt/mssql/backups/AdventureWorks2017.bak’
Excellent! SQL Server can see the backup file which means that we can restore it. Notice on the left-hand side, there are no user databases, just system databases. Also notice that the physical names of the database shown above are from the Windows Operating System. Since SQL Server is running on Linux within the container, we will have to move the physical files to a different location.
RESTORE DATABASE AdventureWorks2017 FROM DISK=N’/var/opt/mssql/backups/AdventureWorks2017.bak’ WITH
MOVE ‘AdventureWorks2017’ to ‘/var/opt/mssql/data/AdventureWorks2017.mdf’,
MOVE ‘AdventureWorks2017_log’ to ‘/var/opt/mssql/data/AdventureWorks2017_log.ldf’
Above we can see that the database was restored and then subsequently upgraded to the SQL Server 2019 database version. If you refresh the Databases branch on the left-hand side, the AdventureWorks2017 database is now present!
Docker continues to be my current choice of “go to” when I need a quick and easy SQL Server solution to play around. While I absolutely recommend Azure and its offerings, utilizing Docker on my local laptop is just faster, and frankly, fun to play around it. Now that I can easily restore databases, it just makes it that much better of a solution for non-production uses.
Like these posts? Make sure to subscribe to get alerted when new posts arrive!
Sometimes the brilliance of a solution is lost in its simplicity. Full disclaimer, this was Joey D ’Antoni’s (B|T) idea, however, he has given me permission to blog about it.
Recently, Joey and I were helping a client to reconfigure their production SQL Server instances to have a new and improved TempDB configuration. We knew that this would require an outage to ensure the changes had correctly taken affect.
Of course, we ran into a snag. The current configuration had 30 plus tempdb data files and any attempts to delete the files were blocked because they were actively being used. Nothing we tried would give us the ability to delete the excess files.
We even tried to restart the instance into single user mode, however, every time that happened something else would take the connection before we could get into the instance. We eventually restarted the SQL Server instance to normal operation so that we could investigate why we could not get a connection when in single user mode.
Turns out that with the production nature of the instance, the clients large farm of application servers was connecting to it faster than we could. This was discovered by using sp_who2, however, you could use the DMV sys.dm_exec_connections to see what is connecting to the instance if you desired. So, we needed a way to block incoming connections while not being evasive like shutting down the application servers or a large network change.
This is where the brilliance comes in.
Disable TCP/IP as a network protocol to SQL Server. BOOM. Instant firewall.
If you open SQL Server Configuration Manager, you will probably see something like this. If you have more facets installed or multiple named instances installed, your list on the right-hand side might look a bit larger but for the most part it should be similar.
Expand SQL Server Network Configuration in the left-hand menu and select Protocols for the instance you want wish to block connections to. In this case, I just have the default instance installed on my laptop.
On the right hand side, notice that the TCP/IP protocol is correctly enabled.
You can disable that by double clicking on TCP/IP and then changing the Enabled option to reflect No.
An alternative method is that you can also simply right click on the protocol and choose to enable or disable it.
With either method, you will acknowledge that the changes will not take effect until the service has been stopped and restarted.
Go ahead and restart the service. Once back online, the TCP/IP protocol will be disabled and any servers wishing to connect will be effectively denied. Keep in mind that we had local access to the server itself and retained connectivity to it even though SQL Server no longer allowed TCP/IP connections. Utilizing the shared memory protocol (it was the only thing enabled), we were able to restart into single user mode, get connected, and successfully remove the extraneous tempdb data files.
Once the tempdb work was completed, we re-enabled the TCPIP protocol, and then restarted the instance back into normal operations. Double checking sp_who2, we also verified that all the application servers had resumed connectivity to the SQL Server instance
This is an example of a quick and effortless way to block all inbound connections to the SQL Server instance. This method required zero intervention from any engineers to stop application servers or implement firewall changes. A few simple clicks of the mouse and we were back in business.
Sometimes it is the simple things that are astonishing and brilliant.
My day to day work machine is a Windows 10 HP laptop. It remains docked at my desk where it happily helps me to go about my daily business. However, my backup machine, is an Apple MacBook Pro. It’s small footprint and ease of use allows me to easily shift working locations if needed. Of course, these days I’m maintaining social distancing so getting away from my desk to work elsewhere means working from my kitchen table for a change of scenery.
When I shift working locations I work on my Macbook, and I still need access to a SQL Server instance. Yes, I could install SQL Server on the OS but it’s my personal machine and I’m not that familiar with SQL on Linux (yet). Docker, however, it a great solution to allow me to run SQL Server containers on my Macbook without having to set up anything. Let’s look to see how easy it is to setup Docker.
Go out to docker.com, download, and install the Docker Desktop application. This is what will allow you to run container images. The images are what define the container which then in turn runs the application, which in our case is SQL Server. Once you have it installed, you’ll have a dashboard like the one shown below.
At this point, we don’t have any containers running so we need to first get a container image and then create a container. I got my image from Docker Hub, a library of container images that you can browse and subsequently download if needed. Searching for SQL Server images, we can see that Microsoft has their own registry that has a few available container images. The Feature Tags shown below show some often used images. If you scroll further down the page, you can browse the full is list of available tags that can be used to download an image.
For my purposes, using the latest and greatest for testing and validating things is perfect so I’m only interested in the “2019-latest” container image. To obtain the image, you can issue a pull command from within a terminal session. You’ll notice that Docker Hub provides you with the command in the above image.
We can check to make sure that the container image was downloaded as well as view any other images we might have. As shown below, you can see that I have three images available.
Starting the SQL Server Container
Now that we have a container image downloaded, we can now initiate setting up the container itself. Since SQL Server has already been installed within the container, it must now be configured for it to successfully start. Essentially, we need to tell Docker to run the image and pass in some parameters.
Here’s what they mean:
-e represents environment variables. Here we are accepting the end user license agreement (EULA) as well as supplying a SA password (it’s required).
-p represents the ports. The value on the left side of the colon is the port on the host and the value on the right side is the port within the container. In this example, both sides will be listening on port 1433, which is the default port for SQL.
-d represents which container image we want to run. In this example I want to run the 2019-latest image pulled from Microsoft.com.
docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -p 1433:1433 -d mcr.microsoft.com/mssql/server:2019-latest
Running this command, if all is correct, will return a GUID looking value.
We can also see that the Docker dashboard now has a running instance.
Using Azure Data Studio, I can now connect to the localhost on port 1433. We can see that the version is 2019 CU4 which is the latest release for SQL Server 2019.
It’s that easy!
If you are stuck on a command, simply run docker + command + “–help” to get a list of available parameters. For example, if I need to see what parameters are available for the run command, I would type in “docker run –help”. The output would display all of the available parameters that you can supply with the “run” command.
Using Docker allows me to quickly and easily stand up an instance of SQL Server. Often times I just need an instance to test a feature or try to figure out how something works. All in all, this process takes about 5-10 minutes to finish and it significantly faster than standing up a VM. If you are new to Docker, try it out. It works on Windows too!
Watch for future blog posts on how to use Docker with SQL Server! Subscribe for email notifications when new content is posted!
As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.
And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.