One of the challenges of being a consultant is having to work with a number of clients, and having different login credentials and accounts. In the early days of Azure, this was exceptionally painful, but over time the experience of using the portal with multiple identities and connecting to Azure tenants has gotten much easier. However, when writing PowerShell or Azure CLI code, switching accounts and contexts is slightly more painful. Also, when you are doing automation, you may be touching a lot of resources at one time, you want to be extra careful that you are in the right subscription and tenant.
Enter cloud shell.
If you click on the highlighted icon from the Azure Portal, you will launch cloud shell. This will require you to have an Azure Storage account which will consume a small amount of resources (€$£)–don’t sweat this–it’s literally going to cost pennies per month, unless you decided to upload terabytes of images to your cloud shell (don’t do this). The storage is there so you can maintain a history of commands and even store script files there.
With cloud shell you are automatically logged into the tenant associated with your login–you will still need to select the subscription. As shown below–you can see the subscriptions available to your login.
The other cool thing about cloud shell is that you also have built-in text editors including vim and code. While means you can paste code into a text editor and save it in your shell. Since you have a storage account that data is persisted. So you can have a bunch of scripts saved in your cloud shell. This is great for developing for Azure automation, or just running some ad-hoc scripts.
You can also go full screen with code–as shown above. While all of the examples I’ve shown have been PowerShell, you can also launch a bash shell running the Azure CLI.
Recently I was setting up a new phone number queue in Microsoft Teams. As part of this, I was trying to set up a new phone number for the queue. The problem was that every time I tried the Microsoft Teams admin page said that I didn’t have any available phone numbers even though I knew for a fact that I had a phone number current available and set up as a service phone number.
The solution to this was to use the PowerShell cmdlet Set-CsOnlineVoiceApplicationInstance to update the Resource Account with the correct phone number. The problem was that everything online said to connect to SkypeOnline with PowerShell, but nothing every said what needed to get that done. (Yes, this cmdlet is from Skype for Business not Teams, even though this is being done through Teams.)
I was eventually able to piece together what needed to be done.
Skype For Business cmdlet modules aren’t available through install-module. They need to be downloaded and installed from the Microsoft Website.
After the installer is run, open a new PowerShell window in Admin mode.
From there use PowerShell to authenticate and connect to SkypeOnline.
If you don’t have MFA configured use the following PowerShell code.
Now keep in mind that I’m using a US Phone number here, hence the +1. If you are in another country your country code will be different. Whatever the phone number that you have available as a service phone number is the phone number that you enter here.
Once this cmdlet was done the phone number listed was allocated to the resource account, and the phone number was setup and ready to go.
We have a customer who is moving platforms, and as part of this, I’ve been tasked with testing a lot of various storage configurations. This means lots of utilization of DiskSpd, which is a disk performance benchmarking tool from Microsoft. We could argue about the value of synthetic disk benchmarks, but they are good for testing a variety of disk configurations with a standardized tool for comparing results. It also has the benefit of add the runtime configuration into the results file. So as long as you have your results file, you can know what parameters you ran the test with. (You do have to document your disk configuration–we are using the name of our output file for this).
Anyway, I have a bunch of these files, and I needed to get the data into Excel. Since I was too lazy to figure out how to parse a text file in C#, my first thought was to use some combination of sed, awk, and grep in a bash shell. I reached out to my friend Anthony Nocentino (b|t) about his thoughts on the best way to do this, and he immediately said PowerShell.
When I asked about how to do things I wanted to do with specific bash commands, he mentioned the fact that I could use bash statements that supported standard input and output in PowerShell. The linked blog shows how to do this in Windows, however I wrote all of this code in PowerShell natively on my MacBook Pro.
As you can see, I’m passing output in my $Results variable to a grep to give me the first match of the word “total” and then using sed to do a couple of find and replace commands to make parsing the file a little bit easier. After I’ve done all that, I split the array into a comma delimited set of results, and output it to a CSV file. This allows you to grab the results, with headers and open then in your favorite spreadsheet software. For posterity, the code is available at in our GitHub repo here.
Out of necessity are born the tools of laziness. This is a good thing. I have found that organizing and running a SQL Saturday event is a great way to create scripts and processes that make an organizers life that much easier. The 2019 Louisville SQL Saturday helped me to create a quick script that would download all of my sponsor logos into a single folder.
The script is written in PowerShell simply because PowerShell has great versatility and it suited my needs. The goal was to be able to download the logos and send them off to one of my volunteers who then was going to put them on signage and get them printed.At the time I had no easy way to do with without manually pulling each one off the site.
Let’s get to it!
The Script
First, I need to set some variables to make it easier to use. I could make these into parameters for usability, however, for my needs since I only run this once a year having just variables is acceptable for me.
The event number is the number that corresponds to the particular SQL Saturday event you want to download the logos. Note that this would work for any event, not just the one you might be organizing.
Next, I need to fetch the XML feed from the event. The XML feed has a wealth of information in it, including the image URL for all of the sponsors.
I will also get the sponsor name, what level they sponsored at (that’s the label column) and the URL for their logo.
#let's get the XML from the SQL Saturday website
$xdoc = Invoke-WebRequest -Uri "http://www.sqlsaturday.com/eventxml.aspx?sat=$eventNum" -UseBasicParsing
#we only need a subset of each node of the XML, mainly the sponsors
$sponsors = $xdoc.guidebookxml.sponsors.sponsor | select name, label, imageURL
We want to ensure that our output folder (the path from the variable above) exists otherwise the process won’t work. If the folder doesn’t exist, it will be created for us.
If there is an error, there is a CATCH block that will capture the error and react accordingly.
Now that I have a destination folder, I can begin to download the logos into the folder. In this block, I will loop through all of the sponsors.
First, I need to do some clean up in the sponsor names. Some sponsors have commas or “.com” in their name and I wanted to use the sponsor name as the file name so I knew who the logo belonged to. Once the cleanup is done, I used the INVOKE-WEBREQUEST cmdlet to fetch the file from the respective URL and output the file into the destination directory.
#give me all of the logos
foreach ($sponsor in $sponsors){
$filename = $sponsor.imageURL | split-path -Leaf
#get the file name and clean up spaces, commas, and the dot coms
$sponsorname = $sponsor.name.replace(" ", "").replace(",","").replace(".com","")
invoke-webrequest -uri $sponsor.imageURL -outfile $outputfolder\$($sponsorName)_$($sponsor.label.ToUpper())_$($fileName)
}
Since I will be sending this to a volunteer to utilize, I wanted the process to automatically zip up the folder to make it easier. I’ll name the archive the same name as the folder so I can use the SPLIT-PATH cmdlet to get the leaf leave of the directory path, which is the folder name.
Using the COMPRESS-ARCHIVE cmdlet, I can then compress the folder and put it put it into that same folder.
# zip things up if desired
If ($compress -eq $true){
$filename = $outputfolder | split-path -Leaf
compress-archive -path $outputFolder -DestinationPath $outputfolder\$($filename).zip
}
Finally, I wanted the process to open the folder when it was done. This is simple accomplished by calling “explorer” along with the destination folder name. This will launch the folder in Windows Explorer
# show me the folder
explorer $outputfolder
Summary
Powershell is a great way to quickly and easily accomplish tasks. Whether that is working with online data or even manipulating things on your local computer, this was a quick and easy way to make my life as a SQL Saturday event organizer that much easier.
As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.
And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.
This website uses cookies to improve your experience. By using this website you agree to our policies found on our Read More. Cookie settingsACCEPT
Privacy & Cookies Policy
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.