In this one-day session, we’ll review the various infrastructure components that make up the Microsoft Azure platform. When it comes to moving SQL Server systems into the Azure platform, having a solid understanding of the Azure infrastructure will make migrations successful and support solutions easier. Designing your Azure infrastructure properly from the beginning is extremely important. An improperly designed and configured infrastructure will result in performance problems, manageability problems, and can be difficult to resolve without downtime. As Azure scales out around the world, many more companies will begin moving services from data centers into the Azure platform, and a solid foundation is the key to successful migrations.
Creating a Power BI report involves data analysis, cognitive science, graphic design, communication, and user experience design. To be successful, we need to approach it holistically, rather than see it as “just a data thing”.
This session will use lecture, demos, and hands-on activities to learn about Power BI features, techniques for building reports, visual design concepts, and the basics of human perception and learning. You’ll learn how to define your message, choose an appropriate visual, and format the visual to optimize information intake. We’ll also discuss how to arrange our visuals coherently in a report and how to implement storytelling in Power BI.
You’ll leave the session with an understanding of defining success in data visualization, how humans consume visual information and interact with reports, and how to create a report to communicate and stimulate viewer engagement. Attendees will receive a list of resources to use in future report design endeavors.
Have you recently decided to move to Microsoft Azure, but don’t know where to start? Do you have concerns about how to get your data there? Thankfully, migrating to Azure isn’t as daunting as you might think.
In this full-day, hands-on session, you will learn about the foundations of the Microsoft Azure SQL Platform ecosystem. We will examine the differences between Azure SQL Database, Managed Instances, and SQL Server in an Azure VM, as well as how to migrate to each of these options. We will ensure that you understand the basics around networking in Azure and how the network connects back to your on-premises environment. After performing some live migrations, we will explore topics such as ensuring data safety, including high availability, disaster recovery, security controls, and encryption.
By the end of this session, you will obtain newfound confidence to help you get data up into the cloud in a safe, secure, cost effective, and successful manner!
A 2 Day Learning Path
If you are new to the cloud and you want to have a full two days that will get you from 0 to ready to deploy in Azure, then I’d highly recommend Denny’s session on Monday, followed by John’s session on Tuesday. While Denny’s session will cover the infrastructure and all the various components that you can use in Azure; John’s will bring the Azure knowledge back to the flavors of SQL Server that Azure focuses on. With those being:
Azure SQL Database
Azure SQL Managed Instance
Azure SQL within a Virtual Machine
While Denny will be focusing on what can be set up, John will be focusing on how to set up specific features for best performance as well as how to migrate to those features. For those that don’t want to take both sessions, there will be some overlap between the two so that everyone is up to speed on the SQL Server nuts and bolts before going into the how and the migration pieces.
We’re extremely proud of our three speakers, and all three of these sessions are going to be fantastic. We hope to see you at these sessions, and the rest of the PASS Summit. So get registered before these great precon sessions sell out.
Note: This is my first blog post in a while. You too may have been stressed by recent events in world. I’m also doing a large content project I’ll hope you get to enjoy in the near future too. This post actually ties into working from home.
So anyway, one of our customers recently spun up a new Azure SQL Database in the Azure South Africa North region, and two of their team members couldn’t connect to it from their homes. The common element–both of them used Verizon Fios as their ISP.
It’s important to note that connectivity for Azure SQL Database is to a public IP address. Each region has a public IP address and a lookup takes place, and then you are connected through a gateway and you eventually connect to the database. The IP address is just a public endpoint. If I run a trace route to a database running in the Azure US East region I see:
Josephs-MacBook-Pro-3:Dropbox joey$ traceroute dcac-demo.database.windows.net traceroute to cr4.eastus2-a.control.database.windows.net (22.214.171.124), 64 hops max, 52 byte packets 1 192.168.115.1 (192.168.115.1) 2.224 ms 2.849 ms 1.953 ms 2 126.96.36.199 (188.8.131.52) 12.961 ms 12.978 ms 14.438 ms 3 184.108.40.206 (220.127.116.11) 10.461 ms 9.803 ms 8.983 ms 4 18.104.22.168 (22.214.171.124) 16.069 ms 6.967 ms 12.133 ms 5 126.96.36.199 (188.8.131.52) 11.460 ms 15.657 ms 11.314 ms 6 be-201-ar03.ivyland.pa.panjde.comcast.net (184.108.40.206) 14.815 ms 13.470 ms 13.775 ms 7 be-33287-cr01.newark.nj.ibone.comcast.net (220.127.116.11) 18.519 ms 17.500 ms 19.012 ms 8 be-1301-cs03.newark.nj.ibone.comcast.net (18.104.22.168) 16.457 ms 23.452 ms 20.628 ms 9 be-2303-pe03.newark.nj.ibone.comcast.net (22.214.171.124) 15.248 ms 21.317 ms 22.250 ms 10 126.96.36.199 (188.8.131.52) 16.467 ms 18.986 ms 23.555 ms 11 ae23-0.ear01.ewr30.ntwk.msn.net (184.108.40.206) 24.966 ms 22.497 ms ae24-0.ear01.nyc30.ntwk.msn.net (220.127.116.11) 26.774 ms 12 be-21-0.ibr02.ewr30.ntwk.msn.net (18.104.22.168) 32.792 ms be-20-0.ibr01.nyc30.ntwk.msn.net (22.214.171.124) 27.437 ms be-20-0.ibr01.ewr30.ntwk.msn.net (126.96.36.199) 34.036 ms
You can see that on steps 7-9 it my connection jumps onto the Comcast backbone (ibone.comcast.net) network, and at step 11 jumps onto the Microsoft Azure network (msn.net). I truncated the output of the trace route there.
Monica Rathbun (b|t ) who works with me at DCAC, also had Fios and was able to help us troubleshoot this. When Monica ran a trace route to the public IP address the of Azure SQL Database in South Africa North it looked like:
Tracing route to cr1.southafricanorth1-a.control.database.windows.net [188.8.131.52]
over a maximum of 30 hops:
1 <1 ms 1 ms <1 ms Fios_Quantum_Gateway.fios-router.home [192.168.1.1]
2 3 ms 1 ms 2 ms 184.108.40.206
That is not how the internet is supposed to work. While we tried to figure out how to get in touch with someone at Verizon to fix this, which through normal consumer channels is not fun. So while we waited for that to get fixed, we had another options to fix this.
Introducing Private Link
Azure Private Link was recently introduced as generally available for Azure SQL Database and allows for you to have a truly private connection to your database. For several years now you have been able to use network endpoints to allow a specific VM or App Service to connect to Azure SQL, however that design had some limitations. The first being that it still routed the connection to the database over the public IP address. The second was that network endpoints did not support multi-region scenarios. With Private Link your database has it’s own private IP address on the virtual network where it is deployed.
In our client’s case, this work around involved connecting to an Azure VPN in US East which would then be connected to South Africa North. One interesting thing about Private Link is that it doesn’t support Azure Virtual Network peering, so you will need to create a Gateway if you want your traffic to traverse virtual networks. Additionally, we were working in two different subcriptions, which mean we had to use the PowerShell from here. Private Link is easy to configure from your Azure SQL server logical server, but be sure you have the right permissions in Azure RBAC. I needed to be granted the network contributor role in order to get the GUI to work.
I had some issues that were permissions related which delayed our implementation of Private Link, and I was fortunate enough to have some members of the Azure Networking product group helping me out. Since Microsoft and Verizon are technically partners (Verizon is an ExpressRoute partner) they were able to get in touch with some folks as at Verizon, and they were able to resolve the problem. Sort of–the trace routes still look weird, but everyone can connect. Verizon reports that the trace routes from what they can see look normal. Which tells me they are doing something really weird on the router itself with how Azure traffic is routed.
Microsoft SQL Server Agent includes a really cool, often unused feature. That is the ability to run SQL Server Agent job steps as another user besides the user account that the SQL Agent runs under. This is done through the authentication mechanism in Microsoft SQL Server called credentials.
Credentials are similar to but different from Logins in SQL Server. Credentials store the username and password just like logins do. But in this case, the credentials are storing a username and password for an outside system, in this typically an Active Directory username and password. The other big difference is that when a password is stored for a login, only the hash is stored. With a credential however SQL Server needs a way to get the password that is being stored, so the password is encrypted instead of being hashed.
The reason that SQL Server needs to be able to retrieve the password, is that when a credential is used, the password isn’t supplied, only the name of the credential is. The SQL Server retrieves the username and password for the credential specified, then uses the username and password.
Credentials are created through management studio under the security section of object explorer, or by using the CREATE CREDENTIAL command in T-SQL. Once the credential is created, users can be given the right to use the credential. This will allow users that don’t have sysadmin rights to the database instance to use the credential. If everyone that sets up SQL Agent Jobs had sysadmin rights, then permissions on credentials don’t need to be adjusted as members of the sysadmin fixed server role can use any credential. You will need to edit the credential and specify while type of job step can be used the credential.
When you create a SQL Server Agent Job of any type other than T-SQL you’ll have an extra drop-down box that will allow you to select the credential that you want to use. Select the credential from the drop-down, and the next time the job runs, the job step will be run as the account specified in the credential.
If using a T-SQL job step, there is no credential option when creating the job step. This is because T-SQL job steps can’t use credentials. Instead for job steps of this type use the EXECUTE AS syntax that is already available to you within the T-SQL Syntax.
OCEANSIDE, CA April 9, 2020 – In the midst of uncertain financial markets, the Financial Times has nevertheless ranked cloud solution experts Denny Cherry & Associates Consulting [DCAC] #278 on their inaugural list of The Americas’ Fastest Growing Companies 2020 and within the top 100 of the technology list.
The FT list was compiled with Statista, a research company, and ranks entrants from across the Americas by compound annual growth rate (CAGR) in revenue between 2015 and 2018.
CEO Denny Cherry commented, “We are thrilled to make the FT Americas rankings and we are incredibly humbled to be there, considering what coronavirus has done to the global economy.
We intend to keep contributing to the resolution of this crisis the way we know best: getting more businesses migrated to and optimized within a cloud environment. With every company we assist, more people return to work albeit remotely, their company regains its productivity and contributes to overall economy; all while assisting their employees with maintaining social distancing so everyone stays safe.”
Maxine Kelly, Commissioning Editor, Special Reports at Financial Times commented, “The inaugural FT Americas ranking comes at a perilous and uncertain time for many companies, as the coronavirus severely curtails economies, workforces and ultimately growth. Yet the ranking also highlights 500 businesses across the continent for whom innovation and creativity have paid off — attributes that will underpin resilience and enable many of them to thrive once the worst effects of the pandemic are behind them.”
The rankings are online now and the full report featuring case studies and analysis will be published in the Financial Times on May 12, 2020.
Media inquiries about this release should be directed to publicist Kathleen Hannon (704) 912-0209 or email@example.com.
Award-winning, Gold Platform Microsoft Partner Denny Cherry and Associates Consulting, assists companies with reliably attaining cloud solutions objectives such as Azure Migration, HA, scalability, SQL Server acceleration, Big Data and Data Warehousing, while finding ways to save on costs. With clients ranging from Fortune 500 to small business, their commitment to each is to provide a deft, high-speed IT environment that optimizes every aspect of their platform: from architecture, to infrastructure, to network.
Register for our webcast featuring Meagan Longoria, Kevin Kline and Denny Cherry as they explore how to make communications clearer, especially during these stressful situations by improving your report visualization techniques.
As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.
And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.