I Presented with Live Captioning and Sign Language Interpreters

Published On: 2020-02-13By:

I had the pleasure of presenting a full-day pre-conference session on the Friday before SQLSaturday Austin-BI last weekend. I could spend paragraphs telling you how enjoyable and friendly and inclusive the event was. But I’d like to focus on one really cool aspect of my speaking experience: I had both closed captioning and sign language interpreters in my pre-con session.

First, let’s talk about the captions. While PowerPoint does have live captions/subtitles, that only works when you are using PowerPoint. When you show a demo or go to a web page, taking PowerPoint off the screen, you lose that ability. So we had a special setup provided by Shawn Weisfeld (Twitter|GitHub).

How the Live Captions Worked

A presenter uses a lavalier mic that sends audio to Epiphan Pearl. The presenter's computer sends video to Epiphan Pearl. Epiphan Pearl sends audio to a computer that sends audio to Azure and receives captions. The computer overlays the captions above the images from teh presenters laptop. That is all sent to the projector.
Technology setup at SQLSaturday Austin- BI Edition 2020 that provided live captions

The presenter connects their laptop to the Epiphan Pearl with an HDMI cable so they can send the video (picture) from the laptop. The speaker wears a lavalier microphone, which sends audio to the Pearl. The transcription green screen computer takes audio from the Pearl, sends it to Azure to be transcribed using Cognitive Services, and overlays the returned transcription text on a green screen input that is sent back to the Pearl. The projector gets the combined output of the transcription text and the presenter’s computer video output.

You can see an example of what it looked like from my presentation on Saturday in the tweet below. There are lots more pictures of it on Twitter with the #SqlSatAustinBI hashtag.

While this setup requires a bit more hardware, it worked so well! It took about 10 minutes to get it set up in the morning. As the speaker, I didn’t have to do anything but wear a mic. It transcribed everything I said regardless of what program my laptop was showing. There was very little lag. It seemed to be less than one second between when I would say something and when we would see it on the screen. While I try to speak clearly and slowly, sometimes I slip and fall back into speaking quickly. But the transcription kept up well. Some attendees said it was great to have the captions up on the screen to help them understand what I said when I occasionally spoke too quickly. The captions are placed at the top of the screen, above the image coming from my laptop, so I didn’t have to adjust my slides or anything to allow space for the captions.

The live captions were a big success. They helped not only people who had trouble hearing, but also those who spoke English as a second language and those who weren’t familiar with some of the terms I used and needed to see them spelled out.

Presenting With Sign Language Interpreters

This was my first time presenting with sign language interpreters to help communicate with my audience. Since the pre-con session lasted multiple hours, there were two interpreters in my room. They would switch places about every hour. They were kind enough to answer a few questions for me during breaks.

I asked them if it was difficult to sign all the technical terminology used and if they tried to study up on terms ahead of time. One of them told me that they don’t study the subject and they fingerspell all the technical terms. Most of my terms were spelled on my slides, and I saw the interpreter look at the slide to get the spelling. When someone asked a question about the font I was using, the interpreter asked me to spell it out, since it wasn’t written anywhere. I asked if having printed slides helped (I provided PDFs of the slides to the attendees at the beginning of the session). One of the interpreters told me no, because they were already watching the signer for questions and watching my slides and listening to me.

What I loved most about having the interpreters there was that the person using the service got to fully participate in the session. They asked questions and made comments like anyone else. And they participated in hands-on small group activities.

Check out this great photo of one of the interpreters in action during a small group activity.

5 people sit in a group at a table while a sign language interpreter sits across the table and helps the group communicate
Photo of small group activities during my Power BI pre-con with a sign language interpreter in the group. Photo by Angela Tidwell

Having ASL interpreters didn’t require any extra effort on my part. I didn’t have to practice with them beforehand or provide them with any of my conference materials. They were great professionals and were able to keep up with me through lecture, demos, small group exercises, and Q&A.

Sign language interpreters cost money. And they should – they provide a valuable service. In this case, the interpreters were provided by the State of Texas because the person using the service worked for the state government. Because this was training for their job, the person’s employer was obligated to provide this service. So we were lucky that it didn’t cost us anything.

While the SQLSaturday organizers were coordinating the ASL interpreters, they found out that there is a fund in Texas that can help with accessibility services when a person’s employer doesn’t/can’t provide them. It may not be the same in every state, but it’s definitely something to look into if you need to pay for interpreters for an event like this.

Make Your Next Event More Accessible

I have organized events, and I understand the effort that it requires. I’m so happy that Angela and Mike made the effort to make SQLSaturday Austin-BI a more inclusive event. I would like to challenge you to do the same for the next event you organize or the next presentation you give at a tech conference.

Your conference may not be able to afford the Epiphan Pearl (note: the original model we used is discontinued, but there is a new model) and the Azure costs. I’d like to see SQLSaturdays join together and purchase equipment and share across events – it would be great if PASS would help with this. If we can’t do that, we could always start small with the built-in capabilities in PowerPoint and work our way up from there.

It was a great experience as a speaker and as an audience member to have the live captions. And I was so happy that someone wanted to attend my session and was making the effort to sign up and request the ASL interpreters. I hope we see more of that in the future. But we need to do our part to let people know that we welcome that and we will work to make it happen.

Contact the Author | Contact DCAC

Parameterizing a REST API Linked Service in Data Factory

Published On: 2020-01-30By:

We can now pass dynamic values to linked services at run time in Data Factory. This enables us to do things like connecting to different databases on the same server using one linked service. Some linked services in Azure Data Factory can parameterized through the UI. Others require that you modify the JSON to achieve your goal.

Recently, I needed to parameterize a Data Factory linked service pointing to a REST API. At this time, REST APIs require you to modify the JSON yourself.

In order to pass dynamic values to a linked service, we need to parameterize the linked service, the dataset, and the activity.

I have a pipeline where I log the pipeline start to a database with a stored procedure, lookup a username in Key Vault, copy data from a REST API to data lake storage, and log the end of the pipeline with a stored procedure. My username and password are stored in separate secrets in Key Vault, so I had to do a lookup with a web activity to get the username. The password is retrieved using Key Vault inside the linked service. Data Factory doesn’t currently support retrieving the username from Key Vault so I had to roll my own Key Vault lookup there.

Data Factory pipeline containing a stored procedure, web activity, copy activity, and stored procedure
Pipeline with a parameterized copy activity

I have parameterized my linked service that points to the source of the data I am copying. My linked service has 3 parameters: BaseUrl, Username, and SecretName. The JSON for my linked service is below. You can see that I need to reference the parameter as the value for the appropriate property and also define the parameter at the bottom.

{
    "name": "LS_RESTSourceParam",
    "properties": {
        "annotations": [],
        "type": "RestService",
        "typeProperties": {
            "url": "@{linkedService().BaseUrl}",
            "enableServerCertificateValidation": true,
            "authenticationType": "Basic",
            "userName": "@{linkedService().Username}",
            "password": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "MyKeyVault",
                    "type": "LinkedServiceReference"
                },
            "secretName": "@{linkedService().SecretName}"
            }
        },
        "parameters": {
            "Username": {
                "type": "String"
            },
            "SecretName": {
                "type": "String"
            },
            "BaseUrl": {
                "type": "String"
            }
        }
    }
}

I have defined these three parameters in my dataset, along with one more parameter that is specific to the dataset (that doesn’t get passed to the linked service). I don’t need to set the default value on the Parameters tab of the dataset.

4 parameters defined in a data factory dataset: relativeURL, username, secret, and baseURL.
Parameters defined in the dataset

On the Connection tab of the dataset, I set the value as shown below. We can see that Data Factory recognizes that I have 3 parameters on the linked service being used. The relativeURL is only used in the dataset and is not used in the linked service. The value of each of these properties must match the parameter name on the Parameters tab of the dataset.

Connection tab of the dataset in data factory, showing 3 linked service properties and one additional dataset property.
Setting the properties on the Connection tab of the dataset

In my copy activity, I can see my 4 dataset parameters on the Source tab. There, I can write expressions to provide the values that should be passed through to the dataset, 3 of which are passed through to the linked service. In my case, this is a child pipeline that is called from a parent pipeline that passes in some values through pipeline parameters which are used in the expressions in the copy activity source.

The Source tab of the copy activity. It uses the parameterized dataset and contains expressions to set the values of the parameters.
Defining the expressions for the dataset properties on the copy activity source

And that’s it. I can run my pipeline and have it call different REST APIs using one linked service and one dataset.

Contact the Author | Contact DCAC

New Power BI Report Design Pre-Con in 2020

Published On: 2019-12-05By:

I’m excited to announce that I will be offering a full-day pre-con about Power BI report design in the coming year called Bookmarks, brain pixels, and bar charts: creating effective Power BI reports. For a full session description and prerequisites, please visit the session page.

Screenshot of the eventbrite page for the pre-con at SQLSaturday Austin

I built this pre-con to help people better approach report design as an interdisciplinary activity where we are communicating with humans, not just regurgitating data or putting shiny things on a page. There are many misconceptions out there about report design. Some people see it as just a “data thing” that only developers do. Many BI developers avoid it and try to focus on what they consider to be more “hardcore data” tasks. I often hear from people that they can’t make a good report because they aren’t artistic. This hands-on session will dispel those misconceptions and help you clarify your definition of a good Power BI report. You will see how you can apply some helpful user interface design and cognitive psychology concepts to improve your reports. And you’ll leave with tips, tricks, and a list of helpful resources to use in your future report design endeavours.

Your report design choices should be intentional, not haphazard or just the Power BI defaults. We’ll review guidelines to help you make good design choices and look at good and bad examples. And we’ll spend some time as a group creating a report to implement the concepts we discuss.

Basic familiarity with Power BI is helpful for attendees. If you know how to add a visual to a report page, populate it with data, and change some colors, that’s all you need. If you feel like you lack a good process for report design to ensure your reports are polished and professional, this session will share an approach you can adopt to help accomplish your design and communication goals. If you feel like your reports are luckluster or not well-received by their intended audience, join me to learn some tips to improve. If you are a more experienced report designer and you want to learn some new techniques and see the latest Power BI reporting features, you’ll find that information in this session as well.

So far, I’m scheduled to present this session at two SQLSaturdays in Q1 2020:

SQLSaturday Austin – BI – February 7, 2020. Please register on Eventbrite.

SQLSaturday Chicago – March 20, 2020. Please register on Eventbrite.

SQLSaturday pre-cons are very reasonably priced. This is a great way to get a full day of training on a low budget! I hope to see you in Austin or Chicago.

Contact the Author | Contact DCAC

Microsoft Can’t Make Your Power BI Reports Accessible Without Your Help

Published On: 2019-11-30By:

Every once in a while, someone asks a question like “Can Power BI be accessible?” or “Is Power BI WCAG compliant?” It makes me happy when people recognize the need for accessibility in Power BI. (I’ll save the discussion about compliance not automatically ensuring accessibility for another day.) But most people don’t appreciate the answer to either question.

A woman, man, and person in a wheelchair positioned next to the Power BI logo

The answer is that WCAG compliance and accessible design are highly dependent upon the report creator. Microsoft has added many built-in accessibility features such as keyboard navigation, high contrast view, and screen reader compatibility. But they can’t make your report automatically accessible as there are accessibility features requiring configuration by the report designer. We need to set the tab order and alt text and use descriptive chart titles – there is no artificial intelligence to do this for us (yet?). Beyond that, things like color contrast and colorblind-friendly design are almost entirely the responsibility of the report designer.

Accessible design used to be solely the domain of UI developers. But as we democratize analytics to have everyone building reports, we now have to create awareness and a sense of responsibility among Power BI report creators, especially those who don’t consider themselves developers.

There is a similar challenge going on with data security. It used to be that people thought of it as a concern only for the IT department. Now, it is widely accepted that everyone in an organization plays a role in maintaining data security. I hope the same attitude will be widely adopted when it comes to accessibility in data visualization and analytics.

This challenge is present in any low-code environment with users of diverse backgrounds and technical expertise, which means it is relevant to the entire Power Platform and other similar tools. There is a white paper on PowerApps Accessibility Standards and Guidelines that has a great description of the situation.

PowerApps embodies the idea of “democratization of development”—anyone in your organization can quickly and easily create a powerful app and share it broadly. But the app maker has an ethical, and sometimes legal, obligation to support “democratization of usage” as well—any user of your app must be able to use it as it was intended.

Based upon the popularity of the Power Platform, I’d say the democratization of development has been a wild success. But we still have some steps to take to democratize usage. Microsoft is doing their part to make their products accessible and to fix accessibility bugs quickly. Now we need to recognize and honor our obligation to design inclusively.

The Microsoft Docs on accessible report design were recently updated to provide more guidance. I hope you’ll check them out and start implementing the recommendations in your reports.

Contact the Author | Contact DCAC
1 2 3 7

Video

Globally Recognized Expertise

As Microsoft MVP’s and Partners as well as VMware experts, we are summoned by companies all over the world to fine-tune and problem-solve the most difficult architecture, infrastructure and network challenges.

And sometimes we’re asked to share what we did, at events like Microsoft’s PASS Summit 2015.

Awards & Certifications

Microsoft Partner    Microsoft MVP    Microsoft Certified Master    VMWare vExpert
   Best Full-Service Cloud Technology Consulting Company    Insights Sccess Award    Technology Headlines Award    Golden Bridge Gold Award    CIO Review Top 20 Azure Solutions Providers