Working with ALS – ALS Awareness Month 2023

I have been living with ALS for the past two years. I was diagnosed in September 2022 but had my first symptoms and started researching what was going on in July 2021. As I begin to understand my diagnosis more, I realize that this is a progressive disease, and my needs and situation will change over time. In my situation, I lost the use of my arms and hands first and I’ve begun to lose functionality in my legs and torso over the past few months. To learn more about my journey to this point check out this video on YouTube which tells my personal story.

Launching the YouTube Channel

Now that we’ve mentioned YouTube, Data on Wheels is launching its YouTube channel focusing first on my ALS story. While this YouTube channel will support technical content like the Data on Wheels and the Data on Rails blogs, it will also support ALS awareness and accessibility technology.

I realized early on that a blog post was not the best way to share the story or tools that I use while working with ALS. We will have a couple of playlists that are directly related to this part of my story. We will have a playlist called Living with ALS that focuses on my personal journey and various things we’ve learned or experienced in the process. We will also have a playlist that’s called Working with ALS that will focus on the tools I use to keep me working and how I keep working as my disease progresses.

While I wish we could have started this much earlier, the disease itself has slowed my ability to complete some of these tasks as quickly as I wished. It is with a lot of help from my family who have supported me through video editing, content support, and motivation that we’ve been able to get this launched. I hope you enjoy it for what it is, and I look forward to having a number of you follow and learn more about accessibility.

Global Accessibility Awareness Day 2023

I had the opportunity to work with the Voice Access team from Microsoft on a story video that was used on Global Accessibility Awareness Day, May 19, 2023. I am truly impressed with the work Microsoft has done with Windows and Office to create more accessible experiences for those of us who need help doing our day-to-day job. When I think about how Voice Access has truly helped me be productive in my day-to-day work needs such as IMs in Teams, it was a pleasure to work with them to do a video to promote this tool. You can check out the video in the Voice Access area here. You can also learn more about the other accessibility options available in Windows 11.

Supporting the Cause -ALS Walk Lexington, August 2023

As part of the effort we’re doing around ALS Awareness this month, Data on Wheels is the team name for our team that will be doing the ALS Walk in Lexington KY this year. This is an opportunity for us to do our part to help the local community continue to do great things for the ALS community. Check out our team site and support if you can. Every little bit counts. If you’re in the area and want to join us for the short walk that happens in August feel free to sign up on the team.

3Cloud – Inclusive and Supportive Workplace

I was employed at 3Cloud when my disability first surfaced. As you can imagine, it was an emotional roller coaster and I really needed to determine how much it would affect my ability to do work for the company. As you have seen above or in one of my stories, this disease first affected my hands and arms and my ability to type. Without knowing the speed or rate of the disease and the impact to my current workload, I was able to work with my bosses to find a good place for me in the company which allowed me to contribute while I sorted this out. With their help, I have been able to find a place to continue to contribute and participate in the technology that I love to work with. For example, just this past week, Microsoft released a new product called Microsoft Fabric that I’ve been working with them and our team on to understand how to best work with it in our field. The company has trusted me with this and other initiatives that have allowed me to continue to be productive and generally contribute to the growth and success of 3Cloud.

Wrapping It Up

ALS has changed a lot about how I work and how I view life. I hope that what I share about what I learn helps others in similar situations. I will continue to be as active as I can both through work and in the community. Each day is a new day. Some days bring good things and some days bring bad, but God is in control of it all. Thank you for your continued support!

Power BI Cruise Recap – Deep Data Dives

Thank you so much to those of you who were able to attend this three-hour, C# packed session! The Power BI Cruise was an incredible conference, and I cannot recommend it enough if you’re looking for an opportunity to dig deep on Power BI topics with other passionate individuals. From the first ever Power BI bingo to a three hour session on TMDL from Mathias himself, this was a trip that will stick with me for a while. In this post, I’ll include notes from my session and the sessions I attended plus links to the GitHub. Let me know if anything really piques your interest and you’d like to see more detailed blog posts about it!

  1. My Session – Power BI, C#, and TMDL!
  2. TMDL & Source Control – Mathias Thierbach
  3. Hacking the Visuals – Stepan Resl
  4. Ask Me Anything – Jeroen (Jay) ter Heerdt

Huge shout out to the other speakers and of course the incredible organizers who stuck with this amazing vision through the pandemic and made it happen. Thank you all so much for making this event full of passion and unparalleled learning. I don’t know how you all made this happen without sponsors, but it was truly an incredible event. So thank you again Asgeir, Erik, Johan, and Just!

My Session – Power BI, C#, and TMDL!

Now you may have noticed I give this session – Power BI Meets Programmability – quite often. Bringing programmability to the world of Power BI is something I greatly enjoy, but this session was truly unique. Having three hours to take a deep dive into what C# can do for Power BI development was the opportunity of a lifetime. With two more hours than usual, we explored creating calculation groups in this session for the first time as well as creating automatically generated set of measures based on data types. Not only that, but the attendees were good sports and stuck around for an extra half hour to create calculation groups in C# using TMDL! I plan on doing a very detailed blog on this in the future, but the GitHub folder for this conference (link below) contains TMDL code live coded by the creator himself – Mathias Thierbach! The script contains a section that will create a TMDL version of your data model, allow you to interact with TMDL files within your solution, and publish those changes back to the data model. It blew my mind, and I hope it blows yours as well!

Additional Resources

The final code can be found here:

TMDL & Source Control – Mathias Thierbach

Speaking of TMDL, this conference hosted the incredible Mathias Thierbach for a session on TMDL & Source Control. Below are my notes from that session, but if you ever have a chance to hear him speak on the subject I highly recommend it. The notes below are in bullet point format for now, as I dig further into this language I look forward to writing more detailed blog posts about this incredible language and what we can do with it.

  • Key goals of TMDL: readable, editable, and allows collaboration
  • There will be a VSCode extention for TMDL coming soon (we were given a special preview in the session) that will have syntax highlighting for both DAX and M as well as TMDL
  • Shortcut to get to the TOM docs: or
  • TMDL is completely case insensitive!
    • Default export of .tmd file is camel case
  • Boolean properties: you only have to say the Boolean property to enforce the default value
    • To explicitly set it, syntax is:
      property: true/false
  • No matter how many times you serialize TMDL, the order of the objects will always be the same
    • Tables: partitions, calc groups, columns, hierarchies…
    • You can reorder objects
    • You can reorder objects to your specifications (aka no longer alphabetical)
  • Meta data IsParameterQuery = true makes it a parameter
  • Ordinal is not a table property in TOM, but you can create one in TMDL! It won’t show up in this order in Desktop…yet. But it will allow consistent order. It’s a weak ordinal, in cases of conflict it won’t cause any errors. It should default to alphabetically on conflict.
  • Indentation for expressions is not white space sensitive (DAX and M)
    • It finds a shared whitespace to the left to create indentation in the final model
  • Indentation only matters for nested objects
  • To fold levels in VS Code use Ctrl + k + [level_number]
  • To unfold levels in VS Code use Ctrl + k + j to unfold all levels
  • Unicode is supported
  • Shared expressions = parameters
  • Preview 1 (what we have), perspectives can’t be read back but can create them
    • Will be fixed in Preview 2
  • 4 previews total, goal is for GA at end of the year
  • tableName.columnName etc.
    • If there are spaces, use single quotes as a delimiter
  • Description is added by using /// text on top of the object declaration. Can be multi-line and line break is maintained.
  • No comment support but you can comment within M and DAX
    • Triple ‘’’[formula] ‘’’ will be a safe way to mark M/DAX expression sections. Likely coming in preview 3 or 4 later this year
  • Language spec will be published
  • Default properties are available to find in Rui’s documentation
    • Default property is what is after the = for the object
    • This can be found on the main TMDL overview page

How to create a TMDL folder from Tabular Editor (need 2.18.1 version or later)

Open the folder you’ve created within VS Code.

Hacking the Visuals – Stepan Resl

Hacking = use various available means to get better insights from data.

This session absolutely blew my mind. Stepan does an incredible job showing the art of the possible by hacking into expression properties within tabular editor as well as calculation group formatting. We had an awesome time connecting at this conference and I hope to see him again at future conferences! In the meantime, below is a link to his GitHub as well as a blog post that covers a lot of the same material as his session. Be prepared to have your mind blown on this one.

Stepan’s GitHub for this session:

Additional blog post by Stepan on this subject:

  • For bar charts – to easily highlight what is over target, change the color within the bar at the target with grey below the target
  • Visual Vocabulary = a great reference for chart types
  • / used in a format string will escape the wildcard characters
  • Keep in mind audience, green is not positive in Japan
  • topProductByLocation = TOPN(1, ALLSELECTED(Products[ProductName]),[# Total Quantity])
  • # sold quantity by Top Product = VAR _product = [topProductByLocation] RETURN CALCULATE([# Total Quantity], Products[Product name] = _product
    • This allows us to use a measure as a filter by precalculating it in a variable
  • Best whitespace option is to use UNICHAR 8203, it’s a very very small dot
  • Color picker: Just Color Picker
  • Use smart text in subtitles in charts to tell people when we hit a target or how close we are
  • You can use format to change numbers to text with a thousands delimiter! See the subtitle measure
  • You can use ; within format to do an if statement! BUT don’t forget to escape any special characters using \
    • Example: “Sales target was set to ” & FORMAT(_target,”#,,,,,,,,,,#”)
      & ” and we have sold ” & FORMAT(_sold,”#,,,,,,,,,,#”)
      & ” which means we ” & FORMAT(_sold-_target,”\have;\haven’t”) & ” fulfilled our target.”
    • In this case “h” is a special character that is trying to get an hour value and will show 0 if you don’t escape it
  • If you create conditional formatting in bar charts, the colors will stay if you switch to a line chart even though conditional colors are not available in the UI
    • Same thing happens if you have an x constant line in a line chart and switch to bar chart
  • You can customize spacing below titles and subtitles.

I’ve added the final PBIX from this session to the GitHub, keep in mind a lot of the features that make it possible are only visible through tabular editor:

Ask Me Anything – Jeroen (Jay) ter Heerdt

Meeting Jay and chatting during various free time on the ship was definitely a highlight of the trip. He’s incredibly knowledgeable on Power BI (especially the elusive DAX language), and has a deep understanding of the Microsoft strategy on new feature creation. It was a pleasure getting to laugh and brainstorm together, and I can’t wait to hang out again at future conferences. Below are some questions that were asked during the AMA session as well as answers provided and links I found while Jay was talking.

  • Why can’t we use RLS and USERELATIONSHIP() together?
    • RLS blocks USERELATIONSHIP() since USERELATIONSHIP() can (but doesn’t always) go around the RLS relationship and creates a security risk
  • Different people own different visuals which is why the options and customizations vary so widely
    • Miguel Myers did a session on the future of visuals in Power BI. Huge plans with him on board to own visuals and align everything
    • There is a linked in group called PBI Core Vision, put comments there on what can be improved in the future. Twitter and LinkedIn
  • When we change the hierarchy name or order via XMLA, it now breaks the visual. Can we get this to update in the visual in the service with adjusted level orders
    • Also ask him about Top N + Others
  • Tableau is the primary competition for PBI, Looker is up and coming as Google, and third is Qlik because it’s still getting new clients in Europe
    • These are taken into consideration when making a case for new items
    • Cannot blatantly copy items
    • Tableau has specific people in their agreement that cannot even look at Tableau (can only look as a user, not at the debugger nor dll file)
  • In the future, hoping to have functions that could be shared within a community for DAX
  • Could we get Rulers or aligned grid lines in PBI Desktop?
    • Rosie or Ree-ann would own this
    • Can we customize grids?
    • Not part of visual team, it’s placed under on-boarding
  • Advancements to API?
    • Hardening of PBIX will help with getting better Scanner APIs to get deeper API’s (visual level and page level details on what’s available within the tenant)?
  • DataZen got killed and unsure why
  • Jay would investigate how to get rid of filter direction in DAX
  • Why can I not put a measure name in smart narratives?
    • You can do this by creating a shape and using fx to put in the text you want

Typing with Your Tongue Part 2: Voice Access

A few months ago, I wrote a post on how I use voice technology to continue working with my ALS condition. Since that post was written, Microsoft released a new voice technology called Voice Access as a part of the Windows 11H2 update. I’m going to talk a little bit about my experience using it. It has changed how I interact with my PC and which tools I choose for voice to text.

Voice Access

You can turn Voice Access on if you’re running the latest version of Windows 11. (Voice Access is not available in Windows 10). You simply go to the Accessibility settings and choose the Voice Access option under Speech. Once you have turned on Voice Access, you will see a bar across the top of your primary screen as shown in the image below. This is how you know you have Voice Access ready to go.

Voice Access is much more than a voice to text tool. Voice Access includes many command tools including a mouse grid option which allows you to grid your screen and select items on the screen using only your voice. Voice Access supports commands such as Open and Close for application windows. You can find out the full list of commands that are available to you in Voice Access here.

How do I use Voice Access?

I typically have Voice Access on all the time except when I’m in meetings. (It will try to type everything that’s said in the meeting, so it needs to be turned off at that time.) There are only two tools where I use Voice Access less often. That would be Outlook and Word. More on that in a bit.

Because Voice Access alternates between dictation and commands, I able to use it when working with most tools. For example, with Windows Mail I will use it to dictate an e-mail, and then click Send to send the same e-mail. When I say “click send”, Voice Access finds the Send button on the window. If there are more than one, it will give me an option to select which Send I meant to click. I find the overall experience pretty good as it allows me to switch between dictation and commands without issue.

I use Voice Access a lot when working with Teams, WhatsApp, and other chat-based tools. Voice Access allows me to have a good voice to text tool in applications that typically do not have great accessibility support. At times, I have used Voice Access instead of Dictation in Office 365, especially when working with PowerPoint and Excel. Neither of these are particularly voice to text friendly and Dictation in PowerPoint is significantly lacking.

Using Office 365 Dictation

I primarily still use Office 365 Dictation when working with Word and with Outlook. Dictation in both tools responds quicker than Voice Access. It also handles some of the issues that are currently being worked on in Voice Access such as punctuation. For example, the bulk of this blog post was authored in Word using Office 365 Dictation because it’s quick, simple, and works well within the context of voice to text.

Other Insights into How I Work with These Tools

Office 365 Dictation is fully online. This means if you lose your internet connection, you lose the ability to use that function. Voice Access on the other hand does not have this dependency and will continue to work without a connection.

The commands between these two tools still vary quite a bit. For example, text formatting is very different between the two tools. In Word, they use “capitalize” to capitalize a word using Dictation. (By the way this has been improved since the last time I wrote my blog post. This improved capability in Office 365 Dictation is huge). In Voice Access you have to say “capitalize that” to capitalize a word or selected words. When you use that same command in Office 365, it will capitalize the whole sentence. This is true of other formatting commands between the two tools.

There are enough variances in those commands to cause frequent issues when working with the two tools interchangeably. I find myself using the wrong command to create a new line or change the case of words regularly. This is something you will have to work through if you go back and forth between the two tools.

I started working with Voice Access while it was still in preview. I am actively working with the Voice Access team and providing feedback about the tool. While many updates have been made, there are still issues with the tool as it continues to mature. If you’ve been working with Windows Speech Recognition and Voice Typing, I would still recommend switching to Voice Access as the experience is much better. Just keep in mind it is still new and you may still run into issues here and there as they continue to improve the product.

When Voice Access became available, I switched immediately to Voice Access as my primary voice command and dictation tool when not using Office 365 (Word and Outlook). I think that this technology will continue to improve and make voice accessibility better for those of us with limited functionality in our hands and arms. As you know from previous conversations, I’m not a huge fan of Dragon because of how it changed my workflow and made things more difficult for me to do in general. Voice Access is a more natural fit into my workflow, and I like it considerably better. I do find myself going back and forth between my roller bar mouse and using a complete voice solution when my hands get tired. That speaks volumes of the work done in Voice Access to help someone like me continue to function by having options in the tools I can use. I look forward to the improvements as they come along.

SQL Bits 2023 Recap

I had an amazing time adventuring to Wales with other members of the data community last week! It was my first international conference, and I had heard so many incredible things about it, but still it blew away all my expectations. I’m so honored to have met so many incredible people throughout the week! Thank you to new friends and old for making it a conference to remember.

Below are the links to GitHub for my session as well as some notes from the sessions I attended. Disclaimer, these notes are not even close to the real thing. Be sure to check out these sessions if you see them being held at your local user group or SQL Saturday in the future! I tend to take very bullet point oriented notes to best organize my thoughts, hopefully these spark your interest and guide you to professionals who know about specific technologies within the data realm.

Check out the agenda and feel free to reach out to folks with topics you’re interested in! They may have their slide deck easily accessible to folks who may not make it to the conference.

Power BI Meets Programmability with Yours Truly

Thank you so much to everyone who made it out to my session!

It was incredible to have 48 people in person and another 21 virtual! It was a humbling experience to see that room fill up and know they were all there to learn something new and exciting to help with their day-to-day work. Can’t wait to look through the feedback and learn where to tweak my presentation to make an even bigger impact in the future!

  • Github with slide deck, sample C# code, and sample PBIX file (with Excel file that powers it): Github Link
  • Category of blog posts related to TOM and C#:
  • Fun tip someone showed me after the session:
    If you use /* and */ to comment out chunks of code (like I do frequently during the session), you can use /* and –*/ so that you only need to comment out the /* to run that block. Pretty neat!

Supercharge Power BI with Azure Synapse Analytics with Mathias Halkjaer

Limitations of PBI

  • Source load minimization doesn’t exist in most cases (lots of queries against the same source)
  • Out of sight transformation layer
  • Not best tool for data warehouse tasks
  • No logs or output from quality checks
  • Doesn’t handle humongous data very well

Enchantments needed for PBI from Synapse

  • Time-travel (historical data & change tracking)
  • Enlarge (massive scale)
  • Counter spell (reverse ETL)
  • Wish (supports multiple programming languages)
  • Divination (AI)
  • Regenerate (CI/CD and source control)

Azure Synapse

  • A toolbox full of tools: serverless sql, data warehouse, spark engine, etc.

Data Lake

  • Landing zone for data
  • Cheap
  • Redundant
  • Quick and dirty analytics
  • Performance
  • Most efficient way to load data into data warehouse
  • Easily consumed

Report Design & Psychology (Quick Tips) with Jon Lunn

Dual process theory: intuitive vs attentive

You can mentally process up to 15 items intuitively. Past that it gets pushed to attentive.

Things can move from system 2 (attentive) to system 1 (intuitive) with repetition (aka driving or typing).

Layout – left to right, top to bottom. Move most important KPI to top left. Remove distractions (excess colors, 3d visuals, pie charts, etc). Titles in upper left make it easier to know what’s contained within the visual.

Tip – to visualize dots use dice patterns to move it to a system 1 process.

Ratios (how many, relative) vs percentages (how likely, chance risk probability). Both are good options to contextualize a number more. We usually understand ratios much better because how many is more intuitive than how likely.

Intuitive systems are more prone to errors or bias. If you want to hide something, go for percentage.

Use the power of 10’s (aka 8 out of 10 preferred it). Round and use a ratio to move it into intuitive process.

Power BI Datamarts What, How, Why with Marthe Moengen

You can manage RLS roles within datamart, join tables via GUI, and join with similar UI to PBI desktop modeling view. In preview only. You can use as a SQL endpoint to connect to your datamart.

Why? Access data through a SQL endpoint. Do simple aggregation using SQL script instead of M. There are potential challenges since this is in preview.

It keeps getting easier to move Oracle and Open Source workloads to Azure with David Levy

  • Cloud migration phases:
    • Discover
    • Assess
    • Migrate
    • Optimize
    • Secure & Manage
  • Azure Migrate: central hub of tools for datacenter migration
    • Primary tool for discover and assess
    • Azure Database Migration Service – Gen 2 has much more control for end users to say how many resources that item gets
  • Use Oracle Assessments in Azure Data Studio to get recommendations for sku size
  • Oracle databases in Oracle cloud can use OCI Interconnect to pull data back and forth as Azure IaaS
  • A lot of people move to PostgresSQL to pay less and get open source resources
  • Migration strategy
    • Rehost
    • Refactor
    • Rearchitect
    • Replace
  • No better way to run software than by SaaS. Why invest the time and resources when the vendor can maintain this for you and allow you to focus on your business?
  • The Oracle Assessments in Azure Data Studio can see all the database details, SKU recommendations, and Feature Compatibility. Each section has a migration effort attached, very nice
  • Azure is a great place for Postgres SQL:
    • High availability
    • Maintained
    • PostgresSQL migration extension can run assessment in ADS and understand the migration readiness
    • Integrated Assessment
    • Easy set up and config in ADS
  • To migrate postgres
    • Pg_dump scripts out the PostgresSQL schema
    • Pg-sql allows creating databases and tables
    • Use the PostgreSQL to Azure Database for PostgreSQL Online Migration Wizard
    • You can start cutover with pending changes
  • Azure Database for MySQL
    • Flexible server is now available, allows mission critical apps to use zone redundancy and fine-grain maintenance scheduling
    • Azure Database Migration Service (ADMS)
      • Migrates schema an data
      • Preview = replicate changes
      • You can do this as an online migration
  • Optimizing your environment
    • Costs during & after migration
      • During: Azure TCO Calculator, Azure Migrate, Azure Hybrid Benefit & Reserved Instances, and join Azure Migration & Modernization Program
        • Do reservations early, you’ll lose money otherwise
        • AMMP = Azure Migration & Modernization Program, provides coaching with best practice
    • Make sure you get good data for trouble times so your migrated instance is ready for the worst case scenario
    • Execute iteratively

5 Things You Can Do to build better looking reports – James McGillivray

  • Design is subjective
  • No clear end point
  • No one solution
  • Design is often seen as less important
  • Blog:
  • Tip Techniques
    • Grid Layout
      • Pro: left brain, less layout tweaking happens, looks professional
      • Con: can look boxy,
      • Gutters = white space between visuals
      • Margins = white space along edge of report
    • Maximize Real Estate
      • Pro: bigger visuals = more value, fewer visuals = less distractions
      • Con: often hard to convince stakeholders, all questions not answered by default
    • Beautiful Colors
      • Pro: use color to convey meaning, highlight data, show continuous data, qualitative sequential divergent
      • Con: many pitfalls, accessibility concerns, can be polarising
    • Consider the Audience
      • Pro: most important elements first, remove unneeded elements, hierarchy view (summary on top, grouped middle, detail on bottom)

Identifying and Preventing Unauthorized Power BI Gateways with Angela Henry

  • The problem: available to all, gateways are not only for PBI, results in chaos
  • How do we find gateways?
    • Manage connections & gateways. Let’s us identify gateways. Need to be Azure AD global, PBI service admin, or Gateway Admin
  • Reason to restrict: governance

Building, Deploying, Sharing a Remote Jupyter Book in Azure Data Studio

When working with Azure Data Studio and its support of Jupyter books, you will find there is an option for remote Jupyter books. As shown in the image below, you can open that Jupyter book and follow through the dialogue for a couple of Microsoft books that are readily available.

top portion of the dialog to add a remote Jupyter book
Dialog for adding a remote Jupyter book

So let’s start at the beginning now let you know what the end game is. What if you want to create and share a Jupyter book full of helpful hints for the world to see? How would you go about doing that? the first question you may be asking is what is a Jupyter book? As it turns out in Azure Data Studio a Jupyter book is a collection of folders and files managed by yaml files.

If you’ve been a SQL Server developer for a very long time, this may be meaningless to you. I know it was for me.

Check out this blog post to see how I created my first notebook and Jupyter book in Azure Data Studio. Once you have created your own content, even if it’s just a sample to try, come back to this post and we will walk through how to share your notebook as a remote Jupyter book.

Get it into GitHub

To share your Jupyter book you will need to get your Jupyter book into GitHub. Use File >> Explorer from the menu in Azure Data Studio to open your repo directory that is stored locally. You can either save your notebooks to this location or copy and paste them using File Explorer. I’ll be writing more details around using GitHub with Azure Data Studio in a future blog post.

Now that you have the content in your repo, you can now commit the code to GitHub. At this point if your repo is not public, you will need to make it public to complete the process. My recommendation is if you have a repo you’d use for a lot of your internal work, you should create a new repo that is public and copy the content there.

Create a zipped file

Because what you’re working with is a folder structure and a set of files, you will not build this in a traditional code build fashion. That begs the question how do you create a release? You create a release of your Jupyter book by zipping up the folder and naming your zip file a very specific way. Before we get into the naming process, remember that Azure Data Studio supports both Mac and Linux as well as Windows. Because of this if you want your remote Jupyter book to be available to Mac and Linux users, you will need to create a tar.GZ file as well.

Your file name needs to contain all the relevant properties required in a remote Jupyter book release. This includes the following attributes separated by hyphens:

  • Jupyter book name
  • Release number
  • Language

In the example I am using, the file name looks like this:

Now that you have the file ready, we can create the release.

Creating a GitHub release

Time to create that release. In GitHub click the Release button to open the releases page to create a new release. Click on the Draft a new release button to open the page that will let you create your next release. You will need to add the zip files by dropping or selecting the files where it says to attach binaries to the release. Add additional documents or instructions as needed. Click Publish release to build and make a release available.

New Release Page at GitHub

Now that you have the release created, you should be able to open that remote Jupyter book.

Opening your remote Jupyter book

As you saw in the beginning only two repos are included by default in Azure Data Studio. Even though you have created a remote Jupyter book, it does not show up on the list. You will also notice that it does not use a proper URL even though that appears to be the value that is requested. To connect to your Jupyter book, you will need to use the following format in the URL text box: repos/your GitHub/your repository name. For example, you can find my example remote notebook on Azure SQL Database Elasticity in this repo: repos/DataOnWheels/notebooks.

It appears that in the background, Azure Data Studio handles the front end of the URL. Once you have entered this in you should be able to click Search and fill in the related information that you built into the file name that you added to the release. once you have filled out all the information, you should be able to click Add and this will import the notebook into your local Azure Data Studio. This effectively loads a disconnected copy of the original notebook. Congratulations, now you have uploaded your own Jupyter book and made it available for others using the remote option.

Why would you ever use this?

Be aware that we would not recommend the usage of a public repository for privileged information that you would work with at your company. However, if you find it necessary to share a notebook as a result of building a presentation or you want to share some great information out easily, this might be a good fit for your use case. The Jupyter book that I have shared using this methodology is also available in my standard GitHub repository for code sharing. However, attaching to a remote Jupyter book to get all the example code easily added to Azure Data Studio is a win. It is an easier way to distribute the work however it is not well-known as a pattern.

If you successfully create an interesting or boring Jupyter book that you want to share, I encourage you to include it in the comments below so we can all have a look.