T-SQL Tuesday #143 – Short code examples

Invitation and wrap-up from John McCormack.

T-SQL Tuesday this month is going back to basics and its all about code. I’d like to know “What are your go to handy short scripts”?

What are those little short bits of code that you can’t live without? I’m talking about little snippets that are only a few lines, that you may even have memorised. It could be T-SQL, PowerShell, Python or anything else you use day to day.

e.g. I manage a lot of SQL agent jobs. Quite often, I need to find out which job has a certain t-sql string in the command so I’ll run:

SELECT * from msdb..sysjobs sj 
JOIN msdb..sysjobsteps sjs 
on sj.job_id = sjs.job_id 
where sjs.command like 'backup log%'

Of course, there are many other ways to find this out including DBATools commands but sometime I just revert to memory for convenience.

Another one I like is to get the estimated completion rate of a backup or restore. Now there are better scripts than this but sometimes, nothing beats getting a quick estimation back from a couple of lines of memorised t-sql.

SELECT percent_complete pc,*
FROM sys.dm_exec_requests
order by pc desc

My invitation to you for this month’s #tsql2sday is…

I would like you to share with the community what your go to script snippets are and why you find them useful. By sharing these, you will undoubtedly be helping someone who hasn’t thought of doing it that way, and hopefully you’ll pick up some handy hints as well.

  • Any language is fine, not just t-sql
  • Please share as many as you wish
  • Perhaps you never do this and always work off saved scripts or convert your snippets to stored procedures? Tell us why this works for you.

T-SQL Tuesday #142: Using descriptive techniques to build database environments

Invitation from Frank Geisler.

In the old glory days back then it was usual that you must deal with one or two or probably three SQL Servers. As you all know these times are over. Through the rise of the cloud, every one of us must deal with more and more systems, not only Infrastructure but also Platform as a Service (PaaS) offerings. The systems themselves are getting more complex through all the new services and technologies that are involved and somehow interconnected. New movements like Azure Arc enabled Data Services bring a whole new aspect to the table where you can easily choose weather to run your data workload on your on-premises Kubernetes Cluster or in the cloud.

All these systems can easily be built with the Azure Portal, but this is not sustainable. Each time you use the portal, you must remember how to build a certain system and – and that is more important – how to apply best practices. For sure you can build e.g., an Azure SQL Database with an open endpoint into the internet and secure this by Firewall settings but this should be done with much caution because you are exposing your database to the internet. A better approach would be to build an Azure SQL Database that does not have a public endpoint but a private endpoint to an Azure V-Net which hosts the systems that must access the database, or which is connected to a local Network via VPN Gateway. As you can imagine there are a lot of moving parts to get such an environment up and running and you must remember (or document) each of these. This is very cumbersome work. There must be a better solution and for sure there is one: Scripting.

When you write a script, you are making your work once and whenever there is the same or a similar situation e.g., deploy an Azure SQL Database best practice, you can just pull out your script and there you go. This can be even taken to another level when you have a parametrization for your script that allows to just put in the parameters and let the script do the rest. Using this as a mantra I developed several scripts to build different cloud environments in PowerShell. This has the big advantage that the environment is documented as you have a script, and that the environment is versioned as well because all our scripts are saved within a source control system. The overall approach is called Infrastructure as Code (https://en.wikipedia.org/wiki/Infrastructure_as_code).

But doing imperative scripting in PowerShell also has its shortcomings. The cloud and the internet in general, is a very uncertain environment. While running the script that deploys your environment many things can happen. Your internet connection can break down, there could be an error deploying your script for whatever reason and so many other things you can think of. So you have to build many conditions in your script: If the resource group exists skip that and just build the Azure SQL Database if the V-Net exists skip that, check if all needed subnets exist and so on. Right? Wrong! Besides the imperative way of telling the Azure Resource Manager what to do you can also use a declarative approach to build resources in Azure.

This declarative approach is very common to everyone who has ever written T-SQL Code. If you write e.g., a query that selects data, you don’t instruct the database system how to retrieve the data from the underlying file structure. You only tell the system how the data you are looking for should look like: Select all the rows of data where the first name is “Frank”. This is the exact same approach that techniques like ARM-Templates, Bicep-Templates or if we are talking about Kubernetes YAML-Scripts take. The scripts are a description of how the target environment should look like. How this target environment is reached is fully up to the underlying System like the Azure Resource Manager. And there is even more: If you are changing an existent environment, only the parts that changed in the script will be altered in the target environment. Say you have an Azure SQL Database of a certain size and you change the size in your Bicep script. Next time you deploy the script the Azure SQL Database will be resized without deleting and redeploying it.

The ideal process of working with Infrastructure as Code would be that the code is checked in to Azure DevOps and that an automatic process will then deploy the changes to your target environment. To change your environment or to add resources you will only have to write the needed changes into your bicep scripts, check them in and let Azure do the magic.

My invitation to you for this month’s #tsql2sday is:

This is my invitation to you this T-SQL Tuesday to think about deploying SQL Component through descriptive Methods and of course to blog about it. It does not matter if you are using Azure and ARM-Templates or Bicep or Kubernetes and YAML. Just write about it and build some new cool Templates that implement some of your best practices infrastructure / environment wise. Or you can write an article on where you have already used descriptive scripts to build environments.

As always there is a whole lot of stuff on the internet you can use as a starting point. I summarize a little bit here:

T-SQL Tuesday #141: Work/Life Balance?

Invitation and wrap up from TJay Belt.

Having gone thru this last year of strange upheaval around the world, I’ve been thinking about this topic quite a bit lately. My company has a mentor program where one can sign up and discuss this topic. I signed up. I’ve had several ‘sessions’ with my mentor. We’ve had some productive conversations about what this all means to me, my career, my skillset, and so on.
As I transitioned to work at home, some 500 days ago, it was a shift in perspective, timing, and many things. Trying to keep those scales balanced has been a focus and a struggle from time to time. Especially recently, but throughout my career, it’s a topic I often lapse back into. Trying to find that balance.

My invitation to you for this month’s #tsql2sday is…

I want to invite you to share your experiences dealing with your own personal journey with Work/Life Balance, either thru experience, or hopeful future goals…

  • What are the cool things have you learned?
  • What things did you try, that simply didn’t work out? 
  • What rules have you implemented that made it easier to balance?
  • What rules made it harder to find that balance?
  • What are tips or tricks can you share to help others?

T-SQL Tuesday #140: What have you been up to with containers?

Invitation and wrap-up from Anthony Nocentino.

In recent years containers have come into the data platform world, exposing new technologies to data professionals. Microsoft put SQL Server in Linux, and shortly after that, SQL Server made its way into containers. SQL Server in Containers has become the foundation for things like Big Data Clusters and Azure Arc-enabled Data Services

My invitation to you for this month’s #tsql2sday is…

I want to invite you to share your experiences using containers and not just SQL Server in containers…

  • What are the cool things you’ve done with containers in your environment, test lab, or even presentation demos?
  • Are you using containers in production? If so, what are the tips or tricks you can share to help others?

If you haven’t tried containers yet…here’s a video showing you how to do the following…

  • Deploy a SQL Server in just a few minutes!
  • Connect to your container-based SQL Server.
  • Upgrade a container-based SQL Server to a new version.

So, if you haven’t used containers before, go ahead and try out the demos from this video, which you can get here, and write about your experience!

T-SQL Tuesday #139: The data world is hybrid

Invitation from Ben Weissman.

Our data estates are not just on premises anymore. And they haven’t all moved to the cloud either.

New offerings for a hybrid world like Azure Arc-enabled Data Services pop up but are far from being widely used in production environments.

It has become pretty evident to me, that we’ll be living in a hybrid world for a long time – potentially indefinitely. But what does that mean for our data strategy?

My Invitation

On this month’s #tsql2sday, I would therefore like to invite you all to share your hybrid and edge experiences!

Where are you on your journey?

Which challenges did you face?

How did you address them?

T-SQL Tuesday #138: Managing Technology Changes

Invitation and wrap-up from Andy Leonard.

One point I make (repeatedly) in my latest book – titled Building Custom Tasks for SQL Server Integration Services – is “software changes.” In fact, software changed on me between the completion of editing and the release of the book! I wrote about the changes in a post titled Building Custom Tasks for SSIS Second Edition Errata, Chapters 1-9. There’s a live stream video at the bottom of that post, as well.

Changing software inspired this month’s T-SQL Tuesday topic:

“How Do You Respond When Technology Changes Under You?”

T-SQL Tuesday #137: Using Notebooks Every Day

Invitation and wrap-up from Steve Jones.

Jupyter Notebooks

I first heard about Jupyter Notebooks years ago. At the time I was just getting started in Python and I thought these were a great way to share code online with others. However, the setup and administration was a pain, and I quickly gave up.

When I saw Microsoft add notebooks to Azure, I got slightly excited again, but once again, it was a bit of a pain to work with these. Too much administrative overhead, in my opinion.

That changed a bit with Azure Data Studio, which has seen notebooks get added, and become easier and more stable. If you haven’t tried notebooks, get ADS and give it a try today. I even wrote a short piece on using these.

Your Invitation

For this month’s T-SQL Tuesday, I want you to write about how you have used, or would like to use, a Jupyter notebook. This seemed to be exciting for many people at first, but I haven’t seen a lot of uptake from users in general. So I’m curious if you are using them.

A few resources for you:

There is even a way to run these from PowerShell.

So let me know how you are using notebooks, or your plans for the future.

T-SQL Tuesday #136 – Blog About Your Favorite Data Type (Or Least Favorite)

Invitation and wrap-up from Brent Ozar.

Your mission: write a blog post about your favorite data type, and schedule it for next Tuesday, March 9.

Mad about money? Feverish about float? Tell us what you use your favorite data type for, and what people need to know before they get started using it. Data types have so many interesting edge cases, like when Aaron Bertrand taught us to use lower case for our data types. (I still don’t do that, but for the record, I do feel guilty.)

T-SQL Tuesday #135 – The outstanding tools of the trade that make your job awesome

Invitation and write-up from Mikey Bronowski

TL;DR

Without tools, most of the work would be much harder to do or could not be done at all. Write a blog post about the most helpful and effective tools you use or know of.

If you do not own the blog – Tweet your contribution or post on LinkedIn with #tsql2sday and put the link back in the comments here.

Tools are great

Can you imagine your day without them? How do you make your coffee or slice the bread? How do you cook breakfast or dinner?

We are surrounded by tools, and there is a special class of them called “tools of the trade”. Those are needed to do your job whatever it is.

I would like you to write about the tools that help you at work, those that helped you the most or were the most effective. It can by anything really, does not have to be related to SQL at all. Here is the list of example, but you can go beyond that and share the tools you cannot live without. Are you…

  • writing code?
  • recording videos or streaming live sessions?
  • creating podcasts or taking pictures?
  • migrating loads of data or environments?
  • writing documentation?
  • maintaining the open-source projects?
  • managing the team?
  • travelling a lot (well, not currently)?
  • learning a new language?
  • renovating house?

Everyone has its own experience and workstyle. The goal of this post is for everyone to learn about a new tool, or starting to use one, so life gets easier a little bit.

T-SQL TUESDAY #134: – GIVE ME A BREAK!

Invitation and roundup from James McGillivray.

Breaks are critical for our mental health. Write a post about relaxation techniques, dream destinations, vacation plans or anything else relating to taking a break for your own mental health.

TAKING A BREAK

2020 was a tough year. And there’s no guarantee that 2021 will be better. Hopefully it is, or at the very least, hopefully our experience in surviving 2020 will allow us to be more prepared.

In the early months of the pandemic, I was burning myself out. South Africa was in a hard lockdown, and we were prohibited from doing many of the things I usually do outside of work. Sport, exercise (outside the house), socialising, and choir were all banned. Additionally, my wife is an Emergency Physician, and at a high-risk of becoming infected, so we isolated at home as much as we possibly could.

With all this time on my hands, and no sport on TV, I started working crazy hours to keep myself occupied. And even when I wasn’t working, I was spending time in front of a screen; watching TV, browsing social media, even having a chat with my parents or friends was happening in front of a screen.

Before long, I was feeling lethargic, listless, and honestly, very down. I was struggling to provide emotional support for my wife when she needed it most, and I realized I was burned out. I knew that I needed to change something… which I will write about next week (but here are some pictures to whet your appetite).

YOUR TURN

For T-SQL Tuesday #133, I’d like to know about your own views on vacations, escapes or other breaks. Whether it’s work, technology, or any other situation that you need to get away from. Plans, memories, or relaxing activities, I want them all!

You can choose any of the topics below, or any other variation on the theme.

  • What do you do to take a break when you’re stuck at home?
  • How do you switch off, and get away from being connected 24/7?
  • If money was no issue, what would be your bucket list vacation?
  • What vacation are you dreaming of taking once it is safe to do?
  • What is the best vacation you’ve ever had?