T-SQL Tuesday #152 – It Depends

Invitation and round up from Deborah Melkin.

I came to a realization lately that I have a few opinions about databases. And I’m pretty sure that you do too. After all, I’ve read your blogs, chatted with you, and seen your Twitter rants.

But we’re database professionals. It’s supposed to depend, right?

Except we all have experiences that shape how we approach our work. One minute your coworker asks you a question about doing X. You reply with “It Depends…” leading into a 5-10 minute rant. This may include some or all of the following:

  • Stories starting with “that one time at that client”
  • References to blog posts you read\wrote\should write
  • Commentary on code – the good, the bad, & the ugly
  • Personal theories and philosophies on the topic

All of this is followed by “Thank you for coming to my TED talk” and a “I’m sorry, what was your question again?

So yeah… this may have been inspired by an actual conversation… or two… or ten. I apologize to my coworkers… again…

So for this month’s T-SQL Tuesday, I want you to give us that rant. Tell us about the experiences, the code, the posts that inspired you, and all the gory details in between. And what is it that makes you so passionate about this topic that “It Depends” gets tossed out the window? Pull out your soapbox and tell us all about it

T-SQL Tuesday #151 – Coding Standards

Invitation and roundup from Mala Mahadevan.

My invite is about coding standards, or what I now call Linting Rules, for T-SQL. What are the T-SQL coding rules that are appropriate for where you work and what you do? If there are exceptions to those rules, state them too! If this is enough, read the blog party rules below and get started!!

  • Your post must be published on Tuesday June 14, 2022.
  • Your post must contain the T-SQL Tuesday logo (see above) and the image must link back to this blog post.
  • Trackbacks should work, but if not please put a link to your post in the comments section so everyone can see your contribution! (My comments are moderated so please don’t worry if yours doesn’t appear right away, I will make sure it does!)
  • If you are on twitter include the hash tag #tsql2sday – it helps with RT-s and visibility!!

More on why I picked this topic as below –

When I started out as a DBA two decades ago, I had a list of rules that I would carry with me into every job I went..these are things I look for in T-SQL code and try to enforce as standard. Some examples were casing rules, minimized usage of SELECT STAR, equating the right data types in columns, avoiding NOLOCK hint and so on. Standards ensure quality and consistency in code.

Standards differ for each firm, depending on what is appropriate for an environment..it is even possible to have varying standards in the same company, depending on the environment and what is appropriate for a database. This is an excellent article on what are the different components that comprise coding standards, and why we need them. I am also a big proponent of automated code checking for standards – there are lots of tools available for doing this – SQL Prompt, which is a personal favorite of mine, and many others as listed here.

Several tools currently do linting on many relational platforms, not just SQL Server. Almost all of them though, have rules that the author(s) think are best for the worlds they work in, and do not include other conditions which they have not encountered yet. A common example I like to use is unnamed primary keys on temporary tables. There is nothing inherently wrong with having an inline primary key constraint/index on a temporary table – but if you use Query Store, plan forcing on a plan that uses this temp table will not work simply because the constraint gets named differently each time. When I started to look for a linting tool for where I work – I ran into so many rules that were non-existent or not applicable to my environment with outside tools that I decided to write my own using ScriptDOM – a Microsoft-provided library that was created specifically for this purpose.

It would help greatly if we had a collection of rules that people use to pick from and enforce as appropriate for their environments. It will also help me to code some of these into ScriptDOM and put it out on GitHub, if the rule is one that ScriptDOM can find easily. So, re-stating the call for this month – What are the T-SQL coding rules that are appropriate for where you work and what you do? If there are exceptions to those rules, state them too!

T-SQL Tuesday #150 – Your First Technical Job

Invitation from Kenneth Fisher.

This month for TSQL Tuesday I’d like to hear about your first technical job(s). I know most DBAs don’t start out working with databases so tell us how you did start. I’m generally thinking of your first tech job, but if you had a job early in your career that wasn’t technical at all but still makes for a good story feel free to share! I can’t wait to hear about it.

T-SQL Tuesday #149 – Blog about T-SQL Advice you’d give to your younger self

Invitation and round up from Camila Henrique.

T-SQL Tuesday is a monthly blogothon where a host chooses one topic and bloggers around the world give their ideas about it. This month is my first time being the host and I’m so excited!

April’s theme: if you could give advice about T-SQL to your younger self, what would you say? I’m not defining any more specific sub-topics here, I’m opening the floor to hear ideas about your past experiences and what you wish you knew better back then.

T-SQL Tuesday #148 – Advice on Running a User Group

Invitation from Rie Merritt.

For this edition of T-SQL Tuesday, I’d like to ask everyone to write about all the various aspects of running a user group.  I’m not asking you to write a James Michener novel that starts at the beginning of time and includes everything a user group leader needs to know. Specifically, I’m asking people to pick one or two things and go deep on what works for you, what didn’t work for you and lessons learned.   Running a user group is a large umbrella of tasks and duties: be it in-person, virtual or hybrid.  My goal this month is to break that up into small bite sized pieces of knowledge for people.  There are so many directions you can go here: finding speakers, growing your membership, utilizing technology to make things easier, finding sponsors, finding a venue, picking a pattern to when you meet, etc.  The possibilities are almost endless!  

The Azure Data Community is building a How to run a user group wiki and we’d love to link to your post to offer new and old user group leaders some great resources to build and maintain a successful, healthy community.  

Special thanks to Steve Jones for running T-SQL Tuesday and finding a spot for me, thanks to Kenneth Fisher for being flexible with when he sponsors T-SQL Tuesday and finally to John MorehouseAnnette Allen and Josh Smith for all of the sweat equity they’ve already put into building the wiki.  

T-SQL Tuesday #147 Invitation – Upgrade Strategies

Invitation and wrap-up from Steve Jones

Planning for Upgrades

In my career, most of the time we don’t upgrade production databases very often. In most of my jobs, we’d change versions for new databases, but existing ones often lived on their original version. It’s how I got into a job where I was managing 4 different versions of SQL Server. These days I expect it’s common for many DBAs to have to deal with that many, or more, versions.

I do have customers these days that try to upgrade often, and limit the number of versions they work with. I have customers now that are on a mix of 2016-2019 only, some that might be working on 2014-2016 only, and I’ve run into a customer that only has SQL Server 2017. Of course, they have few databases and look to upgrade about every 5 years when mainstream support is running out for their edition.

This month I want you to write about how you look at SQL Server upgrades. A few things you might think about:

  • Why we wait to upgrade?
  • Strategies for testing an upgrade
  • Smoke tests or other ways to verify the upgrade worked
  • Moving to the cloud to avoid upgrades
  • Using compatibility levels to upgrade an instance by not a database.
  • Checklists of things to use in planning
  • The time it takes to upgrade your environment
  • What you evaluate in making a decision to upgrade or not?
  • Anything else

I don’t know when SQL Server 2022 will release, but certainly many of us will need to consider in 2023 whether we want to upgrade systems or not. Think about it and write about something that matters to you.

T-SQL Tuesday #146: Upending Preconceived Notions

Invitation and round-up from Andy Yun.

Welcome back to another edition of T-SQL Tuesday! I’m honored to be your host once again!

Theme to Kick off 2022

This month, I’d like to ask everyone to think about something you’ve learned, that subsequently changed your opinion/viewpoint/etc. on something. Maybe you’ve had a certain opinion, belief, or bias? Perhaps you’ve always thought something worked a certain way? Or you’ve always thought that a certain something (called “X”) was only good for “A”, only to later learn that it can help with “B”, and “C” as well. Regardless, you learned something and it totally upended and changed that preconceived notion you held previously.

When has this happened to you Andy?

Let me share an example. In my past as a T-SQL developer, I remember when I first learned about CTEs. I thought the world of them and started using them everywhere! However, there was one slight problem. I was under the mistaken impression that they pre-materialize each sub-query. And they do… in OTHER RDBMS’s. Whoops! After a few years, I learned that they don’t behave that way in SQL Server. Instead the query optimizer inlines the query in the CTE, making them functionally no different that a subquery. And well, let’s just say that that made me regret some of the coding decisions I’d made during my “CTEs-are-awesome” phase.

T-SQL Tuesday #144 – Data Governance reimagination

Invitation from Dr. Victoria Holt.

Data governance is a topic that has raised its head again in the last year, with the introduction of a GA service called Azure Purview.  Data governance is not a new topic in the realms of data management.  I think over the last few years data governance has had a focus on meeting data protection law, government legislation and formalised control and standards. Then with data sovereignty issues being everyday considerations within the global market place, storage locations in the cloud and the likes of the General Data Protection Act, data governance has been very focused on meeting legislative requirements.

There has been a substantive cost involved in setting up data governance within an organisation to avoid those heavy fines if a data breach were to occur. I think because of this many organisations compartmentalise personal data systems.

A change in perspective, a reimagination, of data governance is occurring. Data governance is really about ‘data erudition’, showing an interest in learning about the data we have, improving the quality and creating a more productive and trusted data asset. Starting small and incrementing data change in a way that matches the business need, is a way to gain targeted business value. Azure Purview provides this great opportunity for us to start at the beginning and create a data catalog, data inventory, data dictionary and much more in an automated way.  Trusting the data quality, knowing your risks and what data you have sets your business up for success.

My invitation to you for this month’s #tsql2sday is…

I want to invite you to share your experiences on data governance

  • The current cost of data governance versus its benefits
  • The amazing things data governance has enabled you to achieve or will enable you to achieve in the future
  • The potential uses for Azure Purview within your estates and the automated deployment options for that

T-SQL Tuesday #143 – Short code examples

Invitation and wrap-up from John McCormack.

T-SQL Tuesday this month is going back to basics and its all about code. I’d like to know “What are your go to handy short scripts”?

What are those little short bits of code that you can’t live without? I’m talking about little snippets that are only a few lines, that you may even have memorised. It could be T-SQL, PowerShell, Python or anything else you use day to day.

e.g. I manage a lot of SQL agent jobs. Quite often, I need to find out which job has a certain t-sql string in the command so I’ll run:

SELECT * from msdb..sysjobs sj 
JOIN msdb..sysjobsteps sjs 
on sj.job_id = sjs.job_id 
where sjs.command like 'backup log%'

Of course, there are many other ways to find this out including DBATools commands but sometime I just revert to memory for convenience.

Another one I like is to get the estimated completion rate of a backup or restore. Now there are better scripts than this but sometimes, nothing beats getting a quick estimation back from a couple of lines of memorised t-sql.

SELECT percent_complete pc,*
FROM sys.dm_exec_requests
order by pc desc

My invitation to you for this month’s #tsql2sday is…

I would like you to share with the community what your go to script snippets are and why you find them useful. By sharing these, you will undoubtedly be helping someone who hasn’t thought of doing it that way, and hopefully you’ll pick up some handy hints as well.

  • Any language is fine, not just t-sql
  • Please share as many as you wish
  • Perhaps you never do this and always work off saved scripts or convert your snippets to stored procedures? Tell us why this works for you.

T-SQL Tuesday #142: Using descriptive techniques to build database environments

Invitation from Frank Geisler.

In the old glory days back then it was usual that you must deal with one or two or probably three SQL Servers. As you all know these times are over. Through the rise of the cloud, every one of us must deal with more and more systems, not only Infrastructure but also Platform as a Service (PaaS) offerings. The systems themselves are getting more complex through all the new services and technologies that are involved and somehow interconnected. New movements like Azure Arc enabled Data Services bring a whole new aspect to the table where you can easily choose weather to run your data workload on your on-premises Kubernetes Cluster or in the cloud.

All these systems can easily be built with the Azure Portal, but this is not sustainable. Each time you use the portal, you must remember how to build a certain system and – and that is more important – how to apply best practices. For sure you can build e.g., an Azure SQL Database with an open endpoint into the internet and secure this by Firewall settings but this should be done with much caution because you are exposing your database to the internet. A better approach would be to build an Azure SQL Database that does not have a public endpoint but a private endpoint to an Azure V-Net which hosts the systems that must access the database, or which is connected to a local Network via VPN Gateway. As you can imagine there are a lot of moving parts to get such an environment up and running and you must remember (or document) each of these. This is very cumbersome work. There must be a better solution and for sure there is one: Scripting.

When you write a script, you are making your work once and whenever there is the same or a similar situation e.g., deploy an Azure SQL Database best practice, you can just pull out your script and there you go. This can be even taken to another level when you have a parametrization for your script that allows to just put in the parameters and let the script do the rest. Using this as a mantra I developed several scripts to build different cloud environments in PowerShell. This has the big advantage that the environment is documented as you have a script, and that the environment is versioned as well because all our scripts are saved within a source control system. The overall approach is called Infrastructure as Code (https://en.wikipedia.org/wiki/Infrastructure_as_code).

But doing imperative scripting in PowerShell also has its shortcomings. The cloud and the internet in general, is a very uncertain environment. While running the script that deploys your environment many things can happen. Your internet connection can break down, there could be an error deploying your script for whatever reason and so many other things you can think of. So you have to build many conditions in your script: If the resource group exists skip that and just build the Azure SQL Database if the V-Net exists skip that, check if all needed subnets exist and so on. Right? Wrong! Besides the imperative way of telling the Azure Resource Manager what to do you can also use a declarative approach to build resources in Azure.

This declarative approach is very common to everyone who has ever written T-SQL Code. If you write e.g., a query that selects data, you don’t instruct the database system how to retrieve the data from the underlying file structure. You only tell the system how the data you are looking for should look like: Select all the rows of data where the first name is “Frank”. This is the exact same approach that techniques like ARM-Templates, Bicep-Templates or if we are talking about Kubernetes YAML-Scripts take. The scripts are a description of how the target environment should look like. How this target environment is reached is fully up to the underlying System like the Azure Resource Manager. And there is even more: If you are changing an existent environment, only the parts that changed in the script will be altered in the target environment. Say you have an Azure SQL Database of a certain size and you change the size in your Bicep script. Next time you deploy the script the Azure SQL Database will be resized without deleting and redeploying it.

The ideal process of working with Infrastructure as Code would be that the code is checked in to Azure DevOps and that an automatic process will then deploy the changes to your target environment. To change your environment or to add resources you will only have to write the needed changes into your bicep scripts, check them in and let Azure do the magic.

My invitation to you for this month’s #tsql2sday is:

This is my invitation to you this T-SQL Tuesday to think about deploying SQL Component through descriptive Methods and of course to blog about it. It does not matter if you are using Azure and ARM-Templates or Bicep or Kubernetes and YAML. Just write about it and build some new cool Templates that implement some of your best practices infrastructure / environment wise. Or you can write an article on where you have already used descriptive scripts to build environments.

As always there is a whole lot of stuff on the internet you can use as a starting point. I summarize a little bit here: