MSSQL Archives - Thomas LaRock https://thomaslarock.com/category/mssql/ Thomas LaRock is an author, speaker, data expert, and SQLRockstar. He helps people connect, learn, and share. Along the way he solves data problems, too. Tue, 30 Mar 2021 01:08:22 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://thomaslarock.com/wp-content/uploads/2015/07/gravatar.jpg MSSQL Archives - Thomas LaRock https://thomaslarock.com/category/mssql/ 32 32 18470099 SET NOCOUNT For SQL Server https://thomaslarock.com/2021/03/set-nocount-for-sql-server/ https://thomaslarock.com/2021/03/set-nocount-for-sql-server/#comments Tue, 30 Mar 2021 01:05:55 +0000 https://thomaslarock.com/?p=20845 Last week I was reviewing an article and found myself needing information on the use of NOCOUNT as a standard for writing stored procedures. A quick internet search found this old post of mine, written back when I used to work for a living. Apparently, I was once asked to enable NOCOUNT for a specific ... Read more

The post SET NOCOUNT For SQL Server appeared first on Thomas LaRock.

]]>
Last week I was reviewing an article and found myself needing information on the use of NOCOUNT as a standard for writing stored procedures. A quick internet search found this old post of mine, written back when I used to work for a living. Apparently, I was once asked to enable NOCOUNT for a specific SQL Server database. As the post suggests, this is not possible. The options for NOCOUNT are to set for the entire instance, for your specific connection, or within your T-SQL code.

Since the post was written well before the new-ish ALTER DATABASE SCOPED CONFIGURATION statement, I was hopeful enabling NOCOUNT for a database was now possible. Turns out you cannot, as the set options listed here do not include NOCOUNT. Sad trombone music.

But of course I tried anyway.

And I failed.

Really failed.

I tried to enable NOCOUNT for my instance of SQL 2019 and it wouldn’t take. At all.

Let me explain.

The Flop

Using the code from my previous post, you enable NOCOUNT for the instance by configuring the user option to 512, like this:

EXEC sys.sp_configure 'user options','512'
GO
   
RECONFIGURE   
GO

Now, open a new query window in SQL Server Management Studio (SSMS), set the results to text to make the output easier to see, and run a query. If you are like me, you will see this:

Not exactly the expected behavior! My initial reaction is to assume I have screwed this up somehow. I decide to try Azure Data Studio (ADS) to connect and run the query:

Same result. Two tools, and the result set is showing a count of rows affected, despite the user option clearly having been set.

And the SSMS GUI verifies this as well:

The Turn

Before I go any further, I want to take note that SET NOCOUNT OFF is one of those horrible phrases we come across in tech where our brains are forced to think twice about what we are doing. Whoever named it this way should be sacked. A simple SET ROWRESULTS ON|OFF would be far simpler to comprehend. </rant>

Anyway, I spend time trying to debug what is happening. I am able to manually set NOCOUNT on and off inside of T-SQL and see a count of rows affected returned (or not). I check and recheck everything I can think of and feel as if I have lost my mind. I’m starting to question how I ever became certified in SQL Server.

I mean, it’s a simple configuration change. This isn’t rocket surgery.

So I do what anyone else in this situation would do.

I turn off my laptop and forget about everything for a few days.

The River

Eventually I decide to reopen my laptop and try again. I am able to reproduce everything. So I ask some friends if they are also seeing similar issues. One friend, Karen López (@datachick), asked me a few follow up questions. These questions get my mind thinking about other ways to test behavior and debug. I suddenly recall I can check for the options set for my current connection:

DECLARE @options INT
SELECT @options = @@OPTIONS

PRINT @options
IF ( (1 & @options) = 1 ) PRINT 'DISABLE_DEF_CNST_CHK'
IF ( (2 & @options) = 2 ) PRINT 'IMPLICIT_TRANSACTIONS'
IF ( (4 & @options) = 4 ) PRINT 'CURSOR_CLOSE_ON_COMMIT'
IF ( (8 & @options) = 8 ) PRINT 'ANSI_WARNINGS'
IF ( (16 & @options) = 16 ) PRINT 'ANSI_PADDING'
IF ( (32 & @options) = 32 ) PRINT 'ANSI_NULLS'
IF ( (64 & @options) = 64 ) PRINT 'ARITHABORT'
IF ( (128 & @options) = 128 ) PRINT 'ARITHIGNORE'
IF ( (256 & @options) = 256 ) PRINT 'QUOTED_IDENTIFIER'
IF ( (512 & @options) = 512 ) PRINT 'NOCOUNT'
IF ( (1024 & @options) = 1024 ) PRINT 'ANSI_NULL_DFLT_ON'
IF ( (2048 & @options) = 2048 ) PRINT 'ANSI_NULL_DFLT_OFF'
IF ( (4096 & @options) = 4096 ) PRINT 'CONCAT_NULL_YIELDS_NULL'
IF ( (8192 & @options) = 8192 ) PRINT 'NUMERIC_ROUNDABORT'
IF ( (16384 & @options) = 16384 ) PRINT 'XACT_ABORT'

Running the above code returns the following result for my connection:

And then it hits me. My connection does not have NOCOUNT enabled! I mean, I’m not really surprised, but it is helpful to see that it is missing. I then decide to open a connection with SQLCMD and observe the default behavior for the connection. Sure enough, NOCOUNT is enabled, as expected:

My connection string has no additional options, and the NOCOUNT is respected. This is the expected behavior for the instance.

Now I need to verify what is happening under the hood when you connect to SQL Server using SSMS or ADS. Using the default xEvents session I capture the connection string sent when connecting from ADS and find this gem:

The NOCOUNT user option configuration item is not recognized when you connect using those tools. Other user options appear to be respected, but for some reason NOCOUNT is ignored. This explains why I was see the unexpected behavior.

I will keep my certifications for now.

Summary

I don’t know if this is a bug or a feature. But it is certainly a frustrating experience for an end user like myself. But if I SET NOCOUNT for SQL Server I expect it to apply to all users connecting from that point forward. Since other user options appear to be respected, there must be something different about NOCOUNT.

It should not matter how users are connecting. SSMS and ADS should both respect the server settings. I suspect other tools likely use the same code as SSMS and ADS, meaning you should double check the actual connection string used from your application. It could explain unexpected behavior.

The post SET NOCOUNT For SQL Server appeared first on Thomas LaRock.

]]>
https://thomaslarock.com/2021/03/set-nocount-for-sql-server/feed/ 1 20845
You Can’t Marry Your Database, But You Can Have Relations https://thomaslarock.com/2021/02/you-cant-marry-your-database-but-you-can-have-relations/ https://thomaslarock.com/2021/02/you-cant-marry-your-database-but-you-can-have-relations/#respond Mon, 01 Feb 2021 23:15:29 +0000 https://thomaslarock.com/?p=20575 There’s something you should know about relational databases. They were designed to store data efficiently, protecting the quality of the data written and stored to disk. I’ve written before about relational engines favoring data quality and integrity, and how relational databases were not designed for the reading of data. Of course, if you are going through the trouble ... Read more

The post You Can’t Marry Your Database, But You Can Have Relations appeared first on Thomas LaRock.

]]>
There’s something you should know about relational databases.

They were designed to store data efficiently, protecting the quality of the data written and stored to disk. I’ve written before about relational engines favoring data quality and integrity, and how relational databases were not designed for the reading of data.

Of course, if you are going through the trouble of writing the data into a relational database, it makes sense that you would want to retrieve the data at some point. Otherwise, why go through the exercise of storing the data inside the database?

The trouble with reading data from a relational database is due to the data not being stored in a format that is friendly for viewing, reading, or retrieving. That’s why we have data professionals, like me, to help you write queries that return the correct data, in the correct format, for you to analyze.

I’m here today to tell you we’ve been doing data wrong the whole damn time.

Let me show you what I mean.

Traditional Data Flow Patterns

Here’s what companies do, every day:

Step 1 – Identify useful data

Step 2 – Import that data into a database

Step 3 – Analyze the data

Step 4 – Export the data into dashboards

Step 5 – Profit (maybe)

The trouble with this process is Step 3, the analyzing of the data. Relational databases were not designed for analytical processing. Relational databases do not store data in a way that is readable, or friendly, for human analysis.

That’s not to say you can’t do analytics inside of a relational database. What I am saying is that it could be better for you not to spin the CPU cycles there, and instead do the analytics somewhere else.

For example, data warehouses help with data storage, retrieval, and analytics. But even a data warehouse can fall short when it comes to the use of unstructured data sources. As a result, we’ve spent decades building ETL processes to curate, collate, consolidate, and consume data.

And we’ve been doing it wrong.

Data Mining for the Data Expert

So, we understand that people find data, store it, and try to use it later. They are engaging in the process of data mining, hoping to find gold in the form of insights, leading to better business decisions.

But as I mentioned before, the data isn’t in a readable format. Let’s look at an example.

Following the common data flow pattern, I found some useful data at NFL Savant https://nflsavant.com/about.php, and imported the data into a SQL Server database:

It looks like any other table in a relational database. In this case, it is a table containing time series data pertaining to play by play records for the 2018 NFL season. Each row represents an entity (a play at a point in time of an NFL game), and the columns represent attributes of the play (down, distance, yards to go, etc.)

Nothing out of place here, this is how data is written to a relational database. In an orderly fashion. As a DBA, I love this type of orderly storage. It’s efficient, and efficient is good.

As a data analyst, I’m not a fan. At least, not yet. I have a bunch of data, but what I want are some answers. So, it’s up to me to ask some questions of the data, find some answers, and use that to help make better business decisions.

For this data, here’s an example of a simple question: What are the average yards to go for NFL teams in 2018? I can get that answer with some simple T-SQL:

This is great! I was able to take my data, ask a question, and get an answer. What could be better, right?

Of course, now I have more questions about my data. And here’s the first issue you will discover when trying to analyze data stored in a traditional relational database.

T-SQL is excellent at answering one question at a time, but not as great when you need more than one question answered.

So, if we have more questions, we will need to write more queries.

Here’s a good follow-up question that we might want to be answered: Can we examine this data broken down by each quarter?

Fortunately, the answer is yes, because T-SQL comes with a bunch of statements and functions that will help. In this case, I am going to use the PIVOT operator, as follows:

Easy, right?

No, not easy.

And not readable, either. What’s with that row saying NULL? Why do I not have a result for some teams in that last column?

As it turns out, you need a lot of experience writing T-SQL to get to that query. And you need more experience understanding the result set, too. You don’t start on Day 0 as a data professional writing PIVOT queries against a SQL Server database.

Here’s the good news: You don’t need to write PIVOT queries, ever.

Data Mining for the Masses

The data import from NFL Savant was in the form of a CSV file, which I then imported into my database. Because that’s how ETL is done (see above for the common data flow process).

What if…now hear me out…we skipped step 2? Forget about doing the import process. Instead, let’s open that CSV file in Excel.

Here’s what it would look like:

Back to our football questions. We’ve seen examples in T-SQL, let’s look at how to do this in Excel using a Pivot table.

I click on one cell in Excel, insert a pivot table, drag the offense teams as a row, and the downs to go as a value, change it to an average, and we are done. Have a look:

It took but a few seconds to get this magic to happen. Here’s what I want you to know:

1. No T-SQL is necessary. None. Not one line of code.

2. I have the entire table as a pivot table, allowing me to answer more questions WITHOUT needing to write more T-SQL.

3. There is no code. None.

Let’s say that I want to know the yards to go broken down by quarter. With T-SQL, I would need to write a new query. With the pivot table, it’s a simple drag and drop, like this:

Fin.

There is no need to rewrite code to get this result. Because there is no code, it’s drag and drop, and then I have my answer.

And that’s why I believe the inclusion of pivot table inside Excel is the greatest advancement in the 21st century for data professionals.

Fight me.

Summary

I did not come here to bury relational databases. I came here to help you understand relational databases may not be the right place to do analytical processing.

When it comes to curating and consuming data, I have three simple rules for you to follow:

Rule #1 – Only collect data that you need. Don’t collect data “just in case you may need it later.” The data you collect must be relevant for your needs right now.

Rule #2 – Understand that all data is dirty. You could build a perfect analytical solution but based on inaccurate data. Know the risks involved in making business decisions based on dirty data.

Rule #3 – Before you collect any data, consider where the data will be processed. Don’t just assume that your database will do everything you need. Take time to list out all the available tools and systems at your disposal. The result may be a simpler solution than first imagined.

I wrote this post to help you understand Rule #3. Analysis of NFL play by play data is best done in a tool such as Excel, or PowerBI, and not (necessarily) inside of SQL Server.

SQL Server is a robust relational database engine, containing integrations with data science-y stuff such as R and Python. Just because you could do your analysis inside the SQL Server engine doesn’t mean you should.

This post originally appeared on PowerPivotPro and I was reminded about its existence while talking with Rob Collie during our Raw Data podcast. I asked Rob if I could repost here. He said yes. True story.

The post You Can’t Marry Your Database, But You Can Have Relations appeared first on Thomas LaRock.

]]>
https://thomaslarock.com/2021/02/you-cant-marry-your-database-but-you-can-have-relations/feed/ 0 20575
Tune Workloads, Not Queries https://thomaslarock.com/2020/08/tune-workloads-not-queries/ https://thomaslarock.com/2020/08/tune-workloads-not-queries/#respond Mon, 31 Aug 2020 22:04:41 +0000 https://thomaslarock.com/?p=19906 Ask three DBAs about their preferred performance tuning methodology and you will get back seven distinct answers.

The post Tune Workloads, Not Queries appeared first on Thomas LaRock.

]]>
Ask three DBAs about their preferred performance tuning methodology and you will get back seven distinct answers. I bet a pound of bacon one of the answers will be “it depends”.

Of course, it depends! But on what does performance tuning depend?

Context.

Most performance tuning methodologies focus on tuning one or more queries. This is the wrong way of thinking. It is an antiquated way of problem solving.

Let me explain.

The Problem with Traditional Database Monitoring

Traditional database monitoring platforms were built from the point of view of the engine-observer. These tools focus on metrics inside the database engine, and may collect some O/S level metrics. They often assume the database is running on a single server node, and not a collection of nodes. And they are reactive in nature, notifying you after an issue has happened.

But the reality is your database engine is but a process running on top of an operating system, for a server that is likely virtualized, and may be running in your data center or in the cloud. In other words, there are many layers between users and their data. And in a world of globally distributed systems, chances are your database is not on a single node.

This means your in-house legacy accounting application requires different monitoring and performance tuning methods than your on-line ordering system. When you focus on one query, or even a top ten list of queries, you have little to no information regarding the entire application stack. And those engine metrics we know and love will not help you understand the overall end user experience.

But when it comes to database performance tuning methods, there is a heavy focus on tuning activity inside the engine. This makes sense, because that’s what DBAs (and developers) know. That’s the silo in which they operate. They need to prove the issue is not inside the database.

Stop focusing on the database engine and open your mind to the world that exists outside of that database.

Once you turn that corner, the mean time to resolution shrinks. The result is a better end user experience.

Tune Workloads, Not Queries

The Heisenberg Uncertainty principle states that the position and velocity of a particle cannot be measured exactly at the same time. The more you know about position, the less you know about velocity, and vice-versa.

The same theory applies to database performance tuning methods. The more you know about activity happening inside of a database engine, the less you know about the entire system. Nowhere in an execution plan is there a metric for ‘user happiness’, for example.

Therefore, troubleshooting modern distributed systems requires a different approach. Enter the four golden signals: latency, traffic, errors, and saturation. These signals combine to help provide a measure of overall user experience. From there, if you need to dive into a database, you’ll have context necessary to start tuning at the server, instance, or query level. Over time you can shift to thinking about how to scale out, or up, as necessary.

Put another way, you would not expect your mechanic to tune your Jeep the same way she would tune a Ferrari. Both are vehicles but built for different purposes. The tools and methods are distinct for both. And so are the metrics and dashboards you want for your legacy applications versus a distributed one.

Summary

Slow is the new broke. But things don’t have to be slow to be broke. A poor user experience with your online ordering system will hurt your bottom line. Traditional database monitoring systems are not focused on the user experience. Instead, they focus on the database engine itself. But those engine metrics won’t tell you that Brad in Idaho got frustrated and left his shopping cart with $2,000 worth of potato seeds.

Your performance tuning methodology should include an understanding of the entire system and workload first, before you start looking at any specific query.

The post Tune Workloads, Not Queries appeared first on Thomas LaRock.

]]>
https://thomaslarock.com/2020/08/tune-workloads-not-queries/feed/ 0 19906
101 Ways to Say NO to SysAdmin Requests https://thomaslarock.com/2020/07/101-ways-to-say-no-to-sysadmin-requests/ https://thomaslarock.com/2020/07/101-ways-to-say-no-to-sysadmin-requests/#respond Fri, 24 Jul 2020 23:01:34 +0000 https://thomaslarock.com/?p=19886 As an admin, you often get requests from developers asking for elevated permissions on servers and systems. Inside of SQL Server, this is called ‘sysadmin‘ access. As you can imagine, it is not a good idea to give this level of access to just anyone. I thought I would write a post to help you ... Read more

The post 101 Ways to Say NO to SysAdmin Requests appeared first on Thomas LaRock.

]]>
As an admin, you often get requests from developers asking for elevated permissions on servers and systems. Inside of SQL Server, this is called ‘sysadmin‘ access. As you can imagine, it is not a good idea to give this level of access to just anyone.

I thought I would write a post to help you find a way to decline such requests. Because let’s face it, when a developer asks for sysadmin access, they are asking if they can go all the way with your database.

So, here they are. As always, you’re welcome.

  1. No
  2. I’ll let you know later
  3. How about a pizza instead?
  4. You’re not ready
  5. It’s late on a Friday
  6. I don’t do sysadmin
  7. I don’t like you
  8. I have to go now
  9. You’re just using me
  10. I’d rather you rub my shoulders
  11. I can’t fix what you’ll break
  12. I’m afraid you’ll do something stupid
  13. This isn’t what I had in mind
  14. I know your reputation
  15. Let’s stop working together for a while
  16. If you respected me, you wouldn’t ask
  17. I’ll get caught
  18. Is that all you think about?
  19. I’m allergic to sysadmin
  20. You’ll let everyone use it
  21. My keyboard is too loud
  22. It’s not worth it
  23. I’m waiting for the right developer
  24. It’s against my religion
  25. My boss will kill me
  26. Let’s get something to eat
  27. I want us to be friends
  28. I said no and I mean it
  29. You’re too young too handle the admin
  30. Don’t ask me to make this choice
  31. The sysadmin system is down
  32. Sysadmin can be more trouble than its worth
  33. Not everybody has sysadmin, I don’t
  34. With all this ransomware going around?
  35. Don’t make me laugh
  36. I don’t like fixing your mistakes
  37. I don’t just grant sysadmin
  38. I just got the server built
  39. Shhh. I think I hear your boss
  40. We can find other ways for you to screw things up
  41. If I do, you’ll leave an RDP session connected
  42. My boss is waiting for our 1-on-1
  43. I’ve got real work to do
  44. If I do that now, you will ruin things later
  45. I don’t know you well enough
  46. I can’t remember my password
  47. I want you to leave
  48. I’d rather watch Love Island
  49. Have you thought about the consequences?
  50. I think my manager is calling
  51. I don’t want to
  52. People will think I’m “easy”
  53. Go away
  54. I don’t know what other systems you have touched
  55. I want us to stop working together
  56. Maybe later
  57. I’ll show myself out
  58. Support isn’t just about being sysadmin
  59. It’s not what I want to do
  60. My favorite podcast just dropped
  61. I thought you were different
  62. My therapist said not to
  63. I just quit
  64. I want that server to stay clean
  65. I have to go to the toilet
  66. It’s past quitting time
  67. It’s against my strongly held values
  68. I’m too busy
  69. Ask me again in five months
  70. You ignore my advice all the time
  71. I want to keep my career
  72. Thanks anyway
  73. I’m not ready for that type of commitment
  74. We won’t respect each other later
  75. Let’s ask the auditors
  76. I have an early meeting
  77. I’m not sure you’re the right person
  78. Not today
  79. I don’t want that kind of pressure
  80. My teammates will be here any minute
  81. I don’t want our relationship to just be about sysadmin access
  82. That’s a no from me
  83. Not in a million years
  84. Downvoted
  85. Did you say something?
  86. Go home, you’re drunk
  87. We should get to know each other first
  88. I want to, but I can’t
  89. Waiting will make it better when you do get sysadmin access
  90. Quit asking, it’s getting annoying
  91. I have a webinar to attend
  92. I don’t like doing what other people are doing
  93. It’s too risky
  94. My CIO trusts me, and I don’t want to break that trust
  95. It just doesn’t feel like the right time
  96. I have a bad feeling about this
  97. I don’t trust you
  98. That’s funny
  99. No, that’s my final answer
  100. The other devs will want it, too
  101. How about we cuddle instead?

The post 101 Ways to Say NO to SysAdmin Requests appeared first on Thomas LaRock.

]]>
https://thomaslarock.com/2020/07/101-ways-to-say-no-to-sysadmin-requests/feed/ 0 19886
SQL Plan Warnings https://thomaslarock.com/2020/03/sql-plan-warnings/ https://thomaslarock.com/2020/03/sql-plan-warnings/#respond Tue, 24 Mar 2020 17:39:07 +0000 https://thomaslarock.com/?p=19775 There are many methods available for optimizing the performance of SQL Server. One method in particular is examining your plan cache, looking for query plan warnings. Plan warnings include implicit conversions, key or RID lookups, and missing indexes to name a few. Each of these warnings is the optimizer giving you the opportunity to take ... Read more

The post SQL Plan Warnings appeared first on Thomas LaRock.

]]>
There are many methods available for optimizing the performance of SQL Server. One method in particular is examining your plan cache, looking for query plan warnings. Plan warnings include implicit conversions, key or RID lookups, and missing indexes to name a few. Each of these warnings is the optimizer giving you the opportunity to take action and improve performance. Unfortunately, these plan warnings are buried inside the plan cache, and not many people want to spend time mining their plan cache. That sounds like work.

That’s why last year our company (SolarWinds) launched a free tool called SQL Plan Warnings. Often mining the plan cache involves custom scripts and forcing you to work with text output only. We wanted to make things easier by providing a graphical interface. A GUI will allow for the user to have basic application functionality. Things like connecting to more than one instance at a time, or filtering results with a few clicks.

Let me give a quick tour of SQL Plan Warnings.

Connect to an instance

The first thing noteworthy here is how SQL Plan Warnings supports connecting to a variety of flavors of SQL Server. There’s the Earthed version, Azure QL Database, Azure SQL Database Manage Instance, and Amazon RDS for SQL Server as shown here:

From there you fill in your connection details. The login you choose will need either the VIEW SERVER STATE or SELECT permission for the following DMVs: dm_exec_query_stats, dm_exec_sql_text, and dm_exec_text_query_plan. I’ve provided links to the Microsoft docs for each, so you can review the permissions defined there.

Being able to easily connect to instance of SQL Server, no matter where they are located, is a must-have these days.

SQL Plan Warnings Settings

After you connect to your instance, SQL Plan Warnings will return the top 100 plans, with a default sort by CPU time. However, it is possible after connecting you may see no results. This is likely due to the default settings for SQL Plan Warnings. You get to the settings by clicking on the gear icon in the upper-right corner. Here is what the default settings look like:

If you are not seeing any results, change the default settings and refresh plan analysis. For me, I simply made the change to filter by executions, with 1 as the minimum. This returns a lot of noise, so you need to discover what makes the most sense for your particular instance.

Please note these default settings apply to each connected instance. Think of these settings as the highest level filter for all your connected sources. It may be possible you spend time adjusting these settings frequently, depending on the instance, the workload, and your query tuning goals.

Reviewing the SQL Plan Warnings Results

After plan analysis is complete, you will see a list of warnings found. It should look like this:

Note that a plan can have multiple warnings. So this list could be generated by one or more plans found.

From here we are able to filter on a specific warning type with a simple click. This allows us to narrow our focus. Perhaps today we want to focus on Key and RID lookups. We select that filter, then open the plan:

From here we can zoom and scroll, and view the node that has the lookup warning:

If we select the node a properties dialogue that opens to the right. We also see other warnings are included in this plan, if we want or need to investigate those at this time. We also have the ability to download the plan, if desired.

Summary

The SQL Plan Warnings tool is easy to use and allows for you to be proactive in optimizing your environment. The use of a GUI allows for quick filtering at the plan cache level as well as plan warnings themselves. This allows you to focus on the plan warnings with the most impact.

One thing to note is the size of the plan cache you choose to analyze. Instances with larger plan cache sizes (1GB or greater) may require a larger number of plans to parse for warnings.

You can download the SQL Plan Warnings tool here.

The post SQL Plan Warnings appeared first on Thomas LaRock.

]]>
https://thomaslarock.com/2020/03/sql-plan-warnings/feed/ 0 19775
Use SQLMap to Connect Directly to Azure SQL Database https://thomaslarock.com/2020/03/use-sqlmap-to-connect-directly-to-azure-sql-database/ https://thomaslarock.com/2020/03/use-sqlmap-to-connect-directly-to-azure-sql-database/#comments Thu, 12 Mar 2020 00:58:44 +0000 https://thomaslarock.com/?p=19757 I’ve written before about using sqlmap to perform sql injection testing against a website. It is also possible to use sqlmap to connect directly against a database. In this post I will show you how to use sqlmap to connect directly to Azure SQL Database. Once connected you can enumerate objects, open a shell, or ... Read more

The post Use SQLMap to Connect Directly to Azure SQL Database appeared first on Thomas LaRock.

]]>
I’ve written before about using sqlmap to perform sql injection testing against a website. It is also possible to use sqlmap to connect directly against a database. In this post I will show you how to use sqlmap to connect directly to Azure SQL Database. Once connected you can enumerate objects, open a shell, or run custom SQL injection scripts.

The sqlmap documentation is good, but not perfect. For example, if you go looking for details and examples on how to direct connect to a database you will find the following:

Use SQLMap to Connect Directly to Azure SQL Database

There is no example given for SQL Server, so I assume ‘mssql’ is the correct choice for DBMS. A quick test against my Contoso Clinic website database had me trying the following code (you will need to put it correct login, password, and server host names should you try to replicate my scenraios):

c:\python38\python.exe .\sqlmap.py --batch --flush-session -d "mssql://login:password@dbserver.database.windows.net:1433/Clinic"

This resulted in an error:

[CRITICAL] SQLAlchemy connection issue ('InterfaceError: (pyodbc.InterfaceError) ('IM002', '[IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified (0) (SQLDriverConnect)')')

At first I focused my attention on the driver, thinking that my Surface laptop was not configured properly. I had just rebuilt the machine a few weeks ago, so it was reasonable to think something was amiss. However, it soon dawned on me that my attention should focus on SQLAlchemy, as that was being used by sqlmap to create the connection. So I decided that I would start running some tests using SQLAlchemy.

Use SQLAlchemy to Connect Directly to Azure SQL Database

Here’s the Python script I used as a first test:

import sqlalchemy as sa 

engine = sa.create_engine('mssql+pymssql://login:password@dbserver.database.windows.net:1433/Clinic')

connection = engine.connect()
result = connection.execute("select username from users")
for row in result:
    print("username:", row['username'])
connection.close()

This script threw the same error message, so I considered that to be a sign of progress. Now I set about researching how to connect to Azure SQL Database using SQLAlchemy. A few Google results later and I arrived at the following syntax as allowing for a successful connection:

"mssql+pymssql://login@dbserver:password@dbserver.database.windows.net:1433/Clinic"

I needed to add the @dbserver to the end of the login, and I needed to assign a default driver. Here I chose to use pymssql. This syntax allows me to connect SQLAlchemy to an Azure SQL Database. Now that I was able to make a connection from my laptop, I went back to sqlmap.

Use SQLMap to Connect Directly to Azure SQL Database

The first thing I tried was the following:

c:\python38\python.exe .\sqlmap.py --batch --flush-session -d "mssql+pymssql://login@dbserver:password@dbserver.database.windows.net:1433/Clinic"

This resulted in the following error:

[CRITICAL] invalid target details, valid syntax is for instance 'mysql://USER:PASSWORD@DBMS_IP:DBMS_PORT/DATABASE_NAME' or 'access://DATABASE_FILEPATH'

Again, I consider this to be a sign of progress. It is a different error message, here sqlmap is clearly telling me there is a syntax error. Since I made two changes to the string, I decided to remove one and see if that works. My next test was the following:

c:\python38\python.exe .\sqlmap.py --batch --flush-session -d "mssql://login@dbserver:password@dbserver.database.windows.net:1433/Clinic"

Success! We are able to create a connection:

[INFO] connection to Microsoft SQL Server server 'dbserver.database.windows.net:1433' established

Summary

Connecting to Azure SQL Database with sqlmap is easy, just remember the login@dbserver format. From there you can enumerate objects, open a shell, or run custom SQL injection scripts. This flexibility makes sqlmap a great tool to use for penetration testing. I also use sqlmap to test alerts configured with Advanced Threat Protection.

The post Use SQLMap to Connect Directly to Azure SQL Database appeared first on Thomas LaRock.

]]>
https://thomaslarock.com/2020/03/use-sqlmap-to-connect-directly-to-azure-sql-database/feed/ 1 19757
Modify SQL Audit for Azure SQL Database https://thomaslarock.com/2020/02/modify-sql-audit-for-azure-sql-database/ https://thomaslarock.com/2020/02/modify-sql-audit-for-azure-sql-database/#comments Tue, 11 Feb 2020 15:45:06 +0000 https://thomaslarock.com/?p=19735 At SQL Server Live last November, I demonstrated enabling SQL Audit for Azure SQL Database. During the class discussion I explained you must use Powershell to modify SQL Audit for Azure SQL Database. So, that’s my post today, showing you how it is done. By default, SQL Audit for Azure SQL Database will enable the ... Read more

The post Modify SQL Audit for Azure SQL Database appeared first on Thomas LaRock.

]]>
At SQL Server Live last November, I demonstrated enabling SQL Audit for Azure SQL Database. During the class discussion I explained you must use Powershell to modify SQL Audit for Azure SQL Database. So, that’s my post today, showing you how it is done.

By default, SQL Audit for Azure SQL Database will enable the following:

SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP
FAILED_DATABASE_AUTHENTICATION_GROUP
BATCH_COMPLETED_GROUP

If you want to alter that list, you must use Powershell. There is no GUI available. (If you connect to Azure SQL Database with SQL Server Management Studio v18.4 you will notice there is no option for Audit Specifications. I believe this should be possible at some point, so feel free to go upvote this suggestion.)

Using Set-AzSqlServerAudit

Let’s look at how to enable the DATABASE_PERMISSION_CHANGE_GROUP audit action group. I chose that Action Group for two reasons. First, it’s part of the list I recommend for anyone using SQL Audit along with Security Event Manager. Second, because I was curious to track activity for granting UNMASK when using Dynamic Data Masking.

Here’s some sample code that I used to add the DATABASE_PERMISSION_CHANGE_GROUP audit action group:

Set-AzSqlServerAudit -ResourceGroupName RGname -ServerName Server -AuditActionGroup SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP, FAILED_DATABASE_AUTHENTICATION_GROUP, BATCH_COMPLETED_GROUP, DATABASE_PERMISSION_CHANGE_GROUP

You can then use Get-AzSqlServerAudit to verify the change:

Viewing Audit Logs with Log Analytics

To test the activity is captured, I grant and revoke UNMASK to a user. I’m pushing the audit logs to Log Analytics, which returns the rows as expected:

From there we can build rules and alerts as needed.

Summary

I have been an advocate of SQL Audit for years. I was happy to see it added to Azure a while back. However, to modify SQL Audit for Azure SQL Database you must use Powershell. I’m hopeful Microsoft will get this functionality into SSMS at some point in the near future.

Together with Karen López, we will be delivering a full training day at SQL Konferenz in March. The title of our session is Advanced Data Protection: Security and Privacy Assessments in SQL Server. The above is a sample of the updated content Karen and I will be sharing. If you are in or around Darmstadt on the 3rd of March, we’d love to see you in our class.

REFERENCES:

https://docs.microsoft.com/en-us/sql/relational-databases/security/auditing/sql-server-audit-action-groups-and-actions?view=sql-server-ver15
https://docs.microsoft.com/en-us/powershell/module/az.sql/set-azsqlserveraudit?view=azps-3.4.0
https://docs.microsoft.com/en-us/sql/relational-databases/security/dynamic-data-masking?view=sql-server-ver15

The post Modify SQL Audit for Azure SQL Database appeared first on Thomas LaRock.

]]>
https://thomaslarock.com/2020/02/modify-sql-audit-for-azure-sql-database/feed/ 1 19735