I hate database triggers.
Whew. It feels good to get that out in the open. I have been holding that in for almost ten years. It got harder to admit after I became a DBA, and as time went by it started to feel as if I was trapped in a story by Edgar Allen Poe. But now my secret is out, and I want to tell everyone about triggers, why I hate them, and why you should be weary of them as well.
As I mentioned before I used to develop software for a living. An actual software company, with a proper SDLC, actual Release Notes, the whole nine yards. And part of our application design was the use of database triggers to enforce business logic. Why? I don’t know, that decision was made a long time before I showed up. Truth be told I did nothing to stop the horror. In fact, I went along with it, thinking it was the right thing to do. If only I knew then what I know now, I would have fed everyone a steady diet of Jaeger and bacon until they passed out and then simply redesigned everything until they awoke from their dietary comas at which point they would have looked at me and simply said “Thank You”.
I am sure that at some point in the distant future someone will be reading these words and say to themselves “What is wrong with database triggers? I use them ALL the time!” All I can say is this: I am very happy for you. Honestly, I think that is great and the two of you will live a wonderful life together. However, most of the rest of us would rather eat shards of broken glass than create even one trigger in our databases.
I lost track of the number of hours we spent researching “bugs” in the application code at that company, only to find that there was a trigger involved. I still remember the times when we would finally realize that a trigger was the culprit and we would start to mutter under our breath “I hate triggers”. Yet we would continue to use them. I guess because by then the train had left the station and we just felt like we had no choice but to go along for the ride. My best guess is that the whole idea of triggers in a database comes from the twisted mind of the Mad Hatter, but no one knows for certain the true twisted origins of these vile objects.
Think about this for a minute. There we were, developing code and a database design that had triggers, and it was almost always the last thing we would think about when staring at data and going “WTF?” So, if triggers were always such a pain in the arse for us, how do you think they are for people who almost never use them? Sure, I am not the brightest bulb on the tree, and you could easily dismiss my frustrations as the result of an ID10T error, but how do you explain the rest of the developers there? In my opinion it was clear that relying on triggers to enforce business logic was not the best of practices.
Sure, triggers are a tool, and I know some people love them. And I also understand that there are some cases where triggers are the right tool (or only tool) for some particular job or function. But the idea that I would look to use a trigger to fill a particular need strikes me as similar to what a sailor thinks about while on shore leave in Hong Kong. Sure, you are going to get the job done, but no way would you feel good about yourself or want to tell your friends about that trigger. And before you start to tell me about how you rely on triggers to ensure referential integrity let me stop you and point out that primary and foreign keys also perform that functionality. What’s that? Oh, you need to keep your integrity across databases? Fine, go ahead, use a trigger, and good luck to the people who maintain that database when they try to understand who the hell is manipulating their data.
So, what is my real problem with triggers, besides the idea that I could need shots of penicillin after creating them? They are impossible to track and troubleshoot. I have used many tools to create basic E-R diagrams in my life. Do you know how many of them would contain primary and foreign key relationships? All of them. Do you know how many contained triggers? None. Sure, some software will let you know a trigger exists on a table in an E-R diagram, but they do not give you the full story, and it just makes it harder to figure out the real picture.
And why should it be so hard for us to troubleshoot them? Just thinking about how much time I have lost to tracking down data changes being done mysteriously makes me want to reach into SSMS and choke the Database Triggers directory. Here is a thought, why not have a system view that would allow for me to immediately see the data that has been modified by a trigger? I am not saying to show me a list of triggers, because we have already seen that just knowing they exist is not enough. No, I am proposing that every table in the database have a column attached that shows us if the last update on a row was a result of a trigger action.
Sounds similar to auditing, right? Perhaps. And perhaps such a view already exists, but I have not seen one. But it would seem to me that it would be simple to track. Maybe it would make more sense to have this column created only when a trigger is defined on a table. That way when I make a change to some application code, examine some data after pressing a few buttons, I would be able to see that the data that was changed was ultimately the result of a trigger action, and I could stop ripping apart my application code wondering why the data was altered unexpectedly. But since I am not aware of a way for me to see the end result of a trigger action, I hate triggers.
Almost as much as I hate NULLs.
Don’t get me started on those bastards.
Triggers are great for writing articles about (having done that). They are also great for retrofitting a database with auditing when you have to have auditing and you can’t (aren’t allowed) to make any other changes and you aren’t allowed (can’t afford) to purchase another tool for the same purpose. But in this case you accept the trade-off of the problems you’ve mentioned above in order to get the auditing you want.
Otherwise, I’ve advised to my developers that if you want that kind of auditing, etc., build it in the stored procedures or into the application logic so it’s visible when troubleshooting, as you’ve mentioned.
But you need to write that post on NULLs.
Amen brother!
We’ve actually started to move to having Triggers use their own UserID when making changes to data. Our database already has Last Change and Creation tracking columns, as well as audit tables (the only valid reason I’ve ever really seen for using triggers). If a trigger blames its change on User A you have no way of knowing what really caused the data to change. However, if the trigger blames itself you can always back track to the real change that fired the trigger.
That is similar to some of the workarounds I have seen (or done) in the past in order to help troubleshoot data issues.
The fact that you and your former colleagues did not know how to properly write or use triggers does not mean triggers are bad. If you were using triggers to enforce referential integrity that indicates a fault in either your database design or your architecture. Quit blaming the tool.
The fact that triggers were “always the last thing we would think about” when tracking bugs indicates just how rarely triggers were the cause of your bugs.
I find more bugs in stored procedures than triggers. Should we stop using those as well?
Thanks for your reply. You are right, it is not triggers that are bad per se. It is more about the fact that they are difficult to work with, especially with the graphical tools currently at our disposal.
We were unit testing code and would not see expected results as a result of triggers that were being used to enforce business logic. triggers are often a hidden layer of code that can be difficult to think about when you are unit testing application code.
i’m not sure that our triggers were badly written, or were ever the cause of a bug, and if that was the message you got from my blog post then I apologize for not being able to make things more clear the first time around. the point i wanted to make is that they can be a forgotten piece of logic when you are doing unit testing.
you can decide for yourself about stored procedures. i happen to know a handful of people that would prefer to use nothing but dynamic SQL in web pages. everyone has their own comfort zone, i suppose.
This is a little old, but still, yes, in SQL Server foreign keys are my first choice, when I can use them (qualified, when). I sometimes use an ORM, which at times also implies all sorts of impedance mismatches. When I have multiple paths that can cascade, I try to simplify, but when I cannot, triggers are the way. But for the whole changing of hands concerns, I could use MySQL, which is a little more “full featured” concerning multiple cascading paths, but depending on the version, one has to be careful that’s not just “hiding” the issue. At least in SQL server, triggers are the way to ensure proper deletes are being cascaded. Updates are a whole OTHER enchilada deserving of a post unto themselves. Can I get a witness? It is handy to have a sane migration code base. Yes, I said code base; I make an effort to rigorously, and frequently, unit test, continuously integrate, locally, in a test box if I can, before going to production.
Thanks for the comment!
In my entire career I have never had to program a trigger in SQL Server (and I have over 20 years of experience working with this database). I have never been on a project that required them.
That being said, I don’t know if you would call them “awful” but they do in fact add a layer of ambiguity to application development. With such development, when something goes wrong most review the most obvious areas of concern, which, if it is a database issue, would also require reviewing the access to the database in question. Even if all of the review is done well it doesn’t necessarily mean that the troubleshooter(s) will find a trigger attached to a database table that is causing the issue. This doesn’t make them a poor tool of choice, merely something that has to be considered when doing such reviews.
I have always disliked ambiguity in my development since it doesn’t make it very easy for less experienced personnel to find issues such as this. This is probably one of the reasons I prefer VB.NET to C#, though I am fluent in both.
Thanks for the comment Steve. You are correct, it’s the extra layer of ambiguity. In my shop I was an advocate for not adding in such layers unless absolutely necessary.
I agree with all of the reasons why triggers are bad. I do use them for auditing. But I also have a real use case where I have found them invaluable and I will continue to use them. I have had many engagements over the years where the client (i.e. power user) is in direct control of their database, and as such they reach in and perform data modifications outside of the application we develop–you can only admonish them so much. Because of this I use triggers to keep denormalized values up to date specifically balances, as in the case of on-hand inventory with respect to inventory transactions or customer balances with respect to AR transactions. If they decide to delete a transaction (or a 1000), I don’t want the headache of having to figure out why they are now telling me that the inventory valuation report is wrong.
Thanks for the comment Bill. Triggers do have a place, I just wish that our tools made it easier to understand that they were in use.
Try using the Redgate SQL search tool.
If you have a table where the values are being changed, a quick search with the Redgate search tool will indicate everywhere that the table is referenced (including all triggers, procedures, functions and views.)
Makes part of your complaint about being unable to see what’s manipulating your data void.
The rest I can understand 🙂
Ha! Thanks for the comment, I didn’t know about that tool. I will have to take a look the next time I have a chance.
If you’re using triggers, you’re probably doing it wrong.
That’s what I’ve come to know. The more I’ve worked in this field, the more I’ve learned to use my bad-code divining rods and there’s no surer way to make them cross than to come across some triggers (‘cept maybe some cursors).
If you’re using triggers to enforce referential integrity, you’re doing it wrong.
If you’re using triggers to ensure ‘power users’ don’t screw up the data, you’re WAY doing it wrong.
Use stored procedures, roles, and permissions to ensure that ‘power users’ don’t screw up the data. Use foreign keys and good schema design to enforce referential integrity.
Use Triggers to ride off into the sunset.
Thanks!