Day two was a little shorter than day one, and we went through some demos covering some of the new features of SS2K5 in addition to our continuing discussion about upgrading. We covered SQL Agent job migration as well as the Big Kahuna, DTS package migration, and even talked about the new security features of SS2K5. Throw in some database mirroring, Notification Services (SSNS), and replication, and I am starting to wonder how we ever managed to finish ahead of schedule!
Upgrading SQL Agent jobs seems fairly easy. There is an SSIS ‘transfer jobs task’ that works quite nicely, or one could simply script out existing jobs and then run that script against the SS2K5 instance. The issues raised were few:
- To output a log file you must be a member of the system administrators fixed server role, or you could change to write your output to a history table in the msdb database instead.
- The SQL Agent can no longer connect to the instance with a SQL login, it must be a Windows account.
- The format of job step tokens has changed from $[token_name] to $(token_name). Uh, have I been asleep for five years? Job step tokens? Wow, wish I had known about them before today. Not sure how exactly they would have benefited me, but I am certain I can find a use for them in the future.
I know what you are saying. What about maintenance plans? Unfortunately, maintenance plans are included in SS2K5. I have grown to dislike these things more and more over the years, but the new versions do look better. No idea how they will perform over the long haul, but they could not do much worse than their predecessor. If necessary, one would be best advised to recreate any such plans by hand in SS2K5 but there are some scripts available that could make the migration easier. But even the scripts require some manual intervention, and in the end it might have saved time to just start over.
We then spent some time discussing the new security features in SS2K5. Why? I am not sure, because there is very little that needs to be addressed as part of an upgrade. I think the time was spent in order to demonstrate how the security model is now structured with principals, securables, and permissions. I heard many remarks about how the concept of schemas is similar to Oracle. Again, there are only a few things that you might want to do before an upgrade. One of which I mentioned already, and that is to rename existing procs to a two part name. Another thing might be to assign a default schema to your developers (perhaps dbo). We did discuss some of the other new features such as password policy and specific execution context as well. I think it will be a while before we start developing systems that take advantage of schemas for security. To be honest, I see little difference in security between schemas and database roles.
In the example they showed, they have one table named Sales.Orders, which is a table that would need to be viewed by two groups (Sales and Accounting). Ok then, if you assign that table to that schema, then only those two groups would be able to see it. And if the table was named dbo.Orders, and I had two windows logins, one for Sales, one for Accounting, and only they were allowed select permissions, would that not be the same? I am sure there is a difference, but I have yet to see a good example of why a schema is better for security than groups, roles, and explicit permissions. For whatever reason, people seem to gloss over such details and simply tell me it is better. I keep trying to tell them I am from Missouri, but they do not listen.
We then dived into something that will be front and center for our group very soon, upgrading DTS packages to SSIS. One of the biggest changes that people will need to get used to is that in order to develop an SSIS package they will need to use the Business Intelligence Visual Studio (BI). As DBA’s, we can connect to the Integration Service in SSMS, but there is no way to edit through SSMS, you must edit in the BI Studio.
It is possible to run an existing DTS package on an SS2K5 instance, but it is not possible to design a DTS package. In order to design a DTS package on an SS2K5 instance, one would have to download a ‘Feature Pack’ and install it on their machine. I believe that would allow a person to use SSMS to design, but we are not going to be installing SSMS as a rule. So, perhaps the best way to ‘force’ people to upgrade to the new tools is to make it difficult for them to work with the old ones. But there is an easy way to run them, and that is through the BI Studio, there is an ‘Execute DTS task’. You can point to an existing DTS package and run, and it will make its connections as expected. What I envision at this point is for us to simply tell people to start saving their packages as flat files to their app servers in order to avoid all of these headaches. It really would make life easier for everyone. And, along the way, they could start using the BI Studio to upgrade their existing packages to take advantage of some new features.
So, how to migrate from DTS to SSIS? Well, it is not always easy, and not always hard. We can use the Upgrade Advisor to analyze existing DTS packages. The three big issues with an existing package would be:
- Dynamic properties will need to be reset, they will not carry forward to SS2K5.
- Any transaction settings (i.e., batch size) will need to be reset.
- UDL files are not supported.
Oops, and I know of several people that have been using UDL files.
Now, we can perform a migration on an existing DTS package. But if there is any issue found, then the migration is not ‘successful’. By that I mean it simply gets put into an ‘Execute DTS task’ in SSIS. Even if you save the DTS to a flat file and try to import and migrate, you get the same result. What that means is you will need to recreate the package by hand. So, it may be a while before we can upgrade all of our DTS packages, but we have what I believe to be an even bigger nightmare on our hands.
I would estimate that we have thousands of DTS packages right now. The nightmare is simple: I also estimate that we have hundreds of packages that are no longer being used, or are no longer useful. What are the chances that our app teams know the current status of each and every package they have ever constructed? I know that many of our teams have been documenting as much of their systems as possible. But I feel that DTS is one of those things where you simply have no idea what is workingand where until something breaks and you need to find it. Perhaps this would be a good time to start fresh and force someone to keep track of what is where and is owned by who.
We then spent some time looking at database mirroring. I have seen it before, but never in the RTM version of SS2K5. It is fairly easy to set up and to get running. And the Witness server does not have to be anything special, I believe it can simply be running SQL Express in order to function. I also believe it can be the witness for more than one Principal and Mirror pair, but I am not certain. With Database Mirroring, the failover time is measured in seconds, as opposed to minutes with a cluster. It seems fairly fast and is very cheap. The only downside I see is that if word gets out everyone will want one. With over a thousand databases to keep track of right now,
this could become a nightmare. I did have one thought. For systems that want to have a refresh to test every day, could we set up a mirror instead? How about a combination of a mirror and a snapshot?
Notification Services was next, with a brief demo of setting up a notification whenever a ‘CREATE TABLE’ statement was issued. One voice spoke up from that back and started asking about why they would need to continue to pay Lumigent for Entegra when they could do it all themselves with the Service Broker and Notification Services. I wanted to ask why they paid through the nose for Entegra when they could have already built something themselves for free similar to our DBA_Perform, but I held back. No need to be nasty as class is about to end.
We looked at Partitioned tables, and even stepped through Replication, but by then I was spent. I was still trying to wrap my noodle around all of the DTS headaches we are about to have in a short period of time. Plus I was itching to brag about DBA_Perform, so I filled out my evaluation and got out of Dodge quick. I would definitely say that it was worth my time, and feel very good about how we are going about with our current upgrade plans for SS2K5. I was quite comfortable with all of the materials presented, and that is a direct result of my having been involved with the product for the past eight to ten months, especially through PASS.
That is more than enough. If you have read this far, you should consider getting a hobby.