Final Thoughts From Spain

Today was the final day for the conference, and I am looking forward to spending the next day and a half touring the city instead of listening to more talks about the SS2K5 service broker. I attended four more session today, and managed to take away quite a bit of good content, even more than yesterday.

The first session was on SQLCMD, which is being phased in as an eventual replacement for osql. SQLCMD looks to be quite useful on the surface, but when you start to drill deeper you begin to see some functionality that simply is not there. Don’t get me wrong, I am looking forward to using SQLCMD to replace some current osql, but I wish it could do more. It is as if Microsoft wanted to expand osql, especially giving us a way to run SQLCMD from within SSMS (SQL Server Management Studio), which is quite nice. But what I cannot do with SQLCMD is use it to execute dynamic sql. Sure, I could run dynamic sql with xp_cmdshell to execute SQLCMD.exe, but what is the point? I suppose the point would be that eventually osql will be going away, but is that a good enough reason for me to rewrite things, given that osql won’t be going anywhere until the next version of SQL Server (in 2010)?

What I would really like to do is to put together a stored procedure that would be able to make use of SQLCMD functionality. From what I can tell, that is simply not possible. Our current method of choice right now is to use stored procs (which use cursors that build execute dynamic sql, xp_cmdshell and osql) that are called from SQL Server agent jobs. The only real reason I would want to use SQLCMD is to get away from the use of xp_cmdshell, since I believe that I should practice what I preach. But if I cannot use SQLCMD in a stored procedure, then I will need to rethink some of the underlying structure of our in-house monitoring system. I know that I can use SQLCMD in other ways to get the job done, and the work that I will be doing will lay the foundation for other things as well, not to mention it will allow me to become familiar with something new, which is always a good thing. In other words…I enjoyed the talk and the demos, and I hope to have something put into place in development by the end of next week.

The second talk of the day was about the maintenance and monitoring enhancements in SS2K5. More good stuff here, and I learned that in France it is pronounced “dee-bee-ah”, which sounds so cool when you have a real French accent. The new maintenance plans did little to impress me, but they did look nice. And I did like how you can create a new plan by using some SSIS tasks, but I do not see us using the new plans in the near future.

The monitoring enhancements were quite exciting. The new profiler is cool, right down to the graphical representation of a deadlock, as well as the ability to capture the showplan for every statement captured in the trace. Also, there is a must read on database tuning with the new dynamic management views at http://download.microsoft.com/download/1/3/4/134644fd-05ad-4ee8-8b5a-0aed1c18a31e/TShootPerfProbs.doc. Well, I was told it is a must read, but I have not had a chance to look at it yet, for obvious reasons. But chances are we will find some very useful information that we can be inspired by to make our environment even more stable.

There is also a Database Tuning Advisor that replaces the Index Tuning Wizard. I am very interested in using this advisor, but there is one small problem. Both the DTA and profiler still have no way of scheduling them to run at some point in the future. There is a stop time (like the current profiler), but no start time. With profiler, I believe you will still have to script out the trace and put it into a SQL Agent job if you want it to start at a specific time. With DTA, I do not believe any scripting is available. Why is it that no one has thought to put such functionality into profiler and/or the DTA? Am I the only one that works for a company that has batch processing running in the middle of the night? Or am I the only one that likes to sleep during the night, and dislikes having to get up to kick off a trace? What am I missing here? If we can script out the profile trace, why has no one at Microsoft given us a button to create the job for us? How hard is that? They did it for DTS which had to call DTSRUN.exe, is it that much harder for a trace to be kicked off?

The next session was on the Report Builder. I know very little about Reporting Services, but I can now say I know a little bit more. And Report Builder looks cool, right down to the creation of a new object model for which the business client could then use to build their own report. Honestly, if we could get people to use Report Builder, and away from MS Access (or Excel), life would be wonderful. And the drilldown capability is very cool, especially without the use of an OLAP cube to get the details. I was impressed, to say the least, but then again I stop to look at shiny things all the time.

The last talk was about the new Service Broker. I am interested in learning more about this functionality, I am not certain how it could best be applied as of yet. But I figure if I play with it for a while I should be able to figure out a way for it to either help us now, or to help us in the future. So, I need to find some very basic, practical example to use it with right now. As soon as I can think of one, I’ll let you know, but it is hard for me to keep my focus on any one thing for an extended period of time, not with all of these shiny objects all around me.

Overall, this was a good conference to attend. There was quite a bit a valuable content, and I made additional contacts as well. I look forward to the larger PASS conference in Seattle this November. Now, if you will excuse me, I have a date with my wife.

Adios, and buenas nochas.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.