I’ve dealt with a lot of scenarios where I needed to troubleshoot and test a reporting services subscription simply because it failed. Most of the issues stem from the owner permissions, typos when it comes to the email addresses or because the person left the company and didn’t remove their subscriptions. Knowing where to look and how to associate the rather ugly naming convention of the sql jobs in reference to the subscriptions can eat up some valuable time. So let me save you some time by sharing a script I have used on numerous occasions.
The script joins a few ReportServer specific tables to sys.objects. I cannot recall if I had tested this against SQL 2005 but I do know it works well with SQL 2008 and it would be safe to say that it works with 2008 R2 as well. I am pretty certain it should execute without a hiccup on a 2005 instance. I haven’t tested against 2012 as of yet, but when I do I’ll definitely update this post.
To bread and butter. Here’s the script. The following will return all SQL Jobs associated to SQL Server Reporting Services subscriptions.
SELECT cmd='EXEC msdb.dbo.sp_start_job @job_name= ''' + sj.[name] + ''';' ,ReportName=c.[name] ,ReportPath=c.[Path] ,LoginName=u.[UserName] ,SQLJobCreateDate=sj.[date_created] FROM msdb.dbo.sysjobs sj JOIN [ReportServer].[dbo].[ReportSchedule] AS rs (NOLOCK) ON sj.[name] = CAST(rs.[ScheduleID] AS NVARCHAR(128)) JOIN [ReportServer].[dbo].[Catalog] c (NOLOCK) ON rs.[ReportID] = c.[ItemID] JOIN [ReportServer].[dbo].[Users] u (NOLOCK) ON c.[CreatedByID] = u.[UserID] ORDER BY sj.[date_created] DESC
I like to know what was created today so I run the following:
SELECT cmd='EXEC msdb.dbo.sp_start_job @job_name= ''' + sj.[name] + ''';' ,ReportName=c.[name] ,ReportPath=c.[Path] ,LoginName=u.[UserName] ,SQLJobCreateDate=sj.[date_created] FROM msdb.dbo.sysjobs sj JOIN [ReportServer].[dbo].[ReportSchedule] AS rs (NOLOCK) ON sj.[name] = CAST(rs.[ScheduleID] AS NVARCHAR(128)) JOIN [ReportServer].[dbo].[Catalog] c (NOLOCK) ON rs.[ReportID] = c.[ItemID] JOIN [ReportServer].[dbo].[Users] u (NOLOCK) ON c.[CreatedByID] = u.[UserID] WHERE (CONVERT(VARCHAR(10),sj.[date_created],101) = CONVERT(VARCHAR(10),GETDATE(),101)) ORDER BY sj.[date_created] DESC
The cmd column provides me the command needed to kick off the subscription. Typically most of the subscriptions are email based so if it hits my inbox I know I am good!
I won’t get into all of the “ins and outs” about BCP, but feel free to read up on the subject here: Import and Export Bulk Data by Using the bcp Utility (SQL Server)
I have images stored in a SQL Server database and I need to access them without writing, purchasing or using a custom app. What can I do?
Step 1: Identify the table in question
In this case we are working with the Images table inside of the [devdb] database
Step 2: Generate a BCP format file
Create the format file by executing the following statement in a command prompt window.
bcp devdb.dbo.Images format nul -c -fimages.fmt -T -S (local)\DEV12 -v
This will produce a file named images.fmt in the C:\Temp\ directory
Step 3: Modify the format file
Open the newly pressed images.fmt with notepad. We will (very carefully) edit the contents. So here’s a look at the original file. Note: I modified the following image to fit the width of this layout. Meaning I cropped some white-space. Also note I have SQL 2005, 2008 R2 and SQL 2012 installed on my box. The 10.0 on the first line is present because the BCP version is from SQL 2008. It would show 11.0 if I were using BCP for SQL 2012.
Here is what it needs to look like after we make some adjustments. We modified the file to single out the image column. Be sure to SAVE your changes, especially changing SQLCHAR to SQLBINARY; otherwise, empty files will be created with a zero file size. Notice the white-space in the following image. I really don’t know if it makes a difference but the slight modifications I do make always work.
Step 4: Enable xp_cmdshell
If you don’t have xp_cmdshell enabled here’s the means to do so. Keep in mind if you don’t use xp_cmdshell often then be sure to disable it when you’re finished. You can search the subject in respects to security and xp_cmdshell.
EXEC sp_configure 'Show Advanced Options',1; RECONFIGURE; GO EXEC sp_configure 'xp_cmdshell',1; RECONFIGURE; GO
Step 5: Write up query
In the event you don’t have images this is what I used to load up a few sample files. I literally used the sample pictures that came preloaded with windows 7. Which are located in the C:\Users\Public\Pictures\Sample Pictures directory.
/* LOAD TABLE */ INSERT INTO devdb.dbo.Images ([Image],[FileName]) SELECT BulkColumn, 'Chrysanthemum.jpg' FROM OPENROWSET(BULK 'C:\Users\Public\Pictures\Sample Pictures\Chrysanthemum.jpg',SINGLE_BLOB) AS x; INSERT INTO devdb.dbo.Images ([Image],[FileName]) SELECT BulkColumn, 'Desert.jpg' FROM OPENROWSET(BULK 'C:\Users\Public\Pictures\Sample Pictures\Desert.jpg',SINGLE_BLOB) AS x; INSERT INTO devdb.dbo.Images ([Image],[FileName]) SELECT BulkColumn, 'Hydrangeas.jpg' FROM OPENROWSET(BULK 'C:\Users\Public\Pictures\Sample Pictures\Hydrangeas.jpg',SINGLE_BLOB) AS x; INSERT INTO devdb.dbo.Images ([Image],[FileName]) SELECT BulkColumn, 'Jellyfish.jpg' FROM OPENROWSET(BULK 'C:\Users\Public\Pictures\Sample Pictures\Jellyfish.jpg',SINGLE_BLOB) AS x; INSERT INTO devdb.dbo.Images ([Image],[FileName]) SELECT BulkColumn, 'Koala.jpg' FROM OPENROWSET(BULK 'C:\Users\Public\Pictures\Sample Pictures\Koala.jpg',SINGLE_BLOB) AS x; INSERT INTO devdb.dbo.Images ([Image],[FileName]) SELECT BulkColumn, 'Lighthouse.jpg' FROM OPENROWSET(BULK 'C:\Users\Public\Pictures\Sample Pictures\Lighthouse.jpg',SINGLE_BLOB) AS x; INSERT INTO devdb.dbo.Images ([Image],[FileName]) SELECT BulkColumn, 'Penguins.jpg' FROM OPENROWSET(BULK 'C:\Users\Public\Pictures\Sample Pictures\Penguins.jpg',SINGLE_BLOB) AS x; INSERT INTO devdb.dbo.Images ([Image],[FileName]) SELECT BulkColumn, 'Tulips.jpg' FROM OPENROWSET(BULK 'C:\Users\Public\Pictures\Sample Pictures\Tulips.jpg',SINGLE_BLOB) AS x;
Let’s take a look at the contents of the table.
/* VERIFY TABLE CONTENTS */ SELECT id, Image, FileName FROM devdb.dbo.Images;
Step 6: Export images
Now that we know the table definition, adjusted the format file and enabled xp_cmdshell let’s extract some photos! We could run through this one-by-one but where’s the fun in that. I used a WHILE loop as opposed to a cursor to extract each photo by id.
/* EXPORT PHOTOS FROM DATABASE */ DECLARE @cnt INT, @rcnt INT, @cmd NVARCHAR(150), @fname NVARCHAR(50), @filepath NVARCHAR(50), @inst NVARCHAR(50) /* SET VARIABLES */ SELECT @rcnt = COUNT (*) FROM [devdb].[dbo].[Images] SET @cnt = 1 SET @filepath = 'c:\temp\' SET @inst = '(local)\DEV12' /* LOOP THROUGH RECORDS */ WHILE @cnt < @rcnt BEGIN SELECT @fname = FileName FROM devdb.dbo.Images WHERE (id = @cnt) SELECT @cmd = 'BCP "SELECT Image FROM [devdb].[dbo].[Images] WHERE (id = ' + CAST(@cnt AS VARCHAR) + ')" queryout "' + @filepath + @fname + '" -T -S ' + @inst + ' -f c:\temp\Images.fmt' --SELECT @cmd = 'BCP "SELECT Image FROM [devdb].[dbo].[Images] WHERE (id = ' + CAST(@cnt AS VARCHAR) + ')" queryout "c:\temp\' + @fname + '" -T -S (local)\DEV12 -f c:\temp\Images.fmt' PRINT @cmd /* PRINT STATEMENT TO SCREEN */ EXEC xp_cmdshell @cmd SET @cnt = @cnt + 1 END
That’s pretty much it. Happy BCP’ing and exporting!
If you are like me you save most (if not all) of the scripts or queries that you create. It is just a huge time saver and with our aggressive work load we need all the time we can spare. In my case I deal with a significant volume of production deployments/promotions so I burn through scripts frequently.
On a few occasions (when I need it most of course) the file names often drift from memory so I sometimes look to my recent file list in SSMS to help jog my memory. Yes we have deployment request which I can easily reference but this post is about SQL Server Management Studio, so there.
Back to the subject… by default SSMS 2012 will only display the last six but what if you want to extend that number to 10 or more?
I’d like to point out that Jugal Shah (blog | @imjugal) published an article that covers SSMS 2000, 2005, 2008 and 2008 R2. Here is the link to his article if you would like to read his post Change Setting for Recently Used Files in SQL Server Management Studio SSMS.
The process is pretty much identical with SSMS 2012 and you go about it like so…
Step 1: Tools >> Options
Step 2: Increase the value for: “items shown in recently used lists”
Step 3: Let’s Verify
The change is immediate, so there’s no need to close and reopen SSMS.
Well today was my first attempt outside of a classroom setting to dig my heels into SSAS DMX. I have an idea that I want to put into action (automating a process) and I believe it will benefit my organization. So fire up SSMS and connect to instance that has Analysis Services.
1) Start a New DMX Query
2) Execute Script
SELECT * FROM $system.DBSCHEMA_CATALOGS
3) Returning Specific Columns
Similar to Transact-SQL you just specify the columns you want; however, you need to wrap up the column names with brackets [column].
SELECT [CATALOG_NAME] FROM $system.DBSCHEMA_CATALOGS
Pretty simple. Now let’s try with XMLA, which is XML for Analysis Services.
Just like before connect to an instance which has Analysis Services and published cubes.
1. Start a New XMLA Query
2. Execute the Script
<Discover xmlns="urn:schemas-microsoft-com:xml-analysis"> <RequestType>DBSCHEMA_CATALOGS</RequestType> <Restrictions /> <Properties /> </Discover>
There are many ways to go about getting the last backup date for any and all database but I tend to lean towards set based methods over using the GUI for many reasons. One in particular is for the fact that I can turn set based into an actionable report since monitoring backups is important after all.
The method I use the most is the following because it only returns information about databases that currently exists. I really do not need to see information about what used to exist, but I will also show that example as well. Before I go on you might want to understand the anatomy of both the master.sys.databases catalog view and msdb.dbo.backupset table.
/* GET LAST BACKUP DATE FOR ALL EXISTING DATABASES */ SELECT d.name, MAX(b.backup_finish_date) 'Last Backup Date' FROM master.sys.databases d JOIN msdb.dbo.backupset b ON d.name = b.database_name WHERE (b.type = 'D' OR b.type = 'I') GROUP BY d.name ORDER BY d.name
The results indicate (104 row(s) affected). So I have 104 databases on this particular instance.
This is the example of returning the history for the last backup date which may contain information about past databases which depends on your maintenance for backup history. So this may vary.
/* GET LAST BACKUP DATE FROM BACKUPSET */ SELECT database_name, MAX(backup_finish_Date) 'Last Backup Date' FROM msdb.dbo.backupset WHERE (type = 'D' OR type = 'I') GROUP BY database_name ORDER BY database_name
The results indicate (108 row(s) affected) so between the two statements you can see there is a difference of four rows which is why I avoid using the msdb.dbo.backupset as the only source of record.
The easiest way to determine which databases no longer exists is to execute the following.
/* RETURN NON EXISTING DATABASE NAMES */ SELECT DISTINCT database_name FROM msdb.dbo.backupset WHERE ( database_name NOT IN ( SELECT name FROM master.sys.databases ) )
The results indicate that (4 row(s) affected) and the names of the database that are not current.
Let’s face it we have all spent countless hours developing and/or polishing up our sql scripts deep into the wee hours of the night to the point of near exhaustion. We rise early the next morning (or a few hours later in most cases), pour a cup of coffee and head over to our laptop to find that our system rebooted. Then reality sinks in and you suddenly realize that you didn’t save anything before calling it a night. Now you’re thinking “[enter swear phrase of choice here]“!
Typically when you re-open SQL Server Management Studio you’re prompted with a nice little screen that politely asks if you would like to recover the selected files or queries. Which is extremely helpful, but what happens when you don’t get that prompt, what then?
The answer is easy enough and may require you to change your folder options to show hidden files and folders. In the event you find yourself in a similar situation simply navigate to (assuming you are running Windows 7) C:\Users\”[your username goes here]“\AppData\Local\Temp\ and look for files similarly named like the ones illustrated below.
Words really cannot describe the experience. The SQL Server community is an amazing community to be a part of. The camaraderie among peers is undoubtedly extraordinary. From what I recall there were over 500 first timers this year which just proves the PASS community is growing at a great rate. I have only been involved with PASS for a few years now and have become the V.P. for the Arizona Pass Chapter, a SQLSaturday Phoenix Organizer and now a volunteer/presenter scheduler for the Performance Pass Virtual Chapter. So if you have any questions about Pass feel free to drop me a line, I would love to chat with you about it.
Back to the summit…
The Pass Summit is technically a three-day fun-filled event from Wed – Fri but there were pre-conference seminars (precons) that occurred on Monday and Tuesday which I did not attend. From what I hear the (precons) were amazing. Hopefully next year I can be fortunate enough to attend those as well. I only attended the Wed – Fri sessions. My main focus was on performance but there were many other tracks to choose from. The ever so popular business intelligence to administration to development to professional development. Definitely something for everyone.
The training is just one part of the conference but the relationships you build from networking is priceless. I finally met so many people whom I have literally known for several years for the first time. I know that sounds odd, but the power of social networking just brings people from all walks of life together. The best part is that you feel like you’ve been friends forever even though this is the first time meeting one another. That’s the energy of the SQL Community and I am proud to be a part of it. I learned a lot and have so many thoughts and scenarios running through my mind that I need to organize them into actionable items and prepare to blog about them. There’s so much more I can say about the benefits of attending the Pass Summit, but take it from me (a first timer) that it’s well worth it and you’ll never forget it. Hope to see and/or talk with you soon at a local, national or international event!
I had the extreme pleasure of attending the SSAS Workshop by PragmaticWorks this week, which was a two-day session with a bonus third day entirely focused on Denali (Expedition Denali). Brian Knight (blog | @brianknight), Dustin Ryan (blog | @SQLDusty) and Lonnie Mejia (LinkedIn) were on site at the Microsoft Southwest District office in Tempe, AZ which has a beautiful view of Tempe Town Lake.
I have only had a little exposure to SQL Server Analysis Services before this and from what I have learned I do know that our own data warehouse group could significantly benefit from this workshop. I am not mocking them whatsoever, but I am saying some processes could be handled differently. For example cube updates. Instead of providing me the entire visual studio solution they can easily provide me a XMLA script which I can use in SSMS to deploy the dimension update. Things like this I never knew, so this was a real eye opener for me and gives me the needed ammo to fight with our developers. Kidding! It does however allow me to extend my freshly acquired knowledge to that group in a non-confrontational way of course (fingers crossed behind back).
Business intelligence has a warm place in my heart and the time I did spend developing reports was exciting. To be honest I would love nothing more than to be able to go from zero to data warehouse to SSAS slice and dice to full publish on reporting services, sharepoint, etc… in a week or so. I believe as a DBA that would be a valuable skill-set to have under my belt. This course is my step towards that direction.
There is no doubt that this workshop packs in a lot of information. The two days are literally bursting at the seams with information but this is definitely a MUST for those looking to get into the SSAS world. The PragmaticWorks staff really demystified SSAS. Their lectures and labs are delivered in such a manner that it is really easy to keep up with the pace. Throughout the course you are walked through the process of setting up an SSAS project all the way through creating cubes, dimensions, mining structures, roles and everything in between. The price of the course is a bargain given everything you walk away with.
I think the most action came towards the ending of day two. The room was divided down the middle and the groups were paired against each other to build an SSAS project from start to finish following a set of requirements. Then you needed to create a report in either reporting services or excel based on the cube we published. Everyone participated either by being the designated drivers (at the computer), yelling out the requirements, providing assistance and so on. It was intense! I must mention that “Team Dustin” my group WON the challenge against “Team Brian”. Better luck next time Brian! We literally beat them by 1-2 seconds at best. Nonetheless a fantastic method to illustrate not only what we had learned but more importantly what we had retained. If you get the opportunity to attend this workshop I would highly recommend it. You will not be sorry!
Expedition Denali (Day 3) was very exceptional. I have not touched Denali at all but from what Roger Doherty (blog | @Doherty100) and Brian Knight were covering and demoing I cannot wait till RTM. I would totally spill the beans because there are so many very cool and sexy things coming… but their “body-guard/new sales guy” Lonnie is a pretty big guy so I will refrain. Here he is working through the demo.
5/25 started out different. I woke up about an hour earlier than my alarm despite the fact I ended up crashing out around 1:30 AM. I woke up completely awake, perfectly content and I felt very refreshed with only 4.5 hrs of rest. I had no need for coffee, just a tuck and roll to the home office with a quick stop at the local Bistro (my kitchen) for a power breakfast a la cart. You have to love light traffic.
I logged into work and I flew through the critical SQL SCOM alerts and moved onto the SQL Health & Backup report. Then I finished up right on time to join SQL Sentry’s “Learn How to Tune Queries” webinar. Unfortunately I am not well versed with 3rd party monitoring tools outside of the minimal SSMS tools, which is why I tend to participate as much as I can in demos, forums and similar sessions.
I learned a lot about SQL Sentry’s Plan Explorer and I can honestly say it will definitely be extremely beneficial as I venture into the realm of performance tuning. This literally could not have come at a better time. At my current place of employment there are plans in the works to include me in several tuning aspects which is why I have immersed myself with profiler, database engine tuning advisor and thanks to the SQL community I am learning the many available DMVs.
A huge surprise came at the end of the session. They held a raffle and they happened to call my name. Unfortunately I couldn’t hear the audio portion when they explained what the item was, but I did hear my name called, so I responded. I had figured I won a license for one of their awesome monitoring software packages like the SQL Sentry: Power Suite or their Performance Advisor for SQL Server. To be honest it really didn’t matter to me because either would have been a sweet prize!
Anxious as I was I turned to twitter and pinged Brent Ozar (@BrentO | Blog) and asked what I had won since he was associated with the session. Then shortly after Aaron Bertrand (@AaronBertrand | Blog) responded and the convo went something like this:
As you can tell I was (still am) excited. Everything just worked out perfectly. So I wanted to take this opportunity to say thank you to SQL Sentry, Aaron Bertran, Greg Gonzalez (@SQLsensei | Blog), Peter Shire (@Peter_Shire) and the rest of the SQL Sentry staff for the awesome presentation and the gift. I also want to the thank the organizers of SQLCruise for hosting the event. To bad I am not going on the cruise, but I’ll be there in spirit!
Thank you SQL Sentry!!!
Did I mention the iPad 2 arrives… on… FRIDAY!!!