When I first encountered this I thought to myself… must be a database or service or some sort, not being entirely sure of what to expect. I searched throughout the instance starting with the database engine and moved to the SSAS instance and was unable to find anything remotely named “flight recorder”. At this point I thought this error must have risen from an external call looking for a resource that no longer exists. Little did I realize that it is actually the SSAS log. It’s also been around sin SQL Server 2005.
If you need to determine if it is enabled then this is how you go about it.
Using SSMS connect Analysis Services for that instance. Once connected right-click on the instance and select “Properties” and in the name column just about the 13th row down you will see “Log \ Flight Recorder \ Enabled”
Let’s face it we have all spent countless hours developing and/or polishing up our sql scripts deep into the wee hours of the night to the point of near exhaustion. We rise early the next morning (or a few hours later in most cases), pour a cup of coffee and head over to our laptop to find that our system rebooted. Then reality sinks in and you suddenly realize that you didn’t save anything before calling it a night. Now you’re thinking “[enter swear phrase of choice here]“!
Typically when you re-open SQL Server Management Studio you’re prompted with a nice little screen that politely asks if you would like to recover the selected files or queries. Which is extremely helpful, but what happens when you don’t get that prompt, what then?
The answer is easy enough and may require you to change your folder options to show hidden files and folders. In the event you find yourself in a similar situation simply navigate to (assuming you are running Windows 7) C:\Users\”[your username goes here]“\AppData\Local\Temp\ and look for files similarly named like the ones illustrated below.
Words really cannot describe the experience. The SQL Server community is an amazing community to be a part of. The camaraderie among peers is undoubtedly extraordinary. From what I recall there were over 500 first timers this year which just proves the PASS community is growing at a great rate. I have only been involved with PASS for a few years now and have become the V.P. for the Arizona Pass Chapter, a SQLSaturday Phoenix Organizer and now a volunteer/presenter scheduler for the Performance Pass Virtual Chapter. So if you have any questions about Pass feel free to drop me a line, I would love to chat with you about it.
Back to the summit…
The Pass Summit is technically a three-day fun-filled event from Wed – Fri but there were pre-conference seminars (precons) that occurred on Monday and Tuesday which I did not attend. From what I hear the (precons) were amazing. Hopefully next year I can be fortunate enough to attend those as well. I only attended the Wed – Fri sessions. My main focus was on performance but there were many other tracks to choose from. The ever so popular business intelligence to administration to development to professional development. Definitely something for everyone.
The training is just one part of the conference but the relationships you build from networking is priceless. I finally met so many people whom I have literally known for several years for the first time. I know that sounds odd, but the power of social networking just brings people from all walks of life together. The best part is that you feel like you’ve been friends forever even though this is the first time meeting one another. That’s the energy of the SQL Community and I am proud to be a part of it. I learned a lot and have so many thoughts and scenarios running through my mind that I need to organize them into actionable items and prepare to blog about them. There’s so much more I can say about the benefits of attending the Pass Summit, but take it from me (a first timer) that it’s well worth it and you’ll never forget it. Hope to see and/or talk with you soon at a local, national or international event!
I had the extreme pleasure of attending the SSAS Workshop by PragmaticWorks this week, which was a two-day session with a bonus third day entirely focused on Denali (Expedition Denali). Brian Knight (blog | @brianknight), Dustin Ryan (blog | @SQLDusty) and Lonnie Mejia (LinkedIn) were on site at the Microsoft Southwest District office in Tempe, AZ which has a beautiful view of Tempe Town Lake.
I have only had a little exposure to SQL Server Analysis Services before this and from what I have learned I do know that our own data warehouse group could significantly benefit from this workshop. I am not mocking them whatsoever, but I am saying some processes could be handled differently. For example cube updates. Instead of providing me the entire visual studio solution they can easily provide me a XMLA script which I can use in SSMS to deploy the dimension update. Things like this I never knew, so this was a real eye opener for me and gives me the needed ammo to fight with our developers. Kidding! It does however allow me to extend my freshly acquired knowledge to that group in a non-confrontational way of course (fingers crossed behind back).
Business intelligence has a warm place in my heart and the time I did spend developing reports was exciting. To be honest I would love nothing more than to be able to go from zero to data warehouse to SSAS slice and dice to full publish on reporting services, sharepoint, etc… in a week or so. I believe as a DBA that would be a valuable skill-set to have under my belt. This course is my step towards that direction.
There is no doubt that this workshop packs in a lot of information. The two days are literally bursting at the seams with information but this is definitely a MUST for those looking to get into the SSAS world. The PragmaticWorks staff really demystified SSAS. Their lectures and labs are delivered in such a manner that it is really easy to keep up with the pace. Throughout the course you are walked through the process of setting up an SSAS project all the way through creating cubes, dimensions, mining structures, roles and everything in between. The price of the course is a bargain given everything you walk away with.
I think the most action came towards the ending of day two. The room was divided down the middle and the groups were paired against each other to build an SSAS project from start to finish following a set of requirements. Then you needed to create a report in either reporting services or excel based on the cube we published. Everyone participated either by being the designated drivers (at the computer), yelling out the requirements, providing assistance and so on. It was intense! I must mention that “Team Dustin” my group WON the challenge against “Team Brian”. Better luck next time Brian! We literally beat them by 1-2 seconds at best. Nonetheless a fantastic method to illustrate not only what we had learned but more importantly what we had retained. If you get the opportunity to attend this workshop I would highly recommend it. You will not be sorry!
Expedition Denali (Day 3) was very exceptional. I have not touched Denali at all but from what Roger Doherty (blog | @Doherty100) and Brian Knight were covering and demoing I cannot wait till RTM. I would totally spill the beans because there are so many very cool and sexy things coming… but their “body-guard/new sales guy” Lonnie is a pretty big guy so I will refrain. Here he is working through the demo.
I know many can and will say I can simply use ipconfig or ping the local computer name and to an extent that’s true. In my case I really only want the IP Address and nothing else, just the plain IP Address. I don’t want the extra verbiage that goes along with it.
To get started let’s run through a simple statement, but before we do know that this is geared towards a command prompt and not a batch. The syntax is slightly different.
Step 1: Get only one reply
ping %computername% -4 -n 1 | find /i "reply"
Step 2: Get all left of the colon
FOR /f "tokens=1 delims=:" %d IN ('ping %computername% -4 -n 1 ^| find /i "reply"') DO ECHO %d
Step 3: Get the IP Address
FOR /f "tokens=1 delims=:" %d IN ('ping %computername% -4 -n 1 ^| find /i "reply"') DO FOR /F "tokens=3 delims= " %g IN ("%d") DO echo %g
Step 4: Get the first octet
You might question why you would only want the first octet and the answer is simple. Based on that single value I can determine what the backup share is. So if I were to return only the first octet into a stored procedure then it can dynamically perform backups accordingly to the appropriate share.
FOR /F "tokens=1 delims=:" %d IN ('ping %computername% -4 -n 1 ^| find /i "reply"') DO FOR /F "tokens=3 delims= " %g IN ("%d") DO FOR /F "tokens=1 delims=." %h IN ("%g") DO ECHO %h
At this point you might be asking yourself what the syntax means. Well here is the scoop using (Step 2) as a reference point. Well consider tokens as segments of a single item that is separated by a specific value.
Let’s examine the following string:
Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
Looks pretty straightforward for the most part but if you think about what the separating value that you’ll want to use then the string begins to appear differently. For example I want to set the delims otherwise known as the deliminator character to a colon. Well there is only one colon therefore making two tokens. All characters left and all character right of the colon.
So by me running (Step 2) I am essentially requesting all characters to the left of the colon, because I am only asking for token 1. If I specified token 2 then I would get all characters to the right of the colon including the leading space.
FOR /f "tokens=2 delims=:" %d IN ('ping %computername% -4 -n 1 ^| find /i "reply"') DO ECHO %d
Now moving onto (Step 3) I am essentially breaking apart the string into three tokens because I am setting the delims to a space which is represented by delims= “. There is a space between the = and the “.
Let’s examine the string:
Reply from 127.0.0.1
Hopefully at this point you are able to see the three tokens in the above string. So in order to return only the IP Address I only request token 3.
Q1. What would be the delims value for 127.0.0.1?
Q2. How many tokens will be as a result?
Q3. What token will I need to request to get the first octet?
5/25 started out different. I woke up about an hour earlier than my alarm despite the fact I ended up crashing out around 1:30 AM. I woke up completely awake, perfectly content and I felt very refreshed with only 4.5 hrs of rest. I had no need for coffee, just a tuck and roll to the home office with a quick stop at the local Bistro (my kitchen) for a power breakfast a la cart. You have to love light traffic.
I logged into work and I flew through the critical SQL SCOM alerts and moved onto the SQL Health & Backup report. Then I finished up right on time to join SQL Sentry’s “Learn How to Tune Queries” webinar. Unfortunately I am not well versed with 3rd party monitoring tools outside of the minimal SSMS tools, which is why I tend to participate as much as I can in demos, forums and similar sessions.
I learned a lot about SQL Sentry’s Plan Explorer and I can honestly say it will definitely be extremely beneficial as I venture into the realm of performance tuning. This literally could not have come at a better time. At my current place of employment there are plans in the works to include me in several tuning aspects which is why I have immersed myself with profiler, database engine tuning advisor and thanks to the SQL community I am learning the many available DMVs.
A huge surprise came at the end of the session. They held a raffle and they happened to call my name. Unfortunately I couldn’t hear the audio portion when they explained what the item was, but I did hear my name called, so I responded. I had figured I won a license for one of their awesome monitoring software packages like the SQL Sentry: Power Suite or their Performance Advisor for SQL Server. To be honest it really didn’t matter to me because either would have been a sweet prize!
Anxious as I was I turned to twitter and pinged Brent Ozar (@BrentO | Blog) and asked what I had won since he was associated with the session. Then shortly after Aaron Bertrand (@AaronBertrand | Blog) responded and the convo went something like this:
As you can tell I was (still am) excited. Everything just worked out perfectly. So I wanted to take this opportunity to say thank you to SQL Sentry, Aaron Bertran, Greg Gonzalez (@SQLsensei | Blog), Peter Shire (@Peter_Shire) and the rest of the SQL Sentry staff for the awesome presentation and the gift. I also want to the thank the organizers of SQLCruise for hosting the event. To bad I am not going on the cruise, but I’ll be there in spirit!
Thank you SQL Sentry!!!
Did I mention the iPad 2 arrives… on… FRIDAY!!!