Interviewed for the November 2010 NoCOUG Journal

4 11 2010

November 4, 2010

Almost two months ago I was interviewed for an article that was to appear in an upcoming NoCOUG Jornal newsletter.  It was a great opportunity to participate in the interview, and Chen (Gwen) Shapira provided to me a great set of thought provoking questions.  The interview went a bit long, so some of the questions and answers had to be trimmed from the printed NoCOUG Jornal newsletter article (Iggy Fernandez, Chen, and the other responsible parties did a fantastic job of picking the best answers for inclusion).  The picture used in the article… let’s just say that of the 50 or so pictures that I reviewed for submission, it was the only one that effectively hid my gray hair, where I did not show either a distressed (or puzzled) expression or appear that I had just spent eight hours writing a critical book review (is that a hint that another book review is in the process of being put together?).

I have decided to post a couple of the questions and answers that had to be trimmed due to space constraints (with the exception of the war story from my experience, which drifted a bit too far off topic, and involved some rather gruesome gutting of computer internals).

———————————

You use Excel as an automation and visualization tool. You definitely have more Excel expertise than most Oracle DBAs? How did you get into the Excel business? Do you have some tricks to share with our readers?

Among other things, I was a computer/technology instructor prior to working for K&M Machine-Fabricating, Inc., and one of the topics that I taught was using Microsoft Excel for data analysis and data visualization.  Excel has a full featured macro language that is quite similar to Microsoft Visual Basic 6.0 (and also the 2.0 version of Visual Basic), and is somewhat similar to the much simplified VBScript that is integrated into some ERP packages (Infor’s Visual Manufacturing for example), some engineering applications (AutoCAD, at least for a period of time, supported VBScript), classic ASP web pages, and even the Windows operating system.  It is the macro automation, either from within Excel or another program (whether a Microsoft Office component, custom developed program, or even an operating system script), that provides the most useful features in Excel for a DBA.

In June 2009 I delivered a presentation to a regional ERP user’s group that among other things showed how to leverage Microsoft Excel as a query and reporting tool for ERP data.  The Excel portion of the presentation required roughly 90 minutes and showed how to tap into the ERP data for reporting and visualization, as well as querying flat files, Windows event logs, and the WMI database that is built into all Windows operating systems since Windows 2000 was released.  It took about three months to prepare for that presentation, and the preparation involved reading three books about Microsoft Excel programming:  “Professional Excel Development”, “Excel Power Programming with VBA”, and “Pro Excel 2007 VBA”.  The first two books are great references but also are easily readable from start to end.  The third book really does not introduce anything that could not be discovered by a couple of hours of experimentation and Excel help documentation reading (the help file in Excel 2007 is much better than earlier versions and was improved again in Excel 2010. 

As for tricks, turn on the macro recorder in Excel, make a couple of changes or set up a Microsoft Query that accesses an Oracle Database, and then examine the resulting macro.  This method of generating macros is a great starting point, as long as you keep in mind that the generated code is not necessarily the most efficient approach to producing the intended results, and there is a small chance that the generated code will not work as expected when it is later executed.  If you see an unusual command, click in the middle of the command and press the F1 key for additional information about the command.

Tell us about your favorite network tools. When do you recommend bringing out network tracing? What about other operating system knowledge, do you feel that DBAs would benefit from looking at more low-level tracing and measuring?

For the rare times when I need to perform network analysis, I typically reach for ping, tnsping, or tracert (traceroute) first.  If I am working on a computer running Windows I might also run either nbtstat or netstat with various parameters.  One of the little known key network tools is actually the event log found in a managed network switch, which might provide a clue about misconfigured switch or server settings, failing hardware or failing network links, faulty wiring, and a number of other problems.  For automated network monitoring it is hard to find a better free tool than Nagious.  For a DBA, network monitoring with a packet capture tool such as Wireshark (available on Windows, Unix/Linux with an X Window environment, and Mac) or tcpdump probably should only be attempted when a 10046 trace at level 8 or 12 finds that only a small percentage of processing time is confirmed to be server-side database related activity, essentially when most of the elapsed time is in SQL*Net type wait events.  By time sequencing a Wireshark capture with a 10046 trace file, it might be possible to determine why, for example, every time the client submits a specific SQL statement and a FETCH call is found in the 10046 trace file there is a two second delay before the next entry in the 10046 trace file – essentially answering if there was a problem/slowness in the network communication, or if the client was busy using the retrieved data.  If the Wireshark capture shows that a problem was not found in the 2 second problematic interval, then a switch to client-side tracing with Wireshark and/or another tool such as Micosoft’s Process Monitor, Microsoft’s Spy++, SQL*Net tracing, or other client-side tracing utility might help.

———————————

I hope that you enjoy reading the interview questions and answers; there are several other great articles in the November 2010 NoCOUG Journal.