On the Topic of Technology… 7

26 09 2014

September 26, 2014

(Back to the Previous Post in the Series)

As I was walking from the back of the facility where I work to my office just recently, I started repeatedly saying to myself as I approached the office door, “that huge computer was not here when I started calling my office a second home“.  I guess that I have worked at the same place for so long that I became blind to the interesting technology gadgets (if you can call multi-million dollar computers simply just gadgets) that surround me on a daily basis.

A couple of years ago BlackBerry released a small 8″ tablet, and Motorola released a 10″ Xoom tablet (I think that Motorola sold out their Mobility division to Google, who then sold that division to a farmer to use as fertilizer).  At the time the Xoom and BlackBerry tablets were released, my boss was really excited about the Apple iPads, but he did not care to spend $500 or more of his own money for a toy to use at home.  He had a Windows computer at home, but he seemed to always view that computer as excessively slow (mostly when viewing websites), even though he spent close to $3,000 on the computer six years earlier.  I am not much of an Apple fan, so I decided to have a little fun with my boss’ situation.

On the day that the Xoom tablet became available on Amazon, I placed an order for the tablet.  When it arrived, I brought it into work and showed the boss how quickly it could pull up web pages, along with its support of Adobe Flash playback (the iPad never supported Adobe Flash).  Yet, he continued to go on about the iPad, even showing me newspaper articles written by tech gurus that boasted about the fantastic features of the iPad.  A year earlier I had bought a small Windows netbook with a 10″ display, trying to convince him that such a netbook was even better than an iPad, so obviously that prior attempt failed.

When the BlackBerry tablet was released, I made a special trip to Best Buy just to grab the tablet.  I set the tablet up to work with the BlackBerry phone that I had at the time.  Oh neat, I am able to look at the company emails that I receive on the phone using the tablet – certainly, that will convince the boss that something is better than the iPad.  I showed my boss, who was also using a BlackBerry phone at the time, the neat BlackBerry tablet that could not only quickly pull up web pages (along with showing Adobe Flash contents), but could also show company emails and use the phone as a mobile hotspot for viewing web pages.  He spent a couple of minutes looking over the BlackBerry tablet before handing it back to me.  I found a couple more newspaper articles about the iPad on my desk in the weeks that followed.

On a Sunday afternoon, I decided to do some video testing with the two tablets, in a final attempt to convince the boss that something other than an iPad is ideal for his use at home.  I took the two tablets to my second home (that’s the place where my office, and all of those huge computers are located), and decided to do a head to head video test with the two tablets.  I planned to show the best looking video from the two tablets to the boss, and finally win him over.  I held the two tablets side-by-side as I walked down the isles of the huge computers.  As I walked, I wondered what that 40,000 pound part was doing in the big pit that was dug for one of the computers that was expected to arrive in another month or two.  No matter, I continued with my video testing, holding the tablets at head level as I walked.  I received some strange looks from the other employees as I walked about – I simply reassured the other employees that I was just trying to impress the boss.  I took the tablets home and processed the video from the tablets to eliminate meaningless portions of the video.  It seems that both tablets produced 720P video at either 29 or 30 frames per second that was virtually identical in video quality, but the BlackBerry video would playback directly in the Windows Media Player, while the Xoom video required conversion to a compatible format.  I showed the boss the resulting video, that not only could the BlackBerry tablet quickly pull up web pages (along with showing Adobe Flash contents), show company emails and use the phone as a mobile hotspot for viewing web pages, but also record 720P video that easily plays back on your Windows computer at home.  The boss thought for a minute or two, and then said, “did you have a chance to read Walt Mossberg’s latest Wall Street Journal article, there is a new iPad out now.”

Ah, fond memories.

I recently found the video clips that I recorded using the tablets back in 2011, and after reviewing the videos, I still can’t see much difference between the videos captured by either tablet.  The video looks nice when playing back, but pausing either video to take a screen capture results in a blurry single-frame mess 90% of the time.  The video showed the big pit that was dug for the large computer – yep, that pit now contains a multi-million dollar computer, and the wall that had been next to the pit was removed during a later expansion project.

In the nearly five years since I created the first article on this blog, I really have not said much about the company where I work.  I have posted a lot of Oracle Database book reviews on Amazon, as well as several reviews of security cameras.  Some readers on Amazon were convinced that I worked for a couple of book publishing companies, writing fake book reviews to promote the publishers books; people who actually read the book reviews should know better than that – the reviews are brutally honest.  Some other customers on Amazon thought that I was working for a security camera company and/or living in California; no, not the case.  As a result, I put together an article that shows some of the interesting technology and multi-million dollar computers that are located just feet from my office at work.  In the article, I included some still frames from the video that I captured in the walk through with the tablets in 2011.

Below are three pictures from the article that I recently posted.  I am still trying to come up with good captions for the last two pictures, captions such as “taking a break” and “breaking in a new truck” seem to come in mind.

Cincinnati CL-707 Laser Burner Slicing Through 1" X 10' x 20' Plate SteelIn the Deep EndNeed a Bigger TRuck

 





On the Topic of Technology… 6

16 03 2014

March 16, 2014

(Back to the Previous Post in the Series)  (Forward to the Next Post in the Series)

It has been a while since my last post on this blog – I guess that the simple answer is that I was busy with a lot of non-Oracle Database related items, and was suffering from a bit of a writer’s block (nothing that a block dump can’t fix?).  I am expecting to soon receive the annual bill from WordPress for keeping this blog free of advertisements, as well as a bill for allowing the customized blog theme.

So, given the number of months since my last blog post,  I took the time to update the list of the top five most viewed articles for the past quarter.  The number one article shows how to install the Nagios network monitoring software on a Synology NAS (actually three different Synology NAS units), which means that a low cost NAS unit could be used to not only verify that a server used with Oracle Database responds to a ping request, but also that an Oracle database is reachable and healthy enough to provide a resultset for a simple SQL statement.  The number two article shows how to do a little mathematics with the help of Oracle Database, approximating the distance between two longitude and latitude coordinates.  The number three article shows how to use a programming language that was last updated in the late 1990s with the latest Microsoft operating system and what was the latest version of Oracle Database.

The advancement of technology certainly means that it is important for IT professionals to try staying on top of the advancements in their technology niche, without completely cutting ties with technology of the past, as illustrated by the current number three article on this blog.  For me, that means buying and then reading cover to cover various books, reading articles, and experimenting with technology.  It helps that I am an IT manager in addition to being an Oracle DBA, so my technology niche is rather broad.  In December 2013 I placed an order for the updated version of “Troubleshooting Oracle Performance“, in part because I enjoyed the first version of that book so much that I read it twice, and also because I have not had sufficient time to experiment with Oracle Database 12c – it appears that the second edition might ship next month.  Someone recently left a comment on another book that I reviewed here and on Amazon – I tried ordering that book twice without success, and now there is apparently a new version of the book on Amazon that includes coverage of Oracle Database 12c, and the book is in stock!  Someone will have to spend the $56, write a review, and let me know if the author fixed the items that I and readers of this blog so patiently and clearly mentioned in 2010.  Anyone interested in the challenge?

As I mentioned, the scope of my job responsibilities extends far beyond that of Oracle Database.  I just recently migrated the company’s email system from Microsoft Exchange 2007 to Microsoft Exchange 2013 SP1.  Anyone who remembers the fun of typing cryptic code on a command line would enjoy this experience.  Simply moving the public folders from the old server to the new server was an excellent example of command line fun, reminding me of the fun that I had years ago trying to compile X.509 certificate support into a Linux kernel.  One book that I read and reviewed was extensively detailed on the topic of public folders, yet the commands that were found in the book failed to execute without returning an error message at step 1.  The other book that I read and reviewed more or less skimmed the topic of public folders, so it was of no help for the task at hand.  No problem, I will just go to the source, Microsoft, for the solution.  A recent article on Microsoft’s site clearly listed all of the steps required to move the public folders from Exchange Server 2007 to Exchange Server 2013… all except for one very important step.  So, I am running command after command on the servers trying to move the public folders from the one server to the next, only having a partial idea of what these commands are doing.  Everything is going great, until I execute the last command listed here:

Get-PublicFolder -Recurse | Export-CliXML C:\PFMigration\Legacy_PFStructure.xml
Get-PublicFolderStatistics | Export-CliXML C:\PFMigration\Legacy_PFStatistics.xml
Get-PublicFolder -Recurse | Get-PublicFolderClientPermission | Select-Object Identity,User -ExpandProperty AccessRights | Export-CliXML C:\PFMigration\Legacy_PFPerms.xml
Get-PublicFolderDatabase | ForEach {Get-PublicFolderStatistics -Server $_.Server | Where {$_.Name -like "*\*"}}
Set-PublicFolder -Identity <public folder identity> -Name <new public folder name>
Get-OrganizationConfig | Format-List PublicFoldersLockedforMigration, PublicFolderMigrationComplete
Set-OrganizationConfig -PublicFoldersLockedforMigration:$false -PublicFolderMigrationComplete:$false
Get-PublicFolderMigrationRequest | Remove-PublicFolderMigrationRequest -Confirm:$false
Get-Mailbox -PublicFolder 
Get-PublicFolder
Get-Mailbox -PublicFolder | Where{$_.IsRootPublicFolderMailbox -eq $false} | Remove-Mailbox -PublicFolder -Force -Confirm:$false
Get-Mailbox -PublicFolder | Remove-Mailbox -PublicFolder -Force -Confirm:$false
.\Export-PublicFolderStatistics.ps1 <Folder to size map path> <FQDN of source server>
...

Spot the error?  Why is this server telling me that I need to provide a comma separated list of parameters when I execute the Export-PublicFolderStatistics.ps1 script?  So, I submit the script again with commas separating the parameters – no the same error is returned.  Must be a problem where I need to specify the parameters in double quotes also – no the same error is returned.  What the four letter word?  That is right, the return of trying to compile X.509 certificate support into the Linux kernel roughly a decade ago, only now on Microsoft’s premium messaging platform.

So, what is the missing step?  Exchange Server 2007 ships with Microsoft PowerShell 1.0 – this command requires Microsoft PowerShell 2.0 to execute, yet that requirement was never mentioned.  Oh yeah, we forgot a step, get over it – you have another set of 10 cryptic commands to enter – only to be greeted with a failure message during the public folder migration, stating that the migration failed because some folder name that once existed on Microsoft Exchange 5.5 contains a character that is now considered an invalid character in its name.  These problems never happen with an upgrade in the Oracle Database world, do they?  Advancement of technology, or Back to the Command Line.

I have also spent a bit of time experimenting with IP security cameras.  I put one in my vehicle and went for a drive.  Ah, 1969, someone obviously has not finished compiling the time saving feature into the camera’s firmware? (Click the picture for a larger view.)

NC-239WF-LicensePlateTest-Mod

-

Let’s try a different stop light – these two cars are either turning the wrong direction (obviously an indication of a bug in the camera’s firmware), or are running a red light. (Click the picture for a larger view.)

NC-239WF-RedLightTest

The camera did not pick up much interesting in the vehicle, so I set it up just in time to catch a game of what appears to be football… or maybe it was a game of sock-her? (Click the picture for a larger view.)

NC-239WF-DeerFightTest

Technology is fun, except when it hit you in the nose.





On the Topic of Technology… 5

12 07 2013

July 7, 2013

(Back to the Previous Post in the Series) (Forward to the Next Post in the Series)

As many readers of this blog are probably aware, Oracle Corporation has released Oracle Database 12.1.0.1 so far for the Linux, Solaris, and Windows platforms.  Oracle Database 12.1.0.1 may be downloaded from Oracle’s OTN website.  This article is not about Oracle Database 12.1.0.1, at least not specifically about that version.

In the previous article in this blog series last year, I mentioned experimenting with a Synology DiskStation DS212+, as well as a couple of IP based 640×480 resolution security cameras.  Since that time I have had the opportunity to purchase a couple of additional NAS devices including a Synology Diskstation DS112J, Synology Diskstation DS412+, and Synology Diskstation DS1813+.  The DS212+ and DS112J NAS devices have ARM type processors, while the DS412+ (32 bit?) and DS1813+ (64 bit) have Intel Atom D2700 series processors (the processor series for other Synology processors may be determined by visiting this link).  The processor type in the NAS partially determines the native capabilities of the NAS, as well as what else may be done with the NAS.  Setting up the Synology NAS devices to support FTP server functionality is fairly easy to accomplish, regardless of the processor type.  That FTP server functionality helps to support the upload functionality of the IP based security cameras.

As an experiment shortly after buying the Synology DiskStation DS212+, I attempted to install the network monitoring tool Nagios, in part to allow keeping track of which IP cameras were offline.  I hit a bit of a brick wall trying to find a precompiled package to permit the Nagios server functionality to run on the Synology DiskStations, which at the core run a version of Linux.  The closest thing that I could find was a plugin for Nagios to permit Nagios running on another machine to monitor a Synology NAS.  I first worked with Red Hat Linux in 1999, implemented dual inline manually-coded iptables firewalls based on a stripped down Red Hat Linux in early 2002, compiled/built a Linux based X.509 certificate supporting VPN server before the Linux kernel supported X.509 certificates (I tried compiling a patched version of the Red Hat kernel, patched with X.509 support, but eventually gave up and compiled the Working Overloaded Kernel), and even tried running Red Hat Enterprise Linux with Samba and Windbind as a member of the company’s file server team.  I first worked with Nagios in 2002, when one of my brothers introduced me to the Linux based network monitoring tool (previously called NetSaint).  Needless to say, I have experience working with Linux and manually compiling software on that platform, but that experience is apparently quite rusty.  The attempt to compile the Nagios source code on the Synology DiskStation DS212+ came to an abrupt halt when I received a message during the compile process essentially stating that the ARM type CPU (Marvell Kirkwood mv6282) did not support fine timer resolutions.

A couple of months later, I tried compiling the Nagios source on the Synology DiskStation DS412+, which features an Intel CPU architecture.  I encountered a couple of unexpected snags in the compile process, and had to put the project on hold for several months.  The paths to the various files on the Linux operating system running on the DiskStation differs a bit from the paths used by the Red Hat variants of Linux – that lack of standardization across the various Linux distributions has frustrated me from time to time over the years.

I recently purchased and reviewed a Synology DiskStation DS1813+.  In the review, I stated the following before testing the theory:

“Additionally, ipkg support permits the installation of roughly 900 additional applications, including C++ compilers – which in theory suggests that the source for the Nagios network monitoring utility can be downloaded and compiled on the DS1813+.”

I am curious to know whether or not anyone is able to get the Nagios server software to run on a Synology DiskStation DS412+ or DS1813+.

I suppose that I should not have proposed that the Nagios network monitoring utility might work on the DiskStation without actually confirming that the utility will work.  I am now able to confirm that the Nagios network monitoring utility will execute on the Synology DiskStation DS1813+, although the check_http plugin failed to compile.  The installation is anything but straight-forward – no how-tos that are close to being useful, and no Setup.exe to double-click.  The following screen capture also does not help (non-root users are not permitted to use the ping command on the DiskStations):

TopicOfTechnology5-1

At this time, I cannot provide a detailed instruction list for running the Nagios network monitoring utility on a Synology DiskStation.  However, as a starting point it is necessary to add ipkg support to the DiskStation.  The following ipkg items might be necessary: optware-devel, gcc, libtool, mysql, apache, openssl, openssl-dev, sendmail, inetutils.  With a bit of experimentation (and luck), you might see something similar to this when typing the ps command in a telnet session (I actually typed the command a second time so that the column headings would be visible – there certainly are a lot of background processes on the DiskStation):

TopicOfTechnology5-2

As I found out, just because Nagios is in the process list, that does not mean that it is able to do much of anything useful.  A work-around for the non-root ping issue is needed (I might have hinted part of the solution when listing the various ipkgs), as well as a work-around for the non-root sendmail problem that I did not mention.

When Nagios is working properly, unplugging a monitored device should result in an email message being sent (of course, if you unplug your computer, you probably will not receive an email message stating that the computer is down :-) ):

TopicOfTechnology5-3

There appear to be several Nagios plugins to monitor Oracle databases, although I have not had a chance yet to determine if any of those plugins will compile and work on a Synology DiskStation.  In theory it should wor… wait, I am not headed down that path yet!

In addition to a Synology DiskStation DS212+, the previous article in this series also showed a couple of smart 640×480 resolution IP cameras.  At the time of the previous article, I did not fully comprehend the usefulness of smart IP cameras.  Roughly 30 IP based cameras later, I now have a better understanding of their usefulness and limitations.  Last year I wrote reviews for three 640×480 model cameras here (it appears that Amazon now has this review attached to a different camera), here (it appears that Amazon now has this review attached to a different camera), and here (OK, there is a forth camera included in this review due to a model change over).  I was also burned badly (at a loss of $1343) when I bought two 1080P cameras last year that could not meet (or even approach) the manufacturer’s claims for the product.  All of those reviews include video samples produced by the cameras.

This year I bought and reviewed a couple of smart 720P resolution IP cameras, as well as a couple of different (from last year’s 1080P) smart 1080P resolution IP cameras.  As before, the reviews include sample video clips recorded by the cameras (the 720P and 1080P video was uploaded at the native resolution, but it appears that Amazon uses a pretty aggressive compression algorithm, which leads to some lost video quality).  The new 720P and 1080P cameras are not perfect, but the manufacturer appears to be taking steps to address the weaknesses that I outlined in the reviews.  I was sent another updated firmware for the 1080P cameras, as well as an updated PDF that includes the instructions that were missing from the included printed manual.  The support person for the camera company also stated that their website is currently under development, and will probably be online in the next 30 days.  My review mentioned the lack of success at using the recommended P2PCam264 app on a Motorola Xoom tablet for viewing the live video feed from the smart 720P and 1080P cameras.  The support person suggested using the AnyScene app on the Motorola Xoom tablet for viewing the live feed – that app seems to work.  The AnyScene app, while seemingly lacking the sound feed from the cameras, might even work a little too well.  I brought the Xoom tablet to a different network, only to find that the app is somehow able to still pull a live video feed from any of the configured cameras on the other network without poking any holes in the firewall on either network, and universal plug and play (uPNP) is disabled (below is a low resolution cell phone captured picture).  I am now left wondering what level of security risk this plug and play technology might pose.

TopicOfTechnology5-4

Sample PNG Generated from 720P Camera’s Video (Click to Display Larger Version):

topicoftechnology5-8

Sample PNG Generated from 1080P Camera’s Video (Same Scene as the Above Example – Click to Display Larger Version):

topicoftechnology5-9

Sample JPG 720P Image from an Edited Video (the 1080P video suffers from fewer out of focus problems and is the same resolution – just with a roughly 50% wider and taller viewing area):

TopicOfTechnology5-6

TriVision User Manual for PC – Aug 2012
TriVision User Manual for PC and Mac – Dec 2013

FIRMWARE_EN_TRIVISION_NC_336PW_HD_1080P_60HZ_V5.78B20140916.mfw (Do NOT open with Microsoft Word – when saving, remove the .doc second extension)

FIRMWARE_EN_TRIVISION_NC_336W_HD_1080P_60HZ_V5.78B20140916.mfw (Do NOT open with Microsoft Word – when saving, remove the .doc second extension)

CameraLive_Setup.exe  (Do NOT open with Microsoft Word – when saving, remove the .doc second extension)





Unexpected Timer Resolution, Unexpected Parked CPUs, Unexpected Power Consumption

19 04 2013

April 19, 2013 (Modified May 11, 2013, June 5, 2014)

This blog article is not purely Oracle Database specific, yet it may have some relevance to companies that run Oracle Database on the Windows Server platform (for those DBAs lucky/unlucky enough to run Oracle Database on the Windows Server platform, you may find this article interesting).

I am in the process of setting up a couple of new Windows servers to perform various non-Oracle Database tasks.  I noticed that one of the servers had an odd issue – the server would occasionally become very slow at responding to mouse movements and keyboard input, for instance taking 30 seconds to move the mouse pointer a short distance across the screen.  These servers are running Windows Server 2012, which shares the same kernel and includes much the same features as Windows 8 – with the exception that the server operating system opens to the desktop rather than Windows 8’s new start screen.

Two years ago I wrote a brain teaser article that asked how it was possible that a 10046 extended SQL trace could output c=15600,e=510 on a line of the trace file when executing a SQL statement without using parallel query – essentially asking how it was possible to consume 0.015600 seconds of CPU time in 0.000510 seconds of elapsed time when the SQL statement was restricted to running on no more than one CPU.  In the comments section of the article I mentioned the ClockRes utility, but did not provide a link for the download of the program.  So, I thought that I would run the ClockRes utility on one of the new servers, make a change to the server, and then run the ClockRes utility again:

UnexpectedClockResOutput

As can be seen above, on the first execution of ClockRes the Current timer interval was 1.001 ms, while on the second execution of the ClockRes program the Current timer interval was 15.626 ms.  There is an odd similarity between that 15.626ms time (which oddly exceeds the reported Maximum timer interval of 15.625ms) and the c=15600 reported in the Oracle 10046 extended SQL trace file.  So, what change did I make to the server between the first execution of ClockRes utility and the second execution?  For now I will just say that I stopped one of the background services on the server (more later).

I recall performing an experiment a couple of years ago with Oracle Database.  I downloaded a utility that offered to change the Windows default timer resolution from 15.625ms to 1.0ms.  That utility did in fact change the Windows timer resolution, resulting in Oracle Database outputting c= values in increments of 1000, rather than in increments of 15600.  If I am remembering correctly, a second outcome of the experiment was a decrease in performance of the test Oracle database on the computer due to the higher resolution of the Windows timer.

Could the change in the resolution of the Windows timer from the Windows default of 15.625ms to 1.001ms be responsible for the occasionally sluggish performance of the server?  One article that I found (and unfortunately did not save the link to) claimed that adjusting the Windows timer from the default of 15.625ms to a lower value, 1ms for example, could cause a significant negative impact in multitasking system performance (roughly 30% decrease, if I recall correctly).  I located an article on Microsoft’s website that offered some level of clarification, below is a short quote from the article:

“Applications can call timeBeginPeriod to increase the timer resolution. The maximum resolution of 1 ms is used to support graphical animations, audio playback, or video playback. This not only increases the timer resolution for the application to 1 ms, but also affects the global system timer resolution, because Windows uses at least the highest resolution (that is, the lowest interval) that any application requests. Therefore, if only one application requests a timer resolution of 1 ms, the system timer sets the interval (also called the “system timer tick”) to at least 1 ms. For more information, see “timeBeginPeriod Function” on the MSDN® website.

Modern processors and chipsets, particularly in portable platforms, use the idle time between system timer intervals to reduce system power consumption. Various processor and chipset components are placed into low-power idle states between timer intervals. However, these low-power idle states are often ineffective at lowering system power consumption when the system timer interval is less than the default.

If the system timer interval is decreased to less than the default, including when an application calls timeBeginPeriod with a resolution of 1 ms, the low-power idle states are ineffective at reducing system power consumption and system battery life suffers.”

The above mentioned Microsoft article also suggested running the following command from the Windows command line:

powercfg /energy

I had actually executed the above command before running the ClockRes program for the first time, and again after running the ClockRes program for the second time.  A very small portion of the powercfg generated HTML file follows, generated prior to the first execution of ClockRes:

Platform Timer Resolution:Platform Timer Resolution
The default platform timer resolution is 15.6ms (15625000ns) and should be used whenever the system is idle. If the timer resolution is increased, processor power management technologies may not be effective. The timer resolution may be increased due to multimedia playback or graphical animations.
Current Timer Resolution (100ns units) 10009
Maximum Timer Period (100ns units) 156250

Platform Timer Resolution:Outstanding Timer Request
A program or service has requested a timer resolution smaller than the platform maximum timer resolution.
Requested Period 10000
Requesting Process ID 536
Requesting Process Path \Device\HarddiskVolume4\PROGRA~2\APC\POWERC~1\agent\pbeagent.exe

This is the same section of the generated HTML file, generated after the second execution of ClockRes:

Platform Timer Resolution:Platform Timer Resolution
The default platform timer resolution is 15.6ms (15625000ns) and should be used whenever the system is idle. If the timer resolution is increased, processor power management technologies may not be effective. The timer resolution may be increased due to multimedia playback or graphical animations.
Current Timer Resolution (100ns units) 156261

That is potentially interesting.  The output of powercfg stated that PROGRA~2\APC\POWERC~1\agent\pbeagent.exe requested a timer of 1.000 ms, which then changed the Windows server system-wide timer to 1.0009ms.  Interesting?  PROGRA~2\APC\POWERC~1\agent\pbeagent.exe resolves to the “APC PBE Agent” service in Windows, which is a component of the American Power Conversion (APC) PowerChute Business Edition software.  That software interfaces with an attached UPS to provide a gentle shutdown of the server in the event of an extended power outage.  The “APC PBE Agent” service happens to be the service that I shut down between the first and second execution of the ClockRes utility.

Interesting?  Does that suggest that installing the APC PowerChute Business Edition software on a server potentially has a significant impact on the performance of that server due to the program’s insistance on changing the Windows system-wide timer resolution to 1ms?  A quick observation indicates that the change made by the APC software to the Windows system-wide timer resolution does NOT apparently affect the reporting of the c=15600 entries in an Oracle Database 10046 extended SQL trace when the APC software is installed on the server.  The question remains whether or not this APC software could significantly decrease the performance of that Oracle Database software (potentially by 30%, as suggested in the one unnamed article).

——

The Windows Server that is experiencing occasionally jittery mouse and keyboard input is reasonally high-end for a Windows server: Intel Xeon E5-2690 8 core CPU at 2.9GHz (with hyperthreading enabled, giving the appearance of 16 CPUs in Windows), 64GB of memory, RAID controller with 1GB of battery backed cache, 16 internal 10,000 RPM hard drives, two gigabit network adapters in a teamed configuration, etc.  It should require a substantial load on the server to cause the jittery mouse and keyboard input behavior.

The power option plan in Windows was set to High Performance, while the default plan in Windows Server is Balanced.  Various articles on Microsoft’s website state that the Balanced plan allows the server/operating system to use CPU speed throttling (reducing the CPU speed from the stated speed rating, 2.9GHz in the case of this server), and core parking (essentially putting one or more CPU cores to sleep) in order to reduce energy consumption.  Some articles on Microsoft’s site indicate that, at least with Windows Server 2008, that CPU parking may increase IO latencies – that, of course, would be bad if Oracle Database were installed on the server.  Other articles on Microsoft’s site indicate that there are bugs, at least with Windows Server 2008, related to core parking which causes the parked cores not to wake up when the CPU load increases.  I wonder if this particular bug is playing a part in the performance issue faced in this very recent Usenet thread that describes poor performance of Oracle Database running in Hyper-V on Windows?

Here is a screen capture of the Power Options window and Task Manager on the Windows Server 2012 machine that is experiencing occasionally jittery mouse and keyboard input (screen capture taken when the server was mostly idle):

UnexpectedPowerOptionsTaskManager

Notice the inconsistency?  The server’s CPU is throttled down from 2.9GHz to just 1.16GHz while the power option plan is set to High Performance.  The Microsoft published “Performance Tuning Guidelines for Windows Server 2012” document on pages 16-17 states the following (I highlighted some of the words in red):

Balanced (recommended): Default setting. Targets good energy efficiency with minimal performance impact.  Matches capacity to demand. Energy-saving features balance power and performance.

High Performance: Increases performance at the cost of high energy consumption. Power and thermal limitations, operating expenses, and reliability considerations apply.  Processors are always locked at the highest performance state (including “turbo” frequencies). All cores are unparked.

Power Saver: Limits performance to save energy and reduce operating cost.  Caps processor frequency at a percentage of maximum (if supported), and enables other energy-saving features.”

Well, that is interesting, and is inconsistent with the above screen capture.  Incidentally, when the server was experiencing the worst of the occasionally jittery mouse and keyboard input, the CPU utilization was hovering around 6% and the CPU speed was still coasting at 1.16GHz to 1.18GHz, the network performance hovered between 600Mbps and 1100Mbps, and the server’s internal hard drives barely noticed the traffic passing to/from the disks through the network interface (lower than 75MB/s and 137MB/s, respectively).  6% CPU utilization causes the mouse and keyboard input to become jittery?  With hyperthreading enabled, there is essentially 16 available CPU seconds per each second of elapsed time.  A quick check: 1/16 = 0.0625, so 1 (hyperthreaded) CPU at 100% utilization would be reported as a system-wide utilization of 6.25%.  Interesting, but is that statistic relevant?

I happened to have the Windows Resource Monitor open during one of the jittery episodes.  The Resource Monitor showed, shockingly, that 14 (possibly 15) of the hyperthreaded “CPUs” were parked!  That result is also in conflict with the Microsoft document mentioned above regarding “all cores are unparked” when the High Performance power plan is selected.  So, at 6% CPU utilization the server was CPU constrained.  Modifying the setting in the server’s BIOS that controls whether or not cores may be parked, so that the cores could not be parked, fixed the issue in Windows Server 2012 that resulted in the 30 second delay that accompanied moving the mouse pointer a short distance across the screen.

The server still exhibits a bit of jittery behavior with mouse and keyboard input when the server’s teamed network cards are heavily used for file transfers to the server, but at least the CPU activity is no longer confined to a single hyperthreaded “CPU”:

UnexpectedResourceMonitor

Considering that this server was ordered from the manufacturer as “performance optimized”, I am a bit surprised at the power consumption of the server.  The server was ordered with dual (redundant) 1100 watt power supplies.  With the CPU’s 135 watt maximum TDP (per Intel: “Thermal Design Power (TDP) represents the near maximum power a product can draw for a thermally significant period while running commercially available software.”), 16 hard drives, and 64GB of memory, I fully expected the server to consume between 700 and 900 watts of electrical power.

Here is the server’s power consumption when the server is lightly loaded with roughly 68 running processes (note that the server is connected to a 120 volt power outlet):

UnexpectedPowerConsumptionLittleLoad

Here is the server’s power consumption when the server is moderately loaded with between 600Mbps and 1100Mbps of network traffic (the mouse pointer was slightly jittery at this point):

UnexpectedPowerConsumptionNetworkLoad

So, the server consumes 1.2 amps (126 watts) when lightly loaded and 1.4 amps (154 watts) when moderately loaded.  Keeping in mind that many of the popular incandescent light bulbs require 100 watts of power (note that some governments have now restricted the manufacturing of high wattage incandescent light bulbs), this server is consuming just a little more electrical power than a light bulb that might have been hung overhead just a decade or two ago.

One of the common arguments for server virtualization is energy savings – the above screen captures may suggest that energy savings may not be a significant cost-savings factor for virtualization with modern server hardware.  One might question how much energy is really being saved when the network interface is maxed out by a single virtualized server, just 6% CPU utilization results in a jittering mouse pointer, and there are eight to ten virtualized servers stacked on the physical hardware (all competing for the scarce CPU and network resources).

Added May 11, 2013:

Dell BIOS setting to enable or disable CPU parking in Windows Server 2012:

UnexpectedPowerConsumptionBIOSProcIdle

With the BIOS option set to enabled, disk activity caused by network traffic results in occasionally jittery mouse movements on the server.  Based on a bit of research, installing the Hyper-V role on either Windows Server 2012 or Windows 8 may disable CPU throttling and/or disable CPU parking.

Added June 5, 2014:

I finally had sufficient time to fully analyze this problem, where a 2.9GHz CPU in a Dell PowerEdge T620 server crawled along at a leasurely pace of about 1.16GHz, actually throttling back performance further as demand for the server’s resources increased.  A second Dell PowerEdge T620 server with a 2.6GHz CPU that was purchased at the same time also coasted along at roughly 1.16GHz, but that server did not seem to throttle back performance further as demand for the server’s resources increased.

As a review, the screen capture shown below at the left shows the Windows Server 2012 Power Options settings and the Performance tab of the Task Manager.  The screen capture below at the right shows the Windows Server 2012 Power Options settings and the Performance tab of the Task Manager after fixing this particular problem – note that the 2.9GHz CPU is now essentially overclocked at 3.28GHz (it has operated at roughly that speed since the fix).

UnexpectedT620PowerOptionsTaskManager UnexpectedT620PowerOptionsTaskManager2

The 2.9GHz PowerEdge T620 and the 2.6GHz PowerEdge T620 are both Active Directory domain controllers and internal DNS servers (along with supporting other tasks), so the occasionally slow (or extremely slow) performance of the servers negatively impacted the performance of other servers as well as client workstations.

There was a BIOS firmware update released in the third quarter of 2013, which was supposed to address some CPU throttling issues – that BIOS update did not seem to help the problem that I experienced.

I thought that the low power consumption of the big server with the 2.9 GHz E5-2690 8 core CPU was a significant clue when I tried troubleshooting the server a year ago since that CPU is rated to consume up to 135 watts and the server liked to hover between 120 watts and 140 watts regardless of the server’s workload.  The Dell T620 (and other recent Dell servers) has some pretty sophisticated power management capabilities.  Dell sells a utility that is able to alter the electrical power profile of a server, and I thought that Dell might have imposed a 140 watt limit on the server for some reason, but I could not find where that limit was specified.  The 2.9 GHz E5-2690 8 core CPU apparently has some additional electrical power limiting capabilities.  A year ago I even tried downloading a demo of Dell’s power management utility – that did not help resolve the issue (I think that the installation might have caused some other issues that I had to fix).  Last week Tuesday I read the following articles:
http://www.dell.com/learn/us/en/19/financial-services-markets-solutions-processor-acceleration-technology
http://www.dell.com/us/business/p/dell-openmanage-power-center/pd
http://www.intel.com/content/www/us/en/data-center/data-center-management/how-to-configure-node-manager-video.html
ftp://ftp.dell.com/Manuals/Common/poweredge-r720_Concept%20Guide_en-us.pdf
http://en.community.dell.com/techcenter/power-cooling/w/wiki/3536.openmanage-power-center-faq.aspx

I rebooted the server, pressed F2, and dug around in the settings a bit.  I found that the System Profile Setting was set to “Performance per Watt” (I believe that this was how it was set when it left the Dell factory).  I changed that setting to “Performance”, saved the changes, and rebooted the server again.  The server is now consuming 200+ watts, and the CPU is freely exceeding its rated speed.  Once in the System BIOS settings, the pictures below show the configuration changes to remove the electric power cap, thus allowing the server to behave as it should have from the factory:

UnexpectedT620SystemProfile1 UnexpectedT620SystemProfile2

UnexpectedT620SystemProfile3 UnexpectedT620SystemProfile4

I suppose that if a Dell PowerEdge T620 (or similar recent model Dell server) seems to be running a bit slower than expected (note that the particular problem mentioned above is NOT Windows specific – a Dell PowerEdge T620 running Linux should be affected in the same way), you might take a quick peek at the System Profile Setting in the System BIOS to make certain that the System Profile is set to Performance.  As shipped from the factory, two Dell PowerEdge T620 servers purchased this year were NOT affected by the problems mentioned in this blog article.





On the Topic of Technology… 4

7 05 2012

May 7, 2012

(Back to the Previous Post in the Series) (Forward to the Next Post in the Series)

Today’s blog article has an unusual tie in with Oracle.

The last couple of weeks I have been experimenting with video technology.  Computer related video capabilities have certainly changed over the years.  In 1996 I purchased a Sony Handicam and a Video Snappy.  The Sony video camera was capable of recording video with about 480 lines of resolution (NTSC standard) with a 10x optical zoom and recording capability in just 0.6 LUX of lighting.  The Video Snappy plugs into a computer’s parallel (old style printer) port, connects to the Sony video camera by an RCA style video cable, and converts the live video feed from the video camera to still photos.  Combined with a 120MHz Pentium processor, it seemed to be a state-of-the-art setup at the time (of course ignoring the capabilities of the Commodore Amiga/Newtek Video Toaster).  A picture of the Sony/Snappy configuration is shown below.

Of course the video capabilies of current tablets, cell phones, and digital cameras far exceed what was available in the 1990s – video recording with those devices was described in the previous blog article in this series (see the link at the top of this article).

A product that I recently found is the Synology DiskStation DS212+, which has some remarkable features considering that its primary objective is to provide an external hard drive array with RAID 1.  This particular unit ships without hard drives, so I purchased two Western Digital 2TB Green hard drives.  The external hard drive enclosure includes an SD media card reader, a single USB 2 port, two USB 3 ports, and an ESATA port to allow connecting additional external hard drives, printers, and wireless cards.  While not much larger than the hard drives installed in the unit, it certainly offers much more than access to those drives.  The DS212+ offers FTP services (including secure FTP), an iSCSI interface, DHCP services, media sharing services, WordPress, MySQL, PHP, a remarkable operating system that fully renders console screens in a web browser without the use of Adobe Flash (uses HTML 5 and CSS 3), and more.

The Synology DiskStation DS212 line’s disk throughput is limited by a combination of CPU performance and gigabit network maximum transfer speed, with the DS212+ offering the fastest rated transfer speed of roughly 108 MB/s read (very close to the maximum speed of gigabit Ethernet) and 66 MB/s write.  The Synology DS212+ is pictured below, measuring roughly the same height as five books on the topic of Oracle Database.

So, what does the first picture have in common with the second?  More about that commonality later.

Below is a screen capture of the Synology DS212+ operating system GUI (graphical user interface) rendered in Internet Explorer 9, roughly 21 minutes after powering on the unit for the first time and installing the latest version of the operating system.  As seen below, the 2TB drives were 84% formatted roughly six and a half minutes after I connected for the first time (a verify process lasting several hours started immediately after the format, but the hard drives were accessible during this verify process).

The operating system renders resizable, movable, drag and drop capable, right-clickable windows within the web page.  Several of the free optional packages for the DS212+; a resource meter showing CPU, memory, and network utilization; current network connections; and recent log entries are shown in the picture below.

So, what function does the DS212+ serve other than consuming electricity?  That is still a question that I am trying to answer, but I have only had access to the system for a couple of days.  I installed several of the free optional packages (after downloading the latest version from the company’s website), and experimented a bit.  The screen capture below shows the DS212+ playing an Internet radio stream (the channel was essentially selected at random), while simultaneously playing back a 640 pixel by 480 pixel video.

Incidentally, the above video was captured in a completely dark room using infrared lights that are built into the video camera.

As I mentioned at the beginning of this article, over the last couple of weeks I have spent a bit of time working with video technology.  Pictured below are two TriVision NC-107WF video cameras and a SanDisk 32GB micro SD memory card that works with the cameras.  I have also worked with a couple of TriVision NC-107W video cameras, which lack an infrared cut filter, resulting in poor color rendering.

So, what has 16 years of technology progress provided, comparing to the Sony Handycam shown at the start of this article?  The camera shown below records video at 640 pixels by 480 pixels, much like the Sony Handycam, so that feature has not improved much.  The TriVision camera digitally records nearly a month’s worth of video to a memory card that is about the size of a thumbnail, while the Sony Hanycam digitally records between a half hour and two hours of video to a tape that is about the size of an average person’s fist.  The TriVision camera records black and white video in complete darkness due to its built in infrared lights, while the Sony Handycam records excellent completely black videos in the same lighting conditions.

Surprisingly, there are no reviews of the TriVision line of cameras on Amazon.  The cameras appear to be a clone of the (Amazon) highly rated Sharx Security brand of security cameras.  Unlike some of the other security cameras on the market, this camera ships with a well written user manual (with only a small number of typos).  Offering motion detection, support of up to 32 GB of storage, automatic upload of video and still photos to an FTP server, live streaming through desktop web browsers and mobile devices, and a handful of other capabilities, it is hard to believe just how much technology is stuffed into such a small package.  The wireless range when paired with a Cisco 1250 series access point is impressive, but not terribly impressive when paired with a consumer grade Linksys/Cisco wireless router with integrated antennas.  Poor wireless performance is not necessarily a problem, since the camera stores recorded video to the memory card until the specified FTP server is accessible.  The cameras ship with Multi-live software that permits simultaneous viewing and recording of up to 36 cameras directly from the video streams, which is helpful if an FTP server is not configured.

Reliability of the TriVision NC-107WF/NC-107W cameras is still an unknown.  I have experienced occasional glitches accessing the built-in web server, making it impossible to adjust the camera settings (power cycling the camera seems to correct this issue), however those glitches apparently do not affect video recording or uploading of the captured video to FTP servers.

I have also spent a bit of time working with TriVision’s NC-306W outdoor wireless video cameras, which are shown in the picture below.  The NC-306W camera appears to be a clone of the (Amazon) highly rated Sharx video camera.  The web-based configuration of the NC-306W is nearly identical to that of the NC-107WF.  A 32GB memory card with automatic FTP uploading is supported, as is two-way audio (the NC-107WF supports one-way audio).

Since there are no reviews of the Trivision NC-306W, it is difficult to determine the long-term reliability of this camera.  During installation, one of the mounting nuts snapped due to over-torquing, but that nut is only needed for overhead mounting as seen in the picture below (the mounting nut is attached directly between the sun shield at the top of the camera and the white colored dial at the end of the mounting rod).  As with the TriVision NC-107WF/NC-107W cameras, the built-in web server has occasionally stopped responding, but that problem has not affected video capture or FTP upload.

Below is a screen capture of a video stream from a TriVision NC-107WF camera.  The original video quality was slightly better than pictured below (conversion of the screen capture to JPG format caused some detail loss).  The same scene captured by a TriVision NC-107W camera would have a pink, purple, or red cast due to the presence of infrared light (the NC-107WF and NC-306W are able to selectively filter out the infrared light).

I had hoped to upload a couple of videos captured by the cameras, however, WordPress apparently does not support directly uploaded video formats.  I plan to update this blog article as I better understand all of the features that the Synology Diskstation DS212+ offers, and to provide reliability updates of the DS212+ and the TriVision cameras.





OT: Do Search Terms Describe the Visitor?

12 01 2012

January 12, 2012

I thought that I would start this slightly off topic blog article with a bit of humor.  Seven months ago I wrote a blog article that refuses to move from the first position in the most visited articles on this blog.  In the process of trying to understand why a mathematics focused article is the most popular article on an Oracle Database specific blog, I regularly review the search keywords that bring people to my blog.  Yesterday alone, the search keywords that apparently pointed to the popular blog article included (not a complete list):

  • finding distance between two points lat long (2)
  • distance spherical points coordinates oracle (2)
  • find distance between two longitude latitude points (1)
  • calculate distance between points in a feature (1)
  • calculate speed longitude latitude (1)
  • distance between two lattitude points (1)
  • how to get distance between two latitudes in meter (1)
  • access table that converts latitude and longitude (1)
  • calculate time and distance between two longitudes (1)
  • distance between two points latitude and longitude transact sql (1)
  • converting to oracle degree minutes (1)
  • distance between latitude (1)
  • select sdo_geom.sdo_distance (1)
  • oracle sdo_geom.dos_distance (1)
  • calculate distance longitude latitude (1)
  • method of measuring distance using plain (1)
  • oracle spatial calculate distance (1)
  • how to calculate distance between two latitude and longitude (1)
  • to calculate distance between two points (1)
  • oracle spatial sort lat lon coordinates (1)
  • longitude and latitude distance between 2 points (1)
  • solve this:find distance between (35 degrees 21′) and (29 degrees 57′) and given that the radius is 3960 (1)
  • distance between two points wordpress (1)
  • how to get distance between two longitudes (1)
  • calculate distance between two lat long points (1)
  • how to find distance between points of longitude (1)
  • sql function calculate distance gps kilometers (1)
  • calculate distance between lattitute longitude (1)
  • excel calculate distance from lat long (1)
  • calculate distance from lat long sql (1) 

I think that the above keywords point to one and only one conclusion: the majority of the visitors to this blog are lost, know only their latitude and longitude coordinates (and the coordinates of where they are supposed to be), and are able to VPN in to their Oracle Database at work to determine how long it will take to become unlost.  :-)

While on the topic of interesting search keywords, I thought that I would mention a favorite search keyword that appeared in the WordPress statistics.  My all time favorite interesting search keyword that was used to access this blog appeared shortly after I reviewed a red covered Oracle related book – the search keywords were “Charles Hooper is getting sued”.  Needless to say, it is interesting that the search keywords found an article on my blog.

Moving on to the point of this blog article, I have also noticed a curious trend in some of the search keywords.  WordPress makes it a bit diffcult to go back in time to tabulate somewhat popular search keywords when there are a lot of commonly used keywords that access this site.  However, with a bit of effort, I attempted to locate search keywords that share a common pattern.  Those search keywords that I located (with the minimum number of hits indicated) follow: 

  • oracle core essential internals for dbas and developers pdf (8)
  • oracle core essential internals for dbas and developers download (6)
  • oracle core: essential internals for dbas and developers pdf (4)
  • oracle core: essential internals for dbas and developers download (1)
  • oracle tuning the definitive reference second edition pdf (99)
  • oracle tuning: the definitive reference pdf (3)
  • oracle tuning the definitive reference pdf (1)
  • oracle tuning the definitive reference 2nd edition pdf download (2)
  • oracle performance firefighting pdf (18)
  • expert oracle practices download (1)
  • apress.oracle.core.essential.internals.for.dbas.and.developers.nov.2011.rar (1)

Undoubtedly, in the above cases the searcher likely found my review of a particular book.  The above list has me wondering about a couple of items.  I was a full time college student for five years to obtain my degree.  I remember the pain of signing a check for what seemed to be (U.S.) $500 every semester – it did not seem to matter much if I was buying mathematics books (a $204 Calculus book from Amazon oddly seems approproate here) , computer books, or psychology books – the total was always about the same.  Was that money for books a waste?  Maybe I could have just found the book out on the Internet somewhere… they had that neat Gopher tool way back then, but not the WYSIWYG interface that is common now.

I am left wondering what the people searching for the above keywords might have been thinking.  I assume that some of the searchers have spent some time in college, and might even be Oracle developers or DBAs.  So, do you think that search terms describe vistors?  If so, how do you think the above search keywords might describe the searchers?





Internal Server Error – Contact Your System Administrator

6 12 2011

December 6, 2011

The Oracle OTN forums changed a bit a year or two ago, and in the process I stopped receiving email notifications when new entries were added to discussion threads.  The absence of email notifications was not a significant loss, although at times it is a bit interesting to see how a post in some of the threads changed a week or two after the initial post.  It appears that the OTN staff have corrected the email notification problems, and the speed of the forums seems to have improved significantly since the dramatic performance drop that was caused by an upgrade to the OTN forums a couple of years ago.

An interesting new problem has arrived.  The unbreakable (but free) forums tell me to contact the system administrator after I attempt to log in – the web server claims that the problem is caused by either an internal error or a misconfiguration.  I tried calling the system administrator here, but his phone is busy every time I try to place the call, and remarkably I always get interrupted when I try calling from a different phone.  ;-)  This is the error that I see immediately after logging in:

What is really irritating is that I received three emails today from OTN telling me that the OP has updated an OTN thread that I responded to, but sadly I cannot reach that thread.  After logging into OTN, I can’t even tell the browser to display forums.oracle.com – this is what I see: 

I can make it into Metalink (My Oracle Support) without an issue – I didn’t even need to log in (no password requested):

So, what happens if I click Sign Out in My Oracle Support?  Let’s try and then head back to forums.oracle.com (this seems to work sometimes):

So, the forums work, just as long as you do not care to contribute.  :-)  If we skip the login step, there are a couple of threads in the Community Feedback and Discussion forum about the problems (thread1, thread2).

Let’s log in again… (something comes to mind about the definition of insanity and doing something over and over again):

Out of curiosity, let’s see where forums.oracle.com is pointing:The traceroute is successful (ignore the long ping times – that was caused by other traffic on the Internet connection).

I noted that Google’s 8.8.8.8 DNS server is currently resolving forums.oracle.com also to e4606.b.akamaiedge.net which Google’s DNS server indicates is at IP address 184.25.198.174 (using Google’s DNS server does not change the error message that is displayed in the browser):

Without using Google’s DNS server, forums.oracle.com resolves to 23.1.18.174, as indicated by the trace route output.  I was curious what Wireshark might show when attempting to display forums.oracle.com (while logged in), so I fired it up and then told the browser to refresh the screen twice:

TCP ACKed lost segments…  Would the same problem happen when using Google’s DNS server, which points at IP address 184.25.198.174?  Let’s check:

Not exactly the same, but similar.  I have not spent a lot of time trying to dig through the Wireshark captures, but it is a bit odd that there are still TCP retransmissions (I might need to take a closer look at the Internet connection).

I guess that maybe this blog article drifted a bit.  Anyone else fail to connect 100% of the time, or never have a problem connecting?  I can only imagine the staff changes that probably would take place if one of our internal systems offered less than 95% uptime, much less the 99.99999% uptime that seems to be our internal expectation.  It is important to keep in mind that the OTN forums (even the copyrighted error message in the second screen capture) is a free service offered by Oracle Corporation – the problems will be resolved.








Follow

Get every new post delivered to your Inbox.

Join 144 other followers