XenApp 6.5 Application Reporting (Part 3)

Posted: December 17, 2013 in Citrix
Tags: , , , , , , , , ,

The first 2 parts of this series were about getting the data out of XenApp 6.5, and into a usable XML format.  While it felt like a long and arduous process, I didn’t realize at the time, that it was only about 30% of the work.  The major part of the project was ahead – I had to turn this data into a usable report.  I had gone through quite a bit of effort to do all this work with the goal of being able to produce more & differentiated reports from the same set of data without having to redo all the gathering efforts.

The first step was to load the XML file.  Referring back to Dr. Weltner’s post, I created an XML object, and loaded the data.  I have seen many posts about powershell where the poster used a get-content statement to load the XML file.  Dr. Weltner states that this is an issue of performance – that using a get-content construct as opposed to an XML load takes as much as 7x as long on larger XML files.  The data from the XA 6.5 farm was pretty sizeable (~750kb), and this was a pretty small farm.

$FileOpen = "c:\temp\farminfo.xml"
[xml]$xml = New-Object -TypeName XML
$xml.Load($FileOpen.FileName)

(The [xml] statement strongly types the $xml variable.  This means that the variable will always be an XML document, and pre-configures the methods and properties which greatly speeds up the ability to load the document and perform other functions.)

This is a very fast load..I tried a get-content just to try it and it was noticeably slower.  I didn’t do any formal testing, but I could definitely feel the difference.  Now, I piped out my $xml to get-member to see what I was working with. The type is System.Xml.XmlDocument, with an interesting number of properties and methods.  To start tinkering the various properties and methods were pretty easily discovered and brought out.  (Using the ISE {Integrated Scripting Environment} with Intellisense makes this  part substantially easier).

I tried some of the easy things – $xml.applications.  This gave me a list of applications.  I dug a bit deeper down, and I could pull the users pretty easily.  However, I could not pull the group users.  This made little sense..  (During this testing phase, I noticed the spaces in my published app names, and when I tried to access that node, I could not – powershell threw an error.  This is where I discovered I needed the replace statement in part 2.  I reran the script to gather the XA data to start with clean data.  I reloaded the $xml variable, and began digging through it again.  I still had an issue with not being able to extract the group users.  I decided I would come back to this part.

I moved on to creating the HTML portion of what I was looking for.  During this process, I looked at http://www.w3schools.com and taught myself a small amount of inline-CSS.  I won’t go into excessive detail on this section, but the HTML portion looked like this:

#Create a temporary HTML file
$HTMLtemp = $env:TEMP + "\CTX" + (Get-Random -Maximum 99999).ToString() + ".htm"
# if the file exists, delete it, since it is a temp file
if (Test-Path $HTMLTemp) {
 Remove-Item -Path $HTMLtemp
 }
#Build the HTML header for the report
Add-Content -Path $HTMLtemp -Value "<!DOCTYPE html>`r`n" 
Add-Content -Path $HTMLtemp -Value "<html lang='en-US'>`r`n"
Add-content -Path $HTMLtemp -Value "<head>`r`n"
#CSS information
Add-Content -Path $HTMLtemp -Value "<style>s`r`n"
Add-Content -Path $HTMLTemp -Value "body { font-family:Verdana;font-size:10pt;color:blue}`r`n"Add-Content -Path $HTMLTemp -Value "th {font-family:Verdana;font-size:11pt;color:blue}`r`n"
Add-Content -Path $HTMLTemp -Value "td {font-family:Verdana;font-size:10pt;color:blue}`r`n"
Add-Content -Path $HTMLtemp -Value "p {font-family:Verdana;font-size:14pt;color:blue}`r`n"
Add-Content -Path $HTMLtemp -Value "</style>`r`n"
Add-Content -Path $HTMLTemp -Value "<title>External Application Access Report</title>`r`n"
Add-Content -Path $HTMLtemp -Value "</head>`r`n"
Add-Content -Path $HTMLtemp -Value "<body>`r`n"
Add-Content -Path $HTMLTemp -Value "<p>External Application Access Report</p>`r`n"
Add-Content -Path $HTMLtemp -Value "<table border='1'>`r`n"

I created a temporary file that should be a unique name.  It’s not truly guaranteed, which is why i used the test-path to check the existence of the file, and delete it if it does.  The deletion was purely an arbitrary position on my part.  To me, if the file is in the temp directory, it’s temporary, and does not have to be stored.

In powershell terms, the `r`n represents a carriage-return and newline.  This causes each line to be separated by Windows style CRLF’s in the resulting HTML document. WIthout these, the output looks like one long continuous string, and is difficult to read.  By using these characters, we’re able to create a legible document.

I realize many people would use the ConvertTo-HTML command.  However, I am very inexperienced with this command, and I knew enough to write my own base HTML files.  So, I used the add-content commands and the temp file.  Overall, this is a reasonably quick setup, although using ConvertTo-HTML would probably be substantially faster.  I’ll end up upgrading the script to use this eventually, but for now, this was sufficient.

On to creating the body.  I had multiple application nodes, so some sort of foreach loop would take care of this.  Given the fact that I was looking at a table structure for my intended structure, I set up a function to write the table rows.  (I’ll cover more detail about this later). But, parsing the applications proved to be much tougher than expected.  $xml.applications turned out something like this:

Applications
------
{Application, Application, Application, Application...}

So, it was more time for get-member.   Get-Member turned up another XML node.. so it was kind of a dead-end.  I’d already seen XML elements and their methods and properties.  I did some poking around, and I found out about XPath notation and syntax.   I started tinkering around with SelectNode, etc. trying to find the right syntax to get the actual node information I was looking for.  This turned out to be much tougher than expected.  XPath uses case-sensitive notation, which means you need to know exactly how it is capitalized, the correct spelling, etc.  I won’t cover all my successes at finding non-working methodologies 😉

I have my external apps gathered into a folder structure with the word external in the path name. My existing code had grabbed my external application folder, and processed it.  Furthermore, having the node names being the application names did not work out easily, and making the Applications the top level node, made it much more difficult to get the data I wanted.  Lastly, the subfolders of my external applications folder simply would not work, they were ignored.   I went back and revisited how I was generating the XML file.   As I gave it some thought, it occurred to me that it would be easier to pull all of the folders, and then later add a filtering mechanism for reporting.  This would also give me more flexibility in creating new reports with other information that might be requested.  Now, having tinkered with the XML commands, and the XML writer significantly more than when I started, I was able to revisit grabbing the folder information.  I went for a structure more like this:

<Folders>
    <Folder AppFolder="Applications/Folder1">
        <Applications AppName="App1_Application" BrowserName="App1 Application">
            <property>value</property>
            <Group GroupName="Groupname">
                <GroupUser>username</GroupUser>
            </Group>
            <User>username2</user>
         </Applications>
    </Folder>
</Folders>

Ultimately, this turned out to be the best structure.  I’d be able to use where-object clauses to filter by the attributes.  I went ahead and built in a filter to pick up my external applications.   So, I reran my gathering script to create the XML, and started a foreach loop.

#Get the folders from the XML file
$xml.Folders.ChildNodes | foreach {

This worked out extremely well. I was able to pull all of the folders, including all of the external application folder.  I added a quick filter in that same line.  This pulled out the folders I wanted it to.

Where-Object {$_.AppFolder -ilike "*external*"} |

However, the enumeration for the folders really did not work – I could get it to work in the console, but not in the script.  I turned to the powershell.com forums.  I got a partial answer, and that was the piece I needed.

$BaseFolder = Get-XAFolder -FolderPath "Applications" -Recurse
    $BaseFolder.FolderPath | foreach {
        $AppFolder = $_

Using the $AppFolder = $_ worked exactly has hoped.  So, even though it should have worked without it, adding this piece made it work smoothly.

Now, finally… building usable data and a usable report!  

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s