Mar 132018


We have started to use FSLogix Apps to deliver OneDrive into our Hosted XenApp Desktops that we offer to our customers. This is a great tool that allows users to use the native OneDrive utility in their XenApp sessions. Why would you need this, you may ask? The reason is that OneDrive forces your OneDrive data to be saved in %LOCALAPPDATA%. If you’re using Citrix Profile Manager (UPM) or really any other profile tool, you would be stuck trying to sync that data during logon/logout, which isn’t really feasible. You could also just use local profiles, but then you would be saving all of the data across all of your XenApp servers. I have not figured out a way to trick OneDrive to save to a network location. It was even too smart for a Windows symlink.

This is where FSLogix Apps come in. FSlogix Apps can mount a VHD for each user, and with it’s file system filter driver it’s able to determine what data is destined for OneDrive and move that data into the VHD. At this point there is only one copy of the OneDrive data, and you can roam it across all XenApp servers in your environment. It works very well and we like it a lot, however there are a couple of important shortcoming I need to discuss. I will also show you how I worked around these issues with my own testing and scripting.


First: FSLogix Apps can not pre-mount junction points. If you look at a user who has logged in before configuring OneDrive, you will see the following information:

One, the VHD is mounted fine.

However, look at the redirects:

You can see here that only the Outlook and Lync redirects are in place. I don’t see anything for OneDrive. Let’s see what happens when we configure OneDrive right now. Let’s look at the space on C:

So, look at OneDrive and how much space it thinks is free. The VHD configured for all users is 100GB.

I’m going to select 5.5 GB of “crap” and start a sync. Let’s see what happens.

So, it has taken up space on the C:. Well this isn’t good! With a couple users and a small amount of data you can fill up the C: on your XenApp server. Bad! Also, in this type of configuration where you are only using FSLogix Apps for OneDrive and NOT for profiles, you would be syncing this information back with your profile management solution.

So… What can we do? A couple of things to help this out. I wrote a script to pre-mount the junction points. Let’s take a look.

function Register-Redirect {
[string] $TenantName = "VC3, Inc" # Enter your Tenant ID here
try {
# Ensure frx.exe is present in the PATH
if (!([System.Environment]::GetEnvironmentVariable('PATH').Split(';') -contains "$($env:ProgramFiles)\fslogix\apps")) {
Write-Verbose "frx.exe is not found in 'PATH'...Adding frx.exe to 'PATH'";
$env:PATH = "$($env:PATH);$($env:ProgramFiles)\fslogix\apps";
# If the junction point is not mapped parse out the number from the results returned from 'frx list-redirects file'
[string] $frxOutput = & 'frx' 'list-redirects' | Where-Object { $_ -match "\\$($env:USERNAME)\\" } | Out-String;
if ($frxOutput -match '\\Microsoft\\OneDrive') {
} else {
$Matches = $null; # Reset the hashtable.
$frxOutput -match '\\Device\\HarddiskVolume([1-9]{1,3})\\Skype4B\\User';
$volume = $Matches[1];
# Map the junction points matching the above regex.
Write-Verbose -Message "Adding redirect => Source: \Device\HarddiskVolume2\Users\$($env:USERNAME)\OneDrive - $($TenantName) Target: \Device\HarddiskVolume$($volume)\OneDrive\User";
frx add-redirect -src "\Device\HarddiskVolume2\Users\$($env:USERNAME)\OneDrive - $($TenantName)" -dest "\Device\HarddiskVolume$($volume)\OneDrive\User";
Write-Verbose -Message "Adding redirect => Source: \Device\HarddiskVolume2\Users\$($env:USERNAME)\AppData\Local\Microsoft\OneDrive Target: \Device\HarddiskVolume$($volume)\OneDrive\UserMeta";
frx add-redirect -src "\Device\HarddiskVolume2\Users\$($env:USERNAME)\AppData\Local\Microsoft\OneDrive" -dest "\Device\HarddiskVolume$($volume)\OneDrive\UserMeta";
} catch {
throw "Unable to add redirect";

sleep 5

& "C:\Program Files (x86)\Microsoft OneDrive\OneDriveSetup.exe"

All of this assumes your C: of your XenApp Server is HardDiskVolume2!!!

What does this do? Ok, first, you put your tenant ID in the top. You need this because your folder is normally in the “OneDrive – TenantID” format. Next, it sets a path to the frx .exe. Then it checks to see if the junction points are already mounted with `frx list-redirects` . (it will have no problem duplicating them, so I had to write this in). It uses logic to only pull your own username, so if you are logged into a XenApp server with a bunch of users, you only return your own information. By default it will list all users.

If the script see’s that you don’t have them mounted it will mount them for you. This is where it gets tricky. Look at the existing `frx list-redirects`

The section on the right is where it mounts the information into the VHD. Notice it is HardDiskVolume4. What I had to do was (get a bunch of help, BobFrankly/Atum – CitrixIRC) write a regex to extract the “4” from this output, so we can use it to mount the OneDrive locations in the same volume. THIS IS CRITICAL. Otherwise you wouldn’t be writing to the VHD and that would be bad. So the script extracts the volume number, then runs `frx add-redirect` to add the junction points.

This is a function, so the last thing it does is run the function, then launches the OneDriveSetup.exe

Let’s run this and see what it does. I added a pause so we could see it. You can see the two “Operation completed successfully!” messages, and the OneDrive Setup has launched.

On the server you see the junction points. Notice the addition of two OneDrive folders.

So, let’s setup OneDrive and see what happens. First, let’s look at the hard drive space again.

But wait, why does it still show the wrong amount of hard drive space? Bear with me.

Let’s sync 5GB of “crap” and see what happens. It shows the proper VHD size after the sync is complete. I’m not sure WHY it shows up incorrectly first, it just does.

Let’s check out the hard drive.


YOU SHOULD ONLY NEED TO RUN THIS SCRIPT ONCE! Once this process is done, FSLogix will handle the rest and you won’t have to do this ever again! I have this setup to launch through a shortcut in the user’s Start Menu’s as part of the “OneDrive Onboarding” process.

Second: If you have worked with OneDrive before, it’s great, but not perfect. Also, FSLogix doesn’t always perfectly clean up all of the junction points at logoff. You want to make sure they are gone at logoff, especially if something breaks during initial configuration of OneDrive. I have added a logoff script to kill the junction points. You will have to edit this for your own tenant ID!

frx del-redirect -src "\Device\HarddiskVolume2\Users\%USERNAME%\OneDrive - VC3, Inc"
frx del-redirect -src "\Device\HarddiskVolume2\Users\%USERNAME%\AppData\Local\Microsoft\OneDrive"

Lastly, if you know how the behavior of OneDrive is, you can still have a problem. By default, when you sync with OneDrive, the FIRST place it writes files to is C:\OneDriveTemp\<USERSID>. After it’s done processing it moves it into the OneDrive file location. It does this on a file by file basis, but again, if you had a bunch of users all sync’ing at the same time, you still could possibly fill up the C: on the XenApp server!

Lastly: The final uber-hack I did for this was to create a symlink on the server to point C:\OneDriveTemp to a network location. This one actually works with OneDrive. In my case I pointed it to a share I created on the same volume I was pointing the FSLogix VHDs to.

mklink /d C:\OneDriveTemp <SomePathHere>

That’s all I have. These are the step’s I had to go through to use FSLogix Apps with OneDrive in our production environment. Have fun!

Stay tuned. In a future post I will show you how to setup QOS for OneDrive so you don’t kill your datacenter’s bandwidth when you have a bunch of people uploading files at the same time.

Feb 212018

I plagiarized David Ott’s script for migration of Citrix Profile Manager (UPM) profiles to FSLogix and created it for Local Profiles.

This requires using frx.exe, which means that FSLogix needs to be installed on the server that contains the profiles. The script will create the folders in the USERNAME_SID format, and set all proper permissions.

Use this script. Edit it. Run it (as administrator) from the Citrix server. It will pop up this screen to select what profiles to migrate.

#### EDIT ME
$newprofilepath = "\\\share\path"

#### Don't edit me
$ENV:PATH=”$ENV:PATH;C:\Program Files\fslogix\apps\”
$oldprofiles = gci c:\users | ?{$_.psiscontainer -eq $true} | select -Expand fullname | sort | out-gridview -OutputMode Multiple -title "Select profile(s) to convert"

# foreach old profile
foreach ($old in $oldprofiles) {

$sam = ($old | split-path -leaf)
$sid = (New-Object System.Security.Principal.NTAccount($sam)).translate([System.Security.Principal.SecurityIdentifier]).Value

# set the nfolder path to \\newprofilepath\username_sid
$nfolder = join-path $newprofilepath ($sam+"_"+$sid)
# if $nfolder doesn't exist - create it with permissions
if (!(test-path $nfolder)) {New-Item -Path $nfolder -ItemType directory | Out-Null}
& icacls $nfolder /setowner "$env:userdomain\$sam" /T /C
& icacls $nfolder /grant $env:userdomain\$sam`:`(OI`)`(CI`)F /T

# sets vhd to \\nfolderpath\profile_username.vhd
$vhd = Join-Path $nfolder ("Profile_"+$sam+".vhd")

frx.exe copy-profile -filename $vhd -sid $sid

Dec 202017

In my environment I have nearly 100 persistent MCS VDAs. Our developers, contractors, and even us IT people need the ability to change things on our VDI desktops (install/uninstall/set registry/create local IIS sites/etc.) and have them persist reboots. Luckily, the only software I have to maintain on these desktops is the VDA itself. That still means when Citrix releases a new version that I want to move to, I have to upgrade almost 100 individual desktops. The last time I had to do this I wanted to rip my hair out, and decided to write a script to automate the task. It has saved me TONS of time, so I want to share it in hopes that it can save someone else some time (and hair).

I wrote the script specifically for my MCS environment, which runs on XenServer, but with a little tweaking it can be modified for any environment.

How it works (the short version):

  1. Gets the computer(s) you set – can be manual input or a delivery controller query
  2. On each computer it will create 2 scripts, and 2 scheduled tasks
    1. The first script loads auto logon info into the registry
    2. The second script handles the vda removal and install
  3. One scheduled task runs once to run the first script and reboot
  4. The second scheduled task runs the second script at logon (first script sets up a user – in my case the local administrator – to logon automatically)

Most things are explained in the script, and this time I’ve also created a youtube video to go through it all/show it in action.



Oct 242017

I moved from UPM to FSLogix earlier this year, and decided to write my own powershell script to convert the UPM profiles to .vhd.  FSLogix has its own conversion process (which I didn’t find a whole lot of info on), but I decided to create my own.

What this script does:

  1. Gets a list of all UPM profile directories in the root path (that you supply) and displays them to you to select which one(s) you would like to convert via out-gridview
  2. For each profile you select:
    1. Gets the username from the profile path – you will have to edit this part for your environment… explanation in the script
    2. Use the username to get the SID (FSLogix profiles use the username and sid to name the profile folder)
    3. Creates the FSLogix profile folder (if it doesn’t exist)
    4. Sets the user as the owner of that folder, and gives them full control
    5. Creates the .vhd (my default is 30GB dynamic – edit on line 70 if you wish to change it) – if it doesn’t exist (if it does skip 7, 9-10)
    6. Attaches the .vhd
    7. Creates a partition and formats it ntfs
    8. Assigns the letter T to that drive (edit on line 73 if you wish to change that)
    9. Creates the T:\Profile directory
    10. Grants System/Administrators and the user full control of the profile directory
    11. Copies the profile from the UPM path to the T:\Profile directory with /E /Purge – if you are re-running this script on a particular profile it will overwrite everything fresh
    12. Creates T:\Profile\AppData\Local\FSLogix if it doesnt exist
    13. Creates T:\Profile\AppData\Local\FSLogix\ProfileData.reg if it doesn’t exist (this feeds the profilelist key at logon)

Here is the script!

Jul 112017

I have been using Full Desktop’s inside of XenApp forever. Lately I have been working on a project where I will be using only published apps. We are a CSP and a managed service provider who uses LabTech (Now ConnectWise Automate) to control all of our systems. LabTech uses a great remote access product called ScreenConnect to connect to the systems. All of this works flawlessly inside of a full desktop. However, when I published LabTech as a seamless app (LTClient.exe), everything seems to work fine except for ScreenConnect. I got a great Citrix engineer on the line who actually used all of the collected data I uploaded and troubleshot the issue. ConnectWise is actually a “ClickOnce” application which leverages dfsvc.exe to install and launch ScreenConnect. You can read this super exciting article on ClickOnce applications here.

Technically Citrix, nor Microsoft support any of these ClickOnce applications. Kudos on the Citrix engineer for continuing to work the issue with me, even though this is true. Luckily I already built a 2012R2 RemoteApp environment and was able to get this working to show Citrix this was not an application issue, but a Citrix seamless app issue. During troubleshooting he pointed me to this interesting article on ClickOnce and XenApp 6.5 here. I’m on 7.6 LTSR CU3, but still a good article on how this stuff works.

Anyway, after looking at the procmon information in the ticket, he quickly found that in the working scenario dfsvc.exe was calling ScreenConnect.WindowsClient.exe, where the seamless app was not. His “solution” was to simply run dfsvc.exe before calling LTClient.exe. Not really a “fix” but hell, it worked! So, I created a simple powershell script.

start-process "C:\Windows\Microsoft.NET\Framework\v4.0.30319\dfsvc.exe"
start-process "C:\Program Files (x86)\LabTech Client\LTClient.exe"

Lastly, I added dfsvc.exe to LogoffCheckSysModules per this CTX article.






Sep 302016


XenApp 7.6

700+ Delivered (published) Applications

60+ Windows servers (2008 R2 and 2012 R2)



Recently I had a request to replicate 100+ applications from PROD to QA, using QA server configured with identical applications and identical application locations/paths. Obviously all paths to EXE files need to be the same in order for this to work 100% (unless I missed a memo and XA can now support publishing of identical applications from various paths.  As far as I know, this was not yet available in 7.6).  If QA server has some of the applications in different paths, not all is lost. You can still use this process and script to migrate large number of applications between delivery groups and then modify paths later in Studio.

While I could add few more lines to my PoSH script to actually replicate each application at a time, amount of time it took me to create this script and ability to duplicate applications in Studio seemed unnecessary.

My goal was to replicate, or proper Citrix term would be duplicate all applications and then assign them to another delivery group. Seems simple enough for Citrix and PoSH guru. But for those who are just getting their feet wet could use following process to speed up their delivery time to less than 5 mins and go look for the end of the Internet, while telling client it took you hours 😉



1 – I will be duplicating all requested applications using ol’ Citix Studio.

2 – I will run script below to change Delivery Group and application folder, as visible by the user (you mileage might vary, depending on your requirements)

This script/process is no rocket science, but might help someone to quickly replicate applications and migrate them to another delivery group, instead of publishing them over again.  Modify script below according to your environment before running it.  (WARNING: It is fairly simple script, so review and try to understand exactly what this script is doing, before executing it.)  Also, I am no expert when it comes to creating powershell scripts, but just another Citrix admin.  So, pardon if you can make it better.  Please do improve and share!  I am all for helping fellow Citrix admins anyway I can.  Even if it’s buying a pint!


Step 1

citrixirc1Create alternative application folder in Studio.  For our scenario I am going to create folder named “QA” inside already created “Europe” folder.

Right-click on all applications that you need to replicate in QA (you can select multiple applications at once).

Click Duplicate Application 

Now select all duplicates and drag them over to QA folder.  In my scenario I will be dragging these to Europe\QA.

Step 2

Below script will prompt for the admin folder name where all the duplicates reside (that’s the new folder you just created.  In my example it’s called Europe\QA).  I repeat- do not select your production applications folder, as script will move all your production apps to new delivery group.  Use newly created QA folder where you moved all duplicate applications to in step 1 above.

It is assumed that new delivery group is already created.

Another item to note; there is an optional line (in yellow) to change client-side folder location of newly created applications.  This is to help users identify whether they are running PROD or QA applications. It also looks cleaner in Storefront or WI.  You can add more commands into Foreach loop.  Things like modifying users who have access, or changing actual name of the application and etc.  My goal was to keep all the same and just deliver from QA server.


asnp Citrix*

$adminfolder = (Get-BrokerApplication -MaxRecordCount 10000).AdminFolderName | sort | select -unique | Out-GridView -Title "Select Admin Folder Name" -OutputMode Single
$applist = Get-Brokerapplication -AdminFolderName $adminfolder
$originalDG = (Get-BrokerDesktopGroup -MaxRecordCount 10000).Name | sort | Out-GridView -Title "Select Original Delivery Group Name" -OutputMode Single
$newDG = (Get-BrokerDesktopGroup -MaxRecordCount 10000).Name | sort | Out-GridView -Title "Select New Delivery Group Name" -OutputMode Single

Write-Host "Migrating all applications in $adminfolder`nFrom $originalDG Delivery Group to $newDG Delivery Group" -ForegroundColor Green

foreach ($app in $applist.ApplicationName){
                Write-host "Migrating $app"
                Get-BrokerApplication -ApplicationName $app | Add-BrokerApplication -DesktopGroup $newDG
                Get-BrokerApplication -ApplicationName $app | Remove-BrokerApplication -DesktopGroup $originalDG
                Get-BrokerApplication -ApplicationName $app | Set-BrokerApplication -ClientFolder "Europe\QA" #optional to show all applications inside QA folder and not in the same folder with production apps


BTW, using similar add-brokerapplication command you can publish, or rather deliver same application from multiple delivery groups.  Just comment out remove-brokerapplication command and it will now launch from servers in prod and qa, or any other DG of your choice.  Comes really handy when you have multiple DGs that host different applications, but some of the applications are identical.  You can spread the load across multiple DGs.  Think of it as a worker groups concept in XA 6.x with server groups.   I had such requirement that was easily achievable in XA 6.x, but not so much in XA 7.x.  I paid for someone’s case of beer when they told me that I can use above mentioned command to deliver same application from multiple DG’s, as it’s not clearly documented by Citrix. There is a surprise…

That’s all folks. My first ever citrixirc blog.  Whoo-hoo!

Over and out.

Apr 122016

Thank you Microsoft for changing fundamental things about your operating system, with little or no regard to those of us running in an RDS/XenApp type environment. Check out this technet article. In this article it states how changes have been made.

“In  Pre-Win 8, apps could set the default handler for a file type/protocol by manipulating the registry, this means you could easily have a script or a group policy manipulating the registry. For example  for Mailto protocol you just needed to change the “default” value under HKEY_CLASSES_ROOT\mailto\shell\open\command”

More importantly, you were able to use Group Policy Preferences (GPP) to set these values inside a GPO. You could also Item Level Target (ITL) them by using the GPP. This means you could easily have users run Acrobat Pro for .pdfs on SecurityGroupA and Adobe Reader for .pdfs on SecurityGroupB. However, the technet article goes on to say that in post Windows 8,

“the registry changes are verified by a hash (unique per user and app) “

A little more digging tells us that the new hashing mechanism is also on a per-machine basis. This means that a hash would be different for each user, per app, per XenApp server. Very inconvenient and annoying. This also means that we can not use the built in GPP functions in Active Directory to set these file type associations. Also very inconvenient and annoying.

James Rankin did a great blogpost on this subject as well. You can read that here. He did a great job overviewing this issue and provided a solution with using AppSense. This blog will show you how to do this with good old batch scripting and group policy. To be honest, I’m quite annoyed that I had to put together this “hack” to get around something that worked PERFECTLY FINE in 2008R2 with GPPs. If anyone has a more elegant solution, I’d love to see it. I’m not the best scripter in the world, but I’m very pragmatic. “It works”

The first thing we want to do is create a logoff script to delete HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\FileExts\.pdf. However, because of that pesky Deny in the UserChoice key, we are unable to simply do this. So, I have made a simple “regini” command to overwrite the permissions on that key so that we can delete. In my environment, I have created FTA_PDF.txt in the NETLOGON directory. Inside this file is simply a registry value and some codes, which allow SYSTEM, Administrators, Interactive User, etc, FULL CONTROL of the key.

Next, I create a FTA_Delete.bat file in NETLOGON. This runs the “regini” command to change the permissions, then a “reg delete” command to delete the key.

Then we need to create the script for the logon process. I’ve busted out good old “ifmember” for this one. It’s a simple executable that will check AD group membership. My script is pretty simple. It checks to see if a user is a member of the Acrobat group. If so, run the “reg add” to add the association to Acrobat. If not, it falls back to the default .pdf reader in this environment. In this case, it’s Adobe Reader. Keep in mind that you can add multiple programs and associations using this method. You can add Foxit here if you would like.

So, the sad fact of the matter is when I tried to set this as an actual “Logon Script” the functionality didn’t work. I had to set this in a User GPO: Administrative Templates\System\Logon\Run these programs at user logon. I’m also the type of person that hates to see a CMD window flash up on the screen right after I login. So, I wrote ANOTHER script called FTA_Starter.bat to call this script to run in a minimized window.

This is the script I added to the GPO.

So, I fought with this for a long time and it wasn’t working. I had to re-read James’s blog and found this little blub at the bottom.

“Update – I built a third XenApp server, just to be sure, and this time the solution wouldn’t work until I removed the Registry key HKLM\Software\Classes\.pdf.”

This DID NOT WORK until the HKLM key was deleted from the servers. Do not forget this step.

I hope this helps you work through this issue in less time that it took me to do it.

Nov 052015

As a Citrix CSP, having a good set of scripts to deploy a base environment is critical. Setting up 50 environments by hand would take far more time than with good scripting. Now that I’ve finally had some time to sit down and not be working on 6.5 stuff, I have been able to write scripts to install and configure XenDesktop 7.6 with Storefront 3 using PowerShell (and some batch). Full disclosure! I am NOT good at PowerShell, or scripting in general. I’m sure there are a million better/easier ways to do these things, but what I have works, dammit. The point of this blog is to show you the commands, and what they do. I will attach my final scripts where you can tune and tweak as needed.

As always, credit where credit is due. I took some code from Aaron Parker from here. I also took some code from Eric from here. Lastly, I’d like to give a BIG THANKS to Esther Barhel (@virtuEs_IT) for helping me out with the non-documented Storefront 3 PowerShell commands.

If you have read my blog, most of my environments are built for the SMB/SMG space. That being said in this blog we will be setting up a single instance configuration. There will be a single server with both the XDC and Storefront roles on it. You could easily take this code to add additional XDCs or Storefront servers to your environment. I will be using code snippets to explain what they are for during the blog.

The first thing we want to do is install XenDesktop. As this is a small environment, we are going to use SQL Express on the XDC. Personally, I do not like to use SQL instances, so in my configuration I am going to install SQL with the default instance instead of the SQLEXPRESS instance. For this, I have created a directory with the XD7.6 disk extracted and shared on the network. I then created a SQL directory and extracted SQL2014 Express in there. I shared this directory out as XA76. (Long live XenApp!)

Installing SQL and XenDesktop

This first “Script” simply does a net use to the share, installs SQL, then installs XenDesktop with only the Controller and Studio roles. Note that you do not want to install Storefront yet. This script will reboot the server. As a scripting/PowerShell noobie, I learned a lot during this. For example. In PowerShell, the use of –% tells PowerShell to stop parsing code after those characters. This helped me a lot because PowerShell was trying to parse the \’s and /’s and failing miserably.

net use --% x: \\\XA76
cd '.\SQL'
write-host "Installing SQL"
 .\SETUP.EXE --% /QS /ACTION=install /FEATURES=SQL,Tools /INSTANCENAME=MSSQLSERVER  /IAcceptSQLServerLicenseTerms=true
cd ..
cd '.\x64\XenDesktop Setup'
write-host "Installing XenDesktop"
.\XenDesktopServerSetup.exe --% /components CONTROLLER,DESKTOPSTUDIO /NOSQL /quiet /configure_firewall
shutdown -f -r -t 2

XenDesktop Configuration

After the server is done rebooting, we can start with the configuration of the XenDesktop Site. First thing we want to do is setup the databases. I’m going to setup the default databases to the local system we just installed SQL on, using the NetBIOS Name of the domain as the site name. In this example $NBN is NetBIOS Name.

New-XDDatabase -AllDefaultDatabases -DatabaseServer $env:COMPUTERNAME -SiteName $NBN

The next thing we want to do is setup the Site. In my configuration, I use the NetBIOS name and append it to the DBs. In this example $LDB would be CitrixConfigLoggingTEST where TEST is the NetBIOS name. The same is done with the Monitor Database and Site Database.

New-XDSite -DatabaseServer $env:COMPUTERNAME -LoggingDatabaseName $LDB -MonitorDatabaseName $MDB -SiteDatabaseName $SDB -SiteName $NBN

Next, we want to setup licensing. This is pretty self-explanatory. This sets the license server and the port. IT then sets the product and edition. I generally use a generic DNS name for the licensing server, “ctxlicesnse” in this example, that points to the license server IP address. Then this sets up the product code and product edition. My example shows a setup for XenDesktop Advanced edition. You can get these variables with the following commands.

PS C:\> Get-ConfigProduct

Code                    Name
----                    ----
XDT                     XenDesktop
MPS                     XenApp
PS C:\> Get-ConfigProductEdition -ProductCode XDT

Here is my code to add licensing.

Set-XDLicensing -LicenseServerAddress ctxlicense -LicenseServerPort 27000
Set-ConfigSite -LicensingModel Concurrent -ProductCode XDT -ProductEdition ADV

Next, we setup a Machine catalog. This assumes that we already have at least one VDA setup, already pointed to this Delivery Controller. This is a simple persistent desktop Machine Catalog. This is setup for XenApp type Full Desktops (MultiSession), and we add the first XD box to the Machine Catalog.

New-BrokerCatalog -SessionSupport MultiSession -ProvisioningType Manual -AllocationType Random -Name 2012R2 -Description 2012R2 -PersistUserChanges OnLocal -MachinesArePhysical $true
New-BrokerMachine -MachineName TEST-RDU-XD-01 -CatalogUid 1

This next bit of code sets up the Desktop Group. This code snip was taken from Aaron Parker’s blog here and edited for my using. I encourage you to read his blog page to understand what this stuff does. It is much more involved than my code, and does a great job setting up the delivery group. I will try to break down the piece for all of us though. Let’s walk through the variables first.

$assignedGroup = "$NBN`\$NBN`_CTX_Desktop"
$desktopGroupName = "Some Desktop Group"
$desktopGroupPublishedName = "Some Desktop Group"
$desktopGroupDesc = "Some Desktop Group"
$colorDepth = 'TwentyFourBit'
$deliveryType = 'DesktopsandApps'
$desktopKind = 'Shared'
$sessionSupport = "MultiSession"
$functionalLevel = 'L7_6'
$timeZone = 'EST Eastern Standard Time'
$offPeakBuffer = 10
$peakBuffer = 10
$machineCatalogName = "2012R2"

You can set a TON of stuff in here, and it can get complicated when you are doing MCS/PVS and stuff. In this example, we are setting up a XenApp Full Desktop to the Machine Catalog we created above (2012R2). $deliverytime can be DesktopsOnly, DesktopsandApps, or AppsOnly. $deliverykind can be Shared or Private. $sessionSupport can be MultiSession or SingleSession. This is a new environment, with all 7.6, so we set the $functionalLevel to L7_6. This can be set to 5, 7, or 7.6. There are so many other commands in here that Aaron has detailed; I won’t go into them here. The command I have used for this example is below.

New-BrokerDesktopGroup -ErrorAction SilentlyContinue -AdminAddress $XDC -Name $desktopGroupName -DesktopKind $desktopKind -DeliveryType $deliveryType -Description $desktopGroupPublishedName -PublishedName $desktopGroupPublishedName -MinimumFunctionalLevel $functionalLevel -ColorDepth $colorDepth -SessionSupport $sessionSupport -InMaintenanceMode $False -IsRemotePC $False -SecureIcaRequired $False -Scope @()

The next thing we need to do is to add machines to the desktop group.

$machineCatalog = Get-BrokerCatalog -AdminAddress $XDC -Name $machineCatalogName
Add-BrokerMachinesToDesktopGroup -AdminAddress $XDC -Catalog $machineCatalog -Count $machineCatalog.UnassignedCount -DesktopGroup $desktopGroup

Then we want to add users to the desktop group

$brokerUsers = New-BrokerUser -AdminAddress $XDC -Name $assignedGroup
New-BrokerEntitlementPolicyRule -AdminAddress $XDC -Name ($desktopGroupName + "_" + $Num.ToString()) -IncludedUsers $brokerUsers -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedUserFilterEnabled $False

Lastly, we want to allow access through Access Gateway

New-BrokerAccessPolicyRule -AdminAddress $XDC -Name $accessPolicyRule -IncludedUsers @($brokerUsers.Name) -AllowedConnections 'ViaAG' -AllowedProtocols @('HDX','RDP') -AllowRestart $True -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedSmartAccessFilterEnabled $True -IncludedSmartAccessTags @() -IncludedUserFilterEnabled $True

Here is the full script from start to finish.

add-pssnapin c*
$nbn = $env:USERDOMAIN
$assignedGroup = "$NBN`\$NBN`_CTX_Desktop"
$LDB = "CitrixConfigLogging" + $nbn
$MDB = "CitrixMonitor" + $nbn
$SDB = "Citrix" + $nbn

# Desktop Group properties
$desktopGroupName = "Some Desktop Group"
$desktopGroupPublishedName = "Some Desktop Group"
$desktopGroupDesc = "Some Desktop Group"
$colorDepth = 'TwentyFourBit'
$deliveryType = 'DesktopsandApps'
$desktopKind = 'Shared'
$sessionSupport = "MultiSession"
$functionalLevel = 'L7_6'
$timeZone = 'EST Eastern Standard Time'
$offPeakBuffer = 10
$peakBuffer = 10
$machineCatalogName = "2012R2"

write-host "Creating Citrix Databases"
New-XDDatabase -AllDefaultDatabases -DatabaseServer $env:COMPUTERNAME -SiteName $NBN

write-host "Setting up Site"
New-XDSite -DatabaseServer $env:COMPUTERNAME -LoggingDatabaseName $LDB -MonitorDatabaseName $MDB -SiteDatabaseName $SDB -SiteName $NBN

write-host "Setting up licensing"
Set-XDLicensing -LicenseServerAddress ctxlicense -LicenseServerPort 27000
Set-ConfigSite -LicensingModel Concurrent -ProductCode XDT -ProductEdition ADV

write-host "Setting up Machine Catalog"
New-BrokerCatalog -SessionSupport MultiSession -ProvisioningType Manual -AllocationType Random -Name 2012R2 -Description 2012R2 -PersistUserChanges OnLocal -MachinesArePhysical $true
New-BrokerMachine -MachineName TEST-RDU-XD-01 -CatalogUid 1

$VerbosePreference = "Continue"

write-host "Creating Desktop Group"

If (!(Get-BrokerDesktopGroup -Name $desktopGroupName -ErrorAction SilentlyContinue)) {
Write-Verbose "Creating new Desktop Group: $desktopGroupName"
$desktopGroup = New-BrokerDesktopGroup -ErrorAction SilentlyContinue -AdminAddress $XDC -Name $desktopGroupName -DesktopKind $desktopKind -DeliveryType $deliveryType -Description $desktopGroupPublishedName -PublishedName $desktopGroupPublishedName -MinimumFunctionalLevel $functionalLevel -ColorDepth $colorDepth -SessionSupport $sessionSupport -InMaintenanceMode $False -IsRemotePC $False -SecureIcaRequired $False -Scope @()
If ($desktopGroup) {

Write-Verbose "Getting details for the Machine Catalog: $machineCatalogName"
$machineCatalog = Get-BrokerCatalog -AdminAddress $XDC -Name $machineCatalogName
Write-Verbose "Adding $machineCatalog.UnassignedCount machines to the Desktop Group: $desktopGroupName"
$machinesCount = Add-BrokerMachinesToDesktopGroup -AdminAddress $XDC -Catalog $machineCatalog -Count $machineCatalog.UnassignedCount -DesktopGroup $desktopGroup

Write-Verbose "Creating user/group object in the broker for $assignedGroup"
If (!(Get-BrokerUser -AdminAddress $XDC -Name $assignedGroup -ErrorAction SilentlyContinue)) {
$brokerUsers = New-BrokerUser -AdminAddress $XDC -Name $assignedGroup
} Else {
$brokerUsers = Get-BrokerUser -AdminAddress $XDC -Name $assignedGroup

$Num = 1
Do {
$Test = Test-BrokerEntitlementPolicyRuleNameAvailable -AdminAddress $XDC -Name @($desktopGroupName + "_" + $Num.ToString()) -ErrorAction SilentlyContinue
If ($Test.Available -eq $False) { $Num = $Num + 1 }
} While ($Test.Available -eq $False)
Write-Verbose "Assigning $brokerUsers.Name to Desktop Catalog: $machineCatalogName"
$EntPolicyRule = New-BrokerEntitlementPolicyRule -AdminAddress $XDC -Name ($desktopGroupName + "_" + $Num.ToString()) -IncludedUsers $brokerUsers -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedUserFilterEnabled $False

# Check whether access rules exist and then create rules for direct access and via Access Gateway
$accessPolicyRule = $desktopGroupName + "_Direct"
If (Test-BrokerAccessPolicyRuleNameAvailable -AdminAddress $XDC -Name @($accessPolicyRule) -ErrorAction SilentlyContinue) {
Write-Verbose "Allowing direct access rule to the Desktop Catalog: $machineCatalogName"
New-BrokerAccessPolicyRule -AdminAddress $XDC -Name $accessPolicyRule -IncludedUsers @($brokerUsers.Name) -AllowedConnections 'NotViaAG' -AllowedProtocols @('HDX','RDP') -AllowRestart $True -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedSmartAccessFilterEnabled $True -IncludedUserFilterEnabled $True
} Else {
Write-Error "Failed to add direct access rule $accessPolicyRule. It already exists."
$accessPolicyRule = $desktopGroupName + "_AG"
If (Test-BrokerAccessPolicyRuleNameAvailable -AdminAddress $XDC -Name @($accessPolicyRule) -ErrorAction SilentlyContinue) {
Write-Verbose "Allowing access via Access Gateway rule to the Desktop Catalog: $machineCatalogName"
New-BrokerAccessPolicyRule -AdminAddress $XDC -Name $accessPolicyRule -IncludedUsers @($brokerUsers.Name) -AllowedConnections 'ViaAG' -AllowedProtocols @('HDX','RDP') -AllowRestart $True -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedSmartAccessFilterEnabled $True -IncludedSmartAccessTags @() -IncludedUserFilterEnabled $True
} Else {
Write-Error "Failed to add Access Gateway rule $accessPolicyRule. It already exists."

} #End If DesktopGroup

Installing Storefront

This portion of the scripting is going to do a bunch of things. It will install the pre-requisites for Storefront, including IIS. It installs Storefront. It imports a certificate and binds it to the default website. The sets up the initial Storefront base URL then finishes the configuration. The first thing I did was to copy the 3.x version of CitrixStoreFront-x64 into my share to the x64\StoreFront directory and overwrite the default one. Luckily, this works so we can use XenDesktopServerSetup.exe again to install it.

The first thing we are going to do is install the pre-requisites and install Storefront. Again, I am just going to do a net-use to my share and run everything.

net use --% x: \\\XA76
Import-Module ServerManager
write-host "Installing Storefront Prereqs"
Add-WindowsFeature AS-Net-Framework,Web-Net-Ext45,Web-AppInit,Web-ASP-Net45,Web-ISAPI-Ext,Web-ISAPI-Filter,Web-Default-Doc,Web-HTTP-Errors,Web-Static-Content,Web-HTTP-Redirect,Web-HTTP-Logging,Web-Filtering,Web-Windows-Auth,Web-Client-Auth
cd '.\x64\XenDesktop Setup'
write-host "Installing Storefront"
.\XenDesktopServerSetup.exe --% /components STOREFRONT /NOSQL /quiet

Next, we want to import the certificate and bind it to the default web site. First, we ask for the cert password.

$myPW = read-host -Prompt "Enter Cert Password here"

We then want to import the certificate and assign it to the default website. Copy the .pfx file to the root of C:\. You will need the thumbprint of the certificate to put in the XXXXXXXXXXXXXXXXXX location of the script.

$certpw = ConvertTo-SecureString -String $myPW -Force -AsPlainText
Import-PfxCertificate -filepath "C:\cert.pfx" Cert:\LocalMachine\My -Password $certpw
New-WebBinding -Name "Default Web Site" -IPAddress "*" -Port 443 -Protocol https
cd IIS:
cd .\SSLBindings

Storefront Configuration

This gets us all set to start configuring Storefront. We will first need to import the Storefront PowerShell modules.

. "C:\Program Files\Citrix\Receiver StoreFront\Scripts\ImportModules.ps1"

Let us take a look at some of the variables we will be using here.

$nbn = $env:USERDOMAIN
$GatewayAddress = ""
$Farmname = "XenApp76 Farm"
$Port = "80"
$TransportType = "HTTP"
$sslRelayPort = "443"
$LoadBalance = $false
$FarmType = "XenDesktop"
$fqdn = "$env:computername.$env:userdnsdomain"
$baseurl = "https://" + $fqdn
$SFPath = "/Citrix/" + $nbn.toLower()
$SFPathWeb = "$SFPath`Web"
$SFPathDA = "$SFPath`DesktopAppliance"
$GatewayName = "TEST-RDU-NS-01"
$staservers = "http://" + $fqdn + "/scripts/ctxsta.dll"
$snipIP = ""

Again, keep in mind this is a small environment, so we will be using a single server for the XDC/Storefront roles. My $baseurl variable will resolve to https://server.domain.local. I set the store name to $nbn (NetBIOS Name), in this example it is TEST. Then using some simple PowerShell I set $SFPath, $SFPathWeb, and $SFPathDA to /Citrix/test, /Citrix/testWeb, and /Citrix/testDesktopAppliance respectively. You can set these variables as appropriate for your environment. The first command we want to run will do the initial configuration of Storefront.

Set-DSInitialConfiguration -hostBaseUrl $baseurl -farmName $Farmname -port $Port -transportType $TransportType -sslRelayPort $sslRelayPort -servers $fqdn -loadBalance $LoadBalance -farmType $FarmType -StoreFriendlyName TEST -StoreVirtualPath $SFPath -WebReceiverVirtualPath $SFPathWeb -DesktopApplianceVirtualPath $SFPathDA

The next thing I do here is setup the beacons. I set this up now because if you setup the gateway first, it sets the $baseurl as an external beacon. In my configuration, I do NOT want $baseurl to be an external beacon. At the time of writing this blog, Citrix has not written the full documentation for these PowerShell modules. I have already put in an RFE to get these up on Citrix’s site. That being said, I was not able to figure out HOW to remove an external beacon. The gateway module detects if you have any external beacons configured. If it detects none are configured, it automatically makes $baseurl and the two external beacons. Setting up the external beacons is as simple as these commands.

$beaconID = ([guid]::NewGuid()).ToString()
Add-DSGlobalExternalBeacon -BeaconID $beaconID -BeaconAddress
$beaconID = ([guid]::NewGuid()).ToString()
Add-DSGlobalExternalBeacon -BeaconID $beaconID -BeaconAddress

Next, we are going to add a NetScaler gateway to the configuration. Reference the variables above. Not too much complicated in this command. This box is also the XDC, so the STA setup simply points to http://server.domain.local/scripts/ctxsta.dll.

$GatewayID = ([guid]::NewGuid()).ToString()
Add-DSGlobalV10Gateway -Id $GatewayID -Name $GatewayName -Address $GatewayAddress -Logon Domain -IPAddress $snipIP -SessionReliability $false -SecureTicketAuthorityUrls $staservers -IsDefault $true

The previous command creates the NetScaler gateway. We then have to enable NetScaler authentication and link this gateway to the store.

$gateway = Get-DSGlobalGateway -GatewayId $GatewayID
Set-DSStoreGateways -SiteId 1 -VirtualPath $SFPath -Gateways $gateway
Set-DSStoreRemoteAccess -SiteId 1 -VirtualPath $SFPath -RemoteAccessType "StoresOnly"
Add-DSAuthenticationProtocolsDeployed -SiteId 1 -VirtualPath /Citrix/Authentication -Protocols CitrixAGBasic
Set-DSWebReceiverAuthenticationMethods -SiteId 1 -VirtualPath $SFPathWeb -AuthenticationMethods ExplicitForms,CitrixAGBasic

This next command was from the Eric’s blog referenced above. This disables the check publisher’s certificate revocation to speed up console start-up

set-ItemProperty -path "REGISTRY::\HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing\" -name State -value 146944

Lastly we as we are using $fqdn as $baseurl, we will want to turn the loopback to OnUsingHTTP because the certificate is not going to match. You can look at more details on this command here.

Set-DSLoopback -SiteId 1 -VirtualPath $SFPathWeb -Loopback OnUsingHttp

There we have it. Storefront configuration is DONE, dude! All we need to do is setup an internal DNS cname to point to server.domain.local and we have single URL for internal/external access to your XenDesktop 7.6 environment.

Full Script is below.

# Certificate Password
$myPW = read-host -Prompt "Enter Cert Password here"

# StoreFront Parameters
$nbn = $env:USERDOMAIN
$GatewayAddress = ""
$Farmname = "XenApp76 Farm"
$Port = "80"
$TransportType = "HTTP"
$sslRelayPort = "443"
$LoadBalance = $false
$FarmType = "XenDesktop"
$fqdn = "$env:computername.$env:userdnsdomain"
$baseurl = "https://" + $fqdn
$SFPath = "/Citrix/" + $nbn.toLower()
$SFPathWeb = "$SFPath`Web"
$SFPathDA = "$SFPath`DesktopAppliance"
$GatewayName = "TEST-RDU-NS-01"
$staservers = "http://" + $fqdn + "/scripts/ctxsta.dll"
$snipIP = ""

#write-host "Mapping Drive"
net use --% x: \\\XA76
Import-Module ServerManager

write-host "Installing Storefront Prereqs"
Add-WindowsFeature AS-Net-Framework,Web-Net-Ext45,Web-AppInit,Web-ASP-Net45,Web-ISAPI-Ext,Web-ISAPI-Filter,Web-Default-Doc,Web-HTTP-Errors,Web-Static-Content,Web-HTTP-Redirect,Web-HTTP-Logging,Web-Filtering,Web-Windows-Auth,Web-Client-Auth
cd '.\x64\XenDesktop Setup'

write-host "Installing Storefront"
.\XenDesktopServerSetup.exe --% /components STOREFRONT /NOSQL /quiet

#write-host "Copy certificate to C:\ before moving on"

write-host "Installing Certificate"
$certpw = ConvertTo-SecureString -String $myPW -Force -AsPlainText
Import-PfxCertificate -filepath "C:\" Cert:\LocalMachine\My -Password $certpw
New-WebBinding -Name "Default Web Site" -IPAddress "*" -Port 443 -Protocol https
cd IIS:
cd .\SSLBindings
Get-Item Cert:\LocalMachine\My\8CE850C9DCD7C26DF8E8FD4C44BF7D9E586E8AD1 | new-item!443

# Import Storefront module
write-host "Installing Storefront Modules"
. "C:\Program Files\Citrix\Receiver StoreFront\Scripts\ImportModules.ps1"

# Setup Initial Configuration
write-host "Initial Storefront Configuration"
Set-DSInitialConfiguration -hostBaseUrl $baseurl -farmName $Farmname -port $Port -transportType $TransportType -sslRelayPort $sslRelayPort -servers $fqdn -loadBalance $LoadBalance -farmType $FarmType -StoreFriendlyName TEST -StoreVirtualPath $SFPath -WebReceiverVirtualPath $SFPathWeb -DesktopApplianceVirtualPath $SFPathDA

write-host "Configuring Beacons"
$beaconID = ([guid]::NewGuid()).ToString()
Add-DSGlobalExternalBeacon -BeaconID $beaconID -BeaconAddress
$beaconID = ([guid]::NewGuid()).ToString()
Add-DSGlobalExternalBeacon -BeaconID $beaconID -BeaconAddress

$GatewayID = ([guid]::NewGuid()).ToString()
Add-DSGlobalV10Gateway -Id $GatewayID -Name $GatewayName -Address $GatewayAddress -Logon Domain -IPAddress $snipIP -SessionReliability $false -SecureTicketAuthorityUrls $staservers -IsDefault $true
$gateway = Get-DSGlobalGateway -GatewayId $GatewayID
Set-DSStoreGateways -SiteId 1 -VirtualPath "/Citrix/test" -Gateways $gateway
Set-DSStoreRemoteAccess -SiteId 1 -VirtualPath /Citrix/test -RemoteAccessType "StoresOnly"
Add-DSAuthenticationProtocolsDeployed -SiteId 1 -VirtualPath /Citrix/Authentication -Protocols CitrixAGBasic

write-host "Disable check publisher's cert revocation"
set-ItemProperty -path "REGISTRY::\HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing\" -name State -value 146944

write-host "Setting Loopback to OnUsingHttp"
Set-DSLoopback -SiteId 1 -VirtualPath $SFPathWeb -Loopback OnUsingHttp
Mar 202015

EDIT:  People have been requesting a tool to deploy SMS2 secret keys en mass, and the developer hasn’t implemented it yet.  Until he does I wrote a powershell script that will remotely connect to the sql database and inject the information needed for each user you select (  I have it setup for TOTP keys… which I think is what most people will use.
EDIT2:  I created a new script that does basically the same thing as the script posted above, but you can direct it against a specific AD group (  Also, if you haven’t yet – upgrade your netscalers to version 11 – much easier to control the portal themes.

Get SMS2

Go to and register for your free copy – an email will be sent to you with a download link and your xml based license.

Prepare your environment

You will need SQL/SQLExpress if you don’t already have it (will assume you do). You also need .NET 4 on the RADIUS server (will assume you have that too).

1. On the server you wish to use for RADIUS authentication open server management and click Add Roles and Features

2. Install the Network Policy and Access Services role and add any features that go along with that role – accept all the defaults.

3. Open the Network Policy Server Console

a. Expand Policies and select Network Policies

b. Right click Connections to other access servers and select properties

c. Change it from Deny access to Grant access and hit ok

d. Expand RADIUS Clients and Servers

e. Right click RADIUS Clients and select New

f. Create a connection for the local computer (so you can test connections).
Friendly name – whatever you want to name it
Address – the IP address of the RADIUS server you are creating
Shared secret – type something in that you will remember (will need it later)
Hit OK

g. Do the same thing for your Netscaler(s) using the NSIP(s) – again remember your shared secret – if you have more than one Netscaler use the same shared secret.
Should look something like this when you are done

4. Install SMS2

a. Next

b. For my purposes I select Custom (I don’t want SMS based authentication – just token)

i. Services I set CloudSMS to not install

ii. Under Clients I set all to install but the Citrix Web Interface Customization and SMS2…

c. Configure AuthEngine – enter the license text from the email you received and hit Check License (should pop up when it expires) – click ok and then Next

d. Leave the account as Local System and hit Next

e. On the next screen change the AuthEngine Address to (will reply on all IP addresses of the server)
Type in your domain controller name/address and fill in user account credentials of a user with access to AD
optionally you can change the BaseDN, but I’ll leave it as the root of my test domain
test your config and hit Next if successful

f. Enter your SQL server information
If the SQL instance is on the RADIUS server itself (as it is in my case) check the box to “Use named pipes (local)”
Click Test Connection – I get an error about how it could not use the database… it wasn’t there yet. I hit test connection again and it is successful.

g. Enter your email information – uncheck SSL and Use Auth if you don’t need them (straight smtp for me) – Finish

h. Configure OATHCalc – Next – Finish

i. Configure AdminGUI/Clients – Set the AuthEngine Address to the IP of the RADIUS server, and hit Finish

j. Next – install – Finish

Configure SMS2 for Token

1. Browse to C:\Program Files\WrightCCS2\Settings (assuming you installed the 64bit version… if not the Settings directory will be in x86)

2. Open Configuration.xml in notepad and change these settings (by default they are True, which will mess things up)

3. Find the <AuthProviders> line

a. Under CloudSMS – disable it (we didn’t install it anyways)

b. Under OATHCalc set it as default

c. Under PINTAN – disable

d. Under Email – disable

4. Save the .xml file and restart the WrightAuthEngine service (if they are not started – start them)

Setup all users for token (this could potentially take a long time)

1. Launch the SMS2 Admin Console

2. Select the user on the right hand side to select, and hit Configuration Menu at the top.

3. Go to the Auth Options tab (don’t need the others)

4. Click TOTP (time-based) and click Generate Shared Secret – record the shared secret if you want

5. Click Save configuration and you will see a popup – click OK and then you will see a QR code – copy it to the clipboard and send it to the user (also keep a record of it if you want)

6. Click Close

7. Do that again and again until you have a token for every user who needs to connect to XenApp/XenDesktop through the gateway

At this point users would download Google Authenticator or Microsoft Authenticator (probably others) to their smartphone and add the account using that QR code. Let’s assume everyone has done that.


1. Download NTRadPing ( – google it if that link doesn’t work… you will find it

2. From your RADIUS server unzip it and run it (remember we created a client connection for the local computer earlier)
Type the IP of your radius server (port is 1812 if it isn’t there by default)
Leave the reply/retries set to default
Type in your secure string that you associated with the local computer RADIUS client
Type in the domain\username of a user you have configured to use one of the authenticator apps
Type in the password followed immediately with whatever code is showing in your authenticator app. If the password is “P@ssword!” then the password would be P@ssword!456123 (where 456123) is the number generated.
Click Send – If you see Reply-Message=Message accepted then you are good to go. If not then something is wrong.

Configure Netscaler

GUI 10.5

1. Logon your netscaler and browse to Netscaler Gateway\Policies\Authentication\RADIUS

2. Click the Servers tab and click Add
Give it a name
Select Server IP and punch in the IP of the RADIUS server
Port will be 1812
Type in the secret key you used to create the Netscaler RADIUS clients on the RADIUS server
Click Details and set Accounting* to OFF
Click Create

3. Click the Policies tab and click Add
Name the policy
Select the Server you just created (if it isn’t pre-selected)
Type in “ns_true” into the Expression field and hit Create

4. Bind the policy to your Netscaler Gateway virtual server(s) (NetScaler Gateway\Virtual Servers)
Select the virtual server and hit edit
Click the + on Authentication
Choose RADIUS and Secondary from the drop downs and hit Continue
Click to select the policy
Tick the policy you just created and hit ok
Click Bind
Click done and save

At this point you should be ready to test logging onto the gateway page

Testing the gateway

1. Hit your gateway address you will probably notice it has changed and looks something like this:
Password1 is your password
Password2 is your token pin

2. Logon using your credentials and the token generated by Google Authenticator (or whatever app you are using).

a. If it works then you are good to go and can move onto customizing the web interface

b. If it does not work unbind the policy and test to figure out where things are going wrong

i. Could be the wrong IP entered in for the Netscaler on the RADIUS server or wrong security string

Fixing the gateway appearance

1. Download Notepad++ and install (

2. Download Tunnelier (

3. Install and run Tunnelier (Bitvise SSH Client)

4. Connect to your netscaler using the password method

5. A command window and a SFTP window will open – select the SFTP window and on the right hand side browse to: /var/netscaler/gui/vpn

6. Select login.js and click the download button at the bottom (will download to your local desktop by default unless you change it on the left side… should be ok).

7. On the right side go into the resources folder and download en.xml

8. Make a backup copy of the files just in case

9. Open login.js using Notepad++
Find this line: if ( pwc == 2 ) { document.write(‘&nbsp;1’); }
and change it to:
if ( pwc == 2 ) { document.write(‘&nbsp;’); }
Just remove the 1 basically
Find the line that starts with: document.write(‘<TR><TD align=right style=”padding-top:
and change right to left

10. Save login.js

11. Open en.xml in Notepad++
Find this line: <String id=”Password2″>Password 2:</String>
Change it to: <String id=”Password2″>Token:</String>
You can name it whatever you want… I’m just using Token:

12. In Tunnelier upload the files to their respective directories overwriting them

13. Refresh your browser and your changes should be reflected.

The only problem now is that this change will not survive a reboot. In older versions of netscaler you could use a rewrite policy to rewrite the page and that would persist. In 10.1+ you have to use a custom theme.


Set a custom theme so the gateway appearance persists a reboot

NOTE: Linux is case sensitive… type things exactly as I have them.

1. Using Tunnelier switch to your terminal window
cp /nsconfig/ns.conf /nsconfig/
mkdir /var/ns_gui_custom
cd /netscaler
tar -cvzf /var/ns_gui_custom/customtheme.tar.gz ns_gui/*

What we did there was make a backup of ns.conf (in case something goes awry – reverse the “cp” command to restore it), created a folder, and zipped the contents of /netscaler/ns_gui to /var/ns_gui_custom/customtheme.tar.gz ß that is the file and location that netscaler knows to use for a custom theme.

2. Open your netscaler in your browser, logon and navigate to NetScaler Gateway\Global Settings

3. Click the Change Global Settings link on the right side

4. Click the Client Experience tab and scroll to the bottom

5. Switch the UI Theme to Custom and hit OK

6. TEST the gateway page (I use a chrome incognito window when I make a change as it doesn’t use the cached website)

7. If the test is successful save your netscaler configuration

a. If you have a HA pair I am pretty sure you have to mirror all the steps on the secondary except for setting the UI Theme to Custom. On your secondary:

i. Copy the files to the correct locations on the secondary netscaler

ii. Run the commands from the terminal window

iii. Force a sync from the gui (System\High Availability à Actions)

Mar 162015

UPDATE:  I found trying to run this script through the Netscaler Gateway failed due to differences in the web pages.  I re-wrote the script so it will work internal directly to StoreFront, and externally with Netscaler Gateway.  The main caveat is that the wficalib.dll doesn’t allow you to logoff the session when going through the gateway.  I simply set it to look for any wfica32 processes prior to launching the application/desktop, compare that to the processes after launch, and kill the new process (disconnecting the session).  The script identifies the webpage as a gateway if */vpn/* is in the path.  I also set it to logoff of the Storefront/Gateway page when it ends the script.  If you were to run it back to back without logging off of the page it wouldn’t find what it is looking for initially and fail (because you’d probably already be logged on).  I may re-write this in the future with more functions to make it a bit shorter/cleaner, but as is it should work.
NOTE: I tested this with Netscaler 10.5… it may not work with previous versions as is, but if you read the script you should be able to figure out what needs to be changed.

I need to give credit to this Citrix blog post for getting me started.  This script will launch an application or desktop as a user you specify from the StoreFront web page, then send you an email to let you know if it was successful or not.

Variables you should edit:

In the send-results function
$smtpserver (line 19)
$msg.From (line 28)
$ (line 29)

In the main script (be sure to read the comments)
$username (line 84)
$passstring (line 86)
$resource (line 92)
$mask (line 94)
$wait (line 96)
$internetexplorer.visible (line 98)
$internetexplorer.navigate2 (line 99)

Powershell (x86)! – otherwise you cannot tie into the x86 dll for Receiver
(If you are going to the Netscaler Gateway it doesn’t matter which version as we won’t tie into the dll in that case)

Add the following to the registry if you are pointing to the internal StoreFront url – if you are pointing to the Netscaler Gateway it won’t matter
Windows Registry Editor Version 5.00


If you are running on an x86 machine change the registry path to exclude Wow6432Node, and change the path of the .dll on line 156 of the script to the correct path of the wficalib.dll.

Here is the script!

Contact me in in the channel David62277