Dec 222020
 

I was recently tasked with moving some PVS target/vDisk to a new PVS environment. There are many ways to achieve this. However, this is how I migrate devices from one PVS Farm to another. I am sure there are better ways to automate this as my blog is just a simple way to get it done. Migrating PVS targets to a new PVS Server. Another way I have done it in the past was add new PVS servers into an existing Farm, then go through the actions to make them the main servers, and of course decom the old ones. This is a very simple task, and most people know this. But if not, they will help.

Open your PVS Console, and select the PVS vDisk, then click export.

Select the lowest version to grab the vDisk Chain

You will see a Filename.xml get created. This is a manifest file that will instruct the new destination PVS server to inject the version in.

For me I like to create Stores for all my different vDisk.

Give it a name

Check all the PVS Server that will contain the vDisk

Create the folder in the given location that you are storing the vDisk and Validate the paths.

I leave the default write cache location in the area it selects, as we do use the old legacy ways of the Write cache to server.

Now I just copy the Connects form the original Server to the destination

At this point I open of the PVS console and go to vDisk pool

Right click and Import

Select the store you create earlier, and Select a PVS server that you copied it to, and click search

Or you can import it from the Store, so you don’t have to select anything and take the change of getting something wrong.

Click Add and close

Click ok, after vDisk was added

Now you can see the Vdisk

For me I use a dedicated VMware device for updating my master images and such. I am going to move the VMware device over to the new PVS server and update the subnet and MAC as well to reflect a new subnet we just added.

Example

Change the Port group and note the MAC

Now at this point I manually create the Device in the new PVS Server.

Add the new MAC, Update the vDIsks area to the vDisk you imported in.

Change the Type to Maintenance

Now because I use the ISO BDM method, I need to update it to the new BDM to the new PVS Boot path.

If you have more than one new PVS server, make sure you copy the vDisk files to the second server as well, or when it boots up based on the default load balancing, it will hit the second server and tell you no vDisk found.

Check and make sure the replication status is green on both servers

On the device, you will need to reset the Machine account from PVS, to recreate the Domain trust password

Now, boot the Maintenance Machine up and it should stream to the new location. Found the vDISK

Typically for moving all the targets I would do something like this, then I would edit the MAC addresses so they would align with the new subnet that I change from. 

How to Migrate PVS Target Devices | | Apps, Desktops, and Virtualization (kylewise.net)

For this situation, I can now create all new devices that will have the same vDisk. I will create a new Machine Catalog/Deliver Group. You can also use the same one if needed. I use FSlogix, so the profile data will be intact for the user, the user will not even notice anything has changed as far as the devices name or its moved to a different PVS server.

Oct 182019
 

Update 10/29/19: Added search (username/userfullname) to the Sessions tab.

Studio always seems extremely sluggish to me when trying to navigate between different areas. I’m always waiting a few seconds for the refresh circle to go away every single time I navigate somewhere new.

Most of the time I just need to put a machine in/out of maintenance, perform a power action, or logoff/disconnect a session. I found that power shell is MUCH faster, so I wrote a new little tool.

Requirements:
.net 4.5
The broker admin powershell snapin (available on the install media -alternatively you can run this on the delivery controller itself and use localhost as the connection string)

You can select each server/desktop/session individually, or select multiple. Then simply right click and select the operation to perform. I have tested this on versions 7.15 through 1906. Note: There is no warning! When you select the action to perform it will just do it.
Here is the download link
https://david62277.sharefile.com/d-sd0c6f7690ce436ba

Operations allowed (depending on state):
Machines – turn on/off maintenance, shutdown, restart, reset, poweroff
Sessions – Disconnect, Log Off

Mar 132018
 

——–UPDATE 3/28/2018 ———-

Starting with FSLogix 2.8.12 you no longer need to do this.  Great work FSLogix!

From their release notes:

“• An issue with first time OneDrive installation has been resolved. The issue occurred when OneDrive was installed for the first time, and OneDrive syncing began. The FSLogix agent would not redirect the OneDrive files to the VHD/X container until the user logged off and then back on again. The FSLogix agent has been enhanced to detect that OneDrive is being installed, and now immediately begins redirecting the OneDrive files to the VHD/X.”

I have tested this multiple times and it works perfectly.

However, I DO still recommend you make the symlink to C:\OneDriveTemp as stated below.

Old post below

———————————————————————————————————————————————————————————

We have started to use FSLogix Apps to deliver OneDrive into our Hosted XenApp Desktops that we offer to our customers. This is a great tool that allows users to use the native OneDrive utility in their XenApp sessions. Why would you need this, you may ask? The reason is that OneDrive forces your OneDrive data to be saved in %LOCALAPPDATA%. If you’re using Citrix Profile Manager (UPM) or really any other profile tool, you would be stuck trying to sync that data during logon/logout, which isn’t really feasible. You could also just use local profiles, but then you would be saving all of the data across all of your XenApp servers. I have not figured out a way to trick OneDrive to save to a network location. It was even too smart for a Windows symlink.

This is where FSLogix Apps come in. FSlogix Apps can mount a VHD for each user, and with it’s file system filter driver it’s able to determine what data is destined for OneDrive and move that data into the VHD. At this point there is only one copy of the OneDrive data, and you can roam it across all XenApp servers in your environment. It works very well and we like it a lot, however there are a couple of important shortcoming I need to discuss. I will also show you how I worked around these issues with my own testing and scripting.

Note: THIS IS NOT SUPPORTED BY FSLOGIX. IT WORKS, BUT USE AT YOUR OWN RISK!

First: FSLogix Apps can not pre-mount junction points. If you look at a user who has logged in before configuring OneDrive, you will see the following information:

One, the VHD is mounted fine.

However, look at the redirects:

You can see here that only the Outlook and Lync redirects are in place. I don’t see anything for OneDrive. Let’s see what happens when we configure OneDrive right now. Let’s look at the space on C:

So, look at OneDrive and how much space it thinks is free. The VHD configured for all users is 100GB.

I’m going to select 5.5 GB of “crap” and start a sync. Let’s see what happens.

So, it has taken up space on the C:. Well this isn’t good! With a couple users and a small amount of data you can fill up the C: on your XenApp server. Bad! Also, in this type of configuration where you are only using FSLogix Apps for OneDrive and NOT for profiles, you would be syncing this information back with your profile management solution.

So… What can we do? A couple of things to help this out. I wrote a script to pre-mount the junction points. Let’s take a look.

function Register-Redirect {
param(
[string] $TenantName = "VC3, Inc" # Enter your Tenant ID here
)
try {
# Ensure frx.exe is present in the PATH
if (!([System.Environment]::GetEnvironmentVariable('PATH').Split(';') -contains "$($env:ProgramFiles)\fslogix\apps")) {
Write-Verbose "frx.exe is not found in 'PATH'...Adding frx.exe to 'PATH'";
$env:PATH = "$($env:PATH);$($env:ProgramFiles)\fslogix\apps";
}
# If the junction point is not mapped parse out the number from the results returned from 'frx list-redirects file'
[string] $frxOutput = & 'frx' 'list-redirects' | Where-Object { $_ -match "\\$($env:USERNAME)\\" } | Out-String;
if ($frxOutput -match '\\Microsoft\\OneDrive') {
return;
} else {
$Matches = $null; # Reset the hashtable.
$frxOutput -match '\\Device\\HarddiskVolume([1-9]{1,3})\\Skype4B\\User';
$volume = $Matches[1];
# Map the junction points matching the above regex.
Write-Verbose -Message "Adding redirect => Source: \Device\HarddiskVolume2\Users\$($env:USERNAME)\OneDrive - $($TenantName) Target: \Device\HarddiskVolume$($volume)\OneDrive\User";
frx add-redirect -src "\Device\HarddiskVolume2\Users\$($env:USERNAME)\OneDrive - $($TenantName)" -dest "\Device\HarddiskVolume$($volume)\OneDrive\User";
Write-Verbose -Message "Adding redirect => Source: \Device\HarddiskVolume2\Users\$($env:USERNAME)\AppData\Local\Microsoft\OneDrive Target: \Device\HarddiskVolume$($volume)\OneDrive\UserMeta";
frx add-redirect -src "\Device\HarddiskVolume2\Users\$($env:USERNAME)\AppData\Local\Microsoft\OneDrive" -dest "\Device\HarddiskVolume$($volume)\OneDrive\UserMeta";
}
} catch {
throw "Unable to add redirect";
}
};
Register-Redirect

sleep 5

& "C:\Program Files (x86)\Microsoft OneDrive\OneDriveSetup.exe"

All of this assumes your C: of your XenApp Server is HardDiskVolume2!!!

What does this do? Ok, first, you put your tenant ID in the top. You need this because your folder is normally in the “OneDrive – TenantID” format. Next, it sets a path to the frx .exe. Then it checks to see if the junction points are already mounted with `frx list-redirects` . (it will have no problem duplicating them, so I had to write this in). It uses logic to only pull your own username, so if you are logged into a XenApp server with a bunch of users, you only return your own information. By default it will list all users.

If the script see’s that you don’t have them mounted it will mount them for you. This is where it gets tricky. Look at the existing `frx list-redirects`

The section on the right is where it mounts the information into the VHD. Notice it is HardDiskVolume4. What I had to do was (get a bunch of help, BobFrankly/Atum – CitrixIRC) write a regex to extract the “4” from this output, so we can use it to mount the OneDrive locations in the same volume. THIS IS CRITICAL. Otherwise you wouldn’t be writing to the VHD and that would be bad. So the script extracts the volume number, then runs `frx add-redirect` to add the junction points.

This is a function, so the last thing it does is run the function, then launches the OneDriveSetup.exe

Let’s run this and see what it does. I added a pause so we could see it. You can see the two “Operation completed successfully!” messages, and the OneDrive Setup has launched.

On the server you see the junction points. Notice the addition of two OneDrive folders.

So, let’s setup OneDrive and see what happens. First, let’s look at the hard drive space again.

But wait, why does it still show the wrong amount of hard drive space? Bear with me.

Let’s sync 5GB of “crap” and see what happens. It shows the proper VHD size after the sync is complete. I’m not sure WHY it shows up incorrectly first, it just does.

Let’s check out the hard drive.

Success!

YOU SHOULD ONLY NEED TO RUN THIS SCRIPT ONCE! Once this process is done, FSLogix will handle the rest and you won’t have to do this ever again! I have this setup to launch through a shortcut in the user’s Start Menu’s as part of the “OneDrive Onboarding” process.

Second: If you have worked with OneDrive before, it’s great, but not perfect. Also, FSLogix doesn’t always perfectly clean up all of the junction points at logoff. You want to make sure they are gone at logoff, especially if something breaks during initial configuration of OneDrive. I have added a logoff script to kill the junction points. You will have to edit this for your own tenant ID!

frx del-redirect -src "\Device\HarddiskVolume2\Users\%USERNAME%\OneDrive - VC3, Inc"
frx del-redirect -src "\Device\HarddiskVolume2\Users\%USERNAME%\AppData\Local\Microsoft\OneDrive"

Lastly, if you know how the behavior of OneDrive is, you can still have a problem. By default, when you sync with OneDrive, the FIRST place it writes files to is C:\OneDriveTemp\<USERSID>. After it’s done processing it moves it into the OneDrive file location. It does this on a file by file basis, but again, if you had a bunch of users all sync’ing at the same time, you still could possibly fill up the C: on the XenApp server!

Lastly: The final uber-hack I did for this was to create a symlink on the server to point C:\OneDriveTemp to a network location. This one actually works with OneDrive. In my case I pointed it to a share I created on the same volume I was pointing the FSLogix VHDs to.

mklink /d C:\OneDriveTemp <SomePathHere>

That’s all I have. These are the step’s I had to go through to use FSLogix Apps with OneDrive in our production environment. Have fun!

Stay tuned. In a future post I will show you how to setup QOS for OneDrive so you don’t kill your datacenter’s bandwidth when you have a bunch of people uploading files at the same time.

Feb 212018
 

I plagiarized David Ott’s script for migration of Citrix Profile Manager (UPM) profiles to FSLogix and created it for Local Profiles.

NOTE: This only works between like profile versions.  eg. You can’t migrate your 2008R2 profiles to Server 2016 and expect it to work.  See this chart.

This requires using frx.exe, which means that FSLogix needs to be installed on the server that contains the profiles. The script will create the folders in the USERNAME_SID format, and set all proper permissions.

Use this script. Edit it. Run it (as administrator) from the Citrix server. It will pop up this screen to select what profiles to migrate.

#### EDIT ME
$newprofilepath = "\\domain.com\share\path"

#### Don't edit me
$ENV:PATH=”$ENV:PATH;C:\Program Files\fslogix\apps\”
$oldprofiles = gci c:\users | ?{$_.psiscontainer -eq $true} | select -Expand fullname | sort | out-gridview -OutputMode Multiple -title "Select profile(s) to convert"

# foreach old profile
foreach ($old in $oldprofiles) {

$sam = ($old | split-path -leaf)
$sid = (New-Object System.Security.Principal.NTAccount($sam)).translate([System.Security.Principal.SecurityIdentifier]).Value

# set the nfolder path to \\newprofilepath\username_sid
$nfolder = join-path $newprofilepath ($sam+"_"+$sid)
# if $nfolder doesn't exist - create it with permissions
if (!(test-path $nfolder)) {New-Item -Path $nfolder -ItemType directory | Out-Null}
& icacls $nfolder /setowner "$env:userdomain\$sam" /T /C
& icacls $nfolder /grant $env:userdomain\$sam`:`(OI`)`(CI`)F /T

# sets vhd to \\nfolderpath\profile_username.vhdx (you can make vhd or vhdx here)
$vhd = Join-Path $nfolder ("Profile_"+$sam+".vhdx")

frx.exe copy-profile -filename $vhd -sid $sid
} 
Dec 202017
 

In my environment I have nearly 100 persistent MCS VDAs. Our developers, contractors, and even us IT people need the ability to change things on our VDI desktops (install/uninstall/set registry/create local IIS sites/etc.) and have them persist reboots. Luckily, the only software I have to maintain on these desktops is the VDA itself. That still means when Citrix releases a new version that I want to move to, I have to upgrade almost 100 individual desktops. The last time I had to do this I wanted to rip my hair out, and decided to write a script to automate the task. It has saved me TONS of time, so I want to share it in hopes that it can save someone else some time (and hair).

I wrote the script specifically for my MCS environment, which runs on XenServer, but with a little tweaking it can be modified for any environment.

How it works (the short version):

  1. Gets the computer(s) you set – can be manual input or a delivery controller query
  2. On each computer it will create 2 scripts, and 2 scheduled tasks
    1. The first script loads auto logon info into the registry
    2. The second script handles the vda removal and install
  3. One scheduled task runs once to run the first script and reboot
  4. The second scheduled task runs the second script at logon (first script sets up a user – in my case the local administrator – to logon automatically)

Most things are explained in the script, and this time I’ve also created a youtube video to go through it all/show it in action.

Script
Video

 

Oct 242017
 

NOTE: This only works between like profile versions.  eg. You can’t migrate your 2008R2 profiles to Server 2016 and expect it to work.  See this chart.

I moved from UPM to FSLogix earlier this year, and decided to write my own powershell script to convert the UPM profiles to .vhd.  FSLogix has its own conversion process (which I didn’t find a whole lot of info on), but I decided to create my own.

What this script does:

  1. Gets a list of all UPM profile directories in the root path (that you supply) and displays them to you to select which one(s) you would like to convert via out-gridview
  2. For each profile you select:
    1. Gets the username from the profile path – you will have to edit this part for your environment… explanation in the script
    2. Use the username to get the SID (FSLogix profiles use the username and sid to name the profile folder)
    3. Creates the FSLogix profile folder (if it doesn’t exist)
    4. Sets the user as the owner of that folder, and gives them full control
    5. Creates the .vhd (my default is 30GB dynamic – edit on line 70 if you wish to change it) – if it doesn’t exist (if it does skip 7, 9-10)
    6. Attaches the .vhd
    7. Creates a partition and formats it ntfs
    8. Assigns the letter T to that drive (edit on line 73 if you wish to change that)
    9. Creates the T:\Profile directory
    10. Grants System/Administrators and the user full control of the profile directory
    11. Copies the profile from the UPM path to the T:\Profile directory with /E /Purge – if you are re-running this script on a particular profile it will overwrite everything fresh
    12. Creates T:\Profile\AppData\Local\FSLogix if it doesnt exist
    13. Creates T:\Profile\AppData\Local\FSLogix\ProfileData.reg if it doesn’t exist (this feeds the profilelist key at logon)

Here is the script!

Jul 112017
 

I have been using Full Desktop’s inside of XenApp forever. Lately I have been working on a project where I will be using only published apps. We are a CSP and a managed service provider who uses LabTech (Now ConnectWise Automate) to control all of our systems. LabTech uses a great remote access product called ScreenConnect to connect to the systems. All of this works flawlessly inside of a full desktop. However, when I published LabTech as a seamless app (LTClient.exe), everything seems to work fine except for ScreenConnect. I got a great Citrix engineer on the line who actually used all of the collected data I uploaded and troubleshot the issue. ConnectWise is actually a “ClickOnce” application which leverages dfsvc.exe to install and launch ScreenConnect. You can read this super exciting article on ClickOnce applications here.

Technically Citrix, nor Microsoft support any of these ClickOnce applications. Kudos on the Citrix engineer for continuing to work the issue with me, even though this is true. Luckily I already built a 2012R2 RemoteApp environment and was able to get this working to show Citrix this was not an application issue, but a Citrix seamless app issue. During troubleshooting he pointed me to this interesting article on ClickOnce and XenApp 6.5 here. I’m on 7.6 LTSR CU3, but still a good article on how this stuff works.

Anyway, after looking at the procmon information in the ticket, he quickly found that in the working scenario dfsvc.exe was calling ScreenConnect.WindowsClient.exe, where the seamless app was not. His “solution” was to simply run dfsvc.exe before calling LTClient.exe. Not really a “fix” but hell, it worked! So, I created a simple powershell script.

start-process "C:\Windows\Microsoft.NET\Framework\v4.0.30319\dfsvc.exe"
start-process "C:\Program Files (x86)\LabTech Client\LTClient.exe"

Lastly, I added dfsvc.exe to LogoffCheckSysModules per this CTX article.

Enjoy!

 

 

 

 

Sep 302016
 

Environment

XenApp 7.6

700+ Delivered (published) Applications

60+ Windows servers (2008 R2 and 2012 R2)

 

Scenario

Recently I had a request to replicate 100+ applications from PROD to QA, using QA server configured with identical applications and identical application locations/paths. Obviously all paths to EXE files need to be the same in order for this to work 100% (unless I missed a memo and XA can now support publishing of identical applications from various paths.  As far as I know, this was not yet available in 7.6).  If QA server has some of the applications in different paths, not all is lost. You can still use this process and script to migrate large number of applications between delivery groups and then modify paths later in Studio.

While I could add few more lines to my PoSH script to actually replicate each application at a time, amount of time it took me to create this script and ability to duplicate applications in Studio seemed unnecessary.

My goal was to replicate, or proper Citrix term would be duplicate all applications and then assign them to another delivery group. Seems simple enough for Citrix and PoSH guru. But for those who are just getting their feet wet could use following process to speed up their delivery time to less than 5 mins and go look for the end of the Internet, while telling client it took you hours 😉

 

Process

1 – I will be duplicating all requested applications using ol’ Citix Studio.

2 – I will run script below to change Delivery Group and application folder, as visible by the user (you mileage might vary, depending on your requirements)

This script/process is no rocket science, but might help someone to quickly replicate applications and migrate them to another delivery group, instead of publishing them over again.  Modify script below according to your environment before running it.  (WARNING: It is fairly simple script, so review and try to understand exactly what this script is doing, before executing it.)  Also, I am no expert when it comes to creating powershell scripts, but just another Citrix admin.  So, pardon if you can make it better.  Please do improve and share!  I am all for helping fellow Citrix admins anyway I can.  Even if it’s buying a pint!

 

Step 1

citrixirc1Create alternative application folder in Studio.  For our scenario I am going to create folder named “QA” inside already created “Europe” folder.

Right-click on all applications that you need to replicate in QA (you can select multiple applications at once).

Click Duplicate Application 

Now select all duplicates and drag them over to QA folder.  In my scenario I will be dragging these to Europe\QA.

Step 2

Below script will prompt for the admin folder name where all the duplicates reside (that’s the new folder you just created.  In my example it’s called Europe\QA).  I repeat- do not select your production applications folder, as script will move all your production apps to new delivery group.  Use newly created QA folder where you moved all duplicate applications to in step 1 above.

It is assumed that new delivery group is already created.

Another item to note; there is an optional line (in yellow) to change client-side folder location of newly created applications.  This is to help users identify whether they are running PROD or QA applications. It also looks cleaner in Storefront or WI.  You can add more commands into Foreach loop.  Things like modifying users who have access, or changing actual name of the application and etc.  My goal was to keep all the same and just deliver from QA server.

Script

asnp Citrix*

$adminfolder = (Get-BrokerApplication -MaxRecordCount 10000).AdminFolderName | sort | select -unique | Out-GridView -Title "Select Admin Folder Name" -OutputMode Single
$applist = Get-Brokerapplication -AdminFolderName $adminfolder
$originalDG = (Get-BrokerDesktopGroup -MaxRecordCount 10000).Name | sort | Out-GridView -Title "Select Original Delivery Group Name" -OutputMode Single
$newDG = (Get-BrokerDesktopGroup -MaxRecordCount 10000).Name | sort | Out-GridView -Title "Select New Delivery Group Name" -OutputMode Single

Write-Host "Migrating all applications in $adminfolder`nFrom $originalDG Delivery Group to $newDG Delivery Group" -ForegroundColor Green

foreach ($app in $applist.ApplicationName){
                Write-host "Migrating $app"
                Get-BrokerApplication -ApplicationName $app | Add-BrokerApplication -DesktopGroup $newDG
                Get-BrokerApplication -ApplicationName $app | Remove-BrokerApplication -DesktopGroup $originalDG
                Get-BrokerApplication -ApplicationName $app | Set-BrokerApplication -ClientFolder "Europe\QA" #optional to show all applications inside QA folder and not in the same folder with production apps
 }

Bonus

BTW, using similar add-brokerapplication command you can publish, or rather deliver same application from multiple delivery groups.  Just comment out remove-brokerapplication command and it will now launch from servers in prod and qa, or any other DG of your choice.  Comes really handy when you have multiple DGs that host different applications, but some of the applications are identical.  You can spread the load across multiple DGs.  Think of it as a worker groups concept in XA 6.x with server groups.   I had such requirement that was easily achievable in XA 6.x, but not so much in XA 7.x.  I paid for someone’s case of beer when they told me that I can use above mentioned command to deliver same application from multiple DG’s, as it’s not clearly documented by Citrix. There is a surprise…

That’s all folks. My first ever citrixirc blog.  Whoo-hoo!

Over and out.

Apr 122016
 

Thank you Microsoft for changing fundamental things about your operating system, with little or no regard to those of us running in an RDS/XenApp type environment. Check out this technet article. In this article it states how changes have been made.

“In  Pre-Win 8, apps could set the default handler for a file type/protocol by manipulating the registry, this means you could easily have a script or a group policy manipulating the registry. For example  for Mailto protocol you just needed to change the “default” value under HKEY_CLASSES_ROOT\mailto\shell\open\command”

More importantly, you were able to use Group Policy Preferences (GPP) to set these values inside a GPO. You could also Item Level Target (ITL) them by using the GPP. This means you could easily have users run Acrobat Pro for .pdfs on SecurityGroupA and Adobe Reader for .pdfs on SecurityGroupB. However, the technet article goes on to say that in post Windows 8,

“the registry changes are verified by a hash (unique per user and app) “

A little more digging tells us that the new hashing mechanism is also on a per-machine basis. This means that a hash would be different for each user, per app, per XenApp server. Very inconvenient and annoying. This also means that we can not use the built in GPP functions in Active Directory to set these file type associations. Also very inconvenient and annoying.

James Rankin did a great blogpost on this subject as well. You can read that here. He did a great job overviewing this issue and provided a solution with using AppSense. This blog will show you how to do this with good old batch scripting and group policy. To be honest, I’m quite annoyed that I had to put together this “hack” to get around something that worked PERFECTLY FINE in 2008R2 with GPPs. If anyone has a more elegant solution, I’d love to see it. I’m not the best scripter in the world, but I’m very pragmatic. “It works”

The first thing we want to do is create a logoff script to delete HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\FileExts\.pdf. However, because of that pesky Deny in the UserChoice key, we are unable to simply do this. So, I have made a simple “regini” command to overwrite the permissions on that key so that we can delete. In my environment, I have created FTA_PDF.txt in the NETLOGON directory. Inside this file is simply a registry value and some codes, which allow SYSTEM, Administrators, Interactive User, etc, FULL CONTROL of the key.

Next, I create a FTA_Delete.bat file in NETLOGON. This runs the “regini” command to change the permissions, then a “reg delete” command to delete the key.

Then we need to create the script for the logon process. I’ve busted out good old “ifmember” for this one. It’s a simple executable that will check AD group membership. My script is pretty simple. It checks to see if a user is a member of the Acrobat group. If so, run the “reg add” to add the association to Acrobat. If not, it falls back to the default .pdf reader in this environment. In this case, it’s Adobe Reader. Keep in mind that you can add multiple programs and associations using this method. You can add Foxit here if you would like.

So, the sad fact of the matter is when I tried to set this as an actual “Logon Script” the functionality didn’t work. I had to set this in a User GPO: Administrative Templates\System\Logon\Run these programs at user logon. I’m also the type of person that hates to see a CMD window flash up on the screen right after I login. So, I wrote ANOTHER script called FTA_Starter.bat to call this script to run in a minimized window.

This is the script I added to the GPO.

So, I fought with this for a long time and it wasn’t working. I had to re-read James’s blog and found this little blub at the bottom.

“Update – I built a third XenApp server, just to be sure, and this time the solution wouldn’t work until I removed the Registry key HKLM\Software\Classes\.pdf.”

This DID NOT WORK until the HKLM key was deleted from the servers. Do not forget this step.

I hope this helps you work through this issue in less time that it took me to do it.

Nov 052015
 

As a Citrix CSP, having a good set of scripts to deploy a base environment is critical. Setting up 50 environments by hand would take far more time than with good scripting. Now that I’ve finally had some time to sit down and not be working on 6.5 stuff, I have been able to write scripts to install and configure XenDesktop 7.6 with Storefront 3 using PowerShell (and some batch). Full disclosure! I am NOT good at PowerShell, or scripting in general. I’m sure there are a million better/easier ways to do these things, but what I have works, dammit. The point of this blog is to show you the commands, and what they do. I will attach my final scripts where you can tune and tweak as needed.

As always, credit where credit is due. I took some code from Aaron Parker from here. I also took some code from Eric from here. Lastly, I’d like to give a BIG THANKS to Esther Barhel (@virtuEs_IT) for helping me out with the non-documented Storefront 3 PowerShell commands.

If you have read my blog, most of my environments are built for the SMB/SMG space. That being said in this blog we will be setting up a single instance configuration. There will be a single server with both the XDC and Storefront roles on it. You could easily take this code to add additional XDCs or Storefront servers to your environment. I will be using code snippets to explain what they are for during the blog.

The first thing we want to do is install XenDesktop. As this is a small environment, we are going to use SQL Express on the XDC. Personally, I do not like to use SQL instances, so in my configuration I am going to install SQL with the default instance instead of the SQLEXPRESS instance. For this, I have created a directory with the XD7.6 disk extracted and shared on the network. I then created a SQL directory and extracted SQL2014 Express in there. I shared this directory out as XA76. (Long live XenApp!)

Installing SQL and XenDesktop

This first “Script” simply does a net use to the share, installs SQL, then installs XenDesktop with only the Controller and Studio roles. Note that you do not want to install Storefront yet. This script will reboot the server. As a scripting/PowerShell noobie, I learned a lot during this. For example. In PowerShell, the use of –% tells PowerShell to stop parsing code after those characters. This helped me a lot because PowerShell was trying to parse the \’s and /’s and failing miserably.

net use --% x: \\10.56.90.20\XA76
x:
cd '.\SQL'
write-host "Installing SQL"
 .\SETUP.EXE --% /QS /ACTION=install /FEATURES=SQL,Tools /INSTANCENAME=MSSQLSERVER  /IAcceptSQLServerLicenseTerms=true
cd ..
cd '.\x64\XenDesktop Setup'
write-host "Installing XenDesktop"
.\XenDesktopServerSetup.exe --% /components CONTROLLER,DESKTOPSTUDIO /NOSQL /quiet /configure_firewall
shutdown -f -r -t 2

XenDesktop Configuration

After the server is done rebooting, we can start with the configuration of the XenDesktop Site. First thing we want to do is setup the databases. I’m going to setup the default databases to the local system we just installed SQL on, using the NetBIOS Name of the domain as the site name. In this example $NBN is NetBIOS Name.

New-XDDatabase -AllDefaultDatabases -DatabaseServer $env:COMPUTERNAME -SiteName $NBN

The next thing we want to do is setup the Site. In my configuration, I use the NetBIOS name and append it to the DBs. In this example $LDB would be CitrixConfigLoggingTEST where TEST is the NetBIOS name. The same is done with the Monitor Database and Site Database.

New-XDSite -DatabaseServer $env:COMPUTERNAME -LoggingDatabaseName $LDB -MonitorDatabaseName $MDB -SiteDatabaseName $SDB -SiteName $NBN

Next, we want to setup licensing. This is pretty self-explanatory. This sets the license server and the port. IT then sets the product and edition. I generally use a generic DNS name for the licensing server, “ctxlicesnse” in this example, that points to the license server IP address. Then this sets up the product code and product edition. My example shows a setup for XenDesktop Advanced edition. You can get these variables with the following commands.

PS C:\> Get-ConfigProduct

Code                    Name
----                    ----
XDT                     XenDesktop
MPS                     XenApp
PS C:\> Get-ConfigProductEdition -ProductCode XDT
PLT
ENT
APP
ADV
STD

Here is my code to add licensing.

Set-XDLicensing -LicenseServerAddress ctxlicense -LicenseServerPort 27000
Set-ConfigSite -LicensingModel Concurrent -ProductCode XDT -ProductEdition ADV

Next, we setup a Machine catalog. This assumes that we already have at least one VDA setup, already pointed to this Delivery Controller. This is a simple persistent desktop Machine Catalog. This is setup for XenApp type Full Desktops (MultiSession), and we add the first XD box to the Machine Catalog.

New-BrokerCatalog -SessionSupport MultiSession -ProvisioningType Manual -AllocationType Random -Name 2012R2 -Description 2012R2 -PersistUserChanges OnLocal -MachinesArePhysical $true
New-BrokerMachine -MachineName TEST-RDU-XD-01 -CatalogUid 1

This next bit of code sets up the Desktop Group. This code snip was taken from Aaron Parker’s blog here and edited for my using. I encourage you to read his blog page to understand what this stuff does. It is much more involved than my code, and does a great job setting up the delivery group. I will try to break down the piece for all of us though. Let’s walk through the variables first.

$XDC = $env:COMPUTERNAME
$assignedGroup = "$NBN`\$NBN`_CTX_Desktop"
$desktopGroupName = "Some Desktop Group"
$desktopGroupPublishedName = "Some Desktop Group"
$desktopGroupDesc = "Some Desktop Group"
$colorDepth = 'TwentyFourBit'
$deliveryType = 'DesktopsandApps'
$desktopKind = 'Shared'
$sessionSupport = "MultiSession"
$functionalLevel = 'L7_6'
$timeZone = 'EST Eastern Standard Time'
$offPeakBuffer = 10
$peakBuffer = 10
$machineCatalogName = "2012R2"

You can set a TON of stuff in here, and it can get complicated when you are doing MCS/PVS and stuff. In this example, we are setting up a XenApp Full Desktop to the Machine Catalog we created above (2012R2). $deliverytime can be DesktopsOnly, DesktopsandApps, or AppsOnly. $deliverykind can be Shared or Private. $sessionSupport can be MultiSession or SingleSession. This is a new environment, with all 7.6, so we set the $functionalLevel to L7_6. This can be set to 5, 7, or 7.6. There are so many other commands in here that Aaron has detailed; I won’t go into them here. The command I have used for this example is below.

New-BrokerDesktopGroup -ErrorAction SilentlyContinue -AdminAddress $XDC -Name $desktopGroupName -DesktopKind $desktopKind -DeliveryType $deliveryType -Description $desktopGroupPublishedName -PublishedName $desktopGroupPublishedName -MinimumFunctionalLevel $functionalLevel -ColorDepth $colorDepth -SessionSupport $sessionSupport -InMaintenanceMode $False -IsRemotePC $False -SecureIcaRequired $False -Scope @()

The next thing we need to do is to add machines to the desktop group.

$machineCatalog = Get-BrokerCatalog -AdminAddress $XDC -Name $machineCatalogName
Add-BrokerMachinesToDesktopGroup -AdminAddress $XDC -Catalog $machineCatalog -Count $machineCatalog.UnassignedCount -DesktopGroup $desktopGroup

Then we want to add users to the desktop group

$brokerUsers = New-BrokerUser -AdminAddress $XDC -Name $assignedGroup
New-BrokerEntitlementPolicyRule -AdminAddress $XDC -Name ($desktopGroupName + "_" + $Num.ToString()) -IncludedUsers $brokerUsers -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedUserFilterEnabled $False

Lastly, we want to allow access through Access Gateway

New-BrokerAccessPolicyRule -AdminAddress $XDC -Name $accessPolicyRule -IncludedUsers @($brokerUsers.Name) -AllowedConnections 'ViaAG' -AllowedProtocols @('HDX','RDP') -AllowRestart $True -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedSmartAccessFilterEnabled $True -IncludedSmartAccessTags @() -IncludedUserFilterEnabled $True

Here is the full script from start to finish.

add-pssnapin c*
$XDC = $env:COMPUTERNAME
$nbn = $env:USERDOMAIN
$assignedGroup = "$NBN`\$NBN`_CTX_Desktop"
$LDB = "CitrixConfigLogging" + $nbn
$MDB = "CitrixMonitor" + $nbn
$SDB = "Citrix" + $nbn

# Desktop Group properties
$desktopGroupName = "Some Desktop Group"
$desktopGroupPublishedName = "Some Desktop Group"
$desktopGroupDesc = "Some Desktop Group"
$colorDepth = 'TwentyFourBit'
$deliveryType = 'DesktopsandApps'
$desktopKind = 'Shared'
$sessionSupport = "MultiSession"
$functionalLevel = 'L7_6'
$timeZone = 'EST Eastern Standard Time'
$offPeakBuffer = 10
$peakBuffer = 10
$machineCatalogName = "2012R2"

write-host "Creating Citrix Databases"
New-XDDatabase -AllDefaultDatabases -DatabaseServer $env:COMPUTERNAME -SiteName $NBN

write-host "Setting up Site"
New-XDSite -DatabaseServer $env:COMPUTERNAME -LoggingDatabaseName $LDB -MonitorDatabaseName $MDB -SiteDatabaseName $SDB -SiteName $NBN

write-host "Setting up licensing"
Set-XDLicensing -LicenseServerAddress ctxlicense -LicenseServerPort 27000
Set-ConfigSite -LicensingModel Concurrent -ProductCode XDT -ProductEdition ADV

write-host "Setting up Machine Catalog"
New-BrokerCatalog -SessionSupport MultiSession -ProvisioningType Manual -AllocationType Random -Name 2012R2 -Description 2012R2 -PersistUserChanges OnLocal -MachinesArePhysical $true
New-BrokerMachine -MachineName TEST-RDU-XD-01 -CatalogUid 1

$VerbosePreference = "Continue"

write-host "Creating Desktop Group"

If (!(Get-BrokerDesktopGroup -Name $desktopGroupName -ErrorAction SilentlyContinue)) {
Write-Verbose "Creating new Desktop Group: $desktopGroupName"
$desktopGroup = New-BrokerDesktopGroup -ErrorAction SilentlyContinue -AdminAddress $XDC -Name $desktopGroupName -DesktopKind $desktopKind -DeliveryType $deliveryType -Description $desktopGroupPublishedName -PublishedName $desktopGroupPublishedName -MinimumFunctionalLevel $functionalLevel -ColorDepth $colorDepth -SessionSupport $sessionSupport -InMaintenanceMode $False -IsRemotePC $False -SecureIcaRequired $False -Scope @()
}
If ($desktopGroup) {

Write-Verbose "Getting details for the Machine Catalog: $machineCatalogName"
$machineCatalog = Get-BrokerCatalog -AdminAddress $XDC -Name $machineCatalogName
Write-Verbose "Adding $machineCatalog.UnassignedCount machines to the Desktop Group: $desktopGroupName"
$machinesCount = Add-BrokerMachinesToDesktopGroup -AdminAddress $XDC -Catalog $machineCatalog -Count $machineCatalog.UnassignedCount -DesktopGroup $desktopGroup

Write-Verbose "Creating user/group object in the broker for $assignedGroup"
If (!(Get-BrokerUser -AdminAddress $XDC -Name $assignedGroup -ErrorAction SilentlyContinue)) {
$brokerUsers = New-BrokerUser -AdminAddress $XDC -Name $assignedGroup
} Else {
$brokerUsers = Get-BrokerUser -AdminAddress $XDC -Name $assignedGroup
}

$Num = 1
Do {
$Test = Test-BrokerEntitlementPolicyRuleNameAvailable -AdminAddress $XDC -Name @($desktopGroupName + "_" + $Num.ToString()) -ErrorAction SilentlyContinue
If ($Test.Available -eq $False) { $Num = $Num + 1 }
} While ($Test.Available -eq $False)
Write-Verbose "Assigning $brokerUsers.Name to Desktop Catalog: $machineCatalogName"
$EntPolicyRule = New-BrokerEntitlementPolicyRule -AdminAddress $XDC -Name ($desktopGroupName + "_" + $Num.ToString()) -IncludedUsers $brokerUsers -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedUserFilterEnabled $False

# Check whether access rules exist and then create rules for direct access and via Access Gateway
$accessPolicyRule = $desktopGroupName + "_Direct"
If (Test-BrokerAccessPolicyRuleNameAvailable -AdminAddress $XDC -Name @($accessPolicyRule) -ErrorAction SilentlyContinue) {
Write-Verbose "Allowing direct access rule to the Desktop Catalog: $machineCatalogName"
New-BrokerAccessPolicyRule -AdminAddress $XDC -Name $accessPolicyRule -IncludedUsers @($brokerUsers.Name) -AllowedConnections 'NotViaAG' -AllowedProtocols @('HDX','RDP') -AllowRestart $True -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedSmartAccessFilterEnabled $True -IncludedUserFilterEnabled $True
} Else {
Write-Error "Failed to add direct access rule $accessPolicyRule. It already exists."
}
$accessPolicyRule = $desktopGroupName + "_AG"
If (Test-BrokerAccessPolicyRuleNameAvailable -AdminAddress $XDC -Name @($accessPolicyRule) -ErrorAction SilentlyContinue) {
Write-Verbose "Allowing access via Access Gateway rule to the Desktop Catalog: $machineCatalogName"
New-BrokerAccessPolicyRule -AdminAddress $XDC -Name $accessPolicyRule -IncludedUsers @($brokerUsers.Name) -AllowedConnections 'ViaAG' -AllowedProtocols @('HDX','RDP') -AllowRestart $True -DesktopGroupUid $desktopGroup.Uid -Enabled $True -IncludedSmartAccessFilterEnabled $True -IncludedSmartAccessTags @() -IncludedUserFilterEnabled $True
} Else {
Write-Error "Failed to add Access Gateway rule $accessPolicyRule. It already exists."
}

} #End If DesktopGroup

Installing Storefront

This portion of the scripting is going to do a bunch of things. It will install the pre-requisites for Storefront, including IIS. It installs Storefront. It imports a certificate and binds it to the default website. The sets up the initial Storefront base URL then finishes the configuration. The first thing I did was to copy the 3.x version of CitrixStoreFront-x64 into my share to the x64\StoreFront directory and overwrite the default one. Luckily, this works so we can use XenDesktopServerSetup.exe again to install it.

The first thing we are going to do is install the pre-requisites and install Storefront. Again, I am just going to do a net-use to my share and run everything.

net use --% x: \\10.56.90.20\XA76
Import-Module ServerManager
write-host "Installing Storefront Prereqs"
Add-WindowsFeature AS-Net-Framework,Web-Net-Ext45,Web-AppInit,Web-ASP-Net45,Web-ISAPI-Ext,Web-ISAPI-Filter,Web-Default-Doc,Web-HTTP-Errors,Web-Static-Content,Web-HTTP-Redirect,Web-HTTP-Logging,Web-Filtering,Web-Windows-Auth,Web-Client-Auth
x:
cd '.\x64\XenDesktop Setup'
write-host "Installing Storefront"
.\XenDesktopServerSetup.exe --% /components STOREFRONT /NOSQL /quiet

Next, we want to import the certificate and bind it to the default web site. First, we ask for the cert password.

$myPW = read-host -Prompt "Enter Cert Password here"

We then want to import the certificate and assign it to the default website. Copy the .pfx file to the root of C:\. You will need the thumbprint of the certificate to put in the XXXXXXXXXXXXXXXXXX location of the script.

$certpw = ConvertTo-SecureString -String $myPW -Force -AsPlainText
Import-PfxCertificate -filepath "C:\cert.pfx" Cert:\LocalMachine\My -Password $certpw
New-WebBinding -Name "Default Web Site" -IPAddress "*" -Port 443 -Protocol https
cd IIS:
cd .\SSLBindings
Get-Item Cert:\LocalMachine\My\XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX | new-item 0.0.0.0!443
c:

Storefront Configuration

This gets us all set to start configuring Storefront. We will first need to import the Storefront PowerShell modules.

. "C:\Program Files\Citrix\Receiver StoreFront\Scripts\ImportModules.ps1"

Let us take a look at some of the variables we will be using here.

$nbn = $env:USERDOMAIN
$GatewayAddress = "https://site.domain.com"
$Farmname = "XenApp76 Farm"
$Port = "80"
$TransportType = "HTTP"
$sslRelayPort = "443"
$LoadBalance = $false
$FarmType = "XenDesktop"
$fqdn = "$env:computername.$env:userdnsdomain"
$baseurl = "https://" + $fqdn
$SFPath = "/Citrix/" + $nbn.toLower()
$SFPathWeb = "$SFPath`Web"
$SFPathDA = "$SFPath`DesktopAppliance"
$GatewayName = "TEST-RDU-NS-01"
$staservers = "http://" + $fqdn + "/scripts/ctxsta.dll"
$snipIP = "10.56.13.9"

Again, keep in mind this is a small environment, so we will be using a single server for the XDC/Storefront roles. My $baseurl variable will resolve to https://server.domain.local. I set the store name to $nbn (NetBIOS Name), in this example it is TEST. Then using some simple PowerShell I set $SFPath, $SFPathWeb, and $SFPathDA to /Citrix/test, /Citrix/testWeb, and /Citrix/testDesktopAppliance respectively. You can set these variables as appropriate for your environment. The first command we want to run will do the initial configuration of Storefront.

Set-DSInitialConfiguration -hostBaseUrl $baseurl -farmName $Farmname -port $Port -transportType $TransportType -sslRelayPort $sslRelayPort -servers $fqdn -loadBalance $LoadBalance -farmType $FarmType -StoreFriendlyName TEST -StoreVirtualPath $SFPath -WebReceiverVirtualPath $SFPathWeb -DesktopApplianceVirtualPath $SFPathDA

The next thing I do here is setup the beacons. I set this up now because if you setup the gateway first, it sets the $baseurl as an external beacon. In my configuration, I do NOT want $baseurl to be an external beacon. At the time of writing this blog, Citrix has not written the full documentation for these PowerShell modules. I have already put in an RFE to get these up on Citrix’s site. That being said, I was not able to figure out HOW to remove an external beacon. The gateway module detects if you have any external beacons configured. If it detects none are configured, it automatically makes $baseurl and www.citrix.com the two external beacons. Setting up the external beacons is as simple as these commands.

$beaconID = ([guid]::NewGuid()).ToString()
Add-DSGlobalExternalBeacon -BeaconID $beaconID -BeaconAddress http://www.google.com
$beaconID = ([guid]::NewGuid()).ToString()
Add-DSGlobalExternalBeacon -BeaconID $beaconID -BeaconAddress http://www.citrix.com

Next, we are going to add a NetScaler gateway to the configuration. Reference the variables above. Not too much complicated in this command. This box is also the XDC, so the STA setup simply points to http://server.domain.local/scripts/ctxsta.dll.

$GatewayID = ([guid]::NewGuid()).ToString()
Add-DSGlobalV10Gateway -Id $GatewayID -Name $GatewayName -Address $GatewayAddress -Logon Domain -IPAddress $snipIP -SessionReliability $false -SecureTicketAuthorityUrls $staservers -IsDefault $true

The previous command creates the NetScaler gateway. We then have to enable NetScaler authentication and link this gateway to the store.

$gateway = Get-DSGlobalGateway -GatewayId $GatewayID
Set-DSStoreGateways -SiteId 1 -VirtualPath $SFPath -Gateways $gateway
Set-DSStoreRemoteAccess -SiteId 1 -VirtualPath $SFPath -RemoteAccessType "StoresOnly"
Add-DSAuthenticationProtocolsDeployed -SiteId 1 -VirtualPath /Citrix/Authentication -Protocols CitrixAGBasic
Set-DSWebReceiverAuthenticationMethods -SiteId 1 -VirtualPath $SFPathWeb -AuthenticationMethods ExplicitForms,CitrixAGBasic

This next command was from the Eric’s blog referenced above. This disables the check publisher’s certificate revocation to speed up console start-up

set-ItemProperty -path "REGISTRY::\HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing\" -name State -value 146944

Lastly we as we are using $fqdn as $baseurl, we will want to turn the loopback to OnUsingHTTP because the certificate is not going to match. You can look at more details on this command here.

Set-DSLoopback -SiteId 1 -VirtualPath $SFPathWeb -Loopback OnUsingHttp

There we have it. Storefront configuration is DONE, dude! All we need to do is setup an internal DNS cname to point site.domain.com to server.domain.local and we have single URL for internal/external access to your XenDesktop 7.6 environment.

Full Script is below.

# Certificate Password
#==================
$myPW = read-host -Prompt "Enter Cert Password here"

# StoreFront Parameters
#==================
$nbn = $env:USERDOMAIN
$GatewayAddress = "https://site.domain.com"
$Farmname = "XenApp76 Farm"
$Port = "80"
$TransportType = "HTTP"
$sslRelayPort = "443"
$LoadBalance = $false
$FarmType = "XenDesktop"
$fqdn = "$env:computername.$env:userdnsdomain"
$baseurl = "https://" + $fqdn
$SFPath = "/Citrix/" + $nbn.toLower()
$SFPathWeb = "$SFPath`Web"
$SFPathDA = "$SFPath`DesktopAppliance"
$GatewayName = "TEST-RDU-NS-01"
$staservers = "http://" + $fqdn + "/scripts/ctxsta.dll"
$snipIP = "10.56.13.9"

#write-host "Mapping Drive"
net use --% x: \\10.56.90.20\XA76
Import-Module ServerManager

write-host "Installing Storefront Prereqs"
Add-WindowsFeature AS-Net-Framework,Web-Net-Ext45,Web-AppInit,Web-ASP-Net45,Web-ISAPI-Ext,Web-ISAPI-Filter,Web-Default-Doc,Web-HTTP-Errors,Web-Static-Content,Web-HTTP-Redirect,Web-HTTP-Logging,Web-Filtering,Web-Windows-Auth,Web-Client-Auth
x:
cd '.\x64\XenDesktop Setup'

write-host "Installing Storefront"
.\XenDesktopServerSetup.exe --% /components STOREFRONT /NOSQL /quiet

#write-host "Copy certificate to C:\ before moving on"
#pause

write-host "Installing Certificate"
$certpw = ConvertTo-SecureString -String $myPW -Force -AsPlainText
Import-PfxCertificate -filepath "C:\wildcard.vc3advantage.com-NEW.pfx" Cert:\LocalMachine\My -Password $certpw
New-WebBinding -Name "Default Web Site" -IPAddress "*" -Port 443 -Protocol https
cd IIS:
cd .\SSLBindings
Get-Item Cert:\LocalMachine\My\8CE850C9DCD7C26DF8E8FD4C44BF7D9E586E8AD1 | new-item 0.0.0.0!443
c:

# Import Storefront module
#==========================
write-host "Installing Storefront Modules"
. "C:\Program Files\Citrix\Receiver StoreFront\Scripts\ImportModules.ps1"

# Setup Initial Configuration
#============================
write-host "Initial Storefront Configuration"
Set-DSInitialConfiguration -hostBaseUrl $baseurl -farmName $Farmname -port $Port -transportType $TransportType -sslRelayPort $sslRelayPort -servers $fqdn -loadBalance $LoadBalance -farmType $FarmType -StoreFriendlyName TEST -StoreVirtualPath $SFPath -WebReceiverVirtualPath $SFPathWeb -DesktopApplianceVirtualPath $SFPathDA

write-host "Configuring Beacons"
$beaconID = ([guid]::NewGuid()).ToString()
Add-DSGlobalExternalBeacon -BeaconID $beaconID -BeaconAddress http://www.google.com
$beaconID = ([guid]::NewGuid()).ToString()
Add-DSGlobalExternalBeacon -BeaconID $beaconID -BeaconAddress http://www.citrix.com

$GatewayID = ([guid]::NewGuid()).ToString()
Add-DSGlobalV10Gateway -Id $GatewayID -Name $GatewayName -Address $GatewayAddress -Logon Domain -IPAddress $snipIP -SessionReliability $false -SecureTicketAuthorityUrls $staservers -IsDefault $true
$gateway = Get-DSGlobalGateway -GatewayId $GatewayID
Set-DSStoreGateways -SiteId 1 -VirtualPath "/Citrix/test" -Gateways $gateway
Set-DSStoreRemoteAccess -SiteId 1 -VirtualPath /Citrix/test -RemoteAccessType "StoresOnly"
Add-DSAuthenticationProtocolsDeployed -SiteId 1 -VirtualPath /Citrix/Authentication -Protocols CitrixAGBasic

write-host "Disable check publisher's cert revocation"
set-ItemProperty -path "REGISTRY::\HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing\" -name State -value 146944

write-host "Setting Loopback to OnUsingHttp"
Set-DSLoopback -SiteId 1 -VirtualPath $SFPathWeb -Loopback OnUsingHttp