Search This Blog

Tuesday, July 12, 2016

PowerShell: Force multiple user log off Terminal Services Sessions (RDP) - PSTerminalServices

We have a product that was using terminal services sessions to run multiple processes under multiple users. This was controlled by an overriding service.

This was working well but now and again we experienced an issue in that the service would fall over leaving some orphaned TS Sessions.

We have a batch script set to run on service failure to restart the service and send an alert message to admin staff. However this did not fix all issues, it appeared that it did not like the fact the sessions were left. We then came up with a small PowerShell script that utilises the PSTerminalServices module to log off all sessions with specific users.

Luckily the PowerShell command we used allowed for wildcards, our users were all named similarly.

  • usernameProcess1
  • usernameProcess2
  • usernameProcess3

We could then easily get the sessions for the users above and force logoff these.

Import-Module PSTerminalServices

#Get the local computername
[String]$computerName = $env:computername;

#Get all TS sessions with username beginning with usernameProcess*
$Sessions = Get-TSSession -UserName usernameProcess* -ComputerName $computerName;

ForEach ($session in $sessions)
{
    Stop-TSSession -Computername $computerName -ID $session.sessionid -Confirm:$false -force;
}

Share/Bookmark

Saturday, July 09, 2016

AWS: Snowball fight... Part II :o) - Multiple Parallel copies

So I have previously written an article about the basic commands and process needed to copy to the AWS snowball devices.

In this article I provide a script I used to get round an issue with corrupt files in the source location when trying to copying to the snowball device.

We have a large SAN system that was going to be transferred to AWS via the snowball device. This SAN had been running for years.

Initially I just tried to copy the entire root folder (recursively), however I soon discovered that the snowball copy process, prior to actually performing the copy it will scan and analysis the entire folder structure. If it encounters an error, the whole process is brought to a halt. Now initially I tried to fix the offending file issues, which turned out to be spurious characters in the file names ( like trailing spaces). However the scan would take hours to run, only to fall over each time.

Now the folders within the root folder were organised by client, so I decided we should try the copy a client folder at a time. In addition I was hoping to run multiple copies at the same time.

So I created the script below, it will only run ten copies at a time and then it will only attempt to run the copy on the client folder once (it looks for a pre-existing log file). By doing this I could run 10 copies in parallel, and also ensure I could run through the folder structure once. Then I could work out which client folders failed and then attack them individually.

The script assumes the initial security setup to the snowball has been performed, please see my previous article for my details.

##This script assumes that the connection to the snowball device has been established previously

function Get-ScriptDirectory
{
    #Determine the folder in which the script lives.
    $Invocation = (Get-Variable MyInvocation -Scope 1).Value
    Split-Path $Invocation.MyCommand.Path
}

$scriptPath = Get-ScriptDirectory


[String]$scriptCurrentDateTime = Get-Date -format "yyyyMMddHHmmss";
[String]$computerName = $env:computername;
[string]$sourceFolder = '\\sourceserver\subfolder1\subfolder2';


#remember to stop-transcript (last command in script).
start-transcript -path $scriptPath\psOutput_$scriptCurrentDateTime.log -noclobber

#Set the amount of jobs to run in parallel
[int]$maxRunningJobs = 10;


ForEach ($item in (Get-ChildItem -Path $sourceFolder | ?{ $_.PSIsContainer })) 
{
    $running = @(Get-Job | Where-Object { $_.State -eq 'Running' });
    [string]$logLocation = "1>`"$scriptPath\" + $item.Name + ".log`" 2>&1"
    [string]$logLocationPath = "$scriptPath\" + $item.Name + ".log"
    #check if 10 jobs already running, and if a log file has already been created, 
    if ($running.Count -le $maxRunningJobs -and -not(Test-Path ($logLocationPath)))
    {
        
        [string]$destinationFolder = 's3://awsSnowballJobName/subfolder1/subfolder2/' + $item.Name;
        $debugblock = {
            
            [string]$snowballProgram = 'C:\Program Files (x86)\SnowballClient\bin\snowball.bat';
            
            $commandToRun = "$snowballProgram";
            $commandToRun = "`"$commandToRun`" $($args[0]) $($args[1]) $($args[2]) $($args[3]) $($args[4])";
            #debug
            $commandToRun | Add-Content -Path 'e:\test.txt';
            Invoke-Expression "& $commandToRun";
        }
        
        try 
        {
            start-job -ScriptBlock $debugblock -ArgumentList "cp","--recursive",$item.FullName,$destinationFolder,"$logLocation";
        }
        catch
        {
            $MyError = $_
            Throw $MyError
        }
    }
}

#Clear Up completed jobs
Get-Job | Where-Object { $_.State -eq 'Completed' } | Remove-Job

#stop the transcript
stop-transcript;

trap
    {
        #$CRLF added because usual `r`n in string doesnot work within trap.
        [string]$CRLF = [char]13 + [char]10
        $script:errorMessage += 'Error: {0}' -f $_.Exception.Message + $CRLF;
        $Error.Clear();
        continue;
    }
 

Share/Bookmark

Thursday, July 07, 2016

AWS: Snowball fight.... :o)

Recently we have started the process of moving our production environment to AWS.

To achieve this aim there are a number of services that can be utilised, these include VPNs, AWS DirectConnect and Snowball.

We chose to do an initial dataload using a AWS snowball device.

For those that don’t know about snowball its a storage that AWS ship to you that can be plugged into your network, and then data can be copied to it. Note: the data copied onto the device is encrypted, the encryption is done by a client you have to download, its not encrypted by the device itself.

The device can then be shipped back to AWS where they will load the data onto S3 for you.

To start the process you login to the AWS console and create a job within the Snowball area.

Once done, AWS ship you the device.

The device is a lot larger than I was expecting and actually proved problematic fitting into a normal 19 inch rack. Luckily we had space within a rack to stand up the device. Apart from this the device is cleverly designed with two flaps on the front and back that need to be raised to access the front and rear panels.

The back flap has the powerscord and network cables carefully wrapped around a specially designed storage location. Although the unit shipped with a US power cable!!. Anyway plugged in the power and the network cable into the ports on the rear panel. (there are ports for copper and fibre spf+).

There is a power button on the front above the LCD to power on the device. Once on the device initially is set to DHCP, but the network can be configured manually from the front panel. I manually set the network parameters, and ensured I could ping the device from a server on the network.

Now the device does not just present network storage, to access the device you have to use a special client tool that needs to be downloaded from aws. This at least on the windows system I was using was a java tool.

In addition to the tool, you will need an access code and a manifest file from AWS. You will use these two and the snowball client to access the device.

Once you have the client downloaded and installed on the windows device. I found the easiest way was to go to the folder that contains the snowball.bat file that we will use. Below I list out the commands I used to connect, test and copy to the device with an explanation of each.

.\snowball.bat start –i <snowball ip> -m "E:\Snowball\jobName\xxnxnxnnxnnxnxnxn_manifest.bin"  -u xxxxx-nnnnn-bbbbb-ccccc-ggggg

The command above extracts the manfiest and sets up a secure channel to talk to the device from the server. Once this is in place, we can then use additional command

.\snowball ls

This will list the folders on the device, you should find that the device has a folder with the same name as the job you created in the aws console.

s3://jobname

Now I created a subfolder

.\snowball.bat mkdir s3://jobname/subfolder

Now you can run a test, before you waste time on a full run

.\snowball test -r -t 5 “\\sourceserver\folder"

Then to copy the files to the device, this will create version3 within the subfolder on the snowball device.

.\snowball cp --recursive \\sourceserver\version3 s3://jobname/subfolder

Snowball fight – Part 2 (multiple parallel copies)

Ref: https://docs.aws.amazon.com/AWSImportExport/latest/ug/using-client-commands.html


Share/Bookmark

Monday, July 04, 2016

Powershell: Start multiple parallel 7zip archives jobs

Had an automated job that would loop through folders and archive the folders into encrypted archives using 7zip. The final script is rushed and should be tidied…. But its enough for my purposes at the moment.

Every now and then one of the folders would fail to archive. Not really sure why and I am guessing either a network issue or internal 7zip issue. But on the next run it would work. So I was running the archive manually for each of the failed archives.

Decided to put together a script to help me, I can now add the failed folders to an array and this script will fire off the archives to run in parallel.

Had real issues getting start-job to accept the parameters due to the inclusion of spaces and quotes. For some reason the command using $args would wrap the –p parameter in an additional quotes and this would throw the 7zip command. In the end I had to create a string and then use invoke-expression in the script block. I spent a while trying to get the –p parameter to work normally (using the $args array but in the end had to cut it short due to time. So the script works but would love to fully understand why it wants to wrap that one parameter in additional quotes….

function Get-ScriptDirectory
{
    #Determine the folder in which the script lives.
    $Invocation = (Get-Variable MyInvocation -Scope 1).Value
    Split-Path $Invocation.MyCommand.Path
}

$scriptPath = Get-ScriptDirectory

[String]$scriptCurrentDateTime = Get-Date -format "yyyyMMddHHmmss";
[String]$computerName = $env:computername;
[string]$7zipX64Path = 'C:\7zipPath\7z.exe';
[string]$WorkingFolder = '\\UNC\path\path';
[string]$WorkingFolderWithQuotes = '-w"\\UNC\path\path"';
[string]$passwordWithQuotes = '-p"ThePassword"';
[string]$sourceFolder = '\\UNC\path\path';
[string]$password = '"ThePassword"'

$failedClients = @(
'clientfolder1',
'clientfolder2',
'clientfolder3',
'clientfolder4',
'clientfolder5',
'clientfolder6',
'clientfolder7'
)

    
Foreach ($failedClient IN $failedClients) {
    
    
    $sZipFilesArchive=$scriptCurrentDateTime + '_' + $computerName + '_FilesArchiveAdditonalText_' + $failedClient + '_X64.7z';
    $SFfaileClientWithQuotes = '"' + $sourceFolder + '\' + $failedClient + '"';
    $WFzipArchiveWithQuotes = '"' + $WorkingFolder + '\' + $sZipFilesArchive + '"';
    
    $debugblock = {
        #Have to implement am unusual way to do this as using $args
        #it wraps an additional "" aroudn the -p parameter, not sure why but it screws up the command.
        #& 'C:\7zipPath\7z.exe' $args;
        $commandToRun = 'C:\7zipPath\7z.exe';
        $commandToRun = "`"$commandToRun`" $($args[0]) $($args[1]) $($args[2]) $($args[3]) $($args[4]) $($args[5]) $($args[6]) $($args[7]) $($args[8]) $($args[9]) $($args[10])";
        #debug
        #$commandToRun | Add-Content -Path 'e:\test.txt';
        Invoke-Expression "& $commandToRun";
        
    }

    try 
    {
        start-job -ScriptBlock $debugblock -ArgumentList  "a", "-t7z", "-mx=5", "-m0=LZMA2", "-mmt", "-mhe=on", "-mhc=on", $passwordWithQuotes, $WorkingFolderWithQuotes, $WFzipArchiveWithQuotes, $SFfaileClientWithQuotes;
    }
    catch
    {
        $MyError = $_
        Throw $MyError
    }
}

trap
    {
        #$CRLF added because usual `r`n in string doesnot work within trap.
        [string]$CRLF = [char]13 + [char]10
        $script:errorMessage += '"' + $smtpserver + '" : Error: {0}' -f $_.Exception.Message + $CRLF;
        $Error.Clear();
        continue;
    }
    
    

 

Share/Bookmark