Search This Blog

Friday, August 05, 2016

Importing Keepass to Lastpass

I have been using keepass for years and its been great, but I want to move to an easier to maintain solution, and so I bit the bullet and tried to move to lastpass.

To export and import into last pass is very easy, there were a couple of things that caught me out.

Initially I just did one big export from keepass, however this then put all the folders in lastpass into one root folder. This was not what I wanted. So easiest thing I found was to export individually each folder (at the root level in keepass), into an xml file and then import each xml file into last pass.

The next issue, was how lastpass reads the xml export and makes decisions on what type of record to create. I had a number of entries in key pass without a URL field. Either I hadn't bothered filling it in or it was a password without a URL. Now when these entries (without URL) get imported into lastpass, they get converted into a secure note item... This means the password is in free text on the page. I didnot want this.

So I needed to change the URL to have some sort of content. So using Notepad++ I created a search string and replaced the blank URL fields with dummy content.

<string>
  <key>URL</key>
  <value />
</string>

to
<string>
  <key>URL</key>
  <value>http://</value>
</string>

Saving this change and then importing to lastpass created all entries as sites, and the password is now hidden on the page.

The search strings in notepad++ was (note had to take into account the number of tabs... I suppose I could have come up with a reular expression to do this but that would have taken time and I am not a regular expression guru.... :)

6 tabs

<key>URL</key>\r\n\t\t\t\t\t\t<value />

and the replacement string was

<key>URL</key>\r\n\t\t\t\t\t\t<value>http://</value>

and

7 tabs

<key>URL</key>\r\n\t\t\t\t\t\t\t<value />

and the replacement string was

<key>URL</key>\r\n\t\t\t\t\t\t\t<value>http://</value>

The search string contains the carridge return and new line feed (\r\n) and then the tabs (\t).

Share/Bookmark

Tuesday, July 12, 2016

PowerShell: Force multiple user log off Terminal Services Sessions (RDP) - PSTerminalServices

We have a product that was using terminal services sessions to run multiple processes under multiple users. This was controlled by an overriding service.

This was working well but now and again we experienced an issue in that the service would fall over leaving some orphaned TS Sessions.

We have a batch script set to run on service failure to restart the service and send an alert message to admin staff. However this did not fix all issues, it appeared that it did not like the fact the sessions were left. We then came up with a small PowerShell script that utilises the PSTerminalServices module to log off all sessions with specific users.

Luckily the PowerShell command we used allowed for wildcards, our users were all named similarly.

  • usernameProcess1
  • usernameProcess2
  • usernameProcess3

We could then easily get the sessions for the users above and force logoff these.

Import-Module PSTerminalServices

#Get the local computername
[String]$computerName = $env:computername;

#Get all TS sessions with username beginning with usernameProcess*
$Sessions = Get-TSSession -UserName usernameProcess* -ComputerName $computerName;

ForEach ($session in $sessions)
{
    Stop-TSSession -Computername $computerName -ID $session.sessionid -Confirm:$false -force;
}

Share/Bookmark

Saturday, July 09, 2016

AWS: Snowball fight... Part II :o) - Multiple Parallel copies

So I have previously written an article about the basic commands and process needed to copy to the AWS snowball devices.

In this article I provide a script I used to get round an issue with corrupt files in the source location when trying to copying to the snowball device.

We have a large SAN system that was going to be transferred to AWS via the snowball device. This SAN had been running for years.

Initially I just tried to copy the entire root folder (recursively), however I soon discovered that the snowball copy process, prior to actually performing the copy it will scan and analysis the entire folder structure. If it encounters an error, the whole process is brought to a halt. Now initially I tried to fix the offending file issues, which turned out to be spurious characters in the file names ( like trailing spaces). However the scan would take hours to run, only to fall over each time.

Now the folders within the root folder were organised by client, so I decided we should try the copy a client folder at a time. In addition I was hoping to run multiple copies at the same time.

So I created the script below, it will only run ten copies at a time and then it will only attempt to run the copy on the client folder once (it looks for a pre-existing log file). By doing this I could run 10 copies in parallel, and also ensure I could run through the folder structure once. Then I could work out which client folders failed and then attack them individually.

The script assumes the initial security setup to the snowball has been performed, please see my previous article for my details.

##This script assumes that the connection to the snowball device has been established previously

function Get-ScriptDirectory
{
    #Determine the folder in which the script lives.
    $Invocation = (Get-Variable MyInvocation -Scope 1).Value
    Split-Path $Invocation.MyCommand.Path
}

$scriptPath = Get-ScriptDirectory


[String]$scriptCurrentDateTime = Get-Date -format "yyyyMMddHHmmss";
[String]$computerName = $env:computername;
[string]$sourceFolder = '\\sourceserver\subfolder1\subfolder2';


#remember to stop-transcript (last command in script).
start-transcript -path $scriptPath\psOutput_$scriptCurrentDateTime.log -noclobber

#Set the amount of jobs to run in parallel
[int]$maxRunningJobs = 10;


ForEach ($item in (Get-ChildItem -Path $sourceFolder | ?{ $_.PSIsContainer })) 
{
    $running = @(Get-Job | Where-Object { $_.State -eq 'Running' });
    [string]$logLocation = "1>`"$scriptPath\" + $item.Name + ".log`" 2>&1"
    [string]$logLocationPath = "$scriptPath\" + $item.Name + ".log"
    #check if 10 jobs already running, and if a log file has already been created, 
    if ($running.Count -le $maxRunningJobs -and -not(Test-Path ($logLocationPath)))
    {
        
        [string]$destinationFolder = 's3://awsSnowballJobName/subfolder1/subfolder2/' + $item.Name;
        $debugblock = {
            
            [string]$snowballProgram = 'C:\Program Files (x86)\SnowballClient\bin\snowball.bat';
            
            $commandToRun = "$snowballProgram";
            $commandToRun = "`"$commandToRun`" $($args[0]) $($args[1]) $($args[2]) $($args[3]) $($args[4])";
            #debug
            $commandToRun | Add-Content -Path 'e:\test.txt';
            Invoke-Expression "& $commandToRun";
        }
        
        try 
        {
            start-job -ScriptBlock $debugblock -ArgumentList "cp","--recursive",$item.FullName,$destinationFolder,"$logLocation";
        }
        catch
        {
            $MyError = $_
            Throw $MyError
        }
    }
}

#Clear Up completed jobs
Get-Job | Where-Object { $_.State -eq 'Completed' } | Remove-Job

#stop the transcript
stop-transcript;

trap
    {
        #$CRLF added because usual `r`n in string doesnot work within trap.
        [string]$CRLF = [char]13 + [char]10
        $script:errorMessage += 'Error: {0}' -f $_.Exception.Message + $CRLF;
        $Error.Clear();
        continue;
    }
 

Share/Bookmark

Thursday, July 07, 2016

AWS: Snowball fight.... :o)

Recently we have started the process of moving our production environment to AWS.

To achieve this aim there are a number of services that can be utilised, these include VPNs, AWS DirectConnect and Snowball.

We chose to do an initial dataload using a AWS snowball device.

For those that don’t know about snowball its a storage that AWS ship to you that can be plugged into your network, and then data can be copied to it. Note: the data copied onto the device is encrypted, the encryption is done by a client you have to download, its not encrypted by the device itself.

The device can then be shipped back to AWS where they will load the data onto S3 for you.

To start the process you login to the AWS console and create a job within the Snowball area.

Once done, AWS ship you the device.

The device is a lot larger than I was expecting and actually proved problematic fitting into a normal 19 inch rack. Luckily we had space within a rack to stand up the device. Apart from this the device is cleverly designed with two flaps on the front and back that need to be raised to access the front and rear panels.

The back flap has the powerscord and network cables carefully wrapped around a specially designed storage location. Although the unit shipped with a US power cable!!. Anyway plugged in the power and the network cable into the ports on the rear panel. (there are ports for copper and fibre spf+).

There is a power button on the front above the LCD to power on the device. Once on the device initially is set to DHCP, but the network can be configured manually from the front panel. I manually set the network parameters, and ensured I could ping the device from a server on the network.

Now the device does not just present network storage, to access the device you have to use a special client tool that needs to be downloaded from aws. This at least on the windows system I was using was a java tool.

In addition to the tool, you will need an access code and a manifest file from AWS. You will use these two and the snowball client to access the device.

Once you have the client downloaded and installed on the windows device. I found the easiest way was to go to the folder that contains the snowball.bat file that we will use. Below I list out the commands I used to connect, test and copy to the device with an explanation of each.

.\snowball.bat start –i <snowball ip> -m "E:\Snowball\jobName\xxnxnxnnxnnxnxnxn_manifest.bin"  -u xxxxx-nnnnn-bbbbb-ccccc-ggggg

The command above extracts the manfiest and sets up a secure channel to talk to the device from the server. Once this is in place, we can then use additional command

.\snowball ls

This will list the folders on the device, you should find that the device has a folder with the same name as the job you created in the aws console.

s3://jobname

Now I created a subfolder

.\snowball.bat mkdir s3://jobname/subfolder

Now you can run a test, before you waste time on a full run

.\snowball test -r -t 5 “\\sourceserver\folder"

Then to copy the files to the device, this will create version3 within the subfolder on the snowball device.

.\snowball cp --recursive \\sourceserver\version3 s3://jobname/subfolder

Snowball fight – Part 2 (multiple parallel copies)

Ref: https://docs.aws.amazon.com/AWSImportExport/latest/ug/using-client-commands.html


Share/Bookmark

Monday, July 04, 2016

Powershell: Start multiple parallel 7zip archives jobs

Had an automated job that would loop through folders and archive the folders into encrypted archives using 7zip. The final script is rushed and should be tidied…. But its enough for my purposes at the moment.

Every now and then one of the folders would fail to archive. Not really sure why and I am guessing either a network issue or internal 7zip issue. But on the next run it would work. So I was running the archive manually for each of the failed archives.

Decided to put together a script to help me, I can now add the failed folders to an array and this script will fire off the archives to run in parallel.

Had real issues getting start-job to accept the parameters due to the inclusion of spaces and quotes. For some reason the command using $args would wrap the –p parameter in an additional quotes and this would throw the 7zip command. In the end I had to create a string and then use invoke-expression in the script block. I spent a while trying to get the –p parameter to work normally (using the $args array but in the end had to cut it short due to time. So the script works but would love to fully understand why it wants to wrap that one parameter in additional quotes….

function Get-ScriptDirectory
{
    #Determine the folder in which the script lives.
    $Invocation = (Get-Variable MyInvocation -Scope 1).Value
    Split-Path $Invocation.MyCommand.Path
}

$scriptPath = Get-ScriptDirectory

[String]$scriptCurrentDateTime = Get-Date -format "yyyyMMddHHmmss";
[String]$computerName = $env:computername;
[string]$7zipX64Path = 'C:\7zipPath\7z.exe';
[string]$WorkingFolder = '\\UNC\path\path';
[string]$WorkingFolderWithQuotes = '-w"\\UNC\path\path"';
[string]$passwordWithQuotes = '-p"ThePassword"';
[string]$sourceFolder = '\\UNC\path\path';
[string]$password = '"ThePassword"'

$failedClients = @(
'clientfolder1',
'clientfolder2',
'clientfolder3',
'clientfolder4',
'clientfolder5',
'clientfolder6',
'clientfolder7'
)

    
Foreach ($failedClient IN $failedClients) {
    
    
    $sZipFilesArchive=$scriptCurrentDateTime + '_' + $computerName + '_FilesArchiveAdditonalText_' + $failedClient + '_X64.7z';
    $SFfaileClientWithQuotes = '"' + $sourceFolder + '\' + $failedClient + '"';
    $WFzipArchiveWithQuotes = '"' + $WorkingFolder + '\' + $sZipFilesArchive + '"';
    
    $debugblock = {
        #Have to implement am unusual way to do this as using $args
        #it wraps an additional "" aroudn the -p parameter, not sure why but it screws up the command.
        #& 'C:\7zipPath\7z.exe' $args;
        $commandToRun = 'C:\7zipPath\7z.exe';
        $commandToRun = "`"$commandToRun`" $($args[0]) $($args[1]) $($args[2]) $($args[3]) $($args[4]) $($args[5]) $($args[6]) $($args[7]) $($args[8]) $($args[9]) $($args[10])";
        #debug
        #$commandToRun | Add-Content -Path 'e:\test.txt';
        Invoke-Expression "& $commandToRun";
        
    }

    try 
    {
        start-job -ScriptBlock $debugblock -ArgumentList  "a", "-t7z", "-mx=5", "-m0=LZMA2", "-mmt", "-mhe=on", "-mhc=on", $passwordWithQuotes, $WorkingFolderWithQuotes, $WFzipArchiveWithQuotes, $SFfaileClientWithQuotes;
    }
    catch
    {
        $MyError = $_
        Throw $MyError
    }
}

trap
    {
        #$CRLF added because usual `r`n in string doesnot work within trap.
        [string]$CRLF = [char]13 + [char]10
        $script:errorMessage += '"' + $smtpserver + '" : Error: {0}' -f $_.Exception.Message + $CRLF;
        $Error.Clear();
        continue;
    }
    
    

 

Share/Bookmark

Monday, June 27, 2016

SQL Server: Allow readonly access to all databases (including future)

Recently had to setup readonly access to all databases on a SQL Server, in the past this would have involved assigning roles to a login and that login to all databases, and then repeating for all new databases.

In SQL 2014 a new command allows for this to be simplified.

GRANT CONNECT ANY DATABASE TO [domain\AD-Group]

GO
GRANT SELECT ALL USER SECURABLES TO [domain\AD-Group]

ref: http://www.sqlservercentral.com/articles/Security/111116/


Share/Bookmark

Tuesday, June 21, 2016

Powershell: sending emails via office365 TLS

Recently had to work on sending emails from a powershell script using the office365 smtp server. This means I had to use TLS and also use authentication.

Script below is pretty simple and does not encrypt the password in the script this just a poc.

The script creates an array of users to send the email to ($users), then this is looped through.

This line creates the email object, this is the best way to send email and I believe the only way if you need to enable TLS.

$client = New-Object system.Net.Mail.SmtpClient $smtpserver, $port 

Using the new email object this line enables TLS

$client.EnableSsl = $true 

This next line creates the user credentials that the object will use to authenticate with office365 system when it tries to send the email.

$client.Credentials = New-Object System.Net.NetworkCredential( $office365User , $office365Password ); 

I leave in, but its commented out, but you can get it you use the account the script is run under.

#$client.Credentials = [system.Net.CredentialCache]::DefaultNetworkCredentials 

Full script

[string]$to = ""
[String]$from = "<office365User@office365domain.com>"
[String]$subject = "<subject>"
[String[]]$users = "<user1@domain.com>, <user2@domain.com>"
[String]$smtpserver = "smtp.office365.com"
[string]$office365User = '<office365User@office365domain.com>';
[string]$office365Password = '<office365UserPassword>';
[String]$port = 25

foreach ($user in $users)
{
    trap
    {
        #$CRLF added because usual `r`n in string doesnot work within trap.
        [string]$CRLF = [char]13 + [char]10
        $script:errorMessage += '"' + $smtpserver + '" : Error: {0}' -f $_.Exception.Message + $CRLF;
        $Error.Clear();
        continue;
    }
    $to      = $user
    $output = "<pre>" + "<font color='#800000'>scriptErrors: `r`n" + $script:errorMessage + "</font>`r`n"
    $output += "<b>" + $outputHeader + "</b><font color='#800000'>" + $outputAlert + "</font>" + "<b>" + $outputSubHeader + "</b>" + $outputAll + "</pre>"

    # Create mail message
    $message = New-Object system.net.mail.MailMessage
    $message.From = $from;
    foreach ($useremail in $to)
    {
        $message.To.Add($useremail);
    }
    $message.Subject = $subject;
    $message.Body = $output;
    $message.IsBodyHtml = $true;
    #Create SMTP client
    $client = New-Object system.Net.Mail.SmtpClient $smtpserver, $port
    $client.EnableSsl = $true
    # Credentials are necessary if the server requires the client # to authenticate before it will send e-mail on the client's behalf.
    #$client.Credentials = [system.Net.CredentialCache]::DefaultNetworkCredentials
    $client.Credentials = New-Object System.Net.NetworkCredential( $office365User , $office365Password );
     
    # Try to send the message
    $client.Send($message)
    # reset variables
    $body = ""
    $message.Dispose()
    $client.Dispose()
}

Share/Bookmark

Thursday, June 16, 2016

Powershell: IIS: Set application pool idle-timeout and application pool recycle time.

Most of my IIS scripts will be setting the IIS config options in the applicationhost.config file. This is a personal preference, it puts all config in one central location and prevents the web.config files potentially being modified and deleted in code deployments.

Add-PSSnapin WebAdministration -ErrorAction SilentlyContinue
Import-Module WebAdministration -ErrorAction SilentlyContinue
  
#<name> is the application pool name
 
--Set the idle time to 0(off)
Set-ItemProperty ("IIS:\AppPools\<name>") -Name processModel.idleTimeout -value ( [TimeSpan]::FromMinutes(0))
 
--disable the regular time of 1740 minutes
Set-ItemProperty ("IIS:\AppPools\<name>") -Name Recycling.periodicRestart.time -Value "00:00:00"
 
--Clear any scheduled restart times
Clear-ItemProperty ("IIS:\AppPools\<name>") -Name Recycling.periodicRestart.schedule
 
--Set scheduled restart times
Set-ItemProperty ("IIS:\AppPools\<name>") -Name Recycling.periodicRestart.schedule -value @{value="05:00:00"}

Share/Bookmark

Tuesday, June 14, 2016

Powershell: Ping with timestamp, logging and tail

I include a refernce to the page where I found out how to timestamp each ping request at the bottom of this article.

This code excerpt will timestamp each ping request, redirect it to a log file and then we can use the get-content command to look at the contents of the log file as it is written to (like tail in linux)

filter timestamp {"$(Get-Date -Format o): $_"}
ping servername -t | timestamp > c:\temp\logging.txt
#tail log
Get-Content c:\temp\logging.txt -wait

To tail in powershell

Get-Content c:\temp\logging.txt -Tail 10 –Wait

Returns last 10 lines,  -wait will then wait for new entries

References:

http://stackoverflow.com/questions/27361452/how-to-add-timestamps-to-individual-lines-of-powershell-output


Share/Bookmark

Wednesday, June 08, 2016

Powershell: Find and Kill defined processes owned by specific user.

[string]$userName = 'user1';
[string]$exeName = 'cmd.exe';
Get-WmiObject -Query "Select * from Win32_Process where name = "$exeName" | Select Name, Handle, @{Label='Owner';Expression={$_.GetOwner().User}} | Where-object {$_.Owner -like "$userName"} | Select-object @{Label='Handle';Expression={[int]$_.Handle}} |foreach { stop-process -id $_.Handle -force }

Share/Bookmark

Powershell: IIS add request filtering (CFIDE)

Recently had to add some requestfiltering deny rules to a  number of servers. This was because of a detected vunerabiltiy with Coldfusion and its admin portal.

So to achieve this quickly and consistently I developed the following powershell script.

It adds a deny rule at the server level and then an removes the rule for the CF administration portal site.

The first two lines ensure that the web administration tools are imported in all versions of powershell.

I like my config to be in the central applicationhost.config file, which runs less risk of developers overwriting the config in a web.config file. The line

-pspath 'MACHINE/WEBROOT/APPHOST'

ensures the config is saved in the applicationhost.config file.

–location "CFAdminPortal”

This line ensures that the config is applied to the site called CFAdminPortal

Add-PSSnapin WebAdministration
Import-Module WebAdministration
 
#Add requestfiltering
Add-WebConfigurationProperty -pspath 'MACHINE/WEBROOT/APPHOST' -filter "system.webServer/security/requestFiltering/denyUrlSequences" -name "." -value @{sequence='/CFIDE'}
 
#Remove requestfiltering  for the CFIDE site
Remove-WebConfigurationProperty -pspath 'MACHINE/WEBROOT/APPHOST' –location "CFAdminPortal" -filter "system.webServer/security/requestFiltering/denyUrlSequences" -name "." -AtElement @{sequence='/CFIDE'}

Share/Bookmark

Powershell: Cloudflare autoupdate public IP and send SMS

[2020-12-24 Edit-start]
Two things here - 

1. Cloudflare now allows for access tokens which you can lock down to specific areas and actions.. like dns zones and read\edit. So its better to now use these than the global api key that was only available when I wrote this article. You will need to create the access token in cloudflare and then change the $header variable to something like this.

$headers = @{
    "Authorization" = 'Bearer <access token>';
    "Content-Type" = 'application/json'
}

2. Also now with the demise of internet explorer I have found that to use the Invoke-WebRequest
you have to add -UseBasicParsing to the command. So it will read like this.

$request = Invoke-WebRequest -Uri "${cloudFlareApiBaseUrl}/zones/?name=${Zone}" 
-Method 'GET' -Headers $headers -UseBasicParsing

[2020-12-24 Edit-end]

We had a connection that ran our company VPN. The VPN was available on a couple of URLs. This connection also had a backup connection, so if the primary link failed it would swap to the backup. However this would change the IP address.
I wanted to make it so that the when the connection failed over the VPN URLs were updated.
The script below achieves that by updating the DNS records in cloudflare with the new public IP of the connection.
It looks at the public IP and then compares it to the record in dns, if the same do nothing. If different then update and send SMS
The SMS api maybe different for you depending on the company you use.
 
function Get-ScriptDirectory
{
    #Determine the folder in which the script lives.
    $Invocation = (Get-Variable MyInvocation -Scope 1).Value
    Split-Path $Invocation.MyCommand.Path
}
#[string]$CRLF
#needed to find the location of the script.
$scriptPath = Get-ScriptDirectory

[String]$scriptCurrentDateTime = Get-Date -format "yyyyMMddHHmmss"

$headers = @{
        'X-Auth-Key' = '<auth-key>';
        'X-Auth-Email' = '<email>';
        'Content-Type' = 'application/json'
    }

[string]$urlRequest = 'https://api.smscompanyurl.com/api-adv.php?';
[string]$smsUsername = '<username>';
[string]$smsPassword = '<password>';
[string]$smsNumbers = '<phonenumber1>,<phonenumber2>,<phonenumber3>';
[string]$smsFrom = '<Company Name>';
[string]$smsMessage = [uri]::EscapeDataString('Office IP Address has changed. Attempting auto update.');



$ZoneRecords = @(
'temp.contoso.com',
'tempytemp.contoso.com'
)

[string]$Zone = 'contoso.com';
[string]$cloudFlareApiBaseUrl = 'https://api.cloudflare.com/client/v4';
[string]$public_IP='';
[string]$public_IP='';


# Get Zone ID from cloudflare
    try 
    {
        $request = Invoke-WebRequest -Uri "${cloudFlareApiBaseUrl}/zones/?name=${Zone}" -Method 'GET' -Headers $headers
    }
    catch
    {
        $MyError = $_
        Throw $MyError
    }
    $zoneId = $(ConvertFrom-Json $request.Content).result[0].id


#Lookup external ip
    try 
    {
        $public_IP = (Invoke-RestMethod https://api.ipify.org?format=json).ip.trim()
    }
    catch
    {
        $MyError = $_
        Throw $MyError
    }
     

Foreach ($Element IN $ZoneRecords) {

    #GET record info from cloudflare
    #$RecordVPNweb = Invoke-WebRequest -Uri "${cloudFlareApiBaseUrl}/zones/${ZoneId}/dns_records/?name=${Element}" -Method 'GET' -Headers $headers
    try 
    {
        $RecordVPN = Invoke-RestMethod -Uri "${cloudFlareApiBaseUrl}/zones/${ZoneId}/dns_records/?name=${Element}" -Method 'GET' -Headers $headers -ContentType 'application/json'   
    }
    catch
    {
        $MyError = $_
        Throw $MyError
    }
    
    #When using webrequest need to convert from jason
    #$RecordVPN_ID = $(ConvertFrom-Json $RecordVPN.Content).result[0].id
    #$RecordVPN_IP = $(ConvertFrom-Json $RecordVPN.Content).result[0].content
    $RecordVPN_ID = $RecordVPN.result[0].id
    $RecordVPN_IP = $RecordVPN.result[0].content
    
    if ($RecordVPN_IP -ne $public_IP) {

        $smsMessage = $smsMessage + [uri]::EscapeDataString("Changing from $RecordVPN_IP to new $public_IP");
        $queryString = "username=$smsUsername&password=${smsPassword}&to=${smsNumbers}&from=${smsFrom}&message=${smsMessage}";
        ${urlRequest}+${queryString}
        [string]$textfileName = "$scriptPath\IPChanged$scriptCurrentDateTime.txt"
        
        try 
        {
            #send sms
            [String]$scriptIPChangeDateTime = Get-Date -format "yyyy-MM-dd HHmmss"
            [string]$textfileContent = "IP Change attempted at $scriptIPChangeDateTime : ${urlRequest}+${queryString}"
            out-file -filepath $textfileName -inputobject $textfileContent -encoding ASCII
            Invoke-RestMethod -Uri "${urlRequest}+${queryString}" -Method 'GET';   
        }
        catch
        {
            $MyError = $_
            Throw $MyError
        }

        $Data = @{
        'id' = "${RecordVPN_ID}";
        'type' = 'A';
        'name' = "${Element}";
        'content' = "${public_IP}";
        }
    
        # If it exists, UPDATE (PUT), if not, CREATE (POST)
        [string]$method = 'PUT';
        if ($RecordVPN.result.Count -eq 0) {
            $method = 'POST';
            $Data.Remove('id');
        }

        $DataJson = ConvertTo-Json $Data

        try 
        {
            $JSONResponse = Invoke-RestMethod -Uri "${cloudFlareApiBaseUrl}/zones/${ZoneId}/dns_records/${RecordVPN_ID}/" -Headers $Headers -Body $DataJson -Method "${method}" -ContentType 'application/json' -ErrorAction Stop
        }
        catch
        {
            $MyError = $_
            Throw $MyError
        }
       
    }
}

Share/Bookmark

Tuesday, June 07, 2016

Creating Code block for blogger

Its great news that Live writer has ben made open source by Microsoft. Please see Open Live Writer (OLW)

http://openlivewriter.org/

There is the code plugin for wordpress

https://richhewlett.com/wlwsourcecodeplugin/

this has been updated to work with Open Live Writer. However I am still waiting on a plugin for blogger in OLW. So in the meantime my process is this.

  1. Open the code in notepad++, which does nice simple syntax highlighting.
  2. Use the nppExport plugin to export to html. (note: I could not get this to work to the clipboard I had to export to a file. A minor pain.
  3. Then grab the <style> section, this details the css and how the code will be formatted. Paste this into the html view of the blog post in OLW.
  4. The following CSS has been inserted into the blogger template. Defines a class to apply to the div containing the code and then a style to apply to pre tags within that div.

divCodeBlock {
  word-wrap: break-word;
  width: 95%;
  background: #ffffff;
  white-space: pre-line;
  border-top: black thin solid;
  border-right: black thin solid;
  border-bottom: black thin solid;
  border-left: black thin solid;
  padding-bottom: 2px;
  padding-top: 2px;
  padding-left: 2px;
  padding-right: 2px;
  margin-left: auto;
  margin-right: auto;
  line-height: 1;
}

div.divCodeBlock pre {
  white-space: pre-wrap; /* Since CSS 2.1 */
  white-space: -moz-pre-wrap; /* Mozilla, since 1999 */
  white-space: -pre-wrap; /* Opera 4-6 */
  white-space: -o-pre-wrap; /* Opera 7 */
  word-wrap: break-word; /* Internet Explorer 5.5+ */
}

  1. In source view in OLW insert the following div tag where you want to place your code block.
    <div class="divCodeBlock"> <pre></pre></div>
  2. Now grab the page content in notepad++, this will be within <div> tags in the <body>.  And place into the
    <div class="divCodeBlock"> <pre><content here></pre></div>

Publish and bang, “Bob's your auntie's live-in lover”


Share/Bookmark

Cloudflare API delete multiple records via Powershell

Recently whilst setting up cloudflare DNS, I had to delete a lot of records. With the API you can only do it one at a time. With thousands of records to remove I put together a little script tot do this.

Please ensure you fully test what you are doing with this, as I take no responsibility, but it worked for me.

You will various things to make this work, and I will assume you know how to use the API, by supplying your API key and email etc….

The script below can, with a small change, either read in the records to delete from a file or you can manually create the array in the script.

The script also contains output debugging that I left in.

Hope it helps somebody Smile

$headers = @{
"X-Auth-Key" = '<you need to get this from cloudflare>';
"X-Auth-Email" = '<replace this with email address>';
"Content-Type" = 'application/json'
}
#this may need changing when Cloudflare update the api
[string]$cloudFlareApiBaseUrl = "https://api.cloudflare.com/client/v4"
#file with records in it
[string]$recordlist= 'c:\path\fileRecordList.txt';
#dns domain to use
[string]$Zone = "contoso.com";

#Manually create array, add records here to delete.
<#
$ZoneRecords = @(
'example.contoso.com',
'example.subdomain.contoso.com'
)#>


#Create array from file, records to delete.
$ZoneRecords = Get-content -Path ".\${recordlist}"

# Get Zone ID from cloudflare
$request = Invoke-WebRequest -Uri "${cloudFlareApiBaseUrl}/zones/?name=${Zone}" -Method "GET" -Headers $headers
$zoneId = $(ConvertFrom-Json $request.Content).result[0].id

Foreach ($Element IN $ZoneRecords) {
#debug
$Element

#GET record info from cloudflare
$Record = Invoke-WebRequest -Uri "${cloudFlareApiBaseUrl}/zones/${ZoneId}/dns_records/?name=${Element}" -Method "GET" -Headers $headers
$RecordID = $(ConvertFrom-Json $Record.Content).result[0].id
$RecordName = $(ConvertFrom-Json $Record.Content).result[2].id

#debug
$Record

#DELETE record info from cloudflare
$RecordDelete = Invoke-WebRequest -Uri "${cloudFlareApiBaseUrl}/zones/${ZoneId}/dns_records/${RecordID}" -Method "DELETE" -Headers $headers

#debug
$RecordDelete
'RecordName:' + $RecordName;
'RecordID:' + $RecordID;
}

The format of the text file is just a flat text file

test.qa.contoso.com
test.demo.contoso.com
test.demo.contoso.com
test.contoso.com
test1.contoso.com
test2.contoso.com
test.qa2.contoso.com
test2.qa.contoso.com
test.training.contoso.com
test3.contoso.com
test4.contoso.com

Share/Bookmark