Powershell : Extracting Netscaler "secure gateway" User Logs


If you are using Citrix Netscaler and you need a report on people who have tried to use the Secure Gateway service (which is like a remote access portal) then on the Netscaler designated for external access, then I have used winscp is look in the folder:


/var/log

You are them looking for the entries called nsvpn* as below and highlighted:


I have then from WinSCP copied this files to a folder called "nslog" notice that many of these will be zipped up and that will look like this:


We now need a script to extract there files and place them in a folder called "Logs" this will then contain the 7Zip extracted files that will be "log.0" 

Script : Extractor.ps1

# Define paths
$sourceFolder = ".\nslog"
$destinationFolder = ".\Logs"
$7zipPath = "C:\Program Files\7-Zip\7z.exe"
$combinedLogFile = Join-Path $destinationFolder "combined_logs.log"

# Check if 7-Zip is installed
if (-not (Test-Path $7zipPath)) {
    Write-Error "7-Zip is not installed in the expected location: $7zipPath"
    exit 1
}

# Create Logs folder if it doesn't exist
if (-not (Test-Path $destinationFolder)) {
    Write-Host "Creating Logs folder..."
    New-Item -ItemType Directory -Path $destinationFolder | Out-Null
}

# Check if source folder exists
if (-not (Test-Path $sourceFolder)) {
    Write-Error "Source folder 'nslog' not found!"
    exit 1
}

# Process .gz files
Write-Host "Processing .gz files..."
$gzFiles = Get-ChildItem -Path $sourceFolder -Filter "*.gz" -Recurse
foreach ($file in $gzFiles) {
    Write-Host "Extracting: $($file.Name)"
    try {
        # Create a unique subfolder for initial extraction to prevent overwrites
        $tempFolder = Join-Path $destinationFolder $file.BaseName
        New-Item -ItemType Directory -Path $tempFolder -Force | Out-Null       

        # Extract to temp folder first
        & $7zipPath x $file.FullName "-o$tempFolder" -y | Out-Null
        if ($LASTEXITCODE -ne 0) {
            Write-Warning "Failed to extract: $($file.Name)"
            continue
        }       

        # Move extracted files to main Logs folder
        Get-ChildItem -Path $tempFolder -File | ForEach-Object {
            $newName = $file.BaseName + "_" + $_.Name
            Move-Item -Path $_.FullName -Destination (Join-Path $destinationFolder $newName) -Force
        }       

        # Clean up temp folder
        Remove-Item -Path $tempFolder -Recurse -Force
    }
    catch {
        Write-Error "Error extracting $($file.Name): $_"
    }
}

# Copy .log files
Write-Host "Copying .log files..."$logFiles = Get-ChildItem -Path $sourceFolder -Filter "*.log" -Recurse
foreach ($file in $logFiles) {
    Write-Host "Copying: $($file.Name)"
    try {
        # Add prefix to copied log files to prevent overwrites
        $newName = "copied_" + $file.Name
        Copy-Item -Path $file.FullName -Destination (Join-Path $destinationFolder $newName) -Force
    }
    catch {
        Write-Error "Error copying $($file.Name): $_"
    }
}

# Combine all log files
Write-Host "Combining log files..."
if (Test-Path $combinedLogFile) {
    Remove-Item $combinedLogFile -Force
}

# Summary
$extractedCount = $gzFiles.Count
$copiedCount = $logFiles.Count
$finalLogCount = (Get-ChildItem -Path $destinationFolder -Filter "*.log").Count
Write-Host "`nOperation completed:"
Write-Host "- Processed $extractedCount .gz files"
Write-Host "- Copied $copiedCount .log files"
Write-Host "- Total log files in destination: $finalLogCount"
Write-Host "All files have been processed to: $destinationFolder"

This will then look exactly like this:


Now we need to look at those log file and extract relvent information on the logins being processed:

Script : UserAccess.ps1

# Define the paths to the Logs folder and output files
$logsFolder = ".\Logs"
$usernameReportFile = ".\username_frequency.log"

# Verify the Logs folder exists
if (-not (Test-Path $logsFolder)) {
    Write-Error "Logs folder not found!"
    exit 1
}

Write-Host "Analyzing username frequencies..."

# Get all files ending with either log or log.0
$logFiles = Get-ChildItem -Path $logsFolder -Filter "*log*" | 
    Where-Object { $_.Name -match '\.log(\.0)?$' }

# Create hashtable to store username counts
$usernameCounts = @{}

foreach ($logFile in $logFiles) {
    Write-Host "Processing: $($logFile.Name)"
    
    # Read the file content
    $content = Get-Content $logFile.FullName

    # Search for both types of authentication attempts
    $radiusMatches = $content | Select-String -Pattern "process_radius: RADIUS auth: Authentication failed"
    $kernelMatches = $content | Select-String -Pattern "process_kernel_socket: call to authenticate user"

    # Process RADIUS authentication usernames
    foreach ($match in $radiusMatches) {
        if ($match.Line -match 'user (\S+)') {
            $username = $matches[1]
            if ($usernameCounts.ContainsKey($username)) {
                $usernameCounts[$username]++
            } else {
                $usernameCounts[$username] = 1
            }
        }
    }

    # Process kernel socket authentication usernames
    foreach ($match in $kernelMatches) {
        if ($match.Line -match 'user :(\S+)') {
            $username = $matches[1]
            if ($usernameCounts.ContainsKey($username)) {
                $usernameCounts[$username]++
            } else {
                $usernameCounts[$username] = 1
            }
        }
    }
}

# Create the report
"Username Frequency Report" | Set-Content $usernameReportFile
"======================" | Add-Content $usernameReportFile
"Generated: $(Get-Date)" | Add-Content $usernameReportFile
"" | Add-Content $usernameReportFile

# Sort usernames by frequency (highest to lowest) and write to file
"USERNAME FREQUENCIES (Sorted by count)" | Add-Content $usernameReportFile
"----------------------------------" | Add-Content $usernameReportFile
$usernameCounts.GetEnumerator() | 
    Sort-Object Value -Descending | 
    ForEach-Object {
        "$($_.Key): $($_.Value) attempts" | Add-Content $usernameReportFile
    }

# Add summary statistics
"`nSUMMARY" | Add-Content $usernameReportFile
"=======" | Add-Content $usernameReportFile
"Total unique usernames: $($usernameCounts.Count)" | Add-Content $usernameReportFile
"Total authentication attempts: $($usernameCounts.Values | Measure-Object -Sum | Select-Object -ExpandProperty Sum)" | Add-Content $usernameReportFile

Write-Host "`nAnalysis completed:"
Write-Host "- Files searched: $($logFiles.Count)"
Write-Host "- Unique usernames found: $($usernameCounts.Count)"
Write-Host "- Results saved to: $usernameReportFile"

# Display the log files that were searched
Write-Host "`nFiles searched:"
$logFiles | ForEach-Object { Write-Host "- $($_.Name)" }

This will then produce you output files:


access_attempts.log contains the full error that was encountered
username_frequency.log  contains all the detected usernames regardless of success or failure

When you look at the username_frequency.log look at some of the silly attempts that have tried to login:


This is why if you have a login form you should really use reCaptcha on that form to stop people doing brute force logins to your appliances that have no throttling protection on them, yes this is against the computer misuse act but that people that attempt this login sprays may not care so much about that law as your corporation does.

If you have MFA protection included or secondary authentication then these events are nothing but noise and unnecessary panic, when you provide a public website this is quite a common occurrence.
Previous Post Next Post

نموذج الاتصال