Friday, 20 December 2024

Search for an IP Address in the Last 7 Days of Windows Security Event Logs

This PowerShell script allows you to filter Windows Security event logs for a specific IP address, focusing on events from the past 7 days. The results are saved to a CSV file for further analysis.







The Script

# Define the IP address and output CSV file path $ipaddress = "10.1.1.1" $outputFile = "C:\SecurityEvents_Last7Days.csv" # Define the start date (7 days ago) $startDate = (Get-Date).AddDays(-7) # Extract the events from the last 7 days and export to CSV Get-WinEvent -LogName Security -FilterXPath "*[EventData[Data[@Name='IpAddress']='$ipaddress']]" | Where-Object { $_.TimeCreated -ge $startDate } | Select-Object TimeCreated, Id, Message | Export-Csv -Path $outputFile -NoTypeInformation -Encoding UTF8 # Notify user of completion Write-Output "Events from the last 7 days successfully exported to $outputFile"

Key Features

  1. Filters by IP Address: Searches for events where the IP address matches the specified value.
  2. Time Range: Limits results to events that occurred in the last 7 days using the TimeCreated property.
  3. CSV Output: Saves event details (timestamp, ID, and message) to a specified CSV file.

How to Use It

  1. Replace 10.1.1.1 with the target IP address.
  2. Save the script to a .ps1 file or run it directly in PowerShell with administrator privileges.
  3. Locate the output file (C:\SecurityEvents_Last7Days.csv) for review.

Script Workflow

  1. Input Definition: The $ipaddress variable holds the IP address, and $outputFile specifies the CSV file location.
  2. Time Range Setup: $startDate is calculated as 7 days prior to the current date.
  3. Event Filtering: Get-WinEvent retrieves log entries matching the IP address. Where-Object ensures only events from the past 7 days are included.
  4. Data Export: Selected details are saved to the CSV file for analysis.

Practical Applications

  • Security Monitoring: Quickly identify events tied to suspicious IP activity.
  • Incident Investigation: Focus on recent logs for faster issue resolution.
  • Data Analysis: Exported CSV files can be reviewed in Excel or other tools.

Conclusion

This script is a concise, efficient way to analyze recent security events related to a specific IP address. Adjust the IP and time range as needed for your specific use case, and use the exported data to inform your network security actions.

Monday, 9 December 2024

Understanding Microsoft SQL Index Fragmentation and How to Manage It

Introduction

Indexes in SQL Server play a crucial role in improving query performance by allowing faster access to data. However, over time, these indexes can become fragmented, leading to slower queries and increased system resource usage. In this post, we’ll explore what index fragmentation is, its types, and how to address it effectively.

What Is Index Fragmentation?

Index fragmentation occurs when the logical order of data pages in an index no longer matches their physical order on disk. This misalignment can cause SQL Server to work harder to retrieve ordered data, negatively impacting performance.

Types of Fragmentation:

  1. Internal Fragmentation:

    • Happens when data pages contain excessive free space, often due to page splits during inserts or updates.
    • Leads to inefficient use of storage and additional I/O operations.
  2. External Fragmentation:

    • Occurs when the logical sequence of pages doesn’t align with their physical storage order.
    • Results in extra effort for SQL Server to return ordered results.

When to Reorganize or Rebuild Indexes

To manage fragmentation, SQL Server provides two options:

  • Reorganize:
    • A lightweight, online operation that defragments the index at the leaf level by reordering pages.
    • Minimal system resource usage and can be safely interrupted.
  • Rebuild:
    • A more intensive process that completely recreates the index, removing fragmentation.
    • Can be done online or offline, depending on your SQL Server edition.
    • Requires more resources but provides thorough optimization.

Key Considerations for Online Index Rebuilds:

  • Enterprise Edition: Supports online rebuilds, allowing uninterrupted access to data.
  • Standard and Other Editions: Requires offline rebuilds, during which data access is temporarily restricted.

Best Practices

  • Reorganize when fragmentation levels are between 5% and 30%.
  • Rebuild when fragmentation exceeds 30%.

These thresholds may vary depending on workload and system specifics. Regular monitoring of fragmentation levels helps maintain optimal performance.

------------Code----------

SET NOCOUNT ON;

-- Create a temporary table for results
IF OBJECT_ID('tempdb..#Fragmentation') IS NOT NULL DROP TABLE #Fragmentation;

CREATE TABLE #Fragmentation (
    DatabaseName NVARCHAR(128),
    TableName NVARCHAR(128),
    IndexName NVARCHAR(128),
    IndexType NVARCHAR(60),
    AvgFragmentationPercent FLOAT,
    FragmentCount INT
);

-- Declare variables
DECLARE @DBName NVARCHAR(128);
DECLARE @SQL NVARCHAR(MAX);

-- Iterate through all databases
DECLARE dbCursor CURSOR FOR
SELECT name FROM sys.databases
WHERE state_desc = 'ONLINE' AND name NOT IN ('master', 'tempdb', 'model', 'msdb');

OPEN dbCursor;
FETCH NEXT FROM dbCursor INTO @DBName;


WHILE @@FETCH_STATUS = 0
BEGIN
    SET @SQL = '
    USE [' + @DBName + '];
    INSERT INTO #Fragmentation
    SELECT 
        DB_NAME() AS DatabaseName,
        OBJECT_NAME(ips.object_id) AS TableName,
        i.name AS IndexName,

        CASE 
            WHEN i.type = 1 THEN ''Clustered Index''
            WHEN i.type = 2 THEN ''Non-Clustered Index''
            WHEN i.type = 3 THEN ''XML Index''
            ELSE ''Unknown''
        END AS IndexType,

        ips.avg_fragmentation_in_percent AS AvgFragmentationPercent,

        ips.page_count AS FragmentCount

    FROM 
        sys.dm_db_index_physical_stats(DB_ID(), NULL, NULL, NULL, ''LIMITED'') ips

    JOIN 
        sys.indexes i ON ips.object_id = i.object_id AND ips.index_id = i.index_id

    WHERE 
        ips.avg_fragmentation_in_percent > 5

    AND 
        ips.page_count > 1;';

    EXEC sp_executesql @SQL;
    FETCH NEXT FROM dbCursor INTO @DBName;
END;

CLOSE dbCursor;
DEALLOCATE dbCursor;

-- Display results
SELECT * FROM #Fragmentation
ORDER BY DatabaseName, AvgFragmentationPercent DESC;

-- Clean up
DROP TABLE #Fragmentation;


Sunday, 25 August 2024

Understanding and Implementing BIMI TXT Records

Brand Indicators for Message Identification (BIMI) is an innovative standard that empowers brands to showcase their logo in email clients that are compatible with BIMI. This feature not only bolsters brand recognition but also fosters trust among email recipients. Here's a concise guide on what BIMI TXT records are and how to utilize them.

A BIMI TXT record is a string of text incorporated into your domain's DNS records. It contains the URL of your logo file, which should be a Scalable Vector Graphics (SVG) file.

To establish a BIMI record, you initially need an SVG logo file uploaded to your domain's web storage. Subsequently, you will need to create a TXT record with the following content:

v=BIMI1;l=[your SVG file URL]

This simple step allows your brand's logo to appear in supporting email clients, enhancing your brand's visibility and trustworthiness.

Saturday, 17 August 2024

Windows Memory compression (More RAM at the expanse of CPU)

In its 10525 build, Windows 10 introduced a feature known as Memory Compression also included in Windows 11. This feature aims to optimize the utilization of your system’s physical memory and reduce the need for disk-based pagefile IO operations.

Memory Compression works by compressing infrequently accessed pages and retaining them in a new compression store within the physical RAM. This process allows your PC’s RAM to store more data than its original capacity, which can enhance your system's performance.

For instance, if your PC has 8 GB of RAM available, and there’s 9 GB of data to be stored on it, Memory Compression will attempt to compress the extra data so it fits within the 8 GB capacity of your RAM. Without Memory Compression, your PC would store the extra data in a file on your hard drive storage, which can slow down your PC as it takes more time to read data from a file on the hard drive than from RAM.

While Memory Compression can improve performance, it does use more CPU resources. If you notice a lot of compressed memory and think it’s slowing down your PC, there are a couple of solutions. One solution is to install more physical memory (RAM). This will allow your system to store more data in RAM without needing to compress it, reducing the CPU usage associated with Memory Compression.

If installing more RAM is not feasible, you can disable Memory Compression. Here’s how:

  1. Open the Command Prompt as an administrator.
  2. Type the following command and press Enter: `Disable-MMAgent -mc`
  3. Restart your computer.

In conclusion, Memory Compression is a feature designed to optimize your system's performance by making efficient use of your RAM. It's a tool that can be beneficial, but like all tools, it's important to understand how it works and when to use it.

Sunday, 11 August 2024

The risk if AI model collapse / The death of generative AI

Model collapse refers to a phenomenon where machine learning models gradually degrade due to errors stemming from unchecked training on synthetic data. Specifically, this synthetic data includes outputs from other models, including prior versions of the same model. There are two distinct stages of model collapse:

  1. Early Model Collapse: In the early stages of collapse, it can be hard to detect as performance could appear to improve while the AI starts to lose its grasps on the smaller details.

  2. Late Model Collapse: This is where performance and accuracy both start to suffer greatly, with the AI becoming confused and losing much of its variance.
A study by Duke University researcher Emily Wenge where an AI model was giving a task of generating dog breeds, at first the AI would recreate breeds most common in its training data and may start to over represent a single group of breeds if it's held more in its data.

As new generations are trained using the older generation data it would compound the over representation until rare breeds disappeared from the newer generated data all together, over time this would lead to a total collapse where the new AI would just be outputting a single breed of dog.

This risk of collapse undermines and threatens generative AI as a useful tool and as human generated content is starting to be limited to the AI training set and AI generated content is on the rise are we heading towards a totally avoidable dumbing of AI.

Should we not reframe our view on AI in this process and allow it the same freedom of access as a human to the data online. allowing the growth of a tool that could change how we interact with information on a whole.


Monday, 15 July 2024

Enable Wireless Diagnostics in Windows

  1.  Open Event Log
  2. Go to View and tick "Show Analytic and Debug logs 


  3. Go to "Applications and Services logs > Microsoft > Windows > WLAN-AutoConfig" and right-click on "Diagnostic" and go to "Properties"
  4. and tick "Enable logging"
This should now bring more meaningful results for "Netsh wlan show wlanreport"

Wednesday, 3 April 2024

How to Backup BitLocker Key to Azure AD Using PowerShell

BitLocker is a security feature built into Windows that provides encryption for entire volumes. It addresses the threats of data theft or exposure from lost, stolen, or inappropriately decommissioned devices. By encrypting the hard drive where Windows is installed, or the entire computer if it has multiple drives, BitLocker helps protect your data.

BitLocker is particularly useful as it provides protection against unauthorised changes to your system such as firmware-level malware. It also helps mitigate unauthorised data access by enhancing file and system protections. BitLocker is an essential tool for securing your data, especially when data breaches and information theft are common.

The Command

Here is the command that we’ll be using:

BackupToAAD-BitLockerKeyProtector -MountPoint $env:SystemDrive -KeyProtectorId ((Get-BitLockerVolume -MountPoint $env:SystemDrive ).KeyProtector | where {$_.KeyProtectorType -eq "RecoveryPassword" }).KeyProtectorId

This command backs up the BitLocker key protector of type “RecoveryPassword” for the system drive to AAD.

Outputting the Key Protector to the Screen

If you want to output the key protector to the screen, you can use the following command:

(Get-BitLockerVolume -MountPoint C).KeyProtector

This command retrieves the key protector for the C drive and outputs it to the screen.