Getting Started with Wazuh: Setting Up Your Lab Environment for XDR and SIEM”

In today’s cybersecurity landscape, having a robust and flexible security information and event management (SIEM) system is crucial.
Wazuh, an open-source security platform, offers comprehensive solutions for threat detection, integrity monitoring, incident response, and compliance.

Wazuh has an interesting history. In 2015, the Wazuh team decided to fork OSSEC, an open-source host-based Intrusion Detection System (IDS), to expand its core functionalities with additional features, enhancements, and a user-friendly interface.
Wazuh has grown significantly since its inception. It is now a comprehensive, open-source security platform that provides unified XDR (Extended Detection and Response) and SIEM (Security Information and Event Management) capabilities. The platform is designed to monitor infrastructures, detect threats, respond to incidents, and ensure regulatory compliance.

This blog will guide you through setting up Wazuh in a lab environment, focusing on its basic capabilities in Extended Detection and Response (XDR) and SIEM.
Whether you’re a cybersecurity professional or an enthusiast, this step-by-step guide will help to get started with Wazuh to secure your systems effectively.
We start with the defaults to make the lab-setup not more complex as necessary.

My Lab-env is as follows:

HostIPOS
Wazuh-Server10.50.100.76Ubuntu 24 LTS
Wazuh-Agent10.50.100.110RHEL 9
Wazuh-Agent10.50.100.111RHEL 9
Wazuh-Agent10.50.100.201Windows

Basic setup of Wazuh-Server

with root rights execute curl -sO https://packages.wazuh.com/4.10/wazuh-install.sh && sudo bash ./wazuh-install.sh -a

Example output:

Please note the autogenerated User/Password to get later access to the Dashboard.

Linux: Basic setup of Wazuh-Agent

with root rights execute:


run the Agent installer (10.50.100.76 = Wazuh-Server)

example output:

Start the Wazuh-Agent and check the status:

example output:

Windows: Basic setup of Wazuh-Agent

Download the Agent-Installer and execute the command with admin-rights:

example-output:

Access the Dashboard

open a Browser and access: https://10.50.100.76
Don’t be surprised that the connection is interested, we use the default certs.


We see the default Dashboard:

Wazuh is shipped with default rules.
In a productive environment the real work would start now:
Create/adapt rules that are suitable for the required purposes and environment.
We will start with fixing the first (easy) vulnerability finding.

Fix a chrony-finding/vulnerability

Lets pick an RHEL-Agent and check the details of the chrony-finding:

  1. Navigate to Configuration Assesment
  2. Select an Agent
  3. Agent ID 02 looks as a good candidate
  4. filter the findings for chrony
  5. click on the failed check
  6. read carefully the finding and check the settings on the Agent to get it fixed

get the chrony finding fixed

The crony process is not executed by user chrony, let’s get it fixed:

To force a new assessment a restart of the Wazuh-agent is necessary.

Go back to the Dashboard/finding-screen and check again the chrony-finding:

Chrony looks good now, just another 82 findings to fix

In one of the next sessions I will go into the details of Wazuh, it is a great product.

Run DeepSeek LLM locally on your M series Mac with LM Studio and integrate iTerm2

With the integration of LM Studio and iTerm2, powered by the cutting-edge DeepSeek LLM, developers can now streamline their workflows.
This setup enhances coding efficiency while maintaining complete control over their data.

Running DeepSeek LLM locally offers several benefits:

  1. Customization: You have full control over the model and can fine-tune it to better suit your specific needs and preferences.
  2. Offline Access: You can use the model even without an internet connection, making it more reliable in various situations.
  3. Cost Efficiency: Avoiding cloud service fees can be more economical, especially for extensive or long-term use.

These advantages make running DeepSeek LLM locally a powerful option for developers and users who prioritize privacy.

The following steps show the integration of LM Studio with iTerm2.

LM Studio

Download your preferred LLM and load the Model:

  1. Jump to the Developer screen
  2. Open Settings and set the Server Port to: 11434
  3. Start the Engine

The screen shows now a running service:

Click on the copy-button and close the page

iTerm2

Open the Settings of iTerm2

  1. install the plugin
  2. Enable AI features
  3. enter any API Key (entry is necessary but is not checked locally)
  4. For the first test you can leave the AI Prompt
  5. Use llama3:latest Model
  6. paste the URL copied from LM Studio and add /v1/chat/completions

    The final URL is then
    http://localhost:11434/v1/chat/completions

close the Settings-Windows

Action

-Press command-y in your iTerm2 session
-type your question into the windows and press shift-enter to ask your LLM:

Now you can use your local running LLM, even when you switch off your network-adapter 🙂

Automate Your Cloud Backups: rclone and Duplicati

In today’s digital age, safeguarding your data is more crucial than ever. With the increasing reliance on cloud storage, it’s essential to have a robust backup strategy in place. This blog post will guide you through automating your cloud backups (like Onedrive in this example) using rclone and Duplicati on a Linux system (in my case Ubuntu 24.04.1 LTS).

Why rclone and Duplicati?

  • rclone: A versatile command-line tool (inspired by rsync) that supports various cloud storage providers, including OneDrive. It allows you to sync, copy, and mount cloud storage as if it were a local filesystem.
  • Duplicati: An open-source backup solution that offers incremental backups, encryption, and scheduling. It’s designed to work efficiently with cloud storage, making it an ideal choice for automated backups.

We’ll use rclone to mount your OneDrive folder as a local directory seamlessly. This setup allows Duplicati to perform smart incremental backups, ensuring your data is securely backed up without unnecessary duplication. In this guide, I’ll walk you through the steps to set up rclone and Duplicati, making sure your cloud storage is backed up efficiently and securely. Let’s get started!

Install rclone

This command downloads and runs the installation script for rclone, making it easy to install on most Unix-like systems, including Linux and macOS. For Windows, you can download the executable from the rclone website.

run apt install rclone

Install Duplicati

The install-process of Duplicati is already explained here.

Onedrive homework

By default, rclone uses a shared Client ID and Key when communicating with OneDrive, unless a custom client_id is specified in the configuration. This means that all rclone users share the same default Client ID for their requests. This is everything but not optimal, also throttling usually occurs.

Recommended step: Create unique Client ID for Onedrive personal

click New registration on https://portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/ApplicationsListBlade and follow the steps outlined at the rclone page.
My screenshots for this step are attached (just for your reference, please zoom in to make the file readable in your browser. ).

Setup rclone

This section guides you through the steps to configure rclone to mount your OneDrive folder to use this mount point as the source for  Duplicati backups.

run rclone config and answer the questions

Example output (Ubuntu 24.04)

for Use web browser to automatically authenticate rclone with remote?:
Choose “Yes” if your host supports a GUI.
In my case I have to answer this question with no and have to jump on an GUI-equipped host running the same clone-version to generate the needed one drive-token with the command: rclone authorize "onedrive"

Now we can mount the onedrive-storage-folder as a mount-point.
In this example I use /mnt/onedrive as the mount-point (the folder /mnt/onedrive must be present prior executing the mount command):

rclone mount onedrive:/ /mnt/onedrive

Let’s create an rclone-service to mount the one drive-folder at startup:

vi /etc/systemd/system/rclonemount.service

Start and test the created rclonemount.service:

systemctl start rclonemount

run rclonemount.service at startup:

systemctl enable rclonemount

With Duplicati we can create now a new Backup-Job using the source directory /mnt/onedrive, or any specific subfolder like /mnt/onedrive/important_data.

Onedrive can now be backed up fully automatically with a smart backup solution 🙂

As we wrap up our journey with rclone, it’s clear that this powerful tool can significantly streamline your data management tasks. Whether you’re syncing files across multiple cloud services, automating backups, or simply exploring new ways to enhance your workflow, rclone offers a versatile and reliable solution.

Remember, the key to mastering rcloneor any tool—is practice and experimentation. Don’t hesitate to dive into the documentation, explore the various commands, and tailor rclone to fit your unique needs. The possibilities are vast, and the more you experiment, the more you’ll discover the true potential of this remarkable tool.

SSH Security Made Easy: An Introduction to ssh-audit

ssh-audit is a powerful tool designed to help you assess the security of your SSH servers (and clients!). It provides detailed information about the server’s configuration, supported algorithms, and potential vulnerabilities. In this guide, I’ll walk you through the steps to install ssh-audit and run your first security tests. Secure SSH configuration made easy.

Installation on Linux

  1. Clone the Repository: Open your terminal and clone the ssh-audit repository from GitHub:
    git clone https://github.com/jtesta/ssh-audit.git
  2. Navigate to the Directory: Change to the ssh-audit directory:
    cd ssh-audit
  3. Install Dependencies: Ensure you have Python installed on your system. If not, install it using your package manager. For example, on Ubuntu:
    sudo apt-get install python3

Installation on macOS

To install ssh-audit , run:
brew install ssh-audit
(You have already Brew installed, right ?)

Please check the ssh-audit url for many other setup options (Docker,Windows,etc.)

Test the SSH-Server against vulnerabilities

execute ssh-audit <hostname>
Replace <hostname> with the IP address or domain name of the SSH server you want to audit.

Example of Ubuntu’s 24.04 LTS default SSHD setup:

(if you add the -l warn switch you just get the vulnerabilities presented)

Interpreting the Results: ssh-audit will provide a detailed report of the server’s configuration, including supported key exchange algorithms, encryption ciphers, and MAC algorithms. Look for any warnings or recommendations to improve your server’s security.

Remediation

After running ssh-audit and identifying potential vulnerabilities or weak configurations in your SSH server, it’s important to take steps to remediate these issues. Below are examples of how to apply them:

Example for Ubuntu 24.04.1 LTS:

(Note: This is just an example. The example eliminates vulnerabilities for the SSH-daemon, but it can well be that this snippet does not fit for your setup. Handle with care)

This snippet creates a configuration file (51-ssh-harden_202412.conf) in directory /etc/ssh/sshd_config.d/ with the specified settings to enhance the security of your SSH server.

(SSHD restart required)



Example for RHEL 9.4

(Note: This is just an example. This example eliminates vulnerabilities for the SSH-daemon, but it can well be that this snippet does not fit for your setup. Handle with care)

(SSHD restart required)

Proof the remediation

run ssh-audit again!

Example-output after remediation:

How can I test if my SSH-Client is not vulnerable ?

If you run ssh-audit with the switch -c it creates an ssh-service on port 2222 and audits every connection attempt:

output after the login-attempt (ssh 127.0.0.1 -p 2222)


Make your SSH-communication more secure, if not the SSH-Service opens an attack surface for uninvited visitors.
Secure SSH configuration is Key!

Consider other additional security-steps like:
Secure your SSH communication with certificates
Lab setup: Secure your SSH communication with certificates
Fail2Ban: ban hosts that cause multiple authentication errors

..
.