Name Last Update Last Commit     1f3a6f168ea – Add docker-compose-jfrog.yml history
File_dir CEPTD Loading commit data... Ajax_loader_tree
File_dir MLTD Loading commit data... Ajax_loader_tree
File_dir OD Loading commit data... Ajax_loader_tree
File_dir api Loading commit data... Ajax_loader_tree
File_txt .gitignore Loading commit data... Ajax_loader_tree
File_txt README.md Loading commit data... Ajax_loader_tree
File_txt create_volumes.sh Loading commit data... Ajax_loader_tree
File_txt delete.sh Loading commit data... Ajax_loader_tree
File_txt deploy.sh Loading commit data... Ajax_loader_tree
File_txt docker-compose-jfrog.yml Loading commit data... Ajax_loader_tree
File_txt docker-compose.yml Loading commit data... Ajax_loader_tree
File_txt elk_base.tar.gz Loading commit data... Ajax_loader_tree
File_txt grafana-storage.tar.xz Loading commit data... Ajax_loader_tree
README.md
simple container just to install laravel dependencies

After cloning the project, execute these commands only one time

cd api
cp .env-example .env
docker run --rm -v $(pwd):/app composer:2.0.7 install
cd ..
./create_volumes.sh
docker-compose up
docker-compose exec api php artisan migrate:fresh --seed

This first docker command just creates an empty container to install the required API dependencies.
The create_volumes script, creates the necessary folder structure and extracts the baseline elasticsearch index.
Finally, the last docker command creates and populates the database.

On Keycloak authentication

Update the KEYCLOAK_REALM_PUBLIC_KEY value in .env

MLTD proof of concept experiment

MLTD comes with a model already trained.
The model is trained on the data which where available in TimescaleDB (tables XLSIEM, ADT).
The training data are provided in the csv files "xlsiem.csv" "adt.csv" for results reproduction (directory MLTD/csv_files).
To train a model execute the following POST request with the provided body:

http://localhost:5000/api/v1.0/mltd/training
{
"description":"CUREX data",
"timedb_host":"<the timescaleDB host>",
"timedb_port":5432,
"timedb_username":"<the timescaleDB username>",
"timedb_password":"<the timescaleDB password>",
"timedb_ssl":"False",
"timedb_dbname":"kea",
"asset_id":"server",
"timedb_adt_table":"adt",
"timedb_xlsiem_table":"xlsiem",
"timedb_od_table":"od",
"timedb_measurement":"artificial_events",
"mp_thres_X":10,
"mp_thres_Y":2,
"mp_thres_Z":10,
"mp_pat_length":6,
"rre":"True",
"rfe":"True",
"kofe":"False",
"mil_over":"True",
"fs":"False",
"rf_s":0.06,
"rf_midpoint":"2H",
"hours_before":"4H",
"time_segments":"20T",
"dates":[]
}

In order to obtain the top-k important features, use the following request:
http://127.0.0.1:5000/api/v1.0/mltd/threat-identification/1/ #where 1 is the trained model id

OD pcap files

Incide the OD directory we provide the pcap files used for load testing.

To upload a pcap file for analysis use the following steps.
First start an OD task with the following POST request and body:

http://localhost:9091/api/v1/od
{
"timeDb_database": "kea",
"timeDb_host": "<the timescaleDB host>",
"timeDb_password": "<the timescaleDB password>",
"timeDb_port": "5432",
"timeDb_ssl": "true",
"timeDb_table": "od",
"timeDb_username": "postgres",
"k": "20",
"measurement": "packets-loss",
"mqtt_host": "localhost",
"mqtt_password": "",
"mqtt_port": "1883",
"mqtt_topic": "auth/od",
"mqtt_usermane": "",
"outlier_life": "0",
"r": "0.1",
"slide": "10",
"w": "60"
}

Get the returned OD task id and execute the following POST request to upload a pcap file:

http://127.0.0.1:9091/api/v1/od/analyse/<OD task id>
Header: Content-Type: application/json
Body: file=big.pcap

A test change for the integration