##### simple container just to install laravel dependencies After cloning the project, execute these commands only one time ``` cd api cp .env-example .env docker run --rm -v $(pwd):/app composer:2.0.7 install cd .. ./create_volumes.sh docker-compose up docker-compose exec api php artisan migrate:fresh --seed ``` This first docker command just creates an empty container to install the required API dependencies. The create_volumes script, creates the necessary folder structure and extracts the baseline elasticsearch index. Finally, the last docker command creates and populates the database. ## On Keycloak authentication Keycloak authentication is enablede by setting the AUTH_ENABLED variable in the .env file to "true" (no quotes needed). Other than that, you will need to update the KEYCLOAK_REALM_PUBLIC_KEY value in your .env file too. ##### MLTD proof of concept experiment ####### MLTD comes with a model already trained. The model is trained on the data which where available in TimescaleDB (tables XLSIEM, ADT). The training data are provided in the csv files "xlsiem.csv" "adt.csv" for results reproduction (directory MLTD/csv_files). To train a model execute the following POST request with the provided body: ``` http://localhost:5000/api/v1.0/mltd/training { "description":"CUREX data", "timedb_host":"", "timedb_port":5432, "timedb_username":"", "timedb_password":"", "timedb_ssl":"False", "timedb_dbname":"kea", "asset_id":"server", "timedb_adt_table":"adt", "timedb_xlsiem_table":"xlsiem", "timedb_od_table":"od", "timedb_measurement":"artificial_events", "mp_thres_X":10, "mp_thres_Y":2, "mp_thres_Z":10, "mp_pat_length":6, "rre":"True", "rfe":"True", "kofe":"False", "mil_over":"True", "fs":"False", "rf_s":0.06, "rf_midpoint":"2H", "hours_before":"4H", "time_segments":"20T", "dates":[] } ``` In order to obtain the top-k important features, use the following request: http://127.0.0.1:5000/api/v1.0/mltd/threat-identification/1/ #where 1 is the trained model id ##### OD pcap files ####### Incide the OD directory we provide the pcap files used for load testing. To upload a pcap file for analysis use the following steps. First start an OD task with the following POST request and body: ``` http://localhost:9091/api/v1/od { "timeDb_database": "kea", "timeDb_host": "", "timeDb_password": "", "timeDb_port": "5432", "timeDb_ssl": "true", "timeDb_table": "od", "timeDb_username": "postgres", "k": "20", "measurement": "packets-loss", "mqtt_host": "localhost", "mqtt_password": "", "mqtt_port": "1883", "mqtt_topic": "auth/od", "mqtt_usermane": "", "outlier_life": "0", "r": "0.1", "slide": "10", "w": "60" } ``` Get the returned OD task id and execute the following POST request to upload a pcap file: ``` http://127.0.0.1:9091/api/v1/od/analyse/ Header: Content-Type: application/json Body: file=big.pcap ```