Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slips v1.0.10 #438

Merged
merged 36 commits into from
Jan 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
2444830
change the attacker direction of HTTP traffic, C&C channel, and conn …
AlyaGomaa Dec 16, 2023
f3c4dce
threat_intelligence.py: refactor pre_main()
AlyaGomaa Dec 18, 2023
af0046d
p2p_docker: fix problem cloning submodules [skip-ci]
AlyaGomaa Dec 18, 2023
d5b1e0b
store the accumulated threat level of each profileid and twid in the …
AlyaGomaa Dec 18, 2023
4f95ea9
evidence: reset the accumulated threat level of each profileid and tw…
AlyaGomaa Dec 18, 2023
7559d4e
evidence: add a method to check for filtered evidence only
AlyaGomaa Dec 18, 2023
bc73552
evidence.py: only get all the twid evidence if there's an alert
AlyaGomaa Dec 18, 2023
19fcbc8
Merge pull request #435 from stratosphereips/alya/store_acc_threat_le…
AlyaGomaa Dec 19, 2023
1cdce61
output.py: better handling of the termination of PBar
AlyaGomaa Dec 20, 2023
63ddd15
problem setting "vertical portscan" evidence detected by zeek [skip-ci]
AlyaGomaa Dec 20, 2023
6bb23c4
Merge pull request #436 from stratosphereips/alya/fix-problem-stoppin…
AlyaGomaa Dec 20, 2023
4668219
docker for macos m1 and m1 p2p, change the requirements to include te…
eldraco Jan 3, 2024
e481af3
docker for macos m1. Fix the Dockerfile and the requirement.txt
eldraco Jan 4, 2024
6995c4f
handle unable to do rdap lookup
AlyaGomaa Jan 5, 2024
b7f482f
Merge pull request #437 from stratosphereips/alya/fix_ipwhois_issue
AlyaGomaa Jan 5, 2024
e4a7d0b
fix printing the profilied in the accumulated_threat_level field in a…
AlyaGomaa Jan 9, 2024
b447945
p2p_docker: fix problem cloning submodules [skip-ci]
AlyaGomaa Dec 18, 2023
afa5ccc
change the attacker direction of HTTP traffic, C&C channel, and conn …
AlyaGomaa Dec 16, 2023
d760b24
threat_intelligence.py: refactor pre_main()
AlyaGomaa Dec 18, 2023
70f2663
store the accumulated threat level of each profileid and twid in the …
AlyaGomaa Dec 18, 2023
f6ab206
evidence: reset the accumulated threat level of each profileid and tw…
AlyaGomaa Dec 18, 2023
ff20b3d
evidence: add a method to check for filtered evidence only
AlyaGomaa Dec 18, 2023
4e4059a
evidence.py: only get all the twid evidence if there's an alert
AlyaGomaa Dec 18, 2023
80952e8
problem setting "vertical portscan" evidence detected by zeek [skip-ci]
AlyaGomaa Dec 20, 2023
23ed9d6
output.py: better handling of the termination of PBar
AlyaGomaa Dec 20, 2023
bf22695
handle unable to do rdap lookup
AlyaGomaa Jan 5, 2024
6981611
fix printing the profilied in the accumulated_threat_level field in a…
AlyaGomaa Jan 9, 2024
82129fc
Merge remote-tracking branch 'origin/develop' into develop
AlyaGomaa Jan 15, 2024
d562f3b
fix stopping slips daemon
AlyaGomaa Jan 15, 2024
1597ce0
update slips.gif
AlyaGomaa Jan 15, 2024
3c6e230
update CHANGELOG.md with v1.0.10 changes
AlyaGomaa Jan 15, 2024
c1c18e3
bump slips version to 1.0.10
AlyaGomaa Jan 15, 2024
1deca5d
fix typos in CHANGELOG.md
AlyaGomaa Jan 15, 2024
7d90eaa
CI-prod-testing: run "apt-get update --fix-missing" before installing…
AlyaGomaa Jan 15, 2024
5eb395d
remove the test_daemon.py since it has no unit tests
AlyaGomaa Jan 15, 2024
29d17da
force-checkout to develop in CI-prod-testing
AlyaGomaa Jan 15, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 8 additions & 16 deletions .github/workflows/CI-production-testing.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ jobs:
fetch-depth: ''

- name: Install slips dependencies
run: sudo apt-get -y --no-install-recommends install python3 redis-server python3-pip python3-certifi python3-dev build-essential file lsof net-tools iproute2 iptables python3-tzlocal nfdump tshark git whois golang nodejs notify-osd yara libnotify-bin
run: sudo apt-get update --fix-missing && sudo apt-get -y --no-install-recommends install python3 redis-server python3-pip python3-certifi python3-dev build-essential file lsof net-tools iproute2 iptables python3-tzlocal nfdump tshark git whois golang nodejs notify-osd yara libnotify-bin

- name: Install Zeek
run: |
Expand All @@ -47,7 +47,7 @@ jobs:
run: redis-server --daemonize yes

- name: Run unit tests
run: python3 -m pytest tests/ --ignore="tests/test_daemon.py" --ignore="tests/test_database.py" --ignore="tests/integration_tests" -n 7 -p no:warnings -vv -s
run: python3 -m pytest tests/ --ignore="tests/test_database.py" --ignore="tests/integration_tests" -n 7 -p no:warnings -vv -s

- name: Run database unit tests
run: python3 -m pytest tests/test_database.py -p no:warnings -vv
Expand Down Expand Up @@ -121,29 +121,21 @@ jobs:
image: stratosphereips/slips:latest
run: |
git reset --hard
git pull & git checkout origin/develop
git pull & git checkout -f origin/develop
redis-server --daemonize yes
python3 -m pytest tests/ --ignore="tests/test_daemon.py" --ignore="tests/test_database.py" --ignore="tests/integration_tests" -n 7 -p no:warnings -vv -s
python3 -m pytest tests/ --ignore="tests/test_database.py" --ignore="tests/integration_tests" -n 7 -p no:warnings -vv -s

- name: Run database tests inside docker
uses: addnab/docker-run-action@v3
with:
image: stratosphereips/slips:latest
run: |
git reset --hard
git pull & git checkout origin/develop
git pull & git checkout -f origin/develop
redis-server --daemonize yes
python3 -m pytest tests/test_database.py -p no:warnings -vv

- name: Run daemon tests inside docker
uses: addnab/docker-run-action@v3
with:
image: stratosphereips/slips:latest
run: |
git reset --hard
git pull & git checkout origin/develop
redis-server --daemonize yes
python3 -m pytest tests/test_daemon.py -p no:warnings -vv


- name: Run integration tests inside docker
uses: addnab/docker-run-action@v3
Expand All @@ -154,7 +146,7 @@ jobs:
options: -v ${{ github.workspace }}/output:/StratosphereLinuxIPS/output
run: |
git reset --hard
git pull & git checkout origin/develop
git pull & git checkout -f origin/develop
redis-server --daemonize yes
python3 -m pytest -s tests/integration_tests/test_dataset.py -p no:warnings -vv

Expand All @@ -167,7 +159,7 @@ jobs:
options: -v ${{ github.workspace }}/output:/StratosphereLinuxIPS/output
run: |
git reset --hard
git pull & git checkout origin/develop
git pull & git checkout -f origin/develop
redis-server --daemonize yes
python3 -m pytest -s tests/integration_tests/test_config_files.py -p no:warnings -vv

Expand Down
9 changes: 9 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,12 @@
- 1.0.10 (January 2024)
- Faster ensembling of evidence.
- Log accumulated threat levels of each evidence in alerts.json.
- Better handling of the termination of the progress bar.
- Re-add support for tensorflow to the dockers for macOS M1 and macOS M1 P2P.
- Fix problem setting 'vertical portscan' evidence detected by Zeek.
- Fix unable to do RDAP lookups
- Fix stopping Slips daemon.

-1.0.9 (December 2023)
- Fix using -k to kill opened redis servers.
- Better README and docs.
Expand Down
5 changes: 2 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
<h1 align="center">
Slips v1.0.9
Slips v1.0.10
</h1>


Expand Down Expand Up @@ -51,8 +51,7 @@ Slips v1.0.9

Slips is a powerful endpoint behavioral intrusion prevention and detection system that uses machine learning to detect malicious behaviors in network traffic. Slips can work with network traffic in real-time, PCAP files, and network flows from popular tools like Suricata, Zeek/Bro, and Argus. Slips threat detection is based on a combination of machine learning models trained to detect malicious behaviors, 40+ threat intelligence feeds, and expert heuristics. Slips gathers evidence of malicious behavior and uses extensively trained thresholds to trigger alerts when enough evidence is accumulated.

<img src="https://raw.githubusercontent.com/stratosphereips/StratosphereLinuxIPS/develop/docs/images/slips.gif" width="850px"
title="Slips in action.">
<img src="https://raw.githubusercontent.com/stratosphereips/StratosphereLinuxIPS/develop/docs/images/slips.gif" width="850px" title="Slips in action.">



Expand Down
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
1.0.9
1.0.10
7 changes: 3 additions & 4 deletions docker/P2P-image/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,6 @@ ENV IS_IN_A_DOCKER_CONTAINER True
ENV SLIPS_DIR /StratosphereLinuxIPS




# Install wget and add Zeek repository to our sources.
RUN apt update && apt install -y --no-install-recommends \
wget \
Expand Down Expand Up @@ -48,9 +46,10 @@ RUN apt update && apt install -y --no-install-recommends \
&& ln -s /opt/zeek/bin/zeek /usr/local/bin/bro


RUN git clone --recurse-submodules --remote-submodules https://github.com/stratosphereips/StratosphereLinuxIPS/ ${SLIPS_DIR}/
# Switch to Slips installation dir when login.
RUN git clone https://github.com/stratosphereips/StratosphereLinuxIPS/ ${SLIPS_DIR}/
WORKDIR ${SLIPS_DIR}
RUN git submodule sync && git pull --recurse-submodules
# Switch to Slips installation dir when login.
RUN chmod 774 slips.py && git submodule init && git submodule update


Expand Down
Binary file modified docs/images/slips.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 3 additions & 2 deletions managers/process_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -547,8 +547,6 @@ def shutdown_gracefully(self):
analysis_time = self.get_analysis_time()
self.main.print(f"Analysis of {self.main.input_information} "
f"finished in {analysis_time:.2f} minutes")
flows_count: int = self.main.db.get_flows_count()
self.main.print(f"Total flows read (without altflows): {flows_count}", log_to_logfiles_only=True)

graceful_shutdown = True
if self.main.mode == 'daemonized':
Expand All @@ -563,6 +561,9 @@ def shutdown_gracefully(self):
self.main.daemon.delete_pidfile()

else:
flows_count: int = self.main.db.get_flows_count()
self.main.print(f"Total flows read (without altflows): {flows_count}", log_to_logfiles_only=True)

hitlist: Tuple[List[Process], List[Process]] = self.get_hitlist_in_order()
to_kill_first: List[Process] = hitlist[0]
to_kill_last: List[Process] = hitlist[1]
Expand Down
22 changes: 13 additions & 9 deletions modules/flowalerts/flowalerts.py
Original file line number Diff line number Diff line change
Expand Up @@ -2042,10 +2042,12 @@ def main(self):

if msg := self.get_msg('tw_closed'):
profileid_tw = msg['data'].split('_')
profileid, twid = f'{profileid_tw[0]}_{profileid_tw[1]}', profileid_tw[-1]
profileid = f'{profileid_tw[0]}_{profileid_tw[1]}',
twid = profileid_tw[-1]
self.detect_data_upload_in_twid(profileid, twid)

# --- Detect DNS issues: 1) DNS resolutions without connection, 2) DGA, 3) young domains, 4) ARPA SCANs
# --- Detect DNS issues: 1) DNS resolutions without
# connection, 2) DGA, 3) young domains, 4) ARPA SCANs
if msg:= self.get_msg('new_dns'):
data = json.loads(msg['data'])
profileid = data['profileid']
Expand All @@ -2060,8 +2062,10 @@ def main(self):
rcode_name = flow_data.get('rcode_name', False)
stime = data.get('stime', False)

# only check dns without connection if we have answers(we're sure the query is resolved)
# sometimes we have 2 dns flows, 1 for ipv4 and 1 fo ipv6, both have the
# only check dns without connection if we have
# answers(we're sure the query is resolved)
# sometimes we have 2 dns flows, 1 for ipv4 and
# 1 fo ipv6, both have the
# same uid, this causes FP dns without connection,
# so make sure we only check the uid once
if answers and uid not in self.connections_checked_in_dns_conn_timer_thread:
Expand Down Expand Up @@ -2089,12 +2093,12 @@ def main(self):
domain, stime, profileid, twid, uid
)

if msg:= self.get_msg('new_downloaded_file'):
if msg := self.get_msg('new_downloaded_file'):
ssl_info = json.loads(msg['data'])
self.check_malicious_ssl(ssl_info)

# --- Detect Bad SMTP logins ---
if msg:= self.get_msg('new_smtp'):
if msg := self.get_msg('new_smtp'):
smtp_info = json.loads(msg['data'])
profileid = smtp_info['profileid']
twid = smtp_info['twid']
Expand All @@ -2106,7 +2110,7 @@ def main(self):
flow
)
# --- Detect multiple used SSH versions ---
if msg:= self.get_msg('new_software'):
if msg := self.get_msg('new_software'):
msg = json.loads(msg['data'])
flow:dict = msg['sw_flow']
twid = msg['twid']
Expand All @@ -2121,10 +2125,10 @@ def main(self):
role='SSH::SERVER'
)

if msg:=self.get_msg('new_weird'):
if msg := self.get_msg('new_weird'):
msg = json.loads(msg['data'])
self.check_weird_http_method(msg)

if msg:= self.get_msg('new_tunnel'):
if msg := self.get_msg('new_tunnel'):
msg = json.loads(msg['data'])
self.check_GRE_tunnel(msg)
2 changes: 1 addition & 1 deletion modules/flowalerts/set_evidence.py
Original file line number Diff line number Diff line change
Expand Up @@ -672,7 +672,7 @@ def set_evidence_vertical_portscan(
source_target_tag = 'Recon'
conn_count = int(msg.split('least ')[1].split(' unique')[0])
attacker = scanning_ip
victim = msg.splt('ports of ')[-1]
victim = msg.split('ports of ')[-1]
self.db.setEvidence(
evidence_type,
attacker_direction,
Expand Down
2 changes: 1 addition & 1 deletion modules/http_analyzer/http_analyzer.py
Original file line number Diff line number Diff line change
Expand Up @@ -421,7 +421,7 @@ def set_evidence_http_traffic(self, daddr, profileid, twid, uid, timestamp):
source_target_tag = 'SendingUnencryptedData'
category = 'Anomaly.Traffic'
evidence_type = 'HTTPtraffic'
attacker_direction = 'dstip'
attacker_direction = 'srcip'
attacker = daddr
saddr = profileid.split('_')[-1]
description = f'Unencrypted HTTP traffic from {saddr} to {daddr}.'
Expand Down
13 changes: 8 additions & 5 deletions modules/ip_info/asn_info.py
Original file line number Diff line number Diff line change
Expand Up @@ -91,14 +91,17 @@ def get_asn_info_from_geolite(self, ip) -> dict:

return ip_info

def cache_ip_range(self, ip):
def cache_ip_range(self, ip: str):
"""
Get the range of the given ip and
cache the asn of the whole ip range
"""
if not ip:
return False

try:
# Cache the range of this ip
whois_info = ipwhois.IPWhois(address=ip).lookup_rdap()
whois_info: dict = ipwhois.IPWhois(address=ip).lookup_rdap()
asnorg = whois_info.get('asn_description', False)
asn_cidr = whois_info.get('asn_cidr', False)
asn_number = whois_info.get('asn', False)
Expand All @@ -115,12 +118,12 @@ def cache_ip_range(self, ip):
except (
ipwhois.exceptions.IPDefinedError,
ipwhois.exceptions.HTTPLookupError,
ipwhois.exceptions.ASNRegistryError,
ipwhois.exceptions.ASNParseError,
):
# private ip or RDAP lookup failed. don't cache
# or ASN lookup failed with no more methods to try
return False
except ipwhois.exceptions.ASNRegistryError:
# ASN lookup failed with no more methods to try
pass


def get_asn_online(self, ip):
Expand Down
2 changes: 1 addition & 1 deletion modules/rnn_cc_detection/rnn_cc_detection.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def set_evidence(

tupleid = tupleid.split('-')
dstip, port, proto = tupleid[0], tupleid[1], tupleid[2]
attacker_direction = 'dstip'
attacker_direction = 'srcip'
attacker = dstip
source_target_tag = 'Botnet'
evidence_type = 'Command-and-Control-channels-detection'
Expand Down
21 changes: 14 additions & 7 deletions modules/threat_intelligence/threat_intelligence.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def set_evidence_malicious_asn(
"""
:param asn_info: the malicious asn info taken from own_malicious_iocs.csv
"""
attacker_direction = 'dstip'
attacker_direction = 'srcip'
category = 'Anomaly.Traffic'
evidence_type = 'ThreatIntelligenceBlacklistedASN'
confidence = 0.8
Expand Down Expand Up @@ -161,16 +161,18 @@ def set_evidence_malicious_ip(

confidence = 1
category = 'Anomaly.Traffic'
if 'src' in attacker_direction:
if 'src' in ip_state:
direction = 'from'
opposite_dir = 'to'
victim = daddr
elif 'dst' in attacker_direction:
attacker_direction = 'srcip'
elif 'dst' in ip_state:
direction = 'to'
opposite_dir = 'from'
victim = profileid.split("_")[-1]
attacker_direction = 'srcip'
else:
# attacker_dir is not specified?
# ip_state is not specified?
return


Expand Down Expand Up @@ -997,9 +999,14 @@ def pre_main(self):
# Load the local Threat Intelligence files that are
# stored in the local folder self.path_to_local_ti_files
# The remote files are being loaded by the update_manager
self.update_local_file('own_malicious_iocs.csv')
self.update_local_file('own_malicious_JA3.csv')
self.update_local_file('own_malicious_JARM.csv')
local_files = (
'own_malicious_iocs.csv',
'own_malicious_JA3.csv',
'own_malicious_JARM.csv',
)
for local_file in local_files:
self.update_local_file(local_file)

self.circllu_calls_thread.start()

def main(self):
Expand Down
12 changes: 12 additions & 0 deletions slips_files/core/database/database_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,18 @@ def get_output_dir(self, *args, **kwargs):
def get_input_file(self, *args, **kwargs):
return self.rdb.get_input_file(*args, **kwargs)


def get_accumulated_threat_level(self, *args, **kwargs):
return self.rdb.get_accumulated_threat_level(*args, **kwargs)


def set_accumulated_threat_level(self, *args, **kwargs):
return self.rdb.set_accumulated_threat_level(*args, **kwargs)


def update_accumulated_threat_level(self, *args, **kwargs):
return self.rdb.update_accumulated_threat_level(*args, **kwargs)

def setInfoForIPs(self, *args, **kwargs):
return self.rdb.setInfoForIPs(*args, **kwargs)

Expand Down
Loading
Loading