Skip to content
This repository has been archived by the owner on Sep 12, 2024. It is now read-only.

Commit

Permalink
merge conflict resolved
Browse files Browse the repository at this point in the history
  • Loading branch information
ChartistDev committed Jun 14, 2022
2 parents 8f148ac + 2e7448e commit e3c234c
Show file tree
Hide file tree
Showing 112 changed files with 5,019 additions and 2,102 deletions.
63 changes: 43 additions & 20 deletions .env
Original file line number Diff line number Diff line change
Expand Up @@ -2,31 +2,53 @@

# User Configurable Variables
## Webapp URL
CHAOSGENIUS_WEBAPP_URL=http://localhost:8080/ # URL of the Chaos Genius deployment. Usually, this will be http://<ip-address-or-hostname>:8080/ (http://localhost:8080/ in local installations).
# URL of the Chaos Genius deployment. Usually, this will be http://<ip-address-or-hostname>:8080/ (http://localhost:8080/ in local installations).
CHAOSGENIUS_WEBAPP_URL=http://localhost:8080/

## Analytics
### Common Analytics Configuration
DAYS_OFFSET_FOR_ANALTYICS=2 # Sets the days offset from the current date till which your KPI's will run for.
HOURS_OFFSET_FOR_ANALTYICS=0 # Sets the hours offset from the latest data point till which Anomaly Detection will run for your KPI.
TIMEZONE=UTC # Timezone on which all your analytics are reported.
METADATA_SYNC_TIME=03:00 # Synctime for your metadata
# Sets the days offset from the current date till which your KPI's will run for.
DAYS_OFFSET_FOR_ANALTYICS=2
# Sets the hours offset from the latest data point till which Anomaly Detection will run for your KPI.
HOURS_OFFSET_FOR_ANALTYICS=0
# Timezone on which all your analytics are reported.
TIMEZONE=UTC
# Synctime for your metadata
METADATA_SYNC_TIME=03:00

### KPI Configuration
# Sets the maximum number of rows allowed for a KPI to be added.
MAX_ROWS_IN_KPI=10000000

### Anomaly Configuration
MULTIDIM_ANALYSIS_FOR_ANOMALY=False # Enables the generation of multi-dimensional subgroups.
MAX_SUBDIM_CARDINALITY=1000 # Sets the maximum number of unique values allowed in a dimension.
TOP_DIMENSIONS_FOR_ANOMALY_DRILLDOWN=10 # Sets the maximum number of dimensions shown in the Anomaly Drill Downs
MIN_DATA_IN_SUBGROUP=30 # The minimum population in a subgroup.
TOP_SUBDIMENSIONS_FOR_ANOMALY=10 # Sets the maximum number of sud-dimensions shown in the Anomaly Sub-dimensions page.
MAX_FILTER_SUBGROUPS_ANOMALY=250 # Sets the maximum number of subgroups considered for Anomaly Detection
MAX_ANOMALY_SLACK_DAYS=14 # Sets the maximum number of days for which we can have no data and still consider the KPI for Anomaly Detection.

### DeepDrills Configuration
MAX_ROWS_FOR_DEEPDRILLS=10000000 #Sets the maximum number of rows allowed for a KPI to be added.
MAX_DEEPDRILLS_SLACK_DAYS=14 # Sets the maximum number of days for which we can have no data and still consider the KPI for DeepDrills.
DEEPDRILLS_HTABLE_MAX_PARENTS=5 # Sets the maximum number of rows in the first level of the DeepDrills' drilldowns.
DEEPDRILLS_HTABLE_MAX_CHILDREN=5 # Sets the maximum number of rows in the subsequent levels of the DeepDrills' drilldowns.
DEEPDRILLS_HTABLE_MAX_DEPTH=3 # Sets the maximum depth of the drilldowns in DeepDrills.
DEEPDRILLS_ENABLED_TIME_RANGES=last_30_days,last_7_days,previous_day,month_on_month,month_to_date,week_on_week,week_to_date # Sets the enabled time ranges for which DeepDrills is computed as comma separated values.
# Enables the generation of multi-dimensional subgroups.
MULTIDIM_ANALYSIS_FOR_ANOMALY=False
# Sets the maximum number of unique values allowed in a dimension.
MAX_SUBDIM_CARDINALITY=1000
# Sets the maximum number of dimensions shown in the Anomaly Drill Downs.
TOP_DIMENSIONS_FOR_ANOMALY_DRILLDOWN=10
# The minimum population in a subgroup.
MIN_DATA_IN_SUBGROUP=30
# Sets the maximum number of sud-dimensions shown in the Anomaly Sub-dimensions page.
TOP_SUBDIMENSIONS_FOR_ANOMALY=10
# Sets the maximum number of subgroups considered for Anomaly Detection
MAX_FILTER_SUBGROUPS_ANOMALY=250
# Sets the maximum number of days for which we can have no data and still consider the KPI for Anomaly Detection.
MAX_ANOMALY_SLACK_DAYS=14

### Summary and DeepDrills Configuration
# Sets the maximum number of days for which we can have no data and still consider the KPI for Summary and DeepDrills.
MAX_SUMMARY_DEEPDRILLS_SLACK_DAYS=14
# Sets the enabled time ranges for which Summary and DeepDrills is computed as comma separated values.
SUMMARY_DEEPDRILLS_ENABLED_TIME_RANGES=last_30_days,last_7_days,previous_day,month_on_month,month_to_date,week_on_week,week_to_date
# Enables or disables DeepDrills.
DEEPDRILLS_ENABLED=False
# Sets the maximum number of rows in the first level of the DeepDrills' drilldowns.
DEEPDRILLS_HTABLE_MAX_PARENTS=5
# Sets the maximum number of rows in the subsequent levels of the DeepDrills' drilldowns.
DEEPDRILLS_HTABLE_MAX_CHILDREN=5
# Sets the maximum depth of the drilldowns in DeepDrills.
DEEPDRILLS_HTABLE_MAX_DEPTH=3

## Sentry Logging (leave empty to disable backend telemetry)
SENTRY_DSN=
Expand All @@ -47,6 +69,7 @@ FLASK_DEBUG=0
FLASK_RUN_PORT=5000
SECRET_KEY="t8GIEp8hWmR8y6VLqd6qQCMXzjRaKsx8nRruWNtFuec="
SEND_FILE_MAX_AGE_DEFAULT=31556926
CORS_ENABLED=False

### Database Configuration
DB_HOST=chaosgenius-db
Expand Down
7 changes: 4 additions & 3 deletions .env.local.template
Original file line number Diff line number Diff line change
Expand Up @@ -42,13 +42,14 @@ MAX_SUBDIM_CARDINALITY=1000
TOP_DIMENSIONS_FOR_ANOMALY_DRILLDOWN=10
MIN_DATA_IN_SUBGROUP=30
TOP_SUBDIMENSIONS_FOR_ANOMALY=10
MAX_ROWS_FOR_DEEPDRILLS=10000000
MAX_ROWS_IN_KPI=10000000
MAX_FILTER_SUBGROUPS_ANOMALY=250
MAX_DEEPDRILLS_SLACK_DAYS=14
MAX_SUMMARY_DEEPDRILLS_SLACK_DAYS=14
MAX_ANOMALY_SLACK_DAYS=14
DAYS_OFFSET_FOR_ANALTYICS=2

SUMMARY_DEEPDRILLS_ENABLED_TIME_RANGES=last_30_days,last_7_days,previous_day
DEEPDRILLS_ENABLED=False
DEEPDRILLS_HTABLE_MAX_PARENTS=5
DEEPDRILLS_HTABLE_MAX_CHILDREN=5
DEEPDRILLS_HTABLE_MAX_DEPTH=3
DEEPDRILLS_ENABLED_TIME_RANGES=last_30_days,last_7_days,previous_day
6 changes: 5 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ ipython_config.py
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat-schedule*
celerybeat.pid

# SageMath parsed files
Expand Down Expand Up @@ -345,3 +345,7 @@ docs/yarn-error.log*
########### Third Party Integrations ##########
.integrations.json
.connectors

########### Gitpod #########################

dump.rdb
10 changes: 10 additions & 0 deletions .gitpod.dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
FROM gitpod/workspace-postgres

# Install Redis.
RUN sudo apt-get update && \
sudo apt-get install -y redis-server && \
sudo rm -rf /var/lib/apt/lists/*

ENV PYTHONUSERBASE=/workspace/.pip-modules
ENV PATH=$PYTHONUSERBASE/bin:$PATH
ENV PIP_USER=yes
6 changes: 6 additions & 0 deletions .gitpod.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
CHAOSGENIUS_WEBAPP_URL=CHAOSGENIUS_WEBAPP_URL_HERE

DATABASE_URL_CG_DB=postgresql+psycopg2://gitpod@localhost/postgres

CELERY_RESULT_BACKEND=redis://localhost:6379/1
CELERY_BROKER_URL=redis://localhost:6379/1
77 changes: 77 additions & 0 deletions .gitpod.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
image:
file: .gitpod.dockerfile

# List the start up tasks. Learn more https://www.gitpod.io/docs/config-start-tasks/
tasks:
# modify .bashrc here.
# `PGHOSTADDR` is set to the Postgres server running on Gitpod. pyscopg2 picks up this
# variable and connects to that DB instead of the one we specify through data sources
# for some reason. So we unset this.
# See `DATABASE_URL_CG_DB` in `.gitpod.env` for the credentials to the Postgres server
# running inside Gitpod.
- before: printf 'unset PGHOSTADDR\n' >> $HOME/.bashrc && exit

# the backend server
- name: API Server
init: |
pip install wheel
pip install -r requirements/dev.txt
# notify that backend requirements have finished installing
gp sync-done backend-reqs
command: |
cp .gitpod.env .env.local
# get the URL for port 5000 exposed through Gitpod and use it as the WEBAPP_URL
# TODO: links to "View KPI", etc. won't work since they are on port 3000
sed -i "s~CHAOSGENIUS_WEBAPP_URL_HERE~`gp url 5000`~g" ".env.local"
# start postgres server
pg_start
# apply migrations
flask db upgrade
# notify that backend has been setup completely
gp sync-done backend-setup
bash dev_server.sh
- name: Webapp
init: |
cd frontend
npm install
command: |
cd frontend
# BASE_URL is set to port 5000 exposed through gitpod
REACT_APP_BASE_URL=`gp url 5000` npm start
- name: Redis
command: redis-server

- name: Workers and Scheduler
# TODO: is the await needed here?
init: gp sync-await backend-reqs
command: |
# wait all of backend setup (incl. migrations, env vars) to be completed
gp sync-await backend-setup
bash dev_workers.sh
ports:
# webapp
- port: 3000
onOpen: open-browser
visibility: "public"
# backend server
- port: 5000
visibility: "public"

vscode:
extensions:
- "ms-python.python"
- "samuelcolvin.jinjahtml"

github:
prebuilds:
# add a check to pull requests (defaults to true)
addCheck: false
# add a "Review in Gitpod" button as a comment to pull requests (defaults to false)
addComment: true
39 changes: 33 additions & 6 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,14 +53,41 @@ Follow these steps if you want to make a code contribution to Chaos Genius.
## Development Workflow
### Backend/API
We support 3 different development workflows:
- [Gitpod](#gitpod): easiest and fastest way to get started. Has some minor [limitations](GITPOD.md#known-issues) but can be used in most cases.
- [Docker Compose](#docker-compose): if you have Docker and docker-compose set up locally, this is the fastest way to get started on your local system.
- [Local setup](#local-setup): if you prefer running it directly on your system (without layers like Docker), opt for this. It requires running a Postgres and a Redis server yourself and needs some set up.
#### Prerequisites
### **Gitpod**
It can be quite a bit of work to set up Chaos Genius locally for development. Instead, you can use Gitpod which gives you everything you need to run and develop Chaos Genius but in a cloud environment and with the familiar interface of VS Code. See [GITPOD.md](./GITPOD.md) for details.
### **Docker Compose**
We have provided Docker Compose files specifically made for development. They include all the services (Postgres, Redis, etc.) along with auto-reloading backend server, workers and scheduler.
Note: this does not yet support development of the webapp/frontend. Use Gitpod or a local setup instead if you will be making changes to the webapp/frontend/UI.
Run the dev compose using:
```
docker-compose -f docker-compose.dev.yml up
```
If you will be testing third party data sources (go for the above if you're not sure), use this instead:
```
docker-compose -f docker-compose.dev-thirdparty.yml up
```
### **Local setup**
#### **Backend/API**
Prerequisites:
- Python 3.8 with `venv`
- A PostgreSQL server
- A Redis server
#### Steps to set up a development environment
Steps to set up a development environment:
- Create a new python virtual environment.
```bash
Expand All @@ -85,13 +112,13 @@ Follow these steps if you want to make a code contribution to Chaos Genius.
```
- Note: this does not start any of the celery schedulers or workers, which are needed if you want to run any analytics.
### Frontend/UI/Webapp
#### **Frontend/UI/Webapp**
#### Prerequisites
Prerequisites:
- Node JS
#### Steps to set up a development environment
Steps to set up a development environment:
- The webapp is present in the `frontend` directory.
```
Expand Down
4 changes: 3 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@ RUN apt-get update \

COPY requirements /requirements

RUN pip install -r /requirements/prod.txt --no-cache-dir
ARG DEV

RUN pip install -r /requirements/prod.txt ${DEV:+-r /requirements/dev.txt} --no-cache-dir

COPY . .
39 changes: 39 additions & 0 deletions GITPOD.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Using Gitpod for Chaos Genius

Gitpod gives you a fully functional VS code environment in a web page<sup id="f1">[1](#fn1)</sup>. All the requirements, system dependencies and services are set up according to the project's specifications. You don't even need to set up Git. This lets you get started with contributing to Chaos Genius with no set up required.

Use this button to open a Gitpod workspace on Chaos Genius' `develop` branch:

<a href="https://gitpod.io/#https://github.com/chaos-genius/chaos_genius/tree/develop"><img src="https://gitpod.io/button/open-in-gitpod.svg"/></a>

- You may need to login with your GitHub account.
- It may take 5-10 mins for set up if the pre-build was not made or did not complete for some reason.

## Features

- A new page will open with a URL like `3000-chaosgenius-chaosgenius-<something>.gitpod.io`. This is the Chaos Genius webapp running on Gitpod. It is connected to the backend which runs on Gitpod too.
- Select "Remote Explorer" from the left side bar to see all the open ports. The "Open Browser" button will open the port in a new page (use this on port 3000 to get back the page mentioned in the previous point).
- All services required by Chaos Genius are running in separate terminals (see the right side bar of the terminal window)
- `API Server` is the backend server. You will find the backend logs in this terminal.
- Run `bash dev_server.sh` to restart the server if it had stopped for any reason.
- `Webapp` is the frontend UI. You will find React build logs here.
- Use ```REACT_APP_BASE_URL=`gp url 5000` npm start``` to restart the webapp server if it had stopped.
- `Redis` is the message broker used by Celery workers. You will not find anything useful in these logs.
- Use `redis-server` to restart Redis if it had stopped.
- `Workers and Scheduler` runs the Celery workers and Celery beat scheduler. You will find logs of analytics tasks (anomaly, DeepDrills, etc.), data source metadata pre-fetch, alerts (both individual and alert reports), etc.
- Use `bash dev_workers.sh` to restart them if they had stopped.
- Use the `+` button in the terminal window to open a new shell. The python environment will be pre-activated, which allows you to use the Chaos Genius CLI (see `flask --help` for details).

### Docker compose support

If you need to test any functionality that only affects the docker compose deployments, you can use `docker-compose` directly from Gitpod. Both docker and docker-compose are pre-installed. The webapp will need to be accessed from port `8080` (use the Remote Explorer to make it public or to get the link).

## Known issues

- The workspace shuts down after 30 mins of inactivity on the default free plan.
- There is also a 50 hours monthly limit.
- Sending emails via SMTP does not work inside Gitpod. See https://github.com/gitpod-io/gitpod/issues/965.

## Notes

<b id="fn1">1</b> You can even choose to open Gitpod on your system's VS code. Click on the first button on the left side-bar (the hamburger button) and select `Gitpod: Open in VS Code`. [](#f1)
7 changes: 5 additions & 2 deletions chaos_genius/alerts/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
VS Code extension (or the Pyright equivalent in other editors) along with flake8 when
developing.
"""
import datetime
import logging
from typing import List, Optional, Tuple

Expand All @@ -17,7 +18,9 @@
logger = logging.getLogger()


def check_and_trigger_alert(alert_id: int):
def check_and_trigger_alert(
alert_id: int, last_anomaly_timestamp: Optional[datetime.datetime] = None
):
"""Check the alert and trigger the notification if found.
Args:
Expand Down Expand Up @@ -50,7 +53,7 @@ def check_and_trigger_alert(alert_id: int):
elif (
alert_info.alert_type == "KPI Alert" and alert_info.kpi_alert_type == "Anomaly"
):
anomaly_obj = AnomalyAlertController(alert_info)
anomaly_obj = AnomalyAlertController(alert_info, last_anomaly_timestamp)
return anomaly_obj.check_and_send_alert()
elif alert_info.alert_type == "KPI Alert" and alert_info.kpi_alert_type == "Static":
# TODO: is this still needed?
Expand Down
Loading

0 comments on commit e3c234c

Please sign in to comment.