diff --git a/DEVELOPMENT.md b/DEVELOPMENT.md new file mode 100644 index 0000000..89dd90f --- /dev/null +++ b/DEVELOPMENT.md @@ -0,0 +1,200 @@ +# Development + +This document describes development techniques that can improve and fasten this process. +Here you can find information corelated with more than one component. +There might be some additional data + +## Table of contents + +* [Deploy and set up external components](#deploy-and-set-up-external-components) + * [Testing with Kafka in docker](#testing-with-kafka-in-docker) + * [Testing with MQTT broker in docker](#testing-with-mqtt-broker-in-docker) + * [Testing with Postgres in docker](#testing-with-postgres-in-docker) + +## Deploy and set up external components + +The solution uses external services and componets: + +* Kafka/OpenShift Streams +* MQTT/AMQ Broker +* Postgres database + +For development purposes components can be deployed localy with Docker containers. + +Prerequsitions: + +* Docker installed +* docker-compose installed + +### Testing with Kafka in docker + +You may need this setup to test: + +* Quarkus/MQTT->Kafka bridge + +#### Setup + +One of the quickest way to set up Kafka cluster for development purposes is to use Docker containers. + +The procedure to **set up the cluster** boils down to: + +```shell +curl --silent --output docker-compose.yml \ + https://raw.githubusercontent.com/confluentinc/cp-all-in-one/6.1.0-post/cp-all-in-one/docker-compose.yml + +docker-compose up -d +``` + +See https://docs.confluent.io/platform/current/quickstart/ce-docker-quickstart.html for the details. + +To **create a topic**: + +```shell +docker-compose exec broker kafka-topics --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic ENTRY_EVENT +``` + +where + +* `broker` is the name of the container hosting Kafka broker instance +* `localhost:9092` is the broker's URL +* `ENTRY_EVENT` is the topic name + +To **produce some testing messages**, one can issue the following command: + +```shell +docker-compose exec broker \ + bash -c "seq 10 | kafka-console-producer --request-required-acks 1 --broker-list localhost:9092 --topic ENTRY_EVENT && echo 'Produced 10 messages.'" +``` +or +```shell +docker-compose exec broker \ + bash -c "echo '{\"event_type\":\"customer focus\",\"event_timestamp\":\"2001-12-17T09:30:47.0\",\"payload\":{\"customer_id\":3,\"category\":\"Boys\"}}' | kafka-console-producer --request-required-acks 1 --broker-list localhost:9092 --topic FOCUS_EVENTS && echo 'Message produced.'" +``` + +where + +* `broker` is the name of the container hosting Kafka broker instance +* `ENTRY_EVENT` is the topic name +* `localhost:9092` is the broker's URL + +### Testing with MQTT broker in docker + +You may need this setup to test: + +* Recommendation Service +* Visualization Application +* Mobile Application +* Scenario Player + +#### Setup + +To **run the container** with mosquitto borker: + +```shell +docker run -d --rm --name mosquitto -p 1883:1883 eclipse-mosquitto +``` +or +```shell +docker run -it -p 1883:1883 --name mosquitto eclipse-mosquitto mosquitto -c /mosquitto-no-auth.conf +``` + +To **publish to a topic**: + +```shell +docker exec mosquitto mosquitto_pub -h 127.0.0.1 -t test -m "test message" +``` + +To **subscribe to a topic**: +```shell +docker exec mosquitto mosquitto_sub -h 127.0.0.1 -t test +``` + +### Testing with Postgres in docker + +You may need this setup to test: + +* Recommendation Service + +#### Setup + +CSV files are available in the [../training-with-artificial-data/data_0409_0/data4db/](../training-with-artificial-data/data_0409_0/data4db/) path. + +To **run the container**: + +Go to `training-with-artificial-data/data_0419_0/data4db` and run + +```shell +docker run -v $PWD:/usr/local/pgsql/data -e POSTGRES_PASSWORD=root -p 5432:5432 -d postgres +``` + +To **create a tables**: + +Install postgresql client. If you are using Ubuntu, you can use the command + +```shell +sudo apt-get install postgresql +``` + +Connect to the database using + +```shell +psql -h 127.0.0.1 -p 5432 -U postgres +``` + +Create tables + +```sql +CREATE TABLE coupon_info ( + coupon_id INT, + coupon_type VARCHAR(16), + department VARCHAR(10), + discount INT, + how_many_products_required INT, + start_date VARCHAR(10), + end_date VARCHAR(10), + product_mean_price REAL, + products_available INT, + PRIMARY KEY (coupon_id) +); + +CREATE_TABLE product_info ( + product_id INT, + name VARCHAR(256), + category VARCHAR(50), + sizes VARCHAR(50), + vendor VARCHAR(50), + description VARCHAR(256), + buy_price REAL, + department VARCHAR(10), + PRIMARY KEY (product_id) +); + +CREATE_TABLE coupon_product ( + coupon_id INT, + product_id INT, + FOREIGN KEY (coupon_id) REFERENCES coupon_info(coupon_id), + FOREIGN KEY (product_id) REFERENCES products(product_id) +) + +CREATE TABLE customer_info ( + ustomer_id INT, + gender VARCHAR(1), + age INT, + mean_buy_price REAL, + total_coupons_used: INT, + mean_discount_received: REAL, + unique_products_bought INT, + unique_products_bought_with_coupons: INT, + total_items_bought: + INT, PRIMARY KEY (customer_id) +); +``` + +Fill DB with data: + +```sql +COPY coupon_info FROM '<>/coupon_info.csv' DELIMITER ',' CSV HEADER; +COPY product_info FROM '<>/products.csv' DELIMITER ',' CSV HEADER; +COPY coupon_product FROM '<>/coupon_product.csv' DELIMITER ',' CSV HEADER; +COPY customer_info FROM '<>/customer_info.csv' DELIMITER ',' CSV HEADER; +``` diff --git a/README.md b/README.md index 439e3be..3a896cf 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,33 @@ # Retail Store of the Future -Disclaimer! This solution was created for demo purposes only. It contains simplifications and must not be used for production purposes! +**Disclaimer!** This solution was created for demo purposes only. It contains simplifications +and must not be used for production purposes! + +This solution shows some potential use-cases and capabilities of RedHat and SAP components in +Intel-driven environments. + +In the Retail Store of the Future, the customer is in the center of the action. +Thanks to the advanced technologies the store can "feel" customer's needs and help +her/him to make better purchasing decisions. + +This solution uses AI techniques to increase customer satisfaction by proposing the most +interesting coupons while moving through the store. In the proposed scenario the system also detects +browsing customers and sends the assistant to the customer. + +The "Retail Store of the Future" project shows a potential architecture and components that could +be used in a real-world scenario. Depending on the final solution and needs some elements +could be changed, added, or removed. + +## Table of contents + +* [Solution diagram and components description](#solution-diagram-and-components-description) + * [Training](#training) + * [Prediction Service](#prediction-service) + * [Recommendation Service](#recommendation-service) + * [Customers simulator](#customers-simulator) + * [Visualization app](#visualization-app) +* [Development](#development) +* [Deployment and production](#deployment-and-production) ## Solution diagram and components description @@ -10,30 +37,52 @@ Disclaimer! This solution was created for demo purposes only. It contains simpli The training part is made in the jupyter notebooks using Intel DAAL libraries. -### Prediction service +All scripts and necessary components are available in +(training-with-artificial-data)[./training-with-artificial-data] path. + +There was a special data generator created for training purposes. It is placed in the +(artificial-data-generator)[./artificial-data-generator] path. + +### Prediction Service Prediction service handles the model. It provides a REST API that can be called to retrieve predictions for coupons. -Check [Prediction service README](prediction-service/README.md) for details. +Check [Prediction Service](./prediction-service) path for details. -### Recommendation service +### Recommendation Service -Recommendation service listens for MQTT requests. +Recommendation service provides MQTT interfaces for the recommendation process. -When the entry event occurs it calls (TODO) the central resource for client data. The component stores the data in a cache - database (TODO). +It observes MQTT topics, pulls data, creates the context, and sends it to the Prediction Service. -When the focus event occurs the component gets client and coupon data from the cache (TODO) and calls prediction service. -The result of the prediction is pushed to MQTT. +The response is interpreted, processed, and sent to the suitable MQTT topic. -Check [Recommendation service README](recommendation-service/README.md) for details. -MQTT topics description and schema can also be found there. +### Customers simulator + +This simulator was made for demo purposes. It can simulate customers' movement through the store. +See more details in the [README file](scenario-player/README.md) ### Visualization app -The application was made for demo purposes. [More details int app's README](visualization-app/README.md). +This application provides the interface for the store's crew. However, for demo purposes, it was also +extended to simulate customer's mobile app. [More details](visualization-app). -### Customers simulator +Visualization app contains: + +* Store interface (customers preview, alerts) +* Simulator interface (you can create a new scenario for customers simulator using UI) +* Mobile app simulation (shows customer's mobile app behavior while moving through the store) + +## Development + +Each component has its own instructions describing the development, deployment, structure, interfaces, and more. +Please find the particular component README for details. + +As shown in the diagram, there are additional, external components used in this solution. You can set up +your own development environment using containers. The development instruction can be found (here)[DEVELOPMENT.md]. + +## Deployment and production -This simulator was made for demo purposes. See more details in the [README file](scenario-player/README.md) +The solution is adapted to run in an OpenShift environment. The deployment process is described in (infra)[./infra] path. diff --git a/artificial-data-generator/README.md b/artificial-data-generator/README.md index 1f9f0aa..1e84931 100644 --- a/artificial-data-generator/README.md +++ b/artificial-data-generator/README.md @@ -11,13 +11,17 @@ It makes a few assumptions for that purpose: 1. People in the age 61 and above buys products from the 2/3 to the end of the list* more often, 1. The customer has personal preferences which can change the probability of buying products by up to 40% (20% on average) in 20% of cases. - > *There are actually 3 lists - vendors, departments, and categories. The order in all lists is random but always the same for all generated customers. Every customer has his/her own list of preferences for all 3 types. All lists are generated the same way and using the same assumptions described above. All numbers and functions described above can be changed using the config.py file. +## Table of contents + ## The algorithm +* [The algorithm](#the-algorithm) +* [Usage](#usage) + Here's an approximate algorithm of the generator. For more details please check the code. 1. Generate customers diff --git a/infra/README.md b/infra/README.md index d25329e..86490cd 100644 --- a/infra/README.md +++ b/infra/README.md @@ -1,43 +1,64 @@ -### Building images on Openshift Container Platform +# Build and deployment -If you require a private repository access you must create a secret containing Github Deploy Key -``` +## Table of contents + +* [Building images on Openshift Container Platform](#building-images-on-openshift-container-platform) +* [Deploy the solution using Helm Charts](#deploy-the-solution-using-helm-charts) + +## Building images on Openshift Container Platform + +If you require private repository access you must create a secret containing Github Deploy Key + +```shell oc create secret generic retail-git-ssh-key --from-file=ssh-privatekey= --type=kubernetes.io/ssh-auth ``` + Create BuildConfigs and ImageTags: -``` + +```shell oc apply -f ocp-buildconfigs.yaml ``` + Verify build configs have been created: -``` + +```shell oc get buildconfigs ``` -``` + +```shell NAME TYPE FROM LATEST prediction-service-build Docker Git@develop-pl 0 recommendation-service-build Docker Git@develop-pl 0 ``` + Verify ImageTags have been created in your project: -```bash + +```shell oc get is ``` -``` + +```shell NAME IMAGE REPOSITORY TAGS UPDATED prediction-service default-route-openshift-image-registry.apps.red.ocp.public/retail/prediction-service recommendation-service default-route-openshift-image-registry.apps.red.ocp.public/retail/recommendation-service ``` + Manually trigger the images builds: -```bash + +```shell oc start-build customer-simulation-service oc start-build prediction-service oc start-build recommendation-service oc start-build visualization-service ``` + Wait for the builds to complete: -```bash + +```shell oc get builds --watch ``` -``` + +```shell NAME TYPE FROM STATUS STARTED DURATION prediction-service-build-1 Docker Git@develop-pl Running 5 seconds ago prediction-service-build-1 Docker Git@72d19cf Running 12 seconds ago @@ -52,24 +73,30 @@ prediction-service-build-1 Docker Git@72d19cf Complete About a mi ``` See if the ImageTags have been updated: -```bash + +```shell oc get is ``` -``` + +```shell NAME IMAGE REPOSITORY TAGS UPDATED prediction-service default-route-openshift-image-registry.apps.red.ocp.public/retail/prediction-service latest 1 minutes ago recommendation-service default-route-openshift-image-registry.apps.red.ocp.public/retail/recommendation-service latest 1 minutes ago ``` -### Deploy the solution using Helm Charts +## Deploy the solution using Helm Charts Edit the `values.yaml` file and configure your workload parameters: -```bash + +```shell vim retail-helm-chart/values.yaml ``` -Install the Chart with Helm: -```bash + +Install the Chart with Helm: + +```shell helm install retail retail-helm-chart/ ``` + ``` NAME: retail LAST DEPLOYED: 2021-04-07 14:52:49.839391 +0000 UTC m=+0.078141486 @@ -77,10 +104,12 @@ NAMESPACE: retail STATUS: deployed ``` -Verify all pods are Running and in a Ready: -```bash +Verify all pods are Running: + +```shell oc get all ``` + ``` NAME READY STATUS RESTARTS AGE pod/postgres-5f549f5798-9qz72 1/1 Running 0 68s diff --git a/prediction-service/README.md b/prediction-service/README.md index 7f0561a..3effe52 100644 --- a/prediction-service/README.md +++ b/prediction-service/README.md @@ -2,20 +2,29 @@ REST application which outputs coupon redemption predictions +## Table of contents +* [Development](#development) + * [Dependencies](#dependencies) + * [Service configuration](#service-configuration) + * [Running the service](#running-the-service) +* [Prediction example](#prediction-example) +* [Docker image](#docker-image) ## Development ### Dependencies Dependencies of the project are contained in requirements.txt file. All the packages are publicly available. -All the packages can be installed with: -``` +All the packages can be installed with: + +```shell pip install -f requirements.txt ``` For development purposes creating a dedicated virtual environment is helpful (Python 3.8, all the dependencies installed there): -``` + +```shell python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt @@ -30,7 +39,7 @@ advance. `.environment.variables.sh` can be used for that purpose. Then, in order to run the service the following commands can be used: -``` +```shell $ . .environment.variables.sh $ . .venv/bin/activate (venv)$ uvicorn app.main:app --host 0.0.0.0 --reload @@ -38,7 +47,7 @@ $ . .venv/bin/activate ## Prediction example -``` +```shell curl -X 'POST' \ 'http://10.91.117.45:8002/score' \ -H 'accept: application/json' \ @@ -80,7 +89,7 @@ curl -X 'POST' \ Example response: -``` +```shell [ { "coupon_id": 2, @@ -102,7 +111,8 @@ See https://github.com/tiangolo/uvicorn-gunicorn-fastapi-docker for detail on configuring the container (http port, log level, etc.) In order to build the image use: -``` + +```shell docker build -t prediction-service:0.0.1 . ``` @@ -110,7 +120,7 @@ docker build -t prediction-service:0.0.1 . > your needs. To run the service as a Docker container run: -``` -docker run -d -p 8000:80 -e LOG_LEVEL="warning" prediction-service:0.0.1 +```shell +docker run -d -p 8000:80 -e LOG_LEVEL="warning" prediction-service:0.0.1 ``` diff --git a/recommendation-service/README.md b/recommendation-service/README.md index 9f9651c..0e4bcaf 100644 --- a/recommendation-service/README.md +++ b/recommendation-service/README.md @@ -1,4 +1,5 @@ -# Functionality +# Recommendation Service + This service creates MQTT consumers to listen to "entry event" and "focus event". **Entry event** is generated (externally), whenever a customer enters the store. @@ -10,8 +11,8 @@ This service's responsibility is to: * fetch customer context (purchase history, demographics, etc.) from "the central" (datacenter), when the service receives *entry event* -* invoke/call prediction service to decide if the customer is willing to use promotion coupons from given department, - when the service receives *focus event* and send prediction result to a dedicated MQTT topic. +* invoke/call prediction service to decide if the customer is willing to use coupons from a given department, + when the service receives *focus event* and sends the prediction result to a dedicated MQTT topic. ## Table of contents @@ -28,7 +29,8 @@ This service's responsibility is to: * [Testing without MQTT](#testing-without-mqtt) * [Docker image](#docker-image) * [Mock event endpoints](#mock-event-endpoints) - * [Cache - DB](#cache---db) + +# Functionality ## Event payloads @@ -169,7 +171,6 @@ In order to do the actual prediction, a REST call is made. **TBD** (See: [prediction.schema.json](schema/prediction.schema.json)) - # Development ## Dependencies @@ -188,22 +189,21 @@ The service reads the following **environment variables**: |------------------------|-----------------------------------------|--------------:| | MQTT_HOST | comma-separated list of MQTT brokers | - | | MQTT_PORT | MQTT brokers' port | - | -| MQTT_USERNAME | MQTT user username | None | -| MQTT_PASSWORD | MQTT user password | None | -| MQTT_BROKER_CERT_FILE | path to MQTT ssl cert file | None | +| MQTT_USERNAME | MQTT user username | None | +| MQTT_PASSWORD | MQTT user password | None | +| MQTT_BROKER_CERT_FILE | path to MQTT ssl cert file | None | | ENTRY_EVENT_TOPIC_NAME | topic for entry events | - | | FOCUS_EVENT_TOPIC_NAME | topic for focus events | - | | COUPON_PREDICTION_TOPIC_NAME | topic for sending prediction results | - | -(Parameters with `-` in "Default" column are required.) +(Parameters with `-` in the "Default" column are required.) Use [log_config.py](./app/utils/log_config.py) to **configure logging behaviour**. By default, console and file handlers are used. The file appender writes to `messages.log`. - ## Running the service -For my development I created a project with dedicated virtual environment (Python 3.8, all the dependencies installed +For development, I created a project with a dedicated virtual environment (Python 3.8, all the dependencies installed there). The code reads sensitive information (tokens, secrets) from environment variables. They need to be set accordingly in @@ -211,14 +211,13 @@ advance. `.environment.variables.sh` can be used for that purpose. Then, in order to run the service the following commands can be used: -``` +```bash $ . .environment.variables.sh $ . venv/bin/activate (venv)$ uvicorn app.main:app --host 0.0.0.0 --reload --reload-dir app ``` > Please, note `reload-dir` switch. Without it the reloader goes into an infinite loop because it detects log file changes (messages.log). - ## Testing without MQTT For testing purposes, there are two endpoints that simulate events ("entry event", "focus event"), as if they would appear on a dedicated MQTT topic. @@ -227,6 +226,7 @@ In order for the service not to create real MQTT consumers and producers, set `TESTING_NO_MQTT` environment variable to "true". This way, event processing logic can be tested without MQTT, for example: + ```bash curl -X 'POST' \ 'http://127.0.0.1:8000/mock_entry' \ @@ -252,7 +252,8 @@ See https://github.com/tiangolo/uvicorn-gunicorn-fastapi-docker for detail on configuring the container (http port, log level, etc.) In order to build the image use: -``` + +```bash docker build -t recommendation-service:0.0.1 . ``` @@ -260,16 +261,17 @@ docker build -t recommendation-service:0.0.1 . > your needs. To run the service as a Docker container run: -``` -docker run -d -e LOG_LEVEL="warning" --name recommendaition-service recommendation-service:0.0.1 +```bash +docker run -d -e LOG_LEVEL="warning" --name recommendaition-service recommendation-service:0.0.1 ``` ## Mock event endpoints For testing purposes, there are two endpoints that simulate events ("entry event", "focus event"), -as if they would appear on dedicated MQTT topic. +as if they would appear on a dedicated MQTT topic. This way, event processing logic can be tested without MQTT, for example: + ```bash curl -X 'POST' \ 'http://127.0.0.1:8000/mock_entry' \ @@ -283,8 +285,10 @@ curl -X 'POST' \ } }' ``` + or: -``` + +```bash curl -X 'POST' \ 'http://127.0.0.1:8000/mock_focus' \ -H 'accept: application/json' \ @@ -298,68 +302,3 @@ curl -X 'POST' \ } }' ``` - -## Cache - DB - -This component uses PostgreSQL as a cache. It stores coupons and customer data. - -DB tables: - -```sql - -CREATE TABLE coupon_info ( - coupon_id INT, - coupon_type VARCHAR(16), - department VARCHAR(10), - discount INT, - how_many_products_required INT, - start_date VARCHAR(10), - end_date VARCHAR(10), - product_mean_price REAL, - products_available INT, - PRIMARY KEY (coupon_id) -); - -CREATE_TABLE product_info ( - product_id INT, - name VARCHAR(256), - category VARCHAR(50), - sizes VARCHAR(50), - vendor VARCHAR(50), - description VARCHAR(256), - buy_price REAL, - department VARCHAR(10), - PRIMARY KEY (product_id) -); - -CREATE_TABLE coupon_product ( - coupon_id INT, - product_id INT, - FOREIGN KEY (coupon_id) REFERENCES coupon_info(coupon_id), - FOREIGN KEY (product_id) REFERENCES products(product_id) -) - -CREATE TABLE customer_info ( - ustomer_id INT, - gender VARCHAR(1), - age INT, - mean_buy_price REAL, - total_coupons_used: INT, - mean_discount_received: REAL, - unique_products_bought INT, - unique_products_bought_with_coupons: INT, - total_items_bought: - INT, PRIMARY KEY (customer_id) -); -``` - -How to fill DB with data: - -```sql -COPY coupon_info FROM '<>/coupon_info.csv' DELIMITER ',' CSV HEADER; -COPY product_info FROM '<>/products.csv' DELIMITER ',' CSV HEADER; -COPY coupon_product FROM '<>/coupon_product.csv' DELIMITER ',' CSV HEADER; -COPY customer_info FROM '<>/customer_info.csv' DELIMITER ',' CSV HEADER; -``` - -CSV files are available in the [../training-with-artificial-data/data_0409_0/data4db/](../training-with-artificial-data/data_0409_0/data4db/) path diff --git a/scenario-player/README.md b/scenario-player/README.md index baff830..c2bb47e 100644 --- a/scenario-player/README.md +++ b/scenario-player/README.md @@ -1,10 +1,10 @@ -# Project description +# Scenario Player -This project was a part of broader demo. That broader demo analyzed customers movement in a retail store, determined -their behaviour (for example: "customer stopped in men's clothes department") and use Machine Learning to model for +This project was a part of a broader demo. That broader demo analyzed customers' movement in a retail store, determined +their behavior (for example: "customer stopped in men's clothes department"), and use Machine Learning to model for purchase/product recommendation. The customer location was determined by movement sensors placed in the store. -This service customer behaviour in a retail shop: +This service simulates customer behavior in a retail shop: * customer entering the store * customer movement @@ -12,6 +12,15 @@ This service customer behaviour in a retail shop: by generating proper MQTT messages. +## Table of contents + +* [Usage](#usage) + * [Main simulator loop](#main-simulator-loop) + * [Scenario definitions](#scenario-definitions) + * [Messages payloads](#messages-payloads) +* [Running the service](#running-the-service) +* [Development information](#development-information) + # Usage This is a web service (implemented with FastAPI). By default, it works on port 8000. @@ -22,19 +31,20 @@ When starting, it does the following: * connects to MQTT server * waits for and registers new user movement scenarios (HTTP POST to `/scenario` endpoint) * creates a background task (ran every second) that checks if there is something to be sent (if it is time for giving an - update on particular customer) + update on the particular customer) -### Main simulator loop +## Main simulator loop The main loop executes every second and tries to locate events (in the timeline) that should be "replayed". If there are any, proper messages are constructed and published (via. `Publisher` object). > Please, note, that in the current implementation, the main simulator loop replays > events from the timeline for **current timestamp** (current date/time). -### Scenario definitions +## Scenario definitions -Scenario is a list of locations for given customer in certain moments: -``` +The scenario is a list of locations for a given customer in certain moments, for example: + +```json { "customer": { "customer_id": "3" @@ -50,7 +60,6 @@ Scenario is a list of locations for given customer in certain moments: "location": {"x": 320, "y": 150}, "timestamp": "2021-04-11T10:13:49.614897" }, - ... { "type": "EXIT", "location": {"x": 200, "y": 10}, @@ -61,6 +70,7 @@ Scenario is a list of locations for given customer in certain moments: ``` The service can register a scenario: + ```shell curl -X 'POST' \ 'http://localhost:8000/scenario' \ @@ -69,17 +79,18 @@ curl -X 'POST' \ -d '{"customer":{"customer_id":"1"},"path":[{"type":"ENTER","location":{"x":935,"y":50},"timestamp":1618323771180},{"type":"MOVE","location":{"x":588.9128630705394,"y":454.08039288409145},"timestamp":1618323772180},{"type":"EXIT","location":{"x":1075,"y":50},"timestamp":1618323773180}]}' ``` -After receiving the request, the service all adds all scenario steps (from `path`) to the current timeline +After receiving the request, the service adds all steps (from `path`) to the current timeline with timestamps as defined in the payload. -As the main loop uses current timestamp for "locating" the events on the timeline, +As the main loop uses the current timestamp for "locating" the events on the timeline, it is possible that registered events won't ever be published. In case the timestamp of a given event _is in the past_, it will be never be retrieved and processed. -For the user convenience, it is possible to reuse a scenario definition that refers to the past. +For user convenience, it is possible to reuse a scenario definition that refers to the past. `recalculate_time=true` parameter for `/scenario` request can be used here: -``` + +```shell curl -X 'POST' \ 'http://localhost:8000/scenario?recalculate_time=true' \ -H 'accept: application/json' \ @@ -87,10 +98,9 @@ curl -X 'POST' \ -d '{"customer":{"customer_id":"1"},"path":[{"type":"ENTER","location":{"x":935,"y":50},"timestamp":1618323771180},{"type":"MOVE","location":{"x":588.9128630705394,"y":454.08039288409145},"timestamp":1618323772180},{"type":"EXIT","location":{"x":1075,"y":50},"timestamp":1618323773180}]}' ``` -In this case, scenario will start with current time and step timestamps will be recalculated appropriately +In this case, the scenario will start with the current time and step timestamps will be recalculated appropriately (they will be "refreshed"). - ## Messages payloads ### **customer/exit** Channel @@ -151,4 +161,3 @@ See [instructions](./development.md#running-the-service) for details on configur # Development information See [development.md](./development.md) for information about configuring the service, how to run test and run the service. - diff --git a/scenario-player/development.md b/scenario-player/development.md index b41b4c2..631962a 100644 --- a/scenario-player/development.md +++ b/scenario-player/development.md @@ -1,13 +1,10 @@ # Functionality -This service generates messages that simulate customer behaviour in a retail shop: +This service generates messages that simulate customer behavior in a retail shop: * customer entering the store * customer movement * customer exiting the store - -# Table of contents -* [Functionality](#functionality) - +## Table of contents * [Development](#development) * [Dependencies](#dependencies) * [Service configuration](#service-configuration) @@ -15,11 +12,9 @@ This service generates messages that simulate customer behaviour in a retail sho * [Testing with MQTT broker in docker](#testing-with-mqtt-broker-in-docker) * [Testing without MQTT](#testing-without-mqtt) * [Mock event endpoints](#mock-event-endpoints) - * [Deployment](#deployment) * [Docker image](#docker-image) * [Connecting to a secured broker](#connecting-to-a-secured-broker) - # Development @@ -35,28 +30,27 @@ All the packages can be installed with: The service reads the following **environment variables**: -| Variable | Description | Default | -|------------------------|--------------------------------------|--------------:| -| STORE_HEIGHT | | 10 | -| STORE_WIDTH | | 6 | -| CUSTOMERS_AVERAGE_IN_STORE | | 6 | -| CUSTOMERS_LIST_FILE | | customers.csv | -| MQTT_HOST | | - | -| MQTT_PORT | | 1883 | -| MQTT_NAME | | demoClient | -| ENTER_TOPIC | | customer/enter| -| MOVE_TOPIC | | customer/move | -| EXIT_TOPIC | | customer/exit | - -(Parameters with `-` in "Default" column are required.) +| Variable | Description | Default | +|-----------------------|----------------|--------------:| +| STORE_HEIGHT | | 10 | +| STORE_WIDTH | | 6 | +| CUSTOMERS_AVERAGE_IN_STORE | | 6 | +| CUSTOMERS_LIST_FILE | | customers.csv | +| MQTT_HOST | | - | +| MQTT_PORT | | 1883 | +| MQTT_NAME | | demoClient | +| ENTER_TOPIC | | customer/enter| +| MOVE_TOPIC | | customer/move | +| EXIT_TOPIC | | customer/exit | + +(Parameters with `-` in the "Default" column are required.) Use [log_config.py](./app/utils/log_config.py) to **configure logging behaviour**. By default, console and file handlers are used. The file appender writes to `messages.log`. - ## Running the service -For my development I created a project with dedicated virtual environment (Python 3.8, all the dependencies installed +For development, I created a project with a dedicated virtual environment (Python 3.8, all the dependencies installed there). The code reads sensitive information (tokens, secrets) from environment variables. They need to be set accordingly in @@ -64,34 +58,17 @@ advance. `environment.variables.sh` can be used for that purpose. Then, in order to run the service the following commands can be used: -``` +```shell $ . .environment.variables.sh $ . venv/bin/activate (venv)$ uvicorn app.main:app --host 0.0.0.0 --reload --reload-dir app ``` + > Please, note `reload-dir` switch. Without it the reloader goes into an infinite loop because it detects log file changes (messages.log). ## Testing with MQTT broker in docker -Quick way to **set up a simple MQTT broker** is to use Docker containers: -```shell -docker run -d --rm --name mosquitto -p 1883:1883 eclipse-mosquitto -``` -or -```shell -docker run -it -p 1883:1883 --name mosquitto eclipse-mosquitto mosquitto -c /mosquitto-no-auth.conf -``` - -To **publish to a topic**: - -```shell -docker exec mosquitto mosquitto_pub -h 127.0.0.1 -t test -m "test message" -``` - -To **subscribe to a topic**: -```shell -docker exec mosquitto mosquitto_sub -h 127.0.0.1 -t test -``` +[Check DEPLOYMENT tips](../DEPLOYMENT.md) to find out how to deploy and use the MQTT service for development purposes. ### Testing without MQTT There is an environment variable, `TESTING_MOCK_MQTT`, that will create an MQTT client mock instead of trying to connect @@ -103,16 +80,15 @@ This may be helpful for local development or testing. ```shell curl http://127.0.0.1:8000/produce_entry -d '{"id": "997", "ts": 192326400}' - ``` +``` ```shell curl http://127.0.0.1:8000/produce_exit -d '{"id": "997", "ts": 192326400}' - ``` +``` ```shell curl http://127.0.0.1:8000/produce_move -d '{"id": "997", "ts": 192326400, "x": 2, "y": 3}' - ``` - +``` # Deployment @@ -123,7 +99,8 @@ See https://github.com/tiangolo/uvicorn-gunicorn-fastapi-docker for the details on configuring the container (http port, log level, etc.) In order to build the image use: -``` + +```shell docker build -t customersim-service:0.0.1 . ``` @@ -131,9 +108,9 @@ docker build -t customersim-service:0.0.1 . > your needs. To run the service as a Docker container run: -``` -docker run -d -e LOG_LEVEL="warning" --name customersim-service customersim-service:0.0.1 +```shell +docker run -d -e LOG_LEVEL="warning" --name customersim-service customersim-service:0.0.1 ``` ## Connecting to a secured broker diff --git a/training-with-artificial-data/README.md b/training-with-artificial-data/README.md index 1ef875d..23d30e7 100644 --- a/training-with-artificial-data/README.md +++ b/training-with-artificial-data/README.md @@ -1,20 +1,20 @@ # Data preparation and training - artificial data set - This directory contains jupyter notebooks for preparing data and training a model for the coupon recommendation service. -In order to run the Jupyter notebooks, original dataset needs to be present. Path to the directory needs to be specified at the top of each script. +In order to run the Jupyter notebooks, the original dataset needs to be present. Path to the directory needs to be specified at the top of each script. -* `01_data_prep.ipynb` - data preparation. This notebook contains data cleaning, merging, feature engineering and encoding. It results in an input dataset for training. +* `01_data_prep.ipynb` - data preparation. This notebook contains data cleaning, merging, feature engineering, and encoding. It results in an input dataset for training. * `02_training_automl.ipynb` - Training using H2O AutoML. -* `03_training.ipynb` - Training model using scikit-learn. Algorithm (GBM) and parameters are selected based on AutoML result. The notebook compares training on unbalanced and balanced dataset. +* `03_training.ipynb` - Training model using scikit-learn. Algorithm (GBM) and parameters are selected based on AutoML result. The notebook compares training on an unbalanced and balanced dataset. -* `04_demo_data_selection.ipynb` - Using model trained in the previous notebook, select 'good' customer-coupon pairs, i.e. customers for whom there are many 'hit' coupons predicted, along with the 'hit' coupons. Data are saved in a `demo_data` directory. Details can be found in the notebook. +* `04_demo_data_selection.ipynb` - Using the model trained in the previous notebook, select 'good' customer-coupon pairs, i.e. customers for whom there are many 'hit' coupons predicted, along with the 'hit' coupons. Data are saved in a `demo_data` directory. Details can be found in the notebook. In order to run the jupyter notebooks, use (specify `ip` and `port` according to your needs): -``` + +```shell python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt @@ -24,18 +24,17 @@ jupyter notebook --ip 0.0.0.0 --port 8000 --no-browser The notebooks should be run in the order they are numbered. - ## Docker images This repository contains a Dockerfile for building a docker image. To build it, use: -``` +```shell docker build -t coupon-rec:0.0.1 . --build-arg DATA_DIR= ``` To run it, use: -``` +```shell docker run -it -p 0.0.0.0:8002:8000 coupon-rec:0.0.1 ``` diff --git a/visualization-app/README.md b/visualization-app/README.md index 592368e..61e77ed 100644 --- a/visualization-app/README.md +++ b/visualization-app/README.md @@ -23,6 +23,17 @@ There are also the entrance and exit points shown. ## Table of contents +* [Functionality](#functionality) + * [Table of contents](#table-of-contents) +* [Usage](#usage) + * [Dependencies](#dependencies) + * [Service configuration](#service-configuration) + * [Running the service](#running-the-service) + * [Development](#development) + * [Production](#production) + * [App interfaces](#app-interfaces) + * [Using the UI](#using-the-ui) + # Usage This is a web service (implemented with FastAPI). By default, it works on port 8000. (See instructions for details on configuring and running the service.) @@ -46,18 +57,17 @@ This application assumes running MQTT broker. The service reads the following **environment variables**: -| Variable | Description | Default | -|-----------------------|---------------------------------------|--------------:| -| CUSTOMERS_LIST_FILE | | app/resources/customers.json | -| MQTT_HOST | | - | -| MQTT_PORT | | 1883 | -| MQTT_NAME | | demoVisClient | -| ENTER_TOPIC | | customer/enter| -| MOVE_TOPIC | | customer/move | -| EXIT_TOPIC | | customer/exit | +| Variable | Description | Default | +|-----------------------|--------------|-----------------------------:| +| CUSTOMERS_LIST_FILE | | app/resources/customers.json | +| MQTT_HOST | | - | +| MQTT_PORT | | 1883 | +| MQTT_NAME | | demoVisClient | +| ENTER_TOPIC | | customer/enter | +| MOVE_TOPIC | | customer/move | +| EXIT_TOPIC | | customer/exit | | SCENARIO_PLAYER_SCENARIO_ENDPOINT | full address (ex: `http://localhost:8004/scenario`) to the scenario-player's `scenario` endpoint | - | - (Parameters with `-` in "Default" column are required.) Use [log_config.py](./app/utils/log_config.py) to **configure logging behaviour**. @@ -70,16 +80,17 @@ By default, console and file handlers are used. The file appender writes to `mes `environment.variables.sh` can be used for that purpose. Then, in order to run the service the following commands can be used: -``` +```shell $ . .environment.variables.sh $ . venv/bin/activate (venv)$ uvicorn app.main:app --host 0.0.0.0 --reload --reload-dir app ``` + > Please, note `reload-dir` switch. Without it the reloader goes into an infinite loop because it detects log file changes (messages.log). ### Production -// To be done +The service is made to be run in an OpenShift environment. The deployment process is described in (infra)[../infra] path. ## App interfaces