Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v1.7.0 Merge #131

Merged
merged 50 commits into from
May 6, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
d6c6363
stage 1.6.9
dirtycajunrice Apr 19, 2019
29d0158
fix uncaught influxdb connection error
dirtycajunrice Apr 19, 2019
63fa2f6
drop unnamed unifi devices. fixes #126
dirtycajunrice Apr 19, 2019
6527fa1
refactor ombi and add logging for bad requests so we can see data
dirtycajunrice Apr 19, 2019
96cc789
update sonarr error logging
dirtycajunrice Apr 19, 2019
74bc4e1
add a bit more logging
dirtycajunrice Apr 19, 2019
78beaa1
Add album and track totals to artist library from Tautulli #127
samwiseg0 Apr 20, 2019
5f18be6
make an attempt to reauth unifi + let users know they didnt RTFM
dirtycajunrice Apr 22, 2019
cb09ce8
reset check if successful
dirtycajunrice Apr 22, 2019
7b2f2c7
get sites and map for unifi even if its a description (alias)
dirtycajunrice Apr 24, 2019
d9967ff
Merge remote-tracking branch 'origin/develop' into develop
dirtycajunrice Apr 24, 2019
37bc5a2
grafana datasource + official dashboard import script
dirtycajunrice Apr 24, 2019
ced9dc1
Enable Debug by default for docker images
samwiseg0 Apr 24, 2019
82c6853
Enable debug by default and allow it to be disabled
samwiseg0 Apr 24, 2019
dc3704d
Add "Depreciated" to debug help
samwiseg0 Apr 24, 2019
073fc25
copy utilities folder
dirtycajunrice Apr 24, 2019
60d596e
fix attribute error
dirtycajunrice Apr 24, 2019
009cc27
move utilities copy
dirtycajunrice Apr 24, 2019
d006507
add container names + force grafana to wait on proper varken config
dirtycajunrice Apr 24, 2019
a63ae14
move session under do not edit
dirtycajunrice Apr 24, 2019
2d47fff
fix data -> json
dirtycajunrice Apr 24, 2019
de0aaed
pep8
dirtycajunrice Apr 24, 2019
0e3ff46
add regular venv to gitignore
dirtycajunrice Apr 24, 2019
a88c8de
add lidarr functionality
dirtycajunrice Apr 24, 2019
2decab3
add lidarr to example.ini
dirtycajunrice Apr 24, 2019
548e786
wip. broken
dirtycajunrice Apr 24, 2019
ba1c09c
fixed queue
dirtycajunrice Apr 24, 2019
f57d8c2
start with lidarr disabled
dirtycajunrice Apr 24, 2019
1409fe4
cleanup lines
dirtycajunrice Apr 24, 2019
00fa91b
fix dict check for lidarr
dirtycajunrice Apr 24, 2019
d2eae21
Ultra-threaded concurrency. For SCIENCE!
dirtycajunrice Apr 24, 2019
96ddd7a
sonarr refactor
dirtycajunrice Apr 24, 2019
33cc99f
pre-bump for version
dirtycajunrice Apr 24, 2019
aaedd31
Add to .gitignore
samwiseg0 Apr 25, 2019
b343d73
pep8
dirtycajunrice Apr 25, 2019
ec9c461
Partial update to build to get ready for lidarr to master
dirtycajunrice Apr 25, 2019
79f88ad
more updates
dirtycajunrice Apr 25, 2019
a64d7f0
update requirements.txt + desc
dirtycajunrice Apr 25, 2019
97c6756
fixes #129
dirtycajunrice Apr 29, 2019
41fc23e
change default to true + explain policy with verbosity
dirtycajunrice Apr 29, 2019
afd0b4d
pep8
dirtycajunrice Apr 29, 2019
9b5b8fa
only add if no db for now until tau historical built
dirtycajunrice Apr 29, 2019
1059a31
allow historical import of tautulli
dirtycajunrice Apr 29, 2019
5429792
force specific packages
dirtycajunrice Apr 29, 2019
4a3c627
Fix date compare for history gathering in Tautulli
samwiseg0 Apr 30, 2019
dffe5c6
change links to BookStack links
dirtycajunrice May 6, 2019
785f97b
Add lidarr to readme
dirtycajunrice May 6, 2019
00c66c3
reword tagline
dirtycajunrice May 6, 2019
11cd4de
add alts
dirtycajunrice May 6, 2019
21ad430
v1.7.0 Merge
dirtycajunrice May 6, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,11 @@
.Trashes
ehthumbs.db
Thumbs.db
__pycache__
GeoLite2-City.mmdb
GeoLite2-City.tar.gz
data/varken.ini
.idea/
varken-venv/
venv/
logs/
__pycache__
21 changes: 19 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,24 @@
# Change Log

## [v1.6.8](https://github.com/Boerderij/Varken/tree/v1.6.8) (2019-04-18)
[Full Changelog](https://github.com/Boerderij/Varken/compare/1.6.7...v1.6.8)
## [v1.7.0](https://github.com/Boerderij/Varken/tree/v1.7.0) (2019-05-05)
[Full Changelog](https://github.com/Boerderij/Varken/compare/1.6.8...v1.7.0)

**Implemented enhancements:**

- \[ENHANCEMENT\] Add album and track totals to artist library from Tautulli [\#127](https://github.com/Boerderij/Varken/issues/127)
- \[Feature Request\] No way to show music album / track count [\#125](https://github.com/Boerderij/Varken/issues/125)

**Fixed bugs:**

- \[BUG\] Invalid retention policy name causing retention policy creation failure [\#129](https://github.com/Boerderij/Varken/issues/129)
- \[BUG\] Unifi errors on unnamed devices [\#126](https://github.com/Boerderij/Varken/issues/126)

**Merged pull requests:**

- v1.7.0 Merge [\#131](https://github.com/Boerderij/Varken/pull/131) ([DirtyCajunRice](https://github.com/DirtyCajunRice))

## [1.6.8](https://github.com/Boerderij/Varken/tree/1.6.8) (2019-04-19)
[Full Changelog](https://github.com/Boerderij/Varken/compare/1.6.7...1.6.8)

**Implemented enhancements:**

Expand Down
4 changes: 3 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ FROM amd64/python:3.7.2-alpine

LABEL maintainers="dirtycajunrice,samwiseg0"

ENV DEBUG="False"
ENV DEBUG="True"

WORKDIR /app

Expand All @@ -12,6 +12,8 @@ COPY /varken /app/varken

COPY /data /app/data

COPY /utilities /app/data/utilities

RUN apk add --no-cache tzdata && \
python3 -m pip install -r /app/requirements.txt

Expand Down
2 changes: 1 addition & 1 deletion Dockerfile.arm
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ FROM arm32v6/python:3.7.2-alpine

LABEL maintainers="dirtycajunrice,samwiseg0"

ENV DEBUG="False"
ENV DEBUG="True"

WORKDIR /app

Expand Down
2 changes: 1 addition & 1 deletion Dockerfile.arm64
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ FROM arm64v8/python:3.7.2-alpine

LABEL maintainers="dirtycajunrice,samwiseg0"

ENV DEBUG="False"
ENV DEBUG="True"

WORKDIR /app

Expand Down
20 changes: 10 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
<p align="center">
<img width="800" src="https://bin.cajun.pro/images/varken_full_banner.png">
<img width="800" src="https://bin.cajun.pro/images/varken_full_banner.png" alt="Logo Banner">
</p>

[![Build Status](https://jenkins.cajun.pro/buildStatus/icon?job=Varken/master)](https://jenkins.cajun.pro/job/Varken/job/master/)
Expand All @@ -11,19 +11,19 @@

Dutch for PIG. PIG is an Acronym for Plex/InfluxDB/Grafana

Varken is a standalone command-line utility to aggregate data
from the Plex ecosystem into InfluxDB. Examples use Grafana for a
frontend
Varken is a standalone application to aggregate data from the Plex
ecosystem into InfluxDB using Grafana for a frontend

Requirements:
* [Python 3.6.7+](https://www.python.org/downloads/release/python-367/)
* [Python3-pip](https://pip.pypa.io/en/stable/installing/)
* [InfluxDB](https://www.influxdata.com/)
* [Grafana](https://grafana.com/)

<p align="center">
Example Dashboard

<img width="800" src="https://i.imgur.com/3hNZTkC.png">
<img width="800" src="https://i.imgur.com/3hNZTkC.png" alt="dashboard">
</p>

Supported Modules:
Expand All @@ -33,6 +33,7 @@ Supported Modules:
* [Tautulli](https://tautulli.com/) - A Python based monitoring and tracking tool for Plex Media Server.
* [Ombi](https://ombi.io/) - Want a Movie or TV Show on Plex or Emby? Use Ombi!
* [Unifi](https://unifi-sdn.ubnt.com/) - The Global Leader in Managed Wi-Fi Systems
* [Lidarr](https://lidarr.audio/) - Looks and smells like Sonarr but made for music.

Key features:
* Multiple server support for all modules
Expand All @@ -41,21 +42,20 @@ Key features:


## Installation Guides
Varken Installation guides can be found in the [wiki](https://github.com/Boerderij/Varken/wiki/Installation).
Varken Installation guides can be found in the [wiki](https://wiki.cajun.pro/books/varken/chapter/installation).

## Support
Please read [Asking for Support](https://github.com/Boerderij/Varken/wiki/Asking-for-Support) before seeking support.
Please read [Asking for Support](https://wiki.cajun.pro/books/varken/chapter/asking-for-support) before seeking support.

[Click here for quick access to discord support](http://cyborg.decreator.dev/channels/518970285773422592/530424560504537105/). No app or account needed!

### InfluxDB
[InfluxDB Installation Documentation](https://docs.influxdata.com/influxdb/v1.7/introduction/installation/)
[InfluxDB Installation Documentation](https://wiki.cajun.pro/books/varken/page/influxdb-d1f)

Influxdb is required but not packaged as part of Varken. Varken will create
its database on its own. If you choose to give varken user permissions that
do not include database creation, please ensure you create an influx database
named `varken`

### Grafana
[Grafana Installation Documentation](http://docs.grafana.org/installation/)
Official dashboard installation instructions can be found in the [wiki](https://github.com/Boerderij/Varken/wiki/Installation#grafana)
[Grafana Installation/Dashboard Documentation](https://wiki.cajun.pro/books/varken/page/grafana)
67 changes: 41 additions & 26 deletions Varken.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
from varken import VERSION, BRANCH
from varken.sonarr import SonarrAPI
from varken.radarr import RadarrAPI
from varken.lidarr import LidarrAPI
from varken.iniparser import INIParser
from varken.dbmanager import DBManager
from varken.helpers import GeoIPHandler
Expand All @@ -28,13 +29,9 @@
PLATFORM_LINUX_DISTRO = ' '.join(x for x in linux_distribution() if x)


def thread():
while schedule.jobs:
job = QUEUE.get()
a = job()
if a is not None:
schedule.clear(a)
QUEUE.task_done()
def thread(job, **kwargs):
worker = Thread(target=job, kwargs=dict(**kwargs))
worker.start()


if __name__ == "__main__":
Expand All @@ -43,7 +40,8 @@ def thread():
formatter_class=RawTextHelpFormatter)

parser.add_argument("-d", "--data-folder", help='Define an alternate data folder location')
parser.add_argument("-D", "--debug", action='store_true', help='Use to enable DEBUG logging')
parser.add_argument("-D", "--debug", action='store_true', help='Use to enable DEBUG logging. (Depreciated)')
parser.add_argument("-ND", "--no_debug", action='store_true', help='Use to disable DEBUG logging')

opts = parser.parse_args()

Expand Down Expand Up @@ -72,10 +70,15 @@ def thread():
enable_opts = ['True', 'true', 'yes']
debug_opts = ['debug', 'Debug', 'DEBUG']

if not opts.debug:
opts.debug = True

if getenv('DEBUG') is not None:
opts.debug = True if any([getenv(string, False) for true in enable_opts
for string in debug_opts if getenv(string, False) == true]) else False

elif opts.no_debug:
opts.debug = False

# Initiate the logger
vl = VarkenLogger(data_folder=DATA_FOLDER, debug=opts.debug)
vl.logger.info('Starting Varken...')
Expand All @@ -98,72 +101,84 @@ def thread():
SONARR = SonarrAPI(server, DBMANAGER)
if server.queue:
at_time = schedule.every(server.queue_run_seconds).seconds
at_time.do(QUEUE.put, SONARR.get_queue).tag("sonarr-{}-get_queue".format(server.id))
at_time.do(thread, SONARR.get_queue).tag("sonarr-{}-get_queue".format(server.id))
if server.missing_days > 0:
at_time = schedule.every(server.missing_days_run_seconds).seconds
at_time.do(QUEUE.put, SONARR.get_missing).tag("sonarr-{}-get_missing".format(server.id))
at_time.do(thread, SONARR.get_calendar, query="Missing").tag("sonarr-{}-get_missing".format(server.id))
if server.future_days > 0:
at_time = schedule.every(server.future_days_run_seconds).seconds
at_time.do(QUEUE.put, SONARR.get_future).tag("sonarr-{}-get_future".format(server.id))
at_time.do(thread, SONARR.get_calendar, query="Future").tag("sonarr-{}-get_future".format(server.id))

if CONFIG.tautulli_enabled:
GEOIPHANDLER = GeoIPHandler(DATA_FOLDER)
schedule.every(12).to(24).hours.do(QUEUE.put, GEOIPHANDLER.update)
schedule.every(12).to(24).hours.do(thread, GEOIPHANDLER.update)
for server in CONFIG.tautulli_servers:
TAUTULLI = TautulliAPI(server, DBMANAGER, GEOIPHANDLER)
if server.get_activity:
at_time = schedule.every(server.get_activity_run_seconds).seconds
at_time.do(QUEUE.put, TAUTULLI.get_activity).tag("tautulli-{}-get_activity".format(server.id))
at_time.do(thread, TAUTULLI.get_activity).tag("tautulli-{}-get_activity".format(server.id))
if server.get_stats:
at_time = schedule.every(server.get_stats_run_seconds).seconds
at_time.do(QUEUE.put, TAUTULLI.get_stats).tag("tautulli-{}-get_stats".format(server.id))
at_time.do(thread, TAUTULLI.get_stats).tag("tautulli-{}-get_stats".format(server.id))

if CONFIG.radarr_enabled:
for server in CONFIG.radarr_servers:
RADARR = RadarrAPI(server, DBMANAGER)
if server.get_missing:
at_time = schedule.every(server.get_missing_run_seconds).seconds
at_time.do(QUEUE.put, RADARR.get_missing).tag("radarr-{}-get_missing".format(server.id))
at_time.do(thread, RADARR.get_missing).tag("radarr-{}-get_missing".format(server.id))
if server.queue:
at_time = schedule.every(server.queue_run_seconds).seconds
at_time.do(QUEUE.put, RADARR.get_queue).tag("radarr-{}-get_queue".format(server.id))
at_time.do(thread, RADARR.get_queue).tag("radarr-{}-get_queue".format(server.id))

if CONFIG.lidarr_enabled:
for server in CONFIG.lidarr_servers:
LIDARR = LidarrAPI(server, DBMANAGER)
if server.queue:
at_time = schedule.every(server.queue_run_seconds).seconds
at_time.do(thread, LIDARR.get_queue).tag("lidarr-{}-get_queue".format(server.id))
if server.missing_days > 0:
at_time = schedule.every(server.missing_days_run_seconds).seconds
at_time.do(thread, LIDARR.get_calendar, query="Missing").tag(
"lidarr-{}-get_missing".format(server.id))
if server.future_days > 0:
at_time = schedule.every(server.future_days_run_seconds).seconds
at_time.do(thread, LIDARR.get_calendar, query="Future").tag("lidarr-{}-get_future".format(
server.id))

if CONFIG.ombi_enabled:
for server in CONFIG.ombi_servers:
OMBI = OmbiAPI(server, DBMANAGER)
if server.request_type_counts:
at_time = schedule.every(server.request_type_run_seconds).seconds
at_time.do(QUEUE.put, OMBI.get_request_counts).tag("ombi-{}-get_request_counts".format(server.id))
at_time.do(thread, OMBI.get_request_counts).tag("ombi-{}-get_request_counts".format(server.id))
if server.request_total_counts:
at_time = schedule.every(server.request_total_run_seconds).seconds
at_time.do(QUEUE.put, OMBI.get_all_requests).tag("ombi-{}-get_all_requests".format(server.id))
at_time.do(thread, OMBI.get_all_requests).tag("ombi-{}-get_all_requests".format(server.id))
if server.issue_status_counts:
at_time = schedule.every(server.issue_status_run_seconds).seconds
at_time.do(QUEUE.put, OMBI.get_issue_counts).tag("ombi-{}-get_issue_counts".format(server.id))
at_time.do(thread, OMBI.get_issue_counts).tag("ombi-{}-get_issue_counts".format(server.id))

if CONFIG.sickchill_enabled:
for server in CONFIG.sickchill_servers:
SICKCHILL = SickChillAPI(server, DBMANAGER)
if server.get_missing:
at_time = schedule.every(server.get_missing_run_seconds).seconds
at_time.do(QUEUE.put, SICKCHILL.get_missing).tag("sickchill-{}-get_missing".format(server.id))
at_time.do(thread, SICKCHILL.get_missing).tag("sickchill-{}-get_missing".format(server.id))

if CONFIG.unifi_enabled:
for server in CONFIG.unifi_servers:
UNIFI = UniFiAPI(server, DBMANAGER)
at_time = schedule.every(server.get_usg_stats_run_seconds).seconds
at_time.do(QUEUE.put, UNIFI.get_usg_stats).tag("unifi-{}-get_usg_stats".format(server.id))
at_time.do(thread, UNIFI.get_usg_stats).tag("unifi-{}-get_usg_stats".format(server.id))

# Run all on startup
SERVICES_ENABLED = [CONFIG.ombi_enabled, CONFIG.radarr_enabled, CONFIG.tautulli_enabled, CONFIG.unifi_enabled,
CONFIG.sonarr_enabled, CONFIG.sickchill_enabled]
CONFIG.sonarr_enabled, CONFIG.sickchill_enabled, CONFIG.lidarr_enabled]
if not [enabled for enabled in SERVICES_ENABLED if enabled]:
vl.logger.error("All services disabled. Exiting")
exit(1)

WORKER = Thread(target=thread)
WORKER.start()

schedule.run_all()

while schedule.jobs:
Expand Down
13 changes: 13 additions & 0 deletions data/varken.example.ini
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
[global]
sonarr_server_ids = 1,2
radarr_server_ids = 1,2
lidarr_server_ids = false
tautulli_server_ids = 1
ombi_server_ids = 1
sickchill_server_ids = false
Expand Down Expand Up @@ -69,6 +70,18 @@ queue_run_seconds = 300
get_missing = true
get_missing_run_seconds = 300

[lidarr-1]
url = lidarr1.domain.tld:8686
apikey = xxxxxxxxxxxxxxxx
ssl = false
verify_ssl = false
missing_days = 30
missing_days_run_seconds = 300
future_days = 30
future_days_run_seconds = 300
queue = true
queue_run_seconds = 300

[ombi-1]
url = ombi.domain.tld
apikey = xxxxxxxxxxxxxxxx
Expand Down
4 changes: 4 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ networks:
services:
influxdb:
hostname: influxdb
container_name: influxdb
image: influxdb
networks:
- internal
Expand All @@ -13,6 +14,7 @@ services:
restart: unless-stopped
varken:
hostname: varken
container_name: varken
image: boerderij/varken
networks:
- internal
Expand All @@ -27,6 +29,7 @@ services:
restart: unless-stopped
grafana:
hostname: grafana
container_name: grafana
image: grafana/grafana
networks:
- internal
Expand All @@ -41,4 +44,5 @@ services:
- GF_INSTALL_PLUGINS=grafana-piechart-panel,grafana-worldmap-panel
depends_on:
- influxdb
- varken
restart: unless-stopped
12 changes: 6 additions & 6 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@
# Potential requirements.
# pip3 install -r requirements.txt
#---------------------------------------------------------
requests>=2.20.1
geoip2>=2.9.0
influxdb>=5.2.0
schedule>=0.5.0
distro>=1.3.0
urllib3>=1.22
requests==2.21
geoip2==2.9.0
influxdb==5.2.0
schedule==0.6.0
distro==1.4.0
urllib3==1.24.2
Loading