Skip to content

Commit

Permalink
Vueify snatch-selection (#7345)
Browse files Browse the repository at this point in the history
* Add history apiv2 handler.
* Add paginate_query helper, to provide a proper db pagination solution.

* Fix pagination for all history items.
Fix pagination for when a series_slug is provided.

* Added store for history [WIP]

* Add history.js

* Use a data_generator for the history pagination.
* update history.js store, to get entire history.

* Add show-history.vue component.

* Add apiv2 providers endpoint.
* Return a list of providers in json.

* Add results path_param.
* Return results filtered by show, season or episode.

* Added snatch-selection.vue
* First concept version of a episode history component in snatchSelection.

* yarn dev

* Store last loaded show in recentShows.
Use the Vue beforeRouteEnter lifecycle.

* Add the visited show to the start.
Cut it off at 5.

* Fix getting history for episode when switching between episodes / shows.

* Added sending a SearchResult through webSockets to the client.
* generic_provider.py Fix bug in to_json() for TorrentSearchResult
* Fixed apiv2/providers.py, sending searchResults.

* Added new component to dislay show results. (used by snatch-selection.vue)

* Added store and actions for provider.js.
* Added webSocket support for provider's search results.

* Updated snatch-selection.vue with components for history and results.

* Make sure I get the correct season/episode.

* Merge objects, but do not overwrite the dateAdded field.

* Added date searched / time field.

* Add webSocket support to the queueItems.
* Start webSocket messages when a queueItem starts and finishes.
We can use this to track searches, subtitle Search, postprocessing etc.

* Refactored config related stores.

* use queueItems to update manual search messages.
* Added new store module search.js (which should be used for search related activities)

* Standardize the date format across apiv2 and websocket.

* Add other vgt fields to show-results.vue

* Added some js string templating.

* Added quality to history result.

* Fixed api's for history route.

* Changed identifier for the pickManualSearch function from rowid to identifier. As identifier is always available.

* Fixed bug in parsing the pubdate for a SearchResult objectd, when calling it's to_json().

* This one's also a rowid -> identifier change

* set current tab (when chaning tabs in home) should not send apiv2 request.

* show-history.vue: finished component

* show-results.vue: finished component

* Fixed vue-good-table styling

* Also the setCurrentTab fix.

* updated history.js and provider.js stores.

* runtime bundles

* Add show to store, when added in backend.
Use websocket.

* Remove comments.
* Fix the routes in handler.py.
* Added a reminder TODO.

* whoops

* Return descriptive fields for the "resouce" field.
* Map provider field to descriptive fields, based on specific action id's.

* Fix showing provider/release group in same column.

* Force search, when no initial results found.

* Fix searching when no cache present.

* Add episode title to snatchSelection (showHeader) component.

* Add row style to subtitled history item.
show-results: Fix Seeds/Peers

* Improve history table for subtitle rows

* Fix displaying xem icon in showHeader.

* Add column filters to show-history.vue and show-results.vue tables.

* Fix show-results.vue release name coloring.

* Add season pack support

* Fix switching between season and episode.

* Fix date parse errors

* Fix episode title showing for season pack search

* Better 404 error handling

* Fix double style attribute in history.mako

* Fix provider icons for failed.
* Replace language code with flag.

* Save show-results sorting in cookie.
* Moved vue-good-table sorting methods to manage-cookie.js mixin, to make it easier to re-use.

* Add onError fallback provider image

* Refactor store /config -> /config/general
* Replaced vanilla-lazyload with vue-lazy-images

* Fixed manage_backlogOverview.mako style for allowed.
* Some small bug fixes.

* Fix manage/mass-update

* Fixed show-results.

* Also fix the date console errors, for when new results come in through ws.

* Improve storing cookies for vue-good-table sorting.

* Use provider image, with provider name (tooltip) for show-history.vue and show-results.vue

* Fix clear warnings in sub-menus.js.

* Fixed home.vue smallposter and banner styling.

* Fix app-link.spec.js test.

* Fix save/loading home.vue layout by cookie.

* Refactor state.config -> state.config.general

* Fixed tests

* Enable series asset (poster, banner, fanart) cache-control.

* Wrote my "own" lazy-image component.
* Fixed jest tests

* Fixed typo

* my system trippin

* Fixed lazy-load component.

* Improve speed switching layouts.

* Fix duplicate class error.

* Add jest tests for show-history and show-results

* Fix lint errors

* Status donwloaded should not display a provider icon.

* Fix flake warnings.
* Unused imports.

* Last of the flake warnings

* Fix show-history.spec.js

* rebundle
build lock.
  • Loading branch information
p0psicles authored Jul 16, 2020
1 parent 7ecf25c commit ce876a6
Show file tree
Hide file tree
Showing 177 changed files with 10,392 additions and 4,916 deletions.
37 changes: 34 additions & 3 deletions medusa/classes.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@

from dateutil import parser

from medusa import app
from medusa import app, ws
from medusa.common import (
MULTI_EP_RESULT,
Quality,
Expand All @@ -32,7 +32,6 @@
from medusa.logger.adapters.style import BraceAdapter
from medusa.search import SearchType


from six import itervalues

log = BraceAdapter(logging.getLogger(__name__))
Expand Down Expand Up @@ -204,6 +203,34 @@ def __repr__(self):

return '<{0}: {1}>'.format(type(self).__name__, result)

def to_json(self):
"""Return JSON representation."""
return {
'identifier': self.identifier,
'release': self.name,
'season': self.actual_season,
'episodes': self.actual_episodes,
'seasonPack': len(self.actual_episodes) == 0,
'indexer': self.series.indexer,
'seriesId': self.series.series_id,
'showSlug': self.series.identifier.slug,
'url': self.url,
'time': datetime.now().replace(microsecond=0).isoformat(),
'quality': self.quality,
'releaseGroup': self.release_group,
'dateAdded': datetime.now().replace(microsecond=0).isoformat(),
'version': self.version,
'seeders': self.seeders,
'size': self.size,
'leechers': self.leechers,
'pubdate': self.pubdate.replace(microsecond=0).isoformat() if self.pubdate else None,
'provider': {
'id': self.provider.get_id(),
'name': self.provider.name,
'imageName': self.provider.image_name()
}
}

def file_name(self):
return u'{0}.{1}'.format(self.episodes[0].pretty_name(), self.result_type)

Expand All @@ -213,6 +240,10 @@ def add_result_to_cache(self, cache):
# FIXME: Added repr parsing, as that prevents the logger from throwing an exception.
# This can happen when there are unicode decoded chars in the release name.
log.debug('Adding item from search to cache: {release_name!r}', release_name=self.name)

# Push an update to any open Web UIs through the WebSocket
ws.Message('addManualSearchResult', self.to_json()).push()

return cache.add_cache_entry(self, parsed_result=self.parsed_result)

def _create_episode_objects(self):
Expand Down Expand Up @@ -279,7 +310,7 @@ def update_from_db(self):
self.leechers = int(cached_result['leechers'])
self.release_group = cached_result['release_group']
self.version = int(cached_result['version'])
self.pubdate = cached_result['pubdate']
self.pubdate = parser.parse(cached_result['pubdate']) if cached_result['pubdate'] else None
self.proper_tags = cached_result['proper_tags'].split('|') \
if cached_result['proper_tags'] else []
self.date = datetime.today()
Expand Down
29 changes: 25 additions & 4 deletions medusa/generic_queue.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,13 @@

from __future__ import unicode_literals

import datetime
import logging
import threading
from builtins import object
from datetime import datetime
from functools import cmp_to_key
from uuid import uuid4


log = logging.getLogger()

Expand Down Expand Up @@ -46,7 +48,7 @@ def add_item(self, item):
:return: item
"""
with self.lock:
item.added = datetime.datetime.now()
item.added = datetime.utcnow()
self.queue.append(item)

return item
Expand Down Expand Up @@ -111,19 +113,38 @@ def __init__(self, name, action_id=0):
self.action_id = action_id
self.stop = threading.Event()
self.added = None
self.queue_time = datetime.datetime.now()
self.queue_time = datetime.utcnow()
self.start_time = None
self._to_json = {
'identifier': str(uuid4()),
'name': self.name,
'priority': self.priority,
'actionId': self.action_id,
'queueTime': str(self.queue_time),
'success': None
}

def run(self):
"""Implementing classes should call this."""
self.inProgress = True
self.start_time = datetime.datetime.now()
self.start_time = datetime.utcnow()

def finish(self):
"""Implementing Classes should call this."""
self.inProgress = False
threading.currentThread().name = self.name

@property
def to_json(self):
"""Update queue item JSON representation."""
self._to_json.update({
'inProgress': self.inProgress,
'startTime': str(self.start_time) if self.start_time else None,
'updateTime': str(datetime.utcnow()),
'success': self.success
})
return self._to_json


def fifo(my_list, item, max_size=100):
"""Append item to queue and limit it to 100 items."""
Expand Down
1 change: 1 addition & 0 deletions medusa/indexers/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,6 +146,7 @@ def create_config_json(indexer):


def get_indexer_config():
"""Create a per indexer and main indexer config, used by the apiv2."""
indexers = {
indexerConfig[indexer]['identifier']: create_config_json(indexerConfig[indexer]) for indexer in indexerConfig
}
Expand Down
46 changes: 46 additions & 0 deletions medusa/providers/generic_provider.py
Original file line number Diff line number Diff line change
Expand Up @@ -848,3 +848,49 @@ def __str__(self):
def __unicode__(self):
"""Return provider name and provider type."""
return '{provider_name} ({provider_type})'.format(provider_name=self.name, provider_type=self.provider_type)

def to_json(self):
"""Return a json representation for a provider."""
from medusa.providers.torrent.torrent_provider import TorrentProvider
return {
'name': self.name,
'id': self.get_id(),
'config': {
'enabled': self.enabled,
'search': {
'backlog': {
'enabled': self.enable_backlog
},
'manual': {
'enabled': self.enable_backlog
},
'daily': {
'enabled': self.enable_backlog,
'maxRecentItems': self.max_recent_items,
'stopAt': self.stop_at
},
'fallback': self.search_fallback,
'mode': self.search_mode,
'separator': self.search_separator,
'seasonTemplates': self.season_templates,
'delay': {
'enabled': self.enable_search_delay,
'duration': self.search_delay
}
}
},
'animeOnly': self.anime_only,
'type': self.provider_type,
'public': self.public,
'btCacheUrls': self.bt_cache_urls if isinstance(self, TorrentProvider) else [],
'properStrings': self.proper_strings,
'headers': self.headers,
'supportsAbsoluteNumbering': self.supports_absolute_numbering,
'supportsBacklog': self.supports_backlog,
'url': self.url,
'urls': self.urls,
'cookies': {
'enabled': self.enable_cookies,
'required': self.cookies
}
}
69 changes: 68 additions & 1 deletion medusa/search/queue.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import time
import traceback

from medusa import app, common, failed_history, generic_queue, history, ui
from medusa import app, common, failed_history, generic_queue, history, ui, ws
from medusa.helpers import pretty_file_size
from medusa.logger.adapters.style import BraceAdapter
from medusa.search import BACKLOG_SEARCH, DAILY_SEARCH, FAILED_SEARCH, MANUAL_SEARCH, SNATCH_RESULT, SearchType
Expand Down Expand Up @@ -256,13 +256,22 @@ def __init__(self, scheduler_start_time, force):
self.scheduler_start_time = scheduler_start_time
self.force = force

self.to_json.update({
'success': self.success,
'force': self.force
})

def run(self):
"""Run daily search thread."""
generic_queue.QueueItem.run(self)
self.started = True

try:
log.info('Beginning daily search for new episodes')

# Push an update to any open Web UIs through the WebSocket
ws.Message('QueueItemUpdate', self.to_json).push()

found_results = search_for_needed_episodes(self.scheduler_start_time, force=self.force)

if not found_results:
Expand Down Expand Up @@ -315,6 +324,9 @@ def run(self):
if self.success is None:
self.success = False

# Push an update to any open Web UIs through the WebSocket
ws.Message('QueueItemUpdate', self.to_json).push()

self.finish()


Expand Down Expand Up @@ -345,6 +357,13 @@ def __init__(self, show, segment, manual_search_type='episode'):
self.segment = segment
self.manual_search_type = manual_search_type

self.to_json.update({
'show': self.show.to_json(),
'segment': [ep.to_json() for ep in self.segment],
'success': self.success,
'manualSearchType': self.manual_search_type
})

def run(self):
"""Run manual search thread."""
generic_queue.QueueItem.run(self)
Expand All @@ -359,6 +378,9 @@ def run(self):
}
)

# Push an update to any open Web UIs through the WebSocket
ws.Message('QueueItemUpdate', self.to_json).push()

search_result = search_providers(self.show, self.segment, forced_search=True, down_cur_quality=True,
manual_search=True, manual_search_type=self.manual_search_type)

Expand Down Expand Up @@ -396,6 +418,10 @@ def run(self):
if self.success is None:
self.success = False

# Push an update to any open Web UIs through the WebSocket
msg = ws.Message('QueueItemUpdate', self.to_json)
msg.push()

self.finish()


Expand Down Expand Up @@ -423,6 +449,13 @@ def __init__(self, show, segment, search_result):
self.results = None
self.search_result = search_result

self.to_json.update({
'show': self.show.to_json(),
'segment': [ep.to_json() for ep in self.segment],
'success': self.success,
'searchResult': self.search_result.to_json()
})

def run(self):
"""Run manual snatch job."""
generic_queue.QueueItem.run(self)
Expand All @@ -434,6 +467,10 @@ def run(self):
log.info('Beginning to snatch release: {name}',
{'name': result.name})

# Push an update to any open Web UIs through the WebSocket
msg = ws.Message('QueueItemUpdate', self.to_json)
msg.push()

if result:
if result.seeders not in (-1, None) and result.leechers not in (-1, None):
log.info(
Expand Down Expand Up @@ -473,6 +510,10 @@ def run(self):
if self.success is None:
self.success = False

# Push an update to any open Web UIs through the WebSocket
msg = ws.Message('QueueItemUpdate', self.to_json)
msg.push()

self.finish()


Expand All @@ -491,6 +532,12 @@ def __init__(self, show, segment):
self.show = show
self.segment = segment

self.to_json.update({
'show': self.show.to_json(),
'segment': [ep.to_json() for ep in self.segment],
'success': self.success
})

def run(self):
"""Run backlog search thread."""
generic_queue.QueueItem.run(self)
Expand All @@ -500,6 +547,10 @@ def run(self):
try:
log.info('Beginning backlog search for: {name}',
{'name': self.show.name})

# Push an update to any open Web UIs through the WebSocket
ws.Message('QueueItemUpdate', self.to_json).push()

search_result = search_providers(self.show, self.segment)

if search_result:
Expand Down Expand Up @@ -557,6 +608,9 @@ def run(self):
if self.success is None:
self.success = False

# Push an update to any open Web UIs through the WebSocket
ws.Message('QueueItemUpdate', self.to_json).push()

self.finish()


Expand All @@ -576,11 +630,21 @@ def __init__(self, show, segment, down_cur_quality=False):
self.segment = segment
self.down_cur_quality = down_cur_quality

self.to_json.update({
'show': self.show.to_json(),
'segment': [ep.to_json() for ep in self.segment],
'success': self.success,
'downloadCurrentQuality': self.down_cur_quality
})

def run(self):
"""Run failed thread."""
generic_queue.QueueItem.run(self)
self.started = True

# Push an update to any open Web UIs through the WebSocket
ws.Message('QueueItemUpdate', self.to_json).push()

try:
for ep_obj in self.segment:

Expand Down Expand Up @@ -657,6 +721,9 @@ def run(self):
if self.success is None:
self.success = False

# Push an update to any open Web UIs through the WebSocket
ws.Message('QueueItemUpdate', self.to_json).push()

self.finish()


Expand Down
Loading

0 comments on commit ce876a6

Please sign in to comment.