Documentation: https://armanckeser.github.io/youtube-history-analysis
Source Code: https://github.com/armanckeser/youtube-history-analysis
PyPI: https://pypi.org/project/youtube-history-analysis/
See how your YouTube interests evolved over time
python -m venv yt-history-venv
# On Windows ./yt-history-venv/Scripts/activate.bat
# On MAC source ./yt-history-venv/bin/activate
pip install youtube-history-analysis
- Visit the Google Cloud Console.
- Click the project drop-down and select or create the project for which you want to add an API key.
- Click the hamburger menu and select APIs & Services > Credentials.
- On the Credentials page, click Create credentials > API key.
- The API key created dialog displays your newly created API key.
Remember to restrict the API key so that it can only be used with certain websites or IP addresses by clicking the Edit button for the API key and then setting the restrictions in the Key restriction section.
- Visit Google Takeout and sign in to your Google account.
- Scroll down to the "YouTube" section and click All data included.
- Click the Deselect All button and then select the checkbox next to Watch history.
- Click the Next button at the bottom of the page.
- On the next page, you can select the file type and delivery method for your takeout. Make sure to select JSON as the file type.
- Click the Create export button to start the export process.
Once the export is complete, you will receive an email with a link to download your takeout. The downloaded file will be a zip archive containing your YouTube watch history in JSON format.
python -m youtube_history_analysis $API_KEY --watch-history-file-path $WATCH_HISTORY_JSON_PATH
This will create an outputs
folder with a bunch of .csv
files and a few .png
files. Feel free to use the .csv
file to do your own analysis.
- Clone this repository
- Requirements:
- Poetry
- Python 3.9+
- Create a virtual environment and install the dependencies
poetry install
- Activate the virtual environment
poetry shell
pytest
The documentation is automatically generated from the content of the docs directory and from the docstrings of the public signatures of the source code. The documentation is updated and published as a Github project page automatically as part each release.
Trigger the Draft release workflow (press Run workflow). This will update the changelog & version and create a GitHub release which is in Draft state.
Find the draft release from the GitHub releases and publish it. When a release is published, it'll trigger release workflow which creates PyPI release and deploys updated documentation.
Pre-commit hooks run all the auto-formatters (e.g. black
, isort
), linters (e.g. mypy
, flake8
), and other quality
checks to make sure the changeset is in good shape before a commit/push happens.
You can install the hooks with (runs for each commit):
pre-commit install
Or if you want them to run only for each push:
pre-commit install -t pre-push
Or if you want e.g. want to run all checks manually for all files:
pre-commit run --all-files
This project was generated using the wolt-python-package-cookiecutter template.