A bash script to build a SQLite database of all EXIF information in a directory, a heatmap of the photos/videos based on it, and more.
exiftool, jq, parallel, sqlite3
Usage:
./build_exif_dbv2.sh <directory_with_photos> <output_db_file>
The script will try to use all CPU cores and will display a progressbar with the name of the file currently being processed. It will try to read all the files recursively, everything recognized by exiftool is going to get indexed.
If ran again with the same arguments, it will skip the files already processed, allowing to resume it if it got interrupted and also only index new files.
Once the database is generated, you can use
./map.sh [--photos-at-max-zoom] <db_file>
to generate a heatmap of all the files based on GPS Coordinates. The GPS coordinates will be saved to map-data.js.
To view them on a map, open map.html
in any browser.
If --display-at-max-zoom
option is used, at max zoom heatmap will be replaced with clickable pins allowing you
to view the photo. Files need to be accessible by the full path in the database for this to work.
Having a database allows to fairly quickly search photos by exif information. For example, find all photos and videos made on Apple devices:
sqlite3 database.db "SELECT filename FROM exif_data WHERE json_extract(exif_json, '$.Make') = 'Apple';"
It also allows to pull interesting statistics on photos, for example here is how you can pull number of files by device model taken each year, using EXIF and falling back to modification date if EXIF is not available:
sqlite3 database.db "SELECT substr(coalesce(json_extract(exif_json, '$.SubSecDateTimeOriginal'),\
json_extract(exif_json, '$.FileModifyDate')), 1, 4) AS Year,\
json_extract(exif_json, '$.Make') ||' ' || json_extract(exif_json, '$.Model') AS Model,\
count(1)\
FROM exif_data\
GROUP BY Year, Model\
ORDER BY Year, Model;"
Do not try to read or write from the database while the script is running. This might cause sqlite to error out and some records will fail to INSERT.