Copy data from InfluxDB to MySQL for monthly charts and Time of Use pricing #82
Replies: 12 comments 15 replies
-
Based on the discussion in #113, I'm adding to this post to try to create a step-by-step set of instructions for adding MySQL to Powerwall Dashboard, including the monthly charts and time of use pricing. Part 1: Install and configure MySQL
Create kwh table in MySQL:``` CREATE TABLE `kwh` ( `from_grid` double DEFAULT NULL, `from_pw` double DEFAULT NULL, `home` double DEFAULT NULL, `month` text DEFAULT NULL, `solar` double DEFAULT NULL, `to_grid` double DEFAULT NULL, `to_pw` double DEFAULT NULL, `year` text DEFAULT NULL, `datetime` datetime NOT NULL, `datetime_local` timestamp NULL DEFAULT NULL, `cost_solar` decimal(15,10) DEFAULT NULL, `cost_grid` decimal(15,10) DEFAULT NULL, `credit_grid` decimal(15,10) DEFAULT NULL, PRIMARY KEY (`datetime`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4; ```Part 2 - Copy data from InfluxDB to MySQL
EDIT 2023-03-13: The original Python script automatically created the kwh table in MySQL. This caused issues with DST as the datetime field was being created as a 'timestamp' type and not a 'datetime' type. These instructions have been updated to manually create the table (above), and the scripts below have been updated so the initial load does not recreate the table. Click to expand Python script
Set up this Python script on a schedule (using crontab -e on my Pi). I run it every 5 minutes. Set all the same parameters as in the first script. The only difference between the scripts is in the SELECT statement from InfluxDB. The repeating script only selects the last two days' data (you can alter this as needed). Click to expand Python script
Part 3 - Set up Time of Use cost calculations
Click to expand MySQL query
Click to expand MySQL query
Click to expand MySQL queries
Part 4 - Add to Grafana
Final thought: Here is the JSON for just the two TOU cost panels (stat and bar chart). You should be able to import this into your Grafana and then copy the panels from this to your other dashboards. For a non-PPA you'll want to modify the queries, since the solar would be counted as 'savings' and the total cost is only the 'cost_grid' minus 'credit_grid'. Click to expand Grafana JSON
|
Beta Was this translation helpful? Give feedback.
-
I'm following along at home. I'm running on Windows 11 on WSL (Ubuntu 22.04.1 LTS upgraded in place from 20.xx) running docker desktop 4.13.1 upgraded from 4.13). In this scenario, Docker Desktop runs in Windows Subsystem for Linux, and so the place for doing the setup for the Powerwall Dashboard is a linux command line, which ends up setting up docker images in another linux VM, and the place for doing things like historical imports, and this set up for MySQL, is that same WSL command line. I googled mysql docker image x64 and hit https://hub.docker.com/r/mysql/mysql-server which had the instruction
This might be semi-platform independent, of sorts. I followed up with
Got to the import phase - had to figure out how to run python interactively (hadn't really done that before - just a script), and ran into this error on the create_engine step:
That did get me a step further, but I got the same error on the df.to_sql step. It looks like even though the grant was supposedly for all hosts, I needed to set up a specific user from the host where I was python running the import (the WSL host hosting Docker) for the fix:
So, that's where I'll stop for the moment - notes on installation on a different platform, and connecting to the newly installed instance to run the install. I need to think about next steps now, as I have a different pricing structure to @youzer-name, but there's enough to get me going, and the import steps and instructions were enough to get me up and running with a populated hourly database. Also I'll tag @jasonacox to consider whether or not the options for linux builds of mysql as discussed here https://hub.docker.com/r/mysql/mysql-server are generic enough to consider pulling into a semi-core build that includes MySQL for arm and x64 at least. |
Beta Was this translation helpful? Give feedback.
-
Another comment; again, this might be a mysql version issue, but I needed to modify the trigger syntax to get it to work by wrapping it in a BEGIN END block and a delimiter change:
|
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
@BJReplay - Nice! Your solution for the daily charge seems like the simplest way to include that. I guess if you brought over the raw data you could try to match the 1/2 hour calculations, but it's not worth the effort if the numbers you're getting from this are working for you. I'm definitely happy with these numbers being close to reality, even if they will never be a perfect match. If we want to make the setup for this more universal, it might make sense to have every possible column in the TOU table - solar_cost, solar_savings, daily_charge, etc. - and then just leave the ones that don't apply blank. If a few more people set this up we may find a few more columns that are needed to cover different scenarios. |
Beta Was this translation helpful? Give feedback.
-
Here's my state of play for what I expect my billing period to be (October 31 to November 29 inclusive). It will be a little bit out because of 30 minute net rules, but this should be very close. For anyone playing along at home, there was a VPP test dispatch on the 7th of November. |
Beta Was this translation helpful? Give feedback.
-
I noticed some errors in the MySQL server logs that were caused by not closing the connection each time the Python script ran to transfer data. I added one line to the Python script and it seems to have fixed the errors. I'll edit the original posts in case someone doesn't read down to the bottom, but the additional line that needs to be at the end of the script is: engine.dispose() |
Beta Was this translation helpful? Give feedback.
-
After the time change for Daylight Saving in the US earlier today, I noticed the copy from InfluxDB to MySQL was failing. After tracing the issue, there is an error in the process related to the setup of the MySQL table. The "datetime" column in MySQL was created as a 'timestamp' column, but that type of column is timezone aware and that isn't good when trying to store the UTC time. It may not be an issue if the MySQL server is set to UTC, but mine is set to America/New_York and the datetime column was wrong when the clocks moved. The solution is to change that column from "timestamp" to "datetime". The easiest way to fix existing data is probably to truncate the table, alter the column to be a datetime, and re-import everything. In my case I went through a more complicated process of fixing it in place, since I recently lost my InfluxDB database and had to use the history tool to recreate all the data, so the existing data in my MySQL kwh table should be slightly more accurate than the data imported by the history tool. I edited the posts above to try to fix this issue. Key updates:
@BJReplay, can you update the information in the "Tools" part of the project? |
Beta Was this translation helpful? Give feedback.
-
The first thing that comes to mind when I try to understand your volume tariff challenge is that the solution will involve building views in MySQL to break the problem down into smaller pieces and then adding the results of the views together to get total cost. |
Beta Was this translation helpful? Give feedback.
-
This is going back to a post from November, 2022, and I don't know how I didn't notice this before, but I just spent about an hour trying to figure out why the last day of each TOU period didn't have any cost/credit numbers after midnight. The end_date comparison in each query needs to refer to the DATE(datetime_local).... not to datetime_local (which includes the time). So for example, for today, 2024-03-30, at 05:00 local, it needs to check that the end_date of the period is less than or equal to 2024-03-30, NOT 2024-03-30 05:00:00. I had null values for TOU cost calculation every month on the last day of the period for every row after midnight until I fixed it and re-ran the update. So the triggers should look like:
The query to fix it looks like this:
|
Beta Was this translation helpful? Give feedback.
-
I wonder if this is because you're in a GMT-x timezone, but I'm in a GMT+ timezone that I'm not seeing the same issue? Today (11:53am, 31 March, GMT+11), if I run I get
It looks like I'm getting data calculated right up to date. A quirk of being ahead of GMT? |
Beta Was this translation helpful? Give feedback.
-
Ahh, OK, that's it. Not strictly speaking, some have end dates in the past, now, but when active, they've always had an end date in the future, and I've then changed to a new retailer before I've hit the end date of a ToU period. I don't have seasons, so don't have a particularly complex plan.
I'll update the triggers then just in case. Thanks for mentioning me in the comment. |
Beta Was this translation helpful? Give feedback.
-
Edit: For step-by-step instructions to set up MySQL, skip to the second post in this discussion. Some of the details in this post are out of date
While trying to create a chart that would show metrics (solar production, for example) by month of the year, I ran into a number of limitations of InfluxDB and Grafana that made it impossible to create such a chart. I thought that I would have better luck if I could get my data into MySQL or another SQL-type database.
Specifically, there is no 'datepart(datefield)' type function in InfluxQL, so it isn't possible to get just the month part of a date to use in a chart. The downsampled data in kwh has a month field, but that field is alpha (Jan, Feb) which results in the months being sorted alphabetically on the chart rather than in order from January to December.
I tried using the Flux query language instead of the InfluxQL language, as it is supposed to support operations similar to 'datepart', but didn't have any luck, although someone who is fluent in Flux might be able to achieve the same result without resorting to copying the data.
In my case I have the Powerwall Dashboard stack running under Docker on a Raspberry Pi 4. The first thing I needed was a MySQL server. I initially thought about installing MariaDB on the 32-bit ARM OS on the RPi, but for me it was easier to set up the MySQL on a Windows box under Docker desktop. I'm not sure if using MariaDB on a RPi would cause any issues with this solution. Once it was running I set up the root user password and created a database schema called powerwall_mysql.
[ Edit: I later set this up on the RPi Docker host as noted below. It just required a small change to the query) ]
The next step was to get the kwh data from InfluxDB to MySQL. The kwh data is downsampled to hourly totals for each metric, and I didn't see a need to have more detailed data than that. You could also use the daily downsampled data. I ended up pulling both tables and the script below will export both.
I found a YouTube video that showed a Python script to copy InfluxDB data to MySQL. With a little bit of modification I was able to make that script do what I needed. I had to install several Python modules including sqlalchemy, pymysql, pandas, and influxdb to get the script to run.
There are two scripts below. One is for the initial data pull. It will create the necessary tables in MySQL, over-writing any tables that already exist with the same name.
Scripts
[ Edit: The script has a line that can be uncommented to create a local_datetime column. I recommend doing that as I later ran into a situation where the local time was needed, see discussion on Time of Use rates https://github.com//issues/113 ]
Once I had the data in MySQL I created a new datasource in Grafana that connected to the MySQL server. Using that datasource I created a new bar chart with this query:
The "any_value" operator was a new one for me. This was to work around the fact that I didn't want to group by the datetime, but MySQL usually requires each metric in the select statement to be included in the group by statement. This could also be solved by disabling the "only_full_group_by" setting in the MySQL configuration. There is some discussion of this here: https://dev.mysql.com/doc/refman/8.0/en/group-by-handling.html
[ Edit: I ended up moving the MySQL from a Windows Docker Desktop container to my Raspberry Pi (which is running ARM32 V7l), using https://github.com/yobasystems/alpine-mariadb. This didn't accept the "any_value" query. A simple workaround was to replace "any_value" with "min" (or "max" or any other aggregate function) so it now reads: SELECT min(datetime) as 'time', ]
I then created charts for 'Home Usage', 'From Powerwall', and 'From Grid', set the colors to match the existing daily (past month) charts, and set a soft max on each chart so they would all be on the same scale. I don't have a full year's data yet as I only got my Powerwall at the end of February, so the end result looks like this:
The queries can be set up with either the kwh or the daily data. If you use the daily data, it won't include the partial sums from, today, but since these are monthly charts, the one that includes today is always a partial month anyway, so either way probably works.
The last steps are to add unique keys/constraints to the MySQL tables so that duplicate data can't be created when updating the data, and to schedule the python script to run (via cron or task scheduler).
For the kwh table, a primary key needs to be created on the datetime column. For the daily table, a unique index is needed on the datetime and month fields. This is to deal with data that was logged prior to the bugfix in #80. If you don't have data from before that fix, you can just create a primary key on datetime.
In MySQL:
MySQL query
and
Finally, using cron or task scheduler, run this script to update the data periodically. I am running it every 5 minutes.
Python script
Beta Was this translation helpful? Give feedback.
All reactions