-
-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Column "domain" does not exist #142
Comments
Suggest asking the author of the addon if this is an intentional omission, however I dont think so. Could also be an issue with the sensor, check the HA issues. The easiest solution is to install detective in editable mode using HASS-data-detective/detective/core.py Line 100 in f861f70
Even easier is just to create a standalone function like |
Hi @robmarkcole. I did as suggested and cloned the repository locally, fixed the SQL queries in core.py and then uninstalled and reinstalled the HASS-data-detective module using pip install -e. Unfortunately, I could not figure out how to get this to work on hassos directly. After installing the 'new' version and running the query it was still showing the old SQL query error. Not sure if this because it is somehow getting stored in the docker container or what. Anyway, I gave up on that for right now. Instead I would like to just use HASS-data-detective on my local copy of Jupyterlab. However, for th life of my I cannot figure out how to connect to my tiemseriesdb on my HASSOS instance. Keep getting Connection to port 5432 refused error. This is the connection url I am using:
I have tried both with and without the '77b2833f-timescaledb' and it doesn't seem to make any difference. I realize this has nothing to do with your module, but wanted to ask if you have been able to connect to a postgres db running on home assistant os? I may have to give up and just install a postgres db in a seperate docker container. |
Almost certainly I expect the port is not exposed, this will be a setting somewhere |
You are correct, Sir. After a lot of Googling I found that I needed to update /data/postgres/pg_hba.conf in the timescaledb docker container to add the following line to include my local lan ips:
You can also use '0.0.0.0/0' as the Foreign Address ip, but I preferred to keep it to just my local lan ips. I also checked /data/postgres/postgresql.conf to make sure 'listen_addresses' = '*'. This was already set. I may narrow down this setting later to be 'localhost' and a comma-separated list of local lan ip addresses to lock down access a bit more. Once I made these two changes and restarted the timescaledb docker container, then suddenly I was able to connnect remotely. This was my first experience with working with Postgresql so working my way up the learning curve. The only issue I have now is that I still cannot use the HASS-data-detective even when installing it using Thanks again for your support. |
HI @robmarkcole, just brand new to using Home Assistant. Installed the JupyterLab add-on and ran into similar errors using the GETTING_STARTED example code. after having successfully used call "db = detective.db_from_hass_config()" Which all looks ok But thereafter example code in home-assistant JupyterLab notebook breaks down. df1 = db.fetch_all_data_of(('sensor.wlan_switch_energy_power')) and always got error "OperationalError: no such column: domain" As I am working within the HomeAssistant OS WebUI provided, all the above explanations re pip install etc etc remain mysterious to me - any chance to fix from within HomeAssistant web-interface??? Thanks, |
@pebe-espana can you access your db using a tool like https://dbeaver.io/ and check what columns are available? I have not been keeping a track on all the changes in HA recently and it is possible the schema has changed |
@robmarkcole - thanks for the prompt reply. Apologies if my answers are of RTFM kind - this SQL database stuff is still foreign to me. Used beaver as you suggested, but am a bit lost in seeing what it tells me about the structure and 'schema' that connects back to integrations and the 152 entities. I attach two screenshots that break down the database: database seems to contain tables named event_data, events, etc etc , each of which has columns (of different column names & associated data type) ... so what am I looking/filtering for to locate e.g. 'sensor.wlan_switch_energy_power' . From python code it would seem to me that the code looks at table ='states' and specific 'entity_id' (is that =sensor.wlan_swith_energy_power'???) in there. Indeed column domain is absent in the states table (or the other tables). What takes the place of domain? And how to fix it? I hope this helps. |
Looks like |
Tried to see if I could isolate the function "fetch_all_data_of()" as a function "my_fetch_all_data_of()" in my own notebook, but then got lost on the dependencies of other parts, it being part of a class definition ..... As the collection is already successful, any chance to convert db to pandas and then throw away all the unneeded parts using panda handling? If you could give me a hint for a minimalistic subsection of the full HASS-data-detective GitHub code I would appreciate it. |
The executed sql query is at HASS-data-detective/detective/core.py Line 99 in f861f70
domain . You can install detective and editable mode and make changes on the fly to the query.
eg try SELECT entity_id, state, last_changed, attributes
FROM states
WHERE
state NOT IN ('unknown', 'unavailable')
ORDER BY last_changed DESC |
Thank you. I succeeded with the following code after having fetched the database using note: I had to put the string 'sensor.wlan_switch_energy_power' explicitly into the query, somehow the query string would not allow the variable to be inserted, as in your python class. I could also put two such strings in, then two entities were extracted by query. note: Also had then the problem that in the sample code GETTING STARTED [Popular entities section] But thanks - the goal of extracting a particular data set works, and then I export it anyway to other pc and python/excel or whatever. Much appreciated your help, and maybe others read this minimalist solution and workaround and find teh 'code snippet.txt' already helpful as well! |
I have installed PostgreSQL + TimescaleDB following the instructions here. I am able to connect to the postgersql database and execute
db.entities[:10]
function and it returns the first 10 entities, so I know I am connected. However, when I try to executedf1 = db.fetch_all_data_of(('sensor.aqara_multi_sensor_master_bath_humidity',))
on one of my sensors I get UndefinedColumn: column "domain" does not exist error. I checked that states table that this function appears to be querying and the error message is correct, there is no 'domain' column.if I run
!pip show HASS-data-detective
I get the follwoing information. Am I running and old version of this module? Is there some way to change what columns are included in this function?Error message details:
The text was updated successfully, but these errors were encountered: