Skip to content
This repository has been archived by the owner on Dec 19, 2024. It is now read-only.

Pools as entities #19

Open
agospher4eg opened this issue Apr 21, 2021 · 29 comments
Open

Pools as entities #19

agospher4eg opened this issue Apr 21, 2021 · 29 comments

Comments

@agospher4eg
Copy link

Hi! You made great integration! I read aiotruenas-client and saw that you can list pools. Can you add pools to integration?

@sdwilsh
Copy link
Owner

sdwilsh commented Apr 27, 2021

I don't have a lot of time available for this right now, but I would happily review a PR that added it :D

@sdwilsh
Copy link
Owner

sdwilsh commented May 12, 2021

@agospher4eg, what were you looking to have exposed for Pools in the integration?

@agospher4eg
Copy link
Author

I want pools with total space and free space

@sdwilsh
Copy link
Owner

sdwilsh commented Jul 2, 2021

Interestingly enough, while the TrueNAS UI shows this information for pools, it's actually datasets that have this information in ZFS (or at least, the API that TrueNAS has), so it's a bit more involved as the upstream library doesn't yet have support for datasets. Now, I also own that library, so it's not a process problem to get that support, but more of a time problem right now.

@sdwilsh
Copy link
Owner

sdwilsh commented Jul 2, 2021

And for my, or someone else's, reference in the future, pool.dataset.query gives me all the information I need to accomplish this, and we'd want to query at least these properties:

  • name
  • pool
  • type
  • used
  • available
  • compressratio

Weirdly, it doesn't look like the API provides total space, but someone can compute it with used and available numbers (or we can in the integration too).

@sdwilsh
Copy link
Owner

sdwilsh commented Jul 2, 2021

A PR that needs some work is up in sdwilsh/aiotruenas-client#92 to add this to the library. I need to rethink some stuff there, but it's a solid starting point.

@sdwilsh
Copy link
Owner

sdwilsh commented Jul 2, 2021

v0.8.0 of the client library is out that includes Dataset support to provide the values you wanted, @agospher4eg. I'm not sure when I'll have time next to actually add support to the integration, but the hardest part of getting in the library is now done.

@sdwilsh
Copy link
Owner

sdwilsh commented Sep 11, 2021

I just finished merging in v0.8.0 of the library, so we're pretty close to having this. I'm planning to hold off on releasing the next version until we get this. I probably won't have time to do this for a few weeks still, but @colemamd might get to it before then!

@agospher4eg
Copy link
Author

Great news! I`m waiting. I can help with testing =)

@colemamd
Copy link
Contributor

I've been looking at this over the last couple of days, and I now have datasets working with id, type, pool_name, compress_ratio, available_bytes, used_bytes, & total_bytes. I just have a couple questions. The bytes attributes are shown in bytes, vice MB, GB, TB, etc. I think the easiest way to handle that is in aiotruenas_client by pulling value vice rawvalue. But then we'd have to deal with total_bytes.
There is also no good value for unique_id that I can find. It still works, just can't use the entity registry. Thoughts?

@sdwilsh
Copy link
Owner

sdwilsh commented Sep 14, 2021

The bytes attributes are shown in bytes, vice MB, GB, TB, etc. I think the easiest way to handle that is in aiotruenas_client by pulling value vice rawvalue. But then we'd have to deal with total_bytes.

Based on the tests at least, it looks like the rawvalue was typically a string, whereas the parsedvalue was an actual number, which is why I went with that instead for the library.

I'm not sure what you mean about having to deal with total_byes.

It's worth noting that using bytes will result in a big, not-human-friendly number, but it's not hard to template that to something human readable if someone wanted to use it on a dashboard.

There is also no good value for unique_id that I can find. It still works, just can't use the entity registry. Thoughts?

There's a native property on the datasets called guid that never changes. I wonder if we can access that somehow via the API. On the flip side, I don't think one can rename a dataset from the UX, so technically id is unique. Someone could rename from the command line, which is where getting the guid would be nice.

@colemamd
Copy link
Contributor

colemamd commented Sep 14, 2021

I'm not sure what you mean about having to deal with total_byes.

It's worth noting that using bytes will result in a big, not-human-friendly number, but it's not hard to template that to something human readable if someone wanted to use it on a dashboard.

This is what I meant. Was trying to get a more user-friendly total_bytes, but it can easily be templated if need be.

There's a native property on the datasets called guid that never changes. I wonder if we can access that somehow via the API. On the flip side, I don't think one can rename a dataset from the UX, so technically id is unique. Someone could rename from the command line, which is where getting the guid would be nice.

I was referencing a dump of pool.dataset.query and didn't see a guid at the dataset level, only at the disk level, but you're right, id cannot be altered from the gui, so it is unique. I'll look more into getting guid but I'm not sure that's necessary.

@sdwilsh
Copy link
Owner

sdwilsh commented Sep 14, 2021

Someone could use the raw zfs commands to rename a dataset, but meh. There is pool.dataset.userprop.query, but I suspect that does not include native properties. We could also file a ticket upstream with iX to get it added to the websocket API for future use.

@sdwilsh
Copy link
Owner

sdwilsh commented Sep 15, 2021

pool.dataset.userprop.query does not show native properties, alas.

@sdwilsh
Copy link
Owner

sdwilsh commented Sep 15, 2021

There's an undocumented shell API at websocket/shell we could try to reverse engineer, but I don't think it's worth the investment. It is likely fine to just assume nobody is going to rename it, and if they do, they can deal with the fact that they'll have to delete and resetup the integration.

@sdwilsh
Copy link
Owner

sdwilsh commented Dec 31, 2021

@colemamd, if you don't have the time to finish this off, I have some time available that I could finish off what you started if you wanted to upload it to your fork.

@colemamd
Copy link
Contributor

colemamd commented Jan 1, 2022

I ran out of time a few months ago, started a new job, sorry about that. I have time now, if you give me a few days I can clean up what I had been working on and upload it. I do remember the one thing I was working on is that when you query the datasets, ALL datasets (i.e. child datasets) get pulled. When I was testing on my prod box it was pulling dozens of datasets so I was trying to figure out how to minimize how many get pulled into HA. We could just have all child datasets disabled by default in HA, then the user can go in and enable any that they do want.
I should be able to have something uploaded by the end of this week, but I'll keep you updated.

@agospher4eg
Copy link
Author

agospher4eg commented Jan 1, 2022 via email

@sdwilsh
Copy link
Owner

sdwilsh commented Jan 1, 2022

I have plenty of projects to work on, so happy to let you finish it up. No worries on the delay, and I hope the new job is working out for you!

@colemamd
Copy link
Contributor

colemamd commented Jan 5, 2022

I just uploaded a new branch to my fork, https://github.com/colemamd/hass-truenas/tree/datasets. I'm still trying to figure out how to change the depth of children datasets that are pulled, but the api docs are a bit lacking.

@sdwilsh
Copy link
Owner

sdwilsh commented Jan 5, 2022

Yeah, what I tend to do is grab an API call, and then take that data (with scripts/invoke_method.py) and then play around with tests to get it into a usable state that I'm happy with. I wish they had better docs.

@colemamd
Copy link
Contributor

colemamd commented Jan 5, 2022

That's what I've been doing as well, but I'm having problems finding the right arguments to send with scripts/invoke_method.py for pool.dataset.query. I've been able to do it directly via ws, but get an error when using the script.

@sdwilsh
Copy link
Owner

sdwilsh commented Jan 5, 2022

Passing arguments can be tricky, and I should probably improve the help/documentation for that script to make it clearer (especially because I think I've forgotten myself). If you have some python code that works, I can figure out what that command line should look like later tonight for you (and add the docs).

@colemamd
Copy link
Contributor

colemamd commented Jan 5, 2022

Actually I've got a websocket browser extension I've been using to pass json. {"id":"78430116-ef33-47c9-9715-bbe472f5fad0","msg":"method","method":"pool.dataset.query","query-filters":["query-options.extra.flat",false]} is what I can pass directly to the server and get the appropriate response, but scripts/invoke_method.py gives me a KeyError: 'result' when I pass python3 scripts/invoke_method.py pool.dataset.query --arguments '{"query-filters":["query-options.extra.flat",false]}'.

Edit: Just want to add I believe it's because the scripts/invoke_method.py assumes the params key to be used, but what I'm trying to pass is the query-options key

Edit2: Using python3 scrips/invoke_method.py pool.dataset.query --arguments '[[["type", "=", "FILESYSTEM"]]]' will return all datasets of type FILESYSTEM, so I'm making progress.

@agospher4eg
Copy link
Author

Great work! Thank you!

@sdwilsh
Copy link
Owner

sdwilsh commented Jan 11, 2022

I think @colemamd is planning to add a bit more support for pools here :)

@sdwilsh sdwilsh reopened this Jan 11, 2022
@colemamd
Copy link
Contributor

I've got encryption status working now. What else are you looking for @agospher4eg? As previously mentioned, the API only provides available space and used space from the datasets, not pools.

@agospher4eg
Copy link
Author

agospher4eg commented Jan 12, 2022

It was exactly what i wish.
Screenshot from 2022-01-12 09-56-00

@sdwilsh
Copy link
Owner

sdwilsh commented Jan 12, 2022

Alright, @colemamd, we can get merged in what you have, and deal with other additions later then :D

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants