Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tesla: allow customizing the command proxy to use #14616

Merged
merged 26 commits into from
Jun 30, 2024

Conversation

FraBoCH
Copy link
Contributor

@FraBoCH FraBoCH commented Jun 28, 2024

Add support of custom http proxy like https://github.com/wimaha/TeslaBleHttpProxy for sending commands to the standard tesla vehicle template as discussed in #14252.

The corresponding yaml configuration would be:

vehicles:
- name: tesla
  type: template
  template: tesla
  title: Y not?
  accessToken: XXXX
  refreshToken: XXX
  vin: XXXXX
  capacity: 74
  commandProxy: http://192.168.X.XX:8080

@FraBoCH FraBoCH changed the title Proposition to integrate a BLE HTTP proxy into the standard tesla template Integrate a BLE HTTP proxy into the standard tesla template Jun 28, 2024
@andig andig added the enhancement New feature or request label Jun 29, 2024
@FraBoCH FraBoCH marked this pull request as ready for review June 29, 2024 08:57
templates/definition/vehicle/tesla.yaml Outdated Show resolved Hide resolved
templates/definition/vehicle/tesla.yaml Outdated Show resolved Hide resolved
vehicle/tesla.go Outdated Show resolved Hide resolved
@wimaha
Copy link
Contributor

wimaha commented Jun 29, 2024

I didn't see, that there is another pull request like mine (#14620).

But I really think you have to remove the polling of the tesla server each time the current is changed. Otherwise you will hit the API Limit ...

@FraBoCH
Copy link
Contributor Author

FraBoCH commented Jun 29, 2024

I didn't see, that there is another pull request like mine (#14620).

But I really think you have to remove the polling of the tesla server each time the current is changed. Otherwise you will hit the API Limit ...

We indeed need to verify that. My understanding is that the limit of « read » requests is higher and sufficient for our use case.

@andig
Copy link
Member

andig commented Jun 29, 2024

According to docs there's an API limit, not a write limit.

@wimaha
Copy link
Contributor

wimaha commented Jun 29, 2024

I didn't see, that there is another pull request like mine (#14620).
But I really think you have to remove the polling of the tesla server each time the current is changed. Otherwise you will hit the API Limit ...

We indeed need to verify that. My understanding is that the limit of « read » requests is higher and sufficient for our use case.

The Limit of Read-Requests ist 200. So it‘s up to the weather, of thats enough or not.

@andig andig marked this pull request as draft June 29, 2024 11:34
@FraBoCH FraBoCH changed the title Integrate a BLE HTTP proxy into the standard tesla template Tesla: allow customizing the command proxy to use Jun 29, 2024
@FraBoCH
Copy link
Contributor Author

FraBoCH commented Jun 29, 2024

<> > > I didn't see, that there is another pull request like mine (#14620).

But I really think you have to remove the polling of the tesla server each time the current is changed. Otherwise you will hit the API Limit ...

We indeed need to verify that. My understanding is that the limit of « read » requests is higher and sufficient for our use case.

The Limit of Read-Requests ist 200. So it‘s up to the weather, of thats enough or not.

I see what you mean. On the other hand, this data is cached (and not fetched above 6A) so I wonder how many real API call this trigger in a one day. IMHO we should keep the existing logic to avoid issue #13007 coming back, and think of a further fix if we reach the API limits in practice and if that prevent evcc from doing its job properly.

@wimaha
Copy link
Contributor

wimaha commented Jun 29, 2024

<> > > I didn't see, that there is another pull request like mine (#14620).

But I really think you have to remove the polling of the tesla server each time the current is changed. Otherwise you will hit the API Limit ...

We indeed need to verify that. My understanding is that the limit of « read » requests is higher and sufficient for our use case.

The Limit of Read-Requests ist 200. So it‘s up to the weather, of thats enough or not.

I see what you mean. On the other hand, this data is cached (and not fetched above 6A) so I wonder how many real API call this trigger in a one day. IMHO we should keep the existing logic to avoid issue #13007 coming back, and think of a further fix if we reach the API limits in practice and if that prevent evcc from doing its job properly.

The Cache is resetted every time current is changing. So in my use case the limit will be reached very fast. 😕

@FraBoCH
Copy link
Contributor Author

FraBoCH commented Jun 29, 2024

Adding a parameter to disable « cleanly » the current check loop. So that we can let the (advanced) user decide.

@Raudi1
Copy link

Raudi1 commented Jun 30, 2024

Hopefully getting vehicle info should be possible over BLE as well. teslamotors/vehicle-command#229

@andig
Copy link
Member

andig commented Jun 30, 2024

Maybe, but it should not be necessary. The API limits should be fine once you start sending the commands via BLE.

@Raudi1
Copy link

Raudi1 commented Jun 30, 2024

Maybe, but it should not be necessary. The API limits should be fine once you start sending the commands via BLE.

Yes, but while it isn't an issue for me, there's people without cellular reception at their charging spots. I think I read somewhere Wifi only doesn't work for the API. Could be wrong though.
But it's always a good thing to have local access without any cloud stuff, as this whole debacle with API limits shows alongside many examples of other cloud services changing, not working, or shutting down completely.

@wimaha
Copy link
Contributor

wimaha commented Jul 1, 2024

Regarding the api.CurrentGetter lets see if this doesn't already work. It will only trigger a read <6A and it will only do so due to caching if the current below 6A was changed. The last improvement (though it would need to be tested) would be to only update when the current goes below 6A for the first time, assuming following requests will all work immediately.

As described above this is an issue for my use case. I always charge with < 6A and when there is changing weather the API Limit is reached. Maybe we can try your proposed solution. Perhaps I can file a PR for that.

@andig
Copy link
Member

andig commented Jul 1, 2024

Could you check the logs why that happens? We're caching the result and should only call the API if current is updated. That is 2, 3, 4, 5 Amps. Do we even need to call it if the change remains below 6A or only when going from 6A to below? Needs to be tested.

@FraBoCH
Copy link
Contributor Author

FraBoCH commented Jul 1, 2024 via email

@andig
Copy link
Member

andig commented Jul 1, 2024

Actually, we've just added this in #14622. Since the TWC has phase currents, this should work if you remove the api.CurrentGetter from the vehicle. Could you give it a try?

@FraBoCH
Copy link
Contributor Author

FraBoCH commented Jul 1, 2024

Mmmmh. Any change needed or is it suppose to already work with twc in latest nightly ?
I am testing with a custom vehicle getting data from TeslaMate (without GetMaxCurrent obviously), but it doesn’t detect the phase mismatch when switching from 13A to 1A. The Tesla gets stuck at 5A instead of 1A in that case.

@andig
Copy link
Member

andig commented Jul 1, 2024

Ahhh, I see. Need to check the error return code, too. Change coming.

@andig
Copy link
Member

andig commented Jul 1, 2024

@FraBoCH see #14659. It is a bit subtle but should work. Remaining drawback is that we'll only recognize if current is too high by at least 1A, so it might not detect not switching to 5A, but should detect on 4A due to https://github.com/evcc-io/evcc/pull/14622/files#diff-c456526d4bd9022a488ce32de208b40d86016e2c40b0c81a2dffe557c186377dR723. Seems this is the best we can currently do. Ymmv.

@Raudi1
Copy link

Raudi1 commented Jul 7, 2024

EVCC doesn't always want to use the BLE proxy for me...
I have the latest version of EVCC and added
commandProxy: http://192.168.1.52:8080
to my config. But EVCC sometimes still uses EVCC.io to control the charging speed. At least until the limit is reached. The error message even shows the URL as http://tesla.evcc.io... Sorry, but I can't get the logs right now.
Then without me changing anything about the config, it suddenly works. I did restart EVCC a couple of times though.

@FraBoCH
Copy link
Contributor Author

FraBoCH commented Jul 7, 2024 via email

@andig
Copy link
Member

andig commented Jul 7, 2024

But EVCC sometimes still uses EVCC.io to control the charging speed.

That's pretty impossible since it will only ever see the configured URL. In any case, trace log would show if that happens.

@jannisif
Copy link

jannisif commented Jul 7, 2024

Hi everyone,

it seems I have the same problem as @Raudi1:

Command proxy is set on the vehicles:
vehicles:

  • type: template
    template: tesla
    title: Model 3
    accessToken: xx
    refreshToken: xx
    vin: xx
    capacity: 75
    name: ev1
    commandProxy: http://10.0.0.98:8080 #(also "/" isn't working)

commandProxy is a raspi set up with vehicle-command and TeslaBleHttpProxy(working).
But evcc (0.128.1) always tries to connect to https://tesla.evcc.io/....

@andig, can I create an anonymous trace log, or should I send it via email(where to send)?

Thanks a lot!

@andig
Copy link
Member

andig commented Jul 7, 2024

Pls. post here

@wimaha
Copy link
Contributor

wimaha commented Jul 7, 2024

That‘s surprising. I’am using 0.182.1 for two days and charging works perfectly with the proxy. 😃

@Raudi1
Copy link

Raudi1 commented Jul 7, 2024

Perhaps something was still cached or something similar? After it started working, it continued to work (at least until some kind of connection issue on the proxy side I had to resolve with a restart of the proxy).

@jannisif
Copy link

jannisif commented Jul 7, 2024

evcc-20240707-142559-trace.log

xxx = VIN

Maybe it is a problem with my installation and the problem is on my side... but it seems to have the same problem as @Raudi1

@Raudi1
Copy link

Raudi1 commented Jul 7, 2024

evcc-20240707-142559-trace.log

xxx = VIN

Maybe it is a problem with my installation and the problem is on my side... but it seems to have the same problem as @Raudi1

Yeah, exactly the same. Post "https://tesla.evcc.io/api/1/vehicles/.../command/set_charging_amps" is used while the proxy is set to a local address.
Try restarting EVCC a couple of times. At least that seems to have done the trick for me. I'm using the Home Assistant Addon btw.

@andig
Copy link
Member

andig commented Jul 7, 2024

It wont. There is no secret caching. Thanks for the log, will take a look. Is there any indication its using the proxy at all?

@jannisif
Copy link

jannisif commented Jul 7, 2024

wait a moment, it look pretty good currently. I will observe this for the next hour and report back

@Raudi1
Copy link

Raudi1 commented Jul 7, 2024

Is there any indication its using the proxy at all?

For me there wasn't. The logs for the proxy were completely empty. Just showed the system start and nothing else. After it fixed itself I can see the requests in the proxy log

@jannisif try running docker logs --since 12h tesla-ble-http-proxy on the proxy and see if it gets commands.

@jannisif
Copy link

jannisif commented Jul 8, 2024

So my setup is working for almost a day.
Here are my findings:

  • so far I have restarted the specific evcc service on my proxmox ct. Yesterday I restarted the whole container, and it works suddenly, I'm not sure about how many times I restarted, but at some point it started requesting the proxy.
  • so it is maybe a cache "bug" by the container(?).

Sidenote for those having the following problem:

  • the proxy ran about three hours then due may to a restart, it wasn't able to send request to the car.
  • so I tried it via the vehicle-command, nop , errors like "ble beacon not found" eg., it seems to be a rare bug in "vehicle-command", then I tried it with root permissions and it works for now. I don't want to restart my working setup to look for what reason this was happening.

@Raudi1 thank you for your tip, the logs are still unaffected except for several attempts that it sometimes takes to find a connection (the car is 4 m from the raspi).

Thank you @wimaha for providing the proxy, it works perfectly with the docker.
Thanks everyone for support :)

Have a nice day!

@Nhozz
Copy link

Nhozz commented Jul 9, 2024

It wont. There is no secret caching. Thanks for the log, will take a look. Is there any indication its using the proxy at all?

Hi Andig, I think I may have the exact same issue as Raudi1. I don´t know if it helps but I share you my log and config too.
I´ve installed the TeslaBleHttpProxy on a new raspberry PI 2W.

Here are the docker logs for docker logs --since 12h tesla-ble-http-proxy:
2024/07/09 12:32:04 INFO Key-Request connection established
2024/07/09 12:32:04 INFO sending command=add-key-request body=map[]
2024/07/09 12:32:04 INFO Sent add-key request to XXX. Confirm by tapping NFC card on center console.
2024/07/09 12:32:04 INFO successfully executed command=add-key-request body=map[]
2024/07/09 12:49:53 INFO TeslaBleHttpProxy 1.2 is loading ...
2024/07/09 12:49:53 INFO BleControl initialized
2024/07/09 12:49:53 INFO TeslaBleHttpProxy is running!

EVCC Log:
evcc-20240709-125948-debug.log

My EVCC.Yaml vehicle config:

vehicles:

  • name: ModelY
    type: template
    template: tesla
    title: Tesla Model Y
    icon: car
    accessToken: xxx
    refreshToken: xxx
    capacity: 75
    commandProxy: http://xxx:8080

I´ve XXX´d my IP and VIN

@jannisif
Copy link

jannisif commented Jul 9, 2024

@Nhozz
Did you already restarted your evcc instance multiple times?
That is what I did to get evcc request the proxy :)

@andig
Copy link
Member

andig commented Jul 9, 2024

@Nhozz you're most likely using a different config file than the one you are editing. To verify, look at the start of the log which shows the config used and then

cat /that/file

to verify. Anyway- since this pr has been merged and is confirmed working- I'm closing the discussion here.

@evcc-io evcc-io locked as resolved and limited conversation to collaborators Jul 9, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
devices Specific device support enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants