You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Having tried this out with a Micro:bit V2, it became apparent that, with the faster underlying processor, the overheads for the calls to the microsecond delay are shorter and the timing skewed (and faster) when setting rates greater than about 80kbps. Moreover, the calculation for rate was out by a factor of 2, as I'd compensated twice in different ways for the fact that one on or off clock pulse is half the clock period. Another timing diagram from a different I2C device datasheet suggests that there's a significant inter-byte period during which the slave device processes the byte. Originally, I had a fairly large gap (a whole clock) but it may be best to wait for the slave device to free SDA and base the delay on that. I've altered it to work in that way and removed the micro:bit version test I initially used to work around the V2 speed difference. Now it works up to the max speed - capped by the blocks interface to 200kbps - on v1 and v2 units, noticeably more slowly on v1 if you write to a couple of displays really intensively.
The text was updated successfully, but these errors were encountered:
Having tried this out with a Micro:bit V2, it became apparent that, with the faster underlying processor, the overheads for the calls to the microsecond delay are shorter and the timing skewed (and faster) when setting rates greater than about 80kbps. Moreover, the calculation for rate was out by a factor of 2, as I'd compensated twice in different ways for the fact that one on or off clock pulse is half the clock period. Another timing diagram from a different I2C device datasheet suggests that there's a significant inter-byte period during which the slave device processes the byte. Originally, I had a fairly large gap (a whole clock) but it may be best to wait for the slave device to free SDA and base the delay on that. I've altered it to work in that way and removed the micro:bit version test I initially used to work around the V2 speed difference. Now it works up to the max speed - capped by the blocks interface to 200kbps - on v1 and v2 units, noticeably more slowly on v1 if you write to a couple of displays really intensively.
The text was updated successfully, but these errors were encountered: