Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LidarLite driver fix #12756

Merged
merged 7 commits into from
Sep 4, 2019
Merged

LidarLite driver fix #12756

merged 7 commits into from
Sep 4, 2019

Conversation

cmic0
Copy link
Contributor

@cmic0 cmic0 commented Aug 20, 2019

This PR addresses some issues that have been founded on Lidar Lite sensors:

  • Unified the maximum distance of the sensor back to 35 meters: for v1/v2 sensors the previous value was 25 meters but was obtained empirically. According to the datasheets such sensors should be capable to read ~40m on 70% reflective surfaces, thus setting a general value at 35m could make sense to unify all the versions.
    For altitude estimation the user has still the possibility to set EKF2_RNG_A_HMAXvalue to a more specific one.

  • Improved probe() logic for distinguish v1/v2/v3 from v3hp.

  • Fixed reset_sensor() logic: Thanks to Wingtra for the fix. Actually the v3 was not starting on current master. The proposed reset logic properly resets the sensors.

  • Fixed v3HP signal quality reporting: The LL40LS_PEAK_STRENGTH_REG register is not available in the v3HP sensors thus signal quality was reported to be always 0. For the v3HP the signal quality is now computed by looking at the signal strength (LL40LS_SIGNAL_STRENGTH_REG).

  • Refactored the start() method

Test data / coverage
Tested on v3 and v3hp. Will add logs tomorrow.

FYI @DanielePettenuzzo

@mcsauder
Copy link
Contributor

Looks great @cmic0, everything checks out from my end! Nice work!

image

@ar1040 , you might give this a try and let us know if it fixes your issue, (if not we'll keep hammering at it).

Let's get this in!

mcsauder
mcsauder previously approved these changes Aug 20, 2019
Copy link
Member

@LorenzMeier LorenzMeier left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The range change does not seem to be fully thought through.

@@ -46,8 +46,8 @@

/* Device limits */
#define LL40LS_MIN_DISTANCE (0.05f)
#define LL40LS_MAX_DISTANCE_V3 (35.00f)
#define LL40LS_MAX_DISTANCE_V1 (25.00f)
#define LL40LS_MAX_DISTANCE (35.00f)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure about this change. The v1 sensors really don't work on grass at 35 meters and so you end up claiming that you have a good signal when you really will have a quite erratic reading. The argument provided in the PR itself is non-convincing, as the specs of LIDAR sensors tend to be measured against a material with 70% reflectance (which is a white sheet of paper). But real world materials are way, way below.

Spectral-reflectance-of-different-materials-across-visible-to-infrared-wavelengths

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The main issue in this driver is that there is not a clear way to identify different hardware versions.

In the actual upstream implementation (see here ) v1 versions are distinguished based on the device address (0x42 for v1, 0x62 otherwise), which I think is not properly correct.
As a matter of fact the v1 datasheet reports that "The sensor module has a 7­bit slave address with a default value of 0x62 in hexadecimal notation", didn't find any reference to 0x42 address.
If we stick to this statement then the actual logic for distinguishing the v1 among the others is broken. I don't currently have any v1 hardware to be able to properly verify if this is the case.

The proposed implementation looks at hw_version, sw_version and device id which seemed to be more reliable.
I did some tests with v2, v3, v3hp and the logic holds well, unfortunately I don't have data to catch the v1 case.
This brought me to remove the V1 max distance as I think that this was never applied.

Summing up I would go for one of the following solutions:

  • If we verify that actually all the v1 sensors are on 0x42 address then I would re-introduce the address-based logic for setting the v1 limit.
  • If the v1 have the usual 0x62 address then I would need to define a new case based on hw/sw version

Results of the probing:
v3:
INFO [ll40ls] hw: 23, sw:1
INFO [ll40ls] id: 0
INFO [ll40ls] probe success - hw: 23, sw:1

v3HP:
INFO [ll40ls] hw: 0, sw:200
INFO [ll40ls] id: 13113
INFO [ll40ls] probe success - id: 13113

v2:
INFO [ll40ls] hw: 21, sw:2
INFO [ll40ls] id: 23612
INFO [ll40ls] probe success - hw: 21, sw:2

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Stupid question, and I appreciate this is a complex problem to solve: If we can identify v3, can't we just say those are 35 meters and the rest is 20 meters? Do we have proof that the v3 is indeed delivering 35 meters, or are we just hoping it improved?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually did some further testing to verify if there is actually a common baseline between all the versions:
Flight was performed on grass in a cloudy day:

v2 (drop at 32m):
v2

v3 (drop at 25m):
v3

v3hp (drop at 16m):
v3hp

For the v3 I have actually checked older logs and 25 meters are the best case; usually they drop at about 15-20meters.

Seems that a good baseline would be 25 meters and 35 meters for the v2. What do you think?

@ryanjAA
Copy link
Contributor

ryanjAA commented Aug 21, 2019

I can try and find some old logs but we never hit 35m with v1/2. It was actually the source of some hugely botched landings in the early days. 25m is even a stretch over grass. Iirc I think the average in real world was 12-15m reliably. Anything else was spotty at best. This is not on v3/hp though.

@ar1040
Copy link

ar1040 commented Aug 21, 2019

@mcsauder, I can still only get the lidar to work by running the ll40ls start command in the mavlink console. Looks like you guys are making good progress though! Let me know if there is anything else I can do to help!

@mcsauder
Copy link
Contributor

@ar1040 , does it address this issue? #12278

@ar1040
Copy link

ar1040 commented Aug 21, 2019

@mcsauder , kind of. Originally there was no way for me to connect to the Lidar. Now I can read the lidar data but only if I run the "ll40ls start" command in the mavlink console. Previously, this did not work. I was hoping for a solution on boot up. But I did test the lidar and it is getting data, but not on boot, only when running that command in the console.

@cmic0
Copy link
Contributor Author

cmic0 commented Aug 21, 2019

@ar1040 there is any chance you can provide a record of the boot messages?
You should access to the nuttx Console to get them

@mcsauder
Copy link
Contributor

@ar1040 , can you ensure that your SENS_EN_LL40LS parameter is set to 2? (Or 1 if you are using PWM.)

@cmic0 , it's working for me:

nsh> reboot
WARN  [shutdown] Reboot N�[boot] Rev 0x0 : Ver 0x0 V500
[boot] Fault Log info File No 4 Length 3177 flags:0x01 state:1
[boot] Fault Log is Armed
sercon: Registering CDC/ACM serial driver
sercon: Successfully registered the CDC/ACM serial driver
HW arch: PX4_FMU_V5
HW type: V500
HW version: 0x00000000
HW revision: 0x00000000
FW git-hash: 4d39e8c774bd8d7a395c70f9f668b39813fa0b47
FW version: 1.9.0 0 (17367040)
OS: NuttX
OS version: Release 7.29.0 (119341311)
OS git-hash: d8da511082646d83a54c6905daca13f0a1a609f0
Build datetime: Aug 21 2019 14:43:29
Build uri: localhost
Toolchain: GNU GCC, 7.3.1 20180622 (release) [ARM/embedded-7-branch revision 261907]
PX4GUID: 0002000000003432333830385115002a001b
MCU: STM32F76xxx, rev. Z
[hardfault_log] Fault Log is Armed
INFO  [param] selected parameter default file /fs/mtd_params
INFO  [tune_control] Publishing standard tune 1
Board defaults: /etc/init.d/rc.board_defaults
INFO  [px4_work_queue] creating: wq:lp_default, priority: 205, stack: 1700 bytes
INFO  [px4_work_queue] creating: wq:I2C1, priority: 248, stack: 1250 bytes
rgbled on I2C bus 1 at 0x55 (bus: 100 KHz, max: 100 KHz)
WARN  [rgbled_ncp5623c] no RGB led on bus #1
WARN  [blinkm] I2C init failed
WARN  [blinkm] init failed
INFO  [px4_work_queue] creating: wq:hp_default, priority: 243, stack: 1500 bytes
Board sensors: /etc/init.d/rc.board_sensors
WARN  [mpu6000] no device on bus #3 (SPI1)
MPU6000 on SPI bus 1 at 0 (1000 KHz)
INFO  [px4_work_queue] creating: wq:SPI1, priority: 254, stack: 1400 bytes
BMI055_ACCEL on SPI bus 1 at 3 (10000 KHz)
BMI055_GYRO on SPI bus 1 at 2 (10000 KHz)
IST8310 on I2C bus 1 at 0x0e (bus: 100 KHz, max: 400 KHz)
INFO  [px4_work_queue] creating: wq:I2C2, priority: 247, stack: 1250 bytes
INFO  [ist8310] no device on bus 2
WARN  [hmc5883] no device on bus 1 (type: 2)
WARN  [hmc5883] no device on bus 2 (type: 2)
WARN  [hmc5883] no device on bus 4 (type: 2)
WARN  [qmc5883] no device on bus 1 (type: 2)
WARN  [qmc5883] no device on bus 2 (type: 2)
WARN  [qmc5883] no device on bus 4 (type: 2)
WARN  [lis3mdl] no device on bus 2
WARN  [lis3mdl] no device on bus 2
WARN  [lis3mdl] no device on bus 2
INFO  [px4_work_queue] creating: wq:I2C3, priority: 246, stack: 1250 bytes
IST8310 on I2C bus 3 at 0x0e (bus: 100 KHz, max: 400 KHz)
INFO  [px4_work_queue] creating: wq:SPI5, priority: 250, stack: 1400 bytes
ERROR [pmw3901] driver start failed
MS5611_SPI on SPI bus 4 at 0 (20000 KHz)
INFO  [px4_work_queue] creating: wq:SPI4, priority: 251, stack: 1400 bytes
INFO  [ll40ls] LidarLiteI2C - no device on bus 2
INFO  [ll40ls] LidarLiteI2C - no device on bus 2
INFO  [px4_work_queue] creating: wq:I2C4, priority: 245, stack: 1250 bytes
INFO  [ll40ls] probe success - id: 13113
LL40LS on I2C bus 4 at 0x62 (bus: 100 KHz, max: 100 KHz)
INFO  [px4_work_queue] creating: wq:att_pos_ctrl, priority: 244, stack: 7000 bytes
INFO  [px4_work_queue] creating: wq:rate_ctrl, priority: 255, stack: 1600 bytes
INFO  [commander] Mission #3 loaded, 7 WPs, curr: 0
INFO  [mavlink] mode: Config, data rate: 800000 B/s on /dev/ttyACM0 @ 57600B
Starting Main GPS on /dev/ttyS0
Starting MAVLink on /dev/ttyS1
INFO  [mavlink] mode: Normal, data rate: 1200 B/s on /dev/ttyS1 @ 57600B
INFO  [px4io] default PWM output device
INFO  [init] Mixer: /etc/mixers/quad_x.main.mix on /dev/pwm_output0
INFO  [init] Mixer: /etc/mixers/pass.aux.mix on /dev/pwm_output1
INFO  [ekf2] Found range finder with instance 0
INFO  [logger] logger started (mode=all)

NuttShell (NSH)
nsh> INFO  [ecl/EKF] EKF aligned, (pressure height, IMU buf: 22, OBS buf: 14)

TSC21
TSC21 previously approved these changes Sep 2, 2019
@cmic0
Copy link
Contributor Author

cmic0 commented Sep 2, 2019

Ok, based on the experimental results I just have lowered the maximum distance to 25 meters for all the sensors except for the v2 which will be initialized with 35 meters (here 1260b2b).
This PR is good to go to me.
FYI @LorenzMeier @mcsauder @DanielePettenuzzo

@TSC21
Copy link
Member

TSC21 commented Sep 2, 2019

@dagar we hit a flash overflow on v2 again

@mcsauder
Copy link
Contributor

mcsauder commented Sep 2, 2019

Hi @cmic0 and @dagar , I just bench tested this PR again and everything still looks/works great from my perspective. I think we should get this PR merged and then we can work through #12695 and #12864.

TSC21
TSC21 previously approved these changes Sep 2, 2019
@TSC21 TSC21 requested a review from LorenzMeier September 2, 2019 20:36
@ryanjAA
Copy link
Contributor

ryanjAA commented Sep 3, 2019

While I can dig up some really old logs if absolutely necessary. I can definitively say on more than 20+ occasions, the v2 doesn’t work anywhere close to 35m. All over grass flights and in the 20 m/s range. Of course had its moments of pinging data back but real world flying in differing areas, it was half of that. 25m is pushing it but I’d keep it there if wanting to keep all things equal. We had these in production planes for 1-2 years and it was a huge source of ensuring people knew the limitations during auto-landing.

@cmic0
Copy link
Contributor Author

cmic0 commented Sep 3, 2019

@ryanjAA I just did another test with the v2 and can confirm that its performances in hovering are good (>30m). Maybe for fixed wing applications performance degradation is due to the horizontal velocity.
I think the current values are good to go, useful range can still be adjusted with EKF2_RNG_A_HMAX to avoid bad data going into the EKF.
image

@TSC21
Copy link
Member

TSC21 commented Sep 3, 2019

@cmic0 can you rebase?

@TSC21 TSC21 requested a review from mcsauder September 4, 2019 10:01
Signed-off-by: Claudio Micheli <claudio@auterion.com>
Signed-off-by: Claudio Micheli <claudio@auterion.com>
Signed-off-by: Claudio Micheli <claudio@auterion.com>
Signed-off-by: Claudio Micheli <claudio@auterion.com>
Signed-off-by: Claudio Micheli <claudio@auterion.com>
Signed-off-by: Claudio Micheli <claudio@auterion.com>
Signed-off-by: Claudio Micheli <claudio@auterion.com>
Copy link
Contributor

@mcsauder mcsauder left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work! Looks good from my end.

image

Copy link
Member

@TSC21 TSC21 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All good to me! I think we are good for a merge now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants