-
Notifications
You must be signed in to change notification settings - Fork 13.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Integration tests: Discussion #10791
Conversation
I don't want to add every single developer as reviewer (seems like spamming everyone). Please just add your comments and thoughts. |
@Stifael this sounds like a great idea. We have the CI capacity to run significantly more tests, and will soon have even more with @julianoes faster than realtime work. I'll spend some time reviewing the submodule project in detail. The main thing I want highlevel is to 1) make it easy to add new tests and 2) make it absolutely trivial to reproduce a failure locally. |
.@julianoes @JonasVautherin any word on the SDK submodule? is this the "right" way to include the SDK as a dependency for testing? |
@mrpollo: I would like it to install the latet release and test against that but until that's ready the submodule is ok. |
Nice, thanks Dennis for kicking this off. This is definitely helpful. In addition to Daniel's points, some things from my side:
Some more test cases that should be considered by the design:
|
about the negative points:
That is still possible even with post ulog analysis, it just depends on the test. For instance, one can write a simple mission where the vehicle flies from one point A to point B and expect the vehicle to fly that distance within a certain time T. The genertated ulog file can still be used to check if the vehicle accomplished that simple maneuver.
Agreed. However, I suggest to have two files. One file that has tests that always have to be satisfied, and one file with tests which are specific for a simulated test. For instance, the example above with the simple mission with waypoints A and B and expected time T is a very specific test that only applies to that particular mission. I would keep these tests seperated from the other general tests.
I think this should be possible with post analysis, it just depends on what you want to test. Since most px4-flight behaviors are controlled by parameters, I think this possible.
same as above
same as above. Do you have anything in mind that would not be possible to test through ulog that is GPS loss related?
That is for sure possible. In the simple tilt test that I have added as an example here, the test uses the parameter to check the maximum tilt and uses the navigation state topic to check the tilt for manual/stabilized only. The same logic can be applied to yaw, or any other message. Right now I think it would be great to have a PR where the entire pipeline would be executed. For that, however, it would require some help with the CI integration tests. |
I was thinking about aborting a test or taking some other action. How do you plan to handle that?
Sure I'd separate these as well. I don't see the general checks as individual tests, but just a set of checks that in every test need to be satisfied. These checks can be made available to all tests through a common base class for example.
No, I think it will work. |
What would be the use case for stopping a test?
I think we are talking about the same thing. Maybe we need to discuss it during the dev-call. Right now the way I planned the testing is as follow: General testsAll the tests are within one file. Each test can either be called as a single test, or all tests can be run at once. Here, I would distinguish between CI and regular log file. Regular log fileFor a regular log file, I would like to run that file against all tests within the CIFor CI, we obviously need simulated maneuvers that generate ulog files for us. Some of the simulated tests require a specific test from the My point is that I want to have both. Either you run the log file against all tests, or you run it against specific tests that you are interested in. |
For example when in a mission, and the vehicle does not make any progress anymore.
Sure, I see it the same way. I guess it's best to see that once it's implemented (is it already complete?). |
read -p "Continue (y/n)?" answer | ||
|
||
if [[ $answer == [Yy] ]]; then | ||
mkdir $build_dir |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe first check if the submodule is checked out properly otherwise this will fail and the script continues.
fi | ||
|
||
# start requested maneuver | ||
cd $build_dir && ./maneuvers/$maneuver udp://:14540 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
newline before eof missing
yes, that is the goal. It can be used by Flight Review. I think it would make sense to have a simple example of that testing pipeline in action. To make that work, I will probably need @dagar's help. The CI would need to run the following components:
|
d1b857e
to
af25acf
Compare
af25acf
to
37efe77
Compare
@dagar I added now a
Right now the logs are stored in the px4-src directory. TODO:
|
- move yaml to config directory - deactivate tests based on TestClass name and method
TODO
|
…dded to the RTL test, it can now be used as well for testing
uloganalysis is now called pyulgresample: https://github.com/YUNEEC/pyulgresample |
@Stifael Can you also add some information about this in https://docs.px4.io/en/log/flight_log_analysis.html#analysis-tools ? |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Is this really stale? |
yep. I will open a new PR instead and will close this one sync it is out-dated by now. |
Background
Currently PX4 evolves primarily through feature addition. However, these features are added without guarantee that no other legacy implementation breaks. To reduce this uncertainty, there is now a clear effort to expand flight-testing through the @PX4/testflights team. However, many implementations cannot be directly perceived through actual testing but rather would require in depth log analysis. In addition, many legacy implementations are not known anymore and just dangle around in the code and waiting for someone to break it such that it can be re-fixed afterwards because a few users depend on it. Although with the current PX4 state there is no way around of breaking it and fixing it again (since that feature might not be known to the developer), it would be desired that after that re-fix of the broken legacy implementation, that part of the code will never break again, even if that feature is unknown to anyone.
One possible approach to tackle this problem is to make use of integration tests. PX4 already uses an integration-test framework which is based on ROS or Dronekit. However, there are about 5 integration tests overall, which already have aged quite a lot. One reason why the set of integration tests are not increasing is probably because it is difficult to run these tests on your local machine without going through a nightmare of installing additional dependencies.
Luckily, recently
dronecode_sdk
has evolved quite a lot, which makes it very easy to set up simple missions or offboard related maneuvers. Offboard is not fully there yet, but that is mainly due to the lack of resources and time. Havingdronecode_sdk
as part of the PX4-firmware, every developer can easily generate simple maneuvers that then can be used for integration testing.Integration test pipeline
The suggested integration test pipeline is different to the integration tests known. Originally, an integration test consisted of a simulated maneuver with immediate test-checking. The approach that I suggest is based on the log-files:
.ulg
.ulg
file is the input to the integration tests.There are several advantages of writing integration tests based on
.ulg
file:Self-contained development of simulated maneuvers
The simulated maneuvers do no longer require any tests. They can be implemented in such a way that they could also be run in reality. For instance, a simple maneuver could be to fly a mission in a square where each waypoint is 5m apart. Without any effort, that same mission can be run on the actual vehicle as well. This means that if we have a set of maneuvers, the @PX4/testflights team could in addition to the usual flight tests also run these deterministic flight tests.
Self-contained testing framework based on
.ulg
-fileWe are completely free to choose any framework for testing. Since any
.ulg
file can be converted into python structure throughpyulog
, a good option would be a python testing framework.The tests can be split into general tests and simulated tests:
MPC_XY_CRUISE
independent of simulation or actual flight. Consequently, one can write a test that checks for the cruise-speed during mission. This test can be applied to a simulated maneuvers as well as to an actual flight-test log-file.Current state
px4maneuvers
px4maneuvers is a simple project that uses
dronecode_sdk
to generate simple missions (https://github.com/Stifael/px4maneuvers). I added this project to the firmware (which is this PR). To run the example mission, all you need to do is:I added a
maneuver.sh
script that simplifies the build process. From the root of px4-firmware, type:In an other terminal, start
sitl
as usual.This project can now easily be extended. I also think that it would help the active development of the SDK. For instance, if the PX4-community decides to use the SDK for integration testing as well, then I am confident that developers will be more active in shaping and contributing to the SDK project as well.
uloganalysis
The name
uloganalysis
is not a really good name and needs to change eventually. That said, everything is just [WIP] anyway.This project is here. Most of the details are provided in the
README.md
. The main objective of this project is to re-sample and merge thepyulog
output into pandas dataframe (similar to px4tools). The goal is that this project stays simple without adding too fancy analysis methods. It also should not really analyze anything, but rather should provide convenient functions to add and extract info from the.ulg
file. For instance, roll-/pitch- and yaw-error can be easily computed from thevehicle_attitude
andvehicle_attitude_setpoint
message. Another example would betilt
, which is a fundamental constraint within the px4-firmware but is not logged. That information can be extracted withuloganalysis
.Another goal of this project is to enable other projects such as testing, visualization etc. For instance,
PX4-FlightReview
is a good tool for sharing general flight related information.FlightPlot
is a good tool to go a bit more into detail and look at messages which are not shown inFlightReview
. However, computations such as thetilt
-computation is not possible either. Hence, from a developer perspective, I prefer to have a setup that is more flexible. Consequently,uloganalysis
could be used in any python project (for instance jupyter-notebooks for sharing) and therefore allows the developer to make full use of the python tools.Once this project is matured, the project could then be packaged and distributed over PyPa (everything should already be in place).
px4ulogtests
px4ulogtests is a project that consists of test-scripts and is based on
uloganalysis
. The purpose of this project is to use it for integration testing, but also for actual flight testing and creating reports. Currently this project only contains one test, where the the vehicle's tilt duringManual
andAltitude
flight is tested. This test is a general test, which means it can be applied to simulated maneuvers and actual flight-tests.What is needed
First I need to know if such a pipeline is something that the PX4-community is interested in. If that is the case, then each project needs to be discussed and reviewed again. However, for personal reason, I definitely don't want to have one large project that contains everything. To me it is crucial that
uloganalysis
stays independent. However, I am fine with any project as long as it fulfills a similar purpose.Then we also need to discuss how to integrate such a pipeline into the current CI.
Once we agreed on a pipeline (or declined it entirely), and we still want to use the proposed projects, I would then like to move these repos over to the px4 umbrella.