You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the Simple_Options class, IntSequence is defined as an Option of type IEnumerable<int> with Min=3 and Max=4, and LongValue is defined as a Value with index 0. I believe what should happen is that the Max=4 on IntSequence should be taken into account by CommandLineParser, and it will stop assigning values to IntSequence once it has assigned 4 values. Then the next value (256) will be a value parameter, and it will assigned to LongValue since LongValue has index 0. However, what actually happens is that the number 256 gets assigned to IntSequence, and then ParseArguments returns a NotParsed object with a SequenceOutOfRangeError because IntSequence got assigned 5 values and its max is 4.
There's currently a commented-out code block in TokenPartitioner.cs that would check the Max attribute (if there is one) of a sequence, and stop assigning Value tokens to that sequence once the max is reached. I added it while I was implementing #678, then commented it out when I realized that it was making a different test fail: the Breaking_max_constraint_in_string_sequence_gererates_SequenceOutOfRangeError test in InstanceBuilderTests.cs (which is slightly misspelled, as "gererates" should be "generates" in its title) actually expects the current behavior, where the tokenizer swallows up too many tokens and then throws an error. I believe this is wrong, and that test should expect the parser to succeed with one extra leftover arg that didn't get inserted into the Parsed instance (and that extra arg wouldn't be accessible to the user anywhere, but that's what #680 is about).
Another test that would also fail if I turn on the "stop when you've reached Max number of items" feature is the Partition_sequence_multi_instance_with_max test in SequenceTests.cs. It takes a command line like --str=strvalue freevalue --seq seqval0 seqval1 -x freevalue2 --seq seqval2 seqval3 --seq seqval4 seqval5 where seq is a sequence with Max=3. The test expects seq to end up with six values assigned to it (because --seq is specified three times with 2 values each, so each time it's specified it doesn't go over its max), but I believe it should only have three values assigned to it in total. The question here is, what would library users expect when they write the following?
Would they expect StringSequence to be able to have more than 3 items it in? My answer is no: if library users assign Max=3 to StringSequence and it ends up with 6 items in it, they would rightfully consider that to be a bug. The Max attribute should describe the maximum total number of items that can be assigned to StringSequence, not the maximum that can be assigned in any one single use of the -s parameter on the command line. So when the command line -s 1 2 -s 3 4 -s 5 6 is parsed, I believe the value of StringSequence should be { "1", "2", "3" } and the three extra args should be ignored.
I'll be submitting a PR shortly to fix this bug, which was only just introduced when #678 was merged.
The text was updated successfully, but these errors were encountered:
+1 for considering this a bug. I have spent a lot of time trying to figure out what I was doing wrong in trying to specify a sequence which would accept exactly 2 values, no more, and no less. The parser correctly throws an error when the option is used with too few values for the sequence, but extraneous extra values do not cause an error which leaves one to wonder why Max exists as a property at all.
The documentation states "You can also specify a Min constraint or a Max constraint alone: this means you only want check for minimum or maximum number of elements. Breaking a constraint will cause parsing to fail" but this appears to only be the case for the Min constraint at present; the Max constraint appears to be completely ignored.
[Option('h',"historic",Required=false,Separator=',',Max=2,Min=2,SetName="historic",HelpText="This should error if three values are specified, but it does not do so.")]
While implementing #678, I discovered a bug in how CommandLineParser handles sequences. Currently, the following test will fail:
In the Simple_Options class, IntSequence is defined as an Option of type
IEnumerable<int>
with Min=3 and Max=4, and LongValue is defined as a Value with index 0. I believe what should happen is that the Max=4 on IntSequence should be taken into account by CommandLineParser, and it will stop assigning values to IntSequence once it has assigned 4 values. Then the next value (256) will be a value parameter, and it will assigned to LongValue since LongValue has index 0. However, what actually happens is that the number 256 gets assigned to IntSequence, and then ParseArguments returns a NotParsed object with a SequenceOutOfRangeError because IntSequence got assigned 5 values and its max is 4.There's currently a commented-out code block in TokenPartitioner.cs that would check the Max attribute (if there is one) of a sequence, and stop assigning Value tokens to that sequence once the max is reached. I added it while I was implementing #678, then commented it out when I realized that it was making a different test fail: the Breaking_max_constraint_in_string_sequence_gererates_SequenceOutOfRangeError test in InstanceBuilderTests.cs (which is slightly misspelled, as "gererates" should be "generates" in its title) actually expects the current behavior, where the tokenizer swallows up too many tokens and then throws an error. I believe this is wrong, and that test should expect the parser to succeed with one extra leftover arg that didn't get inserted into the Parsed instance (and that extra arg wouldn't be accessible to the user anywhere, but that's what #680 is about).
Another test that would also fail if I turn on the "stop when you've reached Max number of items" feature is the Partition_sequence_multi_instance_with_max test in SequenceTests.cs. It takes a command line like
--str=strvalue freevalue --seq seqval0 seqval1 -x freevalue2 --seq seqval2 seqval3 --seq seqval4 seqval5
whereseq
is a sequence with Max=3. The test expectsseq
to end up with six values assigned to it (because--seq
is specified three times with 2 values each, so each time it's specified it doesn't go over its max), but I believe it should only have three values assigned to it in total. The question here is, what would library users expect when they write the following?Would they expect StringSequence to be able to have more than 3 items it in? My answer is no: if library users assign Max=3 to StringSequence and it ends up with 6 items in it, they would rightfully consider that to be a bug. The Max attribute should describe the maximum total number of items that can be assigned to StringSequence, not the maximum that can be assigned in any one single use of the
-s
parameter on the command line. So when the command line-s 1 2 -s 3 4 -s 5 6
is parsed, I believe the value of StringSequence should be{ "1", "2", "3" }
and the three extra args should be ignored.I'll be submitting a PR shortly to fix this bug, which was only just introduced when #678 was merged.
The text was updated successfully, but these errors were encountered: