Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplify test files #89

Merged
merged 32 commits into from
Jul 20, 2018
Merged

Simplify test files #89

merged 32 commits into from
Jul 20, 2018

Conversation

jaydenseric
Copy link
Owner

@jaydenseric jaydenseric commented Jul 13, 2018

Instead of using three seperate JSON, JPG and SVG test files, use one multipurpose test file.

Because the multipurpose test file is larger than the previous JSON and SVG files (at 100kb) it may reveal more bugs as it will be received in at least 2 chunks.

@mike-marcacci I don't know why, but these changes cause some tests to fail. Maybe you can tell why? Hopefully it's something silly I've done and not a bug with the new approach 😅

@jaydenseric
Copy link
Owner Author

Intermittently (perhaps once every 5 local test runs?), I have been seeing this test count exceeds plan error too:

screen shot 2018-07-13 at 5 09 00 pm

@mike-marcacci
Copy link
Collaborator

mike-marcacci commented Jul 18, 2018

@jaydenseric - just to confirm, only the "Aborted request" test failure is intermittent, correct? (I see the others consistently failing on my machine.)

@jaydenseric
Copy link
Owner Author

@mike-marcacci pretty sure.

@jaydenseric
Copy link
Owner Author

jaydenseric commented Jul 18, 2018

In test-file.txt I put the 0000000end text at, well, the end to match the placement </svg> had. But maybe we should create a 0000000middle0000000 instead.

@mike-marcacci
Copy link
Collaborator

I saw that – I think either strategy should be fine, since we don't ever send the</svg> or end in the the transform stream.

Working off of this branch, though, I'm not seeing the failure you describe above (I've probably run it 20 times). What OS / node version are you running? Just want to make sure I'm seeing the same things.

@jaydenseric
Copy link
Owner Author

Interesting, as Travis failed on Node.js v10.6 (what I also run locally) and v8.5. I'm on macOS v10.13.6.

@mike-marcacci
Copy link
Collaborator

So, I see the same thing as Travis (I'm also running node v10.6 on macOS): Errors 2 and 3 from your screenshot appear every time; error 1 never appears though. I'll look into 2 and 3, and see if that leads somewhere.

Copy link
Collaborator

@mike-marcacci mike-marcacci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes can account for both errors in travis, and error number 2 and number 3 in your screenshot... but I haven't been able to reproduce the first error in your screenshot.

@@ -851,8 +822,8 @@ t.test('Exceed max file size.', async t => {
})
)

body.append(1, fs.createReadStream(TEST_FILE_PATH_JPG))
body.append(2, fs.createReadStream(TEST_FILE_PATH_JSON))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test tests:

  1. That an upload that exceeds the max file size triggers an error
  2. That a subsequent upload does not error

Because both uploads now exceed the max file size, they both error, which is the correct behavior. We could either introduce another smaller test file, or remove the second half of the test.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll add a second smaller test file.

@mike-marcacci
Copy link
Collaborator

I added a commit to fix these stderr messages in node 8.

When these requests are aborted, the socket experiences a parse error which gets emitted as a clientError on the http server. If there is a listener for this event, it is responsible for closing the socket.

In node 8, the default behavior results in a message being written to stderr, so I'm now destroying the socket without passing the error.

@jaydenseric
Copy link
Owner Author

jaydenseric commented Jul 18, 2018

I'm actually looking at getting rid of having actual test files altogether. It's better tests create exactly the sort of file and assertions they need independently.

Using this alternative form-data API:

- body.append(1, fs.createReadStream(TEST_FILE_PATH))
+ body.append(1, 'Text file content.', { filename: 'test-file.txt' })

@mike-marcacci
Copy link
Collaborator

Ooh! That's a fantastic idea. I did just push the "small file" commit here and was about to hit you up on slack. I'll defer to your changes here so we don't step on each others' toes. Let me know if you see that first error again, or have an idea how it might be reproduced!

@jaydenseric
Copy link
Owner Author

@mike-marcacci there are now only 3 usages of fs.createReadStream(TEST_FILE_PATH). Very strange, but when I swap them for alternative syntax (e.g. body.append('1', 'a', { filename: 'a.txt' })) the tests error. Theoretically the change should not matter. The only thing I can think, is it runs faster than an fs lookup, revealing a race condition.

With console.log of ctx.request.body.variables.files[0] in the Exceed max files with extraneous files interspersed test when using fs you can get a pending promise:

Promise {
  <pending>,
  domain:
   Domain {
  jobs: 1,
  buffered: true,
  occupied: false,
  pool: Pool { length: 0, head: null, tail: null },
  queue: [],
  subtests: [],
  output: '    TAP version 13\n    1..2\n',
  skip: false,
  todo: false,
  only: false,
  results: null,
  options: {} } ] } }

But when using the inline created files, the upload promises are resolved (with a file stream error?) before they can be accessed in the next middleware:

Promise {
  { stream:
   Capacitor {
 _writableState: [WritableState],
 writable: false,
 domain: [Domain],
 _events: [Object],
 _eventsCount: 4,
 _maxListeners: undefined,
 path: '',
 fd: null,
 flags: 'w+',
 mode: 438,
 start: undefined,
 autoClose: false,
 pos: undefined,
 bytesWritten: 0,
 closed: false,
 _reader: [Reader] },
  filename: 'a.txt',
  mimetype: 'text/plain',
  encoding: '7bit' },
  domain:
   Domain {
 domain: null,
 _events:
  { removeListener: [Function: updateExceptionCapture],
    newListener: [Function: updateExceptionCapture],
    error: [Function] },
 _eventsCount: 3,
 _maxListeners: undefined,
 members:
  [ Test { name: 'Express middleware.',
  jobs: 1,
  buffered: true,
  occupied: false,
  pool: Pool { length: 0, head: null, tail: null },
  queue: [],
  subtests: [],
  output:
   '    TAP version 13\n    1..2\n    not ok 1 - 2 max file uploads exceeded.\n      ---\n      stack: |\n        Busboy.parser.once (lib/middleware.js:160:9)\n        Busboy.emit (node_modules/busboy/lib/main.js:38:33)\n        PartStream.<anonymous> (node_modules/busboy/lib/types/multipart.js:179:17)\n        HeaderParser.<anonymous> (node_modules/dicer/lib/Dicer.js:51:16)\n        HeaderParser._finish (node_modules/dicer/lib/HeaderParser.js:68:8)\n        SBMH.<anonymous> (node_modules/dicer/lib/HeaderParser.js:40:12)\n        SBMH._sbmh_feed (node_modules/streamsearch/lib/sbmh.js:159:14)\n      at:\n        line: 160\n        column: 9\n        file: lib/middleware.js\n        function: Busboy.parser.once\n      type: MaxFilesUploadError\n      status: 413\n      expose: true\n      test: Express middleware.\n      source: |\n        new _errors.MaxFilesUploadError(\n      ...\n    \n    # test count(1) != plan(2)\n # failed 1 test\n',
  skip: false,
  todo: false,
  only: false,
  results:
   FinalResults {
 ok: false,
 count: 1,
 pass: 0,
 fail: 1,
 bailout: false,
 todo: 0,
 skip: 0,
 plan:
  FinalPlan { start: 1, end: 2, skipAll: false, skipReason: '', comment: '' },
 failures: [ [Result] ] },
  options: { at: null } } ] } }

@jaydenseric
Copy link
Owner Author

jaydenseric commented Jul 19, 2018

BTW I'm aware that the change to use new Promise(async resolve => is an antipattern, I'm looking into the right way to do it. In the first place, I'm not really sure why the outer tests finish before the inner middleware tests complete, this was not an issue when using t.plan() in graphql-api-koa: https://github.com/jaydenseric/graphql-api-koa/blob/v1.0.0/src/test.mjs#L66.

@mike-marcacci
Copy link
Collaborator

OK! It looks like one of these errors was a problem with the test expecting an end that wasn't supplied by the new file.

The other one, however, was actually a bug in that the extraneous stream was unhandled. When the previous file was larger, a stream wasn't created, and so no error needed to be handled. With the smaller file, though, this bug was exposed.

@jaydenseric
Copy link
Owner Author

@mike-marcacci Thanks for the bugfix! It had me tearing hair out.

I don't know why the tests are currently passing on one of Travis Node.js versions, as they hang for me at the Aborted request test. I had tried a lot of things to account for the expected end, but it would always hang.

It seems this is a little too aggressive, and sometimes the tests run pretty slow. There are a lot of ways errors in these tests can cause the process to hang, perhaps a higher level Travis or tap timeout would be better.
This reveals an issue where certain tests were not even running.
@mike-marcacci
Copy link
Collaborator

I just checked in on this, and see why the abort test is hanging: using a single character (especially a number) as the abort signal probably isn't a great idea, since the actual multipart request is going to contain a randomly generated boundary to construct the delimiter between parts:

----------------------------231979539665073886008199
Content-Disposition: form-data; name="operations"
{"variables":{"fileA":null,"fileB":null,"fileC":null}}
----------------------------231979539665073886008199
Content-Disposition: form-data; name="map"
{"1":["variables.fileA"],"2":["variables.fileB"],"3":["variables.fileC"]}
----------------------------231979539665073886008199

Some sort of memory leak though, as the test script no longer closes.
@jaydenseric
Copy link
Owner Author

jaydenseric commented Jul 20, 2018

Yep, I'm a few steps ahead than that. Made the discovery that it is possible for a request to not be sent at all when aborting, if Node.js has not yet sent any packets. The trick is to use a timeout of 50 or so ms to be sure some of the first packets have been sent out. Fixed the tests (except some todos), but now there is some sort of memory leak I can't easily work out; the test script doesn't exit.

@mike-marcacci
Copy link
Collaborator

Sounds like a socket handle hasn't been released. Instead of using a timeout (which is definitely a bit of a race) or a signal in the data, you could just keep track of the byte count and abort somewhere in the middle of that large payload.

@mike-marcacci
Copy link
Collaborator

mike-marcacci commented Jul 20, 2018

But, of course, I'm arguing against a straw man of your implementation that I just imagined... so you can probably ignore me :)

I'm off to bed (have to drive my wife to the airport in like 4 hours) but let me know if you want me to look over anything as I'll have some time tomorrow!

@jaydenseric
Copy link
Owner Author

I celebrated too soon, for some reason the process doesn't exit again. @mike-marcacci I have to go out to an event now; maybe you can work it out. While debugging, I suggest having task manager open listing all node processes for manual force-quitting.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants