Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only 65535 characters are allowed #142

Closed
timcassell opened this issue Aug 25, 2021 · 18 comments · Fixed by #206
Closed

Only 65535 characters are allowed #142

timcassell opened this issue Aug 25, 2021 · 18 comments · Fixed by #206
Labels
bug Something isn't working

Comments

@timcassell
Copy link
Contributor

timcassell commented Aug 25, 2021

Tests pass, but the github action fails due to too many character (I'm not sure what exactly is attempting to be done).

I suspect this is due to me running over 1700 tests. Is there some way for me to get the action to pass properly?

Last lines of test run:

Processing file playmode-results.xml...
Trying to open artifacts/playmode-results.xml
File artifacts/playmode-results.xml parsed...
Start analyzing results: playmode-results.xml
✅ playmode-results.xml - 1721/1721 - Passed in 1353.219s
=================
Analyze result:
✅ Test Results - 1721/1721 - Passed in 1353.219s
Posting results for cdd86d6db71a1e977b579f30a8c8117bf780e8da
Error: Invalid request.

Only 65535 characters are allowed; 233873 were supplied.

[Edit] Forgot to include my yaml:

name: Actions

on: [push, pull_request]

jobs:
  test:
    name: Build my project
    runs-on: ubuntu-latest
    steps:
      # Checkout
      - name: Checkout repository
        uses: actions/checkout@v2

      # Test
      - name: Run tests
        uses: game-ci/unity-test-runner@v2
        env:
          UNITY_LICENSE: ${{ secrets.UNITY_LICENSE_2019_4_29F1_UBUNTU }}
        with:
          githubToken: ${{ secrets.GITHUB_TOKEN }}
          projectPath: ProtoPromise_Unity
          unityVersion: 2019.4.29f1
          testMode: PlayMode

      - uses: actions/upload-artifact@v2
        if: always()
        with:
          name: Test results
          path: artifacts
@timcassell timcassell added the bug Something isn't working label Aug 25, 2021
@davidmfinol
Copy link
Member

That's a lot of tests!

The Unity Test Runner actions uses the test results to generate markup and display it using the GitHub Checks API.
It looks like the large number of tests you have is causing too large a request to be sent to the Checks API, and they are rejecting it.
I did find this related issue: https://github.community/t/undocumented-65535-character-limit-on-requests/117564

The easiest workaround would be to remove githubToken: ${{ secrets.GITHUB_TOKEN }}, so that the Checks API is not called. But then you wouldn't be able to see the test results in the UI. You could upload the test results artifact as mentioned here in the GameCI docs, or you could try using a different action to display the results (though I expect they will have the same error):

      - uses: MirrorNG/nunit-reporter@v1.0.11
        if: always()
        with:
          path: ${{ steps.test.outputs.artifactsPath }}/*.xml
          access-token: ${{ secrets.GITHUB_TOKEN }}

Alternatively, maybe there's a way to split the results into smaller pieces, so that multiple smaller calls could be made to the Checks API?

@timcassell
Copy link
Contributor Author

timcassell commented Aug 25, 2021

I already have the store artifacts step after the test run. I'm actually not too concerned about the UI for the tests, I just want it to report success/fail, then I can manually inspect the test results artifact if needed. Is there a configuration option for that?

[Edit] I just realize that's exactly what removing the github token does facepalm. But yeah, it would be nice to be able to have the UI work for a large number of tests.

@timcassell
Copy link
Contributor Author

Would it be possible to parse the results and only upload the failed tests? Typically those are the only ones I care about, and failures should typically be very few to not go past that limit.

@davidmfinol
Copy link
Member

I think most people like seeing the green checkmarks for each passing test.
But we could potentially add an option (maybe checkFailedOnly?) that could be set to true to filter the results.
Feel free to raise a PR if you think this is something you could add.

@timcassell
Copy link
Contributor Author

timcassell commented Aug 26, 2021

I'd love to, but I'm not too familiar with js or github actions, so I wouldn't know where to start. I also don't have that much time right now, but maybe I could dig into it in a few weeks or so if no-one else does by then.

@timcassell
Copy link
Contributor Author

I actually found https://github.com/marketplace/actions/test-reporter to solve this issue. It doesn't natively support the XML results that Unity outputs yet, but that's apparently in the works, and a workaround was posted by a user. So I will close this.

@melonmouse
Copy link

melonmouse commented Mar 9, 2022

I actually found https://github.com/marketplace/actions/test-reporter to solve this issue. It doesn't natively support the XML results that Unity outputs yet, but that's apparently in the works, and a workaround was posted by a user. So I will close this.

Should this bug/feature request be reopened? I suspect that supporting test suites with more tests / more testoutput would be a good feature for many people!

The workaround using test-reporter does not work for me, but fails with a stackTrace.split is not a function error (as it does for people in that thread).

That workaround does work if setting reporter: java-junit (rather than jest-junit). Can't believe I read over that / sorry for any confusion!

@webbertakken
Copy link
Member

Sure!

@davidmfinol
Copy link
Member

I created an enhancement request with what I think may be the best solution to this issue: #169

@timcassell
Copy link
Contributor Author

timcassell commented Mar 9, 2022

Should this bug/feature request be reopened? I suspect that supporting test suites with more tests / more testoutput would be a good feature for many people!

The workaround using test-reporter does not work for me, but fails with a stackTrace.split is not a function error (as it does for people in that thread).

Changing the jest-junit to java-junit works for me.

      # Workaround for NUnit XML (see https://github.com/dorny/test-reporter/issues/98#issuecomment-867106931)
      - name: Install NUnit
        if: always()
        run: |
          nuget install NUnit.Console -Version 3.12.0

      - name: Fetch transform code
        if: always()
        run: |
          wget https://raw.githubusercontent.com/nunit/nunit-transforms/master/nunit3-junit/nunit3-junit.xslt
        shell: bash

      - name: Transform NUnit3 to JUnit
        if: always()
        run: |
          Get-ChildItem . -Filter artifacts/*.xml | Foreach-Object {
            $xml = Resolve-Path $_.FullName
            $output = Join-Path ($pwd) ($_.BaseName + '_junit.xml')
            $xslt = New-Object System.Xml.Xsl.XslCompiledTransform;
            $xslt.Load("nunit3-junit.xslt");
            $xslt.Transform($xml, $output);
          }
        shell: pwsh

      - uses: dorny/test-reporter@v1
        if: always()
        with:
          name: "unity-test-results-${{ matrix.buildTarget }}-${{ matrix.scriptingRuntime }}-${{ matrix.mode }}-${{ matrix.progress }}-${{ matrix.pooling }}"
          path: "*_junit.xml"
          reporter: java-junit

      - uses: actions/upload-artifact@v2
        if: always()
        with:
          name: unity-test-results-${{ matrix.buildTarget }}-${{ matrix.scriptingRuntime }}-${{ matrix.mode }}-${{ matrix.progress }}-${{ matrix.pooling }}
          path: artifacts

Though, like someone else mentioned, it does lack the stack traces, but I can get that from the artifacts when I need it.

@davidmfinol
Copy link
Member

I like the idea of having different options to display the test results, so maybe we could even make a new compound action does the NUnit to JUnit transformation and then uses dorny/test-reporter to display the test results?
It would mean extra work + maintenance, so I'm not going to push for the idea, but I thought I'd at least suggest the possibility.

@timcassell
Copy link
Contributor Author

timcassell commented Mar 10, 2022

It would be better if dorny/test-reporter just supported NUnit XML natively, but it seems dorny never got around to it. I posted in that issue asking about it. I think PRs will also be accepted if anyone wants to add the support there if dorny doesn't have the time.

Or if unity's test runner could offer extra test result format options, but that's outside the purview of this project (though someone did ask about it on the forums https://forum.unity.com/threads/unity-test-framework-2-0-ready-for-feedback.1230126/#post-7884664).

@melonmouse
Copy link

melonmouse commented Mar 10, 2022

For me, skipped/inconclusive tests were simply counted as passed. As a very ugly hacky workaround, I added the following to my yml right after the Transform NUnit3 to JUnit step:

      - name: Add "skipped" tags.
        run: |
          sed -i "s/\(<testcase.* status=\".nconclusive\".*>\)/\1 <skipped\/>/g" *_junit.xml
        shell: bash

edit: sorry if this is out of scope for this discussion!

@KurtGokhan
Copy link

Instead of dorny/test-reporter using EnricoMi/publish-unit-test-result-action worked better for me. Usage is very similar.

@timcassell
Copy link
Contributor Author

Instead of dorny/test-reporter using EnricoMi/publish-unit-test-result-action worked better for me. Usage is very similar.

It looks like that only officially supports JUnit XML, not NUnit XML. Do you have to convert it?

@KurtGokhan
Copy link

Yes, you also have to convert it. I used your code above. https://github.com/ReactUnity/core/blob/main/.github/workflows/test.yml#L108

@timcassell
Copy link
Contributor Author

timcassell commented Dec 5, 2022

Hm, looks like EnricoMi/publish-unit-test-result-action added NUnit XML support back in June. I'll have to try it out. It may be worth integrating that.

[Edit] Looks like it only works on Linux, not Windows.

@KurtGokhan
Copy link

@timcassell thanks for the info. I could remove some boilerplate from my workflow file.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants