Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: Tests are incompatible with newer versions of camouflage #967

Closed
2 tasks done
dagardner-nv opened this issue Jun 5, 2023 · 0 comments · Fixed by #1195
Closed
2 tasks done

[BUG]: Tests are incompatible with newer versions of camouflage #967

dagardner-nv opened this issue Jun 5, 2023 · 0 comments · Fixed by #1195
Labels
bug Something isn't working

Comments

@dagardner-nv
Copy link
Contributor

Version

23.07

Which installation method(s) does this occur on?

Source

Describe the bug.

We originally pinned camouflage to 0.9 due to testinggospels/camouflage#203 but those fixes were rolled into 0.13.0 .

Minimum reproducible example

npm install -g camouflage-server
pytest -x --run_slow

Relevant log output

tests/test_abp.py .F

================================================================================= FAILURES ==================================================================================
___________________________________________________________________________ test_abp_cpp[use_cpp] ___________________________________________________________________________

config = Config(debug=False, log_level=30, log_config_file=None, plugins=None, mode=<PipelineModes.FIL: 'FIL'>, feature_length=...m_clock', 'nvidia_smi_log.gpu.max_clocks.video_clock', 'nvidia_smi_log.gpu.max_customer_boost_clocks.graphics_clock']))
tmp_path = PosixPath('/tmp/pytest-of-dagardner/pytest-15/test_abp_cpp_use_cpp_0')

    @pytest.mark.slow
    @pytest.mark.use_cpp
    @pytest.mark.usefixtures("launch_mock_triton")
    def test_abp_cpp(config, tmp_path):
        config.mode = PipelineModes.FIL
        config.class_labels = ["mining"]
        config.model_max_batch_size = MODEL_MAX_BATCH_SIZE
        config.pipeline_batch_size = 1024
        config.feature_length = FEATURE_LENGTH
        config.edge_buffer_size = 128
        config.num_threads = 1
    
        config.fil = ConfigFIL()
        config.fil.feature_columns = load_labels_file(os.path.join(TEST_DIRS.data_dir, 'columns_fil.txt'))
    
        val_file_name = os.path.join(TEST_DIRS.validation_data_dir, 'abp-validation-data.jsonlines')
        out_file = os.path.join(tmp_path, 'results.csv')
        results_file_name = os.path.join(tmp_path, 'results.json')
    
        pipe = LinearPipeline(config)
        pipe.set_source(FileSourceStage(config, filename=val_file_name, iterative=False))
        pipe.add_stage(DeserializeStage(config))
        pipe.add_stage(PreprocessFILStage(config))
    
        # We are feeding TritonInferenceStage the port to the grpc server because that is what the validation tests do
        # but the code under-the-hood replaces this with the port number of the http server
        pipe.add_stage(
            TritonInferenceStage(config, model_name='abp-nvsmi-xgb', server_url='localhost:8001',
                                 force_convert_inputs=True))
        pipe.add_stage(MonitorStage(config, description="Inference Rate", smoothing=0.001, unit="inf"))
        pipe.add_stage(AddClassificationsStage(config))
        pipe.add_stage(AddScoresStage(config, prefix="score_"))
        pipe.add_stage(
            ValidationStage(config, val_file_name=val_file_name, results_file_name=results_file_name, rel_tol=0.05))
        pipe.add_stage(SerializeStage(config))
        pipe.add_stage(WriteToFileStage(config, filename=out_file, overwrite=False))
    
>       pipe.run()

tests/test_abp.py:155: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
morpheus/pipeline/pipeline.py:598: in run
    asyncio.run(self.run_async())
../conda/envs/m2/lib/python3.10/asyncio/runners.py:44: in run
    return loop.run_until_complete(main)
../conda/envs/m2/lib/python3.10/asyncio/base_events.py:649: in run_until_complete
    return future.result()
morpheus/pipeline/pipeline.py:576: in run_async
    await self.join()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <morpheus.pipeline.linear_pipeline.LinearPipeline object at 0x7f8f9a7cb610>

    async def join(self):
        """
        Suspend execution all currently running stages and the MRC pipeline.
        Typically called after `stop`.
        """
        try:
>           await self._mrc_executor.join_async()
E           RuntimeError: Triton Error while executing 'client->Infer(&results, m_options, inputs, outputs)'. Error: failed to parse the request JSON buffer: Invalid value. at 0
E           ../morpheus/_lib/src/stages/triton_inference.cpp(306)

morpheus/pipeline/pipeline.py:327: RuntimeError

Full env printout

No response

Other/Misc.

No response

Code of Conduct

  • I agree to follow Morpheus' Code of Conduct
  • I have searched the open bugs and have found no duplicates for this bug report
@dagardner-nv dagardner-nv added the bug Something isn't working label Jun 5, 2023
rapids-bot bot pushed a commit that referenced this issue Sep 18, 2023
…1195)

* Adopt camouflage-server 0.15, previously we've been locked on v0.9 due to outstanding bugs introduced in versions 0.10 - 0.14.1 :
  - testinggospels/camouflage#203
  - testinggospels/camouflage#223  
  - testinggospels/camouflage#227
  - testinggospels/camouflage#229
* Includes unrelated fix to running CI locally
* Restrict mlflow to versions prior to 2.7

closes #967
Fixes #1192

## By Submitting this PR I confirm:
- I am familiar with the [Contributing Guidelines](https://github.com/nv-morpheus/Morpheus/blob/main/docs/source/developer_guide/contributing.md).
- When the PR is ready for review, new or existing tests cover these changes.
- When the PR is ready for review, the documentation is up to date with these changes.

Authors:
  - David Gardner (https://github.com/dagardner-nv)

Approvers:
  - Christopher Harris (https://github.com/cwharris)

URL: #1195
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

1 participant