Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Add support for ES modules #406

Open
michael-hhai opened this issue Aug 6, 2024 · 11 comments
Open

Feature: Add support for ES modules #406

michael-hhai opened this issue Aug 6, 2024 · 11 comments

Comments

@michael-hhai
Copy link

michael-hhai commented Aug 6, 2024

It seems like OpenLLMetry auto-instrumentation doesn't work with ES modules. For what it's worth, this is chiefly an upstream problem with opentelemetry-js (see also open-telemetry/opentelemetry-js#4845). Just making a record of it here as well.

@nirga
Copy link
Member

nirga commented Aug 6, 2024

Thanks @michael-hhai!
Do you mean that the auto-instrumentation doesn't work?

@michael-hhai
Copy link
Author

Yes. Correct.

@nirga
Copy link
Member

nirga commented Aug 6, 2024

Yes, well known. That's why we have this:
https://www.traceloop.com/docs/openllmetry/tracing/js-force-instrumentations

It's indeed more of an inherent problem with Node.js, but hopefully will be fixed.

@michael-hhai
Copy link
Author

For what it's worth, I do not have a minimal reproduction immediately handy but I don't think the above linked workaround works for ES modules either.

@nirga
Copy link
Member

nirga commented Aug 6, 2024

@michael-hhai I'm 90% sure it works (wanted to say 99% but maybe I need to be more modest)

@michael-hhai
Copy link
Author

I have the following reproduction of the issue with ES modules that I've tried to make as minimal as I can:

  1. Create a local tracer-module package as follows:
    a. Create index.ts as:
import { OpenAI } from 'openai';
import * as traceloop from "@traceloop/node-server-sdk";

class Tracer {
  public init(): void {
    traceloop.initialize({
      baseUrl: "http://example-url-does-not-exist.com/opentelemetry",
      apiKey: "FAKE-API-KEY",
      disableBatch: true,
      instrumentModules: {
        openAI: OpenAI,
      },
    });
  }

  public trace(fn: () => void): void {
    traceloop.withAssociationProperties(
      {
        thing: "thing",
      },
      fn,
    );
  }
}

export const tracer = new Tracer();

b. Create package.json as:

{
  "name": "tracer-module",
  "version": "1.0.0",
  "main": "dist/index.js",
  "type": "module",
  "exports": {
    ".": "./dist/index.js"
  },
  "scripts": {
    "build": "tsc"
  },
  "dependencies": {
    "@anthropic-ai/sdk": "^0.26.1",
    "@aws-sdk/client-bedrock-runtime": "^3.632.0",
    "@azure/openai": "^2.0.0-beta.1",
    "@google-cloud/aiplatform": "^3.26.0",
    "@google-cloud/vertexai": "^1.4.1",
    "@pinecone-database/pinecone": "^3.0.0",
    "@qdrant/js-client-rest": "^1.11.0",
    "@traceloop/node-server-sdk": "^0.10.0",
    "chromadb": "^1.8.1",
    "cohere-ai": "^7.12.0",
    "langchain": "^0.2.16",
    "llamaindex": "^0.5.17",
    "openai": "^4.56.0"
  }
}

c. Create tsconfig.json as:

{
  "compilerOptions": {
    "target": "ES6",
    "sourceMap": true,
    "module": "ESNext",
    "strict": true,
    "esModuleInterop": true,
    "moduleResolution": "node",
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true,
    "outDir": "./dist",
    "typeRoots": ["./node_modules/@types", "./types"]
  },
  "include": ["./**/*.ts", "./custom.d.ts"],
  "exclude": ["node_modules"]
}
  1. Build the package with npm install && npm run build.
  2. Link the package with npm link.
  3. Create some separate test project (i.e. js-test or something like that) as follows:
    a. Create index.ts like:
import { BatchInterceptor } from '@mswjs/interceptors'
import { ClientRequestInterceptor } from '@mswjs/interceptors/ClientRequest'
import { XMLHttpRequestInterceptor } from '@mswjs/interceptors/XMLHttpRequest'

const interceptor = new BatchInterceptor({
  name: 'my-interceptor',
  interceptors: [
    new ClientRequestInterceptor(),
    new XMLHttpRequestInterceptor(),
  ],
})

interceptor.apply()

interceptor.on('request', ({ request, requestId, controller }) => {
  console.log(request.method, request.url)
})

import { tracer } from 'tracer-module';
tracer.init();

import OpenAI from 'openai';

// src/index.ts
const helloWorld = (): string => {
  return "Hello, World!";
};

const main = async () => {
  const resolvedTracer = await tracer; // Await the tracer if it is a promise

  await resolvedTracer.trace(async () => {
    // Example call to OpenAI
    const openai = new OpenAI({
      apiKey: process.env.OPENAI_API_KEY,
    });

    try {
      const response = await openai.chat.completions.create({
        model: "gpt-4o",
        messages: [{ role: 'user', content: "Say Hello, World!" }],
        max_tokens: 5,
      });

      console.log(response.choices[0]?.message?.content);
    } catch (error) {
      console.error("Error calling OpenAI API:", error);
    }

    // Original helloWorld function call
    console.log(helloWorld());
  });

};

main().catch((error) => console.error(error));

b. Create tsconfig.json like:

  "compilerOptions": {
    "target": "es2020",
     "sourceMap": true,
    "module": "commonjs",
    "strict": true,
    "esModuleInterop": true,
    "moduleResolution": "node",
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true,
    "outDir": "./dist",
    "typeRoots": ["./node_modules/@types", "./types"]
  },
  "include": ["./**/*.ts", "./custom.d.ts"],
  "exclude": ["node_modules"]
}

c. Create package.json like:

{
    "type": "module",
    "devDependencies": {
        "tsx": "^4.16.5"
    },
    "dependencies": {
        "@mswjs/interceptors": "^0.34.0",
        "@opentelemetry/instrumentation": "^0.52.1",
        "@traceloop/node-server-sdk": "^0.10.0",
        "honeyhive": "^0.6.4",
        "node-request-interceptor": "^0.6.3",
        "openai": "^4.54.0",
        "tracer-module": "file:../tracer-module"
    }
}
  1. Run the test file like npx tsx index.ts. You should see output like:
Traceloop exporting traces to http://example-url-does-not-exist.com/opentelemetry
POST https://api.openai.com/v1/chat/completions
Hello, World!
Hello, World!

Note that there is no call to http://example-url-does-not-exist.com/opentelemetry/v1/traces.
6. Delete the "type": "module" from the package.json so that it now looks like:

{
    "devDependencies": {
        "tsx": "^4.16.5"
    },
    "dependencies": {
        "@mswjs/interceptors": "^0.34.0",
        "@opentelemetry/instrumentation": "^0.52.1",
        "@traceloop/node-server-sdk": "^0.10.0",
        "honeyhive": "^0.6.4",
        "node-request-interceptor": "^0.6.3",
        "openai": "^4.54.0",
        "tracer-module": "file:../tracer-module"
    }
}
  1. Re-run npx tsx index.ts. You should see output like:
Traceloop exporting traces to http://example-url-does-not-exist.com/opentelemetry
POST https://api.openai.com/v1/chat/completions
Hello, World!
Hello, World!
POST http://example-url-does-not-exist.com/opentelemetry/v1/traces

Note that this does make a call to POST http://example-url-does-not-exist.com/opentelemetry/v1/traces.

An interesting thing here is that this behavior is dependent on the tracer-module being an external dependency and not just another file within the same project. If the tracer module is just another file within the same project, then it will trace correctly regardless of whether or not the project has "type": "module" or not.

@nirga
Copy link
Member

nirga commented Aug 16, 2024

Thanks! I think it's related to openai/openai-node#903

A possible workaround can be -

import { register } from "node:module";

register("import-in-the-middle/hook.mjs", import.meta.url, {
  parentURL: import.meta.url,
  data: { include: ["openai"]},
});

And then when running node for example you’d import that file with node --import ./loader.js

@michael-hhai
Copy link
Author

I've gone down that rabbit hole and I can't really get anything like that to work. Do you know what exactly needs to happen in order for openllmetry-js (really opentelemetry-js) to be able to successfully trace the calls? Do you know the difference in code execution between ES modules and non-ES modules? Because I don't actually currently know the answers to those questions, so I feel as if I'm kinda randomly mashing buttons at the moment.

@ericallam
Copy link

It is indeed a PIA to get this to work, but it's definitely possible (I've done it). A couple of notes:

  • Make sure you aren't bundling the openai package, it must be external.
  • Make sure, like 100% super-duper sure, that you are setting up the instrumentation BEFORE you import "openai". The easiest way to do that is to dynamically import the file that does the import "openai", from the file that does the instrumentation/tracing setup.
  • Do your own setup using the node --import ./loader.js trick @nirga mentions above.
  • This means you probably want to add the import-in-the-middle dependency yourself.
  • Make sure import-in-the-middle is not bundled.

@michael-hhai
Copy link
Author

Would you mind spelling out what exactly all of that entails in terms of the minimal reproduction posted above? I'm not doing any sort of bundling there, so at least is not an issue.

@michael-hhai
Copy link
Author

@ericallam Any help with the above? I still have no luck converting the minimal reproduction above into something that works (i.e. calls out to POST http://example-url-does-not-exist.com/opentelemetry/v1/traces when package.json contains "type": "module"). I don't know if I'm deciphering what you're saying correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants