Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(sample-app): langserve example app #1043

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

chore(sample-app): langserve example app #1043

wants to merge 1 commit into from

Conversation

nirga
Copy link
Member

@nirga nirga commented May 14, 2024

  • I have added tests that cover my changes.
  • If adding a new instrumentation or changing an existing one, I've added screenshots from some observability platform showing the change.
  • PR name follows conventional commits format: feat(instrumentation): ... or fix(instrumentation): ....
  • (If applicable) I have updated the documentation accordingly.

@nirga nirga changed the title chore(sample-app): lengserve example app chore(sample-app): langserve example app May 14, 2024
@damianoneill
Copy link

@nirga would it be possible to include in the README.md a couple of examples of what you expect to see in the trace? I connected traceloop as follows:

try:
    Traceloop.init(
        app_name="Langchain Chatbot Application",
        api_endpoint="http://localhost:4318", # http endpoint for opentelemetry (jaeger) collector
        disable_batch=True,
    )
except Exception as e:  # pylint: disable=broad-except
    logger.error("Failed to initialize Traceloop: %s", e)

And started up a Jaeger container as follows:

docker run --name jaeger \
  -e COLLECTOR_ZIPKIN_HOST_PORT=:9411 \
  -p 5775:5775/udp \
  -p 6831:6831/udp \
  -p 6832:6832/udp \
  -p 5778:5778 \
  -p 16686:16686 \
  -p 14250:14250 \
  -p 14268:14268 \
  -p 14269:14269 \
  -p 9411:9411 \
  -p 4318:4318 jaegertracing/all-in-one:latest

When I generate a langchain invoke, I can see the LLM call per below.

image

I'm not sure what is expected to be shown, I didn't see any calls for langchain itself, e.g. the Runnable Fastapi calls; invoke, stream, etc. Should I have seen these? Or should I only see the LLM call?

Thanks,
Damian.

@nirga
Copy link
Member Author

nirga commented May 20, 2024

@damianoneill yes I think we should! Mind opening a bug for that so we'll take care of it!
But to answer you now - you should be able to see chains and runnables. Let's discuss this over an issue / discussions or slack and try to understand what's not working for you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants