X Tutup
Skip to content

fix(deno): Clear pre-existing OTel global before registering TracerProvider#19723

Open
sergical wants to merge 2 commits intodevelopfrom
fix/deno-otel-trace-disable
Open

fix(deno): Clear pre-existing OTel global before registering TracerProvider#19723
sergical wants to merge 2 commits intodevelopfrom
fix/deno-otel-trace-disable

Conversation

@sergical
Copy link
Member

@sergical sergical commented Mar 9, 2026

Summary

  • Calls trace.disable() before trace.setGlobalTracerProvider() in @sentry/deno's OTel tracer setup
  • This fixes silent registration failure when Supabase Edge Runtime (or Deno's native OTel) pre-registers a TracerProvider on the @opentelemetry/api global (Symbol.for('opentelemetry.js.api.1'))
  • Without this fix, AI SDK OTel spans (gen_ai.*) never reach Sentry because the Sentry TracerProvider is never actually set as the global

Context

Supabase Edge Runtime (Deno 2.1.4+) registers its own TracerProvider before user code runs. The OTel API's setGlobalTracerProvider is a no-op if a provider is already registered, so Sentry's tracer silently gets ignored. Calling trace.disable() clears the global, allowing setGlobalTracerProvider to succeed.

This matches the pattern already used in cleanupOtel() in the test file and is safe because:

  1. It only runs once during Sentry.init()
  2. Any pre-existing provider is immediately replaced by Sentry's
  3. The Cloudflare package was investigated and doesn't have the same issue

Test plan

  • Updated should override pre-existing OTel provider with Sentry provider test — simulates a pre-existing provider and verifies Sentry overrides it
  • Updated should override native Deno OpenTelemetry when enabled test — verifies Sentry captures spans even when OTEL_DENO=true
  • Verified manually with Supabase Edge Functions + AI SDK that gen_ai spans appear in Sentry

🤖 Generated with Claude Code

Closes #19724 (added automatically)

…ovider

Supabase Edge Runtime (and Deno's native OTel) pre-registers on the
`@opentelemetry/api` global, causing `trace.setGlobalTracerProvider()`
to silently fail. Call `trace.disable()` first so Sentry's provider
always wins.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@github-actions
Copy link
Contributor

github-actions bot commented Mar 9, 2026

size-limit report 📦

⚠️ Warning: Base artifact is not the latest one, because the latest workflow run is not done yet. This may lead to incorrect results. Try to re-run all tests to get up to date results.

Path Size % Change Change
@sentry/browser 25.64 kB +0.05% +12 B 🔺
@sentry/browser - with treeshaking flags 24.14 kB +0.03% +7 B 🔺
@sentry/browser (incl. Tracing) 42.44 kB +0.02% +8 B 🔺
@sentry/browser (incl. Tracing, Profiling) 47.1 kB +0.02% +8 B 🔺
@sentry/browser (incl. Tracing, Replay) 81.26 kB +0.02% +9 B 🔺
@sentry/browser (incl. Tracing, Replay) - with treeshaking flags 70.88 kB +0.02% +8 B 🔺
@sentry/browser (incl. Tracing, Replay with Canvas) 85.95 kB +0.02% +9 B 🔺
@sentry/browser (incl. Tracing, Replay, Feedback) 98.21 kB +0.01% +7 B 🔺
@sentry/browser (incl. Feedback) 42.44 kB +0.02% +7 B 🔺
@sentry/browser (incl. sendFeedback) 30.31 kB +0.04% +11 B 🔺
@sentry/browser (incl. FeedbackAsync) 35.36 kB +0.04% +11 B 🔺
@sentry/browser (incl. Metrics) 26.8 kB +0.04% +9 B 🔺
@sentry/browser (incl. Logs) 26.95 kB +0.03% +8 B 🔺
@sentry/browser (incl. Metrics & Logs) 27.62 kB +0.04% +9 B 🔺
@sentry/react 27.39 kB +0.04% +9 B 🔺
@sentry/react (incl. Tracing) 44.78 kB +0.02% +8 B 🔺
@sentry/vue 30.09 kB +0.04% +10 B 🔺
@sentry/vue (incl. Tracing) 44.31 kB +0.03% +9 B 🔺
@sentry/svelte 25.66 kB +0.04% +9 B 🔺
CDN Bundle 28.18 kB +0.04% +9 B 🔺
CDN Bundle (incl. Tracing) 43.27 kB +0.03% +11 B 🔺
CDN Bundle (incl. Logs, Metrics) 29.02 kB +0.04% +9 B 🔺
CDN Bundle (incl. Tracing, Logs, Metrics) 44.11 kB +0.03% +10 B 🔺
CDN Bundle (incl. Replay, Logs, Metrics) 68.1 kB +0.02% +8 B 🔺
CDN Bundle (incl. Tracing, Replay) 80.15 kB +0.02% +12 B 🔺
CDN Bundle (incl. Tracing, Replay, Logs, Metrics) 81.01 kB +0.02% +10 B 🔺
CDN Bundle (incl. Tracing, Replay, Feedback) 85.66 kB +0.02% +9 B 🔺
CDN Bundle (incl. Tracing, Replay, Feedback, Logs, Metrics) 86.54 kB +0.02% +10 B 🔺
CDN Bundle - uncompressed 82.38 kB +0.04% +26 B 🔺
CDN Bundle (incl. Tracing) - uncompressed 128.09 kB +0.03% +26 B 🔺
CDN Bundle (incl. Logs, Metrics) - uncompressed 85.21 kB +0.04% +26 B 🔺
CDN Bundle (incl. Tracing, Logs, Metrics) - uncompressed 130.93 kB +0.02% +26 B 🔺
CDN Bundle (incl. Replay, Logs, Metrics) - uncompressed 208.88 kB +0.02% +26 B 🔺
CDN Bundle (incl. Tracing, Replay) - uncompressed 244.98 kB +0.02% +26 B 🔺
CDN Bundle (incl. Tracing, Replay, Logs, Metrics) - uncompressed 247.8 kB +0.02% +26 B 🔺
CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed 257.89 kB +0.02% +26 B 🔺
CDN Bundle (incl. Tracing, Replay, Feedback, Logs, Metrics) - uncompressed 260.7 kB +0.01% +26 B 🔺
@sentry/nextjs (client) 47.19 kB +0.02% +9 B 🔺
@sentry/sveltekit (client) 42.9 kB +0.02% +8 B 🔺
@sentry/node-core 52.27 kB +0.07% +34 B 🔺
@sentry/node 174.74 kB +0.03% +37 B 🔺
@sentry/node - without tracing 97.44 kB +0.06% +53 B 🔺
@sentry/aws-serverless 113.24 kB +0.05% +49 B 🔺

View base workflow run

Copy link
Member

@Lms24 Lms24 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without this fix, AI SDK OTel spans (gen_ai.*) never reach Sentry because the Sentry TracerProvider is never actually set as the global

H: Could you clarify this? I suspect that either no spans at all are sent or all of them should be sent. What makes gen_ai spans special here? This sounds to me like an agent partially analyzed a problem and didn't grasp the full scope of it. I don't think we can merge this until we know the consequences and the current state of tracing.

export function setupOpenTelemetryTracer(): void {
// Clear any pre-existing OTel global registration (e.g. from Supabase Edge Runtime
// or Deno's built-in OTel) so Sentry's TracerProvider gets registered successfully.
trace.disable();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

m l: I'm wondering if this backfires for people using Sentry with a custom OTel setup or deliberately with Deno's native tracing (OTLP exporter). The good news is that we don't document this setup for Deno, so I think we can just ignore it for the moment and walk back on this change if anyone complains.

Update: I just saw that we gate this function call with skipOpenTelemetrySetup, so users can opt out of it. That's good. So I guess the worst consequence here is that anyone using native tracing with Sentry might need to set this flag now. Which we can classify as a fix because that's how we intended the SDK to work anyway. Downgraded from logaf M to L

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment, no direct action required: I'm fine with these unit tests for now but to be clear these don't prove that the fix works as intended in an actual app. Long-term I'd like us to at least add one e2e app for Deno (or an integration tests setup like for Node) to more reliably verify this. This goes back to my main review comment that I don't think we fully grasped the scope of the current behavior yet. If we had such a test, we could more reliably say that at least some spans are sent.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fix(deno): Clear pre-existing OTel global before registering TracerProvider

2 participants

X Tutup