tracy
Tracy helps you trace, monitor, and evaluate AI-powered features directly from your Kotlin or Java projects.
It provides a unified API to capture structured traces. Fully compatible with observability platforms like Langfuse and Weights & Biases (W&B).
[!Note] This project uses Tracy Official YouTrack Project for issue tracking. Please file bug reports and feature requests there. Additional details are available in the Contributing Guidelines.
Standards:
This library implements
the OpenTelemetry Generative AI Semantic Conventions for span
attributes and event naming, ensuring your traces remain compatible with any OpenTelemetry-compliant backend.
You can use it to:
- Trace AI clients to capture messages, cost, token usage, and execution time. See Client Auto Tracing
- Trace any function to record inputs, outputs, and execution duration. See Annotation-Based Tracing
- Create and manage spans manually. See Manual Tracing
- Export traces to supported backends. See Supported Tracing Backends
Installation
Select the build system that matches your setup:
Gradle (Kotlin DSL)
Gradle (Kotlin DSL)
Add dependencies to the
build.gradle.ktsfile:plugins { id("org.jetbrains.ai.tracy") version "0.1.0" } dependencies { // Core Module with Shared Functionalities implementation("org.jetbrains.ai.tracy:tracy-core:0.1.0") // Client-specific Auto Tracing (select the one you need) implementation("org.jetbrains.ai.tracy:tracy-anthropic:0.1.0") implementation("org.jetbrains.ai.tracy:tracy-gemini:0.1.0") implementation("org.jetbrains.ai.tracy:tracy-ktor:0.1.0") implementation("org.jetbrains.ai.tracy:tracy-openai:0.1.0") }Make sure that you have
mavenCentral()in both your plugin management and project repositories sections.build.gradle.ktsrepositories { mavenCentral() }settings.gradle.ktspluginManagement { repositories { mavenCentral() } }
Gradle (Groovy)
Gradle (Groovy)
Add dependencies to the
build.gradlefile:plugins { id 'org.jetbrains.ai.tracy' version '0.1.0' } dependencies { implementation 'org.jetbrains.ai.tracy:tracy-core:0.1.0' // Client-specific Auto Tracing implementation 'org.jetbrains.ai.tracy:tracy-anthropic:0.1.0' implementation 'org.jetbrains.ai.tracy:tracy-gemini:0.1.0' implementation 'org.jetbrains.ai.tracy:tracy-ktor:0.1.0' implementation 'org.jetbrains.ai.tracy:tracy-openai:0.1.0' }Make sure that you have
mavenCentral()in both your plugin management and project repositories sections.build.gradlerepositories { mavenCentral() }settings.gradlepluginManagement { repositories { mavenCentral() } }
Maven
Maven
Add dependencies to the pom.xml file:
<project>
<!-- ... other project config (groupId, artifactId, version, etc.) ... -->
<build>
<plugins>
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<version>${kotlin.version}</version>
<configuration>
<jvmTarget>19</jvmTarget>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<!--Use the Kotlin version matching your setup (e.g., 2.0.0 or 2.0.20).
Keep the same major and minor numbers but adjust the patch version
to the nearest supported value - either 0 or 20. Examples:
- 2.1.19 -> 2.1.0
- 2.1.23 -> 2.1.20 -->
<artifactId>tracy-compiler-plugin-2.1.0-jvm</artifactId>
<version>0.1.0</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-core-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<!-- Client-specific Auto Tracing -->
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-anthropic-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-gemini-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-ktor-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-openai-jvm</artifactId>
<version>0.1.0</version>
</dependency>
</dependencies>
</project>
Requirements
Compatible with Kotlin from 2.0.0 and Java from 17.
If you already have OpenTelemetry set up in your project, versions 1.2+ are supported.
Supports integration with:
- OpenAI SDK
1.*–4.* - Anthropic SDK
1.*–2.* - Gemini SDK
1.8.*–1.38.*(earlier versions unsupported)
Usage
[!TIP] See the
examplesdirectory for complete and runnable examples!
Tracy provides tracing integrations for multiple clients—not only OpenAI, but also Gemini,
Anthropic,
Ktor, and OkHttp.
All clients can be instrumented in a similar way using the instrument(...) function.
Client Auto Tracing
Tracing Sensitive Content
According to OTEL semantic conventions, capturing of sensitive (e.g., user messages or assistant replies) data should be disabled by default (read more here). Therefore, by default Tracy disguises sensitive content by replacing it with a placeholder "REDACTED", preserving the trace structure and recorded attributes.
You may enable tracing of both input and output independently by following either of the ways below:
- Set the environment variables:
TRACY_CAPTURE_INPUT=true|false
TRACY_CAPTURE_OUTPUT=true|false
- Override the defaults programmatically in code:
// Call a helper method:
TracingManager.traceSensitiveContent()
// Or equivalently:
TracingManager.withCapturingPolicy(
ContentCapturePolicy(
captureInputs = true,
captureOutputs = true,
)
)
Tracing Examples
Below is a minimal OpenAI example. For others, check the examples directory:
- Anthropic Client Auto Tracing Example
- Ktor Client Auto Tracing Example
- OkHttp Auto Tracing Example
- Gemini Client Auto Tracing Example
// Enable tracing via the `IS_TRACY_ENABLED` environment variable
// or programmatically, as shown below:
TracingManager.isTracingEnabled = true
// Initialize tracing and export spans to the console
val sdk: OpenTelemetrySdk = configureOpenTelemetrySdk(
exporterConfig = ConsoleExporterConfig()
)
TracingManager.setSdk(sdk)
// Permit tracing of sensitive content in requests AND responses
TracingManager.traceSensitiveContent()
// Create an OpenAI client and instrument it with tracing capabilities
val instrumentedClient: OpenAIClient = OpenAIOkHttpClient.builder()
.baseUrl(url)
.apiKey(apiKey)
.timeout(timeout)
.build()
.apply { instrument(this) }
// Make request and receive response
val request = ChatCompletionCreateParams.builder()
.addUserMessage("Generate a polite greeting and introduce yourself.")
.model(ChatModel.GPT_4O_MINI)
.temperature(0.0)
.build()
val response = instrumentedClient.chat().completions().create(request)
println("OpenAI response: ${response.choices().first().message().content().get()}")
// Traces are automatically flushed based on ExporterCommonSettings:
// - Periodically via flushIntervalMs and flushThreshold
// - On JVM shutdown if flushOnShutdown = true (default)
// For manual control, use TracingManager.flushTraces()
This example uses simple console tracing for a demonstration. For information about other tracing backends, see Tracing Backends
Full example: OpenAI Client Auto Tracing Example
Tracing Configuration
Runtime toggle: Spans are emitted only if both conditions are met:
- Tracing SDK instance has been installed via
TracingManager.setSdk(...)Runtime tracing is controlled throughTracingManager.isTracingEnabled, which defaults to the value of theIS_TRACY_ENABLEDenvironment variable. if the variable is not set, tracing is disabled by default. You can also change this value programmatically at any time.
- Tracing SDK instance has been installed via
Compile-time plugin toggle:
- Annotation-Based Tracing requires the Kotlin Compiler Plugin, and you can enable or
disable it at compile time by setting the project property
enableTracyPlugin. The property itself is optional. If you don’t define it, the plugin is enabled by default, but you can turn it off whenever needed by settingenableTracyPlugin=falseingradle.propertiesor by passing it on the command line with-PenableTracyPlugin=false.
- Annotation-Based Tracing requires the Kotlin Compiler Plugin, and you can enable or
disable it at compile time by setting the project property
Configuration:
- Provide required backend settings to enable trace collection, either via environment variables or constructor parameters. See Tracing Backends.
Behavior:
- SDK missing: returns a no-op tracer; minimal overhead, no errors.
- SDK set after instrumentation: instrumented clients start emitting spans immediately.
- Runtime toggle: tracing can be dynamically enabled or disabled via
TracingManager.isTracingEnabled.
Annotation-Based Tracing
You can trace regular functions (not only client calls)
using the @Trace
annotation.
Make sure to apply the org.jetbrains.ai.tracy plugin in your build.
The Kotlin compiler plugin automatically instruments annotated functions, capturing execution details such as start and end time, duration, inputs, and outputs.
⚠️ Annotation-based tracing is supported only in Kotlin. For Java, use Manual Tracing instead.
Quick Start
@Trace(name = "GreetUser")
fun greetUser(name: String): String {
println("Hello, $name!")
return "Greeting sent to $name"
}
fun main() {
// Enable tracing via the `IS_TRACY_ENABLED` environment variable
// or programmatically, as shown below:
TracingManager.isTracingEnabled = true
TracingManager.setSdk(configureOpenTelemetrySdk(ConsoleExporterConfig()))
greetUser("Alice")
// Traces are automatically flushed based on ExporterCommonSettings:
// - Periodically via flushIntervalMs and flushThreshold
// - On JVM shutdown if flushOnShutdown = true (default)
// For manual control, use TracingManager.flushTraces()
}
Hierarchical Tracing
Nested calls between annotated functions are traced automatically.
When one traced function invokes another, a hierarchical trace structure is created the outer call is recorded as a
parent span and the inner call as its child span.
This structure provides a clear visual representation of call relationships and execution flow.
See the NestedSpansExample.kt for a
demonstration of hierarchical tracing.
Tracing in Inherited Classes and Interfaces
Tracing annotation is automatically propagated through interfaces and class hierarchies.
If a method in an interface or superclass is annotated with
@Trace, all
overriding or implementing methods
inherit tracing behavior automatically, even if the annotation is not explicitly declared again.
This approach ensures consistent and reusable tracing across an entire inheritance chain without code duplication.
Refer to the
TracingPropagationExample.kt for a
complete example.
Customizing Tracing Behavior
The tracing system offers flexible customization through the
SpanMetadataCustomizer
interface. Implement it as a Kotlin object (classes are not supported) and reference it in the
@Trace annotation.
With a custom SpanMetadataCustomizer, you can define:
- How input parameters are serialized.
- How output results are serialized.
- How span names are dynamically generated based on runtime data. This provides fine-grained control over how tracing information is represented.
See an example implementation in
MetadataCustomizerExample.kt.
Adding Custom Tags
You can enrich your traces with contextual metadata by adding custom tags.
Tags help categorize and filter traces based on your business logic. For example, by user type, feature, or
environment. Use the
addLangfuseTagsToCurrentTrace
function to
attach
tags dynamically within any traced function.
These tags appear in Langfuse or other tracing tools, making it easier to group and analyze trace data.
An example demonstrating how to provide custom tags can be found in
LangfuseTagExample.kt
Manual Tracing
In addition to annotation-based tracing, you can manually create and manage spans wherever you need them in your code.
This approach gives you full control over what is traced and what metadata is recorded.
Use the withSpan function to define a traceable block of code.
It automatically creates and closes a span around the specified code section, allowing you to attach structured metadata
such as inputs, outputs, or custom attributes.
Manual tracing is especially useful in Java projects where annotation-based tracing is not available.
An example demonstrating manual tracing can be found in
ManualTracingExample.kt.
Tracing Backends
Tracy provides integrations with multiple tracing backends, allowing you to choose the one that best fits your workflow. Currently supported backends include:
Create an SDK by providing the desired configuration, then initialize tracing with TracingManager.setSdk(sdk)
// Langfuse
val sdk = configureOpenTelemetrySdk(LangfuseExporterConfig())
// Weave
val sdk = configureOpenTelemetrySdk(WeaveExporterConfig())
// Console-Only
val sdk = configureOpenTelemetrySdk(ConsoleExporterConfig())
// File (plain text and JSON formats supported)
val sdk = configureOpenTelemetrySdk(
FileExporterConfig(
filepath = myFile.absolutePathString(),
append = true,
format = OutputFormat.JSON, // default is `OutputFormat.PLAIN_TEXT`
)
)
Once the SDK is configured, initialize tracing with TracingManager.setSdk(sdk).
Langfuse Configuration
| Property | Environment Variable | Required | Default Value |
|---|---|---|---|
langfuseUrl |
LANGFUSE_URL |
false | https://cloud.langfuse.com |
langfusePublicKey |
LANGFUSE_PUBLIC_KEY |
true | - |
langfuseSecretKey |
LANGFUSE_SECRET_KEY |
true | - |
traceToConsole |
- | false | false |
exporterTimeoutSeconds |
- | false | 10 |
maxNumberOfSpanAttributes |
MAX_NUMBER_OF_SPAN_ATTRIBUTES |
false | 256 |
maxSpanAttributeValueLength |
MAX_SPAN_ATTRIBUTE_VALUE_LENGTH |
false | Int.MAX_VALUE |
Weave Configuration
| Property | Environment Variable | Required | Default Value |
|---|---|---|---|
weaveUrl |
WEAVE_URL |
false | https://trace.wandb.ai |
weaveEntity |
WEAVE_ENTITY |
true | - |
weaveProjectName |
WEAVE_PROJECT_NAME |
true | - |
weaveApiKey |
WEAVE_API_KEY |
true | - |
traceToConsole |
- | false | false |
exporterTimeoutSeconds |
- | false | 10 |
maxNumberOfSpanAttributes |
MAX_NUMBER_OF_SPAN_ATTRIBUTES |
false | 256 |
maxSpanAttributeValueLength |
MAX_SPAN_ATTRIBUTE_VALUE_LENGTH |
false | Int.MAX_VALUE |
Console Configuration
Configuration for exporting OpenTelemetry traces to console only. Console Setup Example
File Configuration
Configuration for exporting OpenTelemetry traces to a file in either JSON or plain text format. File Setup Example
OTLP Configuration
There are Http and gRPC configurations available. Spans can be exported to any OTLP-compatible HTTP or gRPC collector (for example, Jaeger). Jaeger Setup Example
Project Structure
plugin— contains the Kotlin compiler plugins for annotation-based tracing.
It includes multiple Kotlin Compiler Plugin (KCP) implementations for different Kotlin versions.
Thegradle-tracy-pluginmodule automatically selects the appropriate KCP version for your Kotlin compiler and applies it to your Gradle project.publishing— provides a Gradle plugin used for modules that need to be published.
It defines thetracyversion and includes Kotlin DSL logic for publishing artifacts to Space Maven repositories.examples— contains runnable examples demonstrating how to use varioustracyfeatures. capabilities.tracingcore— the foundational module providing tracing abstractions, annotations, and OpenTelemetry integration. Manual tracing. Tracing integration forOkHttpclientopenai— tracing integration for theOpenAIclient.gemini— tracing integration for theGeminiclient.anthropic— tracing integration for theAnthropicclient.ktor— tracing integration for theKtorHTTP client.test-utils— shared utilities for testing tracing functionality across modules.
Limitations
Span Context Propagation
There are a few known limitations of the library in terms of the content propogation. To see complete examples of the
recommended usage patterns, refer to the
ContextPropagationExample.kt file.
Kotlin Coroutines
Context propagation works automatically in structured coroutines (e.g., withContext, launch).
However, some concurrency models such as runBlocking and raw threads create new execution boundaries and require
manual propagation of the OpenTelemetry context.
runBlockinginside suspend functions:
UsecurrentSpanContextElement(...)to ensure child spans remain linked to their parent. Otherwise, spans become detached and appear as separate traces.@Trace suspend fun handleRequestInCoroutine(requestId: String) { println("Running on thread: ${Thread.currentThread().name}") // runBlocking without context propagation would detach the trace runBlocking(currentSpanContextElement(currentCoroutineContext())) { println("Running inside nested coroutine on thread: ${Thread.currentThread().name}") processUserRequest(requestId) } }
Multi-Threading
- Custom threads (via
thread { ... }):
Threads do NOT inherit the OpenTelemetry context automatically.
Capture and propagate it manually:val context = currentSpanContext(currentCoroutineContext()) thread { context.makeCurrent().use { processUserRequest("REQ-303") } }
Local functions
Avoid using @Trace on local (nested) functions.
This is a known Kotlin limitation: references to local functions do NOT implement the KCallable
interface correctly. For more details, see the related issue: KT-64873.
Publishing
This project uses Gradle composite builds. Thus, running plain publish or publishToMavenLocal is NOT enough.
Included builds are not published automatically.
Use this task instead:
- Publish to Maven Local
./gradlew publishAllToMavenLocal
These tasks publish JARs of all modules, all versions of the Kotlin-compiler plugin, and the Gradle plugin.
Versioning
Tracy follows semantic versioning:
<major>.<minor>.<patch>
<major>: stays 0 until the library exits beta.<minor>: increases for public releases (first public release: 0.1.0).<patch>: used for internal development iterations and fixes.
Contributing
Read the Contributing Guidelines.
Code of Conduct
This project and the corresponding community are governed by the JetBrains Open Source and Community Code of Conduct. Please make sure you read it.
License
Tracy is licensed under the Apache 2.0 License.
