Recently we announced in our blog post, "The OpenTelemetry Tracing Specification Reaches 1.0.0!," that OpenTelemetry tracing specifications reached v1.0.0 — offering long-term stability guarantees for the tracing portion of the OpenTelemetry clients. Today we’re excited to share that the first of the language-specific APIs and SDKs have reached v1.0.0 starting with OpenTelemetry Java and OpenTelemetry .NET.
This is a major milestone along the path to finally merging OpenTracing and OpenCensus into one unified, standard tracing API. We’d like to say congratulations to the ~160 community contributors that helped get the project to this point.
You can get started with OpenTelemetry .NET in five minutes.
First, download and install the .NET Core SDK on your computer.
Create a new console application and run it:
dotnet new console --output getting-started
cd getting-started
dotnet run
You should see the following output:
Hello World!
Install the OpenTelemetry.Exporter.Console package:
dotnet add package OpenTelemetry.Exporter.Console
Update the Program.cs file with the code from Program.cs:
Run the application again (using dotnet run) and you should see the trace output from the console.
Activity.Id: 00-8389584945550f40820b96ce1ceb9299-745239d26e408342-01
Activity.DisplayName: SayHello
Activity.Kind: Internal
Activity.StartTime: 2020-08-12T15:59:10.4461835Z
Activity.Duration: 00:00:00.0066039
Activity.TagObjects:
foo: 1
bar: Hello, World!
baz: [1, 2, 3]
Resource associated with Activity:
service.name: unknown_service:getting-started
Congratulations! You are now capturing traces using OpenTelemetry and displaying them in the console. You can find more information and examples in GitHub.
You can find more information in the OpenTelemetry .NET community announcement.
OpenTelemetry Java v1.0.0 was also launched recently.
First, build this example application:
../gradlew shadowJar
Then start the example application with the logging exporter configured:
java -Dotel.traces.exporter=logging \
-cp build/libs/opentelemetry-examples-autoconfigure-0.1.0-SNAPSHOT-all.jar \
io.opentelemetry.example.autoconfigure.AutoConfigExample
Alternatively, instead of system properties you can use environment variables:
export OTEL_TRACES_EXPORTER=logging
java -cp build/libs/opentelemetry-examples-autoconfigure-0.1.0-SNAPSHOT-all.jar \
io.opentelemetry.example.autoconfigure.AutoConfigExample
Full documentation of all supported properties can be found in the OpenTelemetry SDK Autoconfigure README.
After running the app you should see the trace printed out in the console:
...
INFO: 'important work' : ca3938a5793f6f9aba5c757f536a50cb b5e826c981112198 INTERNAL [tracer: io.opentelemetry.example.autoconfigure.AutoConfigExample:] AttributesMap{data={foo=42, bar=a string!}, capacity=128, totalAddedValues=2}
...
Congratulations! You are now collecting traces using OpenTelemetry.
OpenTelemetry Python and OpenTelemetry Erlang v1.0.0 will be GA soon. Check back here or the OpenTelemetry blog for updates.
Whether you build your own OpenTelemetry SDKs or Collectors or use our pre-packaged and supported distribution, you can have confidence that the 1.0 releases contain stable tracing APIs and tracing-related functionality, and that even more integrations are on the way. You can use the .Net and Java SDKs today with Splunk APM by sending traces to an OpenTelemetry Collector that then exports the traces to Splunk.
You can read more about this exciting announcement and what’s next with the project in the blog below.
Splunk is a contributor to OpenTelemetry and committed to accelerating the adoption of the project. For more information about Splunk and OpenTelemetry, you can check out our latest OpenTelemetry blog posts.
The world’s leading organizations rely on Splunk, a Cisco company, to continuously strengthen digital resilience with our unified security and observability platform, powered by industry-leading AI.
Our customers trust Splunk’s award-winning security and observability solutions to secure and improve the reliability of their complex digital environments, at any scale.