how to use dynatrace API to get distributed trace info using traceID - trace

how to use dynatrace API to get distributed trace info using traceID ( need it to create custom dashboard) . Any links or examples would help

Related

REST API for getting performance metrics for an HDInsight cluster?

I am looking for REST API that will allow me to fetch performance metrics for a given HDInsight (Hadoop/Linux) cluster -- such as amount or percentage of memory used by the cluster, cpu usage etc. But I haven't come across anything specific. The only closest link I have found is this. But this too doesn't have any reference to getting performance metrics. Is this info even exposed as REST API ?
According to my understanding, you want to get the metrics of the cluster. If so, you can use the following rest api to get it. For more details, please refer to the document and article
Method: GET
URL: https://<clustername>.azurehdinsight.net//api/v1/clusters/<cluster-name>?fields=metrics/load[<start time>,<end time>,<step>]
Headers : Authorization: Basic <username password>
For example:
Get CPU usage

What are the various metrics for Oracle Cloud Compute VM? What is the REST API for fetching all the metrics?

Any API sample is available with request and response? I have went through oracle cloud documentation, there are no samples only rest end point is available.
Examples of using the APIs for fetching Metrics can be found here:
MonitoringMetricListExample.java
MonitoringMetricSummarizeExample.java

logger messages in zipkin

I am new to sleuth and zipkin. I have logged some messages and sleuth is appending trace id and space id for those messages. I am using zipkin to visualize it. I am able to see timings at different microservices. Can we see logger messages(we put at different microservices) in zipkin UI by trace id?
No you can't. You can use tools like Elasticsearch Logstash Kibana to visualize it. You can go to my repo https://github.com/marcingrzejszczak/docker-elk and run ./ getReadyForConference.sh, it will start docker containers with the ELK stack, run the apps, curl the request to the apps so that you can then check them in ELK.

way to check spring cloud stream source and sink data content

Is there any way I can check what data is there in spring cloud dataflow stream source(say some named destination ":mySource") and sink(say "log" as sink)?
e.g. dataflow:>stream create --name demo --definition ":mySource>log"
Here what is there in mySource and log - how to check?
Is it like I have to check spring cloud dataflow log somewhere to get any clue, if it at all has logs? If so, what is the location of logs for windows environment?
If you're interested in the payload content, you can deploy the stream with the DEBUG logs for the Spring Integration package, which will print the header + payload information among many other interesting lifecycle details. The logs will be either the payload consumed or produced depending on the application-type (i.e., source, processor, or sink).
In your case, you can view the payload consumed by the log-sink via:
dataflow:>stream create --name demo --definition ":mySource > log --logging.level.org.springframework.integration=DEBUG"
We have plans to add native provenance/lineage support with the help of Zipkin and Sleuth in the future releases.

Bluemix: App diagnostic info

I have app that uses several Bluemix services including database service.
Is it possible to have access to all available operational info via single dashboard or I need inspect all components' logs separately?
-Thanks in advance
Service log files are not available. You might consider looking at the Monitoring and Analytics service to see if will meet your needs. There is both a free and a diagnostics plan available.