integrate newrelic in flink scala project - scala

I want to integrate newrelic in my flink project. I have downloaded my newrelic.yml file from my account and have changed the app name only and I have created a folder named newrelic in my project root folder and have placed newrelic.yml file in it.
I have also placed the following dependency in my buld.sbt file:
"com.newrelic.agent.java" % "newrelic-api" % "3.0.0"
I am using the following command to run my jar:
flink run -m yarn-cluster -yn 2 -c Main /home/hadoop/test-assembly-0.2.jar
I guess, my code is not able to read my newrelic.yml file because I can't see my app name in newrelic. Do i need to initialize newrelic agent somewhere (if yes, how?). Please help me with this integration.

You should only need the newrelic.jar and newrelic.yml files to be accessible and have -javaagent:path/to/newrelic.jar passed to the JVM as an argument. You could try putting both newrelic.jar and newrelic.yml into your lib/ directory so they get copied to the job & task managers, then adding this to your conf/flink-conf.yaml:
env.java.opts: -javaagent:lib/newrelic.jar
Both New Relic files should be in the same directory and you ought to be able to remove the New Relic line from your build.sbt file. Also double check that your license key is in the newrelic.yml file.
I haven't tested this but the main goal is for the .yml and .jar to be accessible in the same directory(the yml can go into a different directory but other JVM arguments will need to be passed to reference it) and to pass -javaagent:path/to/newrelic.jar to as a JVM argument. If you run into issues try checking for new relic logs in the log folder of the directory where the .jar is located.

Related

How to run `forest schema:update` outside project directory?

I'm trying to use the forest-cli schema:update command, but when I do, I keep getting the error:
× We are not able to detect a Forest CLI project file architecture at this path: /PATH/TO/REPO/ROOT.: Error: No "routes" directory.
There is a routes directory, but within src/ below the repo root. I have tried running forest schema:update from inside there, but I get the exact same error. The command only has options for a config file and an output directory.
Googling has turned up nothing, and there's no obvious hint from forestadmin's documents. Thanks in advance for any assistance!
According to the forest-cli code available here, the forest schema:update command requires the package.json file to be directly accessible in order to run (In the same folder you run the command), to check that the version of the agent you are running is indeed compatible with schema:update.
You can also use the -c/--config option in order to use another location of your config/database.js, and the -o/--outputDirectory to output the result to a new location.
In your case, I would say that forest schema:update -c src/config/database.config.js -o tmp should allow you to generate the files in the tmp directory (Be aware that this directory should not exist).
This command should be run where your package.json is located.
However, I don't think you will be able to export files directly at the right location when using a custom folder structure.

how do i change values in .properties file and run a build using this in jenkins

i have a basic job that runs a .bat file to do an export of an application from a file server somewhere. it uses a .properties file in standard format to get the login details, server location and application name/version etc
i've made it work from command line and hard coding the values in the .properties file and running it. the export works and saves in the directory i specify.
i moved over to jenkins and it also works using the hardcoded .properties file.
what i want to do now is set the values in the .properties file inside jenkins so it can be updated without having to manually open the .properties file and then run the same .bat file
if someone could provide an example of setting just one value in a .properties file through jenkins, i feel i can do the rest.
You can try to use the EnvInject Plugin for Jenkins which allows to inject environment variables into build process, and modify your .bat file to use them instead of reading values from the properties file.
Here are some of the plugin's usecases/features:
To remove inherited environment variables (PATH, ANT_HOME, ...) at node level (master/slave), available by default for a job run.
To inject variables in the first step of the job (before the SCM checkout)
To inject variables based on user parameter values
To execute an initialization script before a SCM checkout.
To execute an initialization script after a SCM checkout
To inject variables as a build step obtained from a file filled in by a previous build step
To know environment variables used for a build
To inject build cause as environment variable for a build
To inject environment variables from the evaluation of a Groovy script (powered by Script Security Plugin)
To export environment variables as a metadata in your binary repository

sbt run - how to specify working directory? [duplicate]

I would like to be able to run the java program in a specific directory. I think, that it is quite convenient to parametrize working directory, because it allows to easily manage configurations.
For example in one folder you could have configuration for test, in other you could have resources needed for production. You probably think, that there is option to manipulate classpath for including/exluding resources but such solution works only if you are interested in resources stored in classpath and referencing them using Classloader.getResource(r). But what if you have some external configuration and you want to access it using simple instructions like File file = new File("app.properties");?
Let's see ordinary example.
Your application uses app.properties file, where you store credentials for external service. Application looks for this file in working directory, because you uses mentioned File file = new File("app.properties"); instruction to access it. In your tests you want to use app.properties specific to your tests. In you integration tests you want to use app.properties specific to another environment. And finally when you build and release application you want to provide other app.properties file. All these resources you want to access always in the same way just by typing File file = new File("app.properties"); instead of (pseudo code):
if(configTest)
file = File("testWorkDir/app.properties");
else if(config2)
file = File("config2WorkDir/app.properties");
else
file = File("app.properties");
or instead of using resources in classpath
this.getClass.getClassLoader.getResource("app.properties");
Of course you are clever programmer and you use build tool such maven, gradle or sbt :)
Enought at the outset. At least The Question:
Is there a way to set working directory in java and if yes how to configure it in build tools (especially in sbt)?
Additional info:
changing 'user.dir' system property is not working (I've tried to change it programaticly).
In sbt changing 'working directory' via baseDirectory setting for test changes baseDirectory which is not base dir in my understangind and it is not equal new java.io.File(".").getAbsolutePath.
Providing environment variable like YOUR_APP_HOME and referencing resources from this path is feasible but require to remember about this in your code.
In sbt changing 'working directory' via baseDirectory setting for test changes baseDirectory which is not base dir in my understangind and it is not equal new java.io.File(".").getAbsolutePath.
I'm not sure what the above statement means, but with sbt you need to fork to change your working directory during the run or test. This is documented in Enable forking and Change working directory.
If you fork, you can control everything, including the working directory.
http://www.scala-sbt.org/0.13.5/docs/Detailed-Topics/Forking.html
Example code:
fork in run := true
baseDirectory in run := file("/path/to/working/directory/")

"Not A Valid Jar" When trying to run Map Reduce Job

I am trying to run a my MapReduce job by building a jar from eclipse , but while trying to execute the job , I am getting "Not a valid Jar" error.
I have tried to follow the link Not a valid Jar but that didnt help.
Can anyone please give me the instructions on how to build the jar from eclipse, for it to run on Hadoop.
I am aware of the process of building the Jar file from eclipse,however I am not sure, do I have to take any special care for building a jar file, so that it runs on Hadoop.
When you submit the command, make certain you have the following things on the line to do the command:
When you indicate the jar, make certain you are directing to the jar properly. It may be easiest to be certain by using the absolute path. To get the absolute path, if you navigate to the place where the jar is, then run 'readlink -f ' command to get the absolute path. So for you, not just hist.jar, but maybe /home/akash_user/jars/hist.jar or wherever it is on your system. If you are using Eclipse, it may be saving it somewhere funny, so make sure that is not the problem. The jar cannot be run from HDFS storage. must run from local storage.
When you name your main class, in your example Histogram, you must use the fully qualified name of the class, with the package, the project, and the class. So, usually, if the program/project is named Histogram, and there is a HistogramDriver, HistogramMapper, HistogramReducer, and your main() is in HistogramDriver, you need to type Histogram.HistogramDriver to get the program running. (Unless you made your jar runnable, which requires extra stuff at the beginning, making .mdf and things.)
Make sure that the jar you are submitting (hist.jar) is in the current directory from where you are submitting the 'hadoop jar' command.
If the issue is still persisting, please tell the Java, Hadoop and Linux version you are using.
You should not keep the jar file in HDFS when executing the MapReduce job. Make sure Jar is available in the local path. Input path and output directory should be the path from HDFS.

How to access test resources in Play 2.1?

I'm trying to access a .jpg file that is inside my src folders with this line of code:
getClass().getResource("/teste.jpg")
That leads to NullPointerException.
I tried to put the file in the app/ root directory. I also tried to put it in test/ directory and also test/resources as well.
When I run the test from Eclipse it works fine, but it does not from command line using play test.
How to solve this?
I believe that you can achieve it by using
Play.application().getFile("file-relative-to-root")
It is dependent on the running application but for the unit test you can create your own application. Take a look into API docs.
http://www.playframework.com/documentation/api/2.1.0/scala/index.html#play.api.Application
Place the file under public or app/assets as described in Anatomy of a Play application. You can also have the test resources under test/resources.
[root]> help resourceDirectory
Default unmanaged resource directory, used for user-defined resources.
[root]> show test:resourceDirectory
[info] root/test:resourceDirectory
[info] /Users/jacek/dev/sandbox/play-new-app/test/resources
With the resource file(s) in one of the directories, you can get at them as follows:
io.Source.fromInputStream(getClass.getResourceAsStream("/teste.jpg"))