My web project needs some services (processes) that are managed by supervisord.
I thought it would be handy if the supervisord config files for the services would be version controlled, so that they won't get lost and one does not have to re-build those.
I would store them - together with my project files - in a repository. And symlink the files into the /etc/supervisor/conf.d directory.
However, as some of the config options take absolute file paths (i.e. command or stdout_logfile), I falter to save them in version control.
What's the best way to go here?
Related
I built a universal tarball using sbt-native-packager.
How do I load configuration files like c3p0 and application.conf from /etc/myapp or anyother custom location when I start the app.
I don't want the config files to be part of the distribution tarball itself.
I believe you can use typesafe config's "include" feature to grab from a direct location.
See https://github.com/typesafehub/config#features-of-hocon
That said, this would require you to create different configurations based on where you're installing, if you wanted a global file as the config file.
in my Java Eclipse project that contains JUnit tests, I also have a package "resource" that contains all input data used for the tests. But when compiling JUnit tests, the Java compile also data available in resources, so I find the same data in the "bin" folder. Is there a way to avoid this?
thanks.
If you have a particular package within the source path you want to exclude (your resources folder for example), you can right click on the package and select: Build Path > Exclude.
This will tell Eclipse that you don't want to include that package as part of the build.
This is making a couple of assumptions: that you're using Eclipse Helios (because the option might be different in older versions), and that the resources are stored in the same folder as your regular java source files (because if resources is in a folder by itself, you can remove that entire folder from the build by using Build Path > Configure Build Path -> Source tab.
Update:
After the discussion in the comments regarding why you would or would not want to copy resources into the bin directory:
The contents of your bin directory should be ignored and not checked into to a version control system (when using CVS, bin should be an entry in the .cvsignore file)
The resources are only duplicated on your local machine, which is fast and hard discs are big. I'm not sure you should be worrying about this
If you're using Class.getResource to access those resources, they need to be on the classpath somewhere. The bin directory is as good a place as any
So, realistically (barring some unknown, like the files are hundreds of gigabytes or something), I don't think you need to be concerned about excluding these files from the build.
I am building a web service and am packaging it into a war file for deployment. Right now all of my config files (.properties and .xml) are being packaged into my .war file. This isn't going to work as some of these files will need to be modified for each individual installation. I know that some servlet containers will leave the .war files intact which would mean the config files would never be easily modified. My question is this: what is the best practice for deploying a .war file with these external config files? I'm thinking that the config files will need to be shipped separate from the .war file and placed into a directory that is in the classpath. Is there a default directory setup like this in Tomcat that these files can just be dropped into and my web service will be able to find without much trouble?
Maybe I shouldn't be using a war file for this setup? Maybe I should just be providing a zip file (with the same contents as the war file) and the deployment will simply be to extract the zip into the webapps directory?
I do not know any default directory in Tomcat to store configuration, my
attempts to solve the same issue have been :
1 - Move configuration to the DB and provide scripts or webpages to modify values.
2 - Have a script to deploy the war. The script would merge configuration from a user directory into web.xml or other deployed config files.
3 - Have webapps look first in a user directory for configuration and
if not found then look for configuration files deployed by the war.
Least favorite is 3 - it require all webapps to check two places for configuration and
you end up with two different xml files on the server with different values and it is not always clear which one is used.
Next favorite is 2 - the webapps can be written without knowledge of multiple config files, but you run into issue when someone does a deploy from Tomcat manager instead of using your script.
Favorite is 1. This just works in most cases. Problem is when you don't have a DB or
want to configure how you connect to the DB.
If having the file visible from all webapps is not an issue, you could put it $CATALINA_HOME/lib.
One solution is to modify property file after deployment of war file is to use ServletContext.getRealpath() method to get the real path means path of file in the server where it is deployed and then modify that file it will modify file in container only not the original file. So you need to backup it if it is important modification for you. So by this you do not need to redeploy war file as it is already modifying file from deployed container.
This solution can edit a file that is in webpages folder also from the java class.
If you want more description or how to do it then let me know i have did it.
I have a Java web app that offloads some environment specific settings (Hibernate configurations, required directory paths, etc.) in a properties file that is ultimately packaged in the deployed WAR. If I wish to distribute this web app, what's the best way to handle the mangement of these settings? It's not feasible to ask the user to open up the WAR, update the properties file, repackage the WAR, and then deploy. I was thinking of either creating an installer (e.g. NSIS, WiX) that asks for the properties, writes them in the WAR, and then asks for the deployment location for the WAR. The other option is to have the properties file external to the WAR, and based on convention the web app will know where to read the file. What's the best practice in this case?
Some projects that require this sort of configuration, and face this issue, use the approach of building the projects (and the .war) on the server where it will be deployed.
So instead of:
Copy a pre-packaged .war file to a meaningful location
You get:
Check source code out of SCM (Subversion, CVS, etc.)
Configure to taste
Build the project (automated with Maven or Ant)
Deploy the project (also typically automated using Maven or Ant)
From here you can get fancy by checking each server's configuration files into SCM as well. This approach allows you to version & audit configuration changes.
I was also facing the same problem in the project. The developer before me had done crude fix for the solution which was adding all the required configuration in the hibernate.hbm.cfg.xml file and commenting them. The required configurations were uncommented as per the need. There is a better solution to problem however.
Use a configuration folder schema
Using configuration Parameter Reader
Use of ConfigurationReader component
Source : http://www.javaworld.com/javaworld/jw-11-2004/jw-1108-config.html
What are good ways of dealing with the issues surrounding plugin code that interacts with outside system?
To give a concrete and representative example, suppose I would like to use Subversion and Eclipse to develop plugins for WordPress. The main code body of WordPress is installed on the webserver, and the plugin code needs to be available in a subdirectory of that server.
I could see how you could simply checkout a copy of your code directly under the web directory on a development machine, but how would you also then integrate this with the IDE?
I am making the assumption here that all the code for the plugin is located under a single directory.
Do most people just add the plugin as a project in an IDE and then place the working folder for the project wherever the 'main' software system wants it to be? Or do people use some kind of symlinks to their home directory?
Short answer - I do have my development and production servers check out the appropriate directories directly from SVN.
For your example:
Develop on the IDE as you would normally, then, when you're ready to test, check in to your local repository. Your development webserver can then have that directory checked out and you can easily test.
Once you're ready for production, merge the change into the production branch, and do an svn update on the production webserver.
Where I work some folks like to use the FileSync Plugin for Eclipse for this purpose, though I have seen some oddities with that plugin where files in the target directory occasionally go missing. The whole structure is:
Ant task to create target directory at desired location (via copy commands, mostly)
FileSync Plugin configured to keep files in sync between development location and target location as you code (sync the Eclipse output folder to a location in the Web server's classpath, etc.)
Of course, symlinks may work better on systems that have good support for symlinks :-)
To me, adding a symlink pointing to your development folder seems like a tidy solution to the problem.
If the main project is on a different machine/webserver, you could use something like sshfs to mount your development directory into the right place on the webserver.