We had a web service developed in Talend and deployed in TAC(Talend runtime). Service is working fine on the local system but not after the deployment.We had tried various methods to debug it like placing the logger component and putting logging mechanism in Java component of Talend but those messages are not populating in the log file.Please suggest.
Talend Enterprise 5.6 comes with log4j logging. (It can be enabled in the project settings.) Maybe open studio has this feature as well.
If you activate that and start the logserver (based on Kibana / Logstash) you could have a web interface that shows the log messages in real-time, across all the deploys you have.
We're using this approach for development and some production projects. It tells you all the SQL-s connection details, execution times, records fetched, etc..
In TAC you should see the same logs if you click on the magnifier button on the corresponding job on Job Conductor tab. In case if it's empty, check the log4j setting in File->Edit Project Properties->Log4J, and make sure that the default CONSOLE appender is enabled. Also try to build the project manually, and check the log4j.properties in the built zip file.
Finally check the log level at the job properties on TAC->Job Conductor, and make sure it set to the right level.
Related
My current application setup is running with WAS 7.0 and we are planning to migrate with WAS 8.5.
Since my application is having lot of complex console setup like Queues , Activation Spec, multiple data sources , security configs, work manager etc...
So, I thought of using a WAS 7.0 CAR file in WAS 8.5 to restore the profile configuration.
Is it possible to do that or do we have to configure manually in WAS 8.5?
To be honest I've never tried that, but you have some other supported migration paths:
Use migration wizard/migration tools - see Migrating product configurations for details - this will scan your v7.0 profile configuration and create v 8.5.5 profile. Probably the easiest, if you have lots of customized changes and dont have any scripts relating to them.
Use WebSphere Configuration Migration Tool - this tool is installed as plugin to Eclipse. Reads exported configuration and creates jython script. Tool migrates most common resources (like JDBC), but not all. See above page for limitations.
Use AdminTask which exports configuration as property file, change it according to your needs and update on target environment.
wsadmin(.sh/.bat) –lang jython –c “AdminTask.extractConfigProperties(['-propertiesFileName my.props'])”
See Managing environment configurations with properties files using wsadmin scripting
Our application deployed in mule server has credentials in properties file that are used to access Database and other cloud. We have to do some change in properties file. I would like to know whether this will require a redeployment of the running application or it would be sufficient to restart the application from mule managent console or a server restart may be required.
Any suggestion would greatly help.
Thank you in advance
I believe you can still control the applications from MMC, even when they are deployed using the $MULE_HOME/apps directory. You navigate to them through the "Servers" tab. Find the application under the server, and there is an option at the top right to "Stop" and "Start" the application. This should allow you to run with the new configuration values.
The simplest possible without restarting your Mule Server is to Redeploy the application from the Mule Management Console (Deployment tab).
Hope this helps.
I am using IBM RSA 7.5 and Websphere Server 6.1 as a application server.
I am not able to change Class Loader Order dropdown box. It has disable state.
How to enable that. I need to change the item "Classes Loaded with parent class loader first" to "Classes Loaded with application class loader first".
"The classloader options are disabled in the admininstrative console because the application was published in a "loose configuration" manner. What this means is that your application binaries and descriptor files do not reside in the WAS application repository. Since you published them via RAD (which is most likely configured to 'Run with resources in the workspace') then the application binaries exist in the output folders of your various projects and WAS is instructed to read the binaries/descriptor files from that location. As a result, the WAS admin console is not able to make changes to these files so the functionality is disabled.
Here is a document which describes how to can accomplish the task you want when using this publishing mechanism via RAD:
http://publib.boulder.ibm.com/infocenter/radhelp/v7r5/index.jsp?topic=/com.ibm.ws.ast.st.enhanced.ear.doc/topics/tapplicationsv6.html
Using this manner to change the classloader settings results in this information being stored with the application (in the EAR project) so you will no longer need to change it when the application is published to any WAS runtime (i.e. in development or production)."
From the IBM developer help website
I struggled with this one a lot, and even bumped on your question being desperate.
Although i had a different setup,publishing my application from Eclipse, unchecking "Minimize application files copied to the server" in eclipse server config solved the issue for me.
How do I setup a EAR and a Glassfish Server that it shows FINE level on the development server but the same ear shows INFO level on the productive machines?
In the moment I change config in the persistence.xml every time i deploy onto the productive machines.
But s.t. i forget and the machine starts flooding the log files.
You must put in the
<jvm-options>-Declipselink.logging.level=FINE</jvm-options>
into your config.xml java-config tag on your development machine.
And do NOT put in the logging level property into the persistence.xml.
You can also set the EclipseLink log level using System properties (or you could set the log level in code using a SessionCustomizer).
I developed one spring batch application which is deployed as executable jar using batch/shell script. It works fine.
Now recently I read about spring batch admin application release. As per their doc, they say you have to point to job-context.xml and that will allow to manage spring batch app to be started,restarted and stopped from admin app. Now my question is do I have to keep my job-context.xml outside the jar or what are the exact steps, i am confused about this configuration.
Any insight on this is very useful and by the way I am using spring batch 2.1.
Thanks
The Spring Batch admin application is a good reference implementation and is highly customizable. All interface implementations may be replaced via Spring DI using your own classes. UI is also template driven(FreeMarker I think) and therefore may be customized to display relevant information, change skin etc.
I had a similar need like yours - need admin functionality included in an app built as jar. I did not quite like the fact that I had to package my jobs as a .war file. Instead I extracted relevant configurations from Spring Batch Admin source and created a deployment that works off file system and runs on embedded Jetty server.
See screen shots here : https://github.com/regunathb/Trooper/wiki/Trooper-Batch-Web-Console
Source, configurations etc are available here : https://github.com/regunathb/Trooper/tree/master/batch-core . This project actually creates a .jar and not .war
If your application has custom classes and is deployed as a runnable jar and not contained within the spring batch admin, you cannot start jobs. You can only view the status of jobs and "kill" their status in the database.
If you look at http://static.springsource.org/spring-batch-admin/reference/reference.xhtml at the end of the Configuration Upload section it states
You can see a new entry in the job registry ("test-job") which is
launchable in-process because the application has a reference to the
Job. (Jobs which are not launchable were executed out of process, but
used the same database for its JobRepository, so they show up with
their executions in the UI.)
If your jobs are strictly configurable jobs, as-in you use only XML to define them and do not need to do any customized item readers/processors/writers or other custom classes, then you can upload the job XML and it will be runnable from within the admin site. If you have custom classes then, from my experience, you will have to have the spring batch application deployed within your web application and then upload an XML that contains the jobs you want to run separately.
I personally just used the Admin tool to view job status and provide me with statistics through some custom pages. I left the scheduler to run the jobs and I didn't want those with access to the admin site to kick off a job when they knew nothing about it. Basically, used it to give the users a warm fuzzy without allowing them to muck it up. (leave it to a user to find an edge case you didn't account for)