We are implementing the updater service using the install4j APIs (without using the Updater.exe). We could successfully use the APIs as below to get the possible update version
UpdateCheckRequest updateCheckRequest = new UpdateCheckRequest(updatesUrl).applicationDisplayMode(ApplicationDisplayMode.UNATTENDED)
.askForProxy(false).connectTimeout(10000).readTimeout(20000);
UpdateDescriptor updateDescriptor = UpdateChecker.getUpdateDescriptor(updateCheckRequest);
return updateDescriptor.getPossibleUpdateEntry();
However, we are missing the below JVM arguments, (to set proxy settings and enable logging) which are available in the updater screens.
-Dinstall4j.noProxyAutoDetect=true
-DproxySet=true
-DproxyHost= “”,
-DproxyPort=””
-DproxyAuth="true"
-DproxyAuthUser=””
-DproxyAuthPassword=””
-Dinstall4j.keepLog=true -Dinstall4j.alternativeLogfile=${installer:sys.installationDir}/logs/patch-agent-updater.log
Please let us know how to pass them to the custom updater ?
In this case the update checker runs in the same process, so you can set these properties via System.setProperty.
Related
im using
integration toolkit for each external system ... externalService
definition, Servers registation datas, ENV refers the J2C creds to use, datamapping, bussines-errors handling etc.
(Bussines-Layer TK ex. TK_SAP)
for the common functionality such as logging, tokenizing,
pseudomizing, common http-error handling i want to use another one
toolkit (Generic implementation for Transport-layer ex. TK_COM).
So its looks like this dependicies chain:
ProcessApp -> TK_SAP -> TK_COM
There is the serviceFlow with inputs externalServiceName, operationName and a serviceFlow ask for oAuth-token and call to target system using externalServiceName, operationName.
The problem is - when i try to invoke the BPMRESTRequest from TK_COM, i get NullPointerException because "externalServiceName" cant be resolved.
var request = new BPMRESTRequest();
request.externalServiceName = "language-translator-v2";
request.operationName="checkout";
...
var response = tw.system.invokeREST(request);
is it possible to store service definition in another TK (upper) and refer it from Toolkit-invoker?
Or is there callbacks for BPMRESTRequest-Construct to say which ServiceDefinition must be used and avoid NPE.
Or another way to call Rest programmaticaly supporting Environments.
Im understand that switching the layers can help (serviceDefinition in lower TK-dependency), but it unlogisch is:
ProcessApp -> TK_COM -> TK_SAP
the answer is: JS-lib implementation.
Implementing common functionality as JS-Server-file in TK_COM makes that a call to it from TK_SAP will instatiate JS-execution context in the TK_SAP namespace so all defined in TK_SAP externalServices and variables will be accesible by executing of JS-code (actualy provided by lower-dependency Toolkit)
Does anyone know the method / commands to send, to enable and use multi-config support so I can store application specific data?
The SDK 2.0 Developer Guide mentions the AT*CONFIG_IDS command, but I'm not able to make it work. I'm trying the sample commands below:
// set the application ID
AT*CONFIG=12,"CUSTOM:application_id","2902050D"
// clear config ack
AT*CTRL=13,5,0
// set application description, using new app id
AT*CONFIG_IDS=14,"00000000","00000000","2902050D"
AT*CONFIG=15,"CUSTOM:application_desc","My SDK Test"
// clear config ack
AT*CTRL=16,5,0
// re-read config data
AT*CTRL=17,4,0
AT*CTRL=18,5,0
But in the returned config, nothing has changed:
custom:application_id = 00000000
custom:application_desc = Default application configuration
I've also tried prefixing the first CUSTOM:application_id config command with an CONFIG_IDS command but with no avail:
// set the application ID
AT*CONFIG_IDS=11,"00000000","00000000","00000000"
AT*CONFIG=12,"CUSTOM:application_id","2902050D"
Any ideas as to what I'm doing wrong?
After some trial and error, it seems that all configuration names should be all lower case; despite the category being upper cased in the Developer Guide examples.
Therefore the following works fine:
AT*CONFIG=12,"custom:application_id","2902050D"
AT*CTRL=13,5,0
AT*CONFIG_IDS=14,"00000000","00000000","2902050D"
AT*CONFIG=15,"custom:application_desc","My SDK Test"
AT*CTRL=16,5,0
AT*CTRL=17,4,0
AT*CTRL=18,5,0
My goal is to add custom PropertySource to spring-cloud-server. What I want to achieve is to get some custom properties from that custom source in spring-cloud-config-client application.
Basing on suggestions from Adding environment repository in spring-config-server I've created spring-cloud-config-server application and separate project spring-cloud-config-custom. Second one is based on spring-cloud-consul-config code. So, I've created all necessary classes like CustomPropertySource, CustomPropertySourceLocator, CustomConfigBootstrapConfiguration and so on and configured them in spring.factories.
At the end, I've added maven dependency to spring-cloud-config-custom inside my spring-cloud-config-server.
So far so good. Everything works well. When I start server I can see that my CustomPropertySource is on the list of propertySources inside EnviromentRepository bean injected to EnvironmentController.
Problem: When I send GET request to #RequestMapping("/{name}/{profiles}/{label:.*}") (in EnvironmentController), injected EnviromentRepository bean is being used to find requested property source (repository.findOne(name, profiles, label) method).
Unfortunately my property source could not be found here. Why?
I've spent a lot of time on debugging this. I've found that repository delegates findOne() method call to other repositories: MultipleJGitEnvironmentRepository which delegates it to NativeEnvironmentRepository. Inside this delegates, findOne() method doesn't use propertySources from EnviromentRepository primary injected to controller. It creates new environment repository with new list of PropertySources and new separate SpringApplication. At the end, this list does not contain my CustomPropertySource and that is why findOne() returns empty propertySources in resulting Environment object.
Am I doing something wrong?
Is CustomPropertySourceLocator (and/or ConsulPropertySourceLocator) supposed to be used (autowired/bootstrapped) in spring-cloud-config-server or spring-cloud-config-client
Can spring-cloud-config-server deliver many different kind of PropertySources at the same time, via REST interface (saying "different" I mean all Git, Consul and Zookeeper)?
What you are doing is adding a property source to the config server itself, not the configuration it serves. Adding spring-boot-starter-actuator to your config server and viewing /env reveals:
{
"profiles": [
],
"server.ports": {
"local.server.port": 8888
},
"bootstrapProperties:custom": {
"test.prop3": "CUSTOM-VALUE-3",
"test.prop2": "CUSTOM-VALUE-2",
"test.prop1": "CUSTOM-VALUE-1"
},
}
To add something that will be served by config server, you have to implement an EnvironmentRepository.
Support for a composite EnvironmentRepository was recently added.
I am trying to understand how to use #profiles in spring-batch.
I created a java file with two classes in it:
#Configuration
#Profile("nonlocal")
class NonLocalConfiguration {
}
and
#Configuration
#Profile("local")
class LocalConfiguration {
}
in the main java class I am trying to set the profiles as follows:
AbstractApplicationContext context = new ClassPathXmlApplicationContext("applicationcontext.xml");
String run_env = System.getenv("profile");
System.out.println("run_env is: " + run_env);
context.getEnvironment().setActiveProfiles(run_env);
context.refresh();
I set the profile using an Environment variable as profile=local
When the program is executed, I get a null pointer exception which I think is not getting the profile correctly. How can I use profiles concept in Spring Batch? Will it be different in normal spring vs spring batch?
As far as I understood, you have to set the profile before you create the ApplicationContext. Once the application contenxt is created, changing the profile will not have any influence.
From the SpringBoot documentation, chapter 25.2 http://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-profiles.html
Programmatically setting profiles
You can programmatically set active
profiles by calling SpringApplication.setAdditionalProfiles(…) before
your application runs. It is also possible to activate profiles using
Spring’s ConfigurableEnvironment interface.
In my project, there are additional (non-wicket) applications, which need to know the URL representation of some domain objects (e.g. in order to write a link like http://mydomain.com/user/someUserName/ into a notification email).
Now I'd like to create a spring bean in my wicket module, exposing the URLs I need without having a running wicket context, in order to make the other application depend on the wicket module, e.g. offering a method public String getUrlForUser(User u) returning "/user/someUserName/".
I've been stalking around the web and through the wicket source for a complete workday now, and did not find a way to retrieve the URL for a given PageClass and PageParameters without a current RequestCycle.
Any ideas how I could achieve this? Actually, all the information I need is somehow stored by my WebApplication, in which I define mount points and page classes.
Update: Because the code below caused problems under certain circumstances (in our case, being executed subsequently by a quarz scheduled job), I dived a bit deeper and finally found a more light-weight solution.
Pros:
No need to construct and run an instance of the WebApplication
No need to mock a ServletContext
Works completely independent of web application container
Contra (or not, depends on how you look at it):
Need to extract the actual mounting from your WebApplication class and encapsulate it in another class, which can then be used by standalone processes. You can no longer use WebApplication's convenient mountPage() method then, but you can easily build your own convenience implementation, just have a look at the wicket sources.
(Personally, I have never been happy with all the mount configuration making up 95% of my WebApplication class, so it felt good to finally extract it somewhere else.)
I cannot post the actual code, but having a look at this piece of code will give you an idea how you should mount your pages and how to get hold of the URL afterwards:
CompoundRequestMapper rm = new CompoundRequestMapper();
// mounting the pages
rm.add(new MountedMapper("mypage",MyPage.class));
// ... mount other pages ...
// create URL from page class and parameters
Class<? extends IRequestablePage> pageClass = MyPage.class;
PageParameters pp = new PageParameters();
pp.add("param1","value1");
IRequestHandler handler = new BookmarkablePageRequestHandler(new PageProvider(MyPage.class, pp));
Url url = rm.mapHandler(handler);
Original solution below:
After deep-diving into the intestines of the wicket sources, I was able to glue together this piece of code
IRequestMapper rm = MyWebApplication.get().getRootRequestMapper();
IRequestHandler handler = new BookmarkablePageRequestHandler(new PageProvider(pageClass, parameters));
Url url = rm.mapHandler(handler);
It works without a current RequestCycle, but still needs to have MyWebApplication running.
However, from Wicket's internal test classes, I have put the following together to construct a dummy instance of MyWebApplication:
MyWebApplication dummy = new MyWebApplication();
dummy.setName("test-app");
dummy.setServletContext(new MockServletContext(dummy, ""));
ThreadContext.setApplication(dummy);
dummy.initApplication();