Security-Parameter at startUp of jnlp-application - java-web-start

We need to change the security parameter 'networkaddress.cache.ttl' of our jnlp-java-application. The application is started by javaws of IcedTea-Webstart.
I tried a lot of possibilities in the jnlp-file:
<resources>
<j2se version="1.8+" java-vm-args="-Dnetworkaddress.cache.ttl=30" javaws-vm-args="-J-Dnetworkaddress.cache.ttl=30"/>
<property name="networkaddress.cache.ttl" value="30"/>
</resources>
javaws of IcedTea is ignoring every of this parameters, and that's ok, it's a security parameter. I tried this also with the deprecated parameter 'sun.net.inetaddr.ttl', it's also not working.
Another option is to set the Parameter in the code:
java.security.Security.setProperty("networkaddress.cache.ttl" , "30")
But that don't work because the code which is reading this parameter runs before I'm able to set this property. See class sun.net.InetAddressCachePolicy in a static block the security parameter is read. For some people this should have worked, is this really true?
To change the file java.security on every client is no option.
Are there any other possibilities?

Related

Is there an programmatic way to create a custom network profile in Windows without importing XML file?

I'm looking for a way to add a network profile to a system without importing the XML file. Does anyone know the actual syntax to enter in all of that data manually if you have it?
Below is an example of the XML file.
Of course I could use this first snippet to do it, but I want to be able to manually.
Is someone familiar with this process?
I want to write a powershell script that inputs each of the necessary variables one by one.
netsh wlan add profile filename="C:\path\HOME.xml"
<?xml version="1.0"?>
<WLANProfile xmlns="http://www.microsoft.com/networking/WLAN/profile/v1">
<name>HOME</name>
<SSIDConfig>
<SSID>
<hex>6D797374726F</hex>
<name>mystro</name>
</SSID>
</SSIDConfig>
<connectionType>ESS</connectionType>
<connectionMode>manual</connectionMode>
<MSM>
<security>
<authEncryption>
<authentication>WPA2PSK</authentication>
<encryption>AES</encryption>
<useOneX>false</useOneX>
</authEncryption>
<sharedKey>
<keyType>passPhrase</keyType>
<protected>false</protected>
<keyMaterial>password</keyMaterial>
</sharedKey>
</security>
</MSM>
<MacRandomization xmlns="http://www.microsoft.com/networking/WLAN/profile/v3">
<enableRandomization>false</enableRandomization>
</MacRandomization>
</WLANProfile>
The answer was provided above in the form of a comment so I cannot accept it as an answer so I leave it here for anyone else that searches this. Apparently an EMF file is requirerd and manually programming in the variables is not possible at this time.
Microsoft defines it as it is requiring XML - https://learn.microsoft.com/en-us/windows/win32/nativewifi/wlan-profileschema-wlanprofile-element

How to get application DEBUG and INFO messages properly logged in Play 2.4?

From Play 2.4 documentation, the default application logging level should be DEBUG, right:
<logger name="play" level="INFO" />
<logger name="application" level="DEBUG" />
However, in my logs I only get WARN and ERROR level messages.
For example this code:
class Application extends Controller {
val log = Logger(this.getClass)
def index = Action {
log.debug("debug")
log.info("info")
log.warn("warn!")
log.error("ERROR")
Ok("ok")
}
}
...only yields this in stdout (ditto in logs/application.log):
[warn] c.Application - warn!
[error] c.Application - ERROR
How to get application DEBUG and INFO messages properly logged?
Using Play 2.4.3, with basically default configs, and no conf/logback.xml at all. (SBT-based project setup, no Typesafe Activator.)
To clarify, I know I can create a custom config file (conf/logback.xml) for Logback. That is obvious from the documentation I linked to in the very first sentence.
The point here was: if my needs are extremely ordinary (get my app's messages logged, also debug and info), do I really need to create a lengthy custom configuration file? One would assume a thing as basic as this would work by default, or with some minimal config option. If you’ve paid attention, Play Framework is touted as one with good developer experience, and many things with it follow the “convention over configuration” principle.
What I learned from a colleague in our backend chat:
Your Application controller probably resides in the controllers
package, right? When you do Logger(getClass) <- getClass is used to
look up the package path to Application, which then would be
controllers.Application so you can for example add a line <logger
name="controllers" level="DEBUG" /> to get debug output from classes
in the controllers package
There is one way without custom configs (which worked for INFO but not DEBUG with a quick experiment). But it has significant drawbacks compared to using more granual loggers (as in my question).
The "application" logger is the default logger name that's used if you
use the Logger object directly, like Logger.info("Hello, world!"), as
opposed to creating your own instances
[...]
But that will clearly backfire quickly since then you lose granular
configuration of your logs and can only filter the logging "globally",
so I never use that.
Also, your logs will not disclose where the log statement is
made but instead just print that it's in "application"
I don't want those drawbacks, so I did create conf/logback.xml (starting with a copy of the default) and added custom loggers along the lines of:
<logger name="controllers" level="DEBUG" />
<logger name="services" level="DEBUG" />
<logger name="repositories" level="DEBUG" />
So now my val log = Logger(this.getClass) approach works.
But I fail to see how requiring 30-40 lines of custom XML for pretty much the most basic thing imaginable is good developer experience. If some Play advocate or developer could justify why this doesn't work out of the box with default config, I'd be interested in hearing that.
You have no .xml file at all inside the conf folder?
Adding this line should fix it for you:
<logger name="controllers" level="DEBUG" />
You can also override it on the application.conf file, although it will be deprecated in the future:
logger.controllers=DEBUG

Vertica-Tableau error Multiple commands cannot be active

We have dataset in Vertica and Tableau is querying the data (4 Billions record) from vertica for dashboard as shown below :
All list and graphs are separate worksheets in tableau and using same connection to Vertica DB. Each list is a column in DB and list is descending order of # count of items in dataset's respective column. Graph also same as list but calculated in slightly different manner. Start Date and End Date is date range for Data to be query like data connection filter which will restrict the query to fixed amount of data example past week, last month, etc.
But I get this ERROR :
Vertica][VerticaDSII] (10) An error occurred during query preparation: Multiple commands cannot be active on the same connection. Consider increasing ResultBufferSize or fetching all results before initiating another command.
Is the any workaround this issue or any better way to do this
you'll need a TDC file which specifies a particular ODBC connection string option to get around the issue.
The guidance from Vertica was to add an ODBC Connect String parameter with the value “ResultBufferSize=0“. This apparently forces the result buffer to be unlimited, preventing the error. This is simple enough to accomplish when building a connection string manually or working with a DSN, but Vertica is one of Tableau’s native connectors. So how do you tell the native connector to do something else with its connection?
Native Connections in Tableau can be customized using TDC files
“Native connectors” still connect through the vendor’s ODBC drivers, and can be customized just the same as an “Other Databases” / ODBC connection. In the TDC files themselves, “ODBC” connections are referred to as “Generic ODBC”, which is a much more accurate way to think about the difference.
The full guide to TDC customizations, with all of the options, is available here although it is pretty dense reading. One thing that isn’t provided is an example of customizing a “native connector”. The basic structure of a TDC file is this
<?xml version='1.0' encoding='utf-8' ?>
<connection-customization class='genericodbc' enabled='true' version='7.7'>
<vendor name='' />
<driver name='' />
<customizations>
</customizations>
</connection-customization>
When using “Generic ODBC”, the class is “genericodbc” and then the vendor and driver name must be specified so that Tableau can know when the TDC file should be applied. It’s much simpler for a native connector — you just use the native connector name in all three places. The big list of native connector names is at the end of this article. Luckily for us, Vertica is simply referred to as “vertica”. So our Vertica TDC framework will look like:
<?xml version='1.0' encoding='utf-8' ?>
<connection-customization class='vertica' enabled='true' version='7.7'>
<vendor name='vertica' />
<driver name='vertica' />
<customizations>
</customizations>
</connection-customization>
This is a good start, but we need some actual customization tags to cause anything to happen. Per the documentation, to add additional elements to the ODBC connection string, we use a tag named ‘odbc-connect-string-extras‘. This would look like
<customization name='odbc-connect-string-extras' value='ResultBufferSize=0;' />
One important thing we discovered was that all ODBC connection extras need to be in this single tag. Because we wanted to turn on load balancing in the Vertica cluster, there was a second parameter recommended: ConnectionLoadBalance=1. To get both of these parameters in place, the correct way method is
<customization name='odbc-connect-string-extras' value='ResultBufferSize=0;ConnectionLoadBalance=1;' />
There are a whole set of other customizations you can put in to place to see how they affect performance. Make sure you understand the way the customization option is worded — if it starts with ‘SUPRESS’ then giving a ‘yes’ value will turn off the feature; other times you want to set the value to ‘no’ to turn off the feature. Some of the other ones we tried were
<customization name='CAP_SUPPRESS_DISCOVERY_QUERIES' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_PREPARED_QUERY' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_SELECT_STAR' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_EXECUTED_QUERY' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPRESS_SQLSTATISTICS_API' value='yes' />
<customization name= 'CAP_CREATE_TEMP_TABLES' value='no' />
<customization name= 'CAP_SELECT_INTO' value='no' />
<customization name= 'CAP_SELECT_TOP_INTO' value='no' />
The first set were mostly about reducing the number of queries for metadata detection, while the second set tell Tableau not to use TEMP tables.
The best way to see the results of these customizations is to change the TDC file and restart Tableau Desktop Once you are satisfied with the changes, then move the TDC file to your Tableau Server and restart it.
Where to put the TDC files
Per the documentation ”
For Tableau Desktop on Windows: Documents\My Tableau Repository\Datasources
For Tableau Server: Program Files\Tableau\Tableau Server\\bin
Note: The file must be saved using a .tdc extension, but the name does not matter.”
If you are running a Tableau Server cluster, the .tdc file must be placed on every worker node in the bin folder so that the vizqlserver process can find it. I’ve also highlighted the biggest issue of all — you should edit these using a real text editor like Notepad++ or SublimeText rather than Notepad, because Notepad likes to save things with a hidden .TXT ending, and the TDC file will only be recognized if the ending is really .tdc, not .tdc.txt.
Restarting the taableau resolved my issue which was giving same error.

Railo REST component has no access to application variables

I've been playing with the REST functionality in Railo 4, with limited success...
I have two paths set up, the first works exactly as I'd expect but requests to the second seem to exist outside of the application...
I have two mappings:
/api and /admin/api
I created /admin/api first, I'm not sure if that makes a difference, but everything is working fine there and not in /api.
I have the following components in both:
<cffunction
name="test"
access="remote"
returntype="struct"
returnformat="json"
output="true"
httpmethod="GET"
>
<cfscript>
return Application;
</cfscript>
</cffunction>
In my application I have a bunch of variables that are created onApplicationStart - if I run this component from /admin/api they are all available to me, however running the exact same component from /api I don't have any!
I set up the /admin/api a while ago and have had it functioning with no problems - I'm wondering if I've missed a step when I set up /api...
Can anyone explain why I would have access to the application variables in one path, but not in the other?

Force Absolute URLS in TInyMCE [duplicate]

this has been driving me crazy for a few hours, I managed to fix it on my local development machine and of course when I put it to live it's not working.
Here is what I did in my Umbraco set up:
in the Config/tinyMceConfig.Config I added:
<config key="relative_urls">false</config>
<config key="convert_urls">false</config>
<config key="remove_script_host">false</config>
I also amended the Javascript code in insertLink.aspx to set localUrl to blank as this made it work on the dev machine. Does anyone know how to fix this really stupid bug on my live server?
Thanks
Unfortunately, you could not override "remove_script_host" because it is hardcoded in umbraco.editorControls.tinyMCE3.TinyMCE :
Line 250: config.Add("remove_script_host", "true");.
And when you add new value to config file - you will get "true,false" instead of "false", because of NameValueCollection used.
And as we know, when you add 2 items with same key to NameValueCollection, as result you will have concatenation of that two values, with comma separator.
So don't waste your time on finding out what's wrong with your "remove_script_host" configuration.