log4c does not work in service mode - service

I'm new to linux C programming
And I am trying to write a service runnig on a CentOS host, log4c works as my logging module.
Everything works just fine during developing and debbugging, but when I finally install and run it as a service, no logs were ever created.
But if I start it via shell, the logging seems perfect.
The script starting the service is simply:
daemon MyService -c 0
Starting by typing
./Myservice -c 0
and the logs show up
Please help, did I get anything wrong anywhere?
Oh, following are relative codes:
//----------------------wrapper.h------------------
#define SL_LOG_TRACE(cat, fmt, args...) { \
const log4c_location_info_t locinfo = LOG4C_LOCATION_INFO_INITIALIZER(NULL); \
log4c_category_log_locinfo(cat, &locinfo, LOG4C_PRIORITY_TRACE, fmt, ##args); }\
//----------------------main.c----------------------
void SomeFunc()
{
...
SL_LOG_TRACE(g_cat, "some logs");
...
//----------------log4c resource--------------
<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE log4c SYSTEM "">
<log4c version="1.2.1">
<config>
<bufsize>0</bufsize>
<debug level="2"/>
<nocleanup>0</nocleanup>
<reread>1</reread>
</config>
<category name="FileLogger" priority="trace" appender="myrollingfileappender"/>
<appender name="myrollingfileappender" type="rollingfile" logdir="." prefix="sl_log" layout="dated" rollingpolicy="myrollingpolicy"/>
<rollingpolicy name="myrollingpolicy" type="sizewin" maxsize="102400" maxnum="10"/>
//------------------LOG4C_RCPATH="/somedir/log"---------------------

Related

Liquibase 'includeAll' tag generates 2 rows in the databasechangelog table for the same changeset

I am using Liquibase version 4.0.0 to deploy DB migration scripts in PostgreSQL.
I use a master changelog file and it looks like the following.
<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog
xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog
http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.8.xsd">
<includeAll path="4.460.7" relativeToChangelogFile="true"/>
</databaseChangeLog>
My project structure looks like the following.
I use a docker container to run liquibase update in my Jenkinsfile as the following.
docker run --rm -v /home/jenkins/workspace/mate_DB_Migration_Scripts_master:/liquibase/changelog liquibase/liquibase --url="jdbc:postgresql://host:5432/postgres?currentSchema=schema1" --changeLogFile=../liquibase/changelog/changelog.xml --username=postgres --password=some_password update
docker run --rm -v /home/jenkins/workspace/mate_DB_Migration_Scripts_master:/liquibase/changelog liquibase/liquibase --url="jdbc:postgresql://host:5432/postgres?currentSchema=schema2" --changeLogFile=../liquibase/changelog/changelog.xml --username=postgres --password=some_password update
It runs the update just fine. The issue is I can see 2 rows in the databasechangelog file as the following for the same changeset.
Does anyone know why this happens? Please let me know if you want any other information to resolve this.
It looks like there is an issue with the Liquibase Docker image.
More specifically with the combination of v4.0.0 and the "includeAll" tag.
As a workaround you can try the "include" tag instead, but you will have to include every single file.
like this:
<include file="path/to/<filename>.sql" relativeToChangelogFile="true"/>
You can also try the sqlFile tag like this:
<changeSet author="SteveZ" id="external-sql+rollback-script-example" context="QA" labels="Jira1000">
<sqlFile dbms="mysql" splitStatements="true" endDelimiter="//" stripComments="true" path="objects/function/myFunction.sql"/>
<rollback>
<sqlFile dbms="mysql" splitStatements="true" endDelimiter="//" stripComments="true" path="objects/function/myFunction_rollback.sql"/>
</rollback>
</changeSet>

XSLT streaming not streaming

I'm using Saxon-EE for the purpose of streaming XSLT transformation of large XML. The transformation works fine but it seems it's not really streaming since the java.exe process is inflating: for a 100 MB XML, process memory increases ~1GB. This is the XSLT:
<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="3.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:fo="http://www.w3.org/1999/XSL/Format"
xmlns:bb="urn:xx-zz-1.1"
xmlns:aa="urn:xx-yy-1.1">
<xsl:mode streamable="yes"/>
<xsl:output method="text" omit-xml-declaration="yes" indent="no"/>
<xsl:template match="/">
<xsl:for-each select="aa:LevelOne/aa:LevelTwo">
<xsl:iterate select="bb:LevelThree! copy-of(.)">
<xsl:value-of select="concat(bb:fieldOne,',',bb:fieldTwo,'
')"/>
</xsl:iterate>
</xsl:for-each>
</xsl:template>
</xsl:stylesheet>
This is the XML:
<?xml version="1.0" encoding="utf-8"?>
<aa:LevelOne xmlns="urn:xx-zz-1.1" xmlns:aa="urn:xx-yy-1.1">
<aa:LevelTwo xmlns="urn:xx-zz-1.1" xmlns:aa="urn:xx-yy-1.1">
<LevelThree xmlns="urn:xx-zz-1.1">
<fieldOne>f1</fieldOne>
<fieldTwo>f2</fieldTwo>
</LevelThree>
<!-- Level three is repeated many times -->
</aa:LevelTwo>
</aa:LevelOne>
I would like to know if there (& what) is a problem with the XSLT above.
The code I use:
net.sf.saxon.s9api.Processor processor = new net.sf.saxon.s9api.Processor(true);
processor.setConfigurationProperty(Feature.STREAMABILITY, "standard");
XsltCompiler compiler = processor.newXsltCompiler();
XsltExecutable stylesheet = compiler.compile(new StreamSource(stylesheetFile));
Serializer out = processor.newSerializer(outputCsvFile);
Xslt30Transformer transformer = stylesheet.load30();
transformer.applyTemplates(new StreamSource(xmlFile), out);
EDIT: Fixed the XSLT so it compiles & added XML example.
Remark: using command java -cp "<path>\test;<path>\saxon9ee.jar" com.example.test.Test -t does not ouput additional info (only the printlns in the code). java -cp "<path>\test;<path>\saxon9ee.jar" -t com.example.test.Test outputs: Unrecognized option: -Xt Error: Could not create the Java Virtual Machine. If I change the XSLT to non-streamable rule e.g. remove the iterate line, program outputs Template rule is not streamable, also without -t option. In this case if I remove the streamability requirement from code/xslt, the error goes away.
Thanks.
Probably the most likely reason Saxon would fall back to non-streaming mode is that it hasn't located a Saxon-EE license. The easiest way to test that is (unintuitively!) by calling processor.isSchemaAware() - that will only be true if you're running Saxon-EE code with a recognized license, which is exactly the same condition to enable streaming.
If it hasn't found a license, the Saxon documentation includes a section on troubleshooting license problems at http://www.saxonica.com/documentation/index.html#!about/license
Also, try it from the command line with option -t; that will give you more information (a) about streaming, and (b) about loading of license files.
I think, if the data is as simple and as regular as shown in the question, then you can avoid the use of copy-of() and simply use
<xsl:iterate select="bb:LevelThree">
<xsl:value-of select="bb:*" separator=","/>
<xsl:text>
</xsl:text>
</xsl:iterate>
That in a quick test shows a reduced memory consumption compared to your posted approach.
As for your posted approach not using streaming with Saxon EE 9.9, I have tested the posted XSLT and the input sample with Saxon 9.9 EE from the command line with the -t option and it shows the input is streamed.
I also think the Java code shown is fine to process the file with streaming with Saxon EE.
For a detailed analysis of the memory consumption and any problems you encounter with that it might be better to raise an issue with all details on the Saxonica support site. I am not sure how the memory info Saxon outputs relates exactly to the one you say you see for java.exe.

Error uploadig a file to NextCloud via API

I'm trying to upload a file:
curl -X PUT -u "my_username:pass123" "https://nextcloud.my_domain.com/remote.php/webdav/Shared/dir1/" --data-binary #"/Users/user1/test1.png"
Error:
<?xml version="1.0" encoding="utf-8"?>
<d:error xmlns:d="DAV:" xmlns:s="http://sabredav.org/ns">
<s:exception>Sabre\DAV\Exception\Conflict</s:exception>
<s:message>PUT is not allowed on non-files.</s:message>
</d:error>
Why?
The credentials I'm using are the ones I use for login in the browser.
The PUT request needs to refer to the actual file you want to create, right now you are pointing to a directory.
So instead of:
https://nextcloud.my_domain.com/remote.php/webdav/Shared/dir1/
Use:
https://nextcloud.my_domain.com/remote.php/webdav/Shared/dir1/test1.png

Change VMConfig file using azure powershell

I have a vmConfig file. I want to change subnet and IP Address as I want to create a new VM out of config file in new subnet, rest all configurations no need to be changed. I can manually edit xml file content but I want to do it through powershell so that I can have an automated process for everything.
Here is the sample vmConfig xml-
<?xml version="1.0" encoding="utf-8"?>
<PersistentVM xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<ConfigurationSets>
<ConfigurationSet xsi:type="NetworkConfigurationSet">
<ConfigurationSetType>NetworkConfiguration</ConfigurationSetType>
<InputEndpoints>
<InputEndpoint>
<LocalPort>5986</LocalPort>
<Name>PowerShell</Name>
<Port>64929</Port>
<Protocol>tcp</Protocol>
<Vip>191.237.20.225</Vip>
<EnableDirectServerReturn>false</EnableDirectServerReturn>
<IdleTimeoutInMinutes xsi:nil="true" />
</InputEndpoint>
</InputEndpoints>
<SubnetNames>
<string>mysubnet</string>
</SubnetNames>
<StaticVirtualNetworkIPAddress>12.13.14.15</StaticVirtualNetworkIPAddress>
<PublicIPs />
<NetworkInterfaces />
I am interested changing only IP Address and Subnet.
This is basically xml parsing using powershell. I hope this should work for you-
$path = 'C:\myFolder\XmlVM.xml'
[xml]$myXML = Get-Content $path
$myXML.PersistentVM.ConfigurationSets.ConfigurationSet.SubnetNames.string="MYNEWSUBNET"
$myXML.PersistentVM.ConfigurationSets.ConfigurationSet.StaticVirtualNetworkIPAddress="10.11.14.115"
$myXML.Save($path)

using curl to interact with REST API - error: "Content not allowed in prolog"

I am sending the following curl command from terminal in Mac OSX:
curl -d "OPERATION_NAME=ADD_REQUEST&TECHNICIAN_KEY=1AD….4&INPUT_DATA=TestData.xml" http://xxx.xx.xx.xx/sdpapi/request/
I am getting back the response:
FailedError when performing - ADD_REQUEST - Content is not allowed in prolog.
Here is my xml file:
<?xml version="1.0" encoding="UTF-8"?>
<Operation>
<Details>
<requester>Me</requester>
<subject>Test</subject>
<description>Testing curl input</description>
</Details>
</Operation>
I have checked and my xml file is indeed a UTF-8 file. I can tell from google searches that this most likely has to do with my encoding, however, I can't find how to fix it.
I've also tried saving the xml file as ANSI on my pc, xml file looks like this:
<?xml version="1.0"?>
<Operation>
<Details>
<requester>Me</requester>
<subject>Test</subject>
<description>Testing curl input</description>
</Details>
</Operation>
I downloaded Notepad++ and checked the encoding, which is UTF8 with no BOM.
I am still getting the same error. Can anyone see what I am doing wrong?
Updated 9/19 to add:
In addition to everything I've tried in the comments below, I've also tried this:
curl -d "OPERATION_NAME=ADD_REQUEST&TECHNICIAN_KEY=xxxxxxxxxxxxxxxxx&INPUT_DATA=<?xml version="1.0" encoding="utf-8"?><Operation><Details><requester>Me</requester><subject>Test</subject><description>Testing curl input</description></Details></Operation>" http://xxx.xx.xx.xx/sdpapi/request/
The error I'm getting now is: "Error when performing - ADD_REQUEST - The value following "version" in the XML declaration must be a quoted string."
Anyone have any thoughts?
I replaced <?xml version="1.0" encoding="utf-8"?> with <?xml version=%221.0%22 encoding=%22utf-8%22?>
It is now working.