Enable log in ingres - ingres

I need to understand enable logging in Ingres stored procedure. I read a lot about "printqry", DBMS Server Query Tracing or security auditing. My requirement is does Ingres db gives option of custom logging where I can log custom messages.
db.trace("The value for x is ", x)

You can use the MESSAGE statement to write an arbitrary message. The message can either go to the current SESSION (meaning the calling program has to run INQUIRE_SQL to get the text) or the security audit log or errlog. I suspect the later would be most useful.
It takes an optional error number and/or message text. If you want to write messages involving values other than a constant string you'll need to assign it to a variable e.g.
msg_txt = 'The value for x is "+VARCHAR(:x);
MESSAGE :msg_txt WITH DESTINATION = (ERROR_LOG);
HTH

Related

Is there anyway to check duplicate the message control id (MSH:10) in MSH segment using Mirth connect?

Is there anyway to check duplicate the message control id (MSH:10) in MSH segment using Mirth connect?
MSH|^~&|sss|xxx|INSTANCE2|KKLIU 0063/2021|20190905162034||ADT^A28^ADT_A05|Zx20190905162034|P|2.4|||NE|NE|||||
whenever message enters it needs to be validated whether duplicate of control id Zx20190905162034 is already processed or not?
Mirth will not do this for you, but you can write your own JavaScript transformer to check a database or your own set of previously encountered control ids.
Your JavaScript can make use of any appropriate Java classes.
The database check (you can implement this using code template) is the easier way out. You might want to designate the column storing MSH:10 values as a primary key or define an index on it. Queries against unique entries would be faster. Other alternatives include periodically redeploying the Channel while reading all MSH:10 values already in the database and placing them in a global map variable or maintained in an API that you can make a GET request to when processing every message. Any of the options depends on the number of records we are speaking about.

PostgreSQL JDBC driver hide prepared statement parameters from logging

I need to hide prepared statement parameters from logging debug level and exception message. There are security critical values. For example using pgp_sym_encrypt, when exception thrown from database, in exception message shown full statement with parameters also 2nd parameter encryption key password.
Is there any way to hide these kind of values, especially in exception message?
The safest way is to do the encryption on the client side and never send the password to the database. Once you send it to the database, it will be very hard to absolutely control what happens to it. Consider that if there is a way to configure the database to suppress this logging, then there is also a way to reverse that configuration.

Mule: after delivering a message, save the current timestamp for later use. What's the correct idiom?

I'm connecting to a third-party web service to retrieve rows from the underlying database. I can optionally pass a parameter like this:
http://server.com/resource?createdAfter=[yyyy-MM-dd hh:ss]
to get only the rows created after a given date.
This means I have to store the current timestamp (using #[function:datestamp:...], no problem) in one message scope and then retrieve it in another.
It also implies the timestamp should be preserved in case of an outage.
Obviously, I could use a subflow containing a file endpoint, saving in a designated file on a path. But, intuitively, based on my (very!) limited experience, it feels hackish.
What's the correct idiom to solve this?
Thanks!
The Object Store Module is designed just for that: to allow you to save bits of information from your flows.
See:
http://mulesoft.github.io/mule-module-objectstore/mule/objectstore-config.html
https://github.com/mulesoft/mule-module-objectstore/

Mirth losing data in mapper variables

I have a database reader channel set up that actually reads the database at 10 second intervals and sends to a web service just fine. We get a valid response from the wsdl.
However, I need to update the database record so that it is flagged as having been processed. in this case we are simple changing a field from 100 to 101. However, when I try to update the field OR send an email containing ANY data that has been stored into mapper variables I get nothing. The database does not update. Emails send blanks for fields.
When I go into the channel messages for processed messages I can see good data in the Raw Message and Encoded Message tabs. There are no values in the Mappings tab.
Any suggestions on troubleshooting?
The Run-on-Update statement does not have access to the channel map, as it runs after message Encoding (and even the post-processor, I believe).
It DOES have access to the globalChannelMap and the responseMap. Put your new ID in the globalChannelMap and you should be good to go.
If you also want to send an email, would recommend you instead add an SMTP Writer destination (e.g., SMTP writer), which will have access to any channelMap variables created in a 'Destination 1'; as well as the globalChannelMap.

quickfixengine: possible to restrict logging?

In quickfixengine is there a setting to specify the log level to restrict number of messages logged? It seems that we are login a lot of data so we would like to restrict it bit. I assume that logging too many messages should affect performance (don't have any hard data for or against).
You don't say which language you're using but I believe that this should work with both the C++ and Java APIs.
You will need to implement your own LogFactory and Log classes (the former is responsible for creating instances of the latter). Then you'll pass an instance of your custom LogFactory to your Initiator or Acceptor instance. Your Log class is where you will do the message filtering.
Understand that Log receives messages in string form, so you'll need to filtering either with string matching operations or convert the strings back to Messages and then filter using tags, though this may end up slowing you down more than just allowing all messages to be logger.