neo4j using transactionListener causes read/write error - plugins

I'm trying to use TransactionEventListener in neo4j. There don't seem to be any lifecycle hooks for plugins, so I figure the only way to do it is to have a plugin procedure do it. However, trying to do that gives me this error:
Neo4jError: Writing in read access mode not allowed. Attempted write to internal graph 1 (system)
The plugin uses write mode, even if I'm not actually writing anything to the database; I'm just registering that TransactionEventListener, although that could indeed lead to writes later on. Still, I've got Write mode.
Here's my procedure:
#Procedure(name = "setTransactionListener", mode = Mode.WRITE)
public Stream<BuiltInProcedures.NodeResult> setTaxonomy(
#Name("taxonomy") Map<String, Map<String, Object>[]> taxonomy
) {
var managementService = new DatabaseManagementServiceBuilder(Path.of(".")).build();
var listener = new ValidationTransactionListener(taxonomy);
managementService.registerTransactionEventListener(db.databaseName(), listener);
return null;
}
Best guess is that I'm not supposed to register a transaction listener this way. But if not this way, then how? There don't seem to be any lifecycle hooks that get called when the database starts, so how can I possibly register an transactionEventListener?
Or is there a way I can give myself permission to do this?

What do you actually want to do?
It doesn't work like that, you need to register the listener in the database lifecycle within a KernelExtensionFactory?
See here for an example:
https://github.com/neo4j/apoc/blob/dev/common/src/main/java/apoc/ApocExtensionFactory.java#L53

Related

How to make "compute()" function insert data to sqlite while in isolated process?

I'm working on flutter app that uses php apis for server and sqlite for local data.
The problem is with "compute()".
Here is the explanation :
I have three functions that receives data from api on the server, then add the data to my local database (sqlite) table.
First function to get data from server.
Future<List<Map<String, dynamic>>> getServerData(int vers)async {
//my code
}
Second function to insert data into local database:
Future<int> addNewData(List<Map<String, dynamic>>)async {
//my code
}
Third function to call the first and second function:
Future<bool> checkServerData(int vers)async {
List<Map<String, dynamic>> sdt= await getServerData(vers);
int res=await addNewData(sdt);
if(res>0) return true;
else return false;
}
I want to call the third function in a compute function:
compute(checkServerData, 2);
When did that I found this error:
null check operator used on null value.
Note*:
If I used it without calling local database it works good.
The error appears if I called the database to insert data into.
When I searched about this issue I found that it's not allowed to access any resources which generated in one thread from another thread. But I didn't understand exactly how to resolve it or how to use another way that do the same idea.
After searching about the issue specified, I found those workaround solutions:
1: if the process is very important to work in background, you can use the Isolate package classes and functions which allow all isolated processes or contexts to share data between them as messages sending and receiving. But it's something complex for beginners in flutter and dart to understand these things, except those who know about threading in another environments.
To now more about that I will list here some links:
Those for flutter and pub documentation:
https://api.flutter.dev/flutter/dart-isolate/dart-isolate-library.html
https://api.flutter.dev/flutter/dart-isolate/Isolate-class.html
https://pub.dev/packages/flutter_isolate
This is an example in medium.com website:
https://medium.com/flutter-community/thread-and-isolate-with-flutter-30b9631137f3
2: the second solution if the process isn't important to work on background:
using the traditional approaches such as Future.builder or async/await.
You can know more about them here:
https://www.woolha.com/tutorials/flutter-using-futurebuilder-widget-examples
https://dart.dev/codelabs/async-await
and you can review this question and answers in When should I use a FutureBuilder?

How to set the offset.commit.policy to AlwaysCommitOffsetPolicy in debezium?

I created a Debezium Embedded engine to capture MySQL change data. I want to commit the offsets as soon as I can. In the code, the config is created including follows.
.with("offset.commit.policy",OffsetCommitPolicy.AlwaysCommitOffsetPolicy.class.getName())
Running this returns, java.lang.NoSuchMethodException: io.debezium.embedded.spi.OffsetCommitPolicy$AlwaysCommitOffsetPolicy.<init>(io.debezium.config.Configuration)
However, When I start the embedded engine with,
.with("offset.commit.policy",OffsetCommitPolicy.PeriodicCommitOffsetPolicy.class.getName()), the embedded engine works fine.
Note that the class OffsetCommitPolicy.PeriodicCommitOffsetPolicy constructor includes the config parameter while OffsetCommitPolicy.AlwaysCommitOffsetPolicy doesn't.
public PeriodicCommitOffsetPolicy(Configuration config) {
...
}
How to get the debezium embedded engine to use its AlwaysCommitOffsetPolicy?
Thanks for the report. This is partly bug (which we would appreciate if you could log into our Jira). You can solve this issue by calling a dedicated method embedded engine builder like `io.debezium.embedded.EmbeddedEngine.create().with(OffsetCommitPolicy.always())'
Tested with version 1.4.0Final:
new EmbeddedEngine.BuilderImpl() // create builder
.using(config) // regular config
.using(OffsetCommitPolicy.always()) // explicit commit policy
.notifying(this::handleEvent) // even procesor
.build(); // and finally build!

SAPUI5 batch submit returns error

I am using the following code, in an attempt to batch upload the changes made on a table:
onConfirmActionPressed: function() {
var oModel = this.getModel();
oModel.setUseBatch(true);
oModel.submitChanges();
}
I am using setProperty() to set the new values, like this:
onSingleSwitchChange: function(oControlEvent) {
var oModel = this.getView().getModel();
var rowBindingContext = oControlEvent.getSource().getBindingContext();
oModel.setProperty(rowBindingContext.sPath + "/Zlspr", "A");
}
When onConfirmActionPressed is executed, I get a server error, saying that "Commit work during changeset processing not allowed" on SAP R3.
When I upload the lines of the table one-by-one, it works fine. However, uploading this way is very slow, and in some cases it takes more than 10 minutes for the process to complete.
Am I doing something wrong while batch submitting? Is there a chance the issue is due to server (R3) misconfiguration?
You need to override methods:
/IWBEP/IF_MGW_APPL_SRV_RUNTIME~CHANGESET_BEGIN
/IWBEP/IF_MGW_APPL_SRV_RUNTIME~CHANGESET_END
Keep track of the errors across all calls to update methods and if everything went OK then in changeset_end perform commit on the database
edit:
To clarify:
In your Data Provider Class Extension in SAP Gateway you need to find your YOURENTITY_UPDATE_ENTITY method and get rid off any COMMIT WORK statements.
Then you need to redefine /IWBEP/IF_MGW_APPL_SRV_RUNTIME~CHANGESET_BEGIN method and, which is a method which is fired before any batch operation. You could define a class attribute such as table mt_batch_errors which would be emptied in this method.
When you post batch changes from UI5 using oModel.submitChanges() all single changes to Entities are directed to appropriate ..._UPDATE_ENTITY methods. You need to keep track of any possible errors and if any occurs then fill your mt_batch_errors table.
After all entities have been updated /IWBEP/IF_MGW_APPL_SRV_RUNTIME~CHANGESET_END method is fired in which you are able to check mt_batch_errors table if any errors occurred during the batch process. If there were errors then you should probably ROLLBACK WORK, and if not then you are free to COMMIT WORK.
That is just an example of how it could be done, I'm curious of other suggestions.
Good luck!

Disable logging on FileConfigurationSourceChanged - LogEnabledFilter

I want Administrators to enable/disable logging at runtime by changing the enabled property of the LogEnabledFilter in the config.
There are several threads on SO that explain workarounds, but I want it this way.
I tried to change the Logging Enabled Filter like this:
private static void FileConfigurationSourceChanged(object sender, ConfigurationSourceChangedEventArgs e)
{
var fcs = sender as FileConfigurationSource;
System.Diagnostics.Debug.WriteLine("----------- FileConfigurationSourceChanged called --------");
LoggingSettings currentLogSettings = e.ConfigurationSource.GetSection("loggingConfiguration") as LoggingSettings;
var fdtl = currentLogSettings.TraceListeners.Where(tld => tld is FormattedDatabaseTraceListenerData).FirstOrDefault();
var currentLogFileFilter = currentLogSettings.LogFilters.Where(lfd => { return lfd.Name == "Logging Enabled Filter"; }).FirstOrDefault();
var filterNewValue = (bool)currentLogFileFilter.ElementInformation.Properties["enabled"].Value;
var runtimeFilter = Logger.Writer.GetFilter<LogEnabledFilter>("Logging Enabled Filter");
runtimeFilter.Enabled = filterNewValue;
var test = Logger.Writer.IsLoggingEnabled();
}
But test reveals always the initially loaded config value, it does not change.
I thought, that when changing the value in the config the changes will be propagated automatically to the runtime configuration. But this isn't the case!
Setting it programmatically as shown in the code above, doesn't work either.
It's time to rebuild Enterprise Library or shut it down.
You are right that the code you posted does not work. That code is using a config file (FileConfigurationSource) as the method to configure Enterprise Library.
Let's dig a bit deeper and see if programmatic configuration will work.
We will use the Fluent API since it is the preferred method for programmatic configuration:
var builder = new ConfigurationSourceBuilder();
builder.ConfigureLogging()
.WithOptions
.DoNotRevertImpersonation()
.FilterEnableOrDisable("EnableOrDisable").Enable()
.LogToCategoryNamed("General")
.WithOptions.SetAsDefaultCategory()
.SendTo.FlatFile("FlatFile")
.ToFile(#"fluent.log");
var configSource = new DictionaryConfigurationSource();
builder.UpdateConfigurationWithReplace(configSource);
var defaultWriter = new LogWriterFactory(configSource).Create();
defaultWriter.Write("Test1", "General");
var filter = defaultWriter.GetFilter<LogEnabledFilter>();
filter.Enabled = false;
defaultWriter.Write("Test2", "General");
If you try this code the filter will not be updated -- so another failure.
Let's try to use the "old school" programmatic configuration by using the classes directly:
var flatFileTraceListener = new FlatFileTraceListener(
#"program.log",
"----------------------------------------",
"----------------------------------------"
);
LogEnabledFilter enabledFilter = new LogEnabledFilter("Logging Enabled Filter", true);
// Build Configuration
var config = new LoggingConfiguration();
config.AddLogSource("General", SourceLevels.All, true)
.AddTraceListener(flatFileTraceListener);
config.Filters.Add(enabledFilter);
LogWriter defaultWriter = new LogWriter(config);
defaultWriter.Write("Test1", "General");
var filter = defaultWriter.GetFilter<LogEnabledFilter>();
filter.Enabled = false;
defaultWriter.Write("Test2", "General");
Success! The second ("Test2") message was not logged.
So, what is going on here? If we instantiate the filter ourselves and add it to the configuration it works but when relying on the Enterprise Library configuration the filter value is not updated.
This leads to a hypothesis: when using Enterprise Library configuration new filter instances are being returned each time which is why changing the value has no effect on the internal instance being used by Enterprise Library.
If we dig into the Enterprise Library code we (eventually) hit on LoggingSettings class and the BuildLogWriter method. This is used to create the LogWriter. Here's where the filters are created:
var filters = this.LogFilters.Select(tfd => tfd.BuildFilter());
So this line is using the configured LogFilterData and calling the BuildFilter method to instantiate the applicable filter. In this case the BuildFilter method of the configuration class LogEnabledFilterData BuildFilter method returns an instance of the LogEnabledFilter:
return new LogEnabledFilter(this.Name, this.Enabled);
The issue with this code is that this.LogFilters.Select returns a lazy evaluated enumeration that creates LogFilters and this enumeration is passed into the LogWriter to be used for all filter manipulation. Every time the filters are referenced the enumeration is evaluated and a new Filter instance is created! This confirms the original hypothesis.
To make it explicit: every time LogWriter.Write() is called a new LogEnabledFilter is created based on the original configuration. When the filters are queried by calling GetFilter() a new LogEnabledFilter is created based on the original configuration. Any changes to the object returned by GetFilter() have no affect on the internal configuration since it's a new object instance and, anyway, internally Enterprise Library will create another new instance on the next Write() call anyway.
Firstly, this is just plain wrong but it is also inefficient to create new objects on every call to Write() which could be invoked many times..
An easy fix for this issue is to evaluate the LogFilters enumeration by calling ToList():
var filters = this.LogFilters.Select(tfd => tfd.BuildFilter()).ToList();
This evaluates the enumeration only once ensuring that only one filter instance is created. Then the GetFilter() and update filter value approach posted in the question will work.
Update:
Randy Levy provided a fix in his answer above.
Implement the fix and recompile the enterprise library.
Here is the answer from Randy Levy:
Yes, you can disable logging by setting the LogEnabledFiter. The main
way to do this would be to manually edit the configuration file --
this is the main intention of that functionality (developers guide
references administrators tweaking this setting). Other similar
approaches to setting the filter are to programmatically modify the
original file-based configuration (which is essentially a
reconfiguration of the block), or reconfigure the block
programmatically (e.g. using the fluent interface). None of the
programmatic approaches are what I would call simple – Randy Levy 39
mins ago
If you try to get the filter and disable it I don't think it has any
affect without a reconfiguration. So the following code still ends up
logging: var enabledFilter = logWriter.GetFilter();
enabledFilter.Enabled = false; logWriter.Write("TEST"); One non-EntLib
approach would just to manage the enable/disable yourself with a bool
property and a helper class. But I think the priority approach is a
pretty straight forward alternative.
Conclusion:
In your custom Logger class implement a IsLoggenabled property and change/check this one at runtime.
This won't work:
var runtimeFilter = Logger.Writer.GetFilter<LogEnabledFilter>("Logging Enabled Filter");
runtimeFilter.Enabled = false/true;

How to make EF log sql queries globally?

How do I "tell" EF to log queries globally? I was reading this blog post: EF logging which tells in general how to log sql queries. But I still have a few questions regarding this logger.
Where would I need to place this line context.Database.Log = s =>
logger.Log("EFApp", s);?
Can it be globally set? Or do I have to place it everywhere I do DB
operations?
In the "Failed execution" section, the blogger wrote that, and I
quote:
For commands that fail by throwing an exception, the output contains the message from the exception.
Will this be logged too if I don't use the context.Database.Log?
Whenever you want the context to start logging.
It appears to be done on the context object so it should be done every time you create a new context. You could add this line of code in your constructor though to ensure that it is always enabled.
It will not log if you do not enable the logging.
I don't recommend to use that's functionality, because, it hasn't reason to exists in the real case.
Thats it use a lot of to debug code only. But, wether you wanna know more than details ... access link... https://cmatskas.com/logging-and-tracing-with-entity-framework-6/
In this case you can put code like this
public void Mylog()
{
//Thats a delegate where you can set this property to log using
//delegate type Action, see the code below
context.Database.Log = k=>Console.Write("Any query SQL")
//Or
context.Database.Log = k=>Test("Any query SQL")
}
public void Test(string x){
Console.Write(x)
}
I hope thats useufull