I am new to MarkLogic and I cannot find a way to do module invoke instead of adhoc query for using dls:document-insert-and-manage.
I have a content source connected to my host and port.
Session session = contentSource.newSession();
Request request = session.newModuleInvoke("marklogic.com/xdmp/dls/MarkLogic/dls.xqy)
Does not seem to work.
The dls module is a library module. It provides a set of functions that you can call. You need to import it into a main module (or require it into a JavaScript module) and then call the function you want.
Related
I have a bitnami instance that I use to run DF 2.12.0, on which I added a custom “Remote service” (a HTTP REST API). I would like to use server-side event scripting functionalities to pre-process request data before sending it. I have this pre-processing Node js test script, that is linked to the “pre_process” event of my resource :
console.log("test");
But it seems that this script is not executed, after having a look at the DF log file :
However, all DF built-in functionalities such as the user management service seem to work with event scripting. Here is the same log file about a script linked to the user.session.get.pre_process event, that is indeed called :
Strangely, the complete path of my main event script is netwrixapi.search.post.pre_process, but the first log file image only mentions a call to the event “netwrixapi.post.pre_process” (without my resource “search”).
I included the “X-DreamFactory-Api-Key” in my request header, which references an app with a full-access role to API and script sources for all HTTP methods :
I also set APP_DEBUG=true and APP_LOG_LEVEL=debug in my .env file, without any luck.
Any ideas ?
Problem finally solved : seems like the log_events variable wasn't set to true by default (even if the official documentation says so) in my {$HOME}/apps/dreamfactory/htdocs/vendor/dreamfactory/df-core/config/df.php file.
How do I tell DSC that a resource/module from our internal code repository (not on a private Gallery feed), first?
Do I just use a basic Script Resource and bring the files down (somehow) into $PSModulePath and import them?
Update
There's a CmdLet called Get-DSCResource that will list the available resources on a system, i.e. that reside in the correct path(s), and provide some information that can be used with Import-DscResource, which is a 'dynamic keyword' that is placed within a Configuration block in a DSC script to declare the dependencies.
As for getting the resources/modules down to the target system, I'm not sure yet.
If you are using a dsc pull server then you just need to make sure that your custom module(s) are on that server. I usually put them in program files\windowspowershell\modules.
In the configuration you can just specify that you want to import your custom module and then proceed with the custom dsc resource
Configuration myconfig {
Import-DSCResource customModule
Node somenode {
customresource somename {
}
}
}
If you dont have a pull server and you want to push configurations then you have to make sure that your custom modules are on all target systems. you can use the DSC file resource to copy the modules or maybe just use a ps script or any other means to copy them and then use DSC for your custom configurations.
We're in the process of upgrading from MarkLogic 6 to 8 and have run into some problems calling library modules. We have xquery library modules that are called both from custom REST extensions and from non-REST xquery.
MarkLogic's documentation says that REST endpoints can use libraries installed either with their new /ext endpoint or libraries installed the old way (placed somewhere else in the modules database). However, when the library module is using, for example, the functx package that comes with MarkLogic, I can't get the cross-over to work.
Say I have two identical library modules, one installed via /ext and one not:
xquery version "1.0-ml";
module namespace test = "test/lib";
import module namespace functx = "http://www.functx.com" at "/MarkLogic/functx/functx-1.0-nodoc-2007-01.xqy";
declare function test:stuff() {
<foo/>
};
The first is installed using this command, in case it matters:
curl --anyauth --user user:pwd -X PUT -i -d #".\\module\\testlib-ext.xqy" -H "Content-type: application/xquery" "http://host:8020/v1/ext/test/testlib-ext.xqy?perm:rest-reader=execute"
I have rest endpoints that use each module (the only difference is the namespace and import):
xquery version "1.0-ml";
module namespace te = "http://marklogic.com/rest-api/resource/test-ext-to-ext";
import module namespace test = "test/lib" at "/ext/test/testlib-ext.xqy";
declare function te:get($context as map:map, $params as map:map) as document-node()* {
document { test:stuff() }
};
The one using the library installed using /ext works. The one using the module that's simply placed in the modules database installs without errors but gives me an error when called by a non-admin user (it works when called by admin):
RESTAPI-INVALIDREQ: (err:FOER0000) Invalid request: reason: Extension test-ext-to-lib does not exist.
I'd just install them all using /ext, but then xquery that uses xdmp:invoke breaks. It's a different error, but it seems to be the same underlying issue. Invoking a module using the library placed in the modules database works. Invoking a module using the library installed via /ext fails with this error:
XDMP-MODNOTFOUND: (err:XQST0059) xdmp:invoke("/test/test-module-to-ext.xqy", (), ()) -- Module C:\Program Files\MarkLogic\Modules\MarkLogic\functx\functx-1.0-nodoc-2007-01.xqy not found
If I add the admin role to the calling user, all of them work. They also work even without the admin role if I take the functx import out.
It looks like a permission problem, but I can't find a role or permission that will fix it. The user has a role with every checkbox except admin itself checked. Checking that last checkbox is the only thing I've found that makes this work, and that's obviously not a viable solution.
We don't really care how the libraries are installed, but we don't want to duplicate code. How can we make these imports work with both REST and non-REST xquery?
For the permissions to work, the main module and each library in the dependency chain must be executable by at least one role assigned to the user (where assignment includes inheritance and amping).
The REST API sets the rest-extension-user role on modules that it installs under /ext.
So, a user who has the rest-extension-user role should be able to invoke a module installed by the REST API under /ext.
More generally, any module (regardless of how it is installed) that is executable by the rest-extension-user role should be able to have dependencies on libraries installed by the REST API under /ext (assuming, of course, that all paths are correct).
I am using Confluence 4.2.5 (build 3284) with CAS SSO connected to my LDAP server and would like to be able to call synchroniseUserDirectories() from the LDAP server when a user changes their password so that the change is instantaneous.
The way it works now is that users have to wait for the Confluence to run it's periodic LDAP synchronization which can be disconcerting for them.
I have tried using the XML-RPC interface to call changeUserPassword() (as an administrator) but it doesn't work. The operation raises an exception "Error changing password for user ...". I presume that that is because the user is defined in the LDAP but I can't tell for sure because the exception message wasn't clear about the cause.
Here is example code that I would like to be able to use. It doesn't work.
#!/usr/bin/env python
import xmlrpclib
url = 'https://docs.example.com'
admin_user = 'frobisher'
admin_pass = 'supersecretstuff'
username = 'bigbob'
new_password = 'bigbobsbigsecret'
server = xmlrpclib.ServerProxy(url + '/rpc/xmlrpc')
token = server.confluence2.login(admin_user, admin_pass)
# CITATION: https://developer.atlassian.com/display/CONFDEV/Remote+Confluence+Methods
# this doesn't exist but would be my preferred approach.
# It raises a NoSuchMethodException exception.
server.confluence2.synchroniseUserDirectories(token)
# this throws a general exception, because of the LDAP? The message
# wasn't clear about the source of the problem.
#server.confluence2.changeUserPassword(token,
# username,
# password)
server.confluence2.logout(token)
Is there any way to do this using SOAP or REST? I was concerned about REST because it sounds like it is still a prototype.
If none of those approaches will work, can it be done with a simple plugin considering that this must be a push operation from the LDAP server to the Confluence server? I have no experience writing plugins but I do some java work occasionally.
Any hints would be greatly appreciated.
The short answer is "no". The ability to synchronise remote user directories is not exposed as a remote operation in Confluence.
The long answer is "yes", you can write a plugin to do this. If you're already familiar with java, then perhaps the best answer is to just show you some source code I've written that performs a similar function: https://bitbucket.org/jaysee00/confluence-user-sync-api This plugin gives you SOAP, XML-RPC and JSON-RPC methods to force an individual user account to be synced in to Confluence from a remote directory.
That might suit your purposes as-is, but I imagine it would be possible to edit the source of this plugin and change it to synchronise an entire directory, too.
I have installed the postgresql module from Puppetforge.
How can I query Postgresql resources using ralsh ?
None of the following works:
# ralsh postgresql::db
# ralsh puppetlabs/postgresql::db
# ralsh puppetlabs-postgresql::db
I was hoping to use this to get a list of databases (including attributes such as character sets) and user names/passwords from the current system in a form that I can paste into a puppet manifest to recreate that setup on a different machine.
In principle, any puppet client gets the current state of your system from another program called Facter. You should create a custom Fact (a module of Facter), and then included into your puppet client. Afterwards, I think you could call this custom Fact from ralsh.
More information about creating a custom Fact can be found in here.
In creating your own Fact, you should execute your SQL query and then save the result into particular variable.