Sphinx 2.3.1-beta where is search binary? - sphinx

Having just built Sphinx from source, cannot find the CLI search tool 'search'.
Everything else is in place! IE, config set up, searchd runs AOK...

CLI search tool was removed from Sphinx in version 2.2.2.
For testing purposes you can now use mysql client and SphinxQL.

Related

PDI MongoDB Input and Output steps

I recently installed PDI 8.2 CE - doesn't look like it comes with the Mongo input and output steps...
I found it here https://github.com/pentaho/pentaho-mongodb-plugin I unzipped that and put the entire folder in to the ../data-integration/plugins directory and restarted PDI but still no mongo input/output.
What am I doing wrong?
you do not need to add any additional plugins because it's under big data section you will find both input and output for mongodb.
Figured this out and it appears that PDI is very picky when it comes to which versions of JDK/JRE your running on your machine regardless of what you have set as the PENTAHO_JAVA_HOME.
Had to uninstall all of them, and then ensure that only OpenJDK 1.8 was installed.

mongodb 3.0.4 on ubuntu 14.04 LTS configuration file not yaml?

I'm using mongodb 3.0.4 on ubuntu 14.04 LTS and I wanted to change the dbpath in /etc/mongod.conf. However after checking the mongodb manuals, I read that the configuration file format was changed to YAML after version 2.6. But my configuration file doesn't look like this. Did I missed anything or is it a bug or should I change the conf file to YAML?
Thanks in advance!
The config files old format (which is a series of settings=values separated by new lines) is still usable, but deprecated. It will eventually be removed, and certain new settings (like Storage Engine options) may not be configurable. I would recommend switching to the YAML format as soon as you can for future-proofing.
The reason why you have the old format is because the packages (like the one you used to install on Ubuntu) have not been moved to the YAML format yet. The ticket to switch them is complete (SERVER-14750), so you will get the new format in 3.2. The file that will be in 3.2 can be found here.
For reference, you can find the old format documentation in the 2.4 docs here.
If you would like some examples of the YAML configs, I've written up a few common ones over on DBA.

Sitecore Package installation not working

I am trying to install Sitecore PowerShell Extensions-3.0 Package for Sitecore 8
in my sitecore 8 instance (Windows 8.1 machine).
I used Sitecore's Development Tools -> Installation Wizard & Package Manager and choosing the above mentioned package. It shows Install a Package sitecore dialog with Installing gif and stays there for over an hour and nothing happens. No error nothing it just spins.
First I tried other Packages and it was the same, even with lower versions of Powershell Packages and it still doesn't install.
Can someone shed some lights on what I am missing?
When installing Sitecore packages entries are written to the Sitecore Log. In Sitecore 8 the Logs are stored in MongoDB, so if you don't have Mongo running installing packages appears to hang.
Although disabling Mongo Analytics allowed you to install the package it is not a suitable method moving forward. Instead you should install MongoDB and then get the Sitecore DBs running by executing a .bat file.
To do that open Notepad paste the text as it below, modifying the path to the MongoDB.exe and folder containing the Sitecore Mongo databases if required, then Save As SitecoreDbs.bat
"C:\Program Files\MongoDB 2.6 Standard\bin\mongod.exe" -dbpath "C:\inetpub\wwwroot\Sitecore8\Databases"
Essentially the path to your MongoDB executable location and the Path to your Site's databases.
By disabling Mongo Analytic did the trick.

Update of elasticsearch plugin

I have developed a couple of plugins for elasticsearch, and these may require (more or less) frequent updates.
My simple question is: is there a way of updating an elasticsearch plugin without having to remove the old version, delete the relevant indexes, install the new version and rebuild the indexes from scratch?
Thanks in advance.
There is no way to update an existing plugin. You need to delete the old version and install the new one.
I didn't get your question about indices though. A plugin doesn't necessarily work against data, it can just be a site plugin, a query parser etc. In case a plugin does work against indices and you want to upgrade it, but the elasticsearch version stays the same, I don't see why you would need to reindex. The only case is if the plugin itself changed in a non backwards compatible way.
As of the latest version 2.4.1 (2016-10-18) there is still no way to do this easily, because the elasticsearch folks recommend manual plugin updates.
Expect that when you update Elasticsearch and start the service, that you will end up with an error because service won't start due to a plugin being one minor version behind the ES version, e.g. "license".
Go to your elasticsearch bin directory and run the following commands:
sudo ./plugin remove <plugin name>
sudo ./plugin install <plugin name>
You could even be so bold as to write an "update" shell script that does this for you.
A bash script here:
#!/bin/bash
plugin_bin=/usr/share/elasticsearch/bin/plugin
for plugin in $($plugin_bin list | awk '/^\s*-\s*(\S+)/{print $2}')
do
$plugin_bin remove $plugin && $plugin_bin install $plugin
done

Sphinx search on Ubuntu behaves weird

I'm having some issues with Sphinx Search on Ubuntu. My test setup is on a Mac, I use the precompiled binary from sphinxsearch.com.
When testing exceptions everything is fine. E.g. I want to map "starwars" to "star wars", so I have starwars => star wars in my exceptions.txt, works great. When I do the same on our Ubuntu server, if does nothing.
I've tried compiling sphinx myself, still no luck. What's going on, and does anyone else see this difference between operating systems?
I'm running Ubuntu 10.04 lts and the lasted Sphinx Search (64bit).
Firstly check the output of indexer - it may well reveal some issue reading the exceptions file.
Perhaps the permissions arent setup to allow indexer AND searchd to read the file?
Or the path is wrong. Its the simply things that usually cause this
Or could be that you simply didnt reindex and restart searchd after setting up the exceptions?