Querying the Sphinx Search Index - sphinx

I am using the Sphinx Search engine and I have an issue where a few files are not showing up in the search results and definitely should be. I have checked to make sure no info. is missing that would prevent these files from appearing.
Is there some way for me to query the index directly to see if these records are in there, or to see whether or not a specific record is there?
I found a similar post on the subject:
Sphinx Search Index
So, it appears it is possible to do, but that post is not detailed enough on how to do it. I am not following what exactly is going on in that post, in other words. Do I just put this directly into the command line?
Or is there a tutorial available on this? I searched and could not locate one.

Sphinx provides connection through mysql's protocol, so you can use any of mysql's clients to connect and execute queries:
http://dev.mysql.com/doc/refman/5.5/en/programs-client.html
If you will install command-line client, you should connect like this:
$ mysql -h0 -P9306
Sphinx supports custom subset of SQL, called SphinxQL, you can use it to query data from index. There is documentation about SphinxQL:
http://sphinxsearch.com/docs/latest/sphinxql-reference.html

Related

Is it possible to evaluate a Postgres expression without connecting to a database?

PostgreSQL has excellent support for evaluating JSONPath expressions against JSON data.
For example, this query returns true because the value of the nested field is indeed "foo".
select '{"header": {"nested": "foo"}}'::jsonb #? '$.header ? (#.nested == "foo")'
Notably this query does not reference any schemas or tables. Ideally, I would like to use this functionality of PostgreSQL without creating or connecting to a full database instance. Is it possible to run PostgreSQL in such a way that it doesn't have schemas or tables, but is still able to evaluate "standalone" queries?
Some other context on the project, we need to evaluate JSONPath expressions against JSON data in both a Postgres database and Python application. Unfortunately, Python does not have any JSONPath libraries that support enough of the spec to be useful to us.
Ideally, I would like to use this functionality of PostgreSQL without creating or connecting to a full database instance.
Well, it is open source. You can always pull out the source code for this functionality you want and adapt it to compile by itself. But that seems like a large and annoying undertaking, and I probably wouldn't do it. And short of that, no.
Why do you need this? Are you worried about scalability or ease of installation or performance or what? If you are already using PostgreSQL anyway, firing up a dummy connection to just fire some queries at the JSONB engine doesn't seem too hard.

How to run batch "sql" using Parse server or directly on MongoDB?

I am going to use SQL terminology because I am new to Parse, apologies if that is confusing. I have a table in an app and in order to introduce new functionality I need to add a new column and set all the records to a default value. In SQL I would just run
update <table> set <column> = <value>;
Parse Server has MongoDB as the back end and I am not clear whether the correct approach is to directly access the MongoDB and run statements through the command line or if this would cause an issue with Parse. I found this helpful link for translating SQL syntax to MongoDB for that, https://docs.mongodb.com/manual/reference/sql-comparison/.
I also noticed that there were some tools, such as studio 3t, but the ones I saw all required expensive licenses. If direct MongoDB access is OK, any help understanding how to get to that would be helpful, I installed parse-server from the Bitnami stack on the AWS marketplace and to date I have only been interacting with it through the provided dashboard which doesn't have an "Update all records" option.
Right now my work around is to write a Swift script that runs the update in a loop, but I have to think that if I had millions or records instead of thousands this would be the incorrect approach. What is the proper environment and code to update my existing Parse server so that I can run something like the SQL above?

When testing POST (create mongo entries), how to delete entries in DB w/ Jmeter after testing, if you don't have DELETE endpoints?

I'm sure I can write an easy script that simply drops the entire collection from the database but that seems very clumsy as a long term solution.
Currently, we don't have delete endpoints that actually DELETE, we have PUT endpoints that mark the entry as "DONT SHOW/REMOVED" and another "undelete endpoint" that restores the viewing since we technically don't want to delete any data in our implementation of this medical database, for liability purposes.
Does Jmeter have a way where I can make it talk to Mongo and delete? I know there is a deprecated way to talk to mongo via Jmeter but not sure about any modern solutions.
Since I can't add unused code into the repo, does this mean the only solution is for me to make a "extra endpoint" outside of the repo that Jmeter can access to delete each entry?
Seems like a viable solution just not sure if that's the only way to go about it and if I'm missing something.
MongoDB Test Elements were deprecated due to low interest as keeping the MongoDB driver which is being shipped with JMeter up-to-date would require extra effort and the number of users of the MongoDB Test Elements was not that high.
Mailing List Message
Associated JMeter issue
However given you don't test MongoDB per se and plan to use JMeter MongoDB elements only for setup/teardown actions I believe you can go ahead.
You can get MongoDB test elements back by adding the next line to user.properties file:
not_in_menu
This will "unhide" MongoDB Source Config and MongoDB Script elements which you will be able to use for cleaning up the DB. See How to Load Test MongoDB with JMeter for more information, sample queries, tips and tricks.

How to use Solr on Postgresql and index a table

I am new to Solr with the specific need to crawl existing database table and generate results.
Any online example/tutorial so far only explains about you give documents and it gets indexed, but not any indication of how to do same on database.
Can anyone please explain steps how to achieve this ?
Links like this wiki shows everything with jdbc driver and mysql so I even doubt if Solr supports this with .NET or not. My tech boundries are in C# and Postgresql
You have stumpled over the included support for JDBC already, but you have to use the postgres JDBC driver. The example will be identical with the MySQL one, but you'll have to use the proper URL for postgres instead and reference the JDBC driver (which will depend on which Postgres JDBC driver you use).
jdbc:postgresql://localhost/test
This is a configuration option in Solr, and isn't related to .NET or other external dependencies.
However, the other option is to write the indexing code yourself, and this can often be a good solution as it makes it easier to pre-process the content and apply certain logic before storing content in Solr. For .NET you have Solrnet, a Solr client, that'll make it easy to both query from and submit documents to Solr.

searching with Sphinx

I'm at an impasse that probably has a simple solution, but I can't see it. I've done everything in the Sphinx documentation up to the point of the Quick Tour, but when I test the search using test.php in PuTTy, it returns zero results.
I've put in all my correct database info in sphinx.conf and I've assembled the SQL query. I'm not getting any errors at all, just that it says it's returning 0 results every time I search.
Is it looking at my databases? Let me know if you need to see any code. searchd is running (as far as I can tell).
Sphinx has 2 different phases:
1) Indexing
2) Searching
I belive from your question that you skipped by mistake part where you need to index data (run indexer) so searching would have data to search through. In indexing part sphinx will take all of data from your db and search will actually be searching that and not your DB.
Make sure that indexer --all showing that it found and indexed actual documents.
Besides the API there is another convenient method to test sphinx using SphinxQL
Add line "listen = 9306:mysql41" line in searchd section in sphinx.conf as described in http://astellar.com/2011/12/replacing-mysql-full-text-search-with-sphinx/ and start the daemon.
Then run
mysql -h0 -P 9306
and then fire the query against sphinx
SELECT * FROM <your_sphinx_index>;
Hope that helps!