JRockit Dump File - dump

I need a parameter to start JRockit JVM without (doesn´t metter what happens) generating dump files. Something like -DnoDump or -XXnoJrDump would be great.

The parameter to disable dumps in JRockit is -XXdumpSize:none

Related

error: driver already registered, aborting

I compiled a (lm75) driver as a module to insert at run-time and and when tried to perform below
#insmod ./lm75.ko
I got the output as
Error: Driver 'lm75' is already registered aborting...
insmod: can't insert './lm75.ko': Device or resource busy
So, tried removing the same from kernel as below
#rmmod lm75.ko
which outputted
rmmod: can't unload module 'lm75': No such file or directory
let me know if I'm missing something else?
I'm using a script to run commands in u-boot which in turn is loading images (uImage, rootfs, dtb) from predefined locations in the MMC where as the recent version uImage is in wrong location (my fault). Hence, the uImage and rootfs loaded are different, whereas uImage has LM75.KO inserted (as it is a old image where LM75 is compiled as an built-in driver) and rootfs has no info about the LM75 (as it is latest one, in which LM75 is compiled as an kernel module). When, replaced with correct images the insmod and rmmod worked as expected. Hope this helps people like me :)

Elasticsearch shows umlauts as "??"

Setup:
Ubuntu 12.04 Server installed via VMWare quick install
PostgreSQL 9.1
ElasticSearch 0.90
Mono 3.2.1
Rails 4
Nginx 1.4.2 + Passenger 4.0.16
I have a C# program that on start writes a new ElasticSearch index and points the alias that is used by the rails applications to it, the program then keeps going and watches a redis instance for things to update.
There is another C# program that scrapes data from web pages, once scraped they are put into Postgresql and the index writer above is notified via Redis. Those pages have varying encodings and are converted to UTF-8.
The first appearance of this bug was when I made a mistake and encoded data that was already UTF-8 as UTF-8 again.
Investigation
Now I thought that I obviously have some data corruption going on but the weird thing is: The umlauts are only corrupted when I start the indexing mono process from rails via nohup, if I kill this process and manually start it from the command line it works perfectly fine.
When I do a backup/restore of the database it works again from web interface but once the server is rebooted the umlauts are again replaced with ?? when starting the mono process from the web interface.
The first thing I did was to purge the affected rows from the database and scrape the data again (without encoding it twice), that didn't help and since the error only appears when running it as non-interactive via nohup from the rails application I assumed it is because of the locale setting so I changed that in both, /etc/defaults/locale and /etc/environment to en_US.UTF-8 and en_US:en but that did not help either.
I really have no idea what else I can do or what exactly causes this error, any help would be appreciated.
edit: I forgot to clarify the most important part, when umlauts are replaced with ?? ALL umlauts are replaced in every single document in the index.
Put this in the script that you use to start your process:
export LC_ALL=en_US.UTF-8
export LANG=en_US.UTF-8
export LANGUAGE=en_US.UTF-8
The reason that your script only picks up the UTF-8 when you start things manually is that these things are not system wide. I've run into this with jruby and init.d scripts before and the solution is to not rely on defaults for this.

PostgreSQL issue: could not access file "$libdir/plpgsql": No such file or directory

I get this exception in PostgreSQL:
org.postgresql.util.PSQLException: ERROR: could not access file "$libdir/plpgsql": No such file or directory
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:1721)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1489)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:193)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:452)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:337)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statement.java:236)
at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:205)
I searched a lot and most solution points to a wrong installation. But this is my test db which has been running without issues for a long time. Also inserts are working. Issue occurs only on select queries.
Apparently, you moved your PostgreSQL lib directory out of place. To confirm this, try the following in psql:
> SET client_encoding TO iso88591;
ERROR: could not access file "$libdir/utf8_and_iso8859_1": No such file or directory
If you get an error message like this, then my theory is correct. You'll need to find out where those files ended up, or you can reinstall PostgreSQL to restore them.
To find out what $libdir is referring to, run the following command:
pg_config --pkglibdir
For me, this produces:
/usr/lib/postgresql
I have the same problem: the other postgres server instance (8.4) was interfering with the 9.1 one; when the 8.4 instance is removed it works.
the other instance can sometimes be removed from the system while still running (e.g. you do a gentoo update and a depclean without stopping and migrating your data). so the error seems particularly mysterious.
the solution is usually going to be doing a slot install/eselect of the old version (in gentoo terms, or simply downgrading on other distros), running its pg_dumpall, and then uninstalling/reinstalling the new version and importing the data.
this worked pretty painlessly for me

How to connect to PostgreSQL in Erlang using epgsql driver?

I would like to access a PostgreSQL database in Erlang. I downloaded the epgsql driver, it was a few directories and files, but I don't understand how to use it.
How can I write an Erlang program and use the epgsql driver to access a PostgreSQL database?
I made a new folder and copied all files from src/ in the driver and pgsql.hrl to my new folder. Then I created a simple test program:
-module(dbtest).
-export([dbquery/0]).
dbquery() ->
{ok,C} = pgsql:connect("localhost", "postgres", "mypassword",
[{database, "mydatabase"}]),
{ok, Cols, Rows} = pgsql:equery(C, "select * from mytable").
Then I started erl and compiled the modules with c(pgsql). and c(dbtest).
But then when I exeute dbtest:dbquery(). I get this error:
** exception error: undefined function pgsql:connect/4
in function dbtest:dbquery/0
Any suggestions on how I can connect to a PostgreSQL database using Erlang?
Rebar is a good tool to use but, I'm finding it's good to know how your project should be structured so you can tell what to do when things go wrong. Try organizing your project like this:
/deps/epqsql/
/src/dbtest.erl
/ebin
Then cd into deps/epqsql and run make to build the library.
Your dbtest.erl file should also explicitly reference the library, add this near the top:
-include_lib("deps/epgsql/include/pgsql.hrl").
You'll probably want to use a Makefile (or rebar) that compiles your code when you make changes but, try this to compile things right now: erlc -I deps/epqsql/ebin -o ebin src/dbtest.erl.
When testing, make sure your load paths are set correctly, try: erl -pz deps/epqsql/ebin/ ebin/. When the erl console loads up, try dbtest:dbquery(). and see what happens!
I don't have Postgresql setup on my machine but, I was able to get more reasonable looking errors with this setup.
I recommend to use ejabber pgsql driver https://svn.process-one.net/ejabberd-modules/pgsql/trunk/

Stop Oracle from generating sqlnet.log file

I'm using DBD::Oracle in perl, and whenever a connection fails, the client generates a sqlnet.log file with error details.
The thing is, I already have the error trapped by perl, and in my own log file. I really don't need this extra information.
So, is there a flag or environment for stopping the creation of sqlnet.log?
As the Oracle Documentation states: To ensure that all errors are recorded, logging cannot be disabled on clients or Names Servers.
You can follow the suggestion of DCookie and use the /dev/null as the log directory. You can use NUL: on windows machines.
From the metalink
The logging is automatic, there is no way to turn logging off, but since you are on Unix server, you can redirect the log file to a null device, thus eliminating the problem of disk space consumption.
In the SQLNET.ORA file, set LOG_DIRECTORY_CLIENT and LOG_DIRECTORY_SERVER equal to a null device.
For example:
LOG_DIRECTORY_CLIENT = /dev/null
LOG_FILE_CLIENT = /dev/null
in SQLNET.ORA suppresses client logging completely.
To disable the listener from logging, set this parameter in the LISTENER.ORA file:
logging_listener = off
Are your clients on Windows, or *nix? If in *nix, you can set LOG_DIRECTORY_CLIENT=/dev/null in your sqlnet.ora file. Not sure if you can do much for a windows client.
EDIT: Doesn't look like it's possible in Windows. The best you could do would be to set the sqlnet.ora parameter above to a fixed location and create a scheduled task to delete the file as desired.
Okay, as Thomas points out there is a null device on windows, use the same paradigm.
IMPORTANT: DO NOT SET "LOG_FILE_CLIENT=/dev/null", this will cause permissions of /dev/null be reset each time your initialize oracle library, and when your umask is something that does not permit world readable-writable bits, those get removed from /dev/null and if you have permission to chmod that file: i.e running as root.
and running as root maybe something trivial, like php --version having oci php-extension present!
full details here:
http://lists.pld-linux.org/mailman/pipermail/pld-devel-en/2014-May/023931.html
you should use path inside directory that doesn't exist:
LOG_FILE_CLIENT = /dev/impossible/path
and hope nobody creates dir /dev/impossible :)
for Windows NUL probably is fine as it's not actual file there...