ReferenceError : process is not defined - vue-styleguidist

I don't find the process in the webpack config file , so what might cause this error

Related

Failed to read postmaster.pid file while running embedded-postgres

My Spring application uses yandex-qatools/postgresql-embedded for executing Unit Tests.
While executing them, I am constantly getting the below error :
ERROR 75847 --- [ Test worker] r.y.q.embed.postgresql.PostgresProcess : Failed to read PID file (File '/var/folders/sh/xr6l_7bs1_z9v1jfsyctc45w0000gp/T/postgresql-embed-b05c213f-7416-4200-a586-a3afb3263478/db-content-4f285249-22ea-4625-b771-156adbf5851f/postmaster.pid' does not exist)
java.io.FileNotFoundException: File '/var/folders/sh/xr6l_7bs1_z9v1jfsyctc45w0000gp/T/postgresql-embed-b05c213f-7416-4200-a586-a3afb3263478/db-content-4f285249-22ea-4625-b771-156adbf5851f/postmaster.pid' does not exist
There is a warning popped up before the exception, but for now, let's ignore it.
WARN 75847 --- [ Test worker] r.y.q.embed.postgresql.PostgresProcess : Possibly failed to run initdb:
no data was returned by command ""/private/var/folders/sh/xr6l_7bs1_z9v1jfsyctc45w0000gp/T/postgresql-embed-b05c213f-7416-4200-a586-a3afb3263478/pgsql-10.3-1/pgsql/bin/postgres" -V"
The program "postgres" is needed by initdb but was not found in the
same directory as "/private/var/folders/sh/xr6l_7bs1_z9v1jfsyctc45w0000gp/T/postgresql-embed-b05c213f-7416-4200-a586-a3afb3263478/pgsql-10.3-1/pgsql/bin/initdb".
Check your installation.
I verified that no other instance of Postgress is running on my local machine using
ps -ef|grep postgres
Followed this thread as well, but it doesn't help.
Ran out of options to fix this, can anyone please suggest how to resolve it.
OSX version: 12.1
Thanks in advance
In my case, besides your error, I also could see the following error:
r.y.q.embed.postgresql.PostgresProcess : Possibly failed to run initdb:
initdb: invalid locale settings; check LANG and LC_* environment variables
This message led me to the solution. I just added the below environment properties to my .zshrc file:
export LANG="en_US.UTF-8"
export LC_ALL="en_US.UTF-8"
export LC_CTYPE="en_US.UTF-8"

How to log useful error if scala play config files missing?

Is it possible to give a better error message if the config filename is mistyped when using scala play.api.Configuration
e.g. if I run my application with sbt run -J-Dconfig.file=conf/my-config.conf but the file is actually called my_config.conf, there is no error raised about file not found, but instead the first time the error is raised is when applicationConfig.has(configPath) is called, at which point it is not clear how to determine programatically the difference between a missing config value in the file or a missing config file.
Here is what I do:
Wrap the configuration in a Config-Class.
Initialize that class on startup.
Log all property - values.
This will log exceptions on Startup. Here is an example: AdaptersContext.scala
As a remark:
If you have your config-file in the conf directory (on classpath), use:
config.resource=demo.conf

Crowdin error when I tried to upload translations

I've an issue (I'm still blocked), I've created my configuration file like :
project_identifier: test
api_key: KeepTheAPIkeySecret
base_url: https://api.crowdin.com
base_path: /path/to/your/project
files:
-
source: /locale/en/LC_MESSAGES/messages.po
translation: /locale/%two_letters_code%/LC_MESSAGES/%original_file_name%
See : https://github.com/crowdin/crowdin-cli
However, I received an error message when I execute my command line to upload translation in Crowdin :
error: Seems Crowdin server API URL is not valid.
Please check the `base_url` parameter in the configuration file.
I don't know why it's not working!Thanks for any help !
Crowdin sent me another JAR,
The last one was not good for windows path.

Failed to load kafka module

I am trying out the nxlog kafka out module from below
Link
I get the below error message
ERROR Failed to load module from /usr/local/libexec/nxlog/modules/output/om_kafka.so, /usr/local/libexec/nxlog/modules/output/om_kafka.so: undefined symbol: rd_kafka_topic_new;DSO load failed
ERROR module 'outKafka' is not declared at /usr/local/etc/nxlog/nxlog.conf:65
ERROR route tcproute is not functional without output modules, ignored at /usr/local/etc/nxlog/nxlog.conf:65
I am using :
Nxlog Version - nxlog-ce-2.8.1248
Kafka Version - kafka_2.9.2-0.8.1.1
Latest librdkafka
Also the example programe of librdkafka (rdkafka) for both producer and consumer runs fine so I guess the environment is set correct for librdkafka ,
but am unable to determine what's causing this problem.
The problem is that om_kafka.so is not linked with librdkafka.
You will need this in Makefile.am:
om_kafka_la_LIBADD = $(LIBRDKAFKA) $(LIBNX)
The value of $(LIBRDKAFKA) should be properly set ,normally this is done in configure.in. Otherwise you could just use the full path to the library (.so or .la or .a )

Hadoop : Filenotfound exception - windows

This problem seems to be already raised in Stackoverflow, but my case is quite different, file or folder location hadoop looking for is created in C:/tmp/hadoop-SYSTEM/mapred/local/taskTracker/jobcache/, in this location job folder are created while run the wordcount example, but even the files and folder are avalilable, its throwing the file not found exception, it seems like files not been identified, i even tried the re-formating of namenode which is one of the solution provided in forums,but still problem exist
Note: Hadoop version 0.20.2
ERROR:
13/04/11 10:24:20 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
13/04/11 10:24:21 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/04/11 10:24:21 INFO input.FileInputFormat: Total input paths to process : 1
13/04/11 10:24:22 INFO mapred.JobClient: Running job: job_201304111023_0001
13/04/11 10:24:23 INFO mapred.JobClient: map 0% reduce 0%
13/04/11 10:24:34 INFO mapred.JobClient: Task Id : attempt_201304111023_0001_m_000002_0, Status : FAILED
java.io.FileNotFoundException: File C:/tmp/hadoop-SYSTEM/mapred/local/taskTracker/jobcache/job_201304111023_0001/attempt_201304111023_0001_m_000002_0/work/tmp does not exist.
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
at org.apache.hadoop.mapred.TaskRunner.setupWorkDir(TaskRunner.java:519)
at org.apache.hadoop.mapred.Child.main(Child.java:155)
Check if the permissions to that folder have been set properly, this type of error may occur if write permissions are not given to that folder.