enter image description here
I tried to install logstash and i run
logstash -f ./conf/logstash-sample.conf
command but it gives me that error
i am new in ELK stack
Related
Hi i am new to apache flink and i am trying to run a batch wordcount example to start learning about it.I have run
./bin/start-cluster.sh
and then i executed ./bin/flink run ./examples/batch/WordCount.jar --input test.txt --output out.txt
and i get the following
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:8081
console messages
so i think its about server connection error and i tried some things like xampp but nothing better
So what's your opinion on that?
It seems like your cluster is not starting. Please try ./bin/start-cluster.sh again and go to http://localhost:8081/ to confirm your cluster is up. After that, the word count example should run fine after specifying the appropriate input and output files.
Platform: Magento 2 (V2.4.0) |
Server: Linux Centos
When I run the command below,
COMPOSER_MEMORY_LIMIT=-1 php bin/magento setup:static-content:deploy -f
I get the error,
Could not validate a connection to Elasticsearch. No alive nodes found in your cluster
When I run,
systemctl restart elasticsearch.service
I get the error,
Failed to restart elasticsearch.service: Unit not found.
And when I run the command,
systemctl status elasticsearch
I get the error,
Unit elasticsearch.service could not be found
However, I can see Elasticsearch folder and files in the following path:
vendor/elasticsearch/elasticsearch
Any suggestions?
Thank you!
Hi try using these commands, i use them on my ubuntu instance
sudo -i service elasticsearch start
service elasticsearch restart
sudo -i service elasticsearch stop
I need help on the following,
Where to install filebeat in ubuntu 18.04 and How to configure logstash to fetch logs in logfiles
how to parse the IPs(intrusion Prevention logs using logstash grok filter?
I'm trying to implement centralised logging for a number of micro-service docker containers.
To achieve this I'm attempting to use the recommended syslog logging driver approach, to deliver logs to loggly.
https://www.loggly.com/docs/docker-logging-driver/
I've done the following...
On the remote docker-machine...
$ curl -O https://www.loggly.com/install/configure-linux.sh
$ sudo bash configure-linux.sh -a SUBDOMAIN -u USERNAME
It verified that everything worked correctly, and I can see that the host events are now going through to the loggly console.
I then configured the services in docker-compose, like so...
nginx_proxy:
build: nginx_proxy
logging:
driver: "syslog"
options:
tag: "{{.ImageName}}/{{.Name}}/{{.ID}}"
I then rebuilt and re-launched the containers, with...
$ docker-compose up --build -d
However I'm not getting any logs from the containers going to loggly.
I can verify that the syslog driver update has taken effect by doing...
$ docker-compose logs nginx_proxy
This reports...
nginx_proxy_1 | WARNING: no logs are available with the 'syslog' log driver
Which is what I would expect to see, as this log driver doesn't work for viewing logs locally.
Is there something else I need to do to get this working correctly?
Can you share Dockerfile in nginx_proxy directory? Did you confirm that it is generating logs?
To test, can you swap out nginx with basic ubuntu that echo's something like they show in loggly documentation: https://www.loggly.com/docs/docker-logging-driver/
Run:
sudo docker run -d --log-driver=syslog --log-opt tag="{{.ImageName}}\{{.Name}}\{{.ID}}" ubuntu echo "Test Log"
Check:
$ tail /var/log/syslog
I was trying to configure ELK in one of ubuntu server and filebeat in another ubuntu machine. After changing the file configuration which is /etc/filebeat/filebeat.yml I am not seeing that filebeat is running.
Where I can check filebeat logs and what I am missing?
if installed filebeat from installer search for log path in /var/log/filebeat.
check for yml for any syntax error and user permission to run the filebeat.