Running snort 2.9.7.0 on the latest Arch Linux OS on Raspberry Pi B+ model.
I have tried to run Snort multiple times in NIDS mode: snort –dev –l log –h 192.168.1.0/24 –c snort.conf OR snort -c snort.conf -l /log -h 127.0.0.1/24 -s.
I always get this error: ./etc/snort/rules/emerging-icmp.rules(0) Unable to open rules file "./etc/snort/rules/emerging-icmp.rules" no such file or directory. The problem is this file does exist and is part of the rules directory!
I did modify the snort.conf as some tutorials and the manual http://manual.snort.org/node18.html suggested however this did not help in any way and I hit a brick wall. I'm not seeing what I'm doing wrong.
Does it have to do with . before / ?
The ./ will check the directory you're snort.conf is in so if it isn't in the root (/) directory that is probably why. You should remove the . If the rules files is actually in /etc. It could also be a permissions problem. Make sure the permissions are correct on that file for the user you are running snort as.
Related
I've placed a docker compose file project.yaml at the location /etc/project/project.yaml
the file and well as the project directory have the same file permission, i.e. -rxwrxxrwx
but when I run docker-compose
sudo docker-compose -f ./project.yaml up -d
if errors out with the following
Cannot find the file ./project.yaml
I have checked various times and it seems there is no permission issue. Can anyone tell why we have this problem and what would be the solution
Beside using the full path, as commented by quoc9x, double-check your current working directory when you call a command with a relative path ./project.yaml
If you are not in the right folder, that would explain the error message.
I'm using Windows 10 if it matters and I'm trying to feed a file to the "oeminst" app that will convert this file from .EDR to .CCSS. According to the app's website its usage summary is this:
oeminst [-options] [inputfiles]
-v Verbose
-n Don't install, show where files would be installed
-c Don't install, save files to current directory
-S d Specify the install scope u = user (def.), l = local system]
infile Manufacturers setup.exe install file(s) or .dll(s) containing install files
infile.[edr|ccss|ccmx] EDR file(s) to translate and install or CCSS or CCMX files to install
If no file is provided, oeminst will look for the install CD.
more info can be found here https://www.argyllcms.com/doc/oeminst.html
So far I tried this code:
C:\Users\PC>oeminst infile. [C:\Users\PC\testfile.edr]
oeminst: Error - Unable to load file 'infile [C:\Users\PC\testfile]'
I'd appreciate if someone at least could tell me if I'm doing it right or not.
P.S. sorry for the messed up text. Not sure how to fix it. It looks good in editing mode.
Try this : oeminst infile.edr C:\Users\PC\testfile.edr
Nevermind, I got it.
C:\Users\PC>oeminst C:\Users\PC\testfile.edr
I am using Pentaho CE 5 on windows. I would like to use CTools but I can't make them show up in the File -> New menu to use them.
Being behind a proxy, I can not use the Marketplace plugin, so I have tried a manual installation.
First, I tried to use the ctools-installer.sh. I have run the following command line in cygwin (wget and unzip are installed):
./ctools-installer.sh -s /cygdrive/d/Users/[user]/Mes\ Programmes/pentaho/biserver-ce/pentaho-solutions/ -w /cygdrive/d/Users/[user]/Mes\ programmes/pentaho/biserver-ce/tomcat/webapps/pentaho/
The script starts, asks me what module I want to install, and begins the downloads.
For each module, I get an output like (set -x added to the script) :
echo -n 'Downloading CDF...' Downloading CDF...+ wget -q --no-check-certificate 'http://ci.analytical-labs.com/job/Webdetails-CDF-5-Release/lastSuccessfulBuild/artifact/bi-platform-v2-plugin/dist/zip/dist.zip'
-O .tmp/cdf/dist.zip SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc
'[' '!' -z '' ']'
rm -f .tmp/dist/marketplace.xml
unzip -o .tmp/cdf/dist.zip -d .tmp End-of-central-directory signature not found. Either this file is not a zipfile, or it
constitutes one disk of a multi-part archive. In the latter case
the central directory and zipfile comment will be found on the last
disk(s) of this archive. unzip: cannot find zipfile directory in
.tmp/cdf/dist.zip,
and cannot find .tmp/cdf/dist.zip.zip, period.
chmod -R u+rwx .tmp
echo Done Done
Then the script ends. I have seen on this page (pentaho-bi-suite) that it is the normal output. Nevertheless, it seems a bit strange to me and when I start my pentaho server (login: admin/password), I cannot see any new tools in the menus.
After a look to a few other tutorials and the script itself, I have downloaded the .zip snapshots for every tool and unzipped them in the system directory of my pentaho server. Same result.
I would like to make the .sh works, what can I try or adjust ?
Thanks
EDIT 05/06/2014
I checked the dist.zip files dowloaded by the script and they are all empty. It seems that wget cannot fetch the zip files, and therefore the installation fails.
When I try to get any webpage through wget, it fails. I think it is because of the proxy.
Here is my .wgetrc file, located in my user's cygwin home folder:
use_proxy=on
http_proxy=http://[url]:[port]
https_proxy=http://[url]:[port]
proxy_user=[user]
proxy_password=[password]
How could I make this work?
EDIT 10/06/2014
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline.
I guess this is related with the -r option.
I consider this post solved, since it not a CTools issue anymore.
Difficult to identify the issue in the above procedure..
but you can refer this blog he is key member of pentaho itself..
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline. I guess this is related with the -r option.
I consider this post solved, since it is not a CTools issue anymore.
You can manually install the components from http://www.webdetails.pt/ctools/ or if you have pentaho 5.1 or above, you add the following parameters to CATALINA_OPTS option (in start-pentaho.bat or start-pentaho.sh):
-Dhttp.proxyHost= -Dhttp.proxyPort= -Dhttp.nonProxyHosts="localhost|127.0.0.1|10...*"
http://docs.treasuredata.com/articles/pentaho-dataintegration#tips-how-can-i-use-pentaho-through-a-proxy
I,m trying to install a joomla site in parallels plesk panel via akeeba backup . Where I,m facing file permission issue.
An error occured
Could not open /var/www/vhosts/xyz.com/httpdocs/pearl_new/jquery.min.js for writing.
As searched all over and also in Plesk forum . I found this is a very common problem. Some suggested installing mod_suphp can solve the problem. I tried but don't know is it successfully installed or not.
Then I have created a new service plan from where in hosting parameter I select Run PHP as FastCGI
After that I took my domain to that service plan. I thought it will solve the problem. But still getting same error. Can anyone help please ?
On the ssh command line try:
find /var/www/vhosts/xyz.com/httpdocs/ -type f -exec chmod 664 {} \;
find /var/www/vhosts/xyz.com/httpdocs/ -type d -exec chmod 775 {} \;
these will set the permissions correct for writing to by user and group for files (f) and directories (d). you also need to make sure that apache is in the psacln and psaserv groups in the /etc/group file: the lines should look like this:
psaserv:x:504:apache,psaftp,psaadm
psacln:x:505:apache
Then you can run the commad:
chown -R siteusername.psacln /var/www/vhosts/xyz.com/httpdocs/*
where "siteusername" is the username of the site's files.
Hope this helps.
this is common issue in linux and users had shared hosting.
So simple.
If you already selected PHP module with FAST CGi so follow the following steps:
Open file manager
Make new folder "ABC"
Click "ALL" on right side to view all files on the tree.
Select all files and folders except "plesk-stats"
Select Copy/move button
in the path filed type /httpdocs/abc/
Click Move.
If all files moved and then open "abc" folder
Select all files and folders.
Select Copy/move button
in the path filed type /httpdocs/
that's it issue sorted out.
I tried these steps for many clients.
I hope this helps for someone.
Does anybody know syntax for wget command in windows. I tried its basic syntax but the problem is file gets downloaded in the directory on which I have opened command prompt. I want to know whether we can explicitly specify destination in its command. If possible then let me know that would be much helpful for me.
If anyone reading this wants to save files downloaded to a directory, use "-P".
Example:
wget LINKHERE -P %USERPROFILE%/Downloads
This saves whatever is served by your link to C:\Users\username\Downloads.
According to the manual -O, --output-document=FILE write documents to FILE.
So you must give a file name after a valid directory as such:
C:\cronjobs>wget -q -O C:\Users\Public\Documents\tmp1.txt "http://google.com/"
note: -q option is to say quiet but -O is to say save file to a given file name and it will work!
Sure you can.
Use the -O syntax, and the path to use.
I've just tested this with:
C:\users\julien>wget google.com -O "C:\here.html"
And "here.html" was google's index page on the root of my "C:" drive