Snort pcap analysis with ET rules - snort

I am trying to detect malicious behavior in pcap by using snort and Emerging Threat open rules. Here is what I did:
Installed snort 2.9.6.0
Downloaded https://rules.emergingthreats.net/open/snort-2.9.0/emerging.rules.tar.gz and unpacked to /etc/snort/rules
Added an import "include $RULE_PATH/emerging.conf" to snort.conf
Uncommented all rules in emerging.conf
When I run snort via:
snort -r pcap -c /etc/snort/snort.conf
I do not see any alerts in the output. I know that the pcap includes malicious traffic there are matching rules. What is the missing piece here?

Found it, the trick is to add:
-A console

Related

Pktgen error - Invalid PCAP filename, must include port number as P:filename

I'm trying to run Pktgen and record packet data to a pcap file. When I'm running this command:
./usr/local/bin/pktgen --no-telemetry -l 4,6,8 -n 4 -a 0000:03:02.0 -m 1024 -- -T -P -m [6:8].0 -s 0:pcap/captured_packets.pcap
It returns the following error message:
_pcap_open: failed to read the file header
!ERROR!: Invalid PCAP filename (0:pcap/captured_packets.pcap) must include port number as P:filename
What is wrong with the command? It seems to like the syntax of the command should be correct, so maybe it has something to do with the ports? If I run it without -s 0:pcap/captured_packets.pcap Pktgen works as it should.
I have also checked out a previous thread on this (Sending pcap file via packetgen dpdk), but it didn't give any further input into how to solve the issue.
For pktgen, the -s <port>:<pcap file> is only used for playback. It has a couple of drawbacks, including that it doesn't honor pcap timestamps and will just play at whatever rate is configured by the user. The other drawback is that I believe pktgen does not send jumbo packets in their entirety, but will limit any sent packet to the size of the payload section in a single rte_membuf.
If you're interested in using DPDK to record pcaps, you may look into dpdk-pdump and dumpcap.
I've also had good performance in the past when building my own packet recorder, especially since I know I can program it to receive and record any jumbo packets and run across any number of queues and threads.

Unable to install LMD on CentOS 7.9.2009 (core)

Can someone please help me with this? I'm attempting to follow the below guide on installing LMD (Linux Malware Detect) on CentOS.
https://www.tecmint.com/install-linux-malware-detect-lmd-in-rhel-centos-and-fedora/
The issue that I am having is that whenever I attempt to use "wget" on the specified link to LMD, it always pulls an HTML file instead of a .gz file.
Troubleshooting: I've attempted HTTPS instead of HTTP, but that results in an "unable to establish SSL connection" error message (see below). I've already looked around the internet for other guides on installing LMD on Cent and every one of them advised to "wget" the .gz at the below link. I'm hoping that someone can help me to work through this.
http://www.rfxn.com/downloads/maldetect-current.tar.gz
SSL error below
If you need further information from me, please let me know. Thank you.
Best,
B
wget --spider: enter image description here
wget --spider: enter image description here
This is interesting, you requested asset from http://www.rfxn.com but was redirected finally to https://block.charter-prod.hosted.cujo.io which seems to page with text like
Let's stop for a moment
This website has been blocked as it may contain inappropriate content
I am unable to fathom why exactly this happend, but this probably something to do with your network, as I run wget --spider and it did detect (1,5M) [application/x-gzip].
You replace in command http with https. Try wget as it is mentioned in the manual:
wget http://www.rfxn.com/downloads/maldetect-current.tar.gz
Here is what I get with --spider option:
# wget --spider http://www.rfxn.com/downloads/maldetect-current.tar.gz
Spider mode enabled. Check if remote file exists.
--2022-07-06 22:04:57-- http://www.rfxn.com/downloads/maldetect-current.tar.gz
Resolving www.rfxn.com... 172.67.144.156, 104.21.28.71
Connecting to www.rfxn.com|172.67.144.156|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1549126 (1.5M) [application/x-gzip]
Remote file exists.
It was my ISP. They had router-based software preventing Linux extra-network commands from getting past the gateway.

how to read and automatically analyse a pcap from STDIN

I'am about to build an automatic intrusion detection system (IDS) behind my FritzBox Router in my home LAN.
I'm using a Raspberry Pi with Raspbian Jessie, but any dist would be ok.
After some searches and tryouts I found ntop (ntopng to be honest, but I guess my questions aims to any version).
ntop can capture network traffic on its own, but thats not what I want because I want to get all the traffic without putting the Pi between the devices or let him act as a gateway (for performance reasons). Fortunately my FritzBox OS has a function to simulate a mirror port. You can download a .pcap which is continously written in realtime. I do it with a script from this link.
The problem is that I can't pipe the wget download to ntop like I could do it with e.g. tshark.
I'm looking for:
wget -O - http://fritz.box/never_ending.pcap | ntopng -f -
While this works fine:
wget -O - http://fritz.box/never_ending.pcap | tshark -i -
Suggestions of other analyzing software is ok (if pretty enough ;) ) but I want to use the FritzBox-pcap-thing...
Thanks for saving another day of mine :)
Edit:
So I'm comming to this approaches:
Make chunks of pcaps an run a script to analyse every pcap after another. Problem ntop do not merge the results, and I could get a storage problem if traffic running hot
Pipe wget to tshark and overwrite one pcap every time. Then analyse it with ntop. Problem again, the storage
Pipe wget to tshark cut some information out and store them to a database. Problem which info should I store and what programm likes dbs more than pcaps ?
The -i option in tshark is to specify an interface, whereas the -f option in ntop is to specify a name for the dump-file.
In ntopng I didn't even know there was a -f option!?
Does this solve your problem?

Implement Intrusion Prevention System from SNORT IDS

I have currently installed Snort 2.9.0.4 on Fedora 14 . The Snort IDS Mode is running perfectly, I want to Implement an IPS from the Snort IDS.. I am completely New to linux environment.
You configure the snort config file such that logs are entered in a SQL database (create database in sql) and use BASE, so by that you can view all logs in customized manner.

Slow wget speeds when connecting to https pages

I'm using wget to connect to a secure site like this:
wget -nc -i inputFile
where inputeFile consists of URLs like this:
https://clientWebsite.com/TheirPageName.asp?orderValue=1.00&merchantID=36&programmeID=92&ref=foo&Ofaz=0
This page returns a small gif file. For some reason, this is taking around 2.5 minutes. When I paste the same URL into a browser, I get back a response within seconds.
Does anyone have any idea what could be causing this?
The version of wget, by the way, is "GNU Wget 1.9+cvs-stable (Red Hat modified)"
I know this is a year old but this exact problem plagued us for days.
Turns out it was our DNS server but I got around it by disabling IP6 on my box.
You can test it out prior to making the system change by adding "--inet4-only" to the end of the command (w/o quotes).
Try forging your UserAgent
-U "Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-GB; rv:1.9.0.1) Gecko/2008070206 Firefox/3.0.1"
Disable Ceritificate Checking ( slow )
--no-check-certificate
Debug whats happening by enabling verbostity
-v
Eliminate need for DNS lookups:
Hardcode thier IP address in your HOSTS file
/etc/hosts
123.122.121.120 foo.bar.com
Have you tried profiling the requests using strace/dtrace/truss (depending on your platform)?
There are a wide variety of issues that could be causing this. What version of openssl is being used by wget - there could be an issue there. What OS is this running on (full information would be useful there).
There could be some form of download slowdown being enforced due to the agent ID being passed by wget implemented on the site to reduce the effects of spiders.
Is wget performing full certificate validation? Have you tried using --no-check-certificate?
Is the certificate on the client site valid? You may want to specify --no-certificate-check if it is a self-signed certificate.
HTTPS (SSL/TLS) Options for wget
One effective solution is to delete https:\\.
This accelerate my download for around 100 times.
For instance, you wanna download via:
wget https://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2
You can use the following command alternatively to speed up.
wget data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2