How to parse many pcap files in C or python - pcap

I have the problem that I want to parse 10,000 pcap files to analize 10 packets wrote in each file, I'd like to know some issues like the time between each packet was sent and the message wrote in each packet.
The thing is i have no idea on how to parse them, Somebody could help me to give an idea how to parse the 10,000 files and extract that information.
Thank you in advance
Rodolfo Heron

you can look at this site, be patient to find what you are looking for, good work
https://docs.python.org/3.5/library/index.html
you can find components with the dir function

Related

Convert abf to atf file type. Where should I start?

I want to preface this question by first saying I am not 100% sure this is the correct place to ask.
Basically, I want to take files with the .abf (axon binary files) extension and convert them to .atf (axon text files). I was wondering if I could simply just run a script that converts binary to text or if it would be more complicated than that.
I'm making a script that takes files with the .abf extension and feeds them into the Clampex9.2 program in order to save them as .atf files. However I feel like there is a much better way than manually feeding the files into a program, then resaving them with the correct extension.
Again, if this is not the right forum for this type of question, I apologize but thank you in advance if anyone can help me with this problem!

Perl and wireshark export file dialog

I am interested in opening a capture file in wireshark and then exporting the data in "C arrays" format [Wireshark provides that option in its GUI. One can do it by following "File->Export->as C arrays file" from the main menu.My question is how can I do this in perl? Can someone help me with a script for this?
I Would like to parse each and every packet of the wireshark capture. So I thought, I will first convert each packet to an array and then parse it. Do you have any suggestions on this? My capture consists of all IEEE 802.11 frames.
If you want to do all the parsing yourself, i.e. look at the raw packet data, I would suggest writing your own program using libpcap to read pcap-format capture files (on UN*X, libpcap 1.1.0 and later can also read pcap-ng-format capture files, which is what Wireshark 1.8.0 and later write by default). No need to write stuff out as C arrays.

Best way to get a database friendly list of Veteran Affairs Hospital

I sincerely apologize if this isn't the proper forum to discuss this, but I wasn't sure where to go or what would be the best option.
Basically, I'm trying to find a database friendly list of veteran affairs hospitals. The closest thing that I've been able to find is www.va.gov/ofcadmin/docs/CATB.pdf as it has all the information I'm looking for:
Region
Address
City in a separate column
Zip Code in a separate column
State
Facility # (also known as StationID)
VISN
Symbol
I've tried exporting that PDF out into CSV but it's a complete nightmare to get working. So, I was curious if anyone had any ideas or insights into how I could accomplish this task.
First, here's a CSV file containing the data found in CATB.pdf. The very first line contains the column headers, and the rest of the file contains the contents.
http://tmp.alexloney.com/CATB.csv
Now, for the more detailed explanation...I took the PDF you provided a link to, converted it to an HTML document using Adobe Acrobat, then I used a lot of Regular Expressions to parse the file and clean it up. Once the file was cleaned up enough, I was able to write a program to parse through the remainder of the file, grab the state and region, and spit it all out in a nicely formatted CSV.
Hope that helps you!
I believe that PDFILL has an option in it that will convert a PDF file to Excell. Once in Excell you should have no problem converting to a CSV file.

smlnj rephrased question for listdir(filename, directoryname)

i am a newbie learning sml and the question i am thrown with involves IO functions that i have no idea how it works even after reading it. Here is the 2 questions that i really need help with to get me started, please provide me with codings and some explaination, i will be able to trial and error with the code given for the other questions.
Q1) listdir(filename,directoryname), which given the name of a directory, list its contents in a text file. The listing is in a form that makes it easy to seperate filenames, dates and sizes from each other. (similar to what msdos does with "dir" but instead of just listing it out, it places all the files and details into a text file.
Q2) readlist(filename) which reads a list of filenames (each of which were produced by listdir in (Q1) and combines them into one large list. (reads from the text file in Q1 and then assigning the contents into 1 big list containing all the information)
Thing is, i only learned from the lecturer in school on the introduction section, there isnt even a system input or output example shown, not even the "use file" function is taught. if anyone that knows sml sees this, please help. Thanks to anyone who took the effort helping me.
Thanks for the reply, current I am using SMLNJ to try and do this. Basically, Q1 requires me to list the directory's files of the "directoryname" provided into a text file in "filename". The Q2 requires me to read from the "filename" text file and then place the contents into one large list.
Duplicate of: smlnj listdir
As a hint I will say that you have to make use of these functions:
OS.FileSys.OpenDir(directoryname) - this will open directory stream for you (Q1)
TextIO.openOut(filename) - this will open the file stream (Q2)
TextIO.openIn(filename)- this will open the file (Q2)
If you are stuck and dont' know how to do the progs then I will post the full code here, but i suggest you first give a try.
zubair sheikh

Efficient file reading techniques in C#

I am looking for the best way to read a CSV file line by line. I want to know what are the most efficient ways of doing the same. I am particulary concerned when the size of the file is big. Are the file reading utilities avilable in the .NET classes the most effient?
(PS: I have searched for the word 'Efficient' to know if someone has already posted similar question before posting this.)
You can try this: fast CSV Parser.
Here's the benchmark result:
To give more down to earth numbers,
with a 45 MB CSV file containing 145
fields and 50,000 records, the reader
was processing about 30 MB/sec. So all
in all, it took 1.5 seconds! The
machine specs were P4 3.0 GHz, 1024
MB.
The file handling in .NET is fine. If you want to read a line at a time in a comfortable way using foreach, I have a simple LineReader class you might want to look at (with a more general purpose version in MiscUtil).
With that you can use:
foreach (string line in new LineReader(file))
{
// Do the conversion or whatever
}
It's also easily usable with LINQ. You may or may not find this helpful - if the accepted solution works for you, that may be a better "off the shelf" solution.