Yes - that's the idea. Even if you capture some really large trace files
that take too long to load, you can use editcap to split the file into
smaller pieces. Type editcap -h for more information, but the syntax is.

Editcap -c 100000 <capturefilein> <capturefileout>

Where <capturefilein> is the name of your really large capture file and
<capturefileout> is the starting name of your output files. Editcap will
number the output files for you. 

Laura
[EMAIL PROTECTED]

This message is intended only for the use of the addressee and may contain
information that is privileged and confidential. If you are not the intended
recipient, you are hereby notified that any use and/or dissemination of this
communication is strictly prohibited. If you have received this
communication in error, please delete all copies of the message and its
attachments and notify the sender immediately.

  _____  

From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of ARAMBULO, Norman
R.
Sent: Thursday, January 18, 2007 7:18 PM
To: Wireshark-Users (E-mail); Tcpdump-Workers (E-mail);
Tcpdump-Workers-Owner (E-mail)
Subject: Re: [Wireshark-users] Help on tcpdump or dumpcap
Importance: High

 

Thanks for the response, yup I know that wireshark or ethereal cant handle
large amount of data, so does tcpdump and dumpcap capable of handling such

data, can we use it to capture large amount of data, save it to multiple
files for Tshark or Tethereal for post process. Pls advise and thanks


  
  



 "Reality is merely an illusion, albeit a very persistent one."

                                                                -- Albert
Einstein

 

Attachment: image001.gif
Description: GIF image

_______________________________________________
Wireshark-users mailing list
Wireshark-users@wireshark.org
http://www.wireshark.org/mailman/listinfo/wireshark-users

Reply via email to