[ale] Dealing with really big log files....
Michael B. Trausch
mike at trausch.us
Mon Mar 23 14:41:30 EDT 2009
On Mon, 23 Mar 2009 12:29:02 -0600
JK <jknapka at kneuro.net> wrote:
> Yeah but... who cares? You can just trim any partial lines from
> the front and back of the resulting file. And doing a binary search
> (manually if need be) for the interesting chunk is probably quicker
> than scanning through 100 GB of junk.
Possibly. Depends on how heavily loaded the system is from an I/O
standpoint---you've got the advantage of readahead caching if you're
scanning sequentially, and that advantage doesn't seem like much until
the system is really heavily bogged down. I wasn't making the
assumption that this work was being done on a lightly-loaded desktop
machine. A very heavily loaded machine would really suck to be doing a
binary search on, since you'd likely be I/O bound at every iteration.
Anyway, tomato/tomahto. They're both valid approaches, but the right
one would (as is always the case) depend on more variables than were
ever discussed in the first place.
--- Mike
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 197 bytes
Desc: not available
Url : http://mail.ale.org/pipermail/ale/attachments/20090323/e4480c7c/attachment.bin
More information about the Ale
mailing list