Search This Blog

Thursday, May 07, 2009

firewall-wizards Digest, Vol 37, Issue 7

Send firewall-wizards mailing list submissions to
firewall-wizards@listserv.icsalabs.com

To subscribe or unsubscribe via the World Wide Web, visit
https://listserv.icsalabs.com/mailman/listinfo/firewall-wizards
or, via email, send a message with subject or body 'help' to
firewall-wizards-request@listserv.icsalabs.com

You can reach the person managing the list at
firewall-wizards-owner@listserv.icsalabs.com

When replying, please edit your Subject line so it is more specific
than "Re: Contents of firewall-wizards digest..."


Today's Topics:

1. Re: Handling large log files (Swaminathan, Gayathri)
2. Re: Handling large log files (Nate Hausrath)
3. Re: Handling large log files (david@lang.hm)
4. Re: Handling large log files (Marcus J. Ranum)
5. EUSecWest 2009 (May27/28) London Agenda and PacSec 2009 (Nov
4/5) CFP deadline: June 1 2009 (Dragos Ruiu)
6. Re: Handling large log files (hugh.fraser@arcelormittal.com)


----------------------------------------------------------------------

Message: 1
Date: Wed, 6 May 2009 08:57:38 -0500
From: "Swaminathan, Gayathri" <gayathri@ou.edu>
Subject: Re: [fw-wiz] Handling large log files
To: Firewall Wizards Security Mailing List
<firewall-wizards@listserv.icsalabs.com>
Message-ID:
<94189A6B8040C44DB83C0A154ECAC4095C4BADB544@XMAIL3.sooner.net.ou.edu>
Content-Type: text/plain; charset="us-ascii"

Hey Nate,

Have used syslog-ng along with splunk, which improved log review immensely.

Splunk is free for indexing up to 500 MB/day

good luck!
gayathri
________________________________________
From: firewall-wizards-bounces@listserv.icsalabs.com [firewall-wizards-bounces@listserv.icsalabs.com] On Behalf Of Nate Hausrath [hausrath@gmail.com]
Sent: Tuesday, May 05, 2009 5:41 PM
To: firewall-wizards@listserv.icsalabs.com
Subject: [fw-wiz] Handling large log files

Hello everyone,

I have a central log server set up in our environment that would
receive around 200-300 MB of messages per day from various devices
(switches, routers, firewalls, etc). With this volume, logcheck was
able to effectively parse the files and send out a nice email. Now,
however, the volume has increased to around 3-5 GB per day and will
continue growing as we add more systems. Unfortunately, the old
logcheck solution now spends hours trying to parse the logs, and even
if it finishes, it will generate an email that is too big to send.

I'm somewhat new to log management, and I've done quite a bit of
googling for solutions. However, my problem is that I just don't have
enough experience to know what I need. Should I try to work with
logcheck/logsentry in hopes that I can improve its efficiency more?
Should I use filters on syslog-ng to cut out some of the messages I
don't want to see as they reach the box?

I have also thought that it would be useful to cut out all the
duplicate messages and just simply report on the number of times per
day I see each message. After this, it seems likely that logcheck
would be able to effectively parse through the remaining logs and
report the items that I need to see (as well as new messages that
could be interesting).

Are there other solutions that would be better suited to log volumes
like this? Should I look at commercial products?

Any comments/criticisms/suggestions would be greatly appreciated!
Please let me know if I need to provide more information. Again, my
lack of experience in this area causes me hesitant to make a solid
decision without asking for some guidance first. I don't want to
spend a lot of time going in one direction, only to find that I was
completely wrong.

Thanks!
Nate
_______________________________________________
firewall-wizards mailing list
firewall-wizards@listserv.icsalabs.com
https://listserv.icsalabs.com/mailman/listinfo/firewall-wizards


------------------------------

Message: 2
Date: Wed, 6 May 2009 09:54:26 -0400
From: Nate Hausrath <hausrath@gmail.com>
Subject: Re: [fw-wiz] Handling large log files
To: Firewall Wizards Security Mailing List
<firewall-wizards@listserv.icsalabs.com>
Message-ID:
<87e3982b0905060654l21886c8au910a8d230171d8e9@mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1

First, thanks for the great responses! Aside from the fact that we
need a beefier system (2x P3 1.4 GHz, 3 GB RAM, RAID-5... ouch), it
looks like I have a lot of work to do.

Also, thanks for providing some idea of the specs I will need to use
for a central log server. I believe our goal is to have around 300
servers sending logs (most of them should be less chatty than the
current ones). If you don't mind me asking, roughly how many servers
should I expect to have generate 1 GB of logs? I realize there really
isn't an accurate answer here, but I'm trying to get a rough ballpark
figure.

> Marcin wrote:
>
> - see if the architecture can be improved. Can you use multiple log
> servers? Is there
> a logical way of segmenting the log traffic - OS to box 1, db
> transactions to box 2, etc.?
> Post to the project's mailing list, there should be people who use it
> for larger installations,
> and willing/able to provide specific suggestions.

I'll see if this is an option. Along these lines, I'd eventually like
to be able to turn log messages into events and be able to correlate
them with other messages, IDS alerts, etc. I think that once I
compress the duplicates, and get rid of a lot of noise, I could
forward the results to an OSSIM box and use it for correlation,
alerts, etc.

> Paul wrote:
>
> What are you trying to achieve with your log analysis, as in, what
> sort of actions would the review of this daily log report trigger?
> Would you want to or should you move to a model where search/analysis
> is happening in near-real time instead of once daily? That's going to
> be helpful in knowing what kind of solution you should be looking at.
> Also, while it's overpowering your logcheck scripts, 5GB/day of log
> data is nothing when you're talking about firewall logs.
>
> PaulM

We are primarily looking for security related events. Real time
analysis/reporting of events is an eventual goal, but that seems a lot
more difficult to do in some regards. Initially, I'd like to at least
have a summary I can look at daily (probably along the lines of what
David posted below) and then I could transition to more real-time
analysis. Does that sound reasonable?

> David wrote:
>
> I don't like the idea of filtering out messages completely, the number of
> times that an otherwise 'unintersting' message shows up can be significant
> (if the number of requests for a web image per day suddenly jumps to 100
> times what it was before, that's a significant thing to know)

Duly noted. Thanks!

>
> the key is to categorize and summarize the data. I have not found a good
> commercial tool to do this job (there are good tools for drilling down and
> querying the logs), the task of summarizing the data is just too site
> specific. I currently get 40-80G of logs per day and have a nightly process
> that summarizes them.

This is good to know as well. I'd like to avoid commercial tools if
possible to save money (although Splunk seems pretty darn useful).

>
> *Solid plan of attack from David*
>

Thanks for all the great information from everyone. I'll be jumping
into this today!

-Nate


------------------------------

Message: 3
Date: Wed, 6 May 2009 10:39:57 -0700 (PDT)
From: david@lang.hm
Subject: Re: [fw-wiz] Handling large log files
To: Firewall Wizards Security Mailing List
<firewall-wizards@listserv.cybertrust.com>
Message-ID: <alpine.DEB.1.10.0905060952140.15782@asgard>
Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed

On Wed, 6 May 2009, Nate Hausrath wrote:

> First, thanks for the great responses! Aside from the fact that we
> need a beefier system (2x P3 1.4 GHz, 3 GB RAM, RAID-5... ouch), it
> looks like I have a lot of work to do.

raid 5 is not nessasarily a problem.

one surprise I ran into when configuring my splunk systems is that for
read-only situations, raid 5/6 can be as fast as raid 0, the big overhead
of raid 5/6 is when you are writing data.

so what I do is have the incoming logs written to one disk (pair of
mirrord drives), indexed there, and once all the work is done it gets
copied to the raid6 array, and that array is otherwise read-only

> Also, thanks for providing some idea of the specs I will need to use
> for a central log server. I believe our goal is to have around 300
> servers sending logs (most of them should be less chatty than the
> current ones). If you don't mind me asking, roughly how many servers
> should I expect to have generate 1 GB of logs? I realize there really
> isn't an accurate answer here, but I'm trying to get a rough ballpark
> figure.

this depends so much on your systems that any answer is pretty
meaningless.

in the absense of other information, I would just extrapolate from your
current systems

>> Marcin wrote:
>>
>> - see if the architecture can be improved. Can you use multiple log
>> servers? Is there
>> a logical way of segmenting the log traffic - OS to box 1, db
>> transactions to box 2, etc.?
>> Post to the project's mailing list, there should be people who use it
>> for larger installations,
>> and willing/able to provide specific suggestions.
>
> I'll see if this is an option. Along these lines, I'd eventually like
> to be able to turn log messages into events and be able to correlate
> them with other messages, IDS alerts, etc. I think that once I
> compress the duplicates, and get rid of a lot of noise, I could
> forward the results to an OSSIM box and use it for correlation,
> alerts, etc.

this gets a lot harder than you think, but you don't nessasarily need to
pre-filter the logs, the correlattion engines are going to be doing regex
matching on the logs themselves.

>> David wrote:
>>
>> the key is to categorize and summarize the data. I have not found a good
>> commercial tool to do this job (there are good tools for drilling down and
>> querying the logs), the task of summarizing the data is just too site
>> specific. I currently get 40-80G of logs per day and have a nightly process
>> that summarizes them.
>
> This is good to know as well. I'd like to avoid commercial tools if
> possible to save money (although Splunk seems pretty darn useful).

you can do everything with free tools, it's just a matter of manpower ;-)

for nightly reports, you can use the plan I listed

for alert generation and event correlation, look at SEC (simple event
correlator)

the part that is hard to do on the cheap is to efficiantly be able to
search the logs.

if you have an idea of what you are looking for ahead of time, you can
split the logs into different files for different types of events, then
just search the subset of items, but if you don't anticipate things, you
end up needing to do a full-text search through your logs. Postgres does
have good full-text indexing capabilities, but as you grow you will get to
the point where it takes more than one machine to get an answer back in a
reasonable amount of time (just due to the fact that you have so much
index data to search through to find where to go for the real data), and
at that point you need some sort of clustered datastore. those aren't
cheap, (even for the commercial version of postgres), and if you haven't
already figured out how to do this, there is a lot of value in buying one
of the commercial solutions that have that stuff more-or-less figured out
for you.

David Lang


------------------------------

Message: 4
Date: Wed, 06 May 2009 14:12:46 -0400
From: "Marcus J. Ranum" <mjr@ranum.com>
Subject: Re: [fw-wiz] Handling large log files
To: marcin@kajtek.org, Firewall Wizards Security Mailing List
<firewall-wizards@listserv.icsalabs.com>
Message-ID: <4A01D31E.5020406@ranum.com>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed

In case anyone wants 'em, my old USENIX system logging
tutorial and notes are downloadable here:
http://www.ranum.com/security/computer_security/archives/logging-notes.pdf
It's a 100+ page book of everything I know/managed to figure out about
logging.

mjr.
--
Marcus J. Ranum CSO, Tenable Network Security, Inc.
http://www.tenablesecurity.com


------------------------------

Message: 5
Date: Wed, 6 May 2009 15:21:41 -0700
From: Dragos Ruiu <dr@kyx.net>
Subject: [fw-wiz] EUSecWest 2009 (May27/28) London Agenda and PacSec
2009 (Nov 4/5) CFP deadline: June 1 2009
To: firewall-wizards@honor.icsalabs.com
Message-ID: <200905061521.41686.dr@kyx.net>
Content-Type: text/plain; charset="iso-8859-1"

EUSecWest 2009 Speakers

Efficient UAK Recovery attacks against DECT
- Ralf-Philipp Weinmann, University of Luxembourg
A year in the life of an Adobe Flash security researcher
- Peleus Uhley, Adobe
Pwning your grandmother's iPhone
- Charley Miller, Independent Security Evaluators
Post exploitation techniques on OSX and Iphone and other TBA matters.
- Vincent Iozzo,Zynamics
STOP!! Objective-C Run-TIME.
- nemo
Exploiting Delphi/Pascal
- Ilja Van Sprundel, IOActive
PCI bus based operating system attack and protections
- Christophe Devine & Guillaume Vissian, Thales
Thoughts about Trusted Computing
- Joanna Rutkowska, Invisible Things Lab
Nice NIC you got there... does it come with an SSH daemon?
- Arrigo Trulzi
Evolving Microsoft Exploit Mitigations
- Tim Burrell & Peter Beck, Microsoft
Malware Case Study: the ZeuS evolution
- Vicente Diaz, S21Sec
Writing better XSS payloads
- Alex Kouzemtchenko, SIFT
Exploiting Firefox Extensions
-Roberto Suggi Liverani & Nick Freeman, Security-Assessment.com
Stored Value Gift Cards, Magstripes Revisited
- Adrian Pastor, Gnucitizen, Corsaire
Advanced SQL Injection to operating system control
- Bernardo Damele Assumpcao Guimaraes, Portcullis
Cloning Mifare Classic
- Nicolas Courtois, University of London
Rootkits on Windows Mobile/Embedded
- Petr Matousek, Coseinc


PacSec 2009 CALL FOR PAPERS

World Security Pros To Converge on Japan

TOKYO, Japan -- To address the increasing importance of information
security in Japan, the best known figures in the international
security industry will get together with leading Japanese researchers
to share best practices and technology. The most significant new
discoveries about computer network hack attacks will be presented at
the seventh annual PacSec conference to be discussed.

The PacSec meeting provides an opportunity for foreign specialists to
be exposed to Japanese innovation and markets and collaborate on
practical solutions to computer security issues. In an informal
setting with a mixture of material bilingually translated in both
English and Japanese the eminent technologists can socialize and
attend training sessions.

Announcing the opportunity to submit papers for the PacSec 2009
network security training conference. The conference will be held
November 4/5th in Tokyo. The conference focuses on emerging
information security tutorials - it is a bridge between the
international and Japanese information security technology communities..

Please make your paper proposal submissions before June 1st, 2009.
Slides for the papers must be submitted for translation by October 1,
2009 (Which, oh so rarely, happens we are going to start asking for
them earlier :-P --dr).

A some invited papers have been confirmed, but a limited number of
speaking slots are still available. The conference is responsible for
travel and accomodations for the speakers. If you have a proposal for
a tutorial session then please email a synopsis of the material and
your biography, papers and, speaking background to . Tutorials are
one hour in length, but with simultaneous translation should be
approximately 45 minutes in English, or Japanese. Only slides will be
needed for the October paper deadline, full text does not have to be
submitted.

The PacSec conference consists of tutorials on technical details about
current issues, innovative techniques and best practices in the
information security realm. The audiences are a multi-national mix of
professionals involved on a daily basis with security work: security
product vendors, programmers, security officers, and network
administrators. We give preference to technical details and education
for a technical audience.

The conference itself is a single track series of presentations in a
lecture theater environment. The presentations offer speakers the
opportunity to showcase on-going research and collaborate with peers
while educating and highlighting advancements in security products and
techniques. The focus is on innovation, tutorials, and education
instead of product pitches. Some commercial content is tolerated, but
it needs to be backed up by a technical presenter - either giving a
valuable tutorial and best practices instruction or detailing
significant new technology in the products.

Paper proposals should consist of the following information:

1) Presenter, and geographical location (country of origin/passport)
and contact info (e-mail, postal address, phone, fax).
2) Employer and/or affiliations.
3) Brief biography, list of publications and papers.
4) Any significant presentation and educational experience/background.
5) Topic synopsis, Proposed paper title, and a one paragraph
description.
6) Reason why this material is innovative or significant or an
important tutorial.
7. Optionally, any samples of prepared material or outlines ready.
8. Will you have full text available or only slides?
9. Language of preference for submission.
10. Please list any other publications or conferences where this
material has been or will be published/submitted.

Please include the plain text version of this information in your
email as well as any file, pdf, sxw, ppt, or html attachments.

Please forward the above information to to be considered for
placement on the speaker roster.

cheers,
--dr

--
World Security Pros. Cutting Edge Training, Tools, and Techniques
London, U.K. May 27/28 2009 ?http://eusecwest.com
Tokyo, Japan November 4/5 2009 http://pacsec.jp
Vancouver, Canada March 22-26 2010 http://cansecwest.com
pgpkey http://dragos.com/ kyxpgp


------------------------------

Message: 6
Date: Wed, 6 May 2009 15:56:35 -0400
From: <hugh.fraser@arcelormittal.com>
Subject: Re: [fw-wiz] Handling large log files
To: <firewall-wizards@listserv.cybertrust.com>
Message-ID:
<EC1C66A282D0D44EAF927CB6058C8BC501732657@dof-mxb-qc01.hamilton.dofasco.ca>

Content-Type: text/plain; charset="us-ascii"

Like others have mentioned in previous replies, we've used syslog-ng and
Splunk to manage firewall and switch event logs. But sometimes we've
wanted to detect behaviour or anomalies that can't be done easily with
the tools. For these, I've used SEC (Simple Event Correlation), and perl
script from:

http://kodu.neti.ee/~risto/sec/

During the replacement of our campus network when lots of inter-switch
dependency issues arose, we used it to alert us to switches reporting an
error that hadn't had any problems for the past 5 days, usually
indicating something had happened externally to affect it, or to events
that were new in the past 5 days. We also used it to identify things
like links bouncing (down/up/down within a certain period of time). The
output of SEC was fed back in to syslog-ng as and represented in Splunk
as "synthetic" events, for which we had special notification and
reporting.

The goal of the process was to do exception reporting, allowing us to
collect all the events but only be notified when certain criteria
occurred.

-----Original Message-----
From: firewall-wizards-bounces@listserv.cybertrust.com
[mailto:firewall-wizards-bounces@listserv.cybertrust.com] On Behalf Of
Nate Hausrath
Sent: Tuesday, May 05, 2009 6:41 PM
To: firewall-wizards@listserv.cybertrust.com
Subject: [fw-wiz] Handling large log files

Hello everyone,

I have a central log server set up in our environment that would receive
around 200-300 MB of messages per day from various devices (switches,
routers, firewalls, etc). With this volume, logcheck was able to
effectively parse the files and send out a nice email. Now, however,
the volume has increased to around 3-5 GB per day and will continue
growing as we add more systems. Unfortunately, the old logcheck
solution now spends hours trying to parse the logs, and even if it
finishes, it will generate an email that is too big to send.

I'm somewhat new to log management, and I've done quite a bit of
googling for solutions. However, my problem is that I just don't have
enough experience to know what I need. Should I try to work with
logcheck/logsentry in hopes that I can improve its efficiency more?
Should I use filters on syslog-ng to cut out some of the messages I
don't want to see as they reach the box?

I have also thought that it would be useful to cut out all the duplicate
messages and just simply report on the number of times per day I see
each message. After this, it seems likely that logcheck would be able
to effectively parse through the remaining logs and report the items
that I need to see (as well as new messages that could be interesting).

Are there other solutions that would be better suited to log volumes
like this? Should I look at commercial products?

Any comments/criticisms/suggestions would be greatly appreciated!
Please let me know if I need to provide more information. Again, my
lack of experience in this area causes me hesitant to make a solid
decision without asking for some guidance first. I don't want to spend
a lot of time going in one direction, only to find that I was completely
wrong.

Thanks!
Nate
_______________________________________________
firewall-wizards mailing list
firewall-wizards@listserv.icsalabs.com
https://listserv.icsalabs.com/mailman/listinfo/firewall-wizards


------------------------------

_______________________________________________
firewall-wizards mailing list
firewall-wizards@listserv.icsalabs.com
https://listserv.icsalabs.com/mailman/listinfo/firewall-wizards


End of firewall-wizards Digest, Vol 37, Issue 7
***********************************************

No comments: