Join 34,000+ subscribers and receive articles from our blog about software quality, testing, QA and security.
 

About real big loggings


#1

I wondering of the file format for SmartInspect logs. It is a binary format but it is not a database?
If many log sources log to the same logfile, perhaps through backlogging, the logfile may grow rather fast. I then want to search and filter certain events fast from it. What performance can I expect when the log-file is say 2-3 GB? Because if the file is not treated like a database with index I guess you must do a linear scan of the file that is slow if there is much data.

Regards
Roland Bengtsson


#2

Hello Roland,

it is correct that the log packets are stored in a simple sequential format in the log file and not in a database-like format. The SmartInspect Console currently loads the entire log file into memory, so a 2-3 GB file might be a problem if your machine cannot handle this amount of data.

What you can do in those cases is to automatically split log files into multiple parts with the date-based log rotation functionality of the SmartInspect libraries and then process each log on its own. Other options include using the backlog functionality or custom filtering in your code (with the Filter event of the SmartInspect class) to reduce the amount of logging data directly in your application.

Once a log file has been loaded, the Console passes the packets to the available views and these views then decide if the individual packets should be displayed. So yes, the packets are processed linear. But we tested the SmartInspect Console with hundreds of thousands of log entries and it performs very well even with this huge amount of data.


#3

Thanks for your answer. I just trying to compare SmartInspect with our current log solution. Currently we have log-calls to a logclass in many places in the source. In case of exception the stack is also dumped using JCL. When the log-file (that is just text) reach 1 MB it creates a new log-file with the applicationname + date in filename. Those log-files is stored in a log-directory locally on each servers. We have 8 servers so my thought would be to collect all logs to one central place (using TCP/IP) and make it easy to filter those events that are interresting. If the logs appear in many files that’s fine.

But my question is, can I filter from many logfiles at the same time for criteria like date, application, eventtype etc?

Regards Roland


#4

Hello Roland,

yes, this is possible. You can load multiple logs into the Console and use the same filters and views for the concatenated log. For the central logging place, you can use the SmartInspect Receiver application which is available in our downloads section.

http://www.gurock.com/downloads/

The Receiver is a simple service application which can accept logging data from multiple applications via TCP/IP and stores them in log files. For each client, a separate log file is created. We have an article in our article section about the Receiver which explains the details.

http://www.gurock.com/products/smartinspect/articles/
Implementing a Simple Server for the SmartInspect Libraries