Parsing web log files
Parsing of the log files is a process of conversion of raw web server log files into a database which can be used in further generation of web analytics reports. Log2Stats allows you to continuously add new log files to the database without reparsing/reprocessing of already parsed log files. Web analytics reports can be generated on any date ranges in the database and don't require any additional processing of the log files. You will get instant information about the performance of your web site by utilizing this technique for small and medium web sites (up to several thousands of visitors per day), large web sites may require a little time for generation of web analytics reports.
Most information in the reports can be filtered using instant filters and data filters after the parsing of web log files. You don't need to reparse log files to remove visits from the specific IP addresses, visits not accessed specific pages, specific country and so on from your web analytics reports. Use instant filters for this.
Also you can use data filters to highlight or hide information on already generated reports depending on cell values. This works on already generated reports and changes will be instant.
However there are some parameters, that can't be changed after parsing of the log files. These parameters are file types, visit logging strategy, search engine detection, spider detection, query string utilization and some other changes. All changes to these settings require reparsing of log files affected by them.
File types and logging strategy based on files can be changed in the Workspace settings - File types
Search engine and spider detection can't be changed directly. These databases are updated with the program update.
Other settings can be changed via parsing filters.