Scripts-Autolog

From Stadm
Jump to navigationJump to search

Scripts

AutoLog Scripts

cleanall

CLEANALL:

'cleanall' is a very simple script that automates cleaning of log files that are empty for each machine. It simply executes the *clean scripts for each machine in turn.

usage:	cleanall

 - REQUIRED components:
 -- 1) a directory called logclean/ in the same directory as cleanall with permissions "drwxr-xr-x"
 -- 2) logclean/ must contain *clean scripts

created by Joe Mount
2006-02-21

copyall

COPYALL:

'copyall' is a very simple script that automates copying of all log files for each machine. It simply executes the *logcopy scripts for each machine in turn.

usage:	copyall

 - REQUIRED components:
 -- 1) a directory called logcopy/ in the same directory as copyall with permissions "drwxr-xr-x"
 -- 2) logcopy/ must contain *logcopy scripts

created by Joe Mount
2006-02-21

ddtape.csh

DDTAPE.CSH:

'ddtape.csh' is a shell script used for porting data from magnetic tapes to hard disk using the 'dd' command. It also has arguments that allow you to control the tape drive remotely such as rewinding the tape or getting the current tape position (shown as file numbers and blocks). It automatically sets up the tape drive device (if one exists) on the current machine and then reads the tape from beginning to end, outputting the data to .tar files. These tar files can be extracted using the 'tar' command (see the man pages). A time stamp is included before and after reading the tape to see how long reading a particular tape takes (can also be used to figure out the data transfer rate; use the du command to find out the number of bytes transferred). Since ddtape.csh continues reading the tape until EOT is reached, the last couple files may be empty. If so, this script will automatically remove any empty files.

usage:	ddtape.csh [-help] [-rewind] [-stat] [-eject] [-start filenum] [-prefix fileprefix]

created by Joe Mount
2005-09-06

iautolog

IAUTOLOG:

'iautolog' is a menu driven csh script utilizing 'logsurfer' for log analysis. It automates logsurfer usage by having configuration files and line number files stored during use.

usage:

iautolog [-log dir] [-conf dir] [-out dir] [-w] [-a] [-wr] [-nm] [-n] [-d] [-h]

  -log: location of log files (*.log)
DEFAULT: /space/stadm/loganalysis/ilogs
  -conf: location of logsurfer-generated config files (*.ls) and line files (*.ln)
DEFAULT: /space/stadm/loganalysis/iconf
NOTE: use this flag if you want to continue a previous log analysis that is in sill progress (such as when a log file is too big and takes too long to finish analyzing in 1 day), otherwise analysis will start at line 0.
  -out: output directory for logsurfer analysis
DEFAULT:  /space/stadm/loganalysis/iresults
  -w: analyze all lines, regardless of previous line count
  -a: output all lines (from startline on) regardless of config file found
  -wr: write line numbers to file without a prompt
  -nm: new month; resets all line counters to 0
  -n: do nothing, simply update line count
  -d: run in debug mode (status messages)
  -h: show usage

- iautolog can be run with parameters, but there are default values that are most likely to be used.

- REQUIRED components:
   1. ilogs/ directory containing the .log files to be analyzed
   2. output (iresults/) directory for analysis results 
   3. "default.ls" in iconf/ directory for logsurfer
   4. logsurfer script (you shouldnÕt have to worry about this since it should already be set in your PATH and automatically located; if not then ask Aaron to include it into your PATH)
   

Specifics on logsurfer

logsurfer opens and analyzes a specified file, starting at a certain line number (if specified Ð use the Ðconf flag to specify), using a certain config file (if specified).
logsurfer uses regular expressions to control line handling, followed by an action to be taken if a match is found.
Most conveniently, it can be used to ignore (or filter out) entries that are unnecessary or unimportant, and keep only that which is necessary for analysis.

- logsurfer generates and relies on line-number (*.ln) files and config (*.ls) files. The line-number files control at what line in the selected log to start logsurfer at. If the file isn't found, it is assumed to be 0. The config files (*.ls) contain filters to pass into logsurfer to weed out unimportant entries. These files follow the format: <logname>.log.ln and <logname>.log.ls, respectively. 
- during runtime, existence of .ln and .ls file corresponding to selected log is checked. If found, these files are used as parameters to logsurfer, otherwise defaults are passed.



Mark Gorecki
modified by Joe Mount 2006-02-06

iclean

ICLEAN:

'iclean' is a very simple script that checks the log files that were obtained using 'ilogcopy'. If any of these log files are empty (have a size of 0 bytes), then 'iclean' will automatically delete these files from the appropriate directory since it doesn't make sense to analyze a log file that is empty.

usage:	iclean

 - REQUIRED components:
 -- 1) a directory called ilogs/ in the same directory as iclean

created by Joe Mount
2006-02-06

ilogcopy

ILOGCOPY:

'ilogcopy' is a very simple script to copy all ics log files back into the local directory for analyzing.
You do not have to be in the directory that contains the log files to be copied; simply check the path used in the 'ilogcopy' script and verify with Aaron. The script will take care of the copying.

default location of ics logs:		/net/ics/var/log/

usage:		ilogcopy

 - REQUIRED components:
 -- 1) the following path must exist: /space/stadm/loganalysis/ilogs/
  

Joe Mount
2006-02-06

qautolog

QAUTOLOG:

'qautolog' is a menu driven csh script utilizing 'logsurfer' for log analysis. It automates logsurfer usage by having configuration files and line number files stored during use.

usage:

qautolog [-log dir] [-conf dir] [-out dir] [-w] [-a] [-wr] [-nm] [-n] [-d] [-h]

  -log: location of log files (*.log)
DEFAULT: /space/stadm/loganalysis//qlogs
  -conf: location of logsurfer-generated config files (*.ls) and line files (*.ln)
DEFAULT: /space/stadm/loganalysis/qconf
NOTE: use this flag if you want to continue a previous log analysis that is in sill progress (such as when a log file is too big and takes too long to finish analyzing in 1 day), otherwise analysis will start at line 0.
  -out: output directory for logsurfer analysis
DEFAULT:  /space/stadm/loganalysis/qresults
  -w: analyze all lines, regardless of previous line count
  -a: output all lines (from startline on) regardless of config file found
  -wr: write line numbers to file without a prompt
  -nm: new month; resets all line counters to 0
  -n: do nothing, simply update line count
  -d: run in debug mode (status messages)
  -h: show usage

- qautolog can be run with parameters, but there are default values that are most likely to be used.

- REQUIRED components:
   1. qlogs/ directory containing the .log files to be analyzed
   2. output (qresults/) directory for analysis results 
   3. "default.ls" in qconf/ directory for logsurfer
   4. logsurfer script (you shouldnÕt have to worry about this since it should already be set in your PATH and automatically located; if not then ask Aaron to include it into your PATH)
   

Specifics on logsurfer

logsurfer opens and analyzes a specified file, starting at a certain line number (if specified Ð use the Ðconf flag to specify), using a certain config file (if specified).
logsurfer uses regular expressions to control line handling, followed by an action to be taken if a match is found.
Most conveniently, it can be used to ignore (or filter out) entries that are unnecessary or unimportant, and keep only that which is necessary for analysis.

- logsurfer generates and relies on line-number (*.ln) files and config (*.ls) files. The line-number files control at what line in the selected log to start logsurfer at. If the file isn't found, it is assumed to be 0. The config files (*.ls) contain filters to pass into logsurfer to weed out unimportant entries. These files follow the format: <logname>.log.ln and <logname>.log.ls, respectively. 
- during runtime, existence of .ln and .ls file corresponding to selected log is checked. If found, these files are used as parameters to logsurfer, otherwise defaults are passed.



Mark Gorecki
modified by Joe Mount 2/23/05

qclean

QCLEAN:

'qclean' is a very simple script that checks the log files that were obtained using 'qlogcopy'. If any of these log files are empty (have a size of 0 bytes), then 'qclean' will automatically delete these files from the appropriate directory since it doesn't make sense to analyze a log file that is empty.

usage:	qclean

created by Joe Mount
2005-08-30

qlogcopy

QLOGCOPY:

'qlogcopy' is a very simple script to copy all quake log files back into the local directory for processing/analyzing.
You do not have to be in the directory that contains the log files to be copied; simply check the path used in the 'qlogcopy' script and verify with Aaron. The script will take care of the copying.

default location of the quake logs:	/net/quake/var/log/

usage:  qlogcopy




Mark Gorecki
modified by Joe Mount 2005-04-14

replace.sh

REPLACE.SH:

'replace.sh' is a very simple script that finds and replaces text (specified by the command-line argument) within the specified files.
It uses a file called "pathfile" that contains the new path.

usage:		replace.sh <old_text> <new_text> <files>

NOTE: "pathfile" is located on fablio at /net/projects/www/projects/mapcat/scripts as of 2006-02
      "replace.sh" is located on fablio at /home/joe/bin as of 2006-02

created by Joe Mount
2005-12-21
modified 2006-02-13

sautolog

SAUTOLOG:

'sautolog' is a menu driven csh script utilizing 'logsurfer' for log analysis. It automates logsurfer usage by having configuration files and line number files stored during use.

usage:

sautolog [-log dir] [-conf dir] [-out dir] [-w] [-a] [-wr] [-nm] [-n] [-d] [-h]

  -log: location of log files (*.log)
DEFAULT: /space/stadm/loganalysis/slogs
  -conf: location of logsurfer-generated config files (*.ls) and line files (*.ln)
DEFAULT: /space/stadm/loganalysis/sconf
NOTE: use this flag if you want to continue a previous log analysis that is in sill progress (such as when a log file is too big and takes too long to finish analyzing in 1 day), otherwise analysis will start at line 0.
  -out: output directory for logsurfer analysis
DEFAULT:  /space/stadm/loganalysis/sresults
  -w: analyze all lines, regardless of previous line count
  -a: output all lines (from startline on) regardless of config file found
  -wr: write line numbers to file without a prompt
  -nm: new month; resets all line counters to 0
  -n: do nothing, simply update line count
  -d: run in debug mode (status messages)
  -h: show usage

- sautolog can be run with parameters, but there are default values that are most likely to be used.

- REQUIRED components:
   1. slogs/ directory containing the .log files to be analyzed
   2. output (sresults/) directory for analysis results 
   3. "default.ls" in sconf/ directory for logsurfer
   4. logsurfer script (you shouldnÕt have to worry about this since it should already be set in your PATH and automatically located; if not then ask Aaron to include it into your PATH)
   

Specifics on logsurfer

logsurfer opens and analyzes a specified file, starting at a certain line number (if specified Ð use the Ðconf flag to specify), using a certain config file (if specified).
logsurfer uses regular expressions to control line handling, followed by an action to be taken if a match is found.
Most conveniently, it can be used to ignore (or filter out) entries that are unnecessary or unimportant, and keep only that which is necessary for analysis.

- logsurfer generates and relies on line-number (*.ln) files and config (*.ls) files. The line-number files control at what line in the selected log to start logsurfer at. If the file isn't found, it is assumed to be 0. The config files (*.ls) contain filters to pass into logsurfer to weed out unimportant entries. These files follow the format: <logname>.log.ln and <logname>.log.ls, respectively. 
- during runtime, existence of .ln and .ls file corresponding to selected log is checked. If found, these files are used as parameters to logsurfer, otherwise defaults are passed.



Mark Gorecki
modified by Joe Mount 2005-12-20

sclean

SCLEAN:

'sclean' is a very simple script that checks the log files that were obtained using 'slogcopy'. If any of these log files are empty (have a size of 0 bytes), then 'sclean' will automatically delete these files from the appropriate directory since it doesn't make sense to analyze a log file that is empty.

usage:	sclean

created by Joe Mount
2005-08-30
modified 2005-12-20

slogcopy

SLOGCOPY:

'slogcopy' is a very simple script to copy all slate log files back into the local directory for analyzing.
You do not have to be in the directory that contains the log files to be copied; simply check the path used in the 'slogcopy' script and verify with Aaron. The script will take care of the copying.

default location of slate log:		/net/slate/space/root/log/

usage:		slogcopy
  
  

Mark Gorecki
modified by Joe Mount 2005-12-20

suniq

SUNIQ:

'suniq' is a script written to sort a file and pull out the unique lines in that file to simplify log analysis.

usage:
  suniq [value] <filename>
  
[value] is optional; it specifies at what node in a whitespace-seperated line the sorting and uniq'ing starts. (if none is given, suniq uses a value of 9; works for most files)
<filename> is the file to be suniq'ed.

output goes to the same file name with extension .uniq
ie: "suniq sample.log"     produces   "sample.log.uniq"

here are the values for the various logs: (given a '.out' extension, because such is produced as output from mautolog)

*** NOTE ***
for quake log files run the suniq command with argument value of 5 (instead of 9) for each file
ie: suniq 5 <filename>

command: 
 	suniq <filename>
files:
 auth.log.out
 daem.log.out
 errs.log.out
 tcpd.log.out

command:
 suniq 4 <filename>
files:
 kern.log.out
 mail.log.out
 user.log.out
 sudo.log.out

Mark Gorecki
modified by Joe Mount 2005-04-12

suniqall

SUNIQALL:

'suniqall' is a simple script that finds all .out files in the current directory and runs 'suniq' on each one for unique sorting for log analysis.  
It also moves *.uniq files from the previous analysis to the old_results/ directory as a temporary save for viewing later if needed.

usage:
  suniqall
  
--creates *.uniq files from the current *.out files.
--also removes the .out files since they are no longer needed.

 - REQUIRED components:
 -- 1) the directory old_results/ must exist in the current directory

Joe Mount 2005-07-07
modified 2005-10-31