Various Linux commands to play with log file.How to take back up.
Hello friend,
I am again here to enhance your knowledge and share some very exciting Linux commands which will enlighten your mind, Let's start with very famous and mostly used Linux commands in the DevOps journey.
Searching commands which are doing the same work but all have their specialty.
Grep command: I will not explain what is grep command as I have already explained in my previous blog you can refer to this https://saifali2017.hashnode.dev/useful-linux-command-for-working-professionals
Now, let's enjoy the hack of grep commands.
Question: I want to grep a file/directory which has the character/word "devops" recursively
$ grep devops -r
//output will be like below.
tEngagementTime":1.3324283336014272e+16,"lastShortcutLaunchTime":0.0,"pointsAddedToday":15.0,"rawScore":23.58771143911823}},"https://aws.amazon.com:443,*":{"last_modified":"13324321952350304","setting":{"lastEngagementTime":1.3324041009499426e+16,"lastShortcutLaunchTime":0.0,"pointsAddedToday":2.1,"rawScore":2.1}},"https://dhiyanidevops.hashnode.dev:443,*
Isn't look massy? right?...
Let's make some limits for a particular address/location and make a question.
Question : Find devops under /home/saif , which means whatever is written in the home directory by user saif, check the string "devops" and print.
grep
//output will be like this.
/home/saif/Documents/DevOps/myscripts/my_script.sh:mkdir devops
Binary file /home/saif/.local/share/tracker/data/tracker-store.journal matches
Binary file /home/saif/.cache/mozilla/firefox/fodry7li.default-release/cache2/entries/B11417C1FCEE2AFCF074375E3AA7708987B7736D matches
Binary file /home/saif/.cache/mozilla/firefox/fodry7li.default-release/cache2/entries/D38FD6D6EFE7918223E7D5824C73EB53B168EEBD matches
isn't it craxy??.....
Now let's go one more step above.
Assume I have a file called logfile.txt and I want only TRACES from this file.
Question: Parse a logfile and find anything that is Trace
$ grep TRACE logfile.txt
// output
03/22 08:54:24 TRACE :.....event_timerT1_expire: T1 expired
03/22 08:54:24 TRACE :......router_forward_getOI: source address: 9.67.116.98
03/22 08:54:24 TRACE :......router_forward_getOI: out inf: 9.67.116.98
03/22 08:54:24 TRACE :......router_forward_getOI: gateway: 0.0.0.0
03/22 08:54:24 TRACE :......router_forward_getOI: route handle: 7f5251c8
Woo hoo....cool na?
okay, let's go one more step up and I have a question.
Question: Parse a logfile and find anything that is Trace and put it into a new file called tracesonly.txt
$ grep TRACE logfile.txt > tracesonly.txt
//output
$ ls
Ali logfile.txt saif tracesonly.txt
Note: If you don't know if the TRACE is written in upper case or lower case
You just need to put -i after TRACE , now it will be insensitive and find for all cases refer below.
$ grep -i TRACE logfile.txt > tracesonly.txt
Now let's discuss find command.
Find command.
What is the main difference between grep and find is that.
grep: It searches for a string and it can be a file/directory etc.
find: It can be filtered , if you are looking for a directory just use -d or if you are looking for a file just add -f
Quickly write an intro command for the find.
Question: Find a directory that has name Downloads
$ find /home/saif -type d -name Downloads
//output
/home/saif/Downloads
Question: Find a file that has the name logfile.txt
find /home -type f -name logfile.txt
//output
/home/logfile.txt
Question: I want a file that is in the saif group and named logfile.txt.
$ find home/ -group saif -type f -name logfile.txt
Now, I want to find a file called logfile.txt and grep INFO from that file.
$ find /home -type f -name logfile.txt -exec grep INFO {} \;
//output
3/22 08:51:01 INFO :.main: *************** RSVP Agent started ***************
03/22 08:51:01 INFO :...locate_configFile: Specified configuration file: /u/user10/rsvpd1.conf
03/22 08:51:01 INFO :.main: Using log level 511
03/22 08:51:01 INFO :..settcpimage: Get TCP images rc - EDC8112I Operation not supported on socket.
03/22 08:51:01 INFO :..settcpimage: Associate with TCP/IP image name = TCPCS
awk command
awk is an advance level of searching command which very advance and having very exciting and helpful.
Let's take a small example.
$ awk /INFO/ logfile.txt
//output
awk /INFO/ logfile.txt
Now why this is an advance command.
Question : I want rows that have INFO but with only the first and third columns.
$ awk '/INFO/ {print $1,$3}' logfile.txt
output//
03/22 INFO
03/22 INFO
03/22 INFO
03/22 INFO
Now, I want rows that have INFO with line numbers.
awk '/INFO/ {print NR,$1,$3,$4}' logfile.txt
370 03/22 08:53:53 to
375 03/22 08:53:53 RESVED,
382 03/22 08:54:09 to
387 03/22 08:54:09 RESVED,
395 03/22 08:54:22 RSVP_HOP
397 03/22 08:54:22 RESVED,
403 03/22 08:54:24 to
Now let's jump to our second module called
How to take backup.
I am gonna step by step so that if you want to do then just need to go through this blog your job is done.
Let's understand it with the command first then will go with script.
For example, I have a file called logfile.txt and I have to take a backup of it then we will use below command.
tar -czvf backup.tar.gz /prod/logfile.txt
Here tar is a tool to take backup
-czvf refers to
c: create a new file,
z:filter the archive through gzip,
v: Verbose out,
f:use archive file
backp.tar.tz is file name of backup file and them source then address of.
You will get the below output.
tar: Removing leading `/' from member names
/prod/logfile.txt
ls
backup.tar.gz logfile.txt
Now let's do it with the shell script and below are the steps for the same and name will the date on which date you are taking.
Step one: I am creating a directory called backup
$ mkdir backup
//output
backup
Now create a sh file and write below code
nano backupfile.sh
Now add the below code in the backupfile.sh file
#!/bin/bash //script starting tags
src_dir=/prod //source from where you will take backup
tgt_dir=/backup //target where you will get backup
curr_timestamp=$(date "+%Y-%m-%d-%H-%M-%S") //currentTime-stamp
backup_file=$tgt_dir/$curr_timestamp.tgz //backup_file name is //a target directory and making it zip file
echo "Taking backup on $curr_timestamp" //message
#echo "$backup_file"
tar czf $backup_file --absolute-names $src_dir // tar //backup_file created from resource directory into target and //create tarfile
echo "Backup completed"
Save it as backupfile.sh
and run with ./backupfile.sh
you will get the message
./backupfile.sh
Taking backup on 2023-03-26-15-13-14
Backup completed
Your backup is done.
Now the question is how to extract the backup tar file.
Very simple just write below command.
tar -zxvf backup.tar.gz
Here
-z : Work on gzip compression automatically when reading archives.
-x : Extract tar.gz archive.
-v : Produce verbose output (Display progress and extracted file list on screen).
-f : Read the archive from the archive to the specified file. (In this case, read example.tar.gz.)
-t : List the files in the archive.
-r : Append files to the end of the tarball.
Output:
tar -zxvf backup.tar.gz
prod/logfile.txt
How to take backup automatically?
Now I want to do this back by crontab automatically by scheduling.
So here I am taking my same backupfile.sh to automate backup.
Here are the steps which you have to use.
Everything will the same like you have to create a shell script to take backup and its working fine so now just we have make a cron job for the same.
Here are the very simple steps.
We have to just edit file with the below command
crontab -e
You will get a similar kind of interface and you have to go to just at the last of this file as below and add below code.
Note: Only the last line of the given example.
#
# Output of the crontab jobs (including errors) is sent through
# email to the user the crontab file belongs to (unless redirected).
#
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
#
# For more information see the manual pages of crontab(5) and cron(8)
#
# m h dom mon dow command
*/1 * * * * . /prod/backupfile.sh
Here */1 * * * * represent that you cron job will run in one minute and rest of the command is same as you used to run backup manually.
Now here you may confused what is it */1 * * * * and from where it came so for that you just need to visit this site Cronitor to get timer/scheduler which we have to add here.
I hope you got something valuable from this blog.
Happy Learning.....
Best Regards,
Saif Ali