13 May 2014 @ 7:57 AM 

Hello !

A new post after a long time…

 

As you know, the variables in bash are global by default (their scope is from the initialisation (usually when you start using it), to the destruction of the variable (usually at the end of the script). The variable will be available through all the functions within the script.

Now, if you want to reduce the scope of a variable, and tie it to a specific function, you usually render it unique (by starting with an underscore for instance), then initialise it at the beginning of your function, then destroy it (unset) at the end of the function.

There is another option : you can use the keyword “local” like in other languages.

Here is an example on how to use it :

#!/bin/bash

MyFunction () {

local TheVar

TheVar=1

echo “In The Function ${TheVar}”

}

declare -i TheVar

TheVar=99

echo “Before the Function ${TheVar}”

MyFunction

echo “After the Function ${TheVar}”

Here is the output :

Before the Function 99

In The Function 1

After the Function 99

And if you comment out the “local” part :

Before the Function 99

In The Function 1

After the Function 1

Another note on this.

According to my tests done on my bash version (GNU bash, version 3.2.51(1)-release), you can achieve the same results by simply declaring your variable within a function.

e.g.

#!/bin/bash

MyFunction () {

declare -i TheVar

TheVar=1

echo “In The Function ${TheVar}”

}

declare -i TheVar

TheVar=99

echo “Before the Function ${TheVar}”

MyFunction

echo “After the Function ${TheVar}”

Has  the following results :

Before the Function 99

In The Function 1

After the Function 99

Note also that, if you re-use the same variable name (as per our example) once you declare a local variable, the content of the global variable of the same name is not accessible within the function anymore !

e.g.

#!/bin/bash

MyFunction () {

#local TheVar

echo “Before declaration : ${TheVar}”

declare -i TheVar

echo “After declaration, before assignement : ${TheVar}”

TheVar=1

echo “After assignement : ${TheVar}”

}

declare -i TheVar

TheVar=99

echo “Before the Function ${TheVar}”

MyFunction

echo “After the Function ${TheVar}”

Gives the following results :

Before the Function 99

Before declaration : 99

After declaration, before assignement :

After assignement : 1

After the Function 99

Think about this the next time you do recursive functions, this might help. Counters (while [ i -lt 200 ];) should always be declared local within a function. Keep in mind also that “local” keyword is only available within a function.

Hope you enjoyed and see you next time !

Posted By: Dimi
Last Edit: 13 May 2014 @ 07:57 AM

EmailPermalinkComments (0)
Tags
Categories: Bash, Snippets
 07 Jun 2013 @ 3:56 PM 

Hi Guys,

It has been a long time since we did make an entry…

And today will be a short one.

 

How to generate a random number in Bash.

There is a variable called “$RANDOM”. It provides a random number :

echo $RANDOM
27652

Thus, if you want to generate a random number between two barriers, you can use :

$(($RANDOM % (<higher barrier>-(<lower  barrier>-1)) + <lower barrier>)) 

e.g.

echo $(($RANDOM % 100 + 1))
54

And, if you want to get a random line in a file :

echo ${RANDOM} >/dev/null;cat AA_AA | sed -n “$(( ${RANDOM} % `cat AA_AA | wc -l` + 1)) p”

The first “echo ${RANDOM} is to force the re-assignement of the ${RANDOM} value, that, for some reason, does not get updated when used in the formula.

 

Have a good day !

Update : Updated based on KenS comment. Thank you.

Posted By: Dimi
Last Edit: 13 May 2014 @ 06:53 AM

EmailPermalinkComments (3)
Tags
Categories: Bash, Snippets
 20 Apr 2012 @ 8:00 PM 

If you happen to write some scripts for sftp, ftp (…) files transfer, you might want to check if the remote host does accept the connection on the port you specify before triggering anything.

What’s the point of doing this test? – it can be discussed, actually. But one might prefer to handle the case where the host is not responding or refusing the connection otherwise than by parsing the answer provided through the command line by the sftp, ftp (sftp..) binary.

That’s what I wanted to do and my next challenge was to find a way to do this test without using telnet, ftp, sftp, ssh (etc.). The thing is most of the example you can find on the web are using these commands.

The trick is to use the bash sockets…And especially this one: /dev/tcp/. We can try to write something in the socket

exec 3>/dev/tcp/REMOTE_DEST/REMOTE_PORT

and depending on the success or failure assume we could connect… or not. For example, let’s do the following:

exec 3>/dev/tcp/192.168.124.55/22 && echo “OK”

If the connection is not sucessful you will have the answer:

unix:solaris> exec 3>/dev/tcp/192.168.124.55/22 && echo “OK”
-bash: connect: Connection refused

And if it can connect it will simply print in your screen “OK”:

unix:solaris> exec 3>/dev/tcp/192.168.124.55/22 && echo “OK”
OK

The return codes will be “1” when unsuccessful and “0” when successful.

Then it’s up to you to wrap up that command in your function and properly handle the errors/exception/log.

One example of a full test function:

testConnection()
{
[[ $VERBOSE = TRUE ]] && doVerbose “Testing the connection to the server …”
res=”not Connected”

# Note: return is 0 if sucessful and 1 is unsucessful
`exec 3>/dev/tcp/${REMOTE_DEST}/${REMOTE_PORT}` && res=”Connected” || res=”not Connected”
if [[ ${res} != “Connected” ]]
then
doLog “Cannot connect to ${REMOTE_DEST}/${REMOTE_PORT}…[FAILURE]”
quit “Cannot connect, server ${REMOTE_DEST}/${REMOTE_PORT} does not accept the connection”
fi
doLog “${res}…[OK]”
}

Where:

  1. doLog, quit, doVerbose are functions within the same script dedicated to handle the log files, manage exits and finally manage the verbose mode, if triggered while executing the script.
  2. ${REMOTE_DEST} and ${REMOTE_PORT} are variables set outside the fuction.

To go further:

/dev/tcp is one of the pseudo-device unix like systems have. We are more familiar with /dev/null when for example we are only interested with the STDOUT stream and not STDERR (the famous 2>/dev/null) but there are actually a few of them: /dev/zero, /dev/random, /dev/full for example.

You can read the wikipedia page on device files that is quite interesting.

Posted By: Nicolas
Last Edit: 20 Apr 2012 @ 10:38 AM

EmailPermalinkComments (0)
Tags
Tags: , , ,
Categories: Bash, Snippets
 12 Jan 2012 @ 8:00 PM 

Hello all,

Today a fun -really, you can use this in pranks and all- stuff.

Imagine you have a script, that will need deactivation. You do not want someone to find the deactivator easily (if you want to “lock” a script for example).

There are may ways to do this, one easy one is to create a script with name ” ” (space) and to execute it in your main script.

To create your space script :

cat >\
echo ” You should not start this script”
exit 0

and then, you add the following line somewhere, for example after the functions.

. ./\

The script will exit with the message at that place, and it is very difficult for the user to find the cause.

Another option can be to  encrypt the script using crypt (you can install it if you do not know where it is) so that the user cannot use grep on all files to find the blocking one. You simply decrypt the file before running it (this can be done by your “space” script.

Oh, yeah, and a quick important note, on a related but different issue.

If you want to deactivate the following function :

function removeall {
rm -Rf .
}

DO NOT try to comment it out like this  :

#function removeall {
rm -Rf .
}

 Because this will simply remove everything every time we try to load your library !
Do it like this :

function removeall {
return 0
rm -Rf .
}

This is all for today, tomorrow we will talk about the misuse of the tr function.

Thank you for reading, and see you tomorrow !

Posted By: Dimi
Last Edit: 12 Jan 2012 @ 08:54 PM

EmailPermalinkComments (1)
Tags
Tags: , ,
Categories: Bash, basics, Snippets
 05 Jan 2012 @ 8:00 PM 

Hello all,

The last post before a long week-end (I will not be able to post tomorrow, and on Monday and Tuesday. I’ll see you on wednesday.

Today, the topic is “The difference between find -exec and find | xargs”

The situation

You have to look into a specific directory and subdirectories for all the files named “*log” because you are quite sure one of them contains the word “error”.

You have two ways to do this :

1 – using -exec

This is a parameter to the find commands that allows you to do an extra command on the found files. The syntax is as follow :

-exec <command> {} \;

<command> replaces your command

{} will be replaced by the name of the found file.

\; terminates the command

i.e. you type :

find . -name \*log -exec grep -i error {} \;

and it will return all the lines containing error, regardless of the case.

2 – using xargs.

xargs is a command that allows the piped data to be passed as parameter to another command. The syntax is as follow :

| xargs <command>

xargs simply puts at the end of the command the piped data. i.e. you type :

find . -name \*log | xargs grep -i

3 – Using a while loop.

Yes, you can do like that, but it is not the topic of this discussion.

 

The difference

What is the main difference between the two?

-exec is going to take each file, and execute the command with it. Using this, you will get a list of all the lines, but not the name of the file, as grep assumes you know which file he talks about, as you have passed the name as parameter !

A sub process will be spawn for each file to be checked.

xargs, on the other end, is going to pass the complete list of files to grep. The name of the files will be shown in the result list.

The separator for the parameters will be the space, and this is OK as long as there is no space in the names of the files (but who puts spaces in the names of the files ? Ah, ok, users…).

In fact, the fail proof  way to deal with this specific request is to make the while loop.

 

The conclusion

Both ways can be useful, depending of the request and the situation. It is important that you understand the different ways those tools work, so that you can choose which one is the best for your usage.

Thank you for reading, and see you next wednesday ! (told you, long week-end for me 🙂 )

 

Posted By: Dimi
Last Edit: 02 Jan 2012 @ 05:17 PM

EmailPermalinkComments (0)
Tags
Tags: , , , ,
Categories: Bash, basics, Snippets
 04 Jan 2012 @ 8:00 PM 

Hello, and thank you for coming again !

I have noticed that some of you are coming via the RSS feed, and it is nice to see you are following this blog thoroughly !

Today, the topic is ” Equal Tilde operator”

The situation

You have a variable, and you need to check if the content is a single digit, or the letter “a” or the letter “A” or the letter “b” or the letter “B”

The solution

You do not use the equal tilde and you die of boredom :

if [[ ${var} = [0-9] ]] || [ “$var” = “a” ] || [ “$var” = “A” ] || [ “$var” = “b” ] || [ “$var” = “B” ] ; then

or you use the equal tile operator :

if [[ “${var}” =~ \^\[0-9]\|a\|b\|A\|B\$ ]] ; then

and you can live another day.

The equal tilde operator allows you to use regex (the same ones you used for see, remember ?) in an if command.

Note the two square brackets.

Thank you for reading, and see you tomorrow !

 

Nota bene :

You should not put the right side of the =~ operator in quotes, as this will mean a string, and not a regex.

If you do not want escaping everything that might need to be escaping, just put your complicated regex in a variable :

RegEx=”My Complicated Regex”

if [[ “${var}” =~ ${RegEx} ]]; then

 

Posted By: Dimi
Last Edit: 05 Jan 2012 @ 07:54 PM

EmailPermalinkComments (0)
Tags
 03 Jan 2012 @ 8:00 PM 

Today, we’ll see something interesting about the scope of variable in bash, and maybe more a bug than something else.

Before going further, I want to say that I have never had problem with the scope of variables in Bash, but this specific case is very strange.

The situation

Let’s say you have a file “tempo1”, containing three lines :

Nocomp
COMP 2 + 3 + 4
Nocomp
COMP 4 + 5 + 6

You want to compute all the lines that contain “COMP” and provide the total result.

You then write the following script (usually in one line) :

typeset -i i j
cat tempo1 | while read line; do if echo $line | grep COMP >/dev/null 2>&1;  then   i=`expr \`echo $line | cut -f2- -d’ ‘\“;   echo $i; let j=j+i;fi; done

(if the script is unclear, I’ll detail later on)

You assume that, at the end, “j” will have the sum of all, and “i” is going to be the last processed number. Well, you’re wrong.

The problem

At least if you use bash. There seems to be an issue with the context copy for a piped while. i.e. both your variable will be defined, but empty.

$ echo $0 i is $i j is $j
-bash i is j is
$

Now, let’s re-run the whole lot in ksh :

echo $0 i is $i j is $i
ksh i is 15 j is 15
$

The reason

Well, after looking on the web, it seems that I am not the only one puzzled. This has been reported as a bug, and we’ll see if it changes in the future.

The solution

Change your code, to avoid the piped while :

while read line; do if echo $line | grep COMP >/dev/null 2>&1;  then   i=`expr \`echo $line | cut -f2- -d’ ‘\“;   echo $i; let j=j+i;fi; done <tempo1

and the problem disappears. Or use ksh for this specific purpose.

Thanks for reading this, and see you tomorrow ! (Should you require more explanation about the script, feel free to read on.)

 

Explanation of the script

Let’s first explode the script on multiple lines :

typeset -i ij
cat tempo1 | while read line
do
if echo $line | grep COMP >/dev/null 2>&1;  then
i=`expr \`echo $line | cut -f2- -d’ ‘\“
echo $i
let j=j+i
fi
done

So, the first line :

typeset -i i j

is there to define the variables i and j as integer (-i) (equivalent to int i,j; in c)

The next line is parsing the file, and looping each line between the “do” and “done”. The line is put in the variable “line”

If you had written

cat tempo1 | while read JamesTKirk

then the variable containing the line would have been JamesTKirk, which is a bit long to type and also a bit of an overkill to have one of the best Starfleet Captain as your variable.

Next line speaks by itself :

if echo $line | grep COMP >/dev/null 2>&1;  then

means “if the result of the command “echo $line | grep COMP” is 0, then execute the following” and of course, this will only be the case if the line in question contains the word “COMP” somewhere. We could have done maybe with “grep -e ‘^COMP'”, or with the funny test we’ll see tomorrow, but let’s not split the hair.

i=`expr \`echo $line | cut -f2- -d’ ‘\“

expr allows you to do basic arithmetical calculation. This line means “Take everything after the first field on the line, space being the separator, then compute it and store it in the variable “i”“.

let j=j+i

let is another way to do arithmetical operations, using integer  variables.

and voila, done.

Should you require more assistance, feel free to post in the comments or to man the command you want to know about.

Thank you for reading, and see you tomorrow for a special test operator.

Posted By: Dimi
Last Edit: 02 Jan 2012 @ 04:37 PM

EmailPermalinkComments (0)
Tags
Tags: , , ,
Categories: Bash, Snippets
 29 Dec 2011 @ 8:42 PM 

Today, something a bit more complicated. This is a fun way to use a trap function we have seen yesterday .

The problem

You receive a new requirement from your customer (I did receive it a couple of times). You need to develop a program that will :

– Not consume CPU or disk until it is necessary.

– Read the content of a directory, then process each file in a directory A (move it to another directory B ) one by one. Not two files at the same time.

– Once the file has been “taken” by another process (no file in directory B), the next file can be published.

– You can’t have the system polling every minute or so, because the files need to be copied as fast as possible.

– The process that provides the files can start another process, but can’t wait more that a millisecond before giving the hand back.

Good luck…

The Solution

Well, at least the one I found, was to have a main process going to sleep, then being awoken by another one (triggered by the “file-provider”), and starts polling the directory A until it is empty. Then it goes back to sleep.

How ? well, like this :

First, we need to trap the “KILLUSR” signal :

trap “wakeup” 30

Then, we need to create a script, on the fly, that will be called to signal the main script that there is something to process (the “trigger” process)

trigger=trigger.sh
echo “#!/bin/ksh” >$trigger
echo “#Created by $0 on “`date` >>$trigger
echo ‘if [ “$1” == STOP ]’>>$trigger
echo “then” >>$trigger
echo “kill $$” >>$trigger
echo “else” >>$trigger
echo “kill -30 $$” >>$trigger
echo “fi” >>$trigger
echo “#End of the script” >>$trigger
chmod 755 $trigger

Next step, we create the function that will be executed in case of a trap.

_From=/from
_To=/to

wakeup () {
trap ‘echo “\c”‘ 30
# This is basically doing “noop”, to avoid being killed.
echo “Awoken at “`date` >>$0.log
while [ -f ${_From}/* ]
do
while [ -f ${_To}/* ]
do
sleep 1
done
File2Move=`ls -tr ${_From}| head -1`
mv ${_From}/${File2Move} ${_To}
done
echo “${_From} is empty” >>$0.log
}

And, last step, we need the code that will be executed while waiting :

while true
do
sleep 1800&
wait
done

When the script is waiting, the main part is just looping on sleep. Them, once it receives a trap, the function that does the copy is executed.

Afterwards, we take the first file sent, move it over, then wait until the destination directory is empty, and look for the next file.

If you have another idea on how this process could be done, feel free to comment !

Thank you for reading, and I hope to see you on tuesday !

Posted By: Dimi
Last Edit: 30 Dec 2011 @ 03:15 PM

EmailPermalinkComments (0)
Tags
Tags: , ,
Categories: Bash, Signals, Snippets
 28 Dec 2011 @ 8:00 PM 

Hello,

A very short one today, but it is usually not very well known.

As you know, Unix uses signals to communicate with processes. A process receives a signal, then takes an action. If no action is defined, the default action for most signals is to kill the process… which is maybe something you want to avoid.

The problem.

You have written the best script in the world, that does… let’s say… make ice-cream (we can dream).

The problem is that if someone kills your script (i.e. kill -15, SIGTEM, which is the default one), you process will simply stop, and throw ice-cream everywhere on the walls. (Ice-cream is here an image for temporary files).

How can you catch this message and close the ice-cream tap and clean-up a bit ?

The Solution.

Use trap.

At the beginning of your script, you can add the following line (after the shebang) :

trap 15 cleanup

This will trap the default SIGTERM, then start the procedure “cleanup” ( you need to define the “cleanup” procedure somewhere).

You can define a function that will do the cleanup for you, maybe add a message in the log, kill sub-processes, send mail, whatever is needed.

The only thing, is that you need to fail proof your function, in order to be sure that your process is not going in an infinite loop, because the way to kill it afterwards is to use another kill.

Note also that not all the signals might be trapped. Some of them, like the signal 9 (SIGKILL) is not trappable in a script.

Thank you for reading, and if I find some time, I’ll talk tomorrow about an inventive way to use the trap function.

Posted By: Dimi
Last Edit: 27 Dec 2011 @ 08:44 PM

EmailPermalinkComments (0)
Tags
Tags: , , ,
Categories: Bash, Signals, Snippets
 27 Dec 2011 @ 8:03 PM 

So today, we’ll see how to pass a variable (and not the content of a variable) to a function.

The Problem

You need to create a function that updates one or multiple variables given as parameter. For example, let’s say you want to create a “to_upper” function, that will take a variable as parameter, then modify it.

If you do it like this :

to_upper $mytext

You will just receive the content of the variable. Then, what do you want to do with it ? Print it ? You can not return it to the variable, as you do not know it.

The solution :

Write your code like this :

function to_upper  {
eval _text=\$$1
export $1=`echo “${_text}” | tr [a-z] [A-Z]`
}

and you can execute it like this :

toto=”hello”

to_upper toto

echo $toto

HELLO

From there, you can execute everything, like SQL queries, and return them in the calling variable or in another variable… Of course, you still need to protect your function, checking that the given parameter is a name of a variable, and not the variable itself.

Thank you for reading !

Posted By: Dimi
Last Edit: 05 Jan 2012 @ 07:52 PM

EmailPermalinkComments (0)
Tags
Tags: ,
Categories: Bash, Snippets

 Last 50 Posts
Change Theme...
  • Users » 66
  • Posts/Pages » 25
  • Comments » 4
Change Theme...
  • VoidVoid « Default
  • LifeLife
  • EarthEarth
  • WindWind
  • WaterWater
  • FireFire
  • LightLight