12 Jan 2012 @ 8:00 PM 

Hello all,

Today a fun -really, you can use this in pranks and all- stuff.

Imagine you have a script, that will need deactivation. You do not want someone to find the deactivator easily (if you want to “lock” a script for example).

There are may ways to do this, one easy one is to create a script with name ” ” (space) and to execute it in your main script.

To create your space script :

cat >\
echo ” You should not start this script”
exit 0

and then, you add the following line somewhere, for example after the functions.

. ./\

The script will exit with the message at that place, and it is very difficult for the user to find the cause.

Another option can be to  encrypt the script using crypt (you can install it if you do not know where it is) so that the user cannot use grep on all files to find the blocking one. You simply decrypt the file before running it (this can be done by your “space” script.

Oh, yeah, and a quick important note, on a related but different issue.

If you want to deactivate the following function :

function removeall {
rm -Rf .

DO NOT try to comment it out like this  :

#function removeall {
rm -Rf .

 Because this will simply remove everything every time we try to load your library !
Do it like this :

function removeall {
return 0
rm -Rf .

This is all for today, tomorrow we will talk about the misuse of the tr function.

Thank you for reading, and see you tomorrow !

Posted By: Dimi
Last Edit: 12 Jan 2012 @ 08:54 PM

EmailPermalinkComments (1)
Tags: , ,
Categories: Bash, basics, Snippets
 11 Jan 2012 @ 8:00 PM 


You remember we have talked about default variables earlier (Look here). Today, we are going to see an even more efficient way to cast default variables, for Bash users.

Note : The following only works in bash. Should you be unsure if the user of your function is going to use bash or ksh, you should use the way we have explained earlier.

So, this is the way to give a default value to a variable :

[ -z $var ] && var=’default’

The same command if you are using Bash :


It might look a bit more cryptic, but it’s the correct way to do it if you are sure to keep using Bash.

The right part of the :- operator can be a string, an integer or another variable.

Thank you for reading, and see you tomorrow !

Posted By: Dimi
Last Edit: 06 Jan 2012 @ 11:51 AM

EmailPermalinkComments (0)
Categories: Uncategorized
 05 Jan 2012 @ 8:00 PM 

Hello all,

The last post before a long week-end (I will not be able to post tomorrow, and on Monday and Tuesday. I’ll see you on wednesday.

Today, the topic is “The difference between find -exec and find | xargs”

The situation

You have to look into a specific directory and subdirectories for all the files named “*log” because you are quite sure one of them contains the word “error”.

You have two ways to do this :

1 – using -exec

This is a parameter to the find commands that allows you to do an extra command on the found files. The syntax is as follow :

-exec <command> {} \;

<command> replaces your command

{} will be replaced by the name of the found file.

\; terminates the command

i.e. you type :

find . -name \*log -exec grep -i error {} \;

and it will return all the lines containing error, regardless of the case.

2 – using xargs.

xargs is a command that allows the piped data to be passed as parameter to another command. The syntax is as follow :

| xargs <command>

xargs simply puts at the end of the command the piped data. i.e. you type :

find . -name \*log | xargs grep -i

3 – Using a while loop.

Yes, you can do like that, but it is not the topic of this discussion.


The difference

What is the main difference between the two?

-exec is going to take each file, and execute the command with it. Using this, you will get a list of all the lines, but not the name of the file, as grep assumes you know which file he talks about, as you have passed the name as parameter !

A sub process will be spawn for each file to be checked.

xargs, on the other end, is going to pass the complete list of files to grep. The name of the files will be shown in the result list.

The separator for the parameters will be the space, and this is OK as long as there is no space in the names of the files (but who puts spaces in the names of the files ? Ah, ok, users…).

In fact, the fail proof  way to deal with this specific request is to make the while loop.


The conclusion

Both ways can be useful, depending of the request and the situation. It is important that you understand the different ways those tools work, so that you can choose which one is the best for your usage.

Thank you for reading, and see you next wednesday ! (told you, long week-end for me 🙂 )


Posted By: Dimi
Last Edit: 02 Jan 2012 @ 05:17 PM

EmailPermalinkComments (0)
Tags: , , , ,
Categories: Bash, basics, Snippets
 04 Jan 2012 @ 8:00 PM 

Hello, and thank you for coming again !

I have noticed that some of you are coming via the RSS feed, and it is nice to see you are following this blog thoroughly !

Today, the topic is ” Equal Tilde operator”

The situation

You have a variable, and you need to check if the content is a single digit, or the letter “a” or the letter “A” or the letter “b” or the letter “B”

The solution

You do not use the equal tilde and you die of boredom :

if [[ ${var} = [0-9] ]] || [ “$var” = “a” ] || [ “$var” = “A” ] || [ “$var” = “b” ] || [ “$var” = “B” ] ; then

or you use the equal tile operator :

if [[ “${var}” =~ \^\[0-9]\|a\|b\|A\|B\$ ]] ; then

and you can live another day.

The equal tilde operator allows you to use regex (the same ones you used for see, remember ?) in an if command.

Note the two square brackets.

Thank you for reading, and see you tomorrow !


Nota bene :

You should not put the right side of the =~ operator in quotes, as this will mean a string, and not a regex.

If you do not want escaping everything that might need to be escaping, just put your complicated regex in a variable :

RegEx=”My Complicated Regex”

if [[ “${var}” =~ ${RegEx} ]]; then


Posted By: Dimi
Last Edit: 05 Jan 2012 @ 07:54 PM

EmailPermalinkComments (0)
 03 Jan 2012 @ 8:00 PM 

Today, we’ll see something interesting about the scope of variable in bash, and maybe more a bug than something else.

Before going further, I want to say that I have never had problem with the scope of variables in Bash, but this specific case is very strange.

The situation

Let’s say you have a file “tempo1”, containing three lines :

COMP 2 + 3 + 4
COMP 4 + 5 + 6

You want to compute all the lines that contain “COMP” and provide the total result.

You then write the following script (usually in one line) :

typeset -i i j
cat tempo1 | while read line; do if echo $line | grep COMP >/dev/null 2>&1;  then   i=`expr \`echo $line | cut -f2- -d’ ‘\“;   echo $i; let j=j+i;fi; done

(if the script is unclear, I’ll detail later on)

You assume that, at the end, “j” will have the sum of all, and “i” is going to be the last processed number. Well, you’re wrong.

The problem

At least if you use bash. There seems to be an issue with the context copy for a piped while. i.e. both your variable will be defined, but empty.

$ echo $0 i is $i j is $j
-bash i is j is

Now, let’s re-run the whole lot in ksh :

echo $0 i is $i j is $i
ksh i is 15 j is 15

The reason

Well, after looking on the web, it seems that I am not the only one puzzled. This has been reported as a bug, and we’ll see if it changes in the future.

The solution

Change your code, to avoid the piped while :

while read line; do if echo $line | grep COMP >/dev/null 2>&1;  then   i=`expr \`echo $line | cut -f2- -d’ ‘\“;   echo $i; let j=j+i;fi; done <tempo1

and the problem disappears. Or use ksh for this specific purpose.

Thanks for reading this, and see you tomorrow ! (Should you require more explanation about the script, feel free to read on.)


Explanation of the script

Let’s first explode the script on multiple lines :

typeset -i ij
cat tempo1 | while read line
if echo $line | grep COMP >/dev/null 2>&1;  then
i=`expr \`echo $line | cut -f2- -d’ ‘\“
echo $i
let j=j+i

So, the first line :

typeset -i i j

is there to define the variables i and j as integer (-i) (equivalent to int i,j; in c)

The next line is parsing the file, and looping each line between the “do” and “done”. The line is put in the variable “line”

If you had written

cat tempo1 | while read JamesTKirk

then the variable containing the line would have been JamesTKirk, which is a bit long to type and also a bit of an overkill to have one of the best Starfleet Captain as your variable.

Next line speaks by itself :

if echo $line | grep COMP >/dev/null 2>&1;  then

means “if the result of the command “echo $line | grep COMP” is 0, then execute the following” and of course, this will only be the case if the line in question contains the word “COMP” somewhere. We could have done maybe with “grep -e ‘^COMP'”, or with the funny test we’ll see tomorrow, but let’s not split the hair.

i=`expr \`echo $line | cut -f2- -d’ ‘\“

expr allows you to do basic arithmetical calculation. This line means “Take everything after the first field on the line, space being the separator, then compute it and store it in the variable “i”“.

let j=j+i

let is another way to do arithmetical operations, using integer  variables.

and voila, done.

Should you require more assistance, feel free to post in the comments or to man the command you want to know about.

Thank you for reading, and see you tomorrow for a special test operator.

Posted By: Dimi
Last Edit: 02 Jan 2012 @ 04:37 PM

EmailPermalinkComments (0)
Tags: , , ,
Categories: Bash, Snippets
 01 Jan 2012 @ 12:01 AM 

Just a small post to wish you all a happy 2012 !

(If you read this via the RSS, no point of coming to the site : there is nothing more in this post)

Posted By: Dimi
Last Edit: 31 Dec 2011 @ 03:39 PM

EmailPermalinkComments (0)
Categories: Announces

 Last 50 Posts
Change Theme...
  • Users » 67
  • Posts/Pages » 25
  • Comments » 4
Change Theme...
  • VoidVoid « Default
  • LifeLife
  • EarthEarth
  • WindWind
  • WaterWater
  • FireFire
  • LightLight