Saturday, 17 August 2013

What is Role of SSL Play During Web Application Performance

Often a web site/page contains or displays sensitive information about their users such as credit card number, account number and other confidential information. And this type of data needs to be defended and security must be provided for such sites. For this reason a web site/page is implemented over a secure socket layer to encrypt communications and to provide privacy.
SSL handshake means the authentication, negotiation using public and private key on cryptography
algorithms and stuff which inevitably happens for every single new connection. So a request being processed through SSL usually takes longer to complete than a request over an unencrypted communication. Thus creating significant overhead on performance when the site is being accessed by concurrent users.

To reduce the overhead on performance, it would be ideal to encrypt only those pages that has sensitive information rather than the whole web site. Consider an example of en eCommerce site which encrypts only the checkout page for payment and not the login, browse and other sections of the site. You can still reduce the overhead by using a separate hardware for cryptographic operations to perform and to reduce the load of computation expensive transactions.
Not using SSL is very bad but over using it is bad too.

Performance Testing - What, Why & Outcome

Performance Tests determine the run-time “behavior” of the application and its supporting infrastructure, under certain conditions.
Performance Testing is used to measure several system characteristics, such as processing speed, response time, resource consumption, throughput etc.
Need of Performance Testing – 
– The internet and IT infrastructure crucial to business
– Users – employees, business partners, customers—rely on portals, applications, and data to do
their jobs
– Cost of failure can be devastating
– Does the application respond quickly enough for the intended users?
– Will the application handle the expected user load and beyond?
– Is the application stable under expected and unexpected user loads?
– Are you sure that users will have a positive experience on go-live day?
Performance Test Objectives – 
Application response time
How long does it take to complete a task?
Configuration sizing
Which configuration provides the best performance level?
Acceptance
Is the system stable enough to go into production?
Regression
Does the new version of the software adversely affect response time?
Reliability
How stable is the system under a heavy work load?
Capacity planning
At what point does degradation in performance occur?
Bottleneck identification
What is the cause of degradation in performance?
Product evaluation
What is the best server for 100 Users?
Deliverable of Performance testing – 
– Performance Test Plan – a document with well defined Performance Objective, Performance
Goal, Performance Acceptance Criteria etc.
– Performance Test Scenarios – a series of events that occur during a performance test session.
Performance Test Scenarios are constructed from one or more Performance Test Scripts.
– Performance Test Scripts – define the actions that the “Virtual User” performs on the
application being load tested.
– Performance Test Results – are the outputs of the execution of the various Performance Test
Scenarios. Test results include:
Performance Test Result Summary and Analysis
Baseline Performance Test Results
Detailed Performance Test Results
In-depth analysis and Recommendation

Wednesday, 14 August 2013

Basic UNIX Commands

UNIX is case-sensitive.
Files
ls — lists your files
ls -l — lists your files in ‘long format’, which contains lots of useful information, e.g. the exact size of the file, who owns the file and who has the right to look at it, and when it was last modified.
ls -a — lists all files, including the ones whose filenames begin in a dot, which you do not always want to see.
There are many more options, for example to list files by size, by date, recursively etc.
more filename — shows the first part of a file, just as much as will fit on one screen. Just hit the space bar to see more or q to quit. You can use /pattern to search for a pattern.
emacs filename — is an editor that lets you create and edit a file. See the emacs page.
mv filename1 filename2 — moves a file (i.e. gives it a different name, or moves it into a different directory (see below)
cp filename1 filename2 — copies a file
rm filename — removes a file. It is wise to use the option rm -i, which will ask you for confirmation before actually deleting anything. You can make this your default by making an alias in your .cshrc file.
diff filename1 filename2 — compares files, and shows where they differ
wc filename — tells you how many lines, words, and characters there are in a file
chmod options filename — lets you change the read, write, and execute permissions on your files. The default is that only you can look at them and change them, but you may sometimes want to change these permissions. For example, chmod o+r filename will make the file readable for everyone, and chmod o-r filename will make it unreadable for others again. Note that for someone to be able to actually look at the file the directories it is in need to be at least executable. See help protection for more details.
File Compression
gzip filename — compresses files, so that they take up much less space. Usually text files compress to about half their original size, but it depends very much on the size of the file and the nature of the contents. There are other tools for this purpose, too (e.g. compress), but gzip usually gives the highest compression rate. Gzip produces files with the ending ‘.gz’ appended to the original filename.
gunzip filename — uncompresses files compressed by gzip.
gzcat filename — lets you look at a gzipped file without actually having to gunzip it (same as gunzip -c). You can even print it directly, using gzcat filename | lpr
printing
lpr filename — print. Use the -P option to specify the printer name if you want to use a printer other than your default printer. For example, if you want to print double-sided, use ‘lpr -Pvalkyr-d’, or if you’re at CSLI, you may want to use ‘lpr -Pcord115-d’. See ‘help printers’ for more information about printers and their locations.
lpq — check out the printer queue, e.g. to get the number needed for removal, or to see how many other files will be printed before yours will come out
lprm jobnumber — remove something from the printer queue. You can find the job number by using lpq. Theoretically you also have to specify a printer name, but this isn’t necessary as long as you use your default printer in the department.
genscript — converts plain text files into postscript for printing, and gives you some options for formatting. Consider making an alias like alias ecop ‘genscript -2 -r \!* | lpr -h -Pvalkyr’ to print two pages on one piece of paper.
dvips filename — print .dvi files (i.e. files produced by LaTeX). You can use dviselect to print only selected pages. See the LaTeX page for more information about how to save paper when printing drafts.
Directories
Directories, like folders on a Macintosh, are used to group files together in a hierarchical structure.
mkdir dirname — make a new directory
cd dirname — change directory. You basically ‘go’ to another directory, and you will see the files in that directory when you do ‘ls’. You always start out in your ‘home directory’, and you can get back there by typing ‘cd’ without arguments. ‘cd ..’ will get you one level up from your current position. You don’t have to walk along step by step – you can make big leaps or avoid walking around by specifying pathnames.
pwd — tells you where you currently are.
Finding things
ff — find files anywhere on the system. This can be extremely useful if you’ve forgotten in which directory you put a file, but do remember the name. In fact, if you use ff -p you don’t even need the full name, just the beginning. This can also be useful for finding other things on the system, e.g. documentation.
grep string filename(s) — looks for the string in the files. This can be useful a lot of purposes, e.g. finding the right file among many, figuring out which is the right version of something, and even doing serious corpus work. grep comes in several varieties (grep, egrep, and fgrep) and has a lot of very flexible options. Check out the man pages if this sounds good to you.
About other people
w — tells you who’s logged in, and what they’re doing. Especially useful: the ‘idle’ part. This allows you to see whether they’re actually sitting there typing away at their keyboards right at the moment.
who — tells you who’s logged on, and where they’re coming from. Useful if you’re looking for someone who’s actually physically in the same building as you, or in some other particular location.
finger username — gives you lots of information about that user, e.g. when they last read their mail and whether they’re logged in. Often people put other practical information, such as phone numbers and addresses, in a file called .plan. This information is also displayed by ‘finger’.
last -1 username — tells you when the user last logged on and off and from where. Without any options, last will give you a list of everyone’s logins.
talk username — lets you have a (typed) conversation with another user
write username — lets you exchange one-line messages with another user
elm — lets you send e-mail messages to people around the world (and, of course, read them). It’s not the only mailer you can use, but the one we recommend. See the elm page, and find out about the departmental mailing lists (which you can also find in /user/linguistics/helpfile).
About your (electronic) self
whoami — returns your username. Sounds useless, but isn’t. You may need to find out who it is who forgot to log out somewhere, and make sure *you* have logged out.
finger & .plan files
of course you can finger yourself, too. That can be useful e.g. as a quick check whether you got new mail. Try to create a useful .plan file soon. Look at other people’s .plan files for ideas. The file needs to be readable for everyone in order to be visible through ‘finger’. Do ‘chmod a+r .plan’ if necessary. You should realize that this information is accessible from anywhere in the world, not just to other people on turing.
passwd — lets you change your password, which you should do regularly (at least once a year). See the LRB guide and/or look at help password.
ps -u yourusername — lists your processes. Contains lots of information about them, including the process ID, which you need if you have to kill a process. Normally, when you have been kicked out of a dialin session or have otherwise managed to get yourself disconnected abruptly, this list will contain the processes you need to kill. Those may include the shell (tcsh or whatever you’re using), and anything you were running, for example emacs or elm. Be careful not to kill your current shell – the one with the number closer to the one of the ps command you’re currently running. But if it happens, don’t panic. Just try again :) If you’re using an X-display you may have to kill some X processes before you can start them again. These will show only when you use ps -efl, because they’re root processes.
kill PID — kills (ends) the processes with the ID you gave. This works only for your own processes, of course. Get the ID by using ps. If the process doesn’t ‘die’ properly, use the option -9. But attempt without that option first, because it doesn’t give the process a chance to finish possibly important business before dying. You may need to kill processes for example if your modem connection was interrupted and you didn’t get logged out properly, which sometimes happens.
quota -v — show what your disk quota is (i.e. how much space you have to store files), how much you’re actually using, and in case you’ve exceeded your quota (which you’ll be given an automatic warning about by the system) how much time you have left to sort them out (by deleting or gzipping some, or moving them to your own computer).
du filename — shows the disk usage of the files and directories in filename (without argument the current directory is used). du -s gives only a total.
last yourusername — lists your last logins. Can be a useful memory aid for when you were where, how long you’ve been working for, and keeping track of your phonebill if you’re making a non-local phonecall for dialling in.
Connecting to the outside world
nn — allows you to read news. It will first let you read the news local to turing, and then the remote news. If you want to read only the local or remote news, you can use nnl or nnr, respectively. To learn more about nn type nn, then \tty{:man}, then \tty{=.*}, then \tty{Z}, then hit the space bar to step through the manual. Or look at the man page. Or check out the hypertext nn FAQ – probably the easiest and most fun way to go.
rlogin hostname — lets you connect to a remote host
telnet hostname — also lets you connect to a remote host. Use rlogin whenever possible.
ftp hostname — lets you download files from a remote host which is set up as an ftp-server. This is a common method for exchanging academic papers and drafts. If you need to make a paper of yours available in this way, you can (temporarily) put a copy in /user/ftp/pub/TMP. For more permanent solutions, ask Emma. The most important commands within ftp are get for getting files from the remote machine, and put for putting them there (mget and mput let you specify more than one file at once). Sounds straightforward, but be sure not to confuse the two, especially when your physical location doesn’t correspond to the direction of the ftp connection you’re making. ftp just overwrites files with the same filename. If you’re transferring anything other than ASCII text, use binary mode.
lynx — lets you browse the web from an ordinary terminal. Of course you can see only the text, not the pictures. You can type any URL as an argument to the G command. When you’re doing this from any Stanford host you can leave out the .stanford.edu part of the URL when connecting to Stanford URLs. Type H at any time to learn more about lynx, and Q to exit.
Miscellaneous tools
webster word — looks up the word in an electronic version of Webster’s dictionary and returns the definition(s)
date — shows the current date and time.
cal — shows a calendar of the current month. Use e.g., ‘cal 10 1995′ to get that for October 95, or ‘cal 1995′ to get the whole year.
You can find out more about these commands by looking up their manpages:
man commandname — shows you the manual page for the command

An Introduction to the UNIX Shell

Introduction
The shell is both a command language and a programming language that provides an interface to the UNIX operating system. This memorandum describes, with examples, the UNIX shell. The first section covers most of the everyday requirements of terminal users. Some familiarity with UNIX is an advantage when reading this section; see, for example, “UNIX for beginners”.
Simple commands – Simple commands consist of one or more words separated by blanks. The first word is the name of the command to be executed; any remaining words are passed as arguments to the command. For example,
who
is a command that prints the names of users logged in. The command
ls -l
prints a list of files in the current directory. The argument -l tells ls to print status information, size and the creation date for each file.
Background commands – To execute a command the shell normally creates a new process and waits for it to finish. A command may be run without waiting for it to finish. For example,
cc pgm.c &
calls the C compiler to compile the file pgm.c. The trailing & is an operator that instructs the shell not to wait for the command to finish. To help keep track of such a process the shell reports its process number following its creation. A list of currently active processes may be obtained using the ps command.
Input output redirection – Most commands produce output on the standard output that is initially connected to the terminal. This output may be sent to a file by writing, for example,
ls -l >file
The notation >file is interpreted by the shell and is not passed as an argument to ls. If file does not exist then the shell creates it; otherwise the original contents of file are replaced with the output from ls. Output may be appended to a file using the notation
ls -l >>file
In this case file is also created if it does not already exist.
The standard input of a command may be taken from a file instead of the terminal by writing, for example,
wc The command wc reads its standard input (in this case redirected from file) and prints the number of characters, words and lines found. If only the number of lines is required then
wc -l file; wc except that no file is used. Instead the two processes are connected by a pipe (see pipe (2)) and are run in parallel.
Pipes are unidirectional and synchronization is achieved by halting wc when there is nothing to read and halting ls when the pipe is full.
A filter is a command that reads its standard input, transforms it in some way, and prints the result as output. One such filter, grep, selects from its input those lines that contain some specified string. For example,
ls | grep old
prints those lines, if any, of the output from ls that contain the string old. Another useful filter is sort. For example,
who | sort
will print an alphabetically sorted list of logged in users.
A pipeline may consist of more than two commands, for example,
ls | grep old | wc -l
prints the number of file names in the current directory containing the string old.
File name generation – Many commands accept arguments which are file names. For example,
ls -l main.c
prints information relating to the file main.c.
The shell provides a mechanism for generating a list of file names that match a pattern. For example,
ls -l *.c
generates, as arguments to ls, all file names in the current directory that end in .c. The character * is a pattern that will match any string including the null string. In general patterns are specified as follows.
*
Matches any string of characters including the null string.
?
Matches any single character.
[...]
Matches any one of the characters enclosed. A pair of characters separated by a minus will match any character lexically between the pair.
For example,
[a-z]*
matches all names in the current directory beginning with one of the letters a through z.
/usr/fred/test/?
matches all names in the directory /usr/fred/test that consist of a single character. If no file name is found that matches the pattern then the pattern is passed, unchanged, as an argument.
This mechanism is useful both to save typing and to select names according to some pattern. It may also be used to find files. For example,
echo /usr/fred/*/core
finds and prints the names of all core files in sub-directories of /usr/fred. (echo is a standard UNIX command that prints its arguments, separated by blanks.) This last feature can be expensive, requiring a scan of all sub-directories of /usr/fred.
There is one exception to the general rules given for patterns. The character `.' at the start of a file name must be explicitly matched.
echo *
will therefore echo all file names in the current directory not beginning with `.'.
echo .*
will echo all those file names that begin with `.'. This avoids inadvertent matching of the names `.' and `..' which mean `the current directory' and `the parent directory' respectively. (Notice that ls suppresses information for the files `.' and `..'.)
Quoting – Characters that have a special meaning to the shell, such as * ? | &, are called metacharacters. A complete list of metacharacters is given in appendix B. Any character preceded by a \ is quoted and loses its special meaning, if any. The \ is elided so that
echo \?
will echo a single ?, and
echo \\
will echo a single \. To allow long strings to be continued over more than one line the sequence \newline is ignored.
\ is convenient for quoting single characters. When more than one character needs quoting the above mechanism is clumsy and error prone. A string of characters may be quoted by enclosing the string between single quotes. For example,
echo xx’****’xx
will echo
xx****xx
The quoted string may not contain a single quote but may contain newlines, which are preserved. This quoting mechanism is the most simple and is recommended for casual use.
A third quoting mechanism using double quotes is also available that prevents interpretation of some but not all metacharacters. Discussion of the details is deferred to section 3.4.
Prompting – When the shell is used from a terminal it will issue a prompt before reading a command. By default this prompt is `$ ‘. It may be changed by saying, for example,
PS1=yesdear
that sets the prompt to be the string yesdear. If a newline is typed and further input is needed then the shell will issue the prompt `> ‘. Sometimes this can be caused by mistyping a quote mark. If it is unexpected then an interrupt (DEL) will return the shell to read another command. This prompt may be changed by saying, for example,
PS2=more
The shell and login -Following login (1) the shell is called to read and execute commands typed at the terminal. If the user’s login directory contains the file .profile then it is assumed to contain commands and is read by the shell before reading any commands from the terminal.
Summary –
ls
Print the names of files in the current directory.
ls >file
Put the output from ls into file.
ls | wc -l
Print the number of files in the current directory.
ls | grep old
Print those file names containing the string old.
ls | grep old | wc -l
Print the number of files whose name contains the string old.
cc pgm.c &
Run cc in the background.
Shell procedures – The shell may be used to read and execute commands contained in a file. For example,
sh file [ args ... ]
calls the shell to read commands from file. Such a file is called a command procedure or shell procedure. Arguments may be supplied with the call and are referred to in file using the positional parameters $1, $2, …. For example, if the file wg contains
who | grep $1
then
sh wg fred
is equivalent to
who | grep fred
UNIX files have three independent attributes, read, write and execute. The UNIX command chmod (1) may be used to make a file executable. For example,
chmod +x wg
will ensure that the file wg has execute status. Following this, the command
wg fred
is equivalent to
sh wg fred
This allows shell procedures and programs to be used interchangeably. In either case a new process is created to run the command.
As well as providing names for the positional parameters, the number of positional parameters in the call is available as $#. The name of the file being executed is available as $0.
A special shell parameter $* is used to substitute for all positional parameters except $0. A typical use of this is to provide some default arguments, as in,
nroff -T450 -ms $*
which simply prepends some arguments to those already given.
Control flow – for
A frequent use of shell procedures is to loop through the arguments ($1, $2, …) executing commands once for each argument. An example of such a procedure is tel that searches the file /usr/lib/telnos that contains lines of the form

fred mh0123
bert mh0789

The text of tel is
for i
do grep $i /usr/lib/telnos; done
The command
tel fred
prints those lines in /usr/lib/telnos that contain the string fred.
tel fred bert
prints those lines containing fred followed by those for bert.
The for loop notation is recognized by the shell and has the general form
for name in w1 w2 …
do command-list
done
A command-list is a sequence of one or more simple commands separated or terminated by a newline or semicolon. Furthermore, reserved words like do and done are only recognized following a newline or semicolon. name is a shell variable that is set to the words w1 w2 … in turn each time the command-list following do is executed. If in w1 w2 … is omitted then the loop is executed once for each positional parameter; that is, in $* is assumed.
Another example of the use of the for loop is the create command whose text is
for i do >$i; done
The command
create alpha beta
ensures that two empty files alpha and beta exist and are empty. The notation >file may be used on its own to create or clear the contents of a file. Notice also that a semicolon (or newline) is required before done.
Control flow – case
A multiple way branch is provided for by the case notation. For example,
case $# in
1) cat >>$1 ;;
2) cat >>$2 <$1 ;;
*) echo \'usage: append [ from ] to\' ;;
esac
is an append command. When called with one argument as
append file
$# is the string 1 and the standard input is copied onto the end of file using the cat command.
append file1 file2
appends the contents of file1 onto file2. If the number of arguments supplied to append is other than 1 or 2 then a message is printed indicating proper usage.
The general form of the case command is
case word in
pattern) command-list;;

esac
The shell attempts to match word with each pattern, in the order in which the patterns appear. If a match is found the associated command-list is executed and execution of the case is complete. Since * is the pattern that matches any string it can be used for the default case.
A word of caution: no check is made to ensure that only one pattern matches the case argument. The first match found defines the set of commands to be executed. In the example below the commands following the second * will never be executed.
case $# in
*) … ;;
*) … ;;
esac
Another example of the use of the case construction is to distinguish between different forms of an argument. The following example is a fragment of a cc command.
for i
do case $i in
-[ocs]) … ;;
-*) echo \'unknown flag $i\' ;;
*.c) /lib/c0 $i … ;;
*) echo \'unexpected argument $i\' ;;
esac
done
To allow the same commands to be associated with more than one pattern the case command provides for alternative patterns separated by a |. For example,
case $i in
-x|-y) …
esac
is equivalent to
case $i in
-[xy]) …
esac
The usual quoting conventions apply so that
case $i in
\?) …
will match the character ?.
The shell procedure tel in section 2.1 uses the file /usr/lib/telnos to supply the data for grep. An alternative is to include this data within the shell procedure as a here document, as in,
for i
do grep $i <<!

fred mh0123
bert mh0789

!
done
In this example the shell takes the lines between <<! and ! as the standard input for grep. The string ! is arbitrary, the document being terminated by a line that consists of the string following <<.
Parameters are substituted in the document before it is made available to grep as illustrated by the following procedure called edg.
ed $3 <<%
g/$1/s//$2/g
w
%
The call
edg string1 string2 file
is then equivalent to the command
ed file <<%
g/string1/s//string2/g
w
%
and changes all occurrences of string1 in file to string2. Substitution can be prevented using \ to quote the special character $ as in
ed $3 <<+
1,\$s/$1/$2/g
w
+
(This version of edg is equivalent to the first except that ed will print a ? if there are no occurrences of the string $1.) Substitution within a here document may be prevented entirely by quoting the terminating string, for example,
grep $i <${tmp}a
will direct the output of ps to the file /tmp/psa, whereas,
ps a >$tmpa
would cause the value of the variable tmpa to be substituted.
Except for $? the following are set initially by the shell. $? is set after executing each command.
$?
The exit status (return code) of the last command executed as a decimal string. Most commands return a zero exit status if they complete successfully, otherwise a non-zero exit status is returned. Testing the value of return codes is dealt with later under if and while commands.
$#
The number of positional parameters (in decimal). Used, for example, in the append command to check the number of parameters.
$$
The process number of this shell (in decimal). Since process numbers are unique among all existing processes, this string is frequently used to generate unique temporary file names. For example,
ps a >/tmp/ps$$

rm /tmp/ps$$
$!
The process number of the last process run in the background (in decimal).
$-
The current shell flags, such as -x and -v.
Some variables have a special meaning to the shell and should be avoided for general use.
$MAIL
When used interactively the shell looks at the file specified by this variable before it issues a prompt. If the specified file has been modified since it was last looked at the shell prints the message you have mail before prompting for the next command. This variable is typically set in the file .profile, in the user’s login directory. For example,
MAIL=/usr/mail/fred
$HOME
The default argument for the cd command. The current directory is used to resolve file name references that do not begin with a /, and is changed using the cd command. For example,
cd /usr/fred/bin
makes the current directory /usr/fred/bin.
cat wn
will print on the terminal the file wn in this directory. The command cd with no argument is equivalent to
cd $HOME
This variable is also typically set in the the user’s login profile.
$PATH
A list of directories that contain commands (the search path). Each time a command is executed by the shell a list of directories is searched for an executable file. If $PATH is not set then the current directory, /bin, and /usr/bin are searched by default. Otherwise $PATH consists of directory names separated by :. For example,
PATH=:/usr/fred/bin:/bin:/usr/bin
specifies that the current directory (the null string before the first :) , /usr/fred/bin, /bin and /usr/bin are to be searched in that order. In this way individual users can have their own `private’ commands that are accessible independently of the current directory. If the command name contains a / then this directory search is not used; a single attempt is made to execute the command.
$PS1
The primary shell prompt string, by default, `$ ‘.
$PS2
The shell prompt when further input is needed, by default, `> ‘.
$IFS
The set of characters used by blank interpretation (see section 3.4).
2.5 The test command
The test command, although not part of the shell, is intended for use by shell programs. For example,
test -f file
returns zero exit status if file exists and non-zero exit status otherwise. In general test evaluates a predicate and returns the result as its exit status. Some of the more frequently used test arguments are given here, see test (1) for a complete specification.
test s
true if the argument s is not the null string
test -f file
true if file exists
test -r file
true if file is readable
test -w file
true if file is writable
test -d file
true if file is a directory
Control flow – while
The actions of the for loop and the case branch are determined by data available to the shell. A while or until loop and an if then else branch are also provided whose actions are determined by the exit status returned by commands. A while loop has the general form
while command-list1
do command-list2
done
The value tested by the while command is the exit status of the last simple command following while. Each time round the loop command-list1 is executed; if a zero exit status is returned then command-list2 is executed; otherwise, the loop terminates. For example,
while test $1
do …
shift
done
is equivalent to
for i
do …
done
shift is a shell command that renames the positional parameters $2, $3, … as $1, $2, … and loses $1.
Another kind of use for the while/until loop is to wait until some external event occurs and then run some commands. In an until loop the termination condition is reversed. For example,
until test -f file
do sleep 300; done
commands
will loop until file exists. Each time round the loop it waits for 5 minutes before trying again. (Presumably another process will eventually create the file.)
2.7 Control flow – if
Also available is a general conditional branch of the form,
if command-list
then command-list
else command-list
fi
that tests the value returned by the last simple command following if.
The if command may be used in conjunction with the test command to test for the existence of a file as in
if test -f file
then process file
else do something else
fi
An example of the use of if, case and for constructions is given in section 2.10.
A multiple test if command of the form
if …
then …
else if …
then …
else if …

fi
fi
fi
may be written using an extension of the if notation as,
if …
then …
elif …
then …
elif …

fi
The following example is the touch command which changes the `last modified’ time for a list of files. The command may be used in conjunction with make (1) to force recompilation of a list of files.
flag=
for i
do case $i in
-c) flag=N ;;
*) if test -f $i
then ln $i junk$$; rm junk$$
elif test $flag
then echo file \’$i\’ does not exist
else >$i
fi
esac
done
The -c flag is used in this command to force subsequent files to be created if they do not already exist. Otherwise, if the file does not exist, an error message is printed. The shell variable flag is set to some non-null string if the -c argument is encountered. The commands
ln …; rm …
make a link to the file and then remove it thus causing the last modified date to be updated.
The sequence
if command1
then command2
fi
may be written
command1 && command2
Conversely,
command1 || command2
executes command2 only if command1 fails. In each case the value returned is that of the last simple command executed.
2.8 Command grouping
Commands may be grouped in two ways,
{ command-list ; }
and
( command-list )
In the first command-list is simply executed. The second form executes command-list as a separate process. For example,
(cd x; rm junk )
executes rm junk in the directory x without changing the current directory of the invoking shell.
The commands
cd x; rm junk
have the same effect but leave the invoking shell in the directory x.
2.9 Debugging shell procedures
The shell provides two tracing mechanisms to help when debugging shell procedures. The first is invoked within the procedure as
set -v
(v for verbose) and causes lines of the procedure to be printed as they are read. It is useful to help isolate syntax errors. It may be invoked without modifying the procedure by saying
sh -v proc …
where proc is the name of the shell procedure. This flag may be used in conjunction with the -n flag which prevents execution of subsequent commands. (Note that saying set -n at a terminal will render the terminal useless until an end- of-file is typed.)
The command
set -x
will produce an execution trace. Following parameter substitution each command is printed as it is executed. (Try these at the terminal to see what effect they have.) Both flags may be turned off by saying
set -
and the current setting of the shell flags is available as $-.
The man command – The following is the man command which is used to print sections of the UNIX manual. It is called, for example, as
$ man sh
$ man -t ed
$ man 2 fork
In the first the manual section for sh is printed. Since no section is specified, section 1 is used. The second example will typeset (-t option) the manual section for ed. The last prints the fork manual page from section 2.
cd /usr/man
: ‘colon is the comment command’
: ‘default is nroff ($N), section 1 ($s)’
N=n s=1
for i
do case $i in
[1-9]*) s=$i ;;
-t) N=t ;;
-n) N=n ;;
-*) echo unknown flag \’$i\’ ;;
*) if test -f man$s/$i.$s
then ${N}roff man0/${N}aa man$s/$i.$s
else : ‘look through all manual sections’
found=no
for j in 1 2 3 4 5 6 7 8 9
do if test -f man$j/$i.$j
then man $j $i
found=yes
fi
done
case $found in
no) echo \’$i: manual page not found\’
esac
fi
esac
done
Keyword parameters – Shell variables may be given values by assignment or when a shell procedure is invoked. An argument to a shell procedure of the form name=value that precedes the command name causes value to be assigned to name before execution of the procedure begins. The value of name in the invoking shell is not affected. For example,
user=fred command
will execute command with user set to fred. The -k flag causes arguments of the form name=value to be interpreted in this way anywhere in the argument list. Such names are sometimes called keyword parameters. If any arguments remain they are available as positional parameters $1, $2, ….
The set command may also be used to set positional parameters from within a procedure. For example,
set – *
will set $1 to the first file name in the current directory, $2 to the next, and so on. Note that the first argument, -, ensures correct treatment when the first file name begins with a -.
3.1 Parameter transmission
When a shell procedure is invoked both positional and keyword parameters may be supplied with the call. Keyword parameters are also made available implicitly to a shell procedure by specifying in advance that such parameters are to be exported. For example,
export user box
marks the variables user and box for export. When a shell procedure is invoked copies are made of all exportable variables for use within the invoked procedure. Modification of such variables within the procedure does not affect the values in the invoking shell. It is generally true of a shell procedure that it may not modify the state of its caller without explicit request on the part of the caller. (Shared file descriptors are an exception to this rule.)
Names whose value is intended to remain constant may be declared readonly. The form of this command is the same as that of the export command,
readonly name …
Subsequent attempts to set readonly variables are illegal.
3.2 Parameter substitution
If a shell parameter is not set then the null string is substituted for it. For example, if the variable d is not set
echo $d
or
echo ${d}
will echo nothing. A default string may be given as in
echo ${d-.}
which will echo the value of the variable d if it is set and `.’ otherwise. The default string is evaluated using the usual quoting conventions so that
echo ${d-’*'}
will echo * if the variable d is not set. Similarly
echo ${d-$1}
will echo the value of d if it is set and the value (if any) of $1 otherwise. A variable may be assigned a default value using the notation
echo ${d=.}
which substitutes the same string as
echo ${d-.}
and if d were not previously set then it will be set to the string `.’. (The notation ${…=…} is not available for positional parameters.)
If there is no sensible default then the notation
echo ${d?message}
will echo the value of the variable d if it has one, otherwise message is printed by the shell and execution of the shell procedure is abandoned. If message is absent then a standard message is printed. A shell procedure that requires some parameters to be set might start as follows.
: ${user?} ${acct?} ${bin?}

Colon (:) is a command that is built in to the shell and does nothing once its arguments have been evaluated. If any of the variables user, acct or bin are not set then the shell will abandon execution of the procedure.
Command substitution – The standard output from a command can be substituted in a similar way to parameters. The command pwd prints on its standard output the name of the current directory. For example, if the current directory is /usr/fred/bin then the command
d=`pwd`
is equivalent to
d=/usr/fred/bin
The entire string between grave accents (`…`) is taken as the command to be executed and is replaced with the output from the command. The command is written using the usual quoting conventions except that a ` must be escaped using a \. For example,
ls `echo “$1″`
is equivalent to
ls $1
Command substitution occurs in all contexts where parameter substitution occurs (including here documents) and the treatment of the resulting text is the same in both cases. This mechanism allows string processing commands to be used within shell procedures. An example of such a command is basename which removes a specified suffix from a string. For example,
basename main.c .c
will print the string main. Its use is illustrated by the following fragment from a cc command.
case $A in

*.c) B=`basename $A .c`

esac
that sets B to the part of $A with the suffix .c stripped.
Here are some composite examples.
· for i in `ls -t`; do …
The variable i is set to the names of files in time order, most recent first.
· set `date`; echo $6 $2 $3, $4
will print, e.g., 1977 Nov 1, 23:59:59
Evaluation and quoting – The shell is a macro processor that provides parameter substitution, command substitution and file name generation for the arguments to commands. This section discusses the order in which these evaluations occur and the effects of the various quoting mechanisms.
Commands are parsed initially according to the grammar given in appendix A. Before a command is executed the following substitutions occur.
parameter substitution, e.g. $user
command substitution, e.g. `pwd`
Only one evaluation occurs so that if, for example, the value of the variable X is the string $y then
echo $X
will echo $y.
blank interpretation
Following the above substitutions the resulting characters are broken into non-blank words (blank interpretation). For this purpose `blanks’ are the characters of the string $IFS. By default, this string consists of blank, tab and newline. The null string is not regarded as a word unless it is quoted. For example,
echo ”
will pass on the null string as the first argument to echo, whereas
echo $null
will call echo with no arguments if the variable null is not set or set to the null string.
file name generation
Each word is then scanned for the file pattern characters *, ? and [...] and an alphabetical list of file names is generated to replace the word. Each such file name is a separate argument.
The evaluations just described also occur in the list of words associated with a for loop. Only substitution occurs in the word used for a case branch.
As well as the quoting mechanisms described earlier using \ and ‘…’ a third quoting mechanism is provided using double quotes. Within double quotes parameter and command substitution occurs but file name generation and the interpretation of blanks does not. The following characters have a special meaning within double quotes and may be quoted using \.
$
parameter substitution
`
command substitution

ends the quoted string
\
quotes the special characters $ ` ” \
For example,
echo “$x”
will pass the value of the variable x as a single argument to echo. Similarly,
echo “$*”
will pass the positional parameters as a single argument and is equivalent to
echo “$1 $2 …”
The notation $@ is the same as $* except when it is quoted.
echo “$@”
will pass the positional parameters, unevaluated, to echo and is equivalent to
echo “$1″ “$2″ …
The following table gives, for each quoting mechanism, the shell metacharacters that are evaluated.
metacharacter
\ $ * ` ” ‘
‘ n n n n n t
` y n n t n n
” y y n y t n
t terminator
y interpreted
n not interpreted
In cases where more than one evaluation of a string is required the built-in command eval may be used. For example, if the variable X has the value $y, and if y has the value pqr then
eval echo $X
will echo the string pqr.
In general the eval command evaluates its arguments (as do all commands) and treats the result as input to the shell. The input is read and the resulting command(s) executed. For example,
wg=\’eval who|grep\’
$wg fred
is equivalent to
who|grep fred
In this example, eval is required since there is no interpretation of metacharacters, such as |, following substitution.
Error handling – The treatment of errors detected by the shell depends on the type of error and on whether the shell is being used interactively. An interactive shell is one whose input and output are connected to a terminal (as determined by gtty (2)). A shell invoked with the -i flag is also interactive.
Execution of a command (see also 3.7) may fail for any of the following reasons.
Input – output redirection may fail. For example, if a file does not exist or cannot be created.
The command itself does not exist or cannot be executed.
The command terminates abnormally, for example, with a “bus error” or “memory fault”.
The command terminates normally but returns a non-zero exit status.
In all of these cases the shell will go on to execute the next command. Except for the last case an error message will be printed by the shell. All remaining errors cause the shell to exit from a command procedure. An interactive shell will return to read another command from the terminal. Such errors include the following.
Syntax errors. e.g., if … then … done
A signal such as interrupt. The shell waits for the current command, if any, to finish execution and then either exits or returns to the terminal.
Failure of any of the built-in commands such as cd.
The shell flag -e causes the shell to terminate if any error is detected.
1
hangup
2
interrupt
3*
quit
4*
illegal instruction
5*
trace trap
6*
IOT instruction
7*
EMT instruction
8*
floating point exception
9
kill (cannot be caught or ignored)
10*
bus error
11*
segmentation violation
12*
bad argument to system call
13
write on a pipe with no one to read it
14
alarm clock
15
software termination (from kill (1))
UNIX signals – Those signals marked with an asterisk produce a core dump if not caught. However, the shell itself ignores quit which is the only external signal that can cause a dump. The signals in this list of potential interest to shell programs are 1, 2, 3, 14 and 15.
Fault handling – Shell procedures normally terminate when an interrupt is received from the terminal. The trap command is used if some cleaning up is required, such as removing temporary files. For example,
trap ‘rm /tmp/ps$$; exit’ 2
sets a trap for signal 2 (terminal interrupt), and if this signal is received will execute the commands
rm /tmp/ps$$; exit
exit is another built-in command that terminates execution of a shell procedure. The exit is required; otherwise, after the trap has been taken, the shell will resume executing the procedure at the place where it was interrupted.
UNIX signals can be handled in one of three ways. They can be ignored, in which case the signal is never sent to the process. They can be caught, in which case the process must decide what action to take when the signal is received. Lastly, they can be left to cause termination of the process without it having to take any further action. If a signal is being ignored on entry to the shell procedure, for example, by invoking it in the background (see 3.7) then trap commands (and the signal) are ignored.
The use of trap is illustrated by this modified version of the touch command (Figure 4). The cleanup action is to remove the file junk$$.
flag=
trap ‘rm -f junk$$; exit’ 1 2 3 15
for i
do case $i in
-c) flag=N ;;
*) if test -f $i
then ln $i junk$$; rm junk$$
elif test $flag
then echo file \’$i\’ does not exist
else >$i
fi
esac
done
The touch command – The trap command appears before the creation of the temporary file; otherwise it would be possible for the process to die without removing the file.
Since there is no signal 0 in UNIX it is used by the shell to indicate the commands to be executed on exit from the shell procedure.
A procedure may, itself, elect to ignore signals by specifying the null string as the argument to trap. The following fragment is taken from the nohup command.
trap ” 1 2 3 15
which causes hangup, interrupt, quit and kill to be ignored both by the procedure and by invoked commands.
Traps may be reset by saying
trap 2 3
which resets the traps for signals 2 and 3 to their default values. A list of the current values of traps may be obtained by writing
trap
The procedure scan (Figure 5) is an example of the use of trap where there is no exit in the trap command. scan takes each directory in the current directory, prompts with its name, and then executes commands typed at the terminal until an end of file or an interrupt is received. Interrupts are ignored while executing the requested commands but cause termination when scan is waiting for input.
d=`pwd`
for i in *
do if test -d $d/$i
then cd $d/$i
while echo “$i:”
trap exit 2
read x
do trap : 2; eval $x; done
fi
done
The scan command – read x is a built-in command that reads one line from the standard input and places the result in the variable x. It returns a non-zero exit status if either an end-of-file is read or an interrupt is received.
Command execution – To run a command (other than a built-in) the shell first creates a new process using the system call fork. The execution environment for the command includes input, output and the states of signals, and is established in the child process before the command is executed. The built-in command exec is used in the rare cases when no fork is required and simply replaces the shell with a new command. For example, a simple version of the nohup command looks like
trap \’\’ 1 2 3 15
exec $*
The trap turns off the signals specified so that they are ignored by subsequently created commands and exec replaces the shell by the command specified.
Most forms of input output redirection have already been described. In the following word is only subject to parameter and command substitution. No file name generation or blank interpretation takes place so that, for example,
echo … >*.c
will write its output into a file whose name is *.c. Input output specifications are evaluated left to right as they appear in the command.
> word
The standard output (file descriptor 1) is sent to the file word which is created if it does not already exist.
>> word
The standard output is sent to file word. If the file exists then output is appended (by seeking to the end); otherwise the file is created.
< word
The standard input (file descriptor 0) is taken from the file word.
<& digit
The file descriptor digit is duplicated using the system call dup (2) and the result is used as the standard output.
<& digit
The standard input is duplicated from file descriptor digit.
&-
The standard output is closed.
Any of the above may be preceded by a digit in which case the file descriptor created is that specified by the digit instead of the default 0 or 1. For example,
… 2>file
runs a command with message output (file descriptor 2) directed to file.
… 2 file
> word
<< word
file:
word
& digit
& –
case-part:
pattern ) command-list ;;
pattern:
word
pattern | word
else-part:
elif command-list then command-list else-part
else command-list
empty
empty:
word:
a sequence of non-blank characters
name:
a sequence of letters, digits or underscores starting with a letter
digit:
0 1 2 3 4 5 6 7 8 9
Appendix B – Meta-characters and Reserved Words
a) syntactic
|
pipe symbol
&&
`andf' symbol
||
`orf' symbol
;
command separator
;;
case delimiter
&
background commands
( )
command grouping
<
input redirection
<
output creation
>>
output append
b) patterns
*
match any character(s) including none
?
match any single character
[...]
match any of the enclosed characters
c) substitution
${…}
substitute shell variable
`…`
substitute command output
d) quoting
\
quote the next character
‘…’
quote the enclosed characters except for ‘
“…”
quote the enclosed characters except for $ ` \ ”
e) reserved words
if then else elif fi
case in esac
for while until do done
{ }

QTP Interview Question

List of QTP interview question are as follows -

1. What are the types of Object Repositories in QTP?
2. What is extension of shared repository?
3. Explain the check points in QTP (bitmap & Image checkpoint)?
4. Difference between re-usable & external actions?
5. How regular Expressions will be useful & when ?
6. What is the use of Split Method?
7. What is Smart-Identification?
8. Difference Between Re-Testing & Regression Testing?
9. Explain about Call to copy & Call to Existing?
10. Explain about Bug-life cycle?
11. What is the purpose of creating Synchronization Points in QTP?
12. How many type of parameters are in QTP?
13. What is the use of Environment parameters in QTP?
14. What is difference between Action & Function
15. What is the use of Ordinal Identifiers in QTP?
16. What is the use of Repository Parameters in QTP?
17. What is the use of Transaction Statement?
18. When to use a Recovery Scenario and When to use “on error resume next”?
19. How to close all browsers from QTP?
20. How to count all links on Web page?
21. I have a Microsoft Access database that contains data I would like to use in my test. How do I do this?
22. How can I check that a child window exists
23. How can i check the properties of an object in an application without using checkpoints? For
24. How can I check if a parameter exists in DataTable or not?
25 How can import environment from a file on disk
26 How to load the library file at run-time. What’s its limitation?
27 Explain the difference between .qfl and .vbs file in QTP?