19-11-2012, 06:02 PM
DATA STRUCTURES NOTES
DATA STRUCTURES.pdf (Size: 1.85 MB / Downloads: 69)
Stacks
Definition and examples, Primitive operations, Example, The stack as an ADT, Representing stacks in C,
Implementing the pop operation, Testing for exceptional conditions , Implementing the push operation,
Examples for infix, postfix and prefix expressions, Basic definition and examples, Program to evaluate a
postfix expression, Converting an expression from infix to postfix. Program to convert an expression
from infix to postfix.
Queues and Lists
The queue and its sequential representation, the queue as ADT, C implementation of queues, Insert
operation, Priority queue, and Array implementation of priority queue.
Linked lists, inserting and removing nodes from a list. Linked implementation of stacks, getnode and
freenode operations, Linked implementation of queues, Linked list as a data structure, Example of list
operations, Header nodes, Lists in C, Array implementation of lists, Limitations of array implementation,
allocating and freeing dynamic variables, Linked lists using dynamic variables, Queues as lists in C,
Examples of list operations in C, Non integer and non-homogenous lists, other list structures: Circular
lists, stack as a circular list ,Queue as a circular list ,Primitive operation on a circular lists, doubly linked
lists.
Searching
Basic Search techniques: Algorithms notations, sequential searching , searching an ordered table
,Indexed sequential search ,Binary search , Interpolation search, Tree searching: Inserting into Binary
search Tree, deleting from a binary Search Tree, hashing :Resolving hash clashes by open addressing,
Choosing a hash function
Analysis of Algorithm
INTRODUCTION
A common person’s belief is that a computer can do anything. This is far from truth. In reality, computer
can perform only certain predefined instructions. The formal representation of this model as a sequence of
instructions is called an algorithm, and coded algorithm, in a specific computer language is called a
program. Analysis of algorithms has been an area of research in computer science; evolution of very highspeed
computers has not diluted the need for the design of time-efficient algorithms.
Complexity theory in computer science is a part of theory of computation dealing with the resources
required during computation to solve a given problem. The most common resources are time (how many
steps (time) does it take to solve a problem) and space (how much memory does it take to solve a
problem). It may be noted that complexity theory differs from computability theory, which deals with
whether a problem can be solved or not through algorithms, regardless of the resources required.
Analysis of Algorithms is a field of computer science whose overall goal is understand the complexity of
algorithms. While an extremely large amount of research work is devoted to the worst-case evaluations,
the focus in these pages is methods for average-case. One can easily grasp that the focus has shifted from
computer to computer programming and then to creation of an algorithm. This is & algorithm design,
heart of problem solving.
Asymptotic Analysis
Asymptotic analysis is based on the idea that as the problem size grows, the. Complexity can be described as a
simple proportionality to some known function. This idea is incorporated in the “Big 0”, “Omega” and “Theta”
notation for asymptotic performance.
The notations like “Little Oh” are similar in spirit to “Big Oh”; but are rarely used in computer science for
asymptotic analysis.
Tradeoff between space and time complexity
We may sometimes seek a tradeoff between space and time complexity. For example, we may have to choose a data
structure that requires a lot of storage in order to reduce the computation time. Therefore, the programmer must
make a judicious choice from an informed point of view. The programmer must have some verifiable basis based on
which a data structure or algorithm can be selected Complexity analysis provides such a basis.
We will learn about various techniques to bind the complexity function. In fact, our aim is not to count the exact
number of steps of a program or the exact amount of time required for executing an algorithm. In theoretical
analysis of algorithms, it is common to estimate their complexity in asymptotic sense, i.e., to estimate the
complexity function for reasonably large length of input ‘n’. Big 0 notation, omega notation 2 and theta notation 0
are used for this purpose. In order to measure the performance of an algorithm underlying the computer program,
our approach would be based on a concept called asymptotic measure of complexity of algorithm. There are
notations like big 0, 0, for asymptotic measure of growth functions of algorithms. The most common being big-O
notation. The asymptotic analysis of algorithms is often used because time taken to execute an algorithm varies with
the input ‘n’ and other factors which may differ from computer to computer and from run to run.