Seminar Topics & Project Ideas On Computer Science Electronics Electrical Mechanical Engineering Civil MBA Medicine Nursing Science Physics Mathematics Chemistry ppt pdf doc presentation downloads and Abstract

Full Version: Call processing delay analysis in cellular networks: a queuing model approach
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Call processing delay analysis in cellular networks: a queuing model approach
[attachment=26846]
Abstract
Mobile devices generates a call request message while initiating a call, which
is then sent to a base station (BS). A BS processes a call request message and
then takes a decision of acceptance or rejection for the call. It is important to
analyze the total time, which includes the waiting time and the processing
time, spent by a call request message in the system. For if, this time is greater
than the permissible delay, call will be blocked. Hence, the quality of service
is degraded which is not acceptable to the service providers. Further, failures
(hardware and software related) and their recovery increase the delay. With
this background, in this paper, we present queuing models for the analysis
of delay experienced by a call request message for two cases: first, services
of BS are not interrupted and second, services are interrupted due to the
occurrence of failures at the BS. In the end, we discuss special cases of proposed
queuing models based on the distribution of the processing times.
Keywords
Call processing, Delay, Queuing model, M/G/1 queues, Service interruption
1. Introduction
Cellular networks divide a geographic area into smaller regions called
cells. A base station (BS), which provides wireless connectivity through
wireless channels, serves each cell. Several BSs are connected to a base
station controller (BSC), which are then connected to a network subsystem.
The network subsystem consists of mobile switching center (MSC),
home location register (HLR) and visitor location register (VLR). MSC is
responsible for authenticating the mobile user, storing location information
and routing user's calls to appropriate networks.
Figure 1 shows the generic architecture of a cellular network. To establish
a communication session or a call, the mobile station (MS) sends a request
Received: 4 February 2008 / Accepted: 4 August 2008
© Operational Research Society of India, 2009
290 OPSEARCH 46(3):289–302
for radio channels through a channel access procedure. On receiving call
request messages successfully, BS interacts with its BSC and the network
subsystem to carry out the processing of the call request message and hence
takes the decision to accept or reject the call. A BS accepts the call if an idle
wireless channel can be allocated to the call for the communication. To utilize
the limited radio resources at BS efficiently, call admission control (CAC)
schemes are incorporated. Some of the common CAC schemes include
fixed guard channel scheme proposed by Haring et al. [1], dynamic CAC
scheme proposed by Li et al. [2] and channel borrowing CAC scheme, Cao
et al. [3]. Ahmed [4] gives a comprehensive survey on CAC schemes.
The processing time of a call request message varies for different CAC
schemes. For example, in fixed guard channel scheme, a BS searches for an
idle channel in its channel pool. If it finds an idle channel, this channel is
allocated to the call request. In channel borrowing scheme, if a BS does not
possess an idle channel, it borrows a channel from one of its neighbor cells
and allocate this channel to the call request. It is apparent that the time taken
for processing of a call request message in both the schemes is not same
and is random.
From the view point of service providers, it is necessary to analyze the
delay in processing of a call request message and its waiting time in a queue.
For if, this time is greater than the permissible delay, the corresponding call
will be blocked. Hence, the Quality of Service (QoS), is degraded, which
is unacceptable.
Tipper et al. [5] discussed that the failures at BS increase call processing
time and degrade QoS. Chen et al. [6] discussed about the hardware failures
(such as failures in electro-mechanical equipments) and software failures
Fig. 1. Architecture of cellular networks
OPSEARCH 46(3):289–302 291
(such as Heisenbugs and Bohrbugs). To mitigate the impact of the failures at
BS, several fault tolerant strategies are incorporated by the network designers.
Varshney et al. [7] proposed fault tolerance strategies that are responsible for
recovery of failures at a BS. After the recovery, BS presumes processing of a
call request message. Thus, failures and their recovery delay the processing
of a call request message. It becomes important to compute mean processing
time of a call request message in the incidence of failures and their recovery.
In this paper, we present a queuing model to analyze the processing delay
of a call request. The failures at BS negatively affect the call blocking and
dropping probabilities. The call processing delay is further increased due to
the failures at a BS. Fault tolerance strategies are then applied to restore the
services at BS. Before incorporating fault tolerance strategies it is essential
to analyze overhead incurred by them. Hence, in the presence of failures and
their recovery, we model the processing delay of call request messages by
a M /G /1 queuing model with vacation, where vacation is due to failures
and services of a BS are resumed after recovery of the failures.
The rest of this paper is organized as follows: In Section 2, we describe the
system model of a cellular network. In Sections 3, we present the analytical
model to compute the mean and variance of the total time spent in the system
by a call request message for the case when service is not interrupted by the
failures at BS. In Section 4, similar analysis is carried out by considering the
failures and recovery at BS. In Section 5, we obtain mean and variance of
the total time spent in the system for specific service distributions. Finally,
in Section 6, we discuss the inferences drawn from these models.
2. System model
We consider the future generation cellular networks; the network traffic
consists of real time service (RTS) calls (for example, voice and video
conference) and non-real time service (NRTS) calls (for example, SMS,
data transfer and e-mail). We assume that RTS and NRTS call request
messages arrive independently according to Poisson process with rates λ
1
and λ2 , respectively. Both types of call request messages join a single queue.
Therefore, the arrival of call request messages at a single queue is a Poisson
process with rate λ = λ +λ 1 2. Each call request message is processed on firstcome-
first serve basis. Processing time of a call request message depends
on the CAC scheme employed at a BS. After the processing, the decision is
taken to accept or reject a call. We assume an infinite buffer for call request
messages. This assumption is justified because only after the processing of a
call request message, the call is accepted or rejected.
292 OPSEARCH 46(3):289–302
The service time, denoted by S, of a BS is defined as the time duration
taken by a BS for processing of a call request message. The distribution of
S depends on a CAC scheme. For example, in channel borrowing scheme
by Cao et al. [3], the time to process call request message is sum of the
time taken to search for an idle channel in the channel pool and the time to
borrow an idle channel from one of its neighbor cells. This channel borrowing
process involves exchanges of control messages between the BS and its
neighbors. The processing time then includes propagation time, transmission
time and the time for computing the set of channels that can be borrowed
from the neighboring cells. Thus, the assumption of standard exponential
distribution for the processing time of a call request message is not valid.
We assume that the service times of RTS and NRTS call request messages
are independent and identically distributed with cumulative distribution
functionG(⋅) and mean 1/ μ . Note that the call holding times for RTS and
NRTS calls are possibly different.
Failures at a BS further delays the processing of a call request message.
Fault tolerance strategies are incorporated for handling these failures. The
overhead associated with these strategies is the recovery time of the failures
at BS. It is essential to analyze the impact on call processing delay for call
request messages in the presence of failures and their recovery. For these
purpose, we construct a queuing model. In next section, we first develop a
M /G /1 queuing model, without considering the failures at BS.
3. Queuing model for delay in processing without service interruptions
We begin by describing the analytical model for computing the mean and
variance of the total time spent in the system by a call request message.
This total time accounts for the processing delay of a call request message.
Let the random variable X t ( ) represents the number of call request messages
in the BS at any time t and {X (t),t ≥ 0} is a stochastic process with state
space {0,1, 2,K} . We assume that the service time of a BS is generally
distributed and the network traffic is following Poisson process. Further, we
consider that no failures occur at BS. With these assumptions, {X (t),t ≥ 0}
is a non-Markovian process and is modeled as a M /G /1 queuing system.
Figure 1 shows the queuing model for the system without service interruptions.
The service discipline is first-come-first-serve and the inter-arrival times
and service times are independent.