EL REPOSITORIO SE ENCUENTRA EN MARCHA BLANCA

 

Thesis
MARKOVIAN MODELS EFFICIENT NUMERICAL METHODS AND ALGORITHMS

Abstract

A Markov chain is a fundamental mathematical structure, commonly used in performance and dependability modeling and analysis of complex systems, such as computer and telecommunication systems. One of the important criteria to asses quality on those systems is efficiency. The efficiency is usually evaluated using a model that captures the performance and the dependability behavior of the system, in our case, a Markov process. The evaluation procedure of a Markov model involves the computation of transient and steady-state probability distributions and, also, the calculation of some other measures of interest. In dealing with real systems, one usually is confronted with (at least) two problems: (1) the size of the state-space of the system being modeled is difficult to deal with, a problem known as the state-space explosion problem, and (2) the time required to numerically evaluate the model is very large. In this thesis, we accept that the state-space of the systems being modeled could be large. Consequently, we are mainly focused on developing efficient numerical techniques to cope with the evaluation of transient and steady-state probability distributions in Markov chains (MC). The results of this thesis can be categorized in three broad topics of the analysis of Markovian models, and they are summarized as follows: (1) Specialized steady-state analysis techniques: we developed new numerical methods to calculate the steady-state probability distribution and the moments of the number of customers in a particular queueing systems, known as M/Er/1 queue. The techniques presented here can be extended and applied to other queueing models as well, since they are not specifically dependent on this particular model; (2) General transient analysis techniques: we propose a new methodology to accelerate (in some cases, by several orders of magnitude) the calculation of the transient distribution of a Markov model, with respect to the techniques that are currently used; and (3) General steady-state analysis techniques: we present a new methodology to evaluate the steady-state distribution of a MC by means of a transformation consisting in three steps: transforming the original MC by modifying its transitions; finding the steady-state solution of the transformed MC using any of the existing methods; and converting the transformed MC?s solution into the solution of the original chain.

Description

Catalogado desde la versión PDF de la tesis

Keywords

Citation