# Markov Analysis and Competing Values Framework

## Introduction

Comparable to decision analysis, Markov analysis is a probabilistic technique, but it does not offer a recommended decision. Rather, Markov analysis offers probabilistic materials concerning a condition that needs a resolution, which can assist a decision-maker to make an informed choice. That is, this analysis technique does not perform optimization functions. Hence, Markov analysis is a descriptive method, which leads to probabilistic information. This analysis technique is generally applied to systems that show probabilistic changes from one form, state, or condition to another for a given duration. For instance, Markov analysis is applied to show the probability that a system would function for a day and then malfunction on the second day or a customer is most likely to change from one product to another (brand switching) in the second month.

Get your customized and 100% plagiarism-free paper on any subject done
with 15% off on your first order

Markov analysis is a technique for demonstrating complex system designs influenced by time, event sequencing, repair, redundancy, and capabilities to tolerate fault. The analysis is done by illustrating system state transition diagrams and assessing these diagrams to comprehend how some unwanted states are attained and their comparable probabilities. The technique is applied to determine system performance, availability, reliability, dependability, and safety (Ericson II, 2005). The analysis is used to explain failed states and despoiled states of operation where the system is either incompletely failed or in a low output mode, but some functions are executed while others are not. This technique consists of Markov chains, which involve random processes where changes take place only at specific fixed periods. However, it is imperative to recognize that most of the physical occurrences noted in daily life are aspects of changes that take place constantly over time. For instance, machine breakdowns, telephone call arrivals, and radioactive decay are changes associated with continuous processes (Ericson II, 2005). According to Markov processes, changes take place always over time, and the present state determines the future, which is independent of history. In this case, presently known probabilities are applied to determine probabilities of future occurrences (Render, Stair Jr., Hanna, & Hale, 2014). As such, it is possible to have a basic framework for assessing system dependability, reliability, and safety. Markov analysis is based on multiple types of Markov processes. In this essay, an evolution of Markov analysis is presented, and new features and assumptions explained. Moreover, it also offers instances of Markov applications, including states and state probabilities. In this essay, it is hypothesized that Markov analysis offers robust qualitative and quantitative means to understand changes in a system or a process over time.

## History of Markov Analysis

Andrey Markov (1856–1922), a Russian mathematician was the father of Markov analysis (Ericson II, 2005). Markov early works focused on number theory to offer a systematic mathematical evaluation and description of random processes, which later evolved into probability theory. Markov’s research concentrated on the concept of probability of mutually dependent events. Consequently, Markov was able to demonstrate the central limit theorem in this field. Further, Markov also presented the model of chained events that shaped the foundation for Markov chains, and today, it is referred to as Markov analysis. In 1953, Paul Levy expanded Markov analysis by introducing a semi-Markov process to offer a broader model for probabilistic systems.

Markov analysis technique is grouped under the system design hazard analysis (Ericson II, 2005). As such, it is widely used in system design hazard analysis (D’Amico, Manca, Corini, Petroni, & Prattico, 2016). The main goal of Markov analysis is to offer a method for graphically modeling and evaluation of system elements to determine any issues related to system reliability, safety, and dependency. The graphical model can then be converted into a mathematical model for probability analysis. Markov analysis is preferred because of its capability to accurately model and numerically assess complex system designs, especially systems that need repair and dependencies. Additionally, this analysis is used to model operations or related failures of complex system designs. Models from Markov analysis can be based on comprehensive element designs or at a rather abstract subsystem design level. This analysis technique usually yields extensive mathematical model to aid in understanding system failure states, transition states, and timing. The analysis technique expands to accommodate changes in the system based on an increase in size. However, systems are complex and, thus, attempts are usually made to restrict analysis to small system applications to enhance management of outcomes. Further, this model can be used in early development of a system and thereby, helping in identification of system design challenges in different processes. Early use of Markov analysis normally assists system developers to create safety and reliability features of a system at initial stages rather than embarking on system corrective measures after a failure or a catastrophe.

It is imperative to acknowledge that Markov analysis is somewhat a complex analysis to learn, comprehend, and apply effectively (Ericson II, 2005). As such, a higher level of mathematical comprehension is necessary to use this technique. Markov analysis requires users to master its techniques, comprehend materials, and have prior knowledge of processes involved in modelling. Hence, an analyst should be seasoned and adept with mathematical models involved. While Ericson II (2005) shows that Markov analysis is difficult, Render et al. (2014) claim that this analysis technique is like many other quantitative techniques. That is, one can study Markov analysis at any level of depth and sophistication. Further, Render et al. (2014) assert that the main mathematical requirements are based on basic matrix manipulations and ability to solve multiple equations with different unknowns. Hence, users who are not familiar with such techniques may review them for fresh insights, especially matrices and other important mathematical applications.

While Markov analysis is an important analysis tool, it does not necessarily offer robust benefits to system users or safety analysts relative to other analysis tools, such as fault tree analysis (Ericson II, 2005). As previously observed, Markov analysis is often applied in reliability to asses availability of a system. The analysis technique, however, fails to identify hazards. Rather, its major role is to model state transitions to improve comprehension of system operation and determine failure state probabilities. For complex systems, simplified models of Markov analysis are necessary to avoid excessively large and complex systems. Markov analysis is majorly required only when highly defined probability assessments are necessary. Fault tree analysis, for instance, is recommended in most instances because this analysis technique relies on a combinatorial model to extract useful data from system designs, and it is observed that results on probability assessments are often equal or extremely close to results obtained from Markov analysis (Ericson II, 2005). In such instances, fault tree analysis is applied in extremely large, complex systems, which could be difficult to handle using Markov analysis.

Our academic experts can deliver a custom essay specifically for you
with 15% off for your first order

Markov analysis relies on a state transition illustration or a coordinated graph that shows in one diagram a state of operations or failures of a system. The state diagram is not rigid to accommodate even a single aspect of a system or the whole system. The illustration covers representation of the state of the system, transitions, and rates of transitions. The diagram offers adequate information for creating the state equations, which when determined offer probability calculations for every state. In Markov analysis, state refers to a condition of a system or a component at a given point in time. Hence, a state may be malfunction state, operational state, and degraded state among others. States are applied to determine all potential conditions of a system or a process. For instance, a system may be functional correctly or functioning poorly, which reflect two states of a system. Thus, it is possible to use states to illustrate system functional capabilities as states and state probabilities of a process or a system. For a small town with three grocery stores, a customer can associate with a single store at any point in time. Thus, three states can be derived from the grocery stores. Any area of interest under Markov analysis may be considered as a state. According to Render et al. (2014), in Markov analysis, states are viewed as collectively exhaustive and mutually exclusive. From a collective perspective, all potential states of system can be listed and, thus, a system may have a finite number of states. Mutually exclusively implies that a system may only be in a single state at any point time (Render et al., 2014). For instance, a student may only be in one class at a given time or a machine may be functional or not, and not in two or more states simultaneously. Thus, a fundamental aspect of Markov analysis is to determine the exact state of a system or a process at any given point in time.

Markov analysis is based on four assumptions. However, a given course level may limit a detailed exploration of Markov mathematics. Consequently, Markov processes are often restricted to four assumptions to aid understanding at lower levels. The first assumption states that there are a finite or limited number of possible states. Second, the probability of changing states appears constant over time. Third, one can predict any future state from the previous state and the matrix of transition probabilities. Finally, size and makeup, such as the number of customers and manufacturers, do not change during evaluation.

While the primary rules for the construction of a state illustration are not complex, effective comprehension of the system under analysis is important. Construction of a state illustration starts with an assessment of the system and identification of the potential states where they may be found. Application of Markov analysis is presented in the following table.

 Table 1 Application of Markov Analysis Step Task Description 1 Problem definition A company defines its current problem e.g. low customer retention and loyalty 2 Creating a model Analysts develop Markov analysis to model the problem identified. In this case, classification of the problem may be necessary 3 Obtaining the relevant data Analysts collect data on each problem classification to develop transition probabilities. These probabilities demonstrate potential states of customer movements across different classes. In this case, an analyst constructs the state diagram and develops the mathematical equations from the state diagram. Finally, they solve the equation using manual techniques or computer programs 4 Testing the solution Analysts create a tool to evaluate customer behaviors and responses to various efforts driven by a company. 5 Analysis of results Results are evaluated based on Markov analysis. In this case, changes in customer responses based on marketing efforts are observed. Moreover, related cost-saving is also critical for analysis of results 6 Implementation of results Implement recommended design changes as found necessary from the Markov analysis and track any new developments for the entire process

A Markov process shows a stochastic process, which reflects the movement or change of an entity across a finite number of defined states (Abner, Charnigo, & Kryscio, 2013). One of these states must show the movement at any given moment. A potential change or movement is captured using a transition matrix or a state diagram. The process ends when at least one of the states must be absorbing (Abner et al., 2013). That is, in an absorbing, there is a zero probability of exiting the state once entered. In the clinical setting, for instance, death is considered as an absorbing state, especially when evaluating competing risks in clinical outcomes. Markov processes are either continuous or discrete relative to time, which could be homogenous or not.

## Application of Markov Analysis

Markov analysis has multiple application in business. Organizations have applied this technique to determine their market share, predict college enrolments, forecast bad debts and changes in the weather patterns and to determine hazard or whether a machine will fail or a patient will die in the process among others (Abner et al., 2013; Render et al., 2014; Ericson II, 2005). Based on this technique, two firms, for instance, may have different market share in the first month. However, changes in the market share perhaps may take place in the future based on the market dynamics. For prediction of the future state, one must know the likelihood or probability of potential changes from one state to another. For a given issue, probabilities can be constructed in a table or matrix (a matrix of transition probabilities) to show the possibility of change of a system from one time to the next (Render et al., 2014). This transition reflects the Markov process, which is used to predict future states or conditions of a system or a process.

Once the states have been determined, the next procedure involves identification of the probability of the system in a given state. A vector of state probabilities is then used to capture that information.

In some instances, especially when a single item is the subject, one can certainly know the state a system. For example, a machine may be functioning correctly or not at given point in time. Hence, a simple vector of probabilities is constructed to reflect this state.

Such a vector of probability shows that a system is functioning well. State 1 reflects 1, and possibility that the system is not functioning correctly while state 2, is 0 for this specific period. In most instances, however, systems are complex, and the vector of probabilities captures more than an item.

Grocery stores in small towns, for instance, can apply Markov analysis to determine their market shares. Market share is vital for probability values because customers tend to switch brands or their stores over time. Thus, managers are often interested in changes in market share over time. Markov analysis is applied to comprehend customer loyalty. Managers can predict possibilities of repeat customers and customers that would switch to competitors in the next month. This leads to the concept of matrix of transition probabilities, which allows managers to understand a current state and a potential future state (Sedlmeier, Mieruch, Schädler, & Kottmeier, 2016). Once state probabilities have been assessed, as well as the matrix of transition probabilities, analysts can predict future state probabilities (Render et al., 2014). Additionally, the technique can be applied to predict machine operations, hazards, determine equilibrium conditions, and reduce market costs.

Markov analysis offers some major advantages to users. The technique offers an accurate model representation for some design elements, such as sequencing, repair, fault tolerance, and complexities among others. It helps users to understand system operations and possible failures and repair (Liu & Kato, 2016). Finally, Markov analysis can be used in early development stages to identify some safety and design issues for immediate reviews. Conversely, Markov analysis has also demonstrated some weaknesses. First, this technique does not determine specific hazards, but only focuses on further analysis of identified issues. Second, this tool does not assess the cause of a problem. Instead, it only evaluates how components may fail. Third, while computer programs are available to aid users, Markov analysis is a complex technique that requires seasoned analysts to create vectors and generate diagram models alongside probability calculations. Finally, in complex and large systems, Markov analysis could be difficult to apply. Other models, such as fault tree analysis, could be used to assess large systems or processes with simpler probabilistic mathematical models. Analysts may however combine Markov analysis with other related techniques to analyze large systems or processes by applying less complicated quantitative models.

With Markov analysis, analysts are advised to avoid some common mistakes that may affect integrity of results. First, analysts should ensure that they have necessary training, especially in both qualitative and quantitative analyses. It is imperative to apply simpler alternatives, such as fault tree analysis, when appropriate, rather than complex Markov analysis. Analysts must always recognize that probabilities or transitions of change from one state to another are assumed to remain constant (Ericson II, 2005, p. 332). Hence, constant failure rates and repair rates, for instance, should be considered before applying this technique. Finally, only the present state and not the system history is used to determine transition probabilities, implying that future states of the system are viewed as independent of all except the current system state. This model only supports representation of independent states.

## Conclusion

This essay discussed Markov analysis, its evolution, history, and application. It is observed that Markov analysis is a tool for modeling complex system designs. Organizations apply it to understand various issues, including market share, customer loyalty, attrition, cost reduction, system failure, hazard, and even quality outcome management. This technique is robust and, thus, it offers both mathematical (probabilistic) and graphical representation of models. It is also recognized that Markov analysis is necessary when precise mathematical calculations are required. Otherwise, it may not be suitable for assessing excessively complex, large systems.

## References

Abner, E. L., Charnigo, R. J., & Kryscio, R. J. (2013). Markov chains and semi-Markov models in time-to-event analysis. Journal of Biometrics & Biostatistics, (S1), e001. Web.

D’Amico, G., Manca, R., Corini, C., Petroni, F., & Prattico, F. (2016). Tornadoes and related damage costs: Statistical modelling with a semi-Markov approach. Geomatics, Natural Hazards and Risk, 7(5), 1600-1609. Web.

Ericson II, C. A. (2005). Hazard analysis techniques for system safety. Hoboken, NJ: John Wiley & Sons, Inc.

Liu, J., & Kato, N. (2016). A Markovian analysis for explicit probabilistic stopping-based information propagation in postdisaster ad hoc mobile networks. IEEE Transactions on Wireless Communications, 15(1), 81-90. Web.

Render, B., Stair, Jr., R. M., Hanna, M. E., & Hale, T. S. (2014). Quantitative analysis for management (12th ed.). New York, NY: Pearson.

Sedlmeier, K., Mieruch, S., Schädler, G., & Kottmeier, C. (2016). Compound extremes in a changing climate – A Markov chain approach. Nonlinear Processes in Geophysics, 23(6), 375-390. Web.

The following paper on Markov Analysis and Competing Values Framework was written by a student and can be used for your research or references. Make sure to cite it accordingly if you wish to use it.
Removal Request
The copyright owner of this paper can request its removal from this website if they don’t want it published anymore.

## Cite this paper

Select a referencing style

Reference

YourDissertation. (2021, December 27). Markov Analysis and Competing Values Framework. Retrieved from https://yourdissertation.com/dissertation-examples/markov-analysis-and-competing-values-framework/

Work Cited

"Markov Analysis and Competing Values Framework." YourDissertation, 27 Dec. 2021, yourdissertation.com/dissertation-examples/markov-analysis-and-competing-values-framework/.

1. YourDissertation. "Markov Analysis and Competing Values Framework." December 27, 2021. https://yourdissertation.com/dissertation-examples/markov-analysis-and-competing-values-framework/.

Bibliography

YourDissertation. "Markov Analysis and Competing Values Framework." December 27, 2021. https://yourdissertation.com/dissertation-examples/markov-analysis-and-competing-values-framework/.

References

YourDissertation. 2021. "Markov Analysis and Competing Values Framework." December 27, 2021. https://yourdissertation.com/dissertation-examples/markov-analysis-and-competing-values-framework/.

References

YourDissertation. (2021) 'Markov Analysis and Competing Values Framework'. 27 December.

Click to copy
Copied