Search published articles



Rassoul Noorossana, Paria Soleimani,
Volume 23, Issue 3 (9-2012)
Abstract

Abstract Profile monitoring in statistical quality control has attracted attention of many researchers recently. A profile is a function between response variables and one or more independent variables. There have been only a limited number of researches on monitoring multivariate profiles. Indeed, monitoring correlated multivariate profiles is a new subject in the fileld of statistical process control. In this paper, we investigate the effect of autocorrlation in monitoring multivariate linear profiles in phase II. The effect of three main models namely AR(1), MA(1), and ARMA(1,1) on the methods of multivariate linear profile monitoring is evaluated and compared by using simulation study and average run length criteria. Results indicate that autocorrelation affects performance of the existing methods significantly.
Abbas Saghaei, Maryam Rezazadeh-Saghaei, Rasoul Noorossana, Mehdi Doori,
Volume 23, Issue 4 (11-2012)
Abstract

In many industrial and non-industrial applications the quality of a process or product is characterized by a relationship between a response variable and one or more explanatory variables. This relationship is referred to as profile. In the past decade, profile monitoring has been extensively studied under the normal response variable, but it has paid a little attention to the profile with the non-normal response variable. In this paper, the focus is especially on the binary response followed by the bernoulli distribution due to its application in many fields of science and engineering. Some methods have been suggested to monitor such profiles in phase I, the modeling phase however, no method has been proposed for monitoring them in phase II, the detecting phase. In this paper, two methods are proposed for phase II logistic profile monitoring. The first method is a combination of two exponentially weighted moving average (EWMA) control charts for mean and variance monitoring of the residuals defined in logistic regression models and the second method is a multivariate T2 chart to monitor model parameters. The simulation study is done to investigate the performance of the methods.
Vorya Zarei, Iraj Mahdavi, Reza Tavakkoli-Moghaddam, Nezam Mahdavi-Amiri,
Volume 24, Issue 1 (2-2013)
Abstract

The existing works considering the flow-based discount factor in the hub and spoke problems, assume that increasing the amount of flow passing through each edge of network continuously decreases the unit flow transportation cost. Although a higher volume of flow allows for using wider links and consequently cheaper transportation, but the unit of flow enjoys more discounts, quite like replacing the current link by a cheaper link type (i.e., increasing the volume of flow without changing the link type would not affects the unit flow transportation cost). Here, we take a new approach, introducing multi-level capacities to design hub and spoke networks, while alternative links with known capacities, installation costs and discount factors are available to be installed on each network edge. The flow transportation cost and link installation cost are calculated according to the type of links installed on the network edges thus, not only the correct optimum hub location and spoke allocation is determined, but also the appropriate link type to be installed on the network edges are specified. The capacitated multiple allocation p-hub median problem (CMApHMP) using the multi-level capacity approach is then formulated as a mixed-integer linear program (MILP). We also present a new MILP for the hub location problem using a similar approach in order to restrict the amount of flow transmitting through the hubs. Defining diseconomies of scale for each hub type, the model is to present congestion at the hubs and balance the transmitting flow between the hubs. Two new formulations are presented for both the p-hub median and the hub location problems which requiring a flow between two non-hub nodes to be transferred directly, when a direct link between the nodes is available. These models are useful for the general cost structure where the costs are not required to satisfy the triangular inequality. Direct links between non-hub nodes are allowed in all the proposed formulations.
Shervin Asadzadeh , Abdollah Aghaie, Hamid Shahriari ,
Volume 24, Issue 2 (6-2013)
Abstract

Monitoring the reliability of products in both the manufacturing and service processes is of main concern in today’s competitive market. To this end, statistical process control has been widely used to control the reliability-related quality variables. The so-far surveillance schemes have addressed processes with independent quality characteristics. In multistage processes, however, the cascade property must be effectively justified which entails establishing the relationship among quality variables with the purpose of optimal process monitoring. In some cases, measuring the values corresponding to specific covariates is not possible without great financial costs. Subsequently, the unmeasured covariates impose unobserved heterogeneity which decreases the detection power of a control scheme. The complicated picture arises when the presence of a censoring mechanism leads to inaccurate recording of the process response values. Hence, frailty and Cox proportional hazards models are employed and two regression-adjusted monitoring procedures are constructed to effectively account for both the observed and unobserved influential covariates in line with a censoring issue. The simulation-based study reveals that the proposed scheme based on the cumulative sum control chart outperforms its competing procedure with smaller out-of-control average run length values.
Ali Yahyatabar Arabi, Abdolhamid Eshraghnia Jahromi, Mohammad Shabannataj,
Volume 24, Issue 2 (6-2013)
Abstract

Redundancy technique is known as a way to enhance the reliability and availability of non-reparable systems, but for repairable systems, another factor is getting prominent called as the number of maintenance resources. In this study, availability optimization of series-parallel systems is modelled by using Markovian process by which the number of maintenance resources is located into the objective model under constraints such as cost, weight, and volume. Due to complexity of the model as nonlinear programming , solving the model by commercial softwares is not possible, and a simple heuristic method called as simulated annealing is applied. Our main contribution in this study is related to the development of a new availability model considering a new decision variable called as the number of maintenance resources. A numerical simulation is solved and the results are shown to demonstrate the effecienct of the method.
Hossein Akbaripour, Ellips Masehian,
Volume 24, Issue 2 (6-2013)
Abstract

The main advantage of heuristic or metaheuristic algorithms compared to exact optimization methods is their ability in handling large-scale instances within a reasonable time, albeit at the expense of losing a guarantee for achieving the optimal solution. Therefore, metaheuristic techniques are appropriate choices for solving NP-hard problems to near optimality. Since the parameters of heuristic and metaheuristic algorithms have a great influence on their effectiveness and efficiency, parameter tuning and calibration has gained importance. In this paper a new approach for robust parameter tuning of heuristics and metaheuristics is proposed, which is based on a combination of Design of Experiments (DOE), Signal to Noise (S/N) ratio, Shannon entropy, and VIKOR methods, which not only considers the solution quality or the number of fitness function evaluations, but also aims to minimize the running time. In order to evaluate the performance of the suggested approach, a computational analysis has been performed on the Simulated Annealing (SA) and Genetic Algorithms (GA) methods, which have been successfully applied in solving respectively the n-queens and the Uncapacitated Single Allocation Hub Location combinatorial problems. Extensive experimental results showed that by using the presented approach the average number of iterations and the average running time of the SA were respectively improved 12 and 10.2 times compared to the un-tuned SA. Also, the quality of certain solutions was improved in the tuned GA, while the average running time was 2.5 times faster compared to the un-tuned GA.
Mohammad Saber Fallah Nezhad,
Volume 24, Issue 4 (12-2013)
Abstract

In this research, the decision on belief (DOB) approach was employed to analyze and classify the states of uni-variate quality control systems. The concept of DOB and its application in decision making problems were introduced, and then a methodology for modeling a statistical quality control problem by DOB approach was discussed. For this iterative approach, the belief for a system being out-of-control was updated by taking new observations on a given quality characteristic. This can be performed by using Bayesian rule and prior beliefs. If the beliefs are more than a specific threshold, then the system will be classified as an out-of-control condition. Finally, a numerical example and simulation study were provided for evaluating the performance of the proposed method.
Hamidreza Navidi, Amirhossein Amiri, Reza Kamranrad ,
Volume 25, Issue 3 (7-2014)
Abstract

In this paper, a new approach based on game theory has been proposed to multi responses problem optimization. Game theory is a useful tool for decision making in the conflict of interests between intelligent players in order to select the best joint strategy for them through selecting the best joint desirability. Present research uses the game theory approach via definition of each response as each player and factors as strategies of each player. This approach cans determine the best predictor factor sets in order to obtain the best joint desirability of responses. For this aim, the signal to noise ratio(SN) index for each response have been calculated with considering the joint values of strategies then obtained SN ratios for each strategy is modeled in the game theory table. Finally, using Nash Equilibrium method, the best strategy which is the best values of predictor factors is determined. A real case and a numerical example are given to show the efficiency of the proposed method. In addition, the performance of the proposed method is compared with the VIKOR method.
Mahdi Bashiri, Mahdyeh Shiri, Mohammad Hasan Bakhtiarifar,
Volume 26, Issue 2 (7-2015)
Abstract

There are many real problems in which multiple responses should be optimized simultaneously by setting of process variables. One of the common approaches for optimization of multi-response problems is desirability function. In most real cases, there is a correlation structure between responses so ignoring the correlation may lead to mistake results. Hence, in this paper a robust approach based on desirability function is extended to optimize multiple correlated responses. Main contribution of the current study is the synthesis of ideas considering correlation structure in robust optimization through defining joint confidence interval and desirability function method. A genetic algorithm was employed to solve the introduced problem. Effectiveness of the proposed method is illustrated through some computational examples and some comparisons with previous methods were performed to show applicability of the proposed approach. Also, a sensitivity analysis was provided to show relationship of correlation and robustness in these approaches.

\"AWT


Selva Staub, Sam Khoury, Kouroush Jenab,
Volume 26, Issue 3 (9-2015)
Abstract

Enterprise Resource Planning (ERP) has become the most strategic tool for an organization to employ. A leading ERP solution is SAP®. It has been employed by organizations to enable them to collaborate on different projects and to integrate all aspects of operations. Just as organizations have adopted ERP solutions, they employed quality initiatives that are designed to help organizations maximize efficiency. One of these quality initiatives organizations have turned to is Six Sigma. This paper explores the impact of SAP® solutions on the success of Six Sigma initiative implementation in an agile environment, where the needs of the customer changes rapidly. The literature shows that SAP® type ERP products play a critical role in the implementation process like Six Sigma, where the implementation process takes time. Also, the literature implies that in agile environments Six Sigma implementation can be ineffective.

\"AWT


Rahebe Keshavarzi, Mohammad Hossein Abooie,
Volume 27, Issue 2 (6-2016)
Abstract

Process capability indices (PCIs) can be used as an effective tool for measuring product quality and process performance. In classic quality control there are some limitations which prevent a deep and flexible analysis because of the crisp definition of PCA‟s parameters. Fuzzy set theory can be used to add more flexibility to process capability analyses. In this study, the fuzzy X ba and MRx ba control charts are introduced to monitor continuous production process in triangular fuzzy state. Also, fuzzy PCIs are produced when SLs and measurements are triangular fuzzy numbers (TFN). For this aim, a computer program is coded in Matlab software. The fuzzy control charts is applied in Yazd fiber production plant. The results show that in continuous production processes, the better analysis will be performed by using fuzzy measurements. Also, based on the fuzzy capability indices, we can have a flexible analysis of the process performance.


Mohammad Saber Fallah Nezhad, Vida Golbafian, Hasan Rasay, Yusef Shamstabar,
Volume 28, Issue 3 (9-2017)
Abstract

CCC-r control chart is a monitoring technique for high yield processes. It is based on the analysis of the number of inspected items until observing a specific number of defective items.  One of the assumptions in implementing CCC-r chart that has a significant effect on the design of the control chart is that the inspection is perfect. However, in reality, due to the multiple reasons, the inspection is exposed to errors. In this paper, we study the economic-statistical design of CCC-r charts when the inspection is imperfect. Minimization of the average cost per produced item is considered as the objective function. The economic objective function, modified consumer risk, and modified producer risk are simultaneously considered, and then the optimal value of r parameter is selected.


Rassoul Noorossana, Mahnam Najafi,
Volume 28, Issue 4 (11-2017)
Abstract

Change point estimation is as an effective method for identifying the time of a change in production and service processes. In most of the statistical quality control literature, it is usually assumed that the quality characteristic of interest is independently and identically distributed over time. It is obvious that this assumption could be easily violated in practice. In this paper, we use maximum likelihood estimation method to estimate when a step change has occurred in a high yield process by allowing a serial correlation between observations. Monte Carlo simulation is used as a vehicle to evaluate performance of the proposed method. Results indicate satisfactory performance for the proposed method.


Mohammad Saber Fallah Nezhad, Samrad Jafarian-Namin, Alireza Faraz,
Volume 30, Issue 4 (12-2019)
Abstract

The number of nonconforming items in a sample is monitored using the fraction defective known as the np-chart. The performance of the np-chart in Phase II depends on the accuracy of the estimated parameter in Phase I. Although taking large sample sizes ensures the accuracy of the estimated parameter, it can be impractical for attributes in some cases. Recently, the traditional c-chart and the np-chart with some adjustments have been studied to guarantee the in-control performance. Due to technology progresses, researchers have faced high-quality processes with a very low rate of nonconformity, for which traditional control charts are inadequate. To ameliorate such inaccuracy, this study develops a new method for designing the np-chart, such that the in-control performance is guaranteed with a pre-defined probability. The proposed method uses Cornish-Fisher expansions and the bootstrap method to guarantee the desired conditional in-control average run length. Through a simulation study, this study shows that the proposed adjustments improve the np-charts’ in-control performance.


Rassoul Noorossana, Somayeh Khalili,
Volume 32, Issue 1 (1-2021)
Abstract

In the last few decades, profile monitoring in univariate and multivariate environment has drawn a considerable attention in the area of statistical process control. In multivariate profile monitoring, it is required to relate more than one response variable to one or more explanatory variables. In this paper, the multivariate multiple linear profile monitoring problem is addressed under the assumption of existing autocorrelation among observations. Multivariate linear mixed model (MLMM) is proposed to account for the autocorrelation between profiles. Then two control charts in addition to a combined method are applied to monitor the profiles in phase II. Finally, the performance of the presented method is assessed in terms of average run length (ARL). The simulation results demonstrate that the proposed control charts have appropriate performance in signaling out-of-control conditions.
 
Muhammad Asim Siddique,
Volume 34, Issue 4 (12-2023)
Abstract

 
Software testing is the process of assessing the functionality of a software program. The software testing process checks for inaccuracies, gaps and whether the application outcome matches desired expectations before the software is installed and goes into production. Normally in large organizations, the development team allocates a high portion of estimated development time, cost and efficiency for regression testing to assure software testing quality assurance. The quality of developed software relies upon three factors time, efficiency and testing technique used for regression testing. Regression testing is an important component of software testing and maintenance, taking up a significant share of the total testing time, efficiency and resources organizations use in testing techniques. The key to successful regression testing using Test Case Prioritization (TCP), Test case Selection (TCS) and Test Case Minimization (TCM) is maximizing the test cases' effectiveness while considering the limited resources available. Regression testing introduced numerous techniques for (TCP, TCS, TCM) to maximize the efficiency based on Average Percentage Fault Detection (APFD). In recent studies, the TCP and TCS techniques can give the highest APFD score. However, each TCP and TCS approacshow limitations, such as high execution cost, time, efficiency, and lack of information. TCP and TCS approaches that can cover multiple test suite variables (time, cost, efficiency) remained inefficient. Thus, there is a need for a hybrid TCP and TCS technique to be developed to search for the best method that gives a high APFD score while having good coverage of test cases relevant to the cost and execution time to improve efficiency. The proposed hybrid test case selection and prioritization technique will exclude similar & faulty test cases to reduce test size. The proposed hybrid technique has several advantages, including reduced execution time and improved fault detection ability. The proposed hybrid Enhanced Test Case Selection and Prioritization Algorithm ( ETCSA) is a promising approach to select only modified test suites to improve efficiency. However, the efficiency of the proposed technique may depend on the specific criteria for selecting only modified test cases and the software's characteristics. The hybrid technique aims to define an ideal ranking order of test cases, allowing for higher coverage and early fault detection with reduced test suite size. This study reviews TCP and TCS hybrid techniques to reduce testing time, cost and improve efficiency for regression testing. Each TCS and TCP technique in regression testing has identified apparent standards, benefits, and restrictions.


Page 1 from 1