Search published articles


Showing 16 results for Subject: Statistical Process Control Statistical Process Control or Quality Control

Mehdi Kabiri Naeini, Mohammad Saleh Owlia, Mohammad Saber Fallahnezhad,
Volume 23, Issue 3 (9-2012)
Abstract

In this research, an iterative approach is employed to recognize and classify control chart patterns. To do this, by taking new observations on the quality characteristic under consideration, the Maximum Likelihood Estimator of pattern parameters is first obtained and then the probability of each pattern is determined. Then using Bayes’ rule, probabilities are updated recursively. Finally, when one of the updated derived statistics falls outside the calculated control interval a pattern recognition signal is issued. The advantage of this approach comparing with other existing CCP recognition methods is that it has no need for training. Simulation results show the effectiveness and accuracy of the new method to detect the abnormal patterns as well as satisfactory results in the estimation of pattern parameters.
Rassoul Noorossana, Abbas Saghaei, Hamidreza Izadbakhsh, Omid Aghababaei,
Volume 24, Issue 2 (6-2013)
Abstract

In certain statistical process control applications, quality of a process or product can be characterized by a function commonly referred to as profile. Some of the potential applications of profile monitoring are cases where quality characteristic of interest is modelled using binary,multinomial or ordinal variables. In this paper, profiles with multinomial response are studied. For this purpose, multinomial logit regression (MLR) is considered as the basis.Then, the MLR is converted to Poisson GLM with log link. Two methods including Multivariate exponentially weighted moving average (MEWMA) statistics, and Likelihood ratio test (LRT) statistics are proposed to monitor MLR profiles in phase II. Performances of these three methods are evaluated by average run length criterion (ARL). A case study from alloy fasteners manufacturing process is used to illustrate the implementation of the proposed approach. Results indicate satisfactory performance for the proposed method.
Alireza Sharafi, Majid Aminnayeri, Amirhossein Amiri, Mohsen Rasouli,
Volume 24, Issue 2 (6-2013)
Abstract

Identification of a real time of a change in a process, when an out-of-control signal is present is significant. This may reduce costs of defective products as well as the time of exploring and fixing the cause of defects. Another popular topic in the Statistical Process Control (SPC) is profile monitoring, where knowing the distribution of one or more quality characteristics may not be appropriate for discussing the quality of processes or products. One, rather, uses a relationship between a response variable and one or more explanatory variable for this purpose. In this paper, the concept of Maximum Likelihood Estimator (MLE) applied to estimate of the change point in binary profiles, when the type of change is drift. Simulation studies are provided to evaluate the effectiveness of the change point estimator.
Moharram Habibnejad Korayem, Arastoo Azimi, Ali Mohammad Shafei,
Volume 24, Issue 3 (9-2013)
Abstract

In this research the sensitivity analysis of the geometric parameters such as: length, thickness and width of a single link flexible manipulator on maximum deflection (MD) of the end effector and vibration energy (VE) of that point are conducted. The equation of motion of the system is developed based on Gibbs-Appel (G-A) formulation. Also for modeling the elastic property of the system the assumption of assumed modes method (AMM) is applied. In this study, two theories are used to obtain the end-point MD and VE of the end effector. Firstly, the assumption of Timoshenko beam theory (TBT) has been applied to consider the effects of shear and rotational inertia. After that, Euler-Bernoulli beam theory (EBBT) is used. Then Sobol’s sensitivity analysis method is applied to determine how VE and end-point MD is influenced by those geometric parameters. At the end of the research, results of two mentioned theories are compared.
Mr. Virender Narula , Dr. Sandeep Grover,
Volume 26, Issue 1 (3-2015)
Abstract

There has been considerable number of papers published related to Six Sigma applications in manufacturing and service organizations. However, very few studies are done on reviewing the literature of Six Sigma in all the areas including manufacturing, construction, education, financial service, BPOs and healthcare etc. Considering the contribution of Six Sigma in recent time, a more comprehensive review is presented in this paper. The authors have reviewed Six Sigma literature in the way that would help research academicians and practitioners to take a closer look at the growth, development, and applications of this technique. The authors have reviewed various journal papers and suggested different schemes of classification. In addition, certain gap areas are identified that would help researchers in further research.
Ebrahim Mazrae Farahani, Reza Baradaran Kazemzade, Amir Albadvi, Babak Teimourpour,
Volume 29, Issue 3 (9-2018)
Abstract

Studying the social networks plays a significant role in everyone’s life. Recent studies show the importance and increasing interests in the subject by modeling and monitoring the communications between the network members as longitudinal data. Typically, the tendency for modeling the social networks with considering the dependency of an outcome variable on the covariates is growing recently. However, these studies fail in considering the possible correlation between the responses in the modeling of social networks. Our study use generalized linear mixed models (GLMMs) (also referred to as random effects models) to model the social network according to the attributes of nodes in which the nodes take a role of random effect or hidden effect in the modeling. The likelihood ratio test (LRT) statistics is implemented to detect change points in the simulated network streams. Also, in the simulation studies, we applied root mean square Error (RMSE) and standard deviation criteria for choosing an appropriate distribution for simulation data. Also, our simulation studies demonstrates an improvement in the average run length (ARL) index in comparison to the previous studies.
 
Mahdi Imanian, Aazam Ghassemi, Mahdi Karbasian,
Volume 31, Issue 1 (3-2020)
Abstract

This work used two methods for Monitoring and control of autocorrelated processes based on time series modeling. The first method was the simultaneous monitoring of common and assignable causes. This method included applying five steps of data gathering, normality test, autocorrelation test, model selection and control chart selection on all non-stationary process observations. The second method was a novel one for the separate monitoring and control of common and assignable causes. In this method, the process was divided into the parts with and without assignable causes.
The first method was greatly non-stationary due to not separating common and assignable causes. This method also implied that the common causes were hidden in the process. The novel method for the separate monitoring of common and assignable causes could turn the process into a stationary one, leading to identifying, monitoring, and controlling common causes without any interference from the assignable causes. The results showed that, unlike the first method, the second method could be very sensitive to the common causes; it could, therefore, suitably monitor, identify and control both assignable and common causes.
The current work was aimed to use control charts to monitor and control the bootomhole pressure during the drilling operation.
 
Rassoul Noorossana, Mahdi Shayganmanesh, Farhad Pazhuheian, Mohammad Hosein Rahimi,
Volume 31, Issue 3 (9-2020)
Abstract

Laser marking is an advanced technology in material processing that has a permanent effect on materials. With the use of laser engraving, the material is removed, layer by layer, in the laser path through melting displacement, ablation, and evaporation. Al-SiC is a metal matrix composite, widely used in aerospace, automobile manufacturing, and electronic packaging. Accumulative roll bonding (ARB) is one of the newest manufacturing processes of metal matrix composites, which leads to the production of materials with high strength, low weight, and great environmental compatibility. In this paper, we present the laser engraving of Al-SiC composite samples, which are produced through ARB process, using Q-switched Nd:YAG laser. A 2k factorial design is used to analyze the effect of factors, including assistant gas flow, distance of sample from beam focus location (distance), pulse repetition frequency, and pumping current on the qualitative characteristics of engraved zone (width, depth and contrast of engraved zone). Desirability function is used for optimization. Results based on experimental data indicate the optimal setting of input factors which leads to pre-specified target values of responses.
 
Fatemeh Elhambakhsh, Mohammad Saidi- Mehrabad,
Volume 32, Issue 1 (1-2021)
Abstract

Statistical monitoring of dynamic networks is a major topic of interest in complex social systems. Many researches have been conducted on modeling and monitoring dynamic social networks. This article proposes a new methodology for modeling and monitoring dynamic social networks for quick detection of temporal anomalies in network structures using latent variables. The key idea behind our proposed methodology is to determine the importance of latent variables in creating edges between nodes as well as observed covariates. First, latent space model (LSM) is used to model dynamic networks. Vector of parameters in LSM model are monitored through multivariate control charts in order to detect changes in different network sizes. Experiments on simulated social network monitoring demonstrate that our surveillance monitoring strategy can effectively detect abrupt changes between actors in dynamic networks using latent variables.
Samrad Jafarian-Namin, Mohammad Saber Fallahnezhad, Reza Tavakkoli-Moghaddam, Ali Salmasnia, Mohammad Hossein Abooei,
Volume 32, Issue 4 (12-2021)
Abstract

In recent years, it has been proven that integrating statistical process control, maintenance policy, and production can bring more benefits for the entire production systems. In the literature of triple-concept integrated models, it has generally been assumed that the observations are independent. However, the existence of correlated structures in some practical applications put the traditional control charts in trouble. The mixed EWMA-CUSUM (MEC) control chart and the ARMA control chart are effective tools to monitor the mean of autocorrelated processes. This paper proposes an integrated model subject to some constraints for determining the decision variables of triple concepts in the presence of autocorrelated data. Three types of autocorrelated processes are investigated to study their effects on the results. Moreover, the results of the MEC and ARMA charts are compared. Due to the complexity of the model, a particle swarm optimization (PSO) algorithm is applied to select optimal decision variables. An industrial example and extensive comparisons are provided
Bhagwan Kumar Mishra, Anupam Das,
Volume 32, Issue 4 (12-2021)
Abstract

The article highlights the development of a Non-Gaussian Process Monitoring Strategy for a Steel Billet Manufacturing Unit (SBMU). The non-Gaussian monitoring strategy being proposed is based on Modified Independent Component Analysis (ICA) which is a variant of the widely employed conventional ICA. The Independent Components(IC) being extracted by modified ICA technique are ordered as per the variance explained akin to that of Principle Component Analysis (PCA). Whereas in conventional ICA the variance explained by the ICs are not known and thereby causes hindrance in the selection of influential ICs for eventual building of the nominal model for the ensuing monitoring strategy. Hotelling T2 control chart based on modified ICA scores was used for detection of fault(s) whose control limit was estimated via Bootstrap procedure owing to the non-Gaussian distribution of the underlying data. The Diagnosis of the Detected Fault(s) was carried out by employment of Fault Diagnostic Statistic. The Diagnosis of the Fault(s) involved determination of the contribution of the responsible Process and Feedstock characteristics. The non-Gaussian strategy thus devised was able to correctly detect and satisfactory diagnose the detected fault(s)
Elaheh Bakhshizadeh, Hossein Aliasghari, Rassoul Noorossana, Rouzbeh Ghousi,
Volume 33, Issue 1 (3-2022)
Abstract

Organizations have used Customer Lifetime Value (CLV) as an appropriate pattern to classify their customers. Data mining techniques have enabled organizations to analyze their customers’ behaviors more quantitatively. This research has been carried out to cluster customers based on factors of CLV model including length, recency, frequency, and monetary (LRFM) through data mining. Based on LRFM, transaction data of 1865 customers in a software company has been analyzed through Crisp-DM method and the research roadmap. Four CLV factors have been developed based on feature selection algorithm. They also have been prepared for clustering using quintile method. To determine the optimum number of clusters, silhouette and SSE indexes have been evaluated. Additionally, k-means algorithm has been applied to cluster the customers. Then, CLV amounts have been evaluated and the clusters have been ranked. The results show that customers have been clustered in 4 groups namely high value loyal customers, uncertain lost customers, uncertain new customers, and high consumption cost customers. The first cluster customers with the highest number and the highest CLV are the most valuable customers and the fourth, third, and second cluster customers are in the second, third, and fourth positions respectively. The attributes of customers in each cluster have been analyzed and the marketing strategies have been proposed for each group.
Ahmad Hakimi, Hiwa Farughi, Amirhossein Amiri, Jamal Arkat,
Volume 33, Issue 1 (3-2022)
Abstract

In some statistical processes monitoring (SPM) applications, relationship between two or more ordinal factors is shown by an ordinal contingency table (OCT) and it is described by the ordinal Log-linear model (OLLM). Newton-Raphson algorithm methods have also been used to estimate the parameters of the log-linear model. In this paper, the OLLM based processes is monitored using MR and likelihood ratio test (LRT) approaches in Phase I. Some simulation studies are applied to performance evaluation of the proposed approaches in terms of probability of signal under step shifts, drifts and the presence of outliers. Results show that, by imposing the small and moderate shifts in the ordinal log-linear model parameters, the MR statistic has better performance than LRT. In addition, a real case study in dissolution testing in pharmaceutical industry is employed to show the application of the proposed control charts in Phase I.  

Fatemeh Elhambakhsh, Kamyar Sabri-Laghaie,
Volume 33, Issue 1 (3-2022)
Abstract

The fourth industrial revolution has changed our lives by enabling everyone to be interconnected virtually. A trustworthy system is required to secure large volume of stored data in IoT-based devices. Blockchain technology has led to transfer and to save data in a safe way. With this in mind, the blockchain-based cryptocurrencies have gained quite a bit of popularity because of their potential for financial transactions. In this regard, monitoring transactions network is very fruitful to find users’ abnormal behaviors. In this research, a novel procedure is used to monitor blockchain cryptocurrency transactions network. To do so, a random, binary graph model is used to simulate the transactions between users, and a SCAN method is used to detect the abnormal behaviors in the simulated model. Also, a multivariate exponentially weighted moving average (MEWMA) control chart is used to monitor centrality measures. The probability of signal is used to assess the performance of the SCAN method and that of the MEWMA control chart in distinguishing abnormalities. Then, the procedure is adopted to a Bitcoin transactions dataset.
Rakesh Kumar Pattanaik, Muhammad Sarfraz, Mihir Narayan Mohanty,
Volume 33, Issue 4 (12-2022)
Abstract

To develop a system for specific purpose, it needs to estimate its parameters (parameterization). It can be used in different fields like engineering, industry etc. In this work, authors used adaptive algorithm to model a system that is applicable in industry for control. This adaptive model is non-linear where its estimation is based on kernel based Least-mean square (LMS) algorithm. The kernel used as Polynomial and Gaussian. As the system is nonlinear polynomial kernel-based algorithm fails to prove its efficacy, though it is of low complexity approach. Gaussian kernel-based application for nonlinear system control performance better as compared to polynomial kernel. Further its complexity is reduced and used for faster performance. The result shows its performance in form of MSE, MAE, RMSE for identification and control that is very useful in industrial application.
 
Ali Salmasnia, Mohammad Reza Maleki, Esmaeil Safikhani,
Volume 34, Issue 2 (6-2023)
Abstract

In some applications, the number of quality characteristics is larger than the number of observations within subgroups. Common multivariate control charts to monitor the variability of such high-dimensional processes are unsuitable because the sample covariance matrix is not positive semi-definite and invertible. Moreover, the impact of gauge imprecision on detection capability of multivariate control charts under high-dimensional setting has been clearly neglected in the literature. To overcome these shortcomings, this paper develops a ridge penalized likelihood ratio chart for Phase II monitoring of high-dimensional process in the presence of measurement system errors. The developed control chart departures from the assumption of sparse variability shifts in which the assignable cause can only affects a few elements of the covariance matrix. Then, to compensate for the adverse impact of gauge impression, the developed chart is extended by employing multiple measurements on each sampled item. Simulation studies are carried out to study the impact of imprecise measurements on detectability of the developed monitoring scheme under different shift patterns. The results show that the gauge inability negatively affects the run-length distribution of the developed control chart. It is also found that the extended chart under multiple measurements strategy can effectively reduce the error impact.

Page 1 from 1