فهرست مطالب

Industrial Engineering International - Volume:9 Issue: 1, Autumn 2013

Journal Of Industrial Engineering International
Volume:9 Issue: 1, Autumn 2013

  • تاریخ انتشار: 1392/09/10
  • تعداد عناوین: 39
|
  • Mirbahador Gholi Arianezhad, Ahmad Makuie, Saeed Khayatmoghadam * Page 1

    In this research, a new two-echelon model has been presented to control the inventory of perishable goods. The performance of the model lies in a supply chain and is based on real conditions and data. The main purpose of the model is to minimize the maintenance cost of the entire chain. However, if the good is perished before reaching the customer (the expiration date is over), the cost would be added to other costs such as transportation, production, and maintenance costs in the target function. As real conditions are required, some limitations such as production time, storage capacity, inventory level, transportation methods, and sustainability time are considered in the model. Also, due to the complexity of the model, the solution approach is based on genetic algorithm under MATLAB to solve and confirm the accuracy of the model’s performance. As can be noted, the manipulation of parametric figures can solve the problem of reaching the optimum point. Using real data from a food production facility, the model was utilized with the same approach and the obtained results confirm the accuracy of the model.

    Keywords: Two-echelon inventory control, Genetic Algorithm, Supply Chain, Perishable good
  • Nima Zoraghi *, Maghsoud Amiri, Golnaz Talebi, Mahdi Zowghi Page 2

    This paper presents a fuzzy multi-criteria decision-making (FMCDM) model by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels. The objective method selects weights of criteria through mathematical calculation, while the subjective method uses judgments of decision makers. In this paper, we use a combination of weights obtained by both approaches in evaluating service quality in hotel industries. A real case study that considered ranking five hotels is illustrated. Examples are shown to indicate capabilities of the proposed method.

    Keywords: Service Quality, Hotel ranking, FMCDM, Objective weight, Subjective weight
  • GVSS Sharma *, P Srinivasa Rao Page 3

    Statistical process control is an excellent quality assurance tool to improve the quality of manufacture and ultimately scores on end-customer satisfaction. SPC uses process monitoring charts to record the key quality characteristics (KQCs) of the component in manufacture. This paper elaborates on one such KQC of the manufacturing of a connecting rod of an internal combustion engine. Here the journey to attain the process potential capability index (Cp) and the process performance capability index (Cpk) values greater than 1.33 is elaborated by identifying the root cause through quality control tools like the cause-and-effect diagram and examining each cause one after another. In this paper, the define-measure-analyze-improve-control (DMAIC) approach is employed. The definition phase starts with process mapping and identifying the KQC. The next phase is the measurement phase comprising the cause-and-effect diagram and data collection of KQC measurements. Then follows the analysis phase where the process potential and performance capability indices are calculated, followed by the analysis of variance (ANOVA) of the mean values. Finally, the process monitoring charts are used to control the process and prevent any deviations. By using this DMAIC approach, standard deviation is reduced from 0.48 to 0.048, the Cp values from 0.12 to 1.72, and the Cpk values from 0.12 to 1.37, respectively.

    Keywords: Key quality characteristic, Cause-and-effect diagram, Statistical Process Control, Process monitoring charts, Failure Modes, Effects Analysis, analysis of variance
  • Madhu Jain, Richa Sharma *, Gokul Chandra Sharma Page 4

    This paper studies the operating characteristics of an MX/Hk/1 queueing system under multiple vacation policy. It is assumed that the server goes for vacation as soon as the system becomes empty. When he returns from a vacation and there is one or more customers waiting in the queue, he serves these customers until the system becomes empty again, otherwise goes for another vacation. The breakdown and repair times of the server are assumed to follow a negative exponential distribution. By using a generating function, we derive various performance indices. The approximate formulas for the probability distribution of the waiting time of the customers in the system by using the maximum entropy principle (MEP) are obtained. This approach is accurate enough for practical purposes and is a useful method for solving complex queueing systems. The sensitivity analysis is carried out by taking a numerical illustration.

    Keywords: Batch arrival, k-type hyper-exponential distribution, State-dependent rates, Maximum entropy principle, Long-run probabilities, Un-reliable server, Queue length
  • Mehran Khalaj *, Fereshteh Khalaj, Amineh Khalaj Page 5

    Risk analysis of production system, while the actual and appropriate data is not available, will cause wrong system parameters prediction and wrong decision making. In uncertainty condition, there are no appropriate measures for decision making. In epistemic uncertainty, we are confronted by the lack of data. Therefore, in calculating the system risk, we encounter vagueness that we have to use more methods that are efficient in decision making. In this research, using Dempster-Shafer method and risk assessment diagram, the researchers have achieved a better method of calculating tools failure risk. Traditional statistical methods for recognizing and evaluating systems are not always appropriate, especially when enough data is not available. The goal of this research was to present a more modern and applied method in real world organizations. The findings of this research were used in a case study, and an appropriate framework and constraint for tools risk were provided. The research has presented a hopeful concept for the calculation of production systems' risk, and its results show that in uncertainty condition or in case of the lack of knowledge, the selection of an appropriate method will facilitate the decision-making process.

    Keywords: Dempster-Shafer theory, Epistemic uncertainty, Risk analysis, Risk assessment diagram
  • Hans J Thamhain Page 6

    The ability to evaluate project proposals, assessing future success, and organizational value is critical to overall business performance for most enterprises. Yet, predicting project success is difficult and often unreliable. A four-year field study shows that the effectiveness of available methods for evaluating and selecting large, complex project depends on the specific project type, organizational culture, and managerial skills. This paper examines the strength and limitations of various evaluation methods. It also shows that, especially in complex project situations, the decision-making process has to go beyond the application of just analytical methods, but has to incorporate both quantitative and qualitative measures into a combined rational judgmental evaluation process. Equally important, the evaluation process must be effectively linked among functional support groups and with senior management in order to strategically align the project proposal and to unify the evaluation team and stakeholder community behind the mission objectives. All of this requires leadership and managerial skills in planning, organizing, and communicating. The paper suggests specific leadership actions, organizational conditions, and managerial processes for evaluating complex project proposals toward future value and success.

    Keywords: Project evaluation, selection, Project Management, team leadership, Technology, decision making, Rational, Judgmental
  • Suchismita Satapathy *, Pravudatta Mishra Page 7

    Competition in the electric service industry is highlighting the importance of a number of issues affecting the nature and quality of customer service. The quality of service(s) provided to electricity customers may be enhanced by competition, if doing so offers service suppliers a competitive advantage. On the other hand, service quality offered to some consumers could decline if utilities focus their attention on those customers most likely to exercise choice, while reducing effort and investment to serve customers less likely to choose alternatives. Service quality is defined as the way in which the utility interacts with and responds to the needs of its customers. To achieve maximum consumer satisfaction in electricity service, This paper has designed a framework by QFD by measuring service quality of electricity utility sector in ANN and also find interrelationship between these design requirements by ISM.

    Keywords: Service Quality, ANN, QFD, ISM, Electricity utility, Consumer Satisfaction
  • Hassan Assareh *, Rassoul Noorossana, Kerrie L Mengersen Page 8

    Precise identification of the time when a process has changed enables process engineers to search for a potential special cause more effectively. In this paper, we develop change point estimation methods for a Poisson process in a Bayesian framework. We apply Bayesian hierarchical models to formulate the change point where there exists a step < /div> change, a linear trend and a known multiple number of changes in the Poisson rate. The Markov chain Monte Carlo is used to obtain posterior distributions of the change point parameters and corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the well-known c-, Poisson exponentially weighted moving average (EWMA) and Poisson cumulative sum (CUSUM) control charts for different change type scenarios. We also apply the Deviance Information Criterion as a model selection criterion in the Bayesian context, to find the best change point model for a given dataset where there is no prior knowledge about the change type in the process. In comparison with built-in estimators of EWMA and CUSUM charts and ML based estimators, the Bayesian estimator performs reasonably well and remains a strong alternative. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered.

    Keywords: Bayesian hierarchical model, Change point, control charts, Markov chain Monte Carlo, Poisson Process
  • Ümit Yüceer * Page 9

    An employee transporting problem is described and a set partitioning model is developed. An investigation of the model leads to a knapsack problem as a surrogate problem. Finding a partition corresponding to the knapsack problem provides a solution to the problem. An exact algorithm is proposed to obtain a partition (subset-vehicle combination) corresponding to the knapsack solution. It requires testing and matching too many alternatives to obtain a partition. The sweep algorithm is implemented in obtaining a partition (subset-vehicle combination) in an efficient manner. Illustrations are provided to show how the algorithms obtain solutions.

    Keywords: Employee transportation, Set partitioning, Knapsack problem, Sweep Algorithm
  • Mahdi Bashiri *, Amir Farshbaf-Geranmayeh, Hamed Mogouie Page 10

    In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors’ levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.

    Keywords: Multiple response optimization, Artificial Neural Networks, Data envelopment analysis, Smaller-the-bettertype controllable factors
  • Zeinab Hosseini, Reza Ghasemy Yaghin, Maryam Esmaeili * Page 11

    The integration of marketing and demand with logistics and inventories (supply side of companies) may cause multiple improvements; it can revolutionize the management of the revenue of rental companies, hotels, and airlines. In this paper, we develop a multi-objective pricing-inventory model for a retailer. Maximizing the retailer's profit and the service level are the objectives, and shortage is allowed. We present the model under stochastic lead time with uniform and exponential distributions. Since pricing is important and influences demand, the demand is considered as a general function of price. The multiple-objective optimization model is solved using the weighting method as well as the L-P metric method. Concerning the properties of a nonlinear model, a genetic algorithm is taken into account to find the optimal solutions for the selling price, lot size, and reorder point. Finally, numerical examples with sensitivity analysis regarding key parameters are provided.

    Keywords: Multi-objective nonlinear optimization, Pricing, Stochastic lead time, L-P metric method, Genetic Algorithm
  • Kamlesh Kumar *, Madhu Jain Page 12

    The integration of marketing and demand with logistics and inventories (supply side of companies) may cause multiple improvements; it can revolutionize the management of the revenue of rental companies, hotels, and airlines. In this paper, we develop a multi-objective pricing-inventory model for a retailer. Maximizing the retailer's profit and the service level are the objectives, and shortage is allowed. We present the model under stochastic lead time with uniform and exponential distributions. Since pricing is important and influences demand, the demand is considered as a general function of price. The multiple-objective optimization model is solved using the weighting method as well as the L-P metric method. Concerning the properties of a nonlinear model, a genetic algorithm is taken into account to find the optimal solutions for the selling price, lot size, and reorder point. Finally, numerical examples with sensitivity analysis regarding key parameters are provided.

    Keywords: Multi-objective nonlinear optimization, Pricing, Stochastic lead time, L-P metric method, Genetic Algorithm
  • Maryam Alimardani *, Fariborz Jolai, Hamed Rafiei Page 13

    In this paper, we apply continuous review (S-1, S) policy for inventory control in a three-echelon supply chain (SC) including r identical retailers, a central warehouse with limited storage space, and two independent manufacturing plants which offer two kinds of product to the customer. The warehouse of the model follows (M/M/1) queue model where customer demands follow a Poisson probability distribution function, and customer serving time is exponential random variable. To evaluate the effect of considering bi-product developed model, solution of the developed model is compared with that of the two (M/M/1) queue models which are separately developed for each product. Moreover, and in order to cope with the computational complexity of the developed model, a particle swarm optimization algorithm is adopted. Through the conducted numerical experiments, it is shown that total profit of the SC is significantly enhanced using the developed model.

    Keywords: Bi-product multi-echelon inventory planning, Markov Model, Backordering, Exponential lead time, particle swarm optimization
  • Ali Roozitalab *, Ezzatollah Asgharizadeh Page 14

    Warranty is now an integral part of each product. Since its length is directly related to the cost of production, it should be set in such a way that it would maximize revenue generation and customers’ satisfaction. Furthermore, based on the behavior of customers, it is assumed that increasing the warranty period to earn the trust of more customers leads to more sales until the market is saturated. We should bear in mind that different groups of consumers have different consumption behaviors and that performance of the product has a direct impact on the failure rate over the life of the product. Therefore, the optimum duration for every group is different. In fact, we cannot present different warranty periods for various customer groups. In conclusion, using cuckoo meta-heuristic optimization algorithm, we try to find a common period for the entire population. Results with high convergence offer a term length that will maximize the aforementioned goals simultaneously. The study was tested using real data from Appliance Company. The results indicate a significant increase in sales when the optimization approach was applied; it provides a longer warranty through increased revenue from selling, not only reducing profit margins but also increasing it.

    Keywords: Free replacement warranty policy, Cuckoo optimization, Heterogeneous population, Warranty period
  • Wu-Lin Chen *, Chin-Yin Huang, Ching-Ya Huang Page 15

    Product quality for plastic injection molding process is highly related with the settings for its process parameters. Additionally, the product quality is not simply based on a single quality index, but multiple interrelated quality indices. To find the settings for the process parameters such that the multiple quality indices can be simultaneously optimized is becoming a research issue and is now known as finding the efficient frontier of the process parameters. This study considers three quality indices in the plastic injection molding: war page, shrinkage, and volumetric shrinkage at ejection. A digital camera thin cover is taken as an investigation example to show the method of finding the efficient frontier. Solidworks and Moldflow are utilized to create the part’s geometry and to simulate the injection molding process, respectively. Nine process parameters are considered in this research: injection time, injection pressure, packing time, packing pressure, cooling time, cooling temperature, mold open time, melt temperature, and mold temperature. Taguchi’s orthogonal array L27 is applied to run the experiments, and analysis of variance is then used to find the significant process factors with the significant level 0.05. In the example case, four process factors are found significant. The four significant factors are further used to generate 34 experiments by complete experimental design. Each of the experiments is run in Moldflow. The collected experimental data with three quality indices and four process factors are further used to generate three multiple regression equations for the three quality indices, respectively. Then, the three multiple regression equations are applied to generate 1,225 theoretical datasets. Finally, data envelopment analysis is adopted to find the efficient frontier of the 1,225 theoretical datasets. The found datasets on the efficient frontier are with the optimal quality. The process parameters of the efficient frontier are further validated by Moldflow. This study demonstrates that the developed procedure has proved a useful optimization procedure that can be applied in practice to the injection molding process.

    Keywords: Injection molding, Taguchi’s orthogonal array, Mutiple regression analysis, Data envelopment analysis, Optimization
  • Arvind K Lal, Manwinder Kaur *, Sneh Lata Page 16

    Piston plays a vital role in almost all types of vehicles. The present study discusses the behavioral study of a piston manufacturing plant. Manufacturing plants are complex repairable systems and therefore, it is difficult to evaluate the performance of a piston manufacturing plant using stochastic models. The stochastic model is an efficient performance evaluator for repairable systems. In this paper, two stochastic models and computation algorithm of the piston manufacturing plant are illustrated using the state-space transition diagram and availability parameter is used for its behavioral study. Finally, the conclusion is discussed based on the resulting computations.

    Keywords: Piston manufacturing plant, Behavioral study, Stochastic model, Time-dependent availability
  • Pandian Pitchipoo *, Ponnusamy Venkumar, Sivaprakasam Rajakarunakaran Page 17

    This paper presents the development of a model based decision support system with a case study on solving the supplier selection problem in a chemical processing industry. For the evaluation and selection of supplier, the analytical hierarchy process (AHP) and grey relational analysis (GRA) were used. The intention of the study is to propose an appropriate platform for process industries in selecting suppliers, which was tested with an electroplating industry during the course of development. The sensitivity analysis was performed in order to improve the robustness of the results with regard to the relative importance of the evaluation criteria and the parameters of the evaluation process. Finally, a practical implementation study was carried out to reveal the procedure of the proposed system and identify the suitable supplier with detailed discussions about the benefits and limitations.

    Keywords: Supplier evaluation, Supplier selection, Model base, Decision support system, analytical hierarchy process, Grey Relational Analysis
  • Analysis of unreliable bulk queue with statedependent arrivals
    Charan Jeet Singh, Madhu Jain, Binay Kumar Page 18

    In this paper, we investigate a single-server Poisson input queueing model, wherein arrivals of units are in bulk. The arrival rate of the units is state dependent, and service time is arbitrary distributed. It is also assumed that the system is subject to breakdown, and the failed server immediately joins the repair facility which takes constant duration to repair the server. By using supplementary variable technique, we obtain the probability generating function of the number of units in the system which is further used to establish some performance indices such as the mean number of units in the system, mean waiting time, etc. Special cases are also discussed. In order to obtain approximate values of system state probabilities, the principle of maximum entropy is employed. Numerical results are also presented to validate the analytical formulae.

    Keywords: Bulk queue, Arbitrary service time, Breakdown, Supplementary variable, Queue size
  • Arokiasamy Mariajayaprakash *, Thiyagarajan Senthilvelan, Krishnapillai Ponnambal Vivekananthan Page 19

    The various process parameters affecting the quality characteristics of the shock absorber during the process were identified using the Ishikawa diagram and by failure mode and effect analysis. The identified process parameters are welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process parameters (load, hydraulic pressure, air pressure, and fixture height), washing process parameters (total alkalinity, temperature, pH value of rinsing water, and timing), and painting process parameters (flowability, coating thickness, pointage, and temperature). In this paper, the process parameters, namely, painting and washing process parameters, are optimized by Taguchi method. Though the defects are reasonably minimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by Taguchi method.

    Keywords: Ishikawa diagram, FMEA, Taguchi Method, Genetic Algorithm
  • Saeed Yaghoubi *, Siamak Noori, Mohammad Mahdavi Mazdeh Page 20

    This investigation presents a heuristic method for consumable resource allocation problem in multi-class dynamic Project Evaluation and Review Technique (PERT) networks, where new projects from different classes (types) arrive to system according to independent Poisson processes with different arrival rates. Each activity of any project is operated at a devoted service station located in a node of the network with exponential distribution according to its class. Indeed, each project arrives to the first service station and continues its routing according to precedence network of its class. Such system can be represented as a queuing network, while the discipline of queues is first come, first served. On the basis of presented method, a multi-class system is decomposed into several single-class dynamic PERT networks, whereas each class is considered separately as a minisystem. In modeling of single-class dynamic PERT network, we use Markov process and a multi-objective model investigated by Azaron and Tavakkoli-Moghaddam in 2007. Then, after obtaining the resources allocated to service stations in every minisystem, the final resources allocated to activities are calculated by the proposed method.

    Keywords: Project Management, Multi-class dynamic PERT network, Queuing
  • Ghasem Tohidi, Maryam Khodadadi * Page 21

    The formulas of cost and allocative efficiencies of decision making units (DMUs) with positive data cannot be used for DMUs with negative data. On the other hand, these formulas are needed to analyze the productivity and performance of DMUs with negative data. To this end, this study introduces the cost and allocative efficiencies of DMUs with negative data and demonstrates that the introduced cost efficiency is equal to the product of allocative and range directional measure efficiencies. The study then intends to extend the definition of the above efficiencies to DMUs with negative data and different unit costs. Finally, two numerical examples are given to illustrate the proposed methods. JEL classification: C6, D2

    Keywords: DEA, Cost efficiency, Negative data, Allocative Efficiency, RDM
  • Mohammad Ali Saniee Monfared *, Mahsa Safi Page 22

    As governmental subsidies to universities are declining in recent years, sustaining excellence in academic performance and more efficient use of resources have become important issues for university stakeholders. To assess the academic performances and the utilization of the resources, two important issues need to be addressed, i.e., a capable methodology and a set of good performance indicators as we consider in this paper. In this paper, we propose a set of performance indicators to enable efficiency analysis of academic activities and apply a novel network DEA structure to account for subfunctional efficiencies such as teaching quality, research productivity, as well as the overall efficiency. We tested our approach on the efficiency analysis of academic colleges at Alzahra University in Iran.

    Keywords: Data envelopment analysis, Performance Indicators, Academic efficiency, Network DEA
  • Saeed Mehrjoo, Mahdi Bashiri * Page 23

    Production planning and control (PPC) systems have to deal with rising complexity and dynamics. The complexity of planning tasks is due to some existing multiple variables and dynamic factors derived from uncertainties surrounding the PPC. Although literatures on exact scheduling algorithms, simulation approaches, and heuristic methods are extensive in production planning, they seem to be inefficient because of daily fluctuations in real factories. Decision support systems can provide productive tools for production planners to offer a feasible and prompt decision in effective and robust production planning. In this paper, we propose a robust decision support tool for detailed production planning based on statistical multivariate method including principal component analysis and logistic regression. The proposed approach has been used in a real case in Iranian automotive industry. In the presence of existing multisource uncertainties, the results of applying the proposed method in the selected case show that the accuracy of daily production planning increases in comparison with the existing method.

    Keywords: principal component analysis, Logistic regression, Production planning control, Decision support system
  • Babak H Tabrizi *, Jafar Razmi Page 24

    Supply chain management is taken into account as an inseparable component in satisfying customers' requirements. This paper deals with the distribution network design (DND) problem which is a critical issue in achieving supply chain accomplishments. A capable DND can guarantee the success of the entire network performance. However, there are many factors that can cause fluctuations in input data determining market treatment, with respect to short-term planning, on the one hand. On the other hand, network performance may be threatened by the changes that take place within practicing periods, with respect to long-term planning. Thus, in order to bring both kinds of changes under control, we considered a new multi-period, multi-commodity, multi-source DND problem in circumstances where the network encounters uncertain demands. The fuzzy logic is applied here as an efficient tool for controlling the potential customers' demand risk. The defuzzifying framework leads the practitioners and decision-makers to interact with the solution procedure continuously. The fuzzy model is then validated by a sensitivity analysis test, and a typical problem is solved in order to illustrate the implementation steps. Finally, the formulation is tested by some different-sized problems to show its total performance.

    Keywords: Supply Chain, Distribution network design, Uncertain demand, fuzzy logic
  • A Narvand, P Soleimani, Sadigh Raissi * Page 25

    In many circumstances, the quality of a process or product is best characterized by a given mathematical function between a response variable and one or more explanatory variables that is typically referred to as profile. There are some investigations to monitor auto-correlated linear and nonlinear profiles in recent years. In the present paper, we use the linear mixed models to account autocorrelation within observations which is gathered on phase II of the monitoring process. We undertake that the structure of correlated linear profiles simultaneously has both random and fixed effects. The work enhanced a Hotelling’s T2 statistic, a multivariate exponential weighted moving average (MEWMA), and a multivariate cumulative sum (MCUSUM) control charts to monitor process. We also compared their performances, in terms of average run length criterion, and designated that the proposed control charts schemes could effectively act in detecting shifts in process parameters. Finally, the results are applied on a real case study in an agricultural field.

    Keywords: Profile monitoring, linear mixed model, MCUSUM, MEWMA, Hotelling’s T2, autocorrelation, Average run length
  • Ali Mohammad I Nasrabadi, Mohammad Hossein Hosseinpour, Sadoullah Ebrahimnejad * Page 26

    In competitive markets, market segmentation is a critical point of business, and it can be used as a generic strategy. In each segment, strategies lead companies to their targets; thus, segment selection and the application of the appropriate strategies over time are very important to achieve successful business. This paper aims to model a strategy-aligned fuzzy approach to market segment evaluation and selection. A modular decision support system (DSS) is developed to select an optimum segment with its appropriate strategies. The suggested DSS has two main modules. The first one is SPACE matrix which indicates the risk of each segment. Also, it determines the long-term strategies. The second module finds the most preferred segment-strategies over time. Dynamic network process is applied to prioritize segment-strategies according to five competitive force factors. There is vagueness in pairwise comparisons, and this vagueness has been modeled using fuzzy concepts. To clarify, an example is illustrated by a case study in Iran's coffee market. The results show that success possibility of segments could be different, and choosing the best ones could help companies to be sure in developing their business. Moreover, changing the priority of strategies over time indicates the importance of long-term planning. This fact has been supported by a case study on strategic priority difference in short- and long-term consideration.

    Keywords: Market segmentation, Decision support system (DSS), Dynamic network process, fuzzy logic, Risk
  • Mitra Bokaei Hosseini, Mohammad Jafar Tarokh * Page 27

    Most decision making methods used to evaluate a system or demonstrate the weak and strength points are based on fuzzy sets and evaluate the criteria with words that are modeled with fuzzy sets. The ambiguity and vagueness of the words and different perceptions of a word are not considered in these methods. For this reason, the decision making methods that consider the perceptions of decision makers are desirable. Perceptual computing is a subjective judgment method that considers that words mean different things to different people. This method models words with interval type-2 fuzzy sets that consider the uncertainty of the words. Also, there are interrelations and dependency between the decision making criteria in the real world; therefore, using decision making methods that cannot consider these relations is not feasible in some situations. The Decision-Making Trail and Evaluation Laboratory (DEMATEL) method considers the interrelations between decision making criteria. The current study used the combination of DEMATEL and perceptual computing in order to improve the decision making methods. For this reason, the fuzzy DEMATEL method was extended into type-2 fuzzy sets in order to obtain the weights of dependent criteria based on the words. The application of the proposed method is presented for knowledge management evaluation criteria.

    Keywords: DEMATEL, Perceptual computing, decision making, Interval type-2 fuzzy sets (IT2 FSs)
  • Kandukuri Narayana Rao *, Kambagowni Venkata Subbaiah, Ganja Veera Pratap Singh Page 28

    Nowadays, customer expectations are increasing and organizations are prone to operate in an uncertain environment. Under this uncertain environment, the ultimate success of the firm depends on its ability to integrate business processes among supply chain partners. Supply chain management emphasizes cross-functional links to improve the competitive strategy of organizations. Now, companies are moving from decoupled decision processes towards more integrated design and control of their components to achieve the strategic fit. In this paper, a new approach is developed to design a multi-echelon, multi-facility, and multi-product supply chain in fuzzy environment. In fuzzy environment, mixed integer programming problem is formulated through fuzzy goal programming in strategic level with supply chain cost and volume flexibility as fuzzy goals. These fuzzy goals are aggregated using minimum operator. In tactical level, continuous review policy for controlling raw material inventories in supplier echelon and controlling finished product inventories in plant as well as distribution center echelon is considered as fuzzy goals. A non-linear programming model is formulated through fuzzy goal programming using minimum operator in the tactical level. The proposed approach is illustrated with a numerical example.

    Keywords: Supply Chain, Fuzzy Goal Programming, Performance vector, Continuous review policy, Strategic level, Tactical level
  • Reza Kia *, Hossein Shirazi, Nikbakhsh Javadian, Reza Tavakkoli-Moghaddam Page 29

    This paper presents a multi-objective mixed-integer nonlinear programming model to design a group layout of a cellular manufacturing system in a dynamic environment, in which the number of cells to be formed is variable. Cell formation (CF) and group layout (GL) are concurrently made in a dynamic environment by the integrated model, which incorporates with an extensive coverage of important manufacturing features used in the design of CMSs. Additionally, there are some features that make the presented model different from the previous studies. These features include the following: (1) the variable number of cells, (2) the integrated CF and GL decisions in a dynamic environment by a multi-objective mathematical model, and (3) two conflicting objectives that minimize the total costs (i.e., costs of intra and inter-cell material handling, machine relocation, purchasing new machines, machine overhead, machine processing, and forming cells) and minimize the imbalance of workload among cells. Furthermore, the presented model considers some limitations, such as machine capability, machine capacity, part demands satisfaction, cell size, material flow conservation, and location assignment. Four numerical examples are solved by the GAMS software to illustrate the promising results obtained by the incorporated features.

    Keywords: dynamic cellular manufacturing systems, Multi-objective model, Cell formation, group layout
  • Mahdi Bashiri *, Amir Moslemi Page 30

    A robust approach should be considered when estimating regression coefficients in multi-response problems. Many models are derived from the least squares method. Because the presence of outlier data is unavoidable in most real cases and because the least squares method is sensitive to these types of points, robust regression approaches appear to be a more reliable and suitable method for addressing this problem. Additionally, in many problems, more than one response must be analyzed; thus, multi-response problems have more applications. The robust regression approach used in this paper is based on M-estimator methods. One of the most widely used weighting functions used in regression estimation is Huber’s function. In multi-response surfaces, an individual estimation of each response can cause a problem in future deductions because of separate outlier detection schemes. To address this obstacle, a simultaneous independent multi-response iterative reweighting (SIMIR) approach is suggested. By presenting a coincident outlier index (COI) criterion while considering a realistic number of outliers in a multi-response problem, the performance of the proposed method is illustrated. Two well-known cases are presented as numerical examples from the literature. The results show that the proposed approach performs better than the classic estimation, and the proposed index shows efficiency of the proposed approach.

    Keywords: Multi-response problem, Robust regression, Outliers, M-estimator
  • Sanjay Kumar *, Sunil Luthra, Abid Haleem Page 31

    The role of customers in green supply chain management needs to be identified and recognized as an important research area. This paper is an attempt to explore the involvement aspect of customers towards greening of the supply chain (SC). An empirical research approach has been used to collect primary data to rank different variables for effective customer involvement in green concept implementation in SC. An interpretive structural-based model has been presented, and variables have been classified using matrice d'impacts croises-multiplication appliqué a un classement analysis. Contextual relationships among variables have been established using experts' opinions. The research may help practicing managers to understand the interaction among variables affecting customer involvement. Further, this understanding may be helpful in framing the policies and strategies to green SC. Analyzing interaction among variables for effective customer involvement in greening SC to develop the structural model in the Indian perspective is an effort towards promoting environment consciousness.

    Keywords: Supply Chain, Green Supply Chain Management, Green distribution, Interpretive Structural Modeling (ISM), MICMAC Analysis
  • Payel Ghosh *, Tapan Kumar Roy Page 32

    A very useful multi-objective technique is goal programming. There are many methodologies of goal programming such as weighted goal programming, min-max goal programming, and lexicographic goal programming. In this paper, weighted goal programming is reformulated as goal programming with logarithmic deviation variables. Here, a comparison of the proposed method and goal programming with weighted sum method is presented. A numerical example and applications on two industrial problems have also enriched this paper.

    Keywords: goal programming, Geometric programming, Pareto optimality, Nonlinear Programming
  • Vinod Kumar Mishra *, Lal Sahab Singh, Rakesh Kumar Page 33

    In this paper, we considered a deterministic inventory model with time-dependent demand and time-varying holding cost where deterioration is time proportional. The model considered here allows for shortages, and the demand is partially backlogged. The model is solved analytically by minimizing the total inventory cost. The result is illustrated with numerical example for the model. The model can be applied to optimize the total inventory cost for the business enterprises where both the holding cost and deterioration rate are time dependent.

    Keywords: Inventory model, deteriorating items, Shortage, Time-dependent demand, Time-varying holding cost
  • Seyed Taghi Akhavan Niaki *, Majid Khedmati Page 34

    In this paper, a new control chart to monitor multi-binomial processes is first proposed based on a transformation method. Then, the maximum likelihood estimators of change points designed for both step changes and linear-trend disturbances are derived. At the end, the performances of the proposed change-point estimators are evaluated and are compared using some Monte Carlo simulation experiments, considering that the real change type presented in a process are of either a step change or a linear-trend disturbance. According to the results obtained, the change-point estimator designed for step changes outperforms the change-point estimator designed for linear-trend disturbances, when the real change type is a step change. In contrast, the change-point estimator designed for linear-trend disturbances outperforms the change-point estimator designed for step changes, when the real change type is a linear-trend disturbance.

    Keywords: Multi-binomial processes, Maximum Likelihood Estimator, Multi-attribute processes, Step change, Linear-trend disturbance, Root, power transformation
  • Mahdi Bashiri *, Amir Moslemi Page 35

    In this paper, the main idea is to compute the robust regression model, derived by experimentation, in order to achieve a model with minimum effects of outliers and fixed variation among different experimental runs. Both outliers and nonequality of residual variation can affect the response surface parameter estimation. The common way to estimate the regression model coefficients is the ordinary least squares method. The weakness of this method is its sensitivity to outliers and specific residual behavior, so we pursue the modified robust method to solve this problem. Many papers have proposed different robust methods to decrease the effect of outliers, but trends in residual behaviors pose another important issue that should be taken into account. The trends in residuals can cause faulty estimations and thus faulty future decisions and outcomes, so in this paper, an iterative weighting method is used to modify both the outliers and the residuals that follow abnormal trends in variation, like descending or ascending trends, so they will have less effect on the coefficient estimation. Finally, a numerical example illustrates the proposed approach.

    Keywords: Coefficient estimation, Response Surface, Ordinary Least Squares, Outliers, Robust model
  • Seyed Taghi Akhavan Niaki *, Saeid Hoseinzade Page 36

    The main objective of this research is to forecast the daily direction of Standard & Poor's 500 (S&P 500) index using an artificial neural network (ANN). In order to select the most influential features (factors) of the proposed ANN that affect the daily direction of S&P 500 (the response), design of experiments are conducted to determine the statistically significant factors among 27 potential financial and economical variables along with a feature defined as the number of nodes of the ANN. The results of employing the proposed methodology show that the ANN that uses the most influential features is able to forecast the daily direction of S&P 500 significantly better than the traditional logit model. Furthermore, experimental results of employing the proposed ANN on the trades in a test period indicate that ANN could significantly improve the trading profit as compared with the buy-and-hold strategy.

    Keywords: S&P 500 index, Financial time series, Artificial Neural Networks, Design of Experiments, analysis of variance, logit model
  • Reza Tavakkoli-Moghaddam *, Fateme Forouzanfar, Sadoullah Ebrahimnejad Page 37

    This paper considers a single-sourcing network design problem for a three-level supply chain. For the first time, a novel mathematical model is presented considering risk-pooling, the inventory existence at distribution centers (DCs) under demand uncertainty, the existence of several alternatives to transport the product between facilities, and routing of vehicles from distribution centers to customer in a stochastic supply chain system, simultaneously. This problem is formulated as a bi-objective stochastic mixed-integer nonlinear programming model. The aim of this model is to determine the number of located distribution centers, their locations, and capacity levels, and allocating customers to distribution centers and distribution centers to suppliers. It also determines the inventory control decisions on the amount of ordered products and the amount of safety stocks at each opened DC, selecting a type of vehicle for transportation. Moreover, it determines routing decisions, such as determination of vehicles' routes starting from an opened distribution center to serve its allocated customers and returning to that distribution center. All are done in a way that the total system cost and the total transportation time are minimized. The Lingo software is used to solve the presented model. The computational results are illustrated in this paper.

    Keywords: Stochastic supply chain, Inventory control, Risk-pooling, Uncertainty, Capacity levels
  • Fatemeh Fardis, Afagh Zandi, Vahidreza Ghezavati * Page 38

    Clustering parts and machines into part families and machine cells is a major decision in the design of cellular manufacturing systems which is defined as cell formation. This paper presents a non-linear mixed integer programming model to design cellular manufacturing systems which assumes that the arrival rate of parts into cells and machine service rate are stochastic parameters and described by exponential distribution. Uncertain situations may create a queue behind each machine; therefore, we will consider the average waiting time of parts behind each machine in order to have an efficient system. The objective function will minimize summation of idleness cost of machines, sub-contracting cost for exceptional parts, non-utilizing machine cost, and holding cost of parts in the cells. Finally, the linearized model will be solved by the Cplex solver of GAMS, and sensitivity analysis will be performed to illustrate the effectiveness of the parameters.

    Keywords: Cellular manufacturing system, Stochastic arrival rate, service rate, Average Waiting Time, Queuing theory
  • Seyed Ali Hadighi, Navid Sahebjamnia, Iraj Mahdavi *, Hadi Asadollahpour, Hosna Shafieian Page 39

    The increasing complexity of decision making in a severely dynamic competitive environment of the universe has urged the wise managers to have relevant strategic plans for their firms. Strategy is not formulated from one criterion but from multiple criteria in environmental scanning, and often, considering all of them is not possible. A list of criteria utilizing Delphi was selected by consultation with company experts. By reviewing the literature and strategy experts’ proposals, the list is then classified into five categories, namely, human resource, equipment, market, supply chain, and rules. Since all the criteria may not be necessary for the decision process, as they are eliminated in the early stage traditionally, it is important to identify the prime set of criteria, which is a subset of the original criteria and affects decision making. Utilizing these criteria, a Mahalanobis-Taguchi System-based tool was developed to facilitate the selection of a prime set of criteria, which is a subset of the original criteria for ensuring that only ineffective subcriteria are eliminated and the conditions are prepared for relevant strategy formulation. Mahalanobis distance was used for making a measurement scale to distinguish ineffective subcriteria from significant criteria in the environmental scanning stage. The principles of the Taguchi method were used for screening the important criteria in the system and generate the prime set of criteria for each category. One can use these criteria within each category instead of all criteria for the identification of a suitable institution in training. To validate the proposed approach, a case study has been conducted for 38 educational institutions in Iran. The results demonstrated the usefulness of the proposed approach.

    Keywords: Mahalanobis distance, Measurement Scale, Service institution, Strategy Formulation