OPSEARCH is the official journal of the Operational Research Society of India (ORSI) publishing the research papers in the field of operations research and related fields. It is a quarterly publication (March, June, September and December).

  • a) official publication of the prestigious Operational Research Society of India
  • b) premier Indian journal in the field of Operational Research

The journal OPSEARCH published by the Operational Research Society of India (ORSI) is a national forum set up with the objective of promoting the education and applications of Operational Research (OR) in day-to-day environment in business, industry and other organizations.

Related subjects » Business & Management - Mathematics - Operations Research & Decision Theory


The latest content available from Springer
  1. The editor has retracted this article [1] because it has significant overlap with a work published by Sudhesh and Azhagappan [2] and is therefore redundant. The authors do not agree to this retraction.

  2. Abstract

    This paper examines a continuous review inventory model for perishable items with two demand classes. Demands for both classes occur according to Poisson process. The items in inventory are perishable products and have exponential lifetimes. The time after placing an order is an exponential random variable. When the on-hand inventory drops to pre-specified level s, only the priority customer demands are met whereas the demands from ordinary customers are lost. And also, the demand occurring stock-out periods are lost. The inventory system is characterized by continuous-time Markov process and steady-state probabilities are derived. The expected cost function is formulated and a numerical study is provided for optimization.

  3. Abstract

    We study the sensitivity of some optimality criteria based on progressively type-II right censored order statistics scheme changes and explain how the sensitivity analysis helps to find the optimal censoring schemes. We find that determining an optimal censoring plan among a class of one-step censoring schemes is not always recommended. We consider optimality criteria as the model output of a sensitivity analysis problem and quantify how this model depends on its input factor and censoring scheme, using local and global sensitivity methods. Finally, we propose a simple method to find the optimal scheme among all possible censoring schemes.

  4. Abstract

    Vogel’s Approximation Method (VAM) is known as the best algorithm for generating an efficient initial feasible solution to the transportation problem. We demonstrate that VAM has some limitations and computational blunders. To overcome these limitations we develop an Improved Vogel’s Approximation Method (IVAM) by correcting these blunders. It is compared with VAM on obtained initial feasible solutions to a numerical example problem. Reduction in the total transportation cost over VAM by IVAM is found to be 2.27%. Besides, we have compared IVAM with each of twelve previously developed methods including VAM on solutions to numerical problems. IVAM leads to the minimal total cost solutions to seven, better solutions to four and the same better solution to the remaining one. Finally, a statistical analysis has been performed over the results of 1500 randomly generated transportation problems with fifteen distinct dimensions, where each of them has 100 problems instances. This analysis has demonstrated better performance of IVAM over VAM by reducing the total transportation cost in 71.8% of solved problems, especially for large size problems. Thus IVAM outperforms VAM by providing better initial feasible to the transportation problem.

  5. Abstract

    In this paper we present a simple heuristic algorithm to find high-quality, feasible solutions, for the traveling salesman problem (TSP). We hypothesize, that the quality of the initial solution provided by the proposed heuristic will improve the performance of the subsequent algorithm in terms of number of iterations required to reach a certain level TSP solution. The proposed heuristic does not attempt to compete against known TSP algorithms and heuristics, but instead, should be considered to serve as a “pre-processor”. The method provides a simple framework for testing new node selection and neighborhood rules. The cost matrix of origin and destination pairs is processed in a systematic way starting from a principal diagonal matrix element to find a feasible TSP tour. The matrix reduction, systematic moves in rows and columns, systematic elimination of rows and columns from further consideration, and the “reserved” column declaration, assure that the resulting sequence of nodes and edges forms a complete TSP tour. The process can be repeated from each principal diagonal element. The best TSP tour found can then be used, for example, as an input to another algorithm (e.g. the TABU search, simulated annealing, ant colony optimization, nearest neighbor, or another heuristic) to attempt to improve the tour further. It should be noted, that the proposed technique can also be used for testing of presence of cycles of a proposed solution provided by another algorithm. While the goal of the heuristic algorithm is to attempt to find the optimum tour, optimality cannot be guaranteed.

  6. Abstract

    Queueing systems experienced in real-life situations are very often influenced by negative arrivals which are independent of service process and cause the elimination of jobs from the system. Such a scenario occurs in computer network and telecommunication systems where an attack by a malicious virus results in the removal of some or all data files from the system. Along this direction many authors have proposed various killing processes in the past. This paper unifies different killing mechanisms into the classical single server queue having infinite capacity, where arrival occurs as renewal process with exponential service time distribution. The system is assumed to be affected by negative customers as well as disasters. The model is investigated in steady-state in a very simple and elegant way by means of supplementary variable and difference equation technique. The distribution of system-content for the positive customers is derived in an explicit form at pre-arrival and random epochs. The influence of different parameters on the system performance are also examined.

  7. Abstract

    The Karush–Kuhn–Tucker (KKT) optimality conditions are necessary and sufficient for a convex programming problem under suitable constraint qualification. Recently, several papers (Dutta and Lalitha in Optim Lett 7(2):221–229, 2013; Lasserre in Optim Lett 4(1):1–5, 2010; Suneja et al. Am J Oper Res 3(6):536–541, 2013) have appeared wherein the convexity of constraint function has been replaced by convexity of the feasible set. Further, Ho (Optim Lett 11(1):41–46, 2017) studied nonlinear programming problem with non-convex feasible set. We have used this modified approach in the present paper to study vector optimization problem over cones. The KKT optimality conditions are proved by replacing the convexity of the objective function with convexity of strict level set, convexity of feasible set is replaced by a weaker condition and no condition is assumed on the constraint function. We have also formulated a Mond–Weir type dual and proved duality results in the modified setting. Our results directly extend the work of Ho (2017) Suneja et al. (2013) and Lasserre (2010).

  8. Abstract

    This paper discusses a priority based unbalanced time minimization assignment problem which deals with the allocation of n jobs to \(m~(<n)\) persons such that the project is executed in two stages, viz. Stage-I and Stage-II. Stage-I is composed of \(n_1(<m)\) primary jobs and Stage-II is composed of the remaining \((n-n_1)\) secondary jobs which are commenced only after Stage-I jobs are completed. Each person has to do at least one job whereas each job is to be assigned to exactly one person. It is assumed that the nature of primary jobs is such that one person can perform exactly one job whereas a person is free to perform more than one job in Stage-II. Also, persons assigned to primary jobs cannot perform secondary jobs. In a particular stage, all persons start performing the jobs simultaneously. However, if a person is performing more than one job, he does them one after the other. The objective of the proposed study is to find the feasible assignment that minimizes the overall completion time (i.e., the sum of Stage-I and Stage-II time) for the two stage implementation of the project. In this paper, an iterative algorithm is proposed that solves a constrained unbalanced time minimization assignment problem at each iteration and generates a pair of Stage-I and Stage-II times. In order to solve this constrained problem, a solution strategy is developed in the current paper. An alternative combinations based method to solve the priority based unbalanced problem is also analysed and a comparative study is carried out. Numerical demonstrations are provided in support of the theory.

  9. Abstract

    In today’s technology-driven world, despite of efficient planning of manufacturing system and development of refined production technologies and control systems; the items produced in a manufacturing system may have some fraction of defectives. Thus inspection of a lot of the items is essential to differentiate perfect and imperfect products. Every business sector has some hidden costs involved in additional managerial cost which are also imperative to calculate for smooth running of the business sector. This work considers an entropic order quantity model with selling price dependent demand and screening to separate imperfect quality products. We find an important observation about effect of entropy cost on the maximization of profit which states that the entropy cost has similar behavior as the selling price of the product. Our findings enlighten the insights of the entropic order inventory model and enrich the advancement of the literature of inventory model. Finally, a hypothetical numerical example is set to validate the model and sensitivity analysis has also been performed to study the impact of various parameters on the optimal solution.

  10. Abstract

    In this paper, an EPQ model for delayed deteriorating items is presented, where the demand before deterioration sets in is assumed to be time dependent quadratic demand and the holding (carrying) cost is assumed to be linearly dependent on time. Three stages are considered as follows: (1) production build up period, (2) period before deterioration starts and, (3) period after deterioration sets in. There is no demand during production build up period and the demand before deterioration begins is assumed to be quadratic time dependent while that after deterioration sets in is assumed to be constant. It is also assumed that shortages are not allowed. The purpose of this paper is to investigate the optimal set of production rates that minimizes the total inventory cost per unit time, the best cycle length and the economic production quantity. A numerical example is given to illustrate the applicability of the model and sensitivity analysis carried out on the example to see the effect of changes on some system parameters.

  11. Abstract

    In optimization models based on stochastic programming, we often face the problem of representing expectations in proper form known as scenario generation. With advances in computational power, a number of methods starting from simple Monte-Carlo to dedicated approaches such as method of moment-matching and scenario reduction are being used for multistage scenario generation. Recently, various variations of moment-matching approach have been proposed with the aim to reduce computational time for better outputs. In this paper, we describe a methodology to speed up moment-matching based multistage scenario generation by using principal component analysis. Our proposal is to pre-process the data using dimensionality reduction approaches instead of using returns as variables for moment-matching problem directly. We also propose a hybrid multistage extension of heuristic based moment-matching algorithm and compare it with other variants of moment-matching algorithm. Computational results using non-normal and correlated returns show that the proposed approach provides better approximation of returns distribution in lesser time.

  12. Abstract

    Human resources management (HRM) helps the organization to assess organizational and environmental changes related to its activities at minimum costs. Moreover, HRM ensures unity and coherence to personnel activities. Therefore, nowadays, the human resources strategies are considered to be the main components of organizations to get improvement. It is imperative to consider the strategies, in order to increase the effectiveness and efficiency of the management activities, and in order to develop the abilities of employees. In this regard, successful organizations are those organizations in which managers and employees, are always in a dynamic competition for innovation and creativity on the basis of their organization strategies, and also those organizations in with thinking among the forces have become a habit and a task. The present study tries to identify and prioritize the human resource strategies with an approach based on the creativity among the employees in administration of documents and properties registration departments. The analysis was conducted by the strengths, weaknesses, opportunities, and threats matrix. Then analytical network process multi-criteria decision making method is applied to prioritize the strategies. The strategy for the maintenance of human resources, based on the expanded creativity criterion, would be the first priority, along with its details.

  13. Abstract

    This paper deals with a single server queueing system where the waiting space is limited. The server serves the customer in batches. The arrival process is considered to be renewal type and the services are considered to be correlated which has been presented by a continuous-time batch Markovian service process (C-BMSP). Distribution of the system length at pre-arrival instant of a customer and at an arbitrary-epoch have been determined for this queueing system. These probability distributions have been used for obtaining the blocking probability of an arbitrary customer, expected system-length, expected waiting time of an arbitrary customer in the system, and several other important performance measures. This model may find application in queueing systems involving inventory where delay in demand may lead to perishing of goods due to long wait in the system. Also, a profit function has been derived for such a queueing model to maximize the profit from the system for certain model parameters. Finally, assuming that the inter-arrival time follows phase-type distribution, a few numerical examples have been presented in the form of graphs and tables.

  14. Abstract

    The present study analyzes a production-inventory system with hybrid carbon regulation policy. This hybrid carbon policy is a combination of carbon tax and cap-and-trade policies. It considers a single item that can be produced in different qualities. Production cost, setup cost, amount of emissions and the demand rate depend on the quality. The demand rate for each quality is price sensitive. Emissions occur from three sources—setup, production process and stock holding. The firm can invest on green technologies in each emission source separately to reduce emissions. This model considers profit maximization policy. The managerial problem is to select the profit-maximizing quality for production, and to find the optimum values of the production run time, green investments and the selling price. An algorithm is provided to solve the model. The model is illustrated by a numerical example. Sensitivity analysis is also performed.

  15. Abstract

    This paper studies a finite Markovian queue with customer interjections. Arriving customers are dispersed into normal customers and interjecting customers, the normal customers join the queue at the end and the interjecting customers try to cut in line and occupy a position as close to the head of the queue as possible. The behavior of interjecting customers is described by the percentage of customers interjecting and the tolerance level of interjection by individual customers who are already waiting in the queue. Customers in different positions in the waiting line have different tolerance levels on customer interjection. By using the theory of the exponential distribution, the Laplace–Stieltjes transform and the formula of the total probability, we obtain the mean and variance of waiting time of a customer in position n, a normal customer and an interjecting customer. Further, numerical results show that these means and variances of waiting times increase with increase of the arrival process, and are sensitive to the percentage of customers interjecting and the tolerance level of interjection. It is found that eliminating customer interjection would be beneficial to normal customers.

  16. Abstract

    This article presents a model to plan the annual production of pulleys and gray iron bushings in an engineering company. The proposed model considers sales forecasts and the use of finite mixture distributions to determine the behavior of class A sales items and best service levels. The forecast model parameters and finite mixture distribution sales are calculated via a Monte Carlo simulation. The model provides the annual production planning and inventory, minimizing fixed and variable production, inventory holding and penalty costs for noncompliance. The design is implemented with historical class A sales items, determining that the best service level is 95%. This model expects an additional annual profit of approximately $277,000 Mexican pesos.

  17. Abstract

    Mixed-model assembly line is known to be a special case of production lines where variety of product models similar to product characteristics are assembled. This article addresses available-to-promise (ATP) in mixed-model assembly line sequencing problems in a Make-To-Order environment in two stages. First, the customers are prioritized based on their corresponding profit values and a decision support system for order acceptance/rejection based on ATP is developed. By implementing this concept and developing a mathematical model, delivery quantity and date in a planning horizon are determined based on the inventories in the stock. The second stage is solving a mixed binary mathematical model to sequence accepted orders suitably according to demands due dates that guarantees the orders are not released too late or too early. The problem simultaneously considers following objectives: minimizing the total tardiness and earliness costs based on the determined priority of orders and minimizing the utility work and idle time of workers in the production line. An algorithm based on Lagrangian relaxation is developed for the problem, and tested in terms of solution quality and computational efficiency. To validate the performance of the proposed algorithm, various test problems in small size are solved using the CPLEX solver, and compared with the Lagrangian relaxation method. Finally, the proposed model is solved in large size problems to analyze the model performance. The drawback of the CPLEX is that it could not solve large problem instances in reasonable time. For the small sized problem, there is approximately 1% duality gap for the Lagrangian relaxation method. The maximum duality gap in the Lagrangian relaxation method for the large sized problem is always kept below 4% while the average computing time is very reasonable. Therefore, according to the results obtained from test problems, the developed Lagrangian relaxation method proved to be the suitable method for this problem.

  18. Abstract

    There has been extensive scheduling research relating to use of existing dispatching rules along with/without new dispatching rules and compared their performance behavior in job-shop, flow-shop, open-shop, flexible manufacturing system, and single machine with unit capacity environments using various scheduling objectives. However, it appears that there is no comparative study on analysis of dispatching rules for scheduling bottleneck batch processing machine in discrete parts manufacturing, particularly the diffusion furnace (DF) in semiconductor manufacturing (SM). This study addresses this research issue. For that, this study first, proposes the mathematical models for dynamic scheduling (DS) of DF to optimize the due-date based scheduling objectives: Total weighted tardiness, on-time delivery rate, total earliness/lateness, and maximum lateness. Due to the computational intractability of each the proposed mathematical models for large-scale problem, this study proposes greedy heuristic algorithm (GHA) based on due-date based dispatching rules (DDR). Because, dispatching rules are widely used in the SM industry. Accordingly, in this study twenty variants of GHA-DDR are proposed by considering various due-date based dispatching rules to compare the effects of due-date based dispatching rules in DS of DF. From the series of computational analysis carried out in this study, it is observed empirically that the proposed variants of GHA based on apparent tardiness cost (ATC) and batch ATC (BATC) dispatching rules yield consistently better solution for most of the scheduling objectives considered in this study. This observation is further verified by statistical analysis: Friedman test and Nemenyi multiple comparison test.

  19. Abstract

    The purpose of this study is to measure the financial efficiency of firms considering both input and procurement capital. We propose a new method called three-dimensional data envelopment analysis model and conducted an efficiency analysis of 33 companies in Korea manufacturing auto parts. The results of the study are summarized as follows. Even if the business performance generated by the company is more efficient than the input assets, many inefficient companies exist compared with the procurement capital. Meanwhile, management performance compared with input assets was inefficient; however, there were also companies that were efficient in terms of procurement capital. Therefore, when analyzing the efficiency of a company, efficiency methodology and measurement values that consider both input and procurement capital are needed. This study presents a new measurement methodology based on this point and analyzes the current financial situation of each decision-making unit through return-to-scale analysis and suggests financial improvement direction.

  20. Abstract

    In this paper, the Staff Transfer Problem (STP) in Human Resource Management is addressed as a stable matching problem. Earlier, formulation of this problem was of scheduling/allocation type. Here, the stable matching formulation is completely a new and more practical approach to the problem. This new formulation involves two preference lists: the first list contains the offices/locations preferred by the employees undergoing transfer and the second list contains the employees preferred by the employer of an office/location where those employees want to be transferred. The capacity of an office/location would act as a hard constraint. While matching these two lists, the objective is to maximize the number of transfers and at the same time to stabilize the matching, i.e., to minimize the number of blocking pairs. The resulting STP instance belongs to an instance of Maximum Size Minimum Blocking Pair Stable Matching with incomplete preference list (MAX SIZE MIN BP SMI) and has been proved in this paper to be NP-hard. As the problem is new in formulation, no previous work, method or result is available. There was no preference in selecting meta-heuristics. Among a large number of existing meta-heuristics, some most widely used meta-heuristics, namely, Simulated Annealing, Genetic Algorithms, Tabu Search and some variants of them have been chosen. Based on them four meta-heuristic approaches have been proposed, namely, btSA_match, gtSA_match, GA_match and TS_match. The variants btSA_match and gtSA_match are obtained from modifications made upon Simulated Annealing. EGA_match and TS_match are based on modified Genetic Algorithms and Tabu Search respectively. As there is no previous result in the existing literature, the performance has been compared among these four methods. It is observed that, variants of Simulated Annealing (SA) outperform others w.r.t. the performance metrics. The SA-variant with greedy nature, incorporated with a tabu list (gtSA_match) has shown that the best result on the basis of statistical analysis.