Record Details

Two-stage flexible flow shop scheduling with blocking constraint and batching machines

ScholarsArchive at Oregon State University

Field Value
Title Two-stage flexible flow shop scheduling with blocking constraint and batching machines
Names Panahi, Hadi (creator)
Logendran, Rasaratnam (advisor)
Date Issued 2014-06-06 (iso8601)
Note Graduation date: 2015
Abstract In this dissertation, a two stage flexible flow shop scheduling problem is studied, where in stage 1, there are identical batch processors (burnout ovens) and in stage 2, there are identical discrete processors (furnaces). Batch processors perform a pre-heating process on a batch of jobs at a time and discrete processors perform a casting process on only one job at a time. It is assumed there is no intermediate buffer allowed between stages 1 and 2 and there is a blocking constraint between stages one and two. A batch of jobs is loaded into the batch processors based on predefined patterns. Batch processors have capacity limit and only a limited number of jobs can be loaded into the batch processors. This number is determined by the burnout oven size, jobs size and shape, and also by the layout by which they are loaded into the batch processor. It is assumed that the processing times of the batch processors are fixed and independent of the number of jobs loaded. In stage 2, jobs may require a setup time before loading. This setup depends on the job processed last and the job that is going to be processed next. In stage 2 in order to pour the molten metal, pots are used. Larger jobs require larger pots due to higher molten metal volume. Also jobs may have different alloy types and sometimes the pot that is currently being used must be washed before using it for the next job due to the difference between the alloy types of two consecutive jobs. Furthermore, each pot can be used for a predefined number of pot pourings and when it reaches this predefined maximum, it has to be replaced by a new pot. It is assumed that there are different orders of jobs from customers and each customer’s order is associated with a due date. In this problem, several goals have to be attained while producing high quality products that conform to the established production plan. An ideal production plan reduces the blocking times on the batch processors, reduces the number of pot changes and improves the pot efficiency, reduces the maximum weighted completion time of all jobs and at the same time satisfies customers’ due dates. In order to attain all these goals in one production plan, the maximum weighted completion time of all jobs are minimized. The weights are defined based on customers’ due dates.
In order to solve this problem and obtain optimal solutions, several mixed-integer programming mathematical models are developed and are solved using CPLEX and GUROBI. In small-size problems, it is possible to find an optimal solution, but as the size of the problem increases, these commercial solvers are not capable of finding an optimal solution by proving optimality of the feasible solution they find. In large-size problems, these solvers are not even capable of finding a feasible solution.
This problem is inspired by a real industry problem and in order to solve this problem on an industrial scale, optimal approaches cannot be employed due to their inefficiency in solving large-size problems. In order to solve these problems and obtain at least high quality near-optimal solutions, three different search algorithms, namely TS, LTM_MAX and LTM_MIN, all based on Tabu search are designed and implemented.
The quality of the solutions obtained by the search algorithms has to be evaluated against a valid measure. While mathematical modeling is a good tool to obtain such valid measures (optimal solutions), they are not reliable in solving large-size problems and therefore in this research they are used to evaluate the performance of the search algorithms in small-size problems. For large-size problems, a different methodology, based on the identification of valid lower bounds, is used to investigate the performance of the search algorithms.
Two lower bounding mechanisms are used for generating lower bounds. These lower bounds are compared against the solutions found by the search algorithms. The first methodology developed in this research is the iterative selective linear relaxation. This methodology starts from solving the problem with some of the variables relaxed, and allows for relaxing another set of selected variables in subsequent iterations. The second methodology developed in this research is based on branch-and-price (B&P) algorithm. B&P can be viewed as an extension of branch-and-bound, wherein a column generation algorithm is developed and implemented to solve each node of the branch-and bound tree, thus enabling the identification of higher quality lower bounds efficiently for the original problem.
A comprehensive set of experiments are performed to evaluate the efficiency and efficacy of the search algorithms. In pursuance of such experiments, numerous test problems that are representative of the real problem have been generated. There are several important factors such as structure of the problem, ratio of processing times to setup times, and types of setup times that are considered in generating test problems. Since some of these factors are known to be “hard to change”, a split-plot design is used to perform the design of experiments in this research.
The experiments reveal that LTM_MIN outperforms the other two algorithms in terms of the solution quality. On average TS and LTM_MAX deviate from LTM_MIN by 6.51% and 3.44%, respectively, in small and large-size problems. The average gap of LTM_MIN from the optimal solution in small-size problems is 2.81%, which is very promising. Average gap of LTM_MIN from the best lower bound available and B&P are 31.10% and 39.24%, respectively. In large-size problems the average gap of lower bound obtained by B&P from that by B&B is -14.07%, confirming that B&P outperforms B&B in large-size problems. Finally, in large-size problems, the average gap among best of all three algorithms against the best available lower bound is 14.2%.
Genre Thesis/Dissertation
Access Condition http://creativecommons.org/licenses/by-nc-nd/3.0/us/
Topic Scheduling
Identifier http://hdl.handle.net/1957/50149

© Western Waters Digital Library - GWLA member projects - Designed by the J. Willard Marriott Library - Hosted by Oregon State University Libraries and Press