Workshop on Computational Optimization,Modelling and Simulation (COMS) Session 1

Time and Date: 10:35 - 12:15 on 12th June 2017

Room: HG D 5.2

Chair: Xin-She Yang

203 Global Convergence Analysis of the Flower Pollination Algorithm: A Discrete-Time Markov Chain Approach [abstract]
Abstract: Flower pollination algorithm is a recent metaheuristic algorithm for solving nonlinear global optimization problems. The algorithm has also been extended to solve multiobjective optimization with promising results. In this work, we analyze this algorithm mathematically and prove its convergence properties by using Markov chain theory. By constructing the appropriate transition probability for a population of flower pollen and using the homogeneity property, it can be shown that the constructed stochastic sequences can converge to the optimal set. Under the two proper conditions for convergence, it is proved that the simplified flower algorithm can indeed satisfy these convergence conditions and thus the global convergence of this algorithm can be guaranteed. Numerical experiments are used to demonstrate that the flower pollination algorithm can converge quickly and can thus achieve global optimality efficiently.
Xingshi He, Xin-She Yang and Mehmet Karamanoglu
217 Memetic Simulated Annealing for Data Approximation with Local-Support Curves [abstract]
Abstract: This paper introduces a new memetic optimization algorithm called MeSA (Memetic Simulated Annealing) to address the data fitting problem with local-support free-form curves. The proposed method hybridizes simulated annealing with the COBYLA local search optimization method. This approach is further combined with the centripetal parameterization and the Bayesian information criterion to compute all free variables of the curve reconstruction problem with B-splines. The performance of our approach is evaluated by its application to four different shapes with local deformations and different degrees of noise and density of data points. The MeSA method has also been compared to the non-memetic version of SA. Our results show that MeSA is able to reconstruct the underlying shape of data even in the presence of noise and low density point clouds. It also outperforms SA for all the examples in this paper.
Carlos Loucera, Andres Iglesias Prieto and Akemi Galvez-Tomida
326 A Matheuristic Approach for Solving the Dynamic Facility Layout Problem [abstract]
Abstract: The Dynamic Facility Layout Problem (DFLP) is designing a facility over a multi-period planning horizon where the interdepartmental material flows change from one period to the next one due to changes in product demands. The DFLP is used while designing manufacturing and logistics facilities over multiple planning periods; however, it is a very challenging nonlinear optimization problem. In this paper, a zone-based block layout is used to design manufacturing and logistics facilities over multiple planning periods. A zone-based block layout inherently includes possible aisle structures, which can easily be adapted to different material handling systems. The unequal area DFLP is modeled and solved using a zone-based structure where the dimensions of the departments are decision variables and the departments are assigned to flexible zones with a pre-structured positioning. A matheuristic approach, which combines concepts from Tabu Search (TS) and mathematical programming, is proposed to solve the zone-based DFLP on the continuous plane with unequal area departments. The TS determines the relative locations of departments and their assignments to zones while their exact locations and shapes are calculated by the mathematical programming. Numerical results for a set of test problems from the literature showed that our proposed matheuristic approach is promising.
Sadan Kulturel-Konak
64 Job-flow Anticipation Scheduling in Grid [abstract]
Abstract: In this paper, a heuristic user job-flow scheduling approach for Grid virtual organizations with non-dedicated resources is discussed. Users' and resource providers' preferences, virtual organization's internal policies, resources geographical distribution along with local private utilization impose specific requirements for efficient scheduling according to different, usually contradictive, criteria. With increasing resources utilization level the available resources set and corresponding decision space are reduced. This further complicates the task of efficient scheduling. In order to improve overall scheduling efficiency we propose a heuristic anticipation scheduling approach. Initially it generates a near optimal but infeasible scheduling solution which is then used as a reference for efficient resources allocation.
Victor V. Toporkov, Dmitry Yemelyanov and Alexander Bobchenkov

Workshop on Computational Optimization,Modelling and Simulation (COMS) Session 2

Time and Date: 15:45 - 17:25 on 12th June 2017

Room: HG D 5.2

Chair: Xin-She Yang

572 A Hybrid Heuristic in GPU-CPU Based on Scatter Search for the Generalized Assignment Problem [abstract]
Abstract: In the Generalized Assignment Problem (GAP), tasks must be allocated to machines with limited resources, in order to minimize processing costs. This problem has several industrial applications and often appears as a substructure in other combinatorial optimization problems. We propose a hybrid method inspired by Scatter Search metaheuristic, that efficiently generates a pool of solutions using a Tabu list criteria and an Ejection Chain mechanism. Common characteristics are extracted from the pool and solutions are combined by exploring a restricted search space, as a Binary Programming (BP) model. This method was implemented as a parallel approach to run in a Graphics Processing Unit (GPU). Experimental results show that the proposed method is very competitive to the algorithms found in the literature. On average, a gap of 0.09% is obtained over a set of 45 instances, when compared to lower bounds. Due to the integration of the method with an exact BP solver, it was capable of proving the optimality of small size instances, also finding new best known solutions for 21 instances. In terms of computational times, the proposed method performs on average 8 times faster than literature, also indicating that the proposed approach is scalable and robust for practical applications.
Danilo Santos Souza, Haroldo Gambini Santos and Igor Machado Coelho
314 An Exact Resolution for the Probabilistic Traveling Salesman Problem under the A Priori Strategy [abstract]
Abstract: The Probabilistic Traveling Salesman Problem (PTSP) is an extension of the classical Traveling Salesman Problem (TSP). The main difference is the stochastic presence of the customers, that is, the number of them to be visited each time is a random variable, where each customer associates with a given probability of occurrence. The resolution of problem consists in finding an a priori tour visiting all customers which minimizes the expected length over all possibilities. In this paper, we propose an exact algorithm for the solution of the PTSP. In this context, we derive new expressions for the lower bound and transitional, permanent evaluations as well as the mechanism of separation. The numerical results demonstrate the advantage of the proposed evaluations.
Mohamed Abdellahi Amar, Walid Khaznaji and Monia Bellalouna
97 A Matrix Approach to DC Railway Electrification Verification [abstract]
Abstract: There are some rules for 3,000 V DC electrification in the network ruled by the Spanish Railway Infrastructure Authority (ADIF). As far as we know, the correction of the installations is nowadays manually checked by an expert, as used to be done for expert systems years ago. We propose a computer tool that is an aid for the expert in checking that the positioning of the insulators, circuit breakers, load disconnectors, feeders,... fulfills the requirements established. The computer tool allows the expert to automatically check the sections fed in the different scenarios proposed in the requirements.
Eugenio Roanes-Lozano and Ruben Gonzalez-Martin
284 A Multi-Objective Approach to the Competitive Facility Location Problem [abstract]
Abstract: We introduce a novel approach for a competitive facility location problem in which multiple competitors aim to gain market share. The problem is called the Competitive Maximal Covering Location Problem (CMCLP) based on the classical Maximal Covering Location Problem. Typically, the CMCLP is modeled as a Stackelberg game in which the first player and then the other one locate a fixed number of facilities. On the other hand, the present work considers multiple competitors, and the objective is on discovering a set of the competitors’ decision tuples that are not dominated by any other decision tuples in the solution space. Thereby, our modeling paradigm aims to help competing firms understand tradeoffs when they engage in negotiations. We introduce a mathematical model for the CMCLP with two competitors and a genetic algorithm to solve the problems with multiple competitors. We present computational experiments to demonstrate that the proposed algorithm is able to approximate the true Pareto front.
Abdullah Konak, Sadan Kulturel-Konak and Lawrence Snyder
578 Multi-objective optimisation in scientific workflows [abstract]
Abstract: Engineering design is typically a complex process that involves finding a set of designs satisfying various performance criteria. As a result, optimisation algorithms dealing with only single-objective are not sufficient to deal with many real-life problems. Meanwhile, scientific workflows have been shown to be an effective technology for automating and encapsulating scientific processes. While optimisation algorithms have been integrated into workflow tools, they are generally single-objective. This paper first presents our latest development to incorporate multi-objective optimisation algorithms into scientific workflows. We demonstrate the efficacy of these capabilities with the formulation of a three-objective aerodynamics optimisation problem. We target to improve the aerodynamic characteristics of a typical 2D airfoil profile considering also the laminar-turbulent transition location for more accurate estimation of the total drag. We deploy two different heuristic optimisation algorithms and compare the preliminary results.
Hoang Nguyen, Zane van Iperen, Sreekanth Raghunath, David Abramson, Timoleon Kipouros and Sandeep Somasekharan

Workshop on Computational Optimization,Modelling and Simulation (COMS) Session 3

Time and Date: 10:15 - 11:55 on 13th June 2017

Room: HG D 5.2

Chair: Slawomir Koziel

247 Pareto Ranking Bisection Algorithm for EM-Driven Multi-Objective Design of Antennas in Highly-Dimensional Parameter Spaces [abstract]
Abstract: A deterministic technique for fast surrogate-assisted multi-objective design optimization of antennas in highly-dimensional parameters spaces has been discussed. In this two-stage approach, the initial approximation of the Pareto set representing the best compromise between conflicting objectives is obtained using a bisection algorithm which finds new Pareto-optimal designs by dividing the line segments interconnecting previously found optimal points, and executing poll-type search that involves Pareto ranking. The initial Pareto front is generated at the level of the coarsely-discretized EM model of the antenna. In the second stage of the algorithm, the high-fidelity Pareto designs are obtained through optimization of corrected local-approximation models. The considered optimization method is verified using a 17-variable uniplanar antenna operating in ultra-wideband frequency range. The method is compared to three state-of-the-art surrogate-assisted multi-objective optimization algorithms.
Adrian Bekasiewicz, Slawomir Koziel, Leifur Leifsson and Xiaosong Du
218 Accelerating Parallel Multicriterial Optimization Methods Based on Intensive Using of Search Information [abstract]
Abstract: In the present paper, an efficient parallel method for solving complex multicriterial optimization problems, which the optimality criteria can be multiextremal, and the computing of the criteria values can require a large amount of computations in, is proposed. The proposed approach is based on the reduction of the multicriterial problems to the global optimization ones using the minimax convolution of the partial criteria, the dimensionality reduction with the use of the Peano space-filling curves, and the application of the efficient parallel information-statistical global optimization methods. The intensive use of the search information obtained in the course of computations is provided when conducting the computations. The results of the computational experiments demonstrated such an approach to allow reducing the computation costs of solving the multicriterial optimization problems essentially - tens and hundreds times.
Victor Gergel and Evgeniy Kozinov
368 A Surrogate Model Based On Mixtures Of Taylor Expansions For Trust Region Based Methods [abstract]
Abstract: In this paper, we propose the use of a surrogate model based on mixtures of liner Taylor polynomials for Trust Region methods. The main objective of this model is to reduce the myopia presented in surrogate models based on single low-order Taylor expansions by which, the number of iterations during the optimization process of Trust Region based methods can be increased. The proposed model is built as follows: points are sampled from the search space, at each sampled point a surrogate model of the cost function is built by using a linear Taylor polynomial and then, cost functions can be locally approximated via a convex combination of such surrogate models. The Trust Region framework is then utilized in order to validate the quality of the proposed model. Even more, this model is proven to be fully linear which guaranties the global convergence of Trust Region methods to local optimum solutions. Experimental tests are performed making use of the three-dimensional variational optimization problem from data assimilation with an atmospheric general circulation model. The results reveal that, the use of our proposed surrogate model can improve the quality of the local approximations and even more, their use can decrease the number of iterations needed in order to obtain accurate solutions.
Elias D. Nino Ruiz, Carlos Ardila, Alfonso Mancilla and Jesus Estrada
319 Expedite Design of Variable-Topology Broadband Hybrid Couplers for Size Reduction Using Surrogate-Based Optimization and Co-Simulation Coarse Models [abstract]
Abstract: In this paper, we discuss a computationally efficient approach to expedite design optimization of broadband hybrid couplers occupying a minimized substrate area. Structure size reduction is achieved here by decomposing an original coupler circuit into low- and high-impedance components and replacing them with electrically equivalent slow-wave lines with reduced physical dimensions. The main challenge is reliable design of computationally demanding low-impedance slow-wave structures that feature a quasi-periodic circuit topology for wideband operation. Our goal is to determine an adequate number of recurrent unit elements as well as to adjust their designable parameters so that the coupler footprint area is minimal. The proposed method involves using surrogate-based optimization with a reconfigurable co-simulation coarse model as the key component enabling design process acceleration. The latter model is composed in Keysight ADS circuit simulator from multiple EM-evaluated data blocks of the slow-wave unit element and theory-based feeding line models. The embedded optimization algorithm is a trust-region-based gradient search with coarse model Jacobian estimation. We exploit a penalty function approach to ensure that the electrical conditions for the slow-wave lines are accordingly satisfied, apart from explicitly minimizing the area of the coupler. The effectiveness of the proposed technique is demonstrated through a design example of two-section 3-dB branch-line coupler. For the given example, we obtain nine circuit design solutions that correspond to the compact couplers whose multi-element slow-wave lines are composed of unit cells ranging from two to ten.
Piotr Kurgan, Slawomir Koziel, Leifur Leifsson and Xiaosong Du
323 Airfoil Design Under Uncertainty Using Nonintrusive Polynomial Chaos Theory and Utility Functions [abstract]
Abstract: Fast and accurate airfoil design under uncertainty using nonintrusive polynomial chaos expansions and utility functions is proposed. The NIPC expansions provide a means to efficiently and accurately compute statistical information for a given set of input variables with associated probability distribution. Utility functions provide a way to rigorously formulate the design problem. In this work, these two methods are integrated for the design of airfoil shapes under uncertainty. The proposed approach is illustrated on a numerical example of lift-constrained airfoil drag minimization in transonic viscous flow using the Mach number as an uncertain variable. The results show that compared with the standard problem formulation the proposed approach yields more robust designs. In other words, the designs obtained by the proposed approach are less sensitive to variations in the uncertain variables than those obtained with the standard problem formulation.
Xiaosong Du, Leifur Leifsson, Slawomir Koziel and Adrian Bekasiewicz

Workshop on Computational Optimization,Modelling and Simulation (COMS) Session 4

Time and Date: 14:10 - 15:50 on 13th June 2017

Room: HG D 5.2

Chair: Slawomir Koziel

566 Improving HPLC Analysis of Vitamin A and E: Use of Statistical Experimental Design [abstract]
Abstract: Analyses of vitamin supplements A and E in food samples are performed mostly with high performance liquid chromatography (HPLC). In majority of cases, sample preparation preceding HPLC implies saponification, a step critical to heat sensitivity of analytes. The method of saponification is clearly defined by ISO standards, however, two important factors, temperature and time of saponification are only given in value ranges instead of exact settings. Resolving this deficiency with the promise of eliminating time and cost consuming experimental probes, statistical experimental design (SED) is introduced to find optimum settings of temperature and time for the best recovery of vitamin supplements in food samples. Finding the optimum settings in SED was supported with Statsoft Statistica 7.0. For illustrating SED, margarine samples supplemented with vitamin A and E were applied.
Lőrinc Garai
271 A model for optimal fleet composition of vessels for offshore wind farm maintenance [abstract]
Abstract: We present a discrete optimisation model that chooses an optimal fleet of vessels to support maintenance operations at Offshore Wind Farms (OFWs). The model is presented as a bi-level problem. On the first (tactical) level, decisions are made on the fleet composition for a certain time horizon. On the second (operational) level, the fleet is used to optimise the schedule of operations needed at the OWF, given events of failures and weather conditions.
Alejandro Gutierrez-Alcoba, Gloria Ortega, Eligius Hendrix, Elin E. Halvorsen-Weare and Dag Haugland
231 Prostate cancer focal brachytherapy: Improving treatment plan robustness using a convolved dose rate model [abstract]
Abstract: Low-risk prostate cancer can be treated by focal brachytherapy, wherein small radioactive seeds are implanted directly into the prostate. This clinical procedure has reduced side effects compared to conventional radiotherapy treatments that target the entire gland. The planning of such treatment is complicated by post-operative displacement of the seeds from their intended location. This reduces the planned treatment dose and increases the dose to surrounding tissue such as the urethra and rectum, potentially causing harmful side-effects. Current treatment planning methods do not explicitly incorporate the effect of post-operative seed displacement. To address this, we modify the radiation dose rate function used during planning to reflect this displacement using convolution. This new dose rate model enables plans to be produced automatically and efficiently. Simulation experiments show that treatment plans made using the convolved dose rate function are more robust to seed displacement than those using the original unconvolved dose, preserving treatment efficacy but giving increased protection to surrounding tissue.
John Betts, Christopher Mears, Hayley Reynolds, Martin Ebert and Annette Haworth
375 Implementation and Use of Coarse-grained Parallel Branch-and-bound in Everest Distributed Environment [abstract]
Abstract: This paper examines the coarse-grained approach to parallelization of the branch-and-bound (\BNB) algorithm in a distributed computing environment. This approach is based on preliminary decomposition of a feasible domain of mixed-integer programming problem into a set of subproblems. The produced subproblems are solved in parallel by a distributed pool of standalone \BNB solvers. The incumbent values found by individual solvers are intercepted and propagated to other solvers to speed up the traversal of \BNB search tree. Presented implementation of the approach is based on SCIP, a non-commercial MINLP solver, and Everest, a web-based distributed computing platform. The implementation was tested on several mixed-integer programming problems and a noticeable speedup has been achieved. In the paper, results of a number of experiments with the Traveling Salesman Problem are presented.
Vladimir Voloshinov, Sergey Smirnov and Oleg Sukhoroslov
376 Model-Driven Choice of Numerical Methods for the Solution of the Linear Advection Equation [abstract]
Abstract: Designing a partial differential equations solver is a complex task which involves making choices about the solution algorithm and its parameters. Such choices are usually done on the basis of personal preference or numerical experiments, which can introduce significant bias on the selection process. In this work we develop a methodology to drive this selection process towards the optimal choices by modelling the accuracy and the performance of the solution algorithm. We show how this methodology can be successfully applied on the linear advection problem. As a result, the selection can be optimally performed with a much lower investment on the development of high-performance versions of the solvers and without using the target architecture for numerical experiments.
Andrea Arteaga, Oliver Fuhrer, Torsten Hoefler and Thomas Schulthess
21 3D Drape Reconstruction and Parameterization Based on Smartphone Video and Elliptical Fourier Analysis [abstract]
Abstract: In this paper, 3D fabric drape was reconstructed by using video recorded from a smartphone. Elliptical Fourier Analysis (EFA) and Principle Component Analysis (PCA) were used to parameterize the 3D drape to reveal shape parameters. A cluster analysis of various 3D drapes was implemented to verify the proposed method. Experiment results demonstrated that the 3D drape can be reconstructed and parameterized with a mean error of 0.52 mm when the harmonic number of EFA equals to 25. The cluster result indicated that the new features detected by our method were useful to classify different drapes, which provided a novel idea for 3D drape analysis.
Ge Wu, Zhicai Yu, Azmat Hussain and Yueqi Zhong