RSUSSH 2020

Material Parameter Estimation using Multiobjective Optimization and Genetic Algorithms

Assoc. Prof. Dr. Kent Salomonsson
Department of Materials & Manufacturing, School of Engineering Jönköping University, SWEDEN.
Invited speaker G1-1

Abstract

Multi material components are produced more often in industry nowadays, especially in the automotive industry. These components are cumbersome to assemble and join due to their dissimilar materials and to overcome the difficulties of joining, adhesives have become the preferred choice of joining method due to its several benefits over traditional methods. Moreover, when multi material components are used in assemblies of, e.g. complete cars or airplanes, an additional obstacle arise, namely how the newly developed component will behave during its lifecycle. Furthermore, increased demands from industry on the accuracy in predicting the behavior of components and their corresponding assemblies pushes the limits of computational power as well as material models to be used in order to meet these demands. For this reason, it is necessary to develop new and more efficient methodologies to determine the material behavior of e.g. an adhesive joining two dissimilar materials. In the present study, a methodology is developed to increase the accuracy in determining the material properties of a structural adhesive that is used in the automotive industry. Simulations are combined with experimental data to find material properties of individual phases in the adhesive. Material parameters, that were determined in a previous study, are used as input and correlation to an optimization strategy based on the Strength Pareto Evolutionary Algorithm (SPEA2). Two contradictory objective functions are used evaluated and 9 model parameters are used in total. In the previous study, the two objective functions were converted to one objective function which resulted in one optimal solution. In this study, an estimated Pareto front is generated by use of the SPEA2 algorithm. Thus, more insight into the material model, objective functions, optimal solutions and decision space is acquired. The developed methodology shows promising results where it has been shown that it in fact is possible to increase the accuracy of previously assumed optimal solutions. The results also shows that cloud computing can be used effectively to estimate and determine material properties and thereby contribute to more innovative and cost effective solutions for the industry of tomorrow.


QUESTIONS & ANSWERS

Dr. Jamie A. O'Reilly (Participant)

Thank you for your presentation Dr. Kent. A semantic point: it appears that you have used two weighted average L2 norm metrics as "fitness" functions, and there are minimized to determine which parameter sets are propagated to the next generation. I wonder if perhaps a revised choice of terminology would be appropriate, given that fitness is usually considered desirable to maximize.

What were your two stopping criteria for the algorithm? These were evidently satisfied by the 54th generation, which produced the optimal combination of 9 parameter values. How long did this algorithm take to run? Did you compare running it on an individual computer vs. cloud computing to measure the performance gain from adding parallelism? 

@30 Apr 20, 04:02 PM
Prof. Kent Salomonsson (Visitor)

Thank you for your comment and suggestion Dr. Jamie and I see how the choice of wording can be confusing. The choice of wording is more in the sense of how close of a fit the simulated curve is to the experimental data. Nevertheless, it would also be straight forward to change the objective into a maximization problem.

To answer your questions, I will start by answering how long it took to run. So, in total there were 40 times 54 simulations run, where the average runtime was about 2 hours meaning that it would take about 180 days to run on an individual computer. This is why we had to use cloud computing to run as many simulations as we could simultaneously and it reduced the simulation time to about 6 weeks. It was really a matter of how many computers and licenses we had access to at that time.

The stopping criteria was actually a tolerance that we put on the difference between the best combined "fitness" for two consecutive generations. 

 

 

 

@1 May 20, 12:29 PM
Dr. Jamie A. O'Reilly (Participant)

Thank you Dr. Kent. Wow that is a log runtime! And a worthwhile application of cloud computing. I appreciate that the stopping criteria was a threshold applied to the fitness function - my question was concerned with the value of these.

@1 May 20, 04:52 PM
Dr. Jamie A. O'Reilly (Participant)

Please excuse my confusion, now I understand that the stopping criteria was based on the performance improvement between two successive generations. (I am still curious what the value was)

@1 May 20, 05:28 PM
Prof. Kent Salomonsson (Visitor)

Dr. Jamie, ok, so I managed to extract the data for you. The difference between the two evolutions for the final set was ~ 2.822e-7, which is roughly 0.3% of the objective function value. You also have to consider that the ranges for each of the 9 variables and in the initial population was chosen based on an extensive number of trail and error simulations (over 2 years) performed by my colleague and I almost 20 years ago. So, with that, we had already targeted the span of each variable and it thus reduced the number of evolutions significantly.

Thanks again for your interest in this study and I hope to see you around at RSU soon.

Best regards,

Kent    

@3 May 20, 02:23 PM