18th August 2016 by Mike Heskitt
This guest contribution was first published on Innovation Intelligence is written by Andy MacKrell, Product Development at MultiMechanics. MultiMech, a composites modeling and analysis tool, is available through the Altair Partner Alliance.
The number of objects an average human can hold in working memory is 7 ± 2.
This tenant was the topic of a famous psychological paper written by Princeton Psychologist George Miller and is often referred to as Miller’s Law. This theory is germane to the realm of design-engineering, because as more complex engineering systems are developed, the ability to design and optimize purely using engineering judgment becomes increasingly difficult. Our brains just aren’t able to systematically juggle all available design parameters and evaluate results in an efficient way…we don’t have enough RAM.
One area where this cognitive limitation is apparent is in the design and implementation of composite parts and materials. Composites are complex, highly variable, systems that have proven to be difficult to predict the behavior of – and thus difficult to design with. Common variables include fiber/matrix mechanical properties, fiber volume fraction, matrix rate dependency, debond strengths and geometrical arrangements.
While many virtual testing techniques have been developed to help predict the behavior of composite parts, the majority of the methods end up “smearing’ the system variables into a single number. Further, the process of finding that smeared number often requires time-consuming reverse-engineering of destructively tested coupons.
This process may satisfy our natural instinct to simplify complex systems, but it simultaneously eliminates our ability to optimize the variables of a composite system and reduces our ability to accurately predict system behavior. The question then becomes, is there a way we can avoid excess physical testing and instead use virtual automation tools to understand our materials and improve our end-products?
Piece of Cake
To use an analogy, let’s say you want to understand and predict the science behind baking a good cake.
There are a number of variables that define a good cake, like amount of water, quality of flour, convection of your oven. You could bake 10 cakes and laboriously come up with an empirical formula to predict how various inputs affect the resultant cake. Or you can understand the cake’s ingredients well enough to predict how a change in those ingredients will influence the outcome.
If you can define these inputs, you can start to understand which ingredients or processes contribute to favorable or unfavorable cake characteristics. If you can do the latter, then this opens the door to the true power of computers, the ability to iterate and optimize, such that for any given variation of your ingredients, you can reasonably predict how well that cake is going to turn out.
There are a number of factors that can be modified to potentially improve the properties of a composite material. Similarly, there are variables that should be strictly controlled as their presence results in the degradation of a parts performance. These are variables like inclusion misalignment and the existence of voids in a matrix.
A list of the variables present in composites can be seen in the table below. It should be noted that that the multiscale analysis capabilities offered by MultiMech are able to model all of these distinct variables and predict their effects on system behavior, using FEA.
Typically, engineers are modifying these variables while juggling a number of different output constraints like ideal stiffness, strengths, and costs. Because of the number of variables and their interconnectivity, it becomes difficult to manually design an ideal composite.
Let’s look at how to obtain ideal parameters while adhering to Miller’s Law.
HyperStudy is a leading optimization tool that offers a wide range of design, optimization, and data analysis features. Boiled down, optimization tools like HyperStudy operate under the following conditions:
INPUTS – Provide a set of variable parameters and their upper and lower bounds
SOLUTION – Perform some operation using those inputs that generates a single result
MATCH – Try and match that result to a set of pre-defined target values, or try and minimize/maximize any number of result values
ITERATION – Iterate (using a number of smart parameter selection techniques) until that solution converges
Just as it’s important to deconstruct cake ingredients, it’s wise to look at the key pieces of the optimization process. For the parameter selection, this step dictates that you need an input paradigm flexible enough to take in and work with numerous variables. If your tool requires that your inputs are simplified or smeared, then your outputs will be equally unrevealing. The flexible Finite Element (FE) based approach of MultiMech allows users to specify a wide range of composite input parameters, seen in the table above.
The other important and rate-limiting component is the “iteration” step time. This is the key ingredient to all optimization tools. The speed at which a solver takes to arrive at a solution must be significantly faster than the time to actually find an optimal solution manually. This is because it might take an optimization tool 500-5000 iterations before it finds a suitable solution. Thus, another weakness of composites analysis tools in this space is their ability to quickly generate solutions to complex problems. MultiMech, on the other hand, has developed proprietary approaches to compute the homogenized properties of a virtual composite microstructure and exchange data between multiple scales, making a structural analysis efficient enough for most composite design workflows.
Optimization Use Cases
Since isolating variables is often a wise approach, typical optimization use cases found in the composites industry are as follows.
Fiber manufacturers – Find the ideal length of fibers to meet target strength and weight, while minimizing for cost. Evaluate the ratio of glass-to-carbon fiber in a hybrid reinforcement bundle versus other key mechanical properties
Resin Manufacturers – Understand how particles within the matrix (size, composition, adhesive strength) will affect the properties of your matrix, and how it interacts with other inclusions
Proprietors of woven composites – Find the ideal weave geometry to hit a certain strength target
Designers of mining technologies – Find optimal placement of explosives to promote ideal crack propagation within a heterogeneous medium (like coal or shale rock)
Part manufacturers – Find optimal adhesion characteristics of fiber/resin, which can be modified by the introduction of surface treatments and coatings
3D printers – Print optimized material microstructures in the same part, all with specific properties targeted for that part region.
These are just suggestions, but there are many more potential use cases for optimizing heterogeneous materials. For context, the ultimate closed-loop optimization workflow for an injection-molded composite part would be the following:
Given all possible variables
And costs to modify each of these
Cost to control defect
Costs of different materials
Cost of different manufacturing processes
Costs to “model” various geometric features
Find the lowest cost option to hit a given set of targets.
High Powered Optimization Tool (HyperStudy)
Manufacturing simulation Tool (Moldex3d,FiberGraphix,FiberSim, etc.).
Moldex3D in particular are adept in various forms of optimization.
Structural / Thermal Composite Analysis (MultiMech) tool capable of:
Ingesting manufacturing Inputs
Using inputs from various sources to drive automated pre-processing at multiple scales
Efficiently using manufacturing inputs to minimize computational costs
Intelligently notifying optimization engine when a manufacturing input yields sub-par results
Outputting simulation results in useful and consolidated manner
The workflow for the optimization of a discontinuous fiber reinforced part, using available software tools, can be seen in the following schematic.
In composites engineering, the list of variables is long and interrelated. Whenever there exists a problem where there are more input variables than there are favorable outputs (and the stakes for solving are relatively high), you find that each group that controls one variable will claim that their variable is the most important and they have perfected the control of it. Often, you are encountering guessing and speculation (best case) and snake oil (worst case).
It’s like the sugar producer or oven manufacturer claiming they have engineered their product to solve the ‘most pressing’ challenge in cake engineering without understanding how the other ingredients work alongside. In reality, it’s up to the baker to understand ALL ingredients and know how to come together to make something the end-user wants to eat. To learn more about how MultiMech or HyperStudy can help, drop us a line.