By Thomas Schliesch, Head of Research & Development | Max Baermann GmbH
Personal Experiences from Past to Present
I started with these subjects at the end of the 80s in a company that was considered a small scale company in those times. For organizations like that, and even for bigger organizations, FEM and other numerical methods were more or less exotic. At the end of the 80s hardware resources were generally poor and very expensive. I started on an Intel 286 computer with a hard drive of 40MB. Today it happens to me often at daily work, that only one analysis casts around 10GB onto my hard disc. What a difference!
I did only two dimensional analyses by use of the FDM method in those days. FDM means ‘Finite Difference Method’ and seems to be more or less vanished from the software market nowadays. The FDM method had the disadvantage that space had to be meshed with regular grids which consisted of straight and continuous lines. I.e. for systems using Cartesian coordinates the mesh looked like the pages of an arithmetic book, but with adjustable sizes of the single rows and columns. The edges of geometric entities like areas of magnets or iron parts were only allowed to cross the resulting quad elements diagonally or to be congruent with the quad edges. As the total number of resulting nodes was constricted to 5,000, the task was to create a grid that had to include all needed entities adequately on one hand, and to be fine-meshed enough to provide reliable results of magnetic flux density on the other hand.
To manage meshing let me improve my programming skills, as I had to write own small Fortran programs for each second analysis. An ad hoc creation of the FDM mesh was nearly impossible for many cases under the above mentioned constrictions. This step was the most time consuming one. The other time consuming subject was the solution process, needing up to three days in case of a nonlinear, magnetostatic problem. Sometimes I started the solving process at Friday night and when coming back to my office at Monday the problem still was not solved. The disappointment was certainly big and the start to the new week already spoiled.
How was the quality of my results? Well, the first magnet I analyzed was an injection molded ring magnet with a few poles at its outer circumference. And, I was caught by the same trap many young (and not so young) people ran into and still run into nowadays: My assumption of the distribution of magnetization from pole to pole led me to completely wrong results. Means there was a deviation of 30 percent between predicted and measured fields near the magnets outer circumference. How come? I had assumed a homogenous magnetization within each single pole. The picture below shows what a model being much closer to reality predicts about magnetic polarization, instead of my crude assumption in 1989.
After four years we bought a license of a worldwide known FEM (Finite Element) software. What a relief when doing two dimensional analyses then. The main advancement (besides having a much faster computer too) was that the grid did not need to be regular anymore, i.e. elements could change their orientation from location to location as well as their shape. I.e. quads and triangles could be mixed and the need for a grid with continuous lines was gone. Programming skills were now demanded for other things like the script language, which was needed to command the software what to do. A whole run from problem definition to post processing could be controlled by one single batch file. And to be true: I still like this method most even today and use it whenever it is possible.
When trying 3D analyses there was again some disappointment. Error messages over error messages were flooding my screen, mainly saying, that this or that volume could not be meshed and that I had to try other element sizes, shapes etc. Colleagues from other companies told me about weeks only for meshing specific 3D problems.
This lasted for about one or two years until the newest version, which was really much more stable. Even nowadays meshing errors can happen but they are more or less rare.
Another problem was to get realistic shapes of the inherent parameter distributions of in mold magnetized injection molded magnets as in the above picture. I was forced to tackle this issue finally when I needed to predict the torque characteristics of a brushless DC motor for a customer. I subdivided the magnet with a relatively coarse mesh and calculated the resulting orientation and vector sum of magnetization all by hand within each single finite element. But this I did only once and swore never to do that manually again. So respective scripts to automatize this were developed soon and distributions like that in the picture above could be calculated for pole oriented magnets from this time on.
Magnets being magnetized by a pulse magnetization process demanded the same information as the in mold magnetized magnets about their distribution of polarization. This was especially the case for isotropic rare earth magnets. For this a treatment of magnetizing coils together with the feeding electrical circuit was needed. In those times a coupling of spatial FEM elements with electrical circuits was not implemented into the software. But by macro programming it was more or less easy to solve the differential equations of the circuit stepwise in time. The change of flux within the coil had to be recalculated for each time step too to update the time varying inductance of the whole system. Very helpful to implement this was an article of Jewell et al. . I still like to use those old scripts nowadays when I do 2D analyses, because one has full and easy access to all single parameters. In case of 3D analyses I nowadays prefer so called electrical circuit elements, which were introduced later. Very popular presently is the use of system simulation software to model feeding circuits, but it is not needed for condenser discharge problems due to their simplicity.
After having founded the basis for predicting the internal and external behavior of even complicated magnetic systems, many innovative things could be developed both regarding the magnets themselves as well as regarding their manufacturing methods. The mentioned small scale company is not small anymore and has grown a lot since then. And by some part also simulation has made this come true.
In fact the use of numerical simulation has decreased a bit during the last years in my organization. One reason is that for many new projects one can use results of prior projects and is not forced to invent the wheel again and again. Another reason is that I decided to write software for analytical calculations of permanent magnetic systems a couple of years ago. Actually, I did the programming at home and started it only to improve my C++ skills and not for earning any economic success. But nowadays it even has become economically successful and many licenses have been sold yet. The main advantage of such software is that permanent magnetic systems can be calculated easily within a few seconds and the problem definition needs at maximum one or two minutes. So around 70 percent of all customers’ requests can nowadays directly be answered at phone by our sales people and do not demand any dull FEM simulation.
What has happened since the start of the new millennium with the numerical methods? The manufacturer of the FEM software that we currently use has decreased the speed of innovation for electromagnetic simulations a lot. This is at least my personal impression. One reason may be that for many years they have been committed to program a new GUI. Really new simulation abilities for electromagnetic calculations have been introduced only sluggishly. But the new GUI looks fine. The contour or vector plots from that software are beautiful and even could win art contests. The new GUI can be combined with the old scripting language so that both worlds meet well.
If one looks at modern FEM software one subject is remarkable. At a first glance most commercially available packages look very similar. At the left side of the screen is always a kind of object tree, and below or sideward to the tree there pop up input and information dialogs. The rest of the screen is filled with a big window showing the systems model as well as a few symbol and menu bars. Defining a problem is mainly started from a CAD module or from external CAD input. The rest of work is done by specifying those objects on the left side of the screen, and this for preprocessing, solution as well as for post processing.
Another subject is that special features, which in former times had to be implemented by the user itself, are more and more implemented within the source codes of the simulation packages. This has its pros and cons. E.g. the user does not have to care for theory and does not need any programming or math skills. The respective information is available by simply pushing some buttons or by filling a few figures into a dialog mask. On the other hand often the user does not have access to details of the methods in use and even is not able to judge the adequacy of those methods. Examples may be calculations depending on hysteresis models or the analysis of demagnetizing effects.
Using non adequate assumptions about methods being hidden to the user may lead to wrong results at daily work. One way to prevent this at least partially, are so called plausibility analyses. For example, when there is a specific type of analysis one is not familiar with, it is recommendable to start with a simulation of very small size, which is comparable to the real model by its physical principle. Best is to use a problem that has been already treated somewhere else in literature or that can be handled analytically.
Constricting the software to a rigid frame with many push buttons and colorful menus and pictures is on one hand fine and impressing. The major argument of a few companies about the easy access of their software has provided them certainly a lot of customers. On the other hand resources to let the users be innovative and let them walk their own paths should be kept by all means. Luckily the FEM software used by me has not been giving all those resources away. They still slumber under the hood of the new GUI and they can be activated whenever the user wants. May be I’m a bit romantic only and may be I’m a bit old fashioned but as already mentioned above: I still love to run a whole simulation by use of an own written batch file instead of clicking me through any object trees.
 G. W. Jewell, D. Howe, T.S. Birch, Simulation of Capacitor Discharge Magnetization, IEEE Transactions on Magnetics, Vol.26, No.5, p.1638, Sept. 1990.
About the Author
Thomas Schliesch is the head of Research & Development for Max Baermann GmbH. Thomas Schliesch graduated in Physics from University of Hamburg in 1988 and joined the Max Baermann GmbH, which is a well known manufacturer of bonded magnets in Germany, in 1989. Since 1993 he is Head of Research & Development at Max Baermann GmbH. A major part of his work he devoted to electromagnetic design and the development of specific methods for bonded permanent magnets. Thomas can be reached at email@example.com.