IBM Release 1.93 manual Optimization, Merit Functions

Models: Release 1.93

1 93
Download 93 pages 50.46 Kb
Page 41
Image 41

3.6. Optimization

There are powerful techniques for finding the minimum of smooth functions in a few function evaluations, an important consideration when the merit function requires minutes to hours for each iteration. There are robust techniques for finding the minimum of functions that are noisy, discontinuous, or otherwise badly behaved, which our simulation results tend to be. There are intelligent techniques that have a good prospect of finding a global minimum of a function whose geometry is poorly known, which ours usually are. Unfortunately for us, the intersection of those three groups is the null set.

Because 3-D electromagnetic simulations tend to be slow, noisy, and to produce merit function values full of multidimensional cliffs and canyons, we have to be realistic about what optimization can give us. Given reasonable computing resources and time to run, POEMS will take your initial guess and improve it automatically. It will explore the immediate neighbourhood, and if after a couple of restarts it converges back to the same place, you can be reasonably sure that there isn’t a significantly better design in the immediate vicinity. If you run out of time and have to stop it somewhere, the best point in the current simplex is always the best point evaluated so far.

The POEMS optimizer is more like an automatic design-of-experiments machine than an elegant numerical thoroughbred, but it is a reasonable choice for the problem at hand. Work is underway to find a better one, with emphasis on response-surface optimizers, where the merit function values go into a statistically-fitted parameterized surface, and the minimum of that surface is used as the next guess. Pruning the set of points used and choosing the right class of surfaces are the tricky parts.

The Nelder-Mead downhill simplex method does not always converge to the minimum, even of smooth functions. When this happens, the reason is that the simplex has closed up on itself in some dimension, so that its hypervolume becomes anomalously small and the search space becomes defective. For this reason, once it has converged, it’s usually a good idea to restart it using the best point and a few random values. The POEMS optimizer does this if requested. On average, Nelder-Mead seems to be the most efficient direct-search algorithm available for the problem at hand—its occasional failure to converge without restarting is more than balanced by its use of only 1 or 2 function evaluations per step (except for shrink steps).

3.6.1. Merit Functions

In order for this to do anything useful, you have to specify a merit function (or penalty function for pessimists) that depends on the simulation output. POEMS provides enough flexibility to optimize nearly anything technologically relevant, except that it has no direct support for nonlinear optics (which it may in the future). A few examples are useful.

a. Waveguide Loss Loss in a waveguide device can be calculated from the flux integrals in the input and output planes:

POSTPROCESS

...

37

Page 41
Image 41
IBM Release 1.93 manual Optimization, Merit Functions