Optimization is finding the best solution, which mathematically amounts to locating the global minimum of some cost function. Optimization is traditionally automated with digital or quantum computers, each having their limitations and none guaranteeing an optimal solution. Here, we conceive a principle behind optimization based on delay-induced bifurcations, which is potentially implementable in non-quantum analog devices. Often, optimization techniques are interpreted via a particle moving in multi-well energy landscape and to prevent confinement to a non-global minima they should incorporate mechanisms to overcome barriers between the minima. Particularly, simulated annealing digitally emulates pushing a fictitious particle over a barrier by random noise, whereas quantum computers utilize tunneling through barriers. In our principle, the barriers are effectively destroyed by delay-induced bifurcations. Although bifurcation scenarios in nonlinear delay-differential equations can be very complex and are notoriously difficult to predict, we hypothesize, verify, and utilize the finding that they become considerably more predictable in dynamical systems, where the right-hand side depends only on the delayed variable and represents a gradient of some potential energy function. By tuning the delay introduced into the gradient descent setting, thanks to global bifurcations destroying local attractors, one could force the system to spontaneously wander around all minima. This would be similar to noise-induced behavior in simulated annealing but achieved deterministically. Ideally, a slow increase and then decrease of the delay should automatically push the system toward the global minimum. We explore the possibility of this scenario and formulate some prerequisites.
Can optimization problem be solved without either relatively slow digital computers or fast, but costly and difficult to make quantum computers? Instead of employing complex algorithms or expensive technologies, could we rely on a fast analog device operating in a semi-automatic manner? We conceive and prove a principle behind optimization, which uses bifurcations caused by time delay in nonlinear dynamical systems and could potentially be implemented in analog circuits. We hypothesize and verify that, in a special class of delay equations involving the gradient of the cost function, an increase of delay induces a chain of global bifurcations, which effectively remove the barriers between the minima and enable the system to “explore” the vicinities of all minima. Thus, we propose and demonstrate an alternative way for a candidate optimizer to overcome the barriers. The delay plays the role of random noise in standard algorithm-based optimization settings, such as simulated annealing, or of the tunneling effect in quantum computers, and could be promising for obtaining a global minimum. We consider various configurations of the cost function with five minima with slightly different forms of the barrier-breaking mechanism and demonstrate that in many cases, the system converges to the global minimum as desired. Testing with real-life problems and hardware implementation of the proposed principle are the tasks for the future. Given that none of the existing optimization approaches can fully guarantee the optimal solution, that digital computers can be too slow, and quantum computers require expensive technologies, our approach could represent an attractive alternative deserving further exploration.
Funding
DTA - Loughborough University
Engineering and Physical Sciences Research Council
This is an Open Access Article. It is published by AIP Publishing under the Creative Commons Attribution 4.0 International Licence (CC BY 4.0). Full details of this licence are available at: https://creativecommons.org/licenses/by/4.0/