Published in

American Physical Society, Physical review E: Statistical, nonlinear, and soft matter physics, 5(90), 2014

DOI: 10.1103/physreve.90.052908

Links

Tools

Export citation

Search in Google Scholar

Delay-induced Turing instability in reaction-diffusion equations

Journal article published in 2014 by Tonghua Zhang ORCID, Hong Zang
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Time delays have been commonly used in modeling biological systems and can significantly change the dynamics of these systems. Quite a few works have been focused on analyzing the effect of small delays on the pattern formation of biological systems. In this paper, we investigate the effect of any delay on the formation of Turing patterns of reaction-diffusion equations. First, for a delay system in a general form, we propose a technique calculating the critical value of the time delay, above which a Turing instability occurs. Then we apply the technique to a predator-prey model and study the pattern formation of the model due to the delay. For the model in question, we find that when the time delay is small it has a uniform steady state or irregular patterns, which are not of Turing type; however, in the presence of a large delay we find spiral patterns of Turing type. For such a model, we also find that the critical delay is a decreasing function of the ratio of carrying capacity to half saturation of the prey density.