Dissemin is shutting down on January 1st, 2025

Published in

Elsevier, Pattern Recognition Letters, (116), p. 8-14

DOI: 10.1016/j.patrec.2018.09.006

Links

Tools

Export citation

Search in Google Scholar

Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks

Journal article published in 2018 by Fréderic Godin ORCID, Jonas Degrave ORCID, Joni Dambre, Wesley De Neve ORCID
This paper was not found in any repository, but could be made available legally by the author.
This paper was not found in any repository, but could be made available legally by the author.

Full text: Unavailable

Green circle
Preprint: archiving allowed
Red circle
Postprint: archiving forbidden
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO