TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Swish: A Self-Gated Activation Function

2 pointsby goberoiover 7 years ago

1 comment

goberoiover 7 years ago
Why is this interesting? In short: a great new activation function that may challenge the dominance of ReLU.<p>Longer story:<p>Today, ReLU is the most popular activation function for deep networks (along with its variants like leaky ReLU or parametric ReLU).<p>This paper from the Google Brain team is ~2 weeks old, and shows that SWISH, a new activation function, &quot;improves top-1 classification accuracy on ImageNet by 0.9% for Mobile NASNetA and 0.6% for Inception-ResNet-v2&quot; by simply replacing ReLU with SWISH.<p>SWISH is equal to x * sigmoid(x), so not that much harder to compute either.