Deep Residual Shrinkage Network: Artificial Intelligence Method Highly Noisy Data ro

An Artificial Intelligence Method for Highly Noisy Data

Deep Residual Shrinkage Network shi Deep Residual Network ye improved variant. Asalin ma, Deep Residual Shrinkage Network ye Deep Residual Network, attention mechanisms, a soft thresholding functions a zaye kowwi.

Ani Deep Residual Shrinkage Network ye working principle ciga biya rakin. Awwal ma, network ye attention mechanisms gonin unimportant features tusu. Koro, network ye soft thresholding functions gonin unimportant features aden zero ro walazin. Kasi ma, network ye important features tusu a kowwi. Aiki ade deep neural network ye karfi nzon. Aiki ade network ro taimako tukin don useful features signals containing noise lan tusu.

1. Research Motivation

Awwal ma, algorithm ye samples classifies tukin lokacin, noise ba rakin ba. Misalan noise ade Gaussian noise, pink noise, a Laplacian noise. Koro, samples ye information ngubu kowwi, amma information ade classification task ro irrelevant. Ani irrelevant information ade noise ro fahimta rakin. Noise ade classification performance ganazin. (Soft thresholding shi key step signal denoising algorithms ngubu lan.)

Misal ma, magana roadside lan. Audio ye car horns a wheels a kowwi rakin. Ani speech recognition signals ade lan tukin. Background sounds ade results ro matsala kadin. Deep learning perspective lan, deep neural network ye features na horns a wheels a eliminate tukin. Elimination ade features speech recognition results ro matsala ba.

Koro, noise amount samples lan gawazin. Halin ade dataset fal lan ma sadin. (Halin ade attention mechanisms ro kama. Image dataset gonin misal ma. Location na target object images lan gawazin. Attention mechanisms ye focus tukin specific location na target object image kowa lan.)

Misal ma, cat-and-dog classifier training tukin, images 5 da label “dog.” Image 1 ye dog a mouse a kowwi. Image 2 ye dog a goose a kowwi. Image 3 ye dog a chicken a kowwi. Image 4 ye dog a donkey a kowwi. Image 5 ye dog a duck a kowwi. Training lokacin, irrelevant objects ye classifier ro interfere tukin. Objects ade mouse, goose, chicken, donkey, a duck. Interference ade classification accuracy ganazin. Idan ani irrelevant objects tusu rakin. Agogo, ani features na objects ade eliminate rakin. Hanya ade lan, ani accuracy na cat-and-dog classifier nglazin.

2. Soft Thresholding

Soft thresholding shi core step signal denoising algorithms ngubu lan. Algorithm ye features eliminate tukin idan absolute values features ye threshold gana. Algorithm ye features zero ro shrink tukin idan absolute values features ye threshold kura. Researchers ye soft thresholding implement tukin formula ade lan:

\[y = \begin{cases} x - \tau & x > \tau \\ 0 & -\tau \le x \le \tau \\ x + \tau & x < -\tau \end{cases}\]

Derivative na soft thresholding output input ro shi ne:

\[\frac{\partial y}{\partial x} = \begin{cases} 1 & x > \tau \\ 0 & -\tau \le x \le \tau \\ 1 & x < -\tau \end{cases}\]

Formula sama lan nuna tukin, derivative na soft thresholding 1 au 0. Property ade ReLU activation function ye property a fal. Agogo, soft thresholding ye risk na gradient vanishing a gradient exploding deep learning algorithms lan ganazin.

Soft thresholding function lan, threshold setting sharudda indi biya layya. Awwal ma, threshold positive number nzin. Koro, threshold maximum value input signal ye kura nzin ba. In ba haka ba, output duka zero ro walazin.

Koro, threshold sharadi uku ma biya rakin. Sample kowa independent threshold nzin noise content sample ye lan.

Dalil bi ma, noise content samples lan gawazin. Misal ma, Sample A ye noise gana kowwi amma Sample B ye noise ngubu kowwi dataset fal lan. Halin ade lan, Sample A ye threshold gana gonin soft thresholding lokacin. Sample B ye threshold kura gonin. Deep neural networks lan, features a thresholds a physical definitions asalin ba. Amma, basic underlying logic fal. Ma’ana, sample kowa independent threshold gonin. Specific noise content ye threshold ade determine tukin.

3. Attention Mechanism

Researchers ye attention mechanisms computer vision field lan sauki ro fahimta rakin. Visual systems na dabbobi targets tusu rakin scanning entire area lan. Koro, visual systems ye attention focus tukin target object lan. Aiki ade systems ro details ngubu extract tukin. Lokaci fal lan, systems ye irrelevant information suppress tukin. Don karin bayani, dubi literature na attention mechanisms.

Squeeze-and-Excitation Network (SENet) shi deep learning method bulin attention mechanisms utilizes tukin. Samples gawata lan, feature channels gawata classification task ro contribute tukin. SENet ye small sub-network gonin don Learn a set of weights. Koro, SENet ye weights ade multiply tukin features na channels lan. Operation ade magnitude na features channel kowa lan adjust tukin. Ani process ade Apply weighting to each feature channel ro fahimta rakin.

Squeeze-and-Excitation Network

Hanya ade lan, sample kowa independent set of weights kowwi. Ma’ana, weights na samples indi gawazin. SENet lan, path na weights samu shi ne: “Global Pooling → Fully Connected Layer → ReLU Function → Fully Connected Layer → Sigmoid Function.”

Squeeze-and-Excitation Network

4. Soft Thresholding with Deep Attention Mechanism

Deep Residual Shrinkage Network ye SENet sub-network structure gonin. Network ye structure ade implement tukin soft thresholding deep attention mechanism lan. Sub-network (box kime lan) Learn a set of thresholds tukin. Koro, network ye soft thresholding apply tukin feature channel kowa lan thresholds ade a.

Deep Residual Shrinkage Network

Sub-network ade lan, awwal ma system ye absolute values na features duka input feature map lan lissafi tukin. Koro, system ye global average pooling a averaging a tukin don feature A samu. Path gade lan, system ye feature map input tukin small fully connected network lan bayan global average pooling. Fully connected network ade Sigmoid function gonin layer karshe ro. Function ade output 0 a 1 a tsakanin normalize tukin. Process ade coefficient α kadin. Ani final threshold α × A ro nuna rakin. Agogo, threshold shi product na lambobi indi. Lamba fal 0 a 1 a tsakanin. Lamba gade shi average na absolute values na feature map. Method ade threshold positive nzin tabbatar tukin. Method ade threshold buro kura ba nzin tabbatar tukin.

Koro, samples gawata thresholds gawata kadin. Agogo, ani method ade specialized attention mechanism ro fahimta rakin. Mechanism ade features irrelevant current task ro identify tukin. Mechanism ade features transform tukin values karia zero ro convolutional layers indi lan. Koro, mechanism ade features zero ro set tukin soft thresholding lan. Au, mechanism ade features relevant current task ro identify tukin. Mechanism ade features transform tukin values zero ro nisa ro convolutional layers indi lan. Karshen ma, mechanism ade features preserve tukin.

Karshen ma, ani Stack many basic modules tukin. Ani koro convolutional layers, batch normalization, activation functions, global average pooling, a fully connected output layers kowwi. Process ade Deep Residual Shrinkage Network cika construct tukin.

Deep Residual Shrinkage Network

5. Generalization Capability

Deep Residual Shrinkage Network shi general method feature learning ro. Dalil bi ma, samples ye noise kowwi feature learning tasks ngubu lan. Samples ye irrelevant information kowwi. Noise a irrelevant information ade performance na feature learning ro matsala kadin. Misal ma:

Magana image classification lan. Image fal ye objects gade ngubu kowwi. Ani objects ade “noise” ro fahimta rakin. Deep Residual Shrinkage Network ye attention mechanism utilize rakin. Network ye “noise” ade notis tukin. Koro, network ye soft thresholding employ tukin don features na “noise” ade zero ro set tukin. Aiki ade image classification accuracy nglazin.

Magana speech recognition lan. Musamman noisy environments misal ma roadside au factory workshop cikin. Deep Residual Shrinkage Network ye speech recognition accuracy nglazin. Au koro, network ye methodology kowwi. Methodology ade speech recognition accuracy nglazin rakin.

Reference

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Michael Pecht, Deep residual shrinkage networks for fault diagnosis, IEEE Transactions on Industrial Informatics, 2020, 16(7): 4681-4690.

https://ieeexplore.ieee.org/document/8850096

BibTeX

@article{Zhao2020,
  author    = {Minghang Zhao and Shisheng Zhong and Xuyun Fu and Baoping Tang and Michael Pecht},
  title     = {Deep Residual Shrinkage Networks for Fault Diagnosis},
  journal   = {IEEE Transactions on Industrial Informatics},
  year      = {2020},
  volume    = {16},
  number    = {7},
  pages     = {4681-4690},
  doi       = {10.1109/TII.2019.2943898}
}

Academic Impact

Paper ade citations 1,400 Google Scholar lan fi.

Statistics duka ba yaye, researchers ye Deep Residual Shrinkage Network (DRSN) apply tukin publications/studies 1,000 fi lan. Applications ade fields buro ngubu lan. Fields ade mechanical engineering, electrical power, vision, healthcare, speech, text, radar, a remote sensing.