Deep Residual Shrinkage Network: Ete Highly Noisy Data ze'neffe' Artificial Intelligence Method

Deep Residual Shrinkage Network hade ze’tmehheye Deep Residual Network tu. Aslu, nay Deep Residual Network, attention mechanisms, w soft thresholding functions jem’e (integration) tu.

Ete serhu nay Deep Residual Network beze mesil lefehhem qeddr: Attention mechanisms bettenfa’, etom ‘ay’ageddu features le’amerro w be soft thresholding functions ‘ab zero yehe’tte’om; be’akusu, etom ‘ageddesti features le’amerro w ye’aqqbom. Ebbe, nay Deep Neural Network sebbet (ability) min noise zillew signals teqami features n’me’wtsa’ yethayyes.

1. Research Motivation

Qeddam, samples ‘et ‘inikeffelelu waqit (classification), noise—kem Gaussian noise, pink noise, w Laplacian noise—kem liehalef ‘i’qderro. Beze’geffehe ‘iyye, samples buzu waqit mis ‘elli classification task ze’ayrekhib ma’lumat helewwom, ‘elli ‘on kem noise lefehhem qeddr. ‘Elli noise ‘et classification performance hemmay te’siir liehalewo qeddr. (Soft thresholding ‘et bezih signal denoising algorithms ‘agda’i step tu.)

Mesel, ‘et gebbeit gerrhat (roadside) ‘et nthega gze, ‘etitu sda nay makina klaxon w nay gommata hiss liehalew qeddr. ‘Et elli signals speech recognition ‘et ingebbrelu waqit, wejhetu beze hiss kem lithewwok shak ye’llu. Min Deep Learning re’ayye, ‘etom n’ klaxon w gommata zerekkhbu features ‘et wiste Deep Neural Network me’at litemmi gbe’, ‘et speech recognition wejhet te’siir k’aygebbru.

Kla’ay, ‘et hade dataset nbiro, miqqen nay noise min sample n’ sample lefelale tu. (‘Elli mis attention mechanisms shebeh hellu; image dataset kem mesel ‘et nwesd, ‘et klli image bota nay target object lefelale qeddr, w attention mechanisms ‘etitu bota nay target object ‘ab klli image lithkkar qeddr.)

Mesel, nay dummu-kelbi classifier ‘et nmeherrelu waqit, 5 images “kelbi” zibbehal labels ‘et nre’e. La qeddamit image kelbi w hinche (mouse) liehalew qeddr, la kal’et kelbi w hanse (goose), la sellet kelbi w derho (chicken), la rab’et kelbi w adgiy (donkey), la hamset kelbi w bat (duck). ‘Et training, la classifier bezom ‘ay’ageddu objects—kem hinche, hanse, derho, adgiy, w bat—kem lithewwok shak ye’llu, w ‘elli n’accuracy nay classification yene’esso. ‘Ellom ze’ay’ageddu objects—hinche, hanse, derho, adgiy, w bat—re’ina etom zerekkhbuwom features me’at n’ke’elna, nay cat-and-dog classifier accuracy methyyas qeddr.

2. Soft Thresholding

Soft thresholding ‘et bezih signal denoising algorithms ‘agda’i step tu. Etom absolute values om min hade threshold zene’ese features yehe’tte’om (eliminates), w etom absolute values om min ‘elli threshold ze’abi features n’ zero yeshannekkjom (shrinks). Beze formula mettenfa’ qeddr:

\[y = \begin{cases} x - \tau & x > \tau \\ 0 & -\tau \le x \le \tau \\ x + \tau & x < -\tau \end{cases}\]

Nay soft thresholding wejhet derivative n’ input:

\[\frac{\partial y}{\partial x} = \begin{cases} 1 & x > \tau \\ 0 & -\tau \le x \le \tau \\ 1 & x < -\tau \end{cases}\]

Kem la’il re’ina, nay soft thresholding derivative 1 wala 0 tu. ‘Elli halla mis ReLU activation function hade tu. Sileze, soft thresholding n’ Deep Learning algorithms min gradient vanishing w gradient exploding kelli’om qeddr.

‘Et soft thresholding function, la threshold met’esas kil’e shuru’t lemmale gbe’: Qeddam, threshold positive number liehalew gbe’; Kla’ay, threshold min maximum value nay input signal liea’bi ye’llu, ‘en ‘aykone la output mullu zero ligebbi’ tu.

Kamehu, la threshold sellet shuru’t lemmale wayo tu: Klli sample be noise content nay ge’zu independent threshold liehalewwo gbe’.

Sabab, noise content ‘ab samples buzu waqit lefelale tu. Mesel, ‘et hade dataset Sample A whed noise liehalewo qeddr, w Sample B bezih noise liehalewo qeddr. ‘Et elli, ‘et denoising algorithm soft thresholding ‘et ngebbi’, Sample A ne’ish threshold lietneffe’ gbe’, w Sample B ‘abi threshold lietneffe’ gbe’. Walu ‘ellom features w thresholds ‘et Deep Neural Networks fizikalawi ma’na ‘en tefe’o, la aslu methasab hade tu. Ya’ni, klli sample be noise content nay ge’zu independent threshold liehalewwo gbe’.

3. Attention Mechanism

Attention mechanisms ‘et Computer Vision medda ‘aze lefehhem tu. Nay ensisa nay re’i system (visual system) ‘et kullu bota be fetan scan, target objects lefelale, ‘ebbe attention n’ target object lehib w zeyada details lewessi, w etom ze’ay’ageddu information yeshakk’om (suppress). N’zeyada ma’lumat, attention mechanisms zibli articles re’aw.

Squeeze-and-Excitation Network (SENet) haddis Deep Learning method ‘et attention mechanisms tu. ‘Et zilefelale samples, zilefelale feature channels ‘et classification task zegebbro contribution lefelale tu. SENet ne’ish sub-network yetneffe’ Learn a set of weights n’me’wtsa’, w ‘ellom weights mis features nay klli channel lerebbeh n’ size nay features ‘et klli channel n’met’esas. ‘Elli serhu kem Apply weighting to each feature channel lefehhem qeddr.

Squeeze-and-Excitation Network

Beze, klli sample nay ge’zu independent set of weights hellewwo. Be kal’e heje, nay kil’e sample weights zilefelale tu. ‘Et SENet, weights n’merkhab menged: “Global Pooling → Fully Connected Layer → ReLU Function → Fully Connected Layer → Sigmoid Function” tu.

Squeeze-and-Excitation Network

4. Soft Thresholding with Deep Attention Mechanism

Deep Residual Shrinkage Network min SENet sub-network fekkere, soft thresholding ‘et deep attention mechanism n’mettenfa’. Be sub-network (‘et qeyih box zille), Learn a set of thresholds n’ klli feature channel soft thresholding n’meghbar.

Deep Residual Shrinkage Network

‘Et elli sub-network, qeddam nay kullu features ‘et input feature map absolute values tehasseb. Ebbe, be global average pooling w average, hade feature lirkkab, A behil. ‘Et la kal’e menged, feature map dib global average pooling n’ ne’ish fully connected network li’atu. ‘Elli fully connected network Sigmoid function kem mecherresta layer yetneffe’, output ‘et 0 w 1 let’eses, α zibbehal coefficient n’merkhab. La mecherresta threshold α × A tu. Sileze, threshold hasli darb nay number ‘et 0 w 1 mis average nay absolute values nay feature map tu. ‘Elli menged threshold positive kem ligebbi’, w ‘aze kem ze’ye’abi le’tehaqaqqe tu.

Kamehu, zilefelale samples zilefelale thresholds yehewwom. Ebbe, kem special attention mechanism lefehhem qeddr: n’ current task ze’ay’ageddu features le’amerro, be kil’e convolutional layers n’ zero leqerrib values yesheqqayyrom, w be soft thresholding n’ zero leqeyyrom; wayo, n’ current task ze’ageddu features le’amerro, be kil’e convolutional layers min zero lerehaqe values yesheqqayyrom, w ye’aqqbom.

Mecherresta, Stack many basic modules mis convolutional layers, batch normalization, activation functions, global average pooling, w fully connected output layers, la mullu Deep Residual Shrinkage Network yeserrih.

Deep Residual Shrinkage Network

5. Generalization Capability

Deep Residual Shrinkage Network aslu general feature learning method tu. Sababu, ‘et bezih feature learning tasks, samples ne’ish wayo bezih noise w ze’ay’ageddu information helewwom. ‘Elli noise w ze’ay’ageddu information n’ feature learning performance te’siir ligebbru qeddr. Mesel:

‘Et image classification, image buzu kal’e objects ‘en hellewo, ‘ellom objects “noise” lebbehalo qeddr; Deep Residual Shrinkage Network be attention mechanism ‘ellom “noise” le’amerro qeddr, w be soft thresholding etom features nay “noise” n’ zero leqeyyrom, ‘ebbe image classification accuracy lethyyas qeddr.

‘Et speech recognition, hiss zillewo bota—kem gebbeit gerrhat wayo factory—Deep Residual Shrinkage Network speech recognition accuracy lethyyas qeddr, wayo methodology n’ speech recognition accuracy methyyas lehib.

Reference

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Michael Pecht, Deep residual shrinkage networks for fault diagnosis, IEEE Transactions on Industrial Informatics, 2020, 16(7): 4681-4690.

https://ieeexplore.ieee.org/document/8850096

BibTeX

@article{Zhao2020,
  author    = {Minghang Zhao and Shisheng Zhong and Xuyun Fu and Baoping Tang and Michael Pecht},
  title     = {Deep Residual Shrinkage Networks for Fault Diagnosis},
  journal   = {IEEE Transactions on Industrial Informatics},
  year      = {2020},
  volume    = {16},
  number    = {7},
  pages     = {4681-4690},
  doi       = {10.1109/TII.2019.2943898}
}

Academic Impact

‘Elli paper ‘et Google Scholar la’il 1400 citations hellewwo.

Be incomplete statistics, Deep Residual Shrinkage Network (DRSN) ‘et la’il 1000 publications/studies teqabbele, ‘et mechanical engineering, electrical power, vision, healthcare, speech, text, radar, remote sensing w kal’e medda’t.