Ing Deep Residual Shrinkage Network: Metung a Artificial Intelligence Method para king Highly Noisy Data

An Artificial Intelligence Method for Highly Noisy Data

Ing Deep Residual Shrinkage Network, metung yang improved variant ning Deep Residual Network. Sa katunayan, ining Deep Residual Shrinkage Network, pi-integrate na la reng Deep Residual Network, attention mechanisms, ampo ing soft thresholding functions (ing “shrinkage” function).

Maintindian tamu ing working principle ning Deep Residual Shrinkage Network king makanining paraan. Una, gagamit yang attention mechanisms ing network para i-identify nung sanu reng unimportant features. Kaybat, gagamit yang soft thresholding functions para i-set la sa zero dening unimportant features. Baligtad naman, a-i-identify na la reng important features at re-retain na la. Ining prosesu, papasikanan na ing kakayahan ning deep neural network. Sasaup ya ini para maka-extract yang useful features manibat karing signals a atin noise.

1. Research Motivation

Una, ing noise, natural ya mu at e maiiwasan potang mag-classify yang samples ing algorithm. Reng alimbawa nining noise, kayabe la reng Gaussian noise, pink noise, at Laplacian noise. Mas malawak pa ken, reng samples, madalas atin lang laman a impormasyon a ali related king current classification task. Pwedi tamung isipin na ining irrelevant information, noise ya mu rin. Posibling mapababa nini ing classification performance. (Tandaan, ing Soft thresholding metung yang key step karing dakal a signal denoising algorithms ne).

Alimbawa, isipin yo atin mag-usap king gilid dalan. Ing audio, malamang atin yang tunog ning busina ampo gulung. Pota gawan tayang speech recognition dening signals. Syempre, makayapektu la reng background sounds king result. King perspective ning deep learning, dapat i-eliminate ning deep neural network reng features a galing karing busina at gulung. Ining elimination, pipigilan na la reng features para e la makasira king speech recognition results.

Pangalawa, ing amount ning noise, madalas ali-aliwa kada sample. Malyari ining manyari kahit kilub namu ning parehung dataset. (Ining variation, atin yang similarities king attention mechanisms. Kunan tayang alimbawa ing image dataset. Ing lokasyon ning target object, pweding ma-iba kada picture. Reng attention mechanisms, pwedi lang mag-focus king specific location ning target object karing balang image).

Kumbaga, isipin yu mag-train tamung cat-and-dog classifier gamit ing limang litratu na may label a “dog.” Ing Image 1, mapalyaring atin yang asu at dagis. Ing Image 2, atin yang asu at gansa. Ing Image 3, atin yang asu at manuk. Ing Image 4, atin yang asu at donkey. Ing Image 5, atin yang asu at bibi. Habang mag-train tamu, dening irrelevant objects, guguluan da ing classifier. Kayabe na la keni reng dagis, gansa, manuk, donkey, at bibi. Ing result nini, bababa ing classification accuracy. Nung a-identify tamu la rening irrelevant objects. Kaybat, a-eliminate tamu la reng features na associated karela. King makanining paraan, asaupan tamung tumas ing accuracy ning cat-and-dog classifier.

2. Soft Thresholding

Ing Soft thresholding, metung yang core step karing dakal a signal denoising algorithms. I-e-eliminate ning algorithm reng features nung ing absolute values da mas mababa la king specific a threshold. Pero nung ing absolute values mas matas la king threshold, i-s-shrink na la ning algorithm papunta king zero. Reng researchers, i-i-implement da ing soft thresholding gamit ining formula:

\[y = \begin{cases} x - \tau & x > \tau \\ 0 & -\tau \le x \le \tau \\ x + \tau & x < -\tau \end{cases}\]

Ing derivative ning soft thresholding output with respect king input, yapin ini:

\[\frac{\partial y}{\partial x} = \begin{cases} 1 & x > \tau \\ 0 & -\tau \le x \le \tau \\ 1 & x < -\tau \end{cases}\]

Papakit na ning formula sa taas na ing derivative ning soft thresholding, pwedeng 1 ya o kaya 0. Ining property, parehu ya king property ning ReLU activation function. Anya, ing soft thresholding, abawas na ing risk ning gradient vanishing at gradient exploding karing deep learning algorithms.

King soft thresholding function, atin adwang condition king pag-set king threshold. Una, dapat positive number ya ing threshold. Pangalawa, ing threshold bawal yang lumagpas king maximum value ning input signal. Nung ali, maging zero ya ngan ing output.

Pwera ken, mas okay nung ma-satisfy ya ing third condition. Bawat sample, dapat atin yang sariling independent threshold base king noise content ning sample a ita.

Ing dahilan, kasi ing noise content, madalas ali-aliwa karing samples. Alimbawa, ing Sample A pwedeng ditak mu ing noise, samantalang ing Sample B madakal ya noise kilub ning parehung dataset. Keng sitwasyon a ini, ing Sample A dapat gumamit yang mas malating threshold during soft thresholding. Ing Sample B naman, dapat gumamit yang mas matas a threshold. Maski mawala ing explicit physical definitions da rening features at thresholds kilub ning deep neural networks, ing basic logic parehu ya pa rin. Kumbaga, bawat sample dapat atin independent a threshold. Ing specific noise content ing mag-determine nining threshold.

3. Attention Mechanism

Madali dang aintindian ding researchers ing attention mechanisms king field ning computer vision. Ing visual systems da reng hayop, a-di-distinguish da la reng targets kapamilatan ning mabilis a pag-scan king buung area. Kaybat, mag-focus ya ing visual systems king target object. Uli nini, akakwa na ning system ing mas dakal a details. Sabay na niti, i-su-suppress na ning system ing irrelevant information. Para king specifics, pakibasa nala mu reng literature tungkul king attention mechanisms.

Ing Squeeze-and-Excitation Network (SENet), metung yang medyu bayung deep learning method a gagamit attention mechanisms. King ali-aliwang samples, ali-aliwa la ambag reng different feature channels king classification task. Ing SENet, gagamit yang malating sub-network para makakwa yang set of weights (Learn a set of weights). Kaybat, i-mu-multiply na la ning SENet dening weights karing features ning respective channels (Apply weighting to each feature channel). Ining operation, a-a-adjust na ing magnitude da reng features king bawat channel. Pwedi tamung isipin ini bilang pag-apply ning ali-aliwang levels ning attention karing different feature channels.

Squeeze-and-Excitation Network

King approach a ini, bawat sample atin yang independent a set of weights. Ibig sabihin, ing weights ning sanu mang adwang samples, aliwa la. King SENet, ing specific path para makakwang weights yapin ing “ Global PoolingFully Connected LayerReLU FunctionFully Connected LayerSigmoid Function “.

Squeeze-and-Excitation Network

4. Soft Thresholding with Deep Attention Mechanism

Ing Deep Residual Shrinkage Network, gagamitan ne ing structure ning SENet sub-network. Gagamitan ne ining structure para i-implement ing soft thresholding sa lalam ning deep attention mechanism. Ining sub-network (itang makasulat king red box) mag-learn yang set of learnable thresholds (Learn a set of thresholds). Kaybat, i-a-apply na ning network ing soft thresholding king bawat feature channel gamit dening thresholds.

Deep Residual Shrinkage Network

King sub-network a ini, una, kakalkulan na ning system ing absolute values ning lahat ng features king input feature map. Kaybat, gagawa yang global average pooling at averaging para akwa ne ing feature, a tatawagan tamung A. King metung pang path, i-i-input na ning system ing feature map king metung a malating fully connected network kaybat ning global average pooling. Ining fully connected network, gagamit yang Sigmoid function bilang final layer. Ining function, i-no-normalize ne ing output between 0 and 1. Ining process, mag-produce yang coefficient, a tatawagan tamung α. Pwedi tamung isulat ing final threshold bilang α × A. Anya, ing threshold metung yang product ning adwang numbers. Ing metung a number, atyu king pilatan ning 0 at 1. Ing metung naman, yapin ing average ning absolute values ning feature map. Sinisiguradu nining method na ing threshold, positive ya. Sinisiguradu na mu rin na ing threshold, ali ya masyadung maragul.

Isa pa, reng different samples, mag-result la king different thresholds. Bilang resulta, pwedi tamung i-interpret ining method bilang specialized attention mechanism. A-i-identify nining mechanism reng features a irrelevant king current task. I-t-transform na la nining mechanism dening features papunta karing values a malapit king zero gamit ing adwang convolutional layers. Kaybat, i-se-set na la nining mechanism sa zero gamit ing soft thresholding. O kaya naman, a-i-identify nining mechanism reng features a relevant king current task. I-t-transform na la nining mechanism dening features papunta karing values a marayu king zero gamit ing adwang convolutional layers. Sa wakas, i-pre-preserve na la nining mechanism dening features.

Pangatauli, i-s-stack tamu la reng certain number of basic modules (Stack many basic modules). I-i-include tamu la rin ding convolutional layers, batch normalization, activation functions, global average pooling, at fully connected output layers. Ining process, bubuuan ne ing kumpletung Deep Residual Shrinkage Network (kayabe ing Identity path).

Deep Residual Shrinkage Network

5. Generalization Capability

Ing Deep Residual Shrinkage Network metung yang general method para king feature learning. Ing dahilan, kasi reng samples madalas atin lang noise karing dakal a feature learning tasks. Atin la ring irrelevant information reng samples. Ining noise at irrelevant information, pwedeng maka-apekto king performance ning feature learning. Alimbawa:

Isipan yu ing image classification. Ing metung a litratu, pwedeng atin yang dakal a aliwang objects. Pwedi tamung intindian dening objects bilang “noise.” Ing Deep Residual Shrinkage Network, baka pwedi neng gamitan ing attention mechanism. Apansin na ning network ining “noise.” Kaybat, gagamit yang soft thresholding ing network para i-set la sa zero dening features a makatugma king “noise” a ini. Ining action, posibling apataas na ing image classification accuracy.

Isipan yu naman ing speech recognition. Specifically, isipan yu reng medyu maingay a environments kalupa ning kwentuhan king gilid dalan o kilub ning factory workshop. Ing Deep Residual Shrinkage Network, pwedeng apataas na ing speech recognition accuracy. O kahit makanu, ing network nag-o-offer yang methodology. Ining methodology, capable yang pataasan ing speech recognition accuracy.

6. Academic Impact

Ining paper, nakatanggap ne ning mahigit 1,400 citations king Google Scholar.

Base king incomplete statistics, in-apply da ne ring researchers ing Deep Residual Shrinkage Network (DRSN) king mahigit 1,000 publications/studies. Dening applications, sasakupan da la reng malawak a fields. Kayabe karing fields a reni ing mechanical engineering, electrical power, vision, healthcare, speech, text, radar, at remote sensing.

Reference

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Michael Pecht, Deep residual shrinkage networks for fault diagnosis, IEEE Transactions on Industrial Informatics, 2020, 16(7): 4681-4690.

https://ieeexplore.ieee.org/document/8850096

BibTeX

@article{Zhao2020,
  author    = {Minghang Zhao and Shisheng Zhong and Xuyun Fu and Baoping Tang and Michael Pecht},
  title     = {Deep Residual Shrinkage Networks for Fault Diagnosis},
  journal   = {IEEE Transactions on Industrial Informatics},
  year      = {2020},
  volume    = {16},
  number    = {7},
  pages     = {4681-4690},
  doi       = {10.1109/TII.2019.2943898}
}