>100 Views
March 23, 18
スライド概要
2018/03/23
Deep Learning JP:
http://deeplearning.jp/seminar-2/
DL輪読会資料
DEEP LEARNING JP [DL Papers] Squeeze-and-Excitation Networks Koichiro Tamura, Matsuo Lab http://deeplearning.jp/
PAPER INFORMATION • - • • • 2
Outline 5 3 54. 2 . 43 451 2 3
Abstract • – – – I ) C N( C A ILSVRC2017 CSOTA SENet C C 4
ILSVRC • – SR • –N 2 g 17 g 2 • s n ma lrscE LN S N CEI e SENet V ot 11 0 22 i h uC SENet 2017 http://image-net.org/challenges/talks_2017/ILSVRC2017_overview.pdf 5
Introduction • M tcIC • ve • • • o 2 M M . 1Sl d 11 1 hsSr [ DH niM GM a 1Sp - 1 1 1 2. 1 . . . 1 - SENetN gD eM wSl d hsSr u] mI 6
Introduction • ( ) –( - 1 ) • ( :-- • ) – – te – u om ) -1 1 te om E -) -) % % bs ( 1 xa N d q ki N Nzc nl S ) - :1-% :-- - u bh N p r S E q 7
Related Work 2015 2016 2017 DenseNet (2016-08-25) Xception Module Inception-v1 Module 2018 DPN (2017-07-06) Residual-Attention Network (2017-04-23) (2016-10-07) v2 v3 v4 (2014-09-17) Residual Module ResNeXt Module (2015-12-10) (2016-11-16) Pre-act ResNet (2016-03-16) Pryamid Net Shake Drop (2016-10-10) (2018-02-15) SENet (2017-09-05) 8
Inception Module 7 • A G – 1 ]MN – 1A C XdX – I XdX hfnc – R , A L m • , • , • 2 –, , A ,>A A A A , 0 C7 0 C A 0 C C 7 - C X [ > 7 io Sge 4 ())), https://github.com/leetenki/googlenet_chainer/raw/master/sample_images/2.jpg A lm S , A > 7 ap 2 A Lge 9
Residual Module : • – – , : Non , • a • ) , • ) , – – ( BN , pg f m cP dkN I hi - , bR , - - cP e , - - , e DN S l L : , 10
DenseNet • 0 2 (0 0 – – (0 • • –( 7 0)0 0 2 R 7 0 r 0 0 2 shP PB 1 )0 7 P p 0 NyPV edRtw N ( ) ok i ua 0 c V 7 0 27 V ND BVNlCV nm u yPV BV 11
Attention-Module A • – –b – – e d a R c A : https://www.slideshare.net/DeepLearningJP2016/dl-residual-attention-network-for-image-classification 12
Techniques • – D N * 1 – a D R: https://deepage.net/deep_learning/2016/11/30/resnet.html 13
Techniques • L( – – • B ) L C N: https://deepage.net/deep_learning/2016/10/26/batch_normalization.html • – • N: https://www.slideshare.net/KeigoNishida/layer-normalizationnips 14
Techniques • - – • • S c - – • do D i e CN SR CN o m gkl N - e Deep Network with Stochastic Depth – • aehp S n N Shake-shake-regularization 15
Techniques • ) – • • ) ) -( ) ForwardO OA P D Backward T S: https://qiita.com/yu4u/items/a9fc529c85534eca11e5#fn1 16
S • – – , L , U R N D 17
SENet • G 11 1 B - • EI 1 A EI .-: S 2 1 - 1 - . - 11 1 11 1 B B - - .1 N E 18
Squeeze-and-Excitation Blocks -F • – - • E OE ENS B • B - OE E C 19
Squeeze: Global Information Embedding • N C • 1/16 20
Excitation: Adaptive Recalibration • • 1 W 2 21
Squeeze-and-Excitation Blocks Inception module Residual module 22
Result B - E 23
Result 24
Result 25
implementation a • – • - •k 4 2 - 3 – • – 4 Nf C C a 2>)2 d i 2A2 • 4 kwznu sxptUh ihe g 2C)2 – • > 2 2 > 2 k 4> 4 42)2 2 > b i rlmyI ovz k T 24 3 >>2> 3 ( 4 2332 b he yR h > A2 2 4 3 22 a3 2C)2 4 i Nf k acO i g 2 > 23 A 2 2 > : http://iwiwi.hatenadiary.jp/entry/2016/12/31/162059 26
D • – • • • N ME - : AME 27
• • • • • • • • • • • • • • • / EE E AND 5 CI A I N :E RKS – PS/ II A C DAIS KELAB I E S EC.(+ ))D B D-D ( n n ] ]wk – PS/ SLIDES ARE NE REN) SS -) ) 3:: jpq g srq [ y mho – P/ I I I A ENADIAR JP EN R + ( + . 4ENSEL 3 NNEC ED 3 N L I NAL :E RKS 4 AL A :E RKS ESID AL 1 EN I N :E RK R 6 A E 3LASSI ICA I N – 49 ESID AL 1 EN I N :E RK R 6 A E 3LASSI ICA I N • PS/ SLIDES ARE NE 4EEP9EARNIN 7 + DL RESID AL A EN I N NE RK ESID AL :E RK ES:E k hz ] xk y w vy – PS/ DEEPA E NE DEEP@LEARNIN + ( RESNE L 2A C : R ALI A I N ] ]wk t bq g i – PS/ DEEPA E NE DEEP@LEARNIN + + BA C @N R ALI A I N L 9A ER : R ALI A I N0:6 n [ – PS/ SLIDES ARE NE 8EI :IS IDA LA ER N R ALI A I NNIPS 4EEP :E RK I C AS IC 4EP AKE AKE RE LARI A I N 6 PR ED E LARI A I N 3 N L I NAL :E RAL :E RKS I 3 AKE4R P RE LARI A I N gli_ ] ]wk t afamec t ud – PS/ II A C ) I E S A. C .C- ()ECA E ELA 2ACKPR PA A I N R 5 EC I E 9EARNIN 4EEP 3 N L I NAL :E RAL :E RKS R I A E CLASSI ICA I N 28