rectified linear unit relu | 1.75 | 0.5 | 5688 | 26 | 42 |
rectified linear unit relu function | 0.34 | 0.5 | 8284 | 35 | 50 |
rectified linear unit relu : | 1.52 | 0.5 | 7681 | 28 | 49 |
rectified linear unit relu layer | 0.16 | 0.4 | 6255 | 32 | 97 |
rectified linear unit relu in deep learning | 0.66 | 0.1 | 4255 | 43 | 5 |
rectified linear unit relu activation | 0.93 | 0.2 | 6436 | 37 | 4 |
leaky rectified linear unit relu | 1.14 | 0.5 | 7353 | 32 | 37 |
rectified linear unit function | 0.51 | 0.9 | 2207 | 30 | 23 |
rectified linear units relus | 1.9 | 0.6 | 1957 | 28 | 27 |
the rectified linear unit | 0.76 | 0.5 | 4513 | 25 | 60 |
rectified linear unit activation function | 1.4 | 0.6 | 136 | 41 | 76 |
rectified linear unit improve restricted | 1.37 | 0.6 | 131 | 40 | 31 |
rectified linear units improve | 1.89 | 0.2 | 8473 | 30 | 93 |
rectified linear unit activation | 0.91 | 0.2 | 5197 | 32 | 61 |