20170315 deeplearning from_scratch_ch01

Download 20170315 deeplearning from_scratch_ch01

Post on 21-Jan-2018

19 views

Category:

Engineering

1 download

Embed Size (px)

TRANSCRIPT

  1. 1. Tomomi Research Inc. Ch01. Python (Python) 2012/03/13 (Mon) Seong-Hun Choe
  2. 2. Tomomi Research Inc. 2017/9/11
  3. 3. Tomomi Research Inc. Class 2017/9/11 class Man: """""" def __init__(self, name): self.name = name print("Initilized!") def hello(self): print("Hello " + self.name + "!") def goodbye(self): print("Good-bye " + self.name + "!") m = Man("David") m.hello() m.goodbye() C:/Users/SChoe672007022/Dropbox /Python/deep-learning-from- scratch-master/ch01/man.py Initilized! Hello David! Good-bye David! Process finished with exit code 0
  4. 4. Tomomi Research Inc. Perceptron -> Neural Network 2017/9/11 Hidden Layer
  5. 5. Tomomi Research Inc. Review: Perceptron 2017/9/11
  6. 6. Tomomi Research Inc. Review: Perceptron : Activate Function 2017/9/11
  7. 7. Tomomi Research Inc. Activate Function 2017/9/11 1. Step Function 2. Sigmoid Function
  8. 8. Tomomi Research Inc. Activate Function : New Star 2017/9/11 3. ReLu Function ReLU : Rectified Linear Unit IV
  9. 9. Tomomi Research Inc. Multiplying the Matrix 2017/9/11
  10. 10. Tomomi Research Inc. Neural Network 2017/9/11 [X] [W] [Y] = 1 2 = 1 3 5 2 4 6 = 1 2 3 [X][W] = [Y] 1 2 1 3 5 2 4 6 = 1 2 3
  11. 11. Tomomi Research Inc. 3 Layer Neural Network 2017/9/11 Input layer (Oth layer) 1st hidden layer (1st layer) 2nd hidden layer (2nd layer) Output layer (3rd layer)
  12. 12. Tomomi Research Inc. 3 Layer Neural Network 2017/9/11 Original Adding bias Adding Activate function
  13. 13. Tomomi Research Inc. Activate function on output layer 2017/9/11 1. Regression ) 2. Classfication)
  14. 14. Tomomi Research Inc. Activate function on output layer 2017/9/11 54.7kg)
  15. 15. Tomomi Research Inc. Activate function on output layer 2017/9/11 95% , 1% , 0.5% , 3.5%
  16. 16. Tomomi Research Inc. Number of neurons at output layer 2017/9/11 In case of reading handwriting numbers. the # of neurons will be 10. (0,1,2, 9)
  17. 17. Tomomi Research Inc. MNIST 2017/9/11 Handwritten digits : forward propagation in neural network https://rstudio.github.io/tensorflow/tutorial_mnist_beginners.html * Mixed National Institute of Standards and Technology database
  18. 18. Tomomi Research Inc. Training with Python 2017/9/11
  19. 19. Tomomi Research Inc. Summary 1. Activate function at NN : Sigmoid, ReLU function 2. 3. 4. 5. MNIST 2017/9/11

View more