[Questions about Machine Learning]Chapter IV Fundamentals of Deep Learning

Q. What is neural network?

Q. What is perception machine? What is multi-layer perception machine, a.k.a MLP?

A.

Q. What are the common used neural networks?

A. commonly-used-neural-networks

For more information, please visit asimov institute

Q. There are so many deep learning frameworks, which one should I choose?

Q. Why we need deep neural networks? What is it?

Q. Why it is so hard to train a deep neural network?

Q. What are the differences between machine learning and deep learning?

Q. What is forward propagation and backward propagation? a.k.a FP and BP?

Q. Still unclear, more examples?

Q. How to calculate the output of a neural network?

Q. What is hyper parameters?

Q. How to find the best value for hyper parameters?

Q. Generally, what are the steps to find a hyper parameters?

Q. What is activation function? Why we need it?

Q. What are the commonly used activation functions?

A. sigmoid, tanh, Relu, Leaky Relu, softplus and softmax are some commonly used activation functions.

Q. What is the derivatives of those activation functions?

Q. What properties do these activation functions have?

Q. How to choose an proper activation function?

Q. What are the advantages of Relu? Why it is so popular?

Q. Why softmax can be used to do multi class classification?

Q. Why tanh has a higher convergence rate?

Q. What is batch size? Why we need it?

Q. What is normalization? Why we need it?

Q. What is batch normalization? Why we need it?

Q. What is fine tuning?

Q.