For the second week, I continued to study neural networks, moving on to multi-layer and sigmoid classification.
I looked at algorithms for MLPs and different output activation functions such as the soft-max function, as well as how that will change the error handling. Further I focused on the differences between sequential and batch training, and considered minibatches (which I will use) and stochastic gradient descent. One important lesson was understanding when to stop learning to not over fit or learn insufficiently.
I also started practicing with NumPy, which will be used to deal with matrices/arrays and other computations for neural networks.
I started getting myself familiar with Keras, which is a machine learning library that will help with constructing neural networks.
Finally, I started looking into deep learning, which I will hopefully complete next week, and, with much anticipation, find research ideas for my project.