Week Two – Multi-Layer Networks and NumPy

For the second week, I continued to study neural networks, moving on to multi-layer and sigmoid classification.

I looked at algorithms for MLPs and different output activation functions such as the soft-max function, as well as how that will change the error handling. Further I focused on the differences between sequential and batch training, and considered minibatches (which I will use) and stochastic gradient descent. One important lesson was understanding when to stop learning to not over fit or learn insufficiently.


I also started practicing with NumPy, which will be used to deal with matrices/arrays and other computations for neural networks.

I started getting myself familiar with Keras, which is a machine learning library that will help with constructing neural networks.

Finally, I started looking into deep learning, which I will hopefully complete next week, and, with much anticipation, find research ideas for my project.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s