Week 6- A breakthrough!

  As I had desperately hoped, the temporary setbacks that constituted last week's major obstacles shed some light on how to move forward with the project after all. After understanding that those problems were largely the result of the way TensorFlow alters tensors by returning new ones, I set about combing through the code to … Continue reading Week 6- A breakthrough!

Advertisement

Week 5 – one step forward, two steps back, one step… forward?

Prior to the experiences with which I have been presented this summer, I would likely have greeted initial failure at some particular task as an indicator of incompetence. However, it is with a renewed desire to perform genuinely progressive research that I can honestly say that not only does such initial failure come with the … Continue reading Week 5 – one step forward, two steps back, one step… forward?

What if… it didn’t have to forget? – Week 3

Featured image by Terence Broad, from http://terencebroad.com/convnetvis/vis.html This week we went from understanding existing implementations of continual learning with neural networks to identifying and seeking a solution to the problems that have arisen through empirically evaluating those implementations. We have zeroed in on a particular limitation of elastic weight consolidation as a method for allowing … Continue reading What if… it didn’t have to forget? – Week 3

Even Deep Neural Networks Forget Sometimes – Week 1

Thus far, it's been a "hit-the-ground-running" kind of week to kick off my summer research experience, but the deeper I delve into the material, the more excited I become about continuing to do so- and the more I learn about... how to learn. This research is, perhaps more than anything else, an opportunity to become … Continue reading Even Deep Neural Networks Forget Sometimes – Week 1