- Took my first look at Machine Learning for Beginners: An Introduction to Neural Networks by Victor Zhou
- I Also began reading Chapter 1 of Neural Networks- A Systematic Introduction.
To-Do For Next Time:
- I would like to begin reading Chapter 2 of Neural Networks
- I also want to begin looking over math concepts needed for machine learning algorithms. Mathematics for Machine Learning should be a good starting place.
- I finished reading over Chapter 2 of Neural Networks. The first half of the material involved a lot of boolean functions and introduced the Mcculloch Pitts neural net model.
- I also started looking over math for machine learning. I plan on looking over some extra resources to learn some of the multivariable calculus and linear algebra involved. The book assumes a good understanding of some of these topics so it might be useful to watch some videos on them.
- Finally, I started watching the 3Blue1Brown series on Neural Networks. After watching both the first and second videos, I think I will take a look at the book that goes alongside the series to help supplement my learning of concepts such as gradient descent and backpropagation. These two concepts seem to hold a lot of weight in the world of neural networks.
To-Do For Next Time:
- Watch episode 3 of 3Blue1Brown and do supplemental reading
- Read chapter 3 of Neural Networks
- Sharpen up on math before going deeper into the Mathematics for Machine Learning book.
- I Looked over Chapter 3 of Neural Networks. This chapter introduced the idea of perceptron learning.
- I finished looking at Machine Learning for Beginners: An Introduction to Neural Networks
To-Do For Next Time:
- Watch 3Blue1Brown Episode 3
- Continue reading the Ebook material found in 3Blue1Brown's neural net series that will help in coding a basic neural network.
- I finished watching the 3rd episode of 3Blue1Brown's series on neural networks. This video introduces the backpropagation algorithm for deep learning.
- I also read the supplemental Ebook material 3Blue1Brown included in the description of the video series.
To-Do For Next Time:
- Watch 3Blue1Brown Episode 4 and finish ebook material relevant to backpropagation.
- Look over deep learning video series that will help in understanding the concepts in Neural Networks- A Systematic Introduction. Found one series that roughly follows the outline of the book.
- I finished watching the last episode of 3Blue1Brown's neural net series. This video introduced some of the calculus required for backpropagation.
- I also began looking over a Deep Learning video course that follows the outline of Neural Networks- A Systematic Introduction. This video condenses a lot of the information into more manageable chunks that are applicable to our project and vision for how we want to make the neural net.
To-Do For Next Time:
- Review some of the calculus concepts brought up in 3Blue1Brown's video series.
- Finish Ebook reading and continue watching some of the deep learning videos
- I looked over some calculus concepts such as gradients, partial derivatives, and multivariable functions to better understand the background of gradient descent and backpropagation algorithms. My Calculus III textbook helped quite a bit.
- Finished the Ebook supplemental reading for 3Blue1Brown's series and continued watching some of the deep learning videos up until the perceptron learning video.
To-Do For Next Time:
- Continue reading Chapter 4 of the Neural Networks book and watch supplemental videos from the deep learning series.
- I watched the first "3Blue1Brown" video giving a general overview of neural nets using a number classification example. Among other concepts, the video covered hidden layers and weights.
- I also read through the first chapter of Machine Learning for Beginners: An Introduction to Neural Networks by Victor Zhou which covered how to assign weights and basic training of a neural net model.
To-Do For Next Time:
- I plan to read Chapter 1 of Neural Networks: A Systematic Introduction and write out any questions I have
- I read through chapter 1 of Rojas' Neural Networks: A Systematic Introduction which went through the foundation and theory of Neural Networks.
To-Do For Next Time:
- I plan to read Chapter 2 of Neural Networks: A Systematic Introduction and write out any questions I have
- I plan to watch the second video of 3Blue1Brown
- I read through the supplemental reading for 3Blue1Brown's series and looked over the example they reference throughout the video series.
To-Do For Next Time:
- Continue watching the 3Blue1Brown series
- I watched the rest of the 3Blue1Brown videos (2-4) which covered a lot of the math that goes into the neural net model. This includes a lot of Calculus including the chain rule and derivates. It also went over gradient descent and hidden layers.
To-Do For Next Time:
- Investigate some of the math that goes into the model.
- Skimmed Victor's Zhou Machine Learning for Beginners: An Introduction to Neural Networks
- Started reading the first chapter of the Neural Networks- A Systematic Introduction book we found online
- Found a cool website that has Deep Learning courses that seem very interesting
To-Do For Next Time:
- Figure out if I actually need all the math included in Victor's post and rereadit more in-depth
- Finish Chapter 1 of Neural Networks- A Systematic Introduction
- Look into the courses offered in Datacamp and pick one to practice
- Read Machine Learning for Beginners: An Introduction to Neural Networks thoroughly. He mentions experimenting with ML libraries such as Keras and Datacamp has a course in this 2.Read through the descriptions of the Deep Learning courses I found and decided to get staarted with Introduction to Deep Learning in Python
- Finished Chapter 1 of Neural Networks- A Systematic Introduction
Things I need to learn more about based on today's reading
- Partial derivates and derivatives in general
- Chain rule as it seems to be very important for training the model to have a lower loss
- Gradient Descent (maybe?)
- Taylor series?
To-Do For Next Time
- Continue with the Datacamp course
- Get started on Chapter 2 of Neural Networks- A Systematic Introduction
- Find a good calc crash course
- Watch the 3Blue1Brown videos
- I watched 2.5 videos from the 3Blue1Brown channel on Neural Networks. I now have a better understanding of them since the book does not have many images
- Started Chapter 2 of Neural Networks- A Systematic Introduction but I will need to continue reading later today
To-Do For Next Time (ideally tonight)
- Finish Chapter 2 of Neural Networks- A Systematic Introduction
- Finish watching the 3Blue1Brown videos
- Get started learning calculus
- Continue with the course and make sure to save my progress this time!
- I finished watching the videos from the 3Blue1Brown channel on Neural Networks
Notes about today The calculus behind backpropagation seems doable but is still very confusing without a background on calculus
To-Do For Next Time
- Check out the resources mentioned in his videos about linear algebra and calculus and see if they are helpful
- Ask around and see if these math concepts are a must in order to work with neural networks at the level we want to and in the time we have left
- Check out the code that the guy from 3Blue1Brown mentioned in his second video
- I finished reading Chapter 2 of Neural Networks- A Systematic Introduction, turns out discrete math is very important, glad I payed attention in the class
- I'm Halfway done with the Datacamp course Introduction to Deep Learning in Python (2 more hours to go according to their time estimate)
Notes and questions about today
- I'm really enjoying the Datacamp course. The course reminds me a lot about the videos from 3Blue1Brown but I enjoy the questions that they give after every video and the coding practice that you have for each concept
- Do I need to research more about threshold logic?
- "Usually the integration function g is the addition function" I think this refers to the sum() function I've been using in the Datacamp course code
- Seeing some integrals? derivatives? in 2.5.1, I foresee some nightmares in the next weeks
To-Do For Next Time
- Do at least 1 hour of the Datacamp course
- Start watching math videos and brush up on algebra and precalc before attempting to learn calculus
- I'm now 75% done with the Introduction to Deep Learning in Python, I have been spending more time on this that their predicted time since I'm taking notes and researching the functions and libraries that I've seen in the practice code
- Started reading Neural Networks and Deep Learning by Michael Nielsen, the section on Perceptrons looks very similar to Chapter 2 of Neural Networks- A Systematic Introduction but I prefer this explanation as it is clearer and uses the logic gates that we worked with in MAT 310.
To-Do For Next Time
- Finish the Datacamp course
- Start working through the code provided in Nielsen's book
- Start reading about Sigmoid neurons on Neural Networks and Deep Learning
-
I began looking into the first two chapters of Neural Networks: A Systematic Introduction on the biological paradigm and threshold logic. These chpaters covered why nueral networks are designed the way they are with the biological inspiration and an introduction to the types of units that are computed to build the nueral networks.
-
I also began watching the video series on nueral netowrks by three blue one brown. They provided a very conceptual overview on what nueral networks are and how they work.
To-Do For Next Time:
- I plan on reading more in-depth on chapter 2 of Neural Networks: A Systematic Introduction and attempt do finish some of the exercises that are listed at the end of the chapter. I also plan on beginning to skim through chpater 3 to get an overview on the next topic: weighted Networks the perceptron.
-
I began looking into some of the higher math concepts needed to better understand nueral networks. This includes partial derivatives and how to take derivatives in 3d space.
-
I also began reviewing some of the matrix math concepts and applied discrete concepts described in chapter 2 of the Neural Networks: A Systematic Introduction book to make sure that the math makes sense.
-
I have skimmed through chapter 3 of our textbook to get a cursory understanding on what how networks use weights to help decide which functions to use.
To-Do For Next Time:
-
I plan on reading chapter 3 in depth and understanding more of the concepts described.
-
I also plan on looking into the code described in the 2nd three blue one brown video to look at their number reconization network