Thankfully, the deep learning course professor graciously allowed us to utilize Google Colab for all of the Jupyter notebook assignments instead of the proposed full week session on setting everything up on GCP (Google Cloud Platform). The course has moved quite rapidly after that first assignment (set up environment and a broad overview of AI/ML/DL). The second assignment included demonstrating the differences and similarities between biological neural netwoks and artificial neural networks. It also required having a basic understanding of linear algebra (matrices, vectors, etc.) and then applying it with several problems. Finally, we were introduced to TensorFlow (an open-source library created by the Google Brain team that is used for large-scale AI machine learning and deep learning projects). The assignment required utilizing basic TensorFlow in a Jupyter notebook (Google Colab). After that assignment, I decided to take a quick crash course on TensorFlow (TensorFlow: Practical Skills in Constructing, Training, and Optimizing Models) to help support my understanding of this library.

For the next assignment we were introduced to Keras (another open-source, user-friendly Python library used for building and experimenting with deep neural networks. It is known for its simplicity and readability.) The assignment required demonstrating an understanding of one-hot encoding and then applying it (using Keras) on the Iris dataset.
Next, we had to design an MLP–fully connected neural network (as had been covered in the lecture material) using Keras on the Iris dataset. We imported libraries, set the seed (ensuring the random process in the code is reproducible in a consistent manner), loaded the data, obtained basic info on the dataset, performed train-test-split, one-hot encoding, ran the model, plotted the results, and, finally, scored the model. I was definitely learning a lot but the bulk of the code was provided to us and there was a lot of support as we went along which was good and helpful (I’d be lost otherwise!). It’s a good starting point but I look forward to continuing to apply these skills to other data sets on my own.
The midterm exam covered both theory (open ended questions demonstrating understanding of the history and the concepts of AI, ML, DL) and application (build, train, and evaluate a deep neural network MLP that has two layers using Keras and the pima diabetes dataset). So for, this course has quickly covered some difficult and heavy concepts and material (linear algebra, artificial neural networks, etc.). It’s an 8 week course on deep learning! I have been very grateful to have NotebookLM to help me summarize and make the material accessible to me in a way that helps me better grasp these concepts and asking as many clarifying questions when I’m not understanding something is super helpful. The most fun has been listening to a podcast of the lecture and materials on my drive to work. It’s helped me review the material several times over. Since the model is only working with the material I upload (class slides, lecture transcripts, etc.) it prevents it from hallucinating, etc. The potential power of something like NotebookLM for all of K-12 education is potentially revolutionary and transformational!

