Data
Science

Spring 2020
course
site

April 17

Questions or comments on anything?

Please do keep working on your own data analysis projects - that's what I think you'll get the most out of at this point.

For today, I figured we'd spend one more session looking at the abstractions that libraries like tensorflow, pytorch, and the book author in chap 19 use for "deep learning".

Next week I suggest we look at chapter 17, decision trees.

deep learning APIs and concepts

Chapter 19 in the text, "deep learning", implements recursively an object-oriented abstraction of the neural net notions that we talked about last time.

This scratch.deep_learning is patterned after the ideas in libraries like tensorflow and pytorch.

With just a little intuition for what's going on underneath, you can (mostly) use these libraries as black boxes to do this kind of modeling.

I'll walk through the text some, show my jupyter notebook that does a simple example and adds a convergence plot, and point towards some tutorials you can look at.

And I've tried to collect some of the code object names and math buzzwords here in one place for you.

deeep learning libraries

"scratch" code objects (i.e. abstraction of neural net model)

some data manipulation functions

network step modification methods

loss (i.e. error) functions

my stuff

(BTW, the slides for that talk were done with a jupyter notebook ... google "jupyter slides" if you want to know how.)

https://cs.marlboro.college /cours /spring2020 /data /notes /apr17
last modified Sat December 21 2024 4:24 pm