#garfield Logs
Oct 16 2023
#garfield Calendar
02:09 AM polprog: rue_shop3, do you know derivatives?
02:21 AM polprog: df/dx and so on
02:21 AM polprog: i have some notes and examples from the NN class on backprop
02:21 AM polprog: im in class right now actually :D
07:03 AM polprog: so i finally learnt what back propagation is
07:03 AM polprog: basically, you take your error function and using that and it's derivatives, you adjust the coefficients
07:04 AM polprog: we did it on one neuron for now, fitting a line to a cloud of points
07:04 AM polprog: i have notes and code, ill copy them to a text file for you this evening
11:33 AM polprog: not really
11:33 AM polprog: you have no A function to begin with
11:33 AM polprog: you have a series of points that resemble a line
11:33 AM polprog: but they are not on the same line, they are close
11:34 AM polprog: and you have to pick a line that fits
11:34 AM polprog: linear regression
11:34 AM polprog: so there are various tricks to find the parameters of that line (y=ax+b, find a and b) and one of the ways is to use a gradient method
11:34 AM polprog: which is what the single neuron does
02:53 PM polprog: Rue you gotta brush up on derivatives :)
03:12 PM polprog: https://polprog.net/rozne1/ircjunk/nn/01_backprop.html
03:12 PM polprog: https://polprog.net/rozne1/ircjunk/nn/backprop.ipynb python notebook, you can run it in jupyter
07:02 PM rue_mohr: ah thats right, a tensor is an array of equations
07:03 PM rue_mohr: but if you have arrays A*B=C where A is the input and C ist he output
07:03 PM rue_mohr: you onyl need two to work out the 3rd
07:04 PM rue_mohr: do you remember "longchip is long"? the super long SMT package
07:04 PM rue_mohr: ?