#garfield Logs
Oct 06 2018
#garfield Calendar
12:11 AM zhanx: rue my feeder mod to hold the filament is working awesome
12:14 AM zhanx: its not jumping off the hobbed bolt anymore
12:19 AM rue_mohr: excellent
12:29 AM zhanx: fyi i did print the base for the 3d scanner, its done if it works in 40 minutes, i have not touched the printer once
12:29 AM zhanx: its been running fir 2 hours 18 minutes
12:30 AM rue_mohr: it should just get better
12:30 AM zhanx: we can hope so
12:31 AM zhanx: yes i did the washers as a z thing
12:31 AM zhanx: and the 100mm square etc
12:31 AM zhanx: its close to being spot on
12:32 AM zhanx: if anything i can make the layers smaller
12:33 AM rue_mohr: 0.21 is what I use
12:33 AM rue_mohr: carefull, if a flat surface lands on a multiple of the slice height, slic3r will crash
12:34 AM rue_mohr: its got a colinear grievence
12:34 AM zhanx: i am at .25 so not that far off
12:34 AM rue_mohr: if it crashes trying to slice soemthing, add like 0.001 to the slice height BEFORE loading the model
12:34 AM zhanx: k
12:34 AM rue_mohr: 0.21 seems to be odd enough to work for my models
12:35 AM zhanx: if i can get to .22 or .20 i will be happy
12:36 AM rue_mohr: there is something simple here I' not grasping
12:36 AM rue_mohr: and deleting that gypsie kings song will help
12:37 AM rue_mohr: aha, yes, thats it
12:38 AM rue_mohr: only active input neurons are adjusted
12:38 AM rue_mohr: thats what I was looking for
12:38 AM rue_mohr: now I just need to recall why that was a trigger detail for something really important
12:38 AM rue_mohr: GIVE ME MY BRAIN BACK
12:39 AM zhanx: i dont have it
12:39 AM rue_mohr: this is simple stuff and I'm having a hell of a time putting it togethor
12:40 AM rue_mohr: I can only see about 1/4 of the picture I need to
12:40 AM zhanx: your box of thinking is too tight
12:40 AM rue_mohr: its shrunk in the last 8 years
12:41 AM rue_mohr: my ability to se a whole problem is shriveling up
12:41 AM zhanx: you need to remove it
12:41 AM zhanx: shit rue_mohr today on that T-1 line fix, i was clueless
12:41 AM zhanx: I still fixed it
12:42 AM rue_mohr: they opps a polarity or soemthing?
12:42 AM rue_mohr: oops one?
12:42 AM zhanx: no
12:42 AM zhanx: they remodelled and cut it
12:43 AM rue_mohr: ah
12:43 AM zhanx: no one took a toner to it
12:43 AM rue_mohr: workign on a reno where they decided to lop off the 25x that fed the unit
12:44 AM rue_mohr: }:| wtf you just cutting wires for!?
12:44 AM zhanx: cause they like too
12:44 AM rue_mohr: the provider wants a $5000 "donation" to redo that cable
12:44 AM rue_mohr: I can put an end back on it, but I hope to hell I spooked him
12:44 AM zhanx: this hospital was a shit hole of wiring
12:44 AM rue_mohr: bet they run cheap
12:45 AM zhanx: no they paid good because was contracted in to fix it
12:45 AM zhanx: i was there an hour and a half and it was up
12:46 AM zhanx: they have 19 on staff for it
12:46 AM rue_mohr: are there patch cables that go from computer up into the tbar and down the hall to a network switch on a bookshelf thats fed from another switch on the floor thats fed by a cable running thru a hole in the wall to the next room where the router, serving as a table-leg-shim is ?
12:46 AM zhanx: you would swear that
12:47 AM rue_mohr: if not, stop complaining :)
12:47 AM zhanx: i counted 30 open hanging patch cables on one rack
12:47 AM rue_mohr: ok I'm rally trying to focus here
12:47 AM rue_mohr: I want to ace this neural network thing
12:47 AM rue_mohr: and my weekend just got taken away
12:47 AM zhanx: good luck
12:47 AM rue_mohr: after my sleep was ruined by 4 phonecalls
12:48 AM rue_mohr: you know how when you shake a bucket of sand, the larger rocks congrigate at the edge?
12:49 AM zhanx: yhep
12:49 AM rue_mohr: thats what I reffer to as 'organized chaos'
12:49 AM rue_mohr: and THATS the mechanism you use to properly train a neural network
12:50 AM rue_mohr: now simply code that :)
12:50 AM zhanx: if you can remember this print is 5% infill
12:50 AM rue_mohr: the training function should apply equally to all neurons...
12:50 AM rue_mohr: I dip out at about 20%
12:51 AM zhanx: is that too low?
12:51 AM rue_mohr: dunno
12:51 AM zhanx: i almost did hollow
12:52 AM rue_mohr: see, its funny they adjust the activ links, cause int eh same way , you can adjust the inactive ones
12:52 AM rue_mohr: the code for this starts to achive what I'm looking for
12:53 AM rue_mohr: the system is so corrective, the method and values almost dont matter
12:54 AM rue_mohr: the brain sections should be a function of the number of senses available to it
12:54 AM rue_mohr: cant put that togethor right now
12:55 AM rue_mohr: that means that for virtual hidden layers, the weights array needs to be adjusted as the result is processed
12:56 AM rue_mohr: ok I need to watch videos on training networks that have a few hidden layers
12:57 AM rue_mohr: I need to think about calculation butterflies while I watch it
12:57 AM rue_mohr: that another thing that doesn't matter, the structure of the network
12:57 AM rue_mohr: and all the nn stuff uses orderly layers of neurons
12:58 AM rue_mohr: but it can be a chaotic mess of neural links
12:58 AM rue_mohr: tho I need to figure out trainign techniques
12:58 AM rue_mohr: what NN can I implement on an avr to do something musing?
12:59 AM rue_mohr: not much without hidden layers
12:59 AM rue_mohr: ok
12:59 AM rue_mohr: what is an acceptable trainign period for a nural network
12:59 AM rue_mohr: depends if its repeating to itself or not
01:00 AM rue_mohr: what do you want for supper?
01:00 AM rue_mohr: I WANT TO BE A ROBOT!
01:00 AM rue_mohr: *sigh*
01:01 AM rue_mohr: there is the issue of turning training on and off too
01:02 AM rue_mohr: oh, thats why sleep resets the trainign rate
01:04 AM rue_mohr: so, frustration is exhausting cause the brain fails to be able to latch on to its target and is in a failing training cycle
01:06 AM rue_mohr: happy is the sucessfull nural results
01:06 AM rue_mohr: this explains why a child can break down and not be able to do anyhting
01:06 AM rue_mohr: learning mode needs to be reset
01:08 AM rue_mohr: I wonder if blood PH goes up when someone gets frustrated
01:10 AM rue_mohr: yup
01:10 AM rue_mohr: huh, that didn't even take long to find
01:11 AM rue_mohr: the reason it goes up, is because the body is trying to dissolve waste tissue
01:11 AM rue_mohr: and thats why asa helps when I'm tired
01:11 AM rue_mohr: artificially bumps up the reset mechanisms
01:12 AM rue_mohr: you need dialog?
01:12 AM rue_mohr: yes, please
01:12 AM rue_mohr: ok, lets go get some supper, give it some reset time
01:12 AM rue_mohr: yes, let it all sink in a bit
01:13 AM rue_mohr: yea, let the changes harden
01:17 AM rue_mohr: you dont need a fied trainign value, any random petrubation to the values you wan tto adjust will work
01:17 AM rue_mohr: right?
01:23 AM rue_mohr: but the mechnism of controling who is preturbed needs to be in place
01:23 AM rue_mohr: the definition of a stable system is actaully rather wide
01:24 AM rue_mohr: we need to watch videos on hidden layer network learning
01:24 AM rue_mohr: ok
01:24 AM rue_mohr: but the livingroom is cold
01:24 AM rue_mohr: turn on the heat?
01:24 AM rue_mohr: just before leaving for the weekend?
01:24 AM rue_mohr: wel I dunno curl up with a heater for all Icare
08:01 PM furrywolf: https://www.snopes.com/ap/2018/10/06/banksy-artwork-self-destructs-just-1-4-million-sale/ LOL
11:16 PM zhanx: well printer is doing good rue_mohr
11:16 PM zhanx: i need an upgrade for the y axis