#garfield Logs
Apr 12 2019
#garfield Calendar
09:13 PM moonymoon: rue_bed: squeaky clean?
09:24 PM rue_bed: mmm tired
09:24 PM rue_bed: too
09:24 PM rue_bed: so, whats up
09:25 PM zhanx: rue_bed, sleeping again
09:25 PM rue_bed: no
09:25 PM rue_bed: might drift off tho
09:26 PM rue_bed: I'm definitly hearing voices
09:26 PM zhanx: ever work with those 9 dof chips
09:26 PM rue_bed: not any imu yet
09:26 PM rue_bed: I have a bunch I want to play with
09:26 PM rue_bed: my understanding is that all the work is in filtering the readings
09:27 PM zhanx: k, i am starting on it tomorrow
09:27 PM rue_bed: like you need a ______ filter to get anything meaningfull
09:27 PM rue_bed: what that filter called
09:27 PM rue_bed: arg
09:27 PM rue_bed: kalman
09:27 PM zhanx: kalman
09:27 PM zhanx: too slow
09:28 PM rue_bed: and I dont know how to do those yet
09:28 PM zhanx: neither do it
09:28 PM rue_bed: you made one tho
09:29 PM zhanx: i did but not knowing the how or the why
09:29 PM zhanx: it worked
09:29 PM rue_bed: yea
09:29 PM zhanx: but, i am learning more now
09:29 PM zhanx: too me it was dumb luck the first time
09:29 PM rue_bed: I'v nodded off twice now
09:30 PM rue_bed: k, what did you do
09:30 PM zhanx: i took readings and weighted them over time in an array
09:31 PM zhanx: the more i saw it the more it moved up
09:31 PM zhanx: so i had array (data 1, data 2, time 1)
09:31 PM zhanx: then a sort on it
09:32 PM zhanx: weight that arrray was data 1 , data 2, time, score, level
09:34 PM rue_bed: dont totally get it
09:34 PM rue_bed: running averages?
09:34 PM zhanx: averages taken at the time
09:34 PM zhanx: say i sample 20ms of data, average it, store, it check it for duplicates
09:37 PM zhanx: the more i get into it, the more i realize I didn't take into account for it