#robotics Logs

Apr 19 2017

#robotics Calendar

01:18 AM rue_house: why is this on my head?
01:20 AM -!- #robotics mode set to -o by rue_house
01:20 AM rue_house: I'm working on part 2 of the servo course
01:20 AM rue_house: today I put legs on a tiny13, but cant find my carrier board for it
01:22 AM rue_house: but its ok, I'm way to stressed out at work to actually get sick
01:22 AM Anniepoo: I used to have a roomie. Her kid had a toy truck, on a wired controller
01:22 AM z64555: lol
01:22 AM rue_house: Anniepoo, ok...
01:22 AM Anniepoo: she'd lay down and have her husband drive the truck back and forth on her back
01:23 AM rue_house: heh
01:23 AM rue_house: and why didn't she do it?
01:23 AM rue_house: I want to build an inversion board
01:23 AM gottaname|wurk: ne ne Anniepoo
01:23 AM gottaname|wurk: why do wimmenz not do robotics
01:23 AM Anniepoo: probably couldn't see it
01:23 AM Anniepoo: I dunno
01:24 AM gottaname|wurk: also why are there seperate chess games for men and women
01:24 AM rue_house: some legs, a board, 74ls14, some connectors...
01:24 AM orlock: gottaname|wurk: Not here there isnt?
01:24 AM z64555: rue, did you see the comment I made earlier about the piezo's
01:24 AM gottaname|wurk: orlock, weird
01:25 AM rue_house: z64555, not neccissarily
01:25 AM rue_house: the freq would change based on the pressure
01:26 AM z64555: eh... I wouldn't be too sure of that. You'll get some wierd electromechanical feedback between the sensed force and the signal on the crystal
01:27 AM rue_house: I'd like to play with it
01:27 AM z64555: take a tuning fork and mash it up against a solid surface, does the frequency change a whole bunch?
01:27 AM rue_house: yea
01:27 AM rue_house: damping chnages resonant freq
01:28 AM z64555: try the same pressure but with different surface sizes
01:34 AM rue_house: gottaname|wurk, you offended anniepoo, did you mean anything offensive there?
01:35 AM gottaname|wurk: not really
01:35 AM * orlock reads back
01:36 AM orlock: well, he's wrong anyway
01:36 AM orlock: plenty of females doing robotics here
01:36 AM orlock: at this internationally recognised company that builds robots
01:36 AM gottaname|wurk: oh, well that's interesting
01:36 AM z64555: I find it boring, really
01:37 AM gottaname|wurk: guys like us don't think robots look creepy and evil with all the wires right?
01:37 AM gottaname|wurk: maybe women can make them look less.. intimidating
01:37 AM orlock: Build better robots?
01:37 AM gottaname|wurk: cute robots!
01:37 AM gottaname|wurk: build CUTER robots
01:37 AM orlock: Ours usually just look like big off-site cabinets
01:38 AM z64555: nah, not really
01:38 AM gottaname|wurk: because we can!
01:38 AM gottaname|wurk: :P
01:38 AM gottaname|wurk: like tachikomas.
01:38 AM z64555: a mess of wires is just messy wires, there's nothing evil or creepy about it
01:38 AM orlock: gottaname|wurk: Our robots apply chemicals to human tissue and blood, it's not supposed to be cute
01:38 AM orlock: And theres no mess of wires
01:39 AM z64555: dried gore and other biological material that's accumulated in odd spots... THAT's evil and creepy
01:40 AM Snert: Wiring should always be totally hidden except when it's not. And even then, the fenshui and the orderly visual flow of the wiring should prevail.
01:41 AM z64555: bundle it!
01:41 AM z64555: smooth curves, no kinks, no nonsense twists
01:41 AM Snert: that's right.
01:41 AM Snert: it should draw the eye towards something more important.
01:42 AM Snert: not be a fucking mess.
01:42 AM z64555: If the wires must be twisted, the twist must be all throughout the bundle
01:42 AM z64555: only exception is right at the connector, which often there's nothin you can do about
01:42 AM gottaname|wurk: it should be beautiful!
01:42 AM gottaname|wurk: neat, orderly, clean :P
01:43 AM Snert: and it should follow the IEEE electronics color code.
01:43 AM gottaname|wurk: and that too
01:43 AM z64555: all black and red! :D
01:43 AM Snert: Orange is 3. If it ain't D3, A3 or such then it ain't orange.
01:44 AM orlock: i'm amazed by the looms they make here
01:44 AM orlock: i a;ways grab them from the bin
01:45 AM Snert: seperate red and seperate black is always juice.
01:45 AM Snert: but if it's red and black in a bundle then it's 1 an2 of something.
01:45 AM Snert: 0 and 1 I meant
01:46 AM z64555: hm? so its different from the resistor color code
01:46 AM z64555: I'll have to look it up
01:46 AM gottaname|wurk: https://www.youtube.com/watch?v=Dwk0PqG0xg0
01:46 AM Snert: no...I mistyped twice.
01:46 AM gottaname|wurk: ^ cute
01:46 AM Snert: black is 0 and brown is 1.
01:47 AM z64555: resistor color code is Bl Br Rd Or Yl Gr Blu Vi Wh
01:47 AM z64555: err, missing grey
01:47 AM Snert: yea same thing. Grey is 8
01:47 AM z64555: Vi Gr Wh, Go Si
01:47 AM Snert: 9 is white.
01:48 AM z64555: ok, so it's the same as the resistor color code, great! makes it easy to remember
01:48 AM Snert: yep.
01:48 AM rue_house: gottaname|wurk, have you ever watched the crockodile dundee movies?
01:48 AM gottaname|wurk: yes I do
01:48 AM gottaname|wurk: why
01:49 AM rue_house: nevermind...
01:49 AM z64555: hm.
01:52 AM orlock: hah
01:52 AM * orlock gets that reference
01:52 AM orlock: "I see you've played Knifey Spoony before!"
02:00 AM z64555: hm, I don't think I actually have an orange spool of hook-up wire
02:01 AM z64555: or brown, for that matter
02:01 AM z64555: I may have to restock
02:12 AM Anniepoo: so, sadly, I'm being pushed into an ugly choice here. Either I tolerate what went on this evening, or I leave. Neither's very acceptable.
02:15 AM Anniepoo: I know that certainly tonight, this channel's NOT where I need to be
03:28 AM Jak_o_Shadows: I thought we were talking about the wussy little 12V kid cars. Then I clicked on links.
03:28 AM Jak_o_Shadows: Also, people! Geez.
03:33 AM gottaname|wurk: Jak_o_Shadows, in b4 one kid rams your shins with it
06:46 AM z64555: was somebody PM'ing anniepoo?
07:47 AM jandor: what the hell?
08:10 AM deshipu: I wonder what the "P" in "PM" might mean
08:12 AM Snert: Post Meridian as opposed to Ante Meridian
08:13 AM deshipu: so anniepoo has been postmeridianed?
08:13 AM Jak_o_Shadows: ffs
08:14 AM Snert: If it happeneed at night, I suppose so :P
08:16 AM SpeedEvil: Asshats.
08:19 AM jandor: just why
09:26 AM rue_house: annipoo was really upset by gottaname|wurks comment last night
09:35 AM SpeedEvil: asshats are real.
09:35 AM SpeedEvil: Anniepoo: hey. Snail progress?
09:38 AM Anniepoo: hey speed
09:40 AM Anniepoo: It's still morning here
09:42 AM SpeedEvil: ah
09:42 AM SpeedEvil: I just got some aluminium scaffolding poles to play with for other things, and they're remarkably stiff.
09:42 AM Anniepoo: radius of gyration
09:42 AM Anniepoo: 8cD
09:42 AM SpeedEvil: ~15mm deflection when held at the ends of the middle due to weight
09:45 AM cnnx: SpeedEvil: hi
09:46 AM cnnx: SpeedEvil: just went to mail a package back to amazon on foot and my back hurts
09:46 AM cnnx: havent been out in a while
09:46 AM cnnx: Anniepoo: hows your snail comming
09:46 AM rue_house: one of my buddies was at the shop last night trying to bend some cast aluminum
09:47 AM SpeedEvil: rue_house: did he have a cracking time?
09:48 AM Anniepoo: cast aluminum is different stuff than extruded aluminum
09:48 AM rue_house: ~~phonecall~~
09:48 AM SpeedEvil: yeah. I wonder what happens if I try to cast these poles.
09:48 AM Anniepoo: it's morning, I haven't gotten back to the shop
09:48 AM cnnx: cool
09:49 AM cnnx: where's the shop? out in the yard?
09:49 AM rue_house: so, yea, trying to bend cast aluminum strip
09:49 AM Anniepoo: no. The shop's http://theelginworks.com
09:49 AM rue_house: when I saw it break on him I was like !!@?!? that breaks like its cast
09:50 AM rue_house: which is when he told me the bar said right on it it'd been cast
09:50 AM SpeedEvil: warming it might help.
09:50 AM SpeedEvil: Or it might not
09:52 AM Anniepoo: no - casting alters the grain structure
09:52 AM SpeedEvil: yeah, I know, though tehre are additives which can help somewhat
09:52 AM Anniepoo: and aluminum doesn't really work harden much anyway
09:52 AM SpeedEvil: It really does qute a lot
09:53 AM rue_house: yea, neven with anuminum you can bend, you can bend it once
09:53 AM SpeedEvil: Purer aluminium tends to be better
09:53 AM SpeedEvil: Well. 'better'
09:53 AM Anniepoo: ok, that's a relative - I'm thinking about metal spinning
09:53 AM SpeedEvil: it's very weak
09:53 AM SpeedEvil: yeah - spinning is rather different
09:53 AM rue_house: damn i'm late to go for work!
09:53 AM SpeedEvil: and is lots and lots and lots of bending
09:54 AM rue_house: arg, where is shaver
09:54 AM Anniepoo: yes, ok, I'll believe you on this. now that I think about bending - the grains do grow
09:55 AM * rue_house tries to not shave off eyebrows
09:55 AM Anniepoo: rue, are you feeling better this morning?
09:55 AM rue_house: a bit
09:56 AM Anniepoo: lol - I'm on the dorkbot site. Got a laugh from this Hebocon is a robot sumo-wrestling competition for those who are not technically gifted. It is a competition where crappy robots that can just barely move gather and somehow manage to engage in odd, awkward battles.
09:56 AM Anniepoo: https://dorkbotpdx.org/
09:56 AM cnnx: my legs hurt
10:00 AM rue_house: hahha retro shack
10:00 AM rue_house: :( I'm gonna be in a mud filled ditch all day
10:04 AM Anniepoo: ooh, mud filled ditch.
10:04 AM Anniepoo: 8cD <-- going to work on snail some today
10:07 AM SpeedEvil: I should play with these poles - to use them as support for a stressed membrane
10:11 AM Anniepoo: Speed, now Im curious - do they have a material code stamped on them? (or otherwise)
10:11 AM SpeedEvil: 6082
10:12 AM Anniepoo: We're about to cast a bunch of aluminum from the frame of a greenhouse that fell down.
10:12 AM SpeedEvil: http://www.scaffolding-direct.co.uk/alloy-tube-48-3mm-x-6m-length-for-tube-clamps-and-scaffolding.aspx
10:13 AM Anniepoo: not sure how I'd set that experiment up
10:14 AM SpeedEvil: what experiment
10:14 AM Anniepoo: I'd like to try measuring toughness for the same metal after it's been cast
10:15 AM Anniepoo: just outta nerdishness
10:15 AM SpeedEvil: A hardened steel cone or othe shape, pressed or impacted onto the surface
10:15 AM SpeedEvil: ball bearing can also work
10:15 AM Anniepoo: isn't that hardness?
10:15 AM SpeedEvil: http://www.ebay.co.uk/itm/HRC-3-Steel-Diamond-Indenter-Penetrator-For-Rockwell-Hardness-Testing-Tester-/222068170765 if you want to get fancy
10:16 AM SpeedEvil: yes
10:16 AM Anniepoo: toughness - for steel, I know the test is, essentially, put it in a vice, bend it with a ram, see how far it bends
10:16 AM SpeedEvil: I think that'd work too
10:17 AM SpeedEvil: at least indicatively
10:17 AM SpeedEvil: perhaps even notching it
10:17 AM Anniepoo: hang on
10:18 AM Anniepoo: Ah! toughness is defined in a kind of neat way
10:18 AM Anniepoo: it's the area under the stress-strain curve
10:19 AM Anniepoo: ductility is the term I should have been using, I guess
10:19 AM Anniepoo: 8cD best thing about robotics is all the weird areas you end up in
10:20 AM Anniepoo: and it's even worse - ductility = ability to deform under tensile stress.
10:21 AM Anniepoo: malleability = ability to deform under compressive stress
10:21 AM SpeedEvil: It also depends.
10:21 AM SpeedEvil: for much of robotics, you never want to approach deformation
10:22 AM Anniepoo: every loaded material deforms
10:22 AM Anniepoo: if you don't want permanent change, you have to stay within the elastic limit
10:24 AM SpeedEvil: ^plastic deformation
10:25 AM Anniepoo: right, if you exceed the elastic limit you get plastic deformation, and then failure.
12:14 PM MarkusDBX: is this a channel for robotics engineering?
12:15 PM MarkusDBX: like actual robots?
12:22 PM robopal: yes, our masters
12:27 PM DagoRed: future... if we can ever calibrate correctly.
12:33 PM robopal: MarkusDBX, are you a student?
12:33 PM robopal: of engineering?
12:36 PM mumptai: re
12:50 PM solol: deep-learning has been a thing for 70 years
12:51 PM solol: lol @ our masters. robopal what kind of robots do you work on?
12:51 PM solol: anyone working on something like an exoskeleton ?
12:51 PM robopal: none, I am safe
12:51 PM robopal: for now at least
12:52 PM solol: yeah but something like an exoskeleton should be real for a lot of people. i'm wondering if anyone has thought about them
12:53 PM solol: they make the news occasionally but its been a while since i've noticed them maybe 2 years or so
12:54 PM solol: one of the big aerospace guys was prototyping them for working class people last i knew
12:55 PM solol: basically it was a thing that goes up the leg, around the waist, and up the side of your body-down the arm
01:01 PM solol: anyone want to break down the deep-learning technique described https://phys.org/news/2017-04-neural-networks.html
01:09 PM SpeedEvil: exoskeletons really aren't hard.
01:09 PM SpeedEvil: The problem is getting them cheap and usable
01:10 PM SpeedEvil: https://en.wikipedia.org/wiki/Atmospheric_diving_suit + lots of leetle harmonic drives
01:11 PM anniepoo: solol, what do you want to know about them?
01:11 PM solol: i don't get what they meant basically at all
01:12 PM anniepoo: do you understand neural networks?
01:13 PM solol: no i guess not
01:13 PM anniepoo: there are some youtube videos that explain them pretty well
01:15 PM solol: yeah maybe i should look at one or two
01:15 PM solol: i don't get what they do to 'compute any function that a digital computer could'
01:15 PM anniepoo: looking for one I liked
01:16 PM anniepoo: they're simply a way of making a function - you put inputs in, get an output
01:17 PM solol: yeah i don't understand that
01:17 PM anniepoo: so that's 'any function that a digital computer', which is, modulo memory limitations, what a computer can compute
01:18 PM anniepoo: 8cd any recursively renumerable function
01:18 PM * anniepoo shushes the little mathematician who sits on her shoulder and cringes
01:19 PM anniepoo: so, you have some data - lots of cases of bunch of inputs, and outputs
01:19 PM anniepoo: like,
01:19 PM anniepoo: in out
01:19 PM anniepoo: 1,2 3
01:19 PM anniepoo: 4,5 9
01:20 PM anniepoo: 3,3 6
01:20 PM anniepoo: and if you look at this a while, and maybe if I type more lines,
01:20 PM anniepoo: you begin to strongly suspect out is just the sum of the two in parameters
01:21 PM anniepoo: but for a ocputer, that sort of inductive reasoning is hard
01:23 PM anniepoo: so,
01:23 PM anniepoo: a neural net is a way of making a function like this
01:23 PM solol: you're gonna lose me i think
01:23 PM anniepoo: from a bunch of data
01:24 PM anniepoo: yup, youtube videos are going to be clearer than me
01:24 PM solol: what's a weight or threshold
01:24 PM solol: or layers
01:24 PM solol: and how does it compute
01:24 PM solol: i don't get how it would make a function out of just #'s
01:25 PM anniepoo: https://www.youtube.com/watch?v=bxe2T-V8XRs
01:25 PM anniepoo: I found this helpful
01:25 PM anniepoo: YMMV
01:26 PM solol: brb
01:26 PM solol: nvm false alarm
01:26 PM solol: i'll check that series out and get back to you
01:26 PM solol: thanks
01:30 PM solol: yeah i don't really understand layers
01:30 PM solol: he didn't connect every neuron or whatever
01:30 PM solol: just kinda made some of them connect
01:31 PM solol: oh i suppose they all go forward one maybe
01:31 PM solol: yeah that's what they did
01:32 PM solol: lost me making the matrices
01:34 PM solol: yeah lost me in the forward propagation
01:34 PM solol: not sure why they add weights and shit either
01:39 PM z64555: nueral nets mentioned... no mention of fuzzy logic :(
01:40 PM solol: fuzzy logic?
01:41 PM z64555: nueral nets are a specialized case of fuzzy logic inference engines
01:42 PM z64555: With fuzzy logic, you have a number of curves that describe the incoming influence of a value or signal
01:42 PM z64555: WHich are then called a "fuzzified" value of the input
01:43 PM solol: where does the # of curves come into existence
01:43 PM solol: and what is incoming influence
01:43 PM solol: just input part of the function?
01:43 PM z64555: These values can then be added, multiplied, etc. with other fuzzified values from other curves, which can be working on the same input or another input
01:44 PM solol: what are these curves
01:44 PM z64555: It's a function, that maps an input, x, to another value, y
01:45 PM z64555: For example, a unit step is often considered a "curve"
01:45 PM z64555: (a unit step's y is 1 when x >= 0, and y = 0 when x < 0)
01:45 PM * anniepoo has painted on lightweight spackling on the shell plug i messed up. Is waiting for it to dry
01:46 PM z64555: another "curve" often used is a line
01:47 PM z64555: y = mx + b
01:47 PM anniepoo: @jandor - I think for now I'm not goign to bother with simulator, will just write blender plugin to interact with physical robot
01:47 PM solol: lines aren't curves
01:47 PM solol: so i don't get where you're coming from
01:48 PM z64555: you're being too hung up on semantics
01:48 PM solol: at least i've never heard of someone calling a line a curve
01:48 PM solol: a unit step
01:48 PM solol: which is what in a neural network
01:48 PM z64555: It's a foundation of which you seem to have completely missed
01:49 PM solol: they didn't call anything a unit step i don't think
01:49 PM z64555: You're trying to swim in the ocean before swimming in the kiddy pool
01:49 PM solol: what do they use the deep learning for?
01:50 PM z64555: and I'm trying to bring you back to the kiddy pool for a bit but you seem to be objecting
01:50 PM solol: im not sure where unit steps come into play
01:50 PM solol: you take an input
01:50 PM solol: then what
01:50 PM z64555: it gives an output, depending on what the input's value is
01:50 PM solol: so you already know the result of any given input
01:51 PM z64555: for a given curve yes
01:51 PM z64555: But.
01:51 PM z64555: If you have a bunch of different curves working on the same data
01:51 PM solol: what do you mean by curve.. input = output is not even a line
01:51 PM z64555: blended together, you might not know the exact result without
01:52 PM solol: it's just plots
01:52 PM z64555: curve = line function.
01:52 PM z64555: no it's not just plots
01:52 PM z64555: it's a mapping function
01:52 PM z64555: it **maps** an input signal to a different signal
01:53 PM solol: so you take multiple inputs and outputs and create lines?
01:53 PM z64555: a signal is a train or sequence of inputs
01:53 PM z64555: You're getting confused here
01:55 PM solol: yeah i think a line requires more than one point
01:55 PM z64555: If you plot the values out, yes, you would get a line/curve.
01:55 PM z64555: But that's not the point
01:57 PM z64555: You generally don't run a single value into a fuzzy logic engine, or a nueral net for that matter
01:58 PM jandor: anniepoo, ok
01:59 PM z64555: https://en.wikipedia.org/wiki/Fuzzy_logic
01:59 PM anniepoo: and I'm pretty much decided on YARP
02:00 PM z64555: Read that, that's a bit simpler than what I was trying to get
02:00 PM solol: that's a lot of reading
02:00 PM z64555: "simpler" in the sense that it is focusing on logic values instead of signals
02:01 PM solol: what are you trying to solve for in a fuzzy logic engine
02:01 PM solol: values between 0 and 1
02:01 PM solol: but what do they represent
02:01 PM z64555: no
02:01 PM z64555: not "values between 0 and 1"
02:01 PM z64555: its whatever values you want them to represent
02:02 PM z64555: Look at the temperature example
02:02 PM solol: yea i worded that wrong anyway
02:02 PM z64555: This one's got 3 logic values, cold, warm, and hot
02:02 PM solol: ok im looking at it
02:03 PM z64555: the vertical scale is from 0 to 1, with 0 being "not" that particular logic value, and 1 being "definitely" that logic value
02:04 PM solol: yea it says that
02:04 PM solol: that's about all it says though
02:04 PM z64555: well it assumed you read the first few paragraphs :P
02:04 PM solol: goes onto conceptual info again after that
02:05 PM z64555: anyway
02:05 PM z64555: So you can have a number of logic values
02:05 PM solol: i still don't get how it graphs them
02:05 PM z64555: which are nonzero and not-one
02:05 PM solol: i guess inequality expressions
02:05 PM z64555: You set the trapezoid curves
02:06 PM solol: yeah how do you set the infinite # during the slope?
02:06 PM z64555: what infinite number?
02:06 PM solol: going from 0 to 1 or 1 to 0
02:07 PM solol: the slope
02:07 PM z64555: ok now you're confusing me
02:07 PM z64555: The trapezoids are the curves, the mapping functions
02:07 PM z64555: You give it an x, it gives you a y
02:08 PM z64555: In reality these curves are often lookup tables, but their purpose is the same
02:08 PM z64555: You git it an x
02:08 PM z64555: it gives you a y
02:08 PM z64555: Those three trapezoids operate on the same input
02:09 PM z64555: so you give them an x, and you get three y's
02:09 PM z64555: Here, it uses temperature
02:09 PM cnnx: SpeedEvil: you mad?
02:10 PM z64555: So, if the temeprature is like 10 Celsius, then it's definitely in the "Cold" curve
02:10 PM z64555: and not in any of the others
02:10 PM z64555: so your y's would be 1, 0, 0
02:11 PM solol: they're not always 1 or 0 though
02:11 PM z64555: Right, that all depends on what the input value is
02:11 PM z64555: So if your x was like 20 Celsius
02:12 PM solol: how do you take into account the infinite range of y's for the values that aren't 1 or 0
02:12 PM z64555: they don't exist
02:12 PM z64555: here, 0, 1 is associated with the curve
02:12 PM z64555: they are "weights" for each of the functions
02:13 PM z64555: so if it's 1 cold, that means it's 100% cold, 0% warm, and 0% hot
02:13 PM solol: warm could be cold or hot though
02:13 PM z64555: no
02:13 PM solol: it sure looks like it
02:13 PM z64555: you're thinking about it backwards
02:14 PM solol: yeah im totally thinking it can be values between 0 and 1
02:14 PM z64555: the temperature can be be both on the warm and cold curves, or both on the warm and hot curves
02:14 PM solol: what is a weight
02:14 PM solol: 1 or 0?
02:15 PM z64555: 1, 0, or anything in between. it's a multiplicative value
02:16 PM z64555: Ok, so
02:16 PM z64555: If youre temperature is 20C
02:16 PM z64555: You'll end up with a value that's both on the Cold and Warm curves
02:16 PM solol: is that suit really considered an exoskeleton SpeedEvil ?
02:16 PM solol: ok so it's just a 1 for both then
02:17 PM z64555: lets say its in the middle, so its halfway cold (0.5 Cold) and halfway warm (0.5)
02:17 PM z64555: no, those are trapezoids
02:17 PM z64555: If they were squares or rectangles, then the only mapped values you'd get are 1's and 0's
02:19 PM solol: so explain it further i'm following you now as good as im going to be able to
02:19 PM z64555: So you'll end up with 3 y's, 0.5 Cold, 0.5 Warm, and 0 Hot
02:19 PM solol: ok so you make a matrix with those?
02:21 PM z64555: Each curve/line/trapezoid/shape has their own function that describes them
02:21 PM z64555: You can combine them into a matrix, which then can give a vector that has the truth values for each line
02:22 PM solol: what are weightings
02:22 PM z64555: The "weights" are the truth values
02:23 PM z64555: later on, you have another set of "weights" that can further scale the truth values
02:23 PM z64555: but don't jump ahead
02:23 PM z64555: focus
02:24 PM z64555: We're still on the fuzzification phase of fuzzy logic, but you keep trying to jump around to other topics
02:25 PM z64555: To recap
02:25 PM z64555: We've got 3 functions that map an input to a value
02:25 PM z64555: a truth value that represents a % of that logic state
02:27 PM solol: yeah
02:27 PM z64555: Now that you've got these truthe values, you can apply them to a set of rules to control something
02:27 PM z64555: with the temperature example, it'll controll a heater
02:27 PM z64555: or an air conditioner
02:27 PM z64555: Here's a rule table:
02:28 PM z64555: "If it is cold, turn on the heater;"
02:28 PM z64555: "If it is hot, turn on the air conditioner;"
02:28 PM z64555: "If it is warm, turn off the heater and air conditioner"
02:29 PM z64555: but wait, we just got through saying that we don't have boolean states
02:30 PM z64555: so how can this rule table apply boolean logic to non-boolean values
02:30 PM solol: good point
02:30 PM z64555: It blends it.
02:30 PM solol: how so
02:30 PM z64555: So, if the input value is 0.5 cold and 0.5 warm
02:31 PM z64555: we both want the heater on, and off
02:31 PM z64555: 0.5 on, 0.5 off
02:31 PM solol: why wouldn't it be only true for values of 1 or 0
02:32 PM z64555: what?
02:34 PM z64555: Anyway. lemme refine the truth table a bit
02:35 PM z64555: "IF COLD, HEATER ON" OR
02:35 PM z64555: "IF HOT, AIR CONDITIONER ON" OR
02:35 PM z64555: "IF WARM, HEATER OFF AND AIRCONDITIONER OFF"
02:35 PM z64555: note the usage of OR at the end of the statement
02:36 PM z64555: that tells us how to blend the values together
02:36 PM z64555: From boolean algebra, logical OR is an addition, and logical AND is a multiplication
02:37 PM solol: didn't know they could use boolean as addition/multiplication
02:37 PM z64555: what's 1 * 0
02:38 PM z64555: what's 1 AND 0
02:38 PM z64555: they're both 0
02:38 PM z64555: likewise, 1 + 0 = 1, 1 OR 0 = 1
02:39 PM z64555: 1 + 1 = 2, but in boolean logic 2 is the same as 1
02:39 PM z64555: because its "not 0"
02:40 PM z64555: Ok, so we have our rule table
02:40 PM z64555: we have values that have been put into them
02:41 PM z64555: The rule table says the heater should be 0.5 on, and 0.5 off
02:42 PM z64555: So we output to the heater a value of 0.5, which can then be scaled up/down w/e to adequantly power it
02:43 PM z64555: Now here's where those weights I was talking about earlier comes into play
02:43 PM z64555: You can weight each rule in the table to have greater, or less influence
02:44 PM solol: i don't get how 1 and 0 is 0
02:44 PM z64555: https://en.wikipedia.org/wiki/Truth_table#Logical_conjunction_.28AND.29
02:44 PM z64555: :/
02:45 PM z64555: that's not a great truth table. lemme find another
02:45 PM z64555: https://en.wikipedia.org/wiki/AND_gate
02:45 PM z64555: Just look at the table on the right hand side by the intro
02:46 PM z64555: that's boolean logic, which is also digital logic
02:46 PM z64555: back to weights.
02:47 PM z64555: As I said, you can weight each rule to give them a greater or less influence over the other rules
02:47 PM z64555: Weights are values that are multiplied to the output of the rule
02:47 PM z64555: a weight of 1 means no difference, a weight of 1.2 means a 20% more influence
02:48 PM z64555: and a weight of 0.8 means a 20% less influence
02:48 PM solol: i thought they were multiplied to the input
02:48 PM z64555: no
02:48 PM z64555: that's the curves
02:48 PM z64555: a different sort of weight
02:49 PM solol: so in a neural net why is there a known input and output already
02:49 PM z64555: "weight" is a general term for a factor that's multiplied into the value of interest
02:49 PM solol: where does deep-learning get its functionality from
02:50 PM z64555: there you go jumping into the ocean again
02:50 PM solol: but you're saying a the weight number matters when you're picking one out
02:50 PM solol: and do you use the same weight # across the board or new ones for any given output
02:50 PM z64555: it's a different value
02:51 PM z64555: like a car is a vehicle, and a truck is a vehicle, but a car is not a truck
02:52 PM z64555: OK. now.
02:52 PM z64555: Remember I said that the weights are applied on the rule set values?
02:53 PM z64555: This is after the input has been fuzzified on the curves
02:53 PM z64555: and has just run through the rule sets
02:53 PM z64555: and hasn't been blended together just yet
02:54 PM z64555: The nueral net has the ability to change these weights
02:55 PM z64555: A nueral net is a collection of inter-connected nuerons
02:55 PM z64555: Each nueron is a fuzzy logic inference engine, each with their own set of fuzzification curves and rules
02:56 PM z64555: For simplicity sake, all of the curves and rules are the same, so you just have different instances of them
02:56 PM z64555: So like you have 5 apples that look the same, but they're not the same apple
02:56 PM z64555: You can poke or slice into one apple, and the other's won't be affected
02:57 PM solol: sure
02:57 PM z64555: hm, apples aren't that great of an example
02:57 PM z64555: Anyway
02:57 PM solol: lol
02:57 PM z64555: With the nuerons, you link them up sequentially
02:57 PM z64555: or have several nuerons tied into one input nueron
02:58 PM z64555: so one input nueron feeds like 2 or 3 nuerons, and they feed 2 or 3 more, etc.
02:58 PM solol: so basically you have 5 dots on top of each other, then you connect those 5 dots to one dot to the right
02:58 PM solol: why are they feeding
02:59 PM solol: i didn't watch that part of the video
02:59 PM z64555: "feed" meaning they send their output to the other nueron as an input
02:59 PM solol: ok and what does the engine at each stage really do
03:00 PM z64555: This is still a feed forward system, so nothing really exciting other than spread out the input value to different others
03:00 PM z64555: in a single line of nuerons, it applies the same function to the input
03:01 PM z64555: So if a nueron was programmed to, say, multiply a value by 2
03:01 PM z64555: then each nueron in the line will multiply by 2
03:01 PM solol: why would you want more than one line
03:01 PM z64555: so x * 2 * 2 * 2
03:01 PM solol: err more than one neuron
03:02 PM solol: and was i right that you connected all the neurons to the same neuron which then feeds to 2 or 3 a couple times
03:02 PM solol: what do you feed to multiple neurons for
03:02 PM z64555: So you can do multiple things with one input
03:03 PM z64555: One path could be shorter and give a smaller value
03:03 PM z64555: another path could be longer and give a larger value
03:03 PM z64555: and the paths could intertwine and give an interesting combination
03:03 PM solol: i dont get what things are happening
03:03 PM solol: like how it would move on a chess board or something
03:04 PM solol: how is that figured into a neural net
03:04 PM z64555: you're still in the ocean
03:04 PM z64555: Aside from connecting to other nuerons, you can have a nueron feed back to itself
03:04 PM z64555: but instead of feeding it as an input, you can feed it as a weight
03:05 PM z64555: So that future inputs to that neuron will behave slightly differently than before
03:06 PM z64555: And, instead of directly feed-backing itself, you can have a chain or net of nuerons to control their inputs
03:06 PM z64555: the weights can thus be seen as a sort of memory
03:06 PM solol: how
03:07 PM solol: they're just a random #
03:07 PM z64555: how what
03:07 PM solol: so you remember what
03:07 PM z64555: The weights "remember" previous inputs
03:07 PM z64555: because, as I just said, a neuron can feed back its output into its weights
03:08 PM z64555: Here's an example formula
03:08 PM z64555: x = 2 + w;
03:08 PM z64555: err, wait, no, not that
03:08 PM z64555: y = 2x + w;
03:08 PM z64555: at time = 0, w = 0;
03:09 PM z64555: then, w = y;
03:09 PM z64555: that's example of memory, it's recusive
03:09 PM z64555: More applicable is this one:
03:09 PM z64555: y = x * w; w0 = 1
03:09 PM z64555: w = 0.25y;
03:10 PM z64555: then w get back into y
03:11 PM z64555: If x is from unique input, then nothing really happens with w
03:11 PM z64555: But, if x was input from another loop
03:11 PM z64555: then you'll see changes in w
03:11 PM z64555: which thus changes y
03:11 PM z64555: and thus changes x
03:12 PM SpeedEvil: solol: no, but add motors, and it is
03:12 PM z64555: They're not random numbers, they're procedural
03:13 PM solol: what'd you do here with y = x * w; w0 = 1; w = 0.25y;
03:13 PM solol: why is there two values
03:13 PM solol: well 3 technically
03:13 PM solol: for w
03:13 PM z64555: w0 is the state of w at the beginning
03:14 PM solol: is w weight?
03:14 PM z64555: that 0 is supposed to be an underscore
03:14 PM z64555: yes, w is weight
03:15 PM solol: ok so why do you have two different values of w then
03:15 PM solol: w = .25y and y = x * w
03:15 PM z64555: because w changes over time
03:15 PM solol: changes over time wtf
03:16 PM z64555: So you calculate y first
03:16 PM z64555: using an initial w of 1
03:16 PM z64555: then you calculate a new w from y
03:16 PM solol: why wouldn't you skip that step
03:16 PM z64555: ?
03:16 PM solol: multiplying by 1 is always the same
03:16 PM z64555: would not skip which step?
03:17 PM solol: why don't you skip w0
03:17 PM solol: those would just be apples
03:17 PM solol: or whatever
03:17 PM z64555: Because its a not-thought-out example
03:17 PM z64555: I don't have these things sitting on a bookshelf, ready to use, ya know
03:18 PM z64555: Anyway
03:18 PM solol: just make me understand how and why values change
03:18 PM z64555: Well, I can't do that
03:18 PM z64555: :P
03:18 PM solol: why do you use random weight values and neurons
03:18 PM z64555: Only you can do that, I just try to explain it
03:19 PM solol: another question i guess would be why does a neural network take inputs and outputs and allow you to guess what other output values would be?
03:19 PM z64555: there you go jumping in the ocean again
03:20 PM solol: well at least compare it to like a chess board or something
03:20 PM z64555: Nueral networks can be set up to do quite a number of different things
03:20 PM z64555: a chess board is an insanely complex level of nueral nets than what you can handle right now
03:20 PM solol: 'can compute any function that a digital computer could'
03:21 PM solol: it's not complex
03:21 PM solol: 1 per piece type
03:21 PM solol: then some for the game
03:21 PM z64555: yes, it is complex
03:21 PM solol: or something similar
03:22 PM z64555: Nueral networks have memory in them
03:22 PM z64555: and they can modify that memory
03:23 PM z64555: from a stream of changing inputs
03:23 PM solol: where do they go from 1 to many to 1
03:24 PM z64555: that's just a slice of what all goes on
03:24 PM z64555: It shows that a single nueron can feed multiple nuerons
03:24 PM z64555: and multiple nuerons can feed into one nueron
03:24 PM solol: that's the big slice it seems to me
03:24 PM z64555: nope
03:24 PM solol: that's where all the memory is etc..
03:24 PM z64555: nope
03:24 PM solol: what do you mean nope
03:24 PM solol: what is happening
03:25 PM z64555: I just told you severa paragraphs ago
03:26 PM z64555: value goes in, value goes out, value goes in and modifies the function, which modifies the future values going out
03:26 PM solol: why would you modify though
03:26 PM solol: what are the memories telling us
03:26 PM z64555: because then you wouldn't have a memory
03:26 PM SpeedEvil: And be unable to learn.
03:27 PM z64555: If you had the same weights, then the same inputs will always leave the same outputs
03:27 PM solol: it would only make sense if you calculated somehow what the memory was before just making blind changes
03:27 PM z64555: the network is whats making the changes
03:27 PM z64555: you just give it rules
03:27 PM z64555: and its does its thing
03:27 PM solol: i don't get how to make logic out of it though
03:28 PM solol: how you pick where points would be on a screen or something
03:29 PM z64555: sometimes you just make a blind guess of what the rules and values would be, then you run the network to train its memory for a few hours to see what you get
03:29 PM z64555: then you tweat the weight values a tiny bit
03:29 PM z64555: *tweak
03:30 PM z64555: run it again, etc.
03:30 PM solol: yeah i don't get it any more than when i started
03:30 PM z64555: look into PID control systems, that might help, too
03:30 PM solol: but i still want an exoskeleton
03:31 PM z64555: point being, it's not magic, or voodoo, it's just sequential math and logic
03:31 PM z64555: a TON of it.
03:31 PM solol: so they do use logic
03:32 PM solol: like if-then
03:32 PM solol: not sure how you'd program something to form a neural net though
03:33 PM solol: should be like a chess piece can move one place then it forms a memory there and it's next moves depend on some function or death
03:33 PM solol: i think a chess board explanation would have been a lot better than your botched attempt z64555
03:33 PM solol: lol
03:34 PM SpeedEvil: Programming neural nets is not understood.
03:34 PM solol: maybe checkers,
03:34 PM SpeedEvil: It's unclear if it will ever be well understood.
03:35 PM solol: the article said it has been around for 70 years
03:35 PM solol: why wouldn't it be well understood?
03:35 PM z64555: Because it's a complex thing that requires a diverse knowledge
03:35 PM SpeedEvil: In that you can start from a network organisation and a problem that is large enough to be non-trivial for the network, and come out with a solution that just works.
03:36 PM SpeedEvil: And requires things that may not be soluble.
03:36 PM z64555: the chessboard is insufficent to explain how a nueral net works, because there's a whole lot of things that go on
03:37 PM z64555: If you get a nueral net of any decent size, the mathematical formula needed to accurately describe the entire thing would take up a room
03:38 PM z64555: so we instead break it down to the nuerons
03:39 PM SpeedEvil: In many ways it's analogous to water flow.
03:40 PM SpeedEvil: At very low information content (change in velocity over distance), it's smooth and its behaviour is well predictable.
03:40 PM SpeedEvil: This is analogous to a neural network with way more neurons than input/outputs, so you can simply code a neuron to an input/output combination, and there is no complexity
03:40 PM z64555: hm, might help if I had my coffee
03:41 PM SpeedEvil: as you increase the information density - the number of things each neuron has to care about from well below 1, the network gets turbulent, with its design for a specific problem getting intractable and hard to predict.
03:42 PM solol: i don't get where all the computing happens
03:43 PM z64555: it happens for each cell. sequentially.
03:43 PM solol: i just know they take a one to many to one approach
03:43 PM solol: where do they get the many side of things working to get everything equal to the output
03:44 PM z64555: at the nuron those many connections go into
03:45 PM z64555: It can be as simple as a sum
03:45 PM z64555: where it just adds all of the values together
03:45 PM z64555: or it can weight each of the values from their respective neuron, in which it's called a bias
03:45 PM z64555: and then sum them
03:46 PM solol: yeah i'd have to read a book on it or something
03:46 PM z64555: You'd have to read several
03:46 PM solol: something that puts it all into proper perspective
03:49 PM solol: so a neuron does a calculation for what reason exactly
03:49 PM solol: do you form proper formulas at each neuron or do you just multiply them by random shit
03:50 PM z64555: the formulas are in each of the nuerons already
03:50 PM z64555: The nuerons can all have the same formula, or different
03:50 PM z64555: for simplicity, stick with the same forumla
03:51 PM solol: you would think they use different formulas or the same neuron
03:51 PM SpeedEvil: The formula are not aimed at the problem.
03:51 PM SpeedEvil: They are aimed at being capable of learning a solution to the problem, given specific inputs.
03:51 PM solol: yeah so what the hell happens
03:52 PM z64555: I tried to tell you on an intrisic level, but we all know how well that went :P
03:52 PM SpeedEvil: solol: what is your actual question.
03:52 PM SpeedEvil: solol: what answer would be an example of what you want
03:52 PM solol: why does it learn
03:52 PM solol: and what does it learn
03:53 PM SpeedEvil: It learns because it's fed specific training input, and because it's capable of learning that due to its configuration.
03:53 PM SpeedEvil: it learns weights of connections on the neurons that result in a solution to the problem.
03:54 PM solol: making no sense
03:54 PM z64555: alrighty, to the pyscology route
03:54 PM z64555: Do you know about pavlov's dogs?
03:55 PM * SpeedEvil drools.
03:55 PM z64555: Lol, did I ring a bell there?
03:56 PM z64555: Anyway, Pavlov was doing some experments with dogs for something else
03:56 PM z64555: Each time he or his assistance fed them, they'd first ring a bell to signal it was dinner time
03:56 PM z64555: *assitants
03:57 PM z64555: The dogs learned that the bell was the signal to dinner time, and that they were going to get food. So they started to drool at the sound of the dinner bell in expectation of food
03:58 PM z64555: Here, the dog's brains are nueral nets
03:58 PM solol: this still isn't really a neural net
03:58 PM solol: stimulus and response maybe
03:58 PM z64555: It's a macro explanation of neural nets
03:58 PM solol: it's not really
03:58 PM z64555: it is, really
03:59 PM z64555: The training phase of the neural net gives stimuli to the net
04:00 PM z64555: the net makes various associations between the stimuli
04:00 PM z64555: In the case of the dogs, the bell was associated with food
04:01 PM MarkusDBX: I'm looking for an open platform robotic vacuum, is there such a thing?
04:01 PM MarkusDBX: Or in general a home automation robotics open platform?
04:01 PM z64555: Roomba, is the first thing that comes to mind
04:01 PM MarkusDBX: well is it open?
04:01 PM z64555: not open platform, but easily obtainable and can be hacked to death
04:02 PM MarkusDBX: do people hack it?
04:02 PM MarkusDBX: easy to hack?
04:02 PM z64555: just google it :P
04:02 PM MarkusDBX: hehe
04:03 PM MarkusDBX: Anyways.. some wifi connected vacuum, with some kind of an interface you could use would be fun.
04:04 PM z64555: solol: So, nueral nets are trained with stimuli to get some achieved response. Once trained, they can blend stimuli together with the absence of stimuli together to get the correct response
04:04 PM z64555: ugh I need coffee
04:04 PM SpeedEvil: z64555: Also, they can think they need coffee.
04:04 PM MarkusDBX: neural nets are also trained by all the humans using facebook =) and such services.
04:04 PM z64555: Pavlov's dogs stopped drooling after awhile when the bells were rung, but no food arrived
04:05 PM MarkusDBX: I bet in the future when people loose everything, they will find giving up all their behaviour data for free.. was a too low price.
04:06 PM z64555: meh
04:06 PM MarkusDBX: well without any kind of tinfoil, the future kind of exrapolates into that currently.
04:07 PM z64555: most of human behavior data is junk, anyway
04:08 PM MarkusDBX: sure, but that junk get sorted out eventually
04:09 PM z64555: solol: once a nueral net has been trained to do some task, they often cut out most of feedback loops to make it easier to process
04:09 PM z64555: Similar to what Boston Dynamics does with its walking robots
04:11 PM z64555: each nueron is composed of a transfer function and a modifiable variable, the weight, which is used in the transfer function
04:12 PM z64555: This simple version of nueral nets has the outputs summed at the inputs of other neurons
04:12 PM z64555: because the weighting is already done in the neuron itself
04:14 PM z64555: feedback paths are paths from a nueron to itself, either as another input or as a modifier to its weight
04:15 PM z64555: Memory in the neural network is thus in the form of the weight values
04:15 PM solol: i dont get how it takes on random values and makes a learning process
04:15 PM solol: where do you plug in optimal values or something
04:17 PM z64555: You have a monitor system, which is either a person, some logic rules, or another network, that monitors the input and output of the network under test
04:17 PM z64555: and semi-randomly adjusts weights here and there to see its affect on the output
04:18 PM z64555: This is a similar process to PID controller tuning
04:18 PM solol: where does random work though
04:19 PM z64555: Could you rephrase?
04:21 PM z64555: Random weight values are injected into the system when it is not known which direction the output is going to go
04:22 PM z64555: This is before a feedback system, this is a still a feed-forward system that you're more familiar with
04:23 PM z64555: Some neurons or a sub-net of neurons can form a feedback path, where they can train themselves to a small degree
04:24 PM z64555: The monitor system still has authority over these feedback paths, however
04:24 PM z64555: Since you so love the chessboard example
04:25 PM z64555: There's several things going on, an X movment, a Y movment, a Z movement
04:26 PM z64555: You'll need a network that can achieve the kinematics of moving the piece in sequence
04:27 PM z64555: and you'll need a network that can move the arm that's moving the chess piece
04:27 PM z64555: and you'll need a network that can determine where the chess piece should go
04:28 PM z64555: So you work you way up from the easy things that can do feed-forward paths, to the hard things that need a feed-back path
04:29 PM z64555: First is the piece movement rules
04:30 PM z64555: a pawn may move forward one square at any point on the board
04:30 PM z64555: Except if it's at the opponent's side, and cannot move any further
04:30 PM z64555: or if it is blocked by another piece
04:31 PM z64555: So you train the nueal net to recognize "Pawn" and bring up a set of valid rules
04:33 PM z64555: The ruleset isn't necassarily within the net itself, it can be an outside subroutine
04:33 PM solol: what is train
04:34 PM solol: you recognize the board
04:34 PM z64555: train. verb. The act of teaching a subject some topic or another
04:35 PM z64555: example "I had to train solol on how to use a calculator"
04:36 PM z64555: Piece and board recognition is at the high level network
04:36 PM z64555: Presumably a network that's working from a camera sensor
04:38 PM z64555: Training that one, you point the camera at a chess piece and tell it is a pawn, while moving the the chess piece from different angle
04:38 PM z64555: s
04:38 PM solol: you're gettin crazy
04:38 PM solol: that's a different neural network entirely
04:38 PM solol: that's not the game, that's the camera
04:38 PM solol: but you could have examined a neural net from a camera or computer screen
04:38 PM z64555: I just said that
04:39 PM z64555: "Piece and board recognition is at the high level network"
04:41 PM z64555: Ok, so you have your pieces, and their movement rules programed in as a set of rules
04:41 PM z64555: and the nueral net has been trained on which one is what
04:42 PM z64555: Next is to train the net on the goals of the game, which is to checkmate the opponent's king
04:43 PM solol: yeah you're skipping some stuff
04:43 PM z64555: So you have just the king on the board, and the net has a full suit of pieces
04:43 PM solol: you can quit man
04:43 PM z64555: what am I skipping
04:44 PM z64555: "How do you train a network?"
04:44 PM z64555: You give it a stimulus, then modify its weights until the output matches the desired result
04:46 PM z64555: You can have a network train itself by making it adjust its own weights on a simple yes/no feedback from a monitor system
04:47 PM SpeedEvil: http://solarprobe.jhuapl.edu/ is neat. Solar probe to 9.5 solar radii. 1400C at the front
04:47 PM z64555: Monitor: Here is an apple.
04:47 PM z64555: Network: this is an orange.
04:47 PM z64555: Monitor: No.
04:47 PM z64555: Monitor: Here is an apple.
04:47 PM z64555: Network: this is a fruit.
04:47 PM z64555: Monitor: Yes.
04:47 PM z64555: Monitor: Here is an apple.
04:47 PM z64555: Network: This is an apple.
04:49 PM z64555: At any rate, you're having a difficult time extrapolating what I'm telling you
04:49 PM z64555: and I'm not awake enough anyway :P
04:52 PM solol: you're training by already knowing
04:52 PM solol: yet it learns
04:52 PM solol: i don't get it
04:53 PM Snert_: I'm hungry
04:58 PM z64555: it doesn't already know it
04:58 PM z64555: You or the monitor knows it, but the network doesn't
04:58 PM z64555: Because it hasn't built up its memory yet
05:21 PM jandor: can anyone tell me about ros navigation stack?
05:21 PM jandor: I can read about it for sure, just one to talk to someone
07:57 PM cnnx: hi
08:08 PM ace4016: hi
08:15 PM cnnx: how are you ace4016 ?
08:16 PM cnnx: do you own any sbc?
08:16 PM ace4016: single board computers?
08:16 PM cnnx: yes
08:16 PM ace4016: yea
08:16 PM cnnx: which ones?
08:17 PM ace4016: beaglebone black
08:17 PM cnnx: how does it compare to rpi
08:17 PM ace4016: i guess i do have some launchpads if those count
08:18 PM ace4016: no clue
08:20 PM ace4016: RPI is Cortex A7 based; beaglebone is A8 based...that's about all i can see :P
08:20 PM cnnx: i'd like to find osmeone on earth who wants to develop a device with me that i can connect wirelessly to
08:20 PM cnnx: like a communication device
08:21 PM cnnx: so lets say you are in the usa
08:21 PM cnnx: and im in canada
08:21 PM cnnx: or europe
08:21 PM cnnx: or whereever
08:21 PM cnnx: find a way to develop a technology to do so
08:21 PM cnnx: that would be a fun challenge :)
08:22 PM ace4016: you mean connect like a HAM radio across the globe or...something else?
08:22 PM cnnx: yeah but without ham
08:22 PM cnnx: like we decide our own technology
08:23 PM cnnx: and we can't need a license for something we develop
08:23 PM cnnx: since it's not regulated/doesnt exist yet
08:23 PM ace4016: so something in the ISM band :P
08:24 PM cnnx: what about use a public satellite in space to relay to/from
08:25 PM ace4016: not sure there are publically available channels. satcom time is expensive :P
08:26 PM cnnx: well i own an iridium 9555 handheld
08:26 PM cnnx: and i know i need to pay to register on the network
08:26 PM cnnx: with a sim card
08:26 PM cnnx: wondering if you can register without paying
08:26 PM cnnx: legally
08:27 PM cnnx: another option which is not in my reach yet, is build a cubesat with a transciever, and send it in LEO
08:27 PM cnnx: but the launch is about 10-15k
08:27 PM cnnx: and will fall out of orbit in a few months after
08:28 PM ace4016: but it'll be cool to say you had a satellite in space :P
08:28 PM cnnx: yes
08:29 PM cnnx: are you in north america?
08:30 PM ace4016: yes
08:30 PM cnnx: there's soemthing called lora
08:30 PM cnnx: long range communications it stands for
08:31 PM cnnx: its hardware made for something like this
08:31 PM cnnx: but i think it requires a line of sight
08:31 PM ace4016: i've got no interest in creating a communications system though
08:31 PM cnnx: okay
08:31 PM cnnx: sorry to bother you
08:31 PM ace4016: most communication is line of sight
08:31 PM cnnx: what are your interests?
08:32 PM ace4016: satellites happen to have a large sight range :P
08:32 PM ace4016: aviation, robotics (figures eh?), material science
08:33 PM cnnx: ace4016: any interest in flight simulators? like x-plane?
08:33 PM cnnx: or vatsim network?
08:35 PM ace4016: i like flight sims :P
08:36 PM cnnx: cool
08:36 PM cnnx: i built a nitro 50 size rc helicopter
08:36 PM cnnx: Vibe 50 NEX
08:36 PM cnnx: by JR
09:13 PM z64555: y no broad spectrum HF
09:15 PM z64555: blah, I forget the name of it
10:51 PM anniepoo: boing
10:53 PM mrdata: z64555, what is the fraction of a wavelength that represents the band width of broad spectrum signals in the microwave region, say?
10:53 PM mrdata: let's choose 2.4GHz
10:53 PM mrdata: and use the bandwidth of typical wifi
10:56 PM anniepoo: mrdata, that was directed at z, he doesn't see to be answering
10:57 PM anniepoo: would you like an answer?
10:57 PM z64555: hmmm, I forget. I remember that the sinc function is used to represent the influence of a signal across frequencies, but I forget the simplified fraction
10:57 PM anniepoo: nvm - hi Z!
10:57 PM z64555: yeah I'm here, sometimes
10:58 PM z64555: the bandwidth of a single signal is also affected by the quality of the transmitter
10:58 PM z64555: poorer quality transmitters have larger bandwidths
10:59 PM mrdata: ok so 2.4 GHz is 2.4 x 10^9 cycles per second; and 11 Mbps is 11 million bits per second, so to get 1 bit thru, you consume around 218 cycles
10:59 PM mrdata: now translate that down to HF
10:59 PM z64555: ah
11:01 PM mrdata: let's say we're on 1 MHz, (broadcast radio); then a similar fraction would be 4.6 kbps
11:01 PM z64555: See, GHz radios don't fully utilize the spectrum because they can't switch fast enough to generate the signal
11:01 PM z64555: The same can be said for the HF and VHF, but it's not the same formula
11:01 PM z64555: You can get more bps per cycle at the lower frequencies
11:02 PM z64555: That's not saying you can get 1 bit per cycle, though
11:02 PM z64555: nor is it saying you can get a whole lot of data through
11:03 PM mrdata: sure; the different modes have different formulae
11:03 PM mrdata: but the overall spectrum is narrower, too
11:04 PM mrdata: at HF than it is at microwave
11:05 PM mrdata: but then,
11:05 PM mrdata: look at the deep space communications network
11:05 PM mrdata: DSN is very low data rate but microwave frequency
11:05 PM mrdata: very large dishes
11:06 PM mrdata: long distances
11:07 PM z64555: and a ton of error checking algorithms
11:08 PM z64555: Since the reciever is in deep space, there needs to be some way of distinguishing a signal from cosmic radiation, and other poluting signals
11:09 PM z64555: I sadly haven't been keeping up with that tech, though
11:10 PM z64555: Now back to the HF broad spectrum idea. I was in no way suggesting you'd get seomthing like a Mbps connection with HF
11:11 PM z64555: Even if you swamped the entire HF band you'd be lucky to get a data exchange of 28Kbps
11:12 PM z64555: well, maybe not that drastic
11:12 PM z64555: I mean, 28Kbps is pretty good for audio carrier
11:12 PM z64555: You're going to make me do math, arn't you :P
11:19 PM * anniepoo pokes z with a stick and makes him do math
11:20 PM rue_bed: http://www.gripinit.com/wp-content/uploads/2015/02/cyborg_gaming_mouse_robot_technology_hd-wallpaper-1215936.jpg
11:20 PM rue_bed: huh, has a nose
11:23 PM anniepoo: social robotics feature.
11:23 PM anniepoo: robot to the left of me has a nose
11:23 PM anniepoo: (big hugs Elmo)
11:37 PM gottaname|wurk: anniepoo, cute robots!
11:42 PM z64555: ah, here we go, nyquist rate
11:42 PM z64555: theoretical maximum number of symbols that can be transfered
11:42 PM anniepoo: testing
11:43 PM z64555: hi
11:45 PM * anniepoo wonders if this is working
11:45 PM anniepoo: 8cD
11:45 PM * anniepoo dances around the Nyquist noise floor
11:45 PM z64555: I can see your text, anniepoo
11:46 PM anniepoo: thanks
11:46 PM anniepoo: for some reason it wasn't sending slash command
11:47 PM z64555: If the essential freuqnecy range is limited to B cycles per second, then 2B is its maximum number of symbols that could be sent and obtained, assuming peak interference is less than half a quantum step
11:47 PM z64555: I think the quantum step is the period of one cycle
11:51 PM z64555: ohh, that might be why the DSN has a slow data transfer, needing time to energize the element to a point where it can distinguish a 1 and a 0