#robotics | Logs for 2017-01-27

Back
[00:01:22] <rue_house> If I live to 8000 and meet alien races I will tell your story.
[00:02:51] <rue_house> ... cant assure you what I'll remember at 8000, but I'll try to make it good
[00:03:55] * rue_house piles supper into finned-mounds to try to get it to cool faster
[00:04:12] <rue_house> "IS SUPPER COLD __YET__???"
[00:06:17] <anniepoo> well, I have video streaming in a web page on my desktop
[00:06:30] <anniepoo> but haven't got it forwarding out to the world yet
[00:06:36] <rue_house> how latency?
[00:06:58] <anniepoo> well, I'm on the local gigabit network
[00:07:16] <rue_house> endcoder and decoder
[00:07:27] <rue_house> between them, you can get as many as 100 frame buffers
[00:07:31] <anniepoo> it's mostly driven by the speed of the crummy javascript thing grabbing jpgs
[00:07:39] <rue_house> ah
[00:07:51] <theBear> grabbing jpegs don't really qualify as streaming video <grin>
[00:07:52] <anniepoo> it's about 100ms I'd guess
[00:08:08] <anniepoo> mjpg is just a series of jpgs
[00:08:18] <rue_house> thats pretty good, I wasn't able to get under 2 seconds
[00:08:37] <anniepoo> bet it gets much worse if I go to the outside world
[00:08:52] <anniepoo> I don't really want to have a webbot
[00:09:09] <theBear> i know what mjpg looks like, i was talkin more about the grabbing side of things tho, and broad terms ain't much practical use anyway :)
[00:09:39] <theBear> i don't really wanna have a ouchey elbow, but we don't all get everything we don't want
[00:10:06] <anniepoo> hmm.... basically, I'm just poking around with the video streaming
[00:10:17] <anniepoo> in the end, I want video
[00:11:00] <anniepoo> picamera --> python --lan--> prolog --> opencv
[00:11:31] <anniepoo> 8cD my day job's writing Prolog
[00:11:43] <theBear> why the middle 3 ?
[00:12:00] <theBear> and why are 2 of them such strange choices ?
[00:12:40] <anniepoo> well, picamera is a library I don't intend to rewrite, and it's in python, so that's the first two
[00:12:46] <theBear> and doesn't opencv kinda negate the need for prolog
[00:12:50] <anniepoo> the lan is the lan
[00:12:58] <theBear> oh, i thought that was the name of some tiny camera
[00:13:17] <anniepoo> no, it's the camera lib for python on the pi
[00:13:28] <anniepoo> prolog's not a 'strange choice' to me
[00:13:47] <anniepoo> and allows me to reason about what expensive opencv operations are needed this frame
[00:13:58] <theBear> hmmm... way i see it, if you using opencv, you just wanna get the raw camera data to it as directly and simply as possible
[00:14:07] <anniepoo> true
[00:14:26] <theBear> mmm, and that last bit, that makes a lotta sense when it isn't in a text based 1d diagram :)
[00:14:53] <anniepoo> I could, I suppose, write some server in C++, but that sounds like some degree of pain
[00:15:25] <theBear> and i dunno what that picam lib does, but python is SLOW, and that probly ain't what you want
[00:15:51] <anniepoo> this is an IO bound situation
[00:15:56] <anniepoo> 8cD
[00:16:35] <theBear> wtf ? how slow is your cam interface, or silly-fast+hires is the output ?
[00:16:55] <anniepoo> so I'm not worried about the python interpreter
[00:17:25] <theBear> mmm, you know that io bound still doesn't excuse excessive latency
[00:18:30] <theBear> and as far as getting video from one place to another, if the native/inherant bits that come with cam driver/api etc aren't enough, there are a bunch of super simple and efficient/fast things that can handle that unmodificated
[00:19:49] <anniepoo> I'll preprocess on the pi to reduce the size of data. I'm sending a foveal image and a reduced sized peripheral image
[00:20:33] <theBear> hmm, you know a lot of words i don't know
[00:20:35] <anniepoo> still deciding exactly what to send
[00:21:07] <anniepoo> the Bayer data is available. That's very tempting. I'll try sending it, and if the link keeps up, I'll do that
[00:21:40] <anniepoo> then I can control the color processing
[00:21:48] <theBear> does opencv have to be on another host ?
[00:21:58] <anniepoo> yes, I think so
[00:22:58] <anniepoo> I'm wanting to do some Haar classification
[00:23:33] <theBear> hmm, now you just trying to make me feel dumb <grin>
[00:23:58] <anniepoo> I did the CV chain for the R-25
[00:24:27] <anniepoo> http://www.robokindrobots.com/
[00:24:48] <anniepoo> and now do other stuff that's related, but I can't talk about
[00:26:35] <anniepoo> 8cD I'm a committer on SWI-Prolog
[00:26:51] <anniepoo> so that's a natural way for me to go for the server
[00:27:05] <anniepoo> and I don't NEED every bit of the CV chain running every frame
[00:27:45] <anniepoo> eg if I identify Annie in a frame, and find a face in the next frame with a more rapid algorithm
[00:27:56] <anniepoo> in about the same place
[00:28:00] <anniepoo> it's probably Annie
[00:28:36] <anniepoo> I mosly want to find obstacles, objects and people
[00:28:58] <anniepoo> I've been fiddling with a PIR detector of the 'motion detector' sort
[00:29:26] <theBear> yeah, i might not know the modern fancy words for it, but i always been good at normalisation and that kinda stuff in programming land, and i on top of this kinda thing
[00:29:34] <theBear> conceptually anyway :)
[00:29:42] <anniepoo> I'm pretty convinced putting one behind an aperture and restricting it's FOV
[00:29:55] <anniepoo> and scanning is a more reliable way to find humans
[00:30:17] <anniepoo> and a range sensor is a better way to find obstacles
[00:30:46] <anniepoo> and most object location is gamut detection
[00:31:34] <anniepoo> for localization I may end up using some fiducials
[00:31:59] <theBear> for proper/good navigating/getting-around a combination of things like range finding/radar-ing and cv will generally get a much better combo of efficiency and effectiveness
[00:32:39] <theBear> you just kinda lean on the strengths of all your options/sources and use those to cover/avoid the weaknesses of others
[00:32:54] <anniepoo> There's a cool paper on creating fiducials that also read as normal drawings
[00:33:07] <theBear> heh, fuzzy logic it up a bit, late 80's buzzword style
[00:33:07] <anniepoo> yes, this is sensor fusion
[00:33:39] <theBear> i've had some pretty kickass asian/jap-fusion a couple times over the years
[01:12:54] <anniepoo> OK, hmm....
[01:13:17] <anniepoo> rather than guess at latencies and performance issues a priori
[01:13:54] <anniepoo> I'm thinking I'll make a bunch of small servers that do individual tasks
[01:14:18] <anniepoo> I can run them on whichever machine, and reconfigure things pretty fast
[01:19:19] <rue_bed> arg, I need to start making heads for me
[01:19:31] <rue_bed> I need head makingexperience
[01:20:19] <rue_bed> I need something stern but not too serious
[01:20:39] <rue_bed> hmmmm
[01:22:14] <rue_bed> I suppose if I put all the 3d printers in the back livingroom, I can put the cnc machines in the shop
[01:22:35] <rue_bed> I should start assembling the 3rd 3d printer soon
[01:27:53] <anniepoo> heads for printers or human heads?
[01:32:00] <anniepoo> 8c/ I gotta commit to a sensor suite
[01:47:47] <rue_bed> cyborg/droid heads
[01:48:02] <rue_bed> I suspect there will be a cyborg stage
[01:48:10] <rue_bed> lets hope its brief
[03:26:31] <Jak_o_Shadows> sup
[03:36:29] <theBear> rue_bed, feel free to base a head on mine, it won't be too serious, but very stern either, perhaps not at all :)
[03:37:58] <theBear> then again, i should mention that i could never in good conscience encourage anyone to even get a taste of what it's like inside my head... sure i grown quite fond of it in here, but it ain't for general consumption, and i think even "aqquired taste" is far from accurate to describe it
[03:38:14] <theBear> it does look pretty damned good tho
[03:39:05] <theBear> plus it's extremely trauma and tear resistant, which may prove useful during the cyborg stage of the revolution :)
[07:42:12] <tsglove> o/
[09:28:48] <rue_house> maybe something more motorbike helmet
[09:29:17] <rue_house> arg, I been fighting off a cold for a while now, but I think last night it got me
[09:39:38] * z64555 makes sure to take his vitamins
[11:47:09] <veverak> damn it
[11:47:13] <veverak> conferences are tiring
[11:57:03] <deshipu> devconf?
[11:57:12] <veverak> yep
[11:57:19] <deshipu> that's just the first day
[11:57:19] <veverak> deshipu: got booth for our hackerspace here
[11:57:26] <veverak> exactly
[11:57:34] <deshipu> but the party will be fun :)
[11:57:35] * veverak got really tiring week by
[11:57:37] <veverak> hope so ;)
[11:57:45] <veverak> deshipu: a lot of people like my Tote
[11:58:15] <deshipu> yeah, give a programmer something physical
[11:59:03] <veverak> :D
[12:09:19] <veverak> wonder if it's worth it trying to force filesystem on 512k micropython
[12:22:31] <deshipu> probably not
[12:22:43] <deshipu> you can just replace the memory chip on that module
[12:22:52] <deshipu> they are a couple of cents a piece
[12:24:03] <veverak> O_o
[12:26:00] <veverak> hmm
[12:26:07] <veverak> for esp 07 it means opening the cae?
[12:26:08] <veverak> *case
[12:31:59] <deshipu> ah, if it has a shield, then yes
[12:32:15] <deshipu> s/shield/shielding
[18:18:39] <rue_house> haha I implemented a stack in BASIC before I even knew what a stack was
[18:19:50] <rue_house> oo I'm gonna try to write recursive code
[18:20:03] <rue_house> I hate temporal inversion
[20:14:33] <anniepoo> rue, pocketsphinx or ship it offboard?
[20:14:40] <anniepoo> RPi3
[20:16:56] <anniepoo> board's doing a bunch of other stuff- notably preprocessing of camera data to compact before moving offboard
[20:18:06] <anniepoo> I could move the raw voice data offboard
[20:18:19] <anniepoo> I could also use stand off sensing with a couple kinects
[20:20:38] <anniepoo> howdy
[20:24:55] <anniepoo> 8cD doing architectural work on the snail software
[20:41:42] <lonecrow> yo Rue we are meeting in Sechelt at 6:30
[20:43:11] <anniepoo> 8cD whats the meeting?
[23:34:56] * rue_bed is ill