#robotics | Logs for 2016-05-31

Back
[00:01:12] <wildmage> Tom_itx, how are things?
[00:01:27] <wildmage> Tom_itx, i haven't been on in a long while
[00:01:51] <wildmage> Tom_itx, i've not been doing robot stuff for a while
[00:10:12] <Tom_itx> me either but stuck around here
[00:18:13] <rue_shop3> mrdata, micorcontroller outputs
[00:18:40] <rue_shop3> wildmage, howdy
[00:19:02] <wildmage> rue_shop3, hi
[00:19:17] <wildmage> rue_shop3, what's going on at the rue compound?
[00:19:21] <rue_shop3> sorry, channel is in late stages of life
[00:19:28] <rue_shop3> robots for faire
[00:19:54] <rue_shop3> https://www.youtube.com/watch?v=j4uz_yha9uI
[00:20:22] <rue_shop3> it loses about 2% of the marbles
[00:20:29] <rue_shop3> randomly of course
[00:24:16] <wildmage> rue_shop3, is this a maker faire?
[00:24:53] <wildmage> what sensors are you using to detect the marble?
[00:25:20] <wildmage> that's a nice custom arm
[00:36:36] <rue_house> its arm8
[00:36:45] <rue_house> it was succeeded by arm9
[00:36:54] <rue_house> yup maker fiare
[00:37:02] <rue_house> I'll be at VMMF in 2 weeks
[00:37:08] <rue_house> I need to get a show togethor
[00:37:10] <rue_house> :)
[00:37:13] <Anniepoo> that's actually a nice way to test an arm
[00:37:27] <rue_house> yar, it kida wore out
[00:37:35] <rue_house> arm9 is gonna do that for the next faire
[00:37:45] <rue_house> and arm8 needs a new SG90 on its gripper
[00:38:03] <rue_house> one of the synchro belts, i THINK slipped a tooth
[00:38:12] <rue_house> and a gear drive is comming loose
[00:38:26] <rue_house> a faire is like 400x the normal amount of use those get
[01:19:50] <[VEGETA]> http://answers.ros.org/question/235639/point-cloud-object-tracking/
[01:28:15] <theBear> fairies don't have teeth, they collect them
[01:29:18] <theBear> that's you ? course there is, 2 cameras with some amount of distance between where they looking from, kinda like the glasballs in your face right now, then pick an object and pythagorus that shiz up, yo
[01:34:26] <z64555> figuring out what is an object just from comparing two images is tough. eyeballs also have the ability to change convergance
[01:34:41] <z64555> which really helps figuring distance
[02:13:54] <theBear> cameras can be moved
[02:14:35] <theBear> and if something is moving, you effectively got a ton more data to math with instantly as well
[02:51:38] <deshipu> I still think the most foolproof way is to just bump into it
[02:55:52] <rue_house> drive by feel, thats what the boss says
[04:37:23] <[VEGETA]> back now..
[05:50:22] <SpeedEvil> what? changing convergence doesn't help at all in determining distance.
[05:50:44] <SpeedEvil> Only if you can't multiply a frame by another frame cheaply.
[05:51:45] <SpeedEvil> offsetting in x pixels the images does exactly the same as altering convergence.
[05:52:11] <SpeedEvil> The ebay '3d' lenses for smartphones look interesting.
[05:52:23] <SpeedEvil> Little dual periscopes - to avoid another cam and sync issues
[05:53:48] <[VEGETA]> well...
[05:54:13] <[VEGETA]> this is my question: http://answers.ros.org/question/235639/point-cloud-object-tracking/
[05:54:30] <[VEGETA]> speedevil, are you responding to me?
[06:08:30] <theBear> SpeedEvil, there was an htc phone with both "3d camera" and "3d screen" a long time ago now, well, ya know, in android terms.. it was surprisingly convincing... never did get around to finding out how the screen worked
[06:48:29] <SpeedEvil> I was responding to the silly comment from <z64555> figuring out what is an object just from comparing two images is tough. eyeballs also have the ability to change convergance
[06:48:39] <SpeedEvil> [VEGETA]:
[06:48:45] <SpeedEvil> theBear: yeah.
[06:49:00] <[VEGETA]> ?
[06:49:14] <[VEGETA]> speedevil, your line is empty :(
[06:49:37] <SpeedEvil> theBear: I was more meaning that instead of trying to work out how to get two cameras on your robot, frame-locked so movement doesn't screw you, you could just buy a six quid 'periscope' from ebay and attach to the pi camera, for example.
[06:49:52] <SpeedEvil> [VEGETA]: It is a common way to refer someone to the lines immediately above
[06:50:48] <[VEGETA]> speedevil, aha xD. I thought you destroyed my idea *__*
[06:53:28] <SpeedEvil> [VEGETA]: If you are asking specific ways to accomplish a task on a platform, you're going to get way fewer responses than asking general questions.
[06:53:50] <[VEGETA]> speedevil, what do you think about determining dynamic obstacles by analysing 3d point clouds to segment related clusters
[06:54:32] <[VEGETA]> then updating the 3d map and 2d occupancy map accordingly
[06:54:38] <SpeedEvil> For flying objects, that is the easiest way
[06:54:54] <SpeedEvil> For flying objects, there is no concern about detaching them from the environment
[06:54:54] <[VEGETA]> no, for ground objects mainly
[06:55:14] <SpeedEvil> For ground objects, you need to work out where to cut from the ground clutter, which can cause issues.
[06:55:43] <[VEGETA]> so basically i need to segment ground first
[06:55:53] <[VEGETA]> then segment dynamic obstacles
[06:55:56] <[VEGETA]> right?
[06:56:04] <SpeedEvil> Something like that.
[06:56:26] <SpeedEvil> Or if you have a map - correlate the map to the point-cloud to work out ground.
[06:56:30] <[VEGETA]> I saw a very simple code example in a good ROS book that isolates ground
[06:57:18] <SpeedEvil> Isolating ground is easy if you can say things about it.
[06:58:03] <SpeedEvil> For example 'radius of curvature exceeds 3m' AND 'no more than 20 degrees from horizontal'
[06:58:19] <[VEGETA]> it is in "learning ros for robotic programming - 2nd edition" book... chapter about point cloud
[07:02:02] <[VEGETA]> here is what the book said
[07:02:15] <[VEGETA]> In this example, we are going to show how to perform model-based segmentation of
[07:02:15] <[VEGETA]> a point cloud. We are going to constrain ourselves to a planar model, which is one of
[07:02:15] <[VEGETA]> the most common mathematical models you can usually fit to a point cloud. For this
[07:02:15] <[VEGETA]> example, we will also perform the model estimation using a widespread algorithm
[07:02:15] <[VEGETA]> called RANdom SAmple Consensus (RANSAC), which is an iterative algorithm
[07:02:15] <[VEGETA]> capable of performing accurate estimations even in the presence of outliers.
[07:02:20] <[VEGETA]> __________
[07:04:08] <[VEGETA]> speedevil, here is the code: https://github.com/AaronMR/Learning_ROS_for_Robotics_Programming_2nd_edition/blob/hydro-devel/chapter6_tutorials/src/pcl_planar_segmentation.cpp
[07:05:36] <SpeedEvil> 'to a planar model' - yes - fitting to a plane is really, really simple.
[07:05:56] <SpeedEvil> Not all ground surfaces are flat though, which considerably makes the problem harder
[07:06:31] <[VEGETA]> aha
[07:06:47] <[VEGETA]> but indoor environments are nearly plannar
[07:07:40] <[VEGETA]> if we assume ground is not planar, what can be done?
[07:08:57] <deshipu> we are doomed
[07:11:20] <SpeedEvil> [VEGETA]: you either correlate it to your known map, or you segment the world based on rules about what you can do with it.
[07:11:34] <SpeedEvil> For example 'radius of curvature exceeds 3m' AND 'no more than 20 degrees from horizontal'
[07:11:56] <SpeedEvil> This would get you smooth surfaces that can be driven around in a wheelchair, for example.
[07:12:39] <[VEGETA]> so you mean by your example is to make any 20 degree surface as if it is 0 degree = ground
[07:12:58] <theBear> SpeedEvil, oh i know, if you go back even more you see me casually "answer the whole thing" right at the beginning
[07:12:59] <SpeedEvil> No, from horizontal as measured by onboard sensors
[07:13:08] <SpeedEvil> theBear: ah
[07:13:41] <[VEGETA]> speedevil, yes i mean 20 degrees taken from the horizon.
[07:13:53] <theBear> SpeedEvil, i suspect like you, when these things are simple, i feel no need to make them seem complex :)
[07:13:56] <[VEGETA]> cuz 0 degrees is actually the pure ground
[07:14:50] <theBear> the umm, have we mixed up ground and height, or er, is our robot just woke up from passed out with his eyeballs center leveled to the floor ?
[07:14:52] <[VEGETA]> but what is "radius of curvature"?
[07:15:20] <theBear> portion of a cicle
[07:15:22] <theBear> cirle
[07:15:25] <theBear> dammit
[07:15:27] <theBear> circle
[07:15:40] <SpeedEvil> [VEGETA]: no, not from the horizon. From accellerometers, for example.
[07:15:42] <theBear> or radius of it, depending context that i too rushing to look at
[07:16:04] <SpeedEvil> [VEGETA]: how curved the ground is.
[07:16:20] <[VEGETA]> speedevil, well, i won't use accelerometers... only kinect sensor with its readings
[07:16:33] <[VEGETA]> that builds the 3d map via point clouds
[07:17:39] <[VEGETA]> you know, like octomap for example. i've seen videos of it constructing 3d map
[07:18:34] <deshipu> then you have no way of knowing which way down is
[07:18:58] <deshipu> even humans have accelerometers in their ears
[07:19:11] <[VEGETA]> deshipu, why is that? the robot will have odometry
[07:19:27] <deshipu> [VEGETA]: how would you know?
[07:19:31] <[VEGETA]> kinect can generate visual odometry
[07:19:43] <deshipu> so what
[07:19:54] <deshipu> you may be in a room that is slanted sligthly as a whole
[07:20:02] <deshipu> no visual information will tell you where down is
[07:20:09] <deshipu> unless you observe something falling
[07:20:31] <[VEGETA]> but not all robots have IMU, how do they know ground?
[07:21:06] <deshipu> most just assume that whatever they stand on is the ground
[07:21:11] <[VEGETA]> isn't 3d map data enough?
[07:21:28] <deshipu> no, because you can rotate it arbitrarily
[07:21:32] <[VEGETA]> ^ yes, that assumption works in indoor environments
[07:21:57] <SpeedEvil> ^not built by Escher
[07:22:26] <deshipu> I remember a nasty practical joke, that involved gluing furtniture to the ceiling or a wall in a room where a person is sleeping
[07:22:41] <SpeedEvil> That's not very nasty.
[07:22:43] <deshipu> when they wake up, they make funny movements to avoid falling down
[07:23:40] <deshipu> [VEGETA]: in particular, if your robot stands on a ramp, it doesn't know if the ramp is level and all the rest of the floor is angled, or the other way around
[07:24:25] <deshipu> you can of course make assumptions -- assume that the largest flat surface is level, for instance
[07:24:51] <deshipu> but it doesn't always work
[07:24:54] <[VEGETA]> so basically using an accelerometer makes it easier in general and more accurate
[07:25:16] <deshipu> well, if you have accelerometer, you knwo which way the gravity points
[07:25:31] <deshipu> of course, that only works when you are on the surface of a planet
[07:25:43] <[VEGETA]> lol where else will i be
[07:25:52] <deshipu> we use robots in space, you know
[07:25:57] <deshipu> we sent some to Mars
[07:25:59] <deshipu> etc.
[07:27:03] <[VEGETA]> so if accelerometer says it is 15 degrees... and radius is 2m -> it is ground or walkable
[07:27:51] <deshipu> unless it's very slippery
[07:28:05] <deshipu> or water
[07:28:21] <deshipu> or long grass
[07:28:47] <[VEGETA]> fine, I won't need that since it is in indoor
[07:29:00] <deshipu> there are indoor pools :P
[07:29:10] <deshipu> and soft carpets
[07:29:21] <[VEGETA]> now, the degree from horizontal is known.. what is radius of curvature here?
[07:29:51] <deshipu> I think you can just focus on the angle
[07:30:31] <[VEGETA]> ok, assuming the robot will always or most times will be on 0 degree.
[07:30:40] <[VEGETA]> what else should be done here?
[07:30:57] <[VEGETA]> radius of curvature thing... what does it represent in our situation?
[07:32:39] <deshipu> I assume that SpeedEvil meant he case where you have curved floor, and you can ride on it even if the angle exceeds whatever limit you have -- like rollocoaster or Sonic in the game
[07:32:52] <deshipu> if only the radius is big enough
[07:33:10] <deshipu> but I think that in practice you can forget that
[07:33:27] <deshipu> what you need is to make sure the robot will fit
[07:33:40] <[VEGETA]> well, I would appreciate a simple picture :)
[07:33:47] <deshipu> so that the ramp or corridor is wide enough
[07:33:58] <[VEGETA]> yes, in practice the robot should fit
[07:35:21] <SpeedEvil> Quite. One definition of 'ground' is 'that which I can drive my robot over'
[07:35:22] <[VEGETA]> aha so he meant "width"?
[07:36:08] <SpeedEvil> This may be quite different for a tiny robot with weak motors meant to only move over smooth perfectly flat surfaces, and a six wheel 'all terrain' type vehicle
[07:36:34] <SpeedEvil> For the first, a horizontal surface with short grass is a complete obstacle.
[07:36:48] <SpeedEvil> The other might be able to drive over small shrubs on a 30 degree slope.
[07:36:51] <[VEGETA]> my robot will be Ackerman type
[07:37:10] <deshipu> right, if an obstacle is smaller than the radius of your wheelm and you have enough motor power to lift your robot, you can probably ride over it
[07:37:43] <deshipu> assuming no parts of the chassis sticking out
[07:38:39] <[VEGETA]> aha so that is what you meant by radius of curvature? the width of a certain obstacle?
[07:46:15] <[VEGETA]> with that done, I guess segmenting ground is possible right?
[07:46:31] <Polymorphism> http://www.cableorganizer.com/igus/cf10-tpe-control-cables/
[07:46:33] <Polymorphism> this wire is pure sex
[07:46:36] <Polymorphism> enjoy
[08:00:11] <[VEGETA]> I guess this is what should be used to detect objects: http://pointclouds.org/documentation/tutorials/cluster_extraction.php#cluster-extraction
[08:06:21] <[VEGETA]> but how they determine values for the types of points they want?
[08:14:31] <SpeedEvil> [VEGETA]: Radius of curvature is simply how curved the environment is.
[08:15:00] <SpeedEvil> [VEGETA]: Take the top of a sphere, at some point your robot can't drive over it as the chassis grounds out.
[08:15:21] <SpeedEvil> For example, no common passenger vehicle can drive over a 24" diameter sphere.
[08:15:49] <[VEGETA]> ok, i understand
[08:15:52] <SpeedEvil> All of them can drive over a 12000km diameter sphere
[08:15:59] <[VEGETA]> now the issue is how to implement that
[08:17:39] <[VEGETA]> how to know the points that belong to a certain object
[08:17:57] <[VEGETA]> yet knowing if it is dynamic or static
[08:22:30] <[VEGETA]> speedevil, brb, back in one hour.
[08:47:34] <rue_bed> I wonder if he will have it master the 50 point turn
[08:48:10] <SpeedEvil> :)
[08:48:17] <SpeedEvil> Handbrake turn.
[08:48:26] <veverak> hmmm
[08:48:28] <veverak> wonder wonder
[08:48:36] * veverak is thinking about using ansible for his robot project
[08:48:47] <veverak> to setup enviroment on linux machine for the robot and it's appp
[08:49:00] <veverak> apart from writing down everything I want to setup
[08:49:31] <veverak> I am curious where it's wise to ask if it's good idea
[08:49:36] <veverak> or, how would people like it
[08:49:57] <veverak> maybe starckoverflow ?
[09:27:01] <rue_house> if you want to make it overly complex with really expensive parts, you need to find a university site
[09:27:53] <veverak> not really
[09:27:56] <veverak> :)
[09:31:03] <rue_house> by univerity wire, I mean a place where the university folk who use $54k software and put $10k into a robot, gather to chat.
[09:31:15] <rue_house> wire? site...
[09:31:32] <rue_house> similar stroke pattern, interesting
[09:32:44] <veverak> not entirely my goal, again
[09:32:49] <veverak> it's designed to be cheap
[09:33:10] <veverak> it's just that I am thinking that ansible will be better approach than writing long manual about how enviroment should be setuped
[20:30:38] <Snert> I just take notes on what I had to do to get all the pieces and parts of software configured.
[20:30:54] <Snert> and offer my notes to anyone that wants them.
[20:31:24] <Snert> It winds up being pretty much a step-by-step how-to.
[20:35:41] <Snert> ansible seems like a poor way to go.
[20:36:00] <Snert> bit-n-pieces of reality being the better thing.