Wednesday, December 20, 2017

The Mysterious Teal Cloud!

Found something interesting yesterday: after color-coding the particle weights, it appears that the timesteps following kidnapping events and select actions like turning a corner result in a particle cloud in which all the weights are assigned to a bright lime green.


SIDEBAR:
One thing I'm wondering about: do the particle weights/weight distributions vary depending on things like the map or the route? I would want whatever threshold I come up with to be based on ...a specific rule or calculation, not a specific number. 
SIDEBAR OVER.

Anyway, back to the Mysterious Teal Cloud:
There's a beautiful shade of teal that ONLY surfaces in the particle clouds in the 5-8 seconds after a kidnapping incident.  It occurs in 12/13 of my datasets.  Hallelujah!

Observe:  Kidnapping event at 45 seconds.  At 52 seconds, there's this glorious swath of teal in the particle cloud.  In the timestep following it (53 seconds), we see the lime green:






In the "Normal", non-kidnapping datasets, the green cloud is preceded by a rainbow cloud:




Another variable I'd like to track is the difference between AMCL message arrivals. 


TODO List:
-See if there's a noticeable difference in the AMCL message frequency in kidnapping or non-kidnapping examples
-Fix colormap scale


P.S. Part of my "teaching philosophy" that I developed in the SLU CUTS program is the benefits of including reflection as part of the learning process. The experience of writing this blog post forced me to go through the process of explaining what I think is going on, and caused me to re-think a couple things and come to a better understanding of the data. 

Sunday, December 17, 2017

Plot Comparison

Finally learned how to plot (x, y, angle) data in Matlab with the quiver() function, while also setting the arrows' colors individually to correspond with the particle weight.  I re-visited the joys of For Loop vs. matrix operations performance, learned about the num2cell() and cellfun() functions, and questioned my sanity over figure holds and resets.

Turns out, Matlab has built-in color schemes, and I used these to denote the particle's weight. 

Dark Blue = low weight
Green = high weight


Here is what a "normal" drive looks like over time. 





Here is what the drive looks like when there is a "major" kidnapping event as the robot approaches (0, -2) - moves right to left from (1, -2):


Wednesday, November 29, 2017

aggregate()

Now that I'm getting back into R, the things it can do reminds me of other things I used to do when I used SAS as a supply chain intern.  For example, grouping data by a timestamp and taking the first/last element from the timestamp by using a keyword like "last" or "first".  Fun stuff!

I'm looking at particle cloud weights today. I've plotted the cloud weights for each timestamp in Excel by creating a scatter plot with an X axis of each timestamp and a Y axis of the individual particles' weight at that timestamp.  The "normal" plots and the "kidnapped" plots are easily differentiated:

Kidnapped:

Not Kidnapped:


Next, I looked at using R to group the particles by timestep and analyze the timesteps.  Here's a script for getting the mean of each timesteps' particle weights:

1. Import the CSV:
mytable<-read.csv(<CSVpath>,header=TRUE,sep="\t")

2. Convert to a data frame:
values<-data.frame(mytable)

The piece de resistance: using the aggregate() function and specifying Time as the column by which I want the data to be grouped
3. xyz<-aggregate(x=values, by=list(unique.values=values$Time), FUN=mean)

Voila!  Though, I don't know what the 4th column from the left is about.

Normal:
    unique.values   Time ParticleNumber ParticleWeight
1           18.87  18.87          999.5    0.000500000
2           26.15  26.15          250.0    0.001996008
3           26.39  26.39          250.0    0.001996008
4           26.62  26.62          250.0    0.001996008
5           27.14  27.14          250.0    0.001996008
6           30.28  30.28          250.0    0.001996008
7           31.71  31.71          250.0    0.001996008
8           33.60  33.60          250.0    0.001996008
9           34.48  34.48          250.0    0.001996008
10          35.82  35.82          250.0    0.001996008
11          36.20  36.20          250.0    0.001996008
12          36.42  36.42          250.0    0.001996008
13          37.60  37.60          250.0    0.001996008
14          39.22  39.22          250.0    0.001996008
15          40.80  40.80          250.0    0.001996008
16          42.29  42.29          250.0    0.001996008
17          43.54  43.54          250.0    0.001996008
18          44.79  44.79          250.0    0.001996008
19          45.16  45.16          250.0    0.001996008
20          46.39  46.39          250.0    0.001996008
21          48.29  48.29          250.0    0.001996008
22          49.11  49.11          250.0    0.001996008
23          49.39  49.39          250.0    0.001996008
24          50.24  50.24          250.0    0.001996008
25          50.77  50.77          250.0    0.001996008
26          51.29  51.29          250.0    0.001996008
27          52.99  52.99          250.0    0.001996008
28          54.71  54.71          250.0    0.001996008
29          58.20  58.20          250.0    0.001996008
30          60.42  60.42          250.0    0.001996008
31          61.28  61.28          250.0    0.001996008
32          61.85  61.85          250.0    0.001996008
33          63.77  63.77          250.0    0.001996008
34          65.99  65.99          250.0    0.001996008
35          66.43  66.43          250.0    0.001996008
36          67.50  67.50          250.0    0.001996008
37          68.43  68.43          250.0    0.001996008
38          68.50  68.50          250.0    0.001996008
39          68.72  68.72          250.0    0.001996008
40          68.90  68.90          250.0    0.001996008
41          69.35  69.35          250.0    0.001996008
42          70.96  70.96          250.0    0.001996008
43          71.17  71.17          250.0    0.001996008
44          71.36  71.36          250.0    0.001996008
45          71.69  71.69          250.0    0.001996008
46          75.19  75.19          250.0    0.001996008
47          75.94  75.94          250.0    0.001996008
48          77.48  77.48          250.0    0.001996008
49          79.60  79.60          250.0    0.001996008
50          80.53  80.53          250.0    0.001996008
51          80.81  80.81          250.0    0.001996008
52          81.26  81.26          250.0    0.001996008
53          81.31  81.31          250.0    0.001996008
54          83.78  83.78          250.0    0.001996008
55          85.54  85.54          250.0    0.001996008
56          85.72  85.72          250.0    0.001996008
57          86.10  86.10          250.0    0.001996008
58          86.72  86.72          250.0    0.001996008
59          88.28  88.28          250.0    0.001996008
60          88.75  88.75          250.0    0.001996008
61          88.84  88.84          250.0    0.001996008
62          89.12  89.12          250.0    0.001996008
63          89.34  89.34          250.0    0.001996008
64          89.76  89.76          250.0    0.001996008
65          90.40  90.40          250.0    0.001996008
66          90.44  90.44          250.0    0.001996008
67          90.71  90.71          250.0    0.001996008
68          91.15  91.15          250.0    0.001996008
69          91.40  91.40          250.0    0.001996008
70          92.65  92.65          250.0    0.001996008
71          92.84  92.84          250.0    0.001996008
72          94.27  94.27          250.0    0.001996008
73          95.67  95.67          250.0    0.001996008
74          96.96  96.96          250.0    0.001996008
75          98.51  98.51          250.0    0.001996008
76         101.10 101.10          250.0    0.001996008
77         101.90 101.90          250.0    0.001996008
78         101.99 101.99          250.0    0.001996008
79         102.57 102.57          250.0    0.001996008
80         103.00 103.00          250.0    0.001996008
81         104.61 104.61          250.0    0.001996008
82         105.99 105.99          250.0    0.001996008
83         106.55 106.55          250.0    0.001996008
84         106.72 106.72          250.0    0.001996008
85         107.74 107.74          250.0    0.001996008
86         108.20 108.20          250.0    0.001996008
87         108.40 108.40          250.0    0.001996008
88         108.85 108.85          250.0    0.001996008
89         109.31 109.31          250.0    0.001996008
90         111.11 111.11          250.0    0.001996008
91         112.32 112.32          250.0    0.001996008
92         112.50 112.50          250.0    0.001996008
93         114.44 114.44          250.0    0.001996008
94         116.50 116.50          250.0    0.001996008
95         117.14 117.14          250.0    0.001996008
96         117.29 117.29          250.0    0.001996008
97         117.64 117.64          250.0    0.001996008
98         118.40 118.40          250.0    0.001996008
99         118.66 118.66          250.0    0.001996008
100        118.70 118.70          250.0    0.001996008
101        120.13 120.13          250.0    0.001996008
102        120.86 120.86          250.0    0.001996008
103        121.40 121.40          250.0    0.001996008
104        122.85 122.85          250.0    0.001996008
105        124.50 124.50          250.0    0.001996008
106        124.86 124.86          250.0    0.001996008
107        125.22 125.22          250.0    0.001996008
108        126.61 126.61          250.0    0.001996008
109        128.70 128.70          250.0    0.001996008
110        128.76 128.76          250.0    0.001996008
111        128.87 128.87          250.0    0.001996008
112        129.17 129.17          250.0    0.001996008
113        129.42 129.42          250.0    0.001996008
114        129.63 129.63          250.0    0.001996008
115        130.26 130.26          250.0    0.001996008
116        130.46 130.46          250.0    0.001996008
117        130.60 130.60          250.0    0.001996008
118        130.88 130.88          250.0    0.001996008
119        132.27 132.27          250.0    0.001996008
120        132.44 132.44          250.0    0.001996008
121        132.62 132.62          250.0    0.001996008
122        132.98 132.98          250.0    0.001996008
123        133.35 133.35          250.0    0.001996008
124        133.50 133.50          250.0    0.001996008
125        134.17 134.17          250.0    0.001996008
126        134.74 134.74          250.0    0.001996008
127        135.36 135.36          250.0    0.001996008
128        135.60 135.60          250.0    0.001996008


Unfortunately, the mean of the cloud weights from the major kidnapping incident looks identical:
    unique.values   Time ParticleNumber ParticleWeight
1           19.57  19.57          999.5    0.000500000
2           28.58  28.58          250.0    0.001996008
3           28.76  28.76          250.0    0.001996008
4           28.95  28.95          250.0    0.001996008
5           29.24  29.24          250.0    0.001996008
6           31.79  31.79          250.0    0.001996008
7           33.14  33.14          250.0    0.001996008
8           34.52  34.52          250.0    0.001996008
9           35.79  35.79          250.0    0.001996008
10          37.20  37.20          250.0    0.001996008
11          38.53  38.53          250.0    0.001996008
12          39.42  39.42          250.0    0.001996008
13          39.53  39.53          250.0    0.001996008
14          39.96  39.96          250.0    0.001996008
15          40.28  40.28          250.0    0.001996008
16          43.80  43.80          250.0    0.001996008
17          44.95  44.95          250.0    0.001996008
18          46.43  46.43          250.0    0.001996008
19          47.78  47.78          250.0    0.001996008
20          48.81  48.81          250.0    0.001996008
21          49.18  49.18          250.0    0.001996008
22          51.19  51.19          250.0    0.001996008
23          53.23  53.23          250.0    0.001996008
24          53.82  53.82          250.0    0.001996008
25          53.90  53.90          250.0    0.001996008
26          55.38  55.38          250.0    0.001996008
27          56.65  56.65          250.0    0.001996008
28          58.33  58.33          250.0    0.001996008
29          59.97  59.97          250.0    0.001996008
30          62.59  62.59          250.0    0.001996008
31          64.49  64.49          250.0    0.001996008
32          65.53  65.53          250.0    0.001996008
33          66.70  66.70          250.0    0.001996008
34          67.93  67.93          250.0    0.001996008
35          69.70  69.70          250.0    0.001996008
36          69.85  69.85          250.0    0.001996008
37          70.14  70.14          250.0    0.001996008
38          70.55  70.55          250.0    0.001996008
39          71.66  71.66          250.0    0.001996008
40          71.94  71.94          250.0    0.001996008
41          72.27  72.27          250.0    0.001996008
42          72.56  72.56          250.0    0.001996008
43          72.96  72.96          250.0    0.001996008
44          74.51  74.51          250.0    0.001996008
45          74.93  74.93          250.0    0.001996008
46          75.36  75.36          250.0    0.001996008
47          75.50  75.50          250.0    0.001996008
48          75.71  75.71          250.0    0.001996008
49          78.99  78.99          250.0    0.001996008
50          80.40  80.40          250.0    0.001996008
51          81.76  81.76          250.0    0.001996008
52          83.46  83.46          250.0    0.001996008
53          84.67  84.67          250.0    0.001996008
54          84.99  84.99          250.0    0.001996008
55          85.32  85.32          250.0    0.001996008
56          85.86  85.86          250.0    0.001996008
57          87.74  87.74          250.0    0.001996008
58          89.19  89.19          250.0    0.001996008
59          89.39  89.39          250.0    0.001996008
60          89.61  89.61          250.0    0.001996008
61          90.11  90.11          250.0    0.001996008
62          90.88  90.88          250.0    0.001996008
63          92.79  92.79          250.0    0.001996008
64          93.37  93.37          250.0    0.001996008
65          93.85  93.85          250.0    0.001996008
66          94.31  94.31          250.0    0.001996008
67          94.67  94.67          250.0    0.001996008
68          94.86  94.86          250.0    0.001996008
69          95.21  95.21          250.0    0.001996008
70          95.77  95.77          250.0    0.001996008
71          96.94  96.94          250.0    0.001996008
72          97.24  97.24          250.0    0.001996008
73          98.72  98.72          250.0    0.001996008
74         100.32 100.32          250.0    0.001996008
75         102.60 102.60          250.0    0.001996008
76         102.72 102.72          250.0    0.001996008
77         105.42 105.42          250.0    0.001996008
78         105.60 105.60          250.0    0.001996008
79         105.79 105.79          250.0    0.001996008
80         106.21 106.21          250.0    0.001996008
81         106.55 106.55          250.0    0.001996008
82         106.87 106.87          250.0    0.001996008
83         108.58 108.58          250.0    0.001996008
84         110.10 110.10          250.0    0.001996008
85         110.11 110.11          250.0    0.001996008
86         110.47 110.47          250.0    0.001996008
87         111.41 111.41          250.0    0.001996008
88         111.76 111.76          250.0    0.001996008
89         112.12 112.12          250.0    0.001996008
90         112.29 112.29          250.0    0.001996008
91         113.31 113.31          250.0    0.001996008
92         113.90 113.90          250.0    0.001996008
93         115.19 115.19          250.0    0.001996008
94         115.64 115.64          250.0    0.001996008
95         115.99 115.99          250.0    0.001996008
96         118.30 118.30          250.0    0.001996008
97         119.62 119.62          250.0    0.001996008
98         120.46 120.46          250.0    0.001996008
99         120.66 120.66          250.0    0.001996008
100        120.99 120.99          250.0    0.001996008
101        121.24 121.24          250.0    0.001996008
102        121.64 121.64          250.0    0.001996008
103        122.23 122.23          250.0    0.001996008
104        122.50 122.50          250.0    0.001996008
105        123.95 123.95          250.0    0.001996008
106        124.70 124.70          250.0    0.001996008
107        125.12 125.12          250.0    0.001996008
108        125.20 125.20          250.0    0.001996008
109        125.92 125.92          250.0    0.001996008
110        127.14 127.14          250.0    0.001996008
111        128.39 128.39          250.0    0.001996008
112        129.26 129.26          250.0    0.001996008
113        129.72 129.72          250.0    0.001996008
114        131.23 131.23          250.0    0.001996008
115        132.64 132.64          250.0    0.001996008
116        133.30 133.30          250.0    0.001996008
117        133.49 133.49          250.0    0.001996008
118        133.90 133.90          250.0    0.001996008
119        134.53 134.53          250.0    0.001996008
120        134.67 134.67          250.0    0.001996008
121        134.80 134.80          250.0    0.001996008
122        135.00 135.00          250.0    0.001996008
123        136.33 136.33          250.0    0.001996008
124        136.46 136.46          250.0    0.001996008
125        136.80 136.80          250.0    0.001996008
126        137.10 137.10          250.0    0.001996008
127        137.29 137.29          250.0    0.001996008
128        137.62 137.62          250.0    0.001996008
129        137.82 137.82          250.0    0.001996008
130        139.17 139.17          250.0    0.001996008
131        139.46 139.46          250.0    0.001996008
132        139.72 139.72          250.0    0.001996008


R is not magic

Realization:
As cool as R is, you still have to do the work of choosing the best ways to arrange your data and the statistical tests to use on it.

Sunday, November 19, 2017

I'm Becoming A Fan...

...A fan of R!  I have all these datasets for different AMCL/Gazebo variables, and in the past, I'd written little scripts for merging them based on their closest timestamp matches.  My method was cumbersome and broke easily.  Turns out, there's a variety of ways to do the same thing with R.  Woot woot!

Getting Rolling With R

Today, I've written a little script that parses bag files into CSVs of the topics' messages.  Now I have to match them up and put them together into a combined dataset.  The way I described this situation earlier was,

"A bag file stores message traffic from recorded ROS topics. Messages arrive regularly, but at different frequencies."

As I tried to think of the best way to organize my datasets, I considered putting together a database.  I  balked when I realized I'd have to define tables, and declare columns with names and the appropriate datatypes (yuck).  I needed SQL functionality with the simplicity of a CSV file.  Lo and behold, I googled for 'r join datasets', and found this page:

"We often encounter situations where we have data in multiple files, at different frequencies and on different subsets of observations, but we would like to match them to one another as completely and systematically as possible. In R, the merge() command is a great way to match two data frames together."

https://www.r-bloggers.com/merging-multiple-data-files-into-one-data-frame/

Sounds like what I need!

Mistakes Were Made, and Corrections Were, Too.

I did some extra work yesterday to get the robot's Gazebo poses publishing as a ROS message, and I'm glad I did.  I hadn't been sure it would be a good use of time, since I already have a script that prints the poses to a text file, but at the time, I did it just to make things ship-shape.  It led to an important discovery, though:

In all of my simulations, I've had the starting AMCL pose adjusted for the wrong map.  Yikes!  I suspect this was going on in the previous round of simulations, too. 

I've corrected this by forcing the AMCL pose to (0,0) through a command line parameter added to my Bash script (another reason I'm loving the Bash scripts for starting ROS nodes).  Another addition to the start_driving_robot Bash script is a couple commands that execute after the playback of the driving route is finished:

rosnode kill -a           
Purpose :Kills everything - specifically, the rosrecord node, so that I can walk away from the simulation after I start it up, and I won't get data from the robot sitting still at the end.

killall -9 gzserver
Purpose: rosnode kill apparently wasn't shutting down Gazebo nicely, and when I would immediately start a new instance of Gazebo, it would fail because there was already a gzserver process running.  This step seems to help prevent this.

Really feeling the benefits of my improvements to the research infrastructure.

Saturday, November 18, 2017

Plugin for Publishing Robot Pose to ROS from Gazebo

This morning, I wanted to have a Gazebo plugin that publishes the robot's pose in Gazebo.  Right now, I've got a python script that prints to a text file, but it would be easier for analysis to just have all the data in the same bag file. 


It took an hour, but now I've got the robot's pose published as a ROS topic.

******
NOTE! The published pose messages from this new way match the Gazebo print statements of the get_model_state call on the mobile_base in the world frame to 0.01 meters.
******


Here's how I did it:
https://answers.ros.org/question/222033/how-do-i-publish-gazebo-position-of-a-robot-model-on-odometry-topic/

Recommends something like this should be added to the URDF (robot specification)

<plugin name="p3d_base_controller" filename="libgazebo_ros_p3d.so">
<rpyOffsets>0 0 0</rpyOffsets>  <xyzOffsets>0 0 0</xyzOffsets>  <frameName>world</frameName>  <gaussianNoise>0.01</gaussianNoise>  <topicName>gazebo_robot_pose</topicName>  <bodyName>base_link</bodyName>  <updateRate>50.0</updateRate>  <alwaysOn>true</alwaysOn>
</plugin>


Notes:

The difficult part was finding which file to use.  Here's what I did:
1. I went to the turtlebot_gazebo package and looked at the turtlebot_world.launch file to figure out where the turtlebot URDF files are.

2. Turns out, there's a turtlebot_description package with relevant files.  I opened turtlebot_description\urdf and found a file called turtlebot_gazebo.urdf.xacro

3. I added a new <gazebo> tag and put the plugin section within the new <gazebo> section.

**Originally, I used "mobile_base" for the <bodyName>, but I got an error that mobile_base wasn't a valid bodyName value. I switched back to using base_link, and the results are working fine.



This was a helpful example of where to put the plugin code - note that the plugin is within its own <gazebo> tag in the xacro:
https://github.com/tu-darmstadt-ros-pkg/taurob_tracker_common/blob/master/taurob_tracker_description/urdf/tracker_chassis.gazebo.xacro.xml#L44


The source of that .so file is described here:

##what is libgazebo_ros_p3d.so? You can find source code from gazebo_ros_pkgs repository.
Go to gazebo_plugins/include/gazebo_plugins/gazebo_ros_p3d.h.
/*
BriefNote:
Compute relative position of [bodyName] frame with respect to [frameName] frame.
And publish it as [topicName] with [updateRate].
It seems that it does not have physical control of the model.
*/
https://github.com/jaejunlee0538/ua_ros_p3dx

Simulations

I'm back to collecting data, now that I've got the AMCL particle cloud weights printed out.

See my ROS Answers forum question: https://answers.ros.org/question/275574/amcl-particle-weights-are-always-equal-why/

In the past,  I had a couple different driving methods and they each had their own problems:

1. turtlebot_teleop keyboard_teleop - driving the robot around the room on roughly the same route with keyboard controls.
Problem: Time-consuming, and didn't follow the exact route each time.

2. Wrote a Python script to drive the robot around a series of locations in the Gazebo world
Problem: Changing direction relied rotating the robot in place (sometimes a full circle or two).  AMCL really didn't like this.

3. Using the ROS Navigation path planner to specify a route
Problem: I must have been using a different path planner than the one that RViz uses (which is GREAT), because this one had a lot of rotation in it, too.

This time around, I decided to record a BAG file of the output from the turtlebot_teleop keyboard_teleop node, on the /cmd_vel_mux/input/teleop topic.  This topic has the translational and rotational changes specified by the keyboard commands, so I drove around the room once and recorded these messages so that I could replay them for every simulation.

The difficulty this week was getting a solid driving route - apparently, there's a lot of variation in what the robot actually executes compared to what you told it to do.  I'd record a clean driving route, and when I replayed it, the robot would over- or under-shoot the specified motion at some timestep along the way, and start bumping into the furniture in the simulation.  I finally learned to steer WAY clear of the furniture and was able to record a replayable route.

I had been aiming for a 5-minute route, but due to the difficulty in getting a driving route in place, I ended up sticking to a 2:20 driving sample.  As long as I schedule kidnapping events within the first 45 seconds, I think it will still give us enough information about AMCL's long-term post-kidnapping behavior.

Anyway, here are the scripts I've been using to record trials:

bash start_gazebo.sh
bash start_amcl.sh
bash start_recording_messages.sh
bash start_recording_amcl_gazebo.sh
bash start_driving_robot.sh

I LOVE that I adopted bash scripts, instead of having to remember what to type in each time - would highly recommend this step, as it's saved me a lot of time and prevented frustration.  I also made them foolproof by saving the data files within the script, and putting a usage error message when I don't supply a filename.  It's been working great.


Here are the specifics of each:
bash start_gazebo.sh
roslaunch turtlebot_gazebo turtlebot_world.launch world_file:=/opt/ros/indigo/share/turtlebot_gazebo/worlds/playground.world


bash start_amcl.sh
source /home/ebrenna8/amcl_overlay/devel/setup.bash;
roslaunch turtlebot_gazebo amcl_demo.launch map_file:=/opt/ros/indigo/share/turtlebot_gazebo/maps/playground.yaml

bash start_recording_messages.sh - this one takes a file name
rosbag record -O $1 /cluster_count /delta_pose /filter_covariance /max_weight_hyp /numsamples /particlecloud /test_pub /amcl_pose /cloud_weight /particlecloudPreResample /odom /cmd_vel_mux/input/teleop

bash start_driving_robot.sh
rosbag play /home/ebrenna8/Turtlebot_World_Driving.bag

bash start_recording_amcl_gazebo.sh - this one takes a file name
python ./print_amcl_gazebo_poses.py > $1


Monday, November 6, 2017

Debugging in ROS

In the past few days, I have been investigating ways to better understand what's going on with the AMCL code and I've turned to improving my debugging capabilities.

Turns out, I've been missing out on the ROS_DEBUG messages already being printed by the AMCL node.  I have used rqt before to view bag files' contents, but it turns out that if you open the tool with some special options, it lets you view the message channels, too.

More info here:
http://wiki.ros.org/ROS/Tutorials/UsingRqtconsoleRoslaunch


Another thing I started reading about was getting an IDE around ROS code so that I can put in breakpoints and step through the code.  There are some tools that are new to me, and tools that I recognized, like Visual Studio Code or Eclipse. Unfortunately, they ALL look really complicated. 

More info here:
http://wiki.ros.org/IDEs

Saturday, October 28, 2017

Publishing Point Cloud Weights

After some confusion with ROS messages and C++ float/double and array/vector properties, I've finally put the code together that publishes the weights of each particle in the particle filter as the robot drives around.  I had to use a custom ROS message type I had originally created for the filter's covariance, because for some reason the ROS weights message I made was not being copied to the proper dependency folder in the build process.  I suspect this is related to the CMakeLists file getting blown away.

The code that I wrote is just a few lines added to amcl_node.cpp:

cloud_weight_pub_ = nh_.advertise<filter_covariance_msg>("cloud_weight",2,true);
...
 if (!m_force_update) {
      geometry_msgs::PoseArray cloud_msg;
     
      filter_covariance_msg weights_msg;
      std::vector<float> weights(set->sample_count);

      cloud_msg.header.stamp = ros::Time::now();
      cloud_msg.header.frame_id = global_frame_id_;
      cloud_msg.poses.resize(set->sample_count);
      for(int i=0;i<set->sample_count;i++)
      {
        tf::poseTFToMsg(tf::Pose(tf::createQuaternionFromYaw(set->samples[i].pose.v[2]),
                                 tf::Vector3(set->samples[i].pose.v[0],
                                           set->samples[i].pose.v[1], 0)),
                        cloud_msg.poses[i]);
  weights[i] = set->samples[i].weight;
 
      }
      particlecloud_pub_.publish(cloud_msg);
      weights_msg.cov = weights;     
      cloud_weight_pub_.publish(weights_msg);

    }
  }


And the result is that there's now a /cloud_weights topic published by my AMCL overlay with data that looks like this:


From here, I'm going to run trials and collect data. There are a few new details I'm going to incorporate:

-Data routes should be 5 minutes long
-Drive with teleop and record/playback the teleop commands to replicate the exact route
-Script anything I can (for reproducibility)
-For every route, try the same route but with kidnapping occurrences at different times (locations) to see what happens.

The ultimate direction is that I'm going to add this particle weight data into my Matlab print-outs and color the particles by weight. I'm crossing my fingers that this will reveal something.

Monday, October 16, 2017

Small Victories

It looks like there's a problem with the cloud_weight_msg ROS message that I made a while back in order to publish the particle cloud's particle weights.  Even though the file declares the message type as float64[], the
rosmsg show cloud_weight_msg  prints out the message type as int32[].  Not sure what the deal is.

Fortunately, I had an existing message of type float32[], so I'm using that one for the cloud_weights message declaration.  The only difference is that my cloud weights are stored as doubles, and this float32[] will require them to be converted to floats.  I don't think it will cause a problem, so I should be good to go.

Work Plan:
- Put code in for publishing particle cloud weights tonight.
- Tomorrow: collect a bunch of data, over 5-minute trials and kidnapping events at different timesteps.

Wednesday, October 11, 2017

Maybe ROS Isn't So Bad

I'm still waiting for the other shoe to drop, but I ran catkin_make in my amcl_overlay workspace, and I was able to successfully compile the code and start the amcl_demo code with the overlaid version.  Then, I went over to the turtlebot plugin workspace, and ran catkin_make, and everything compiled.

I may have been gunshy about changing the code in order to get the particle cloud weights for no reason at all.  Whoops.

Graduate Research Seminar Speaker

Yesterday I had the opportunity to be the speaker for the weekly Graduate Research Seminar at Parks College.  There were about 15 students in attendance, spaced out among all levels and corners of the auditorium of 200 seats.  I spoke for about 40 minutes about my research, and in particular, the steps we had to take to simulate robot fault.  While I was preparing the presentation, I was thinking, "You know, this robot fault thing could really be a paper or something."  Unfortunately, a RosCon conference proposal was not accepted over the summer, and it's made me think either the topic isn't as original as I think, or I didn't write a good proposal.

The most enjoyable part of the talk was the last 2 minutes, when I took questions.  I got questions about automating the tests, whether I'll be able to differentiate between major and minor kidnapping event classes, and how to improve the Roomba's ability to make it around the whole room like it's supposed to.  Getting to free-style and apply my knowledge was fun.

Sunday, September 24, 2017

Daily Goals

To add the Gazebo poses, here's what I've found:
*Major Kidnapping Trial 4 is missing a print_amcl_gazebo_poses.txt file. Booo.
*The angle for Gazebo is a quaternion. Need to write a python script to convert that to radians.
I'm in instant-gratification mode now, though, so I'm going to just see if I can get X,Y plotting for the Gazebo pose to work on Major Kidnapping Trial 5, and then I'll add angle.

*****************

Update! Got the Gazebo poses added to the graph. It's enlightening - the blue dot is the robot's ground-truth pose in Gazebo:

Simulation starts at (0,0) and AMCL is started at (0,0), but the particle cloud is pretty big:



Cloud shrinks down as robot moves.  (Pose estimates are off by a quarter of a meter in each direction...)




BOOM! Kidnapped.


Plotting Mean and AMCL Poses

Plotted particle cloud + AMCL Pose + Mean X/Y/Theta all together!  Here are a couple samples:

Robot's not lost:


Robot's lost:




 Do you notice a difference?  (I don't...)

Structure



One of the biggest take-aways from Paul Silvia's "How To Write A Lot: A Practical Guide to Productive Academic Writing" is that you HAVE to make a schedule for when you're going to do your research writing.  I actually found it helpful that he's aiming at PhD students and professors, because besides writing-up their research, they have a lot of other responsibilities like teaching, administration stuff, grading homework, etc. and research might get pushed to the side by the rest of these things.  It's akin to my situation working full-time, so his advice to JUST MAKE TIME FOR YOUR WORK - you make time every Thursday night to watch a half-hour of Donnybrook - hit home.

And, I have good news: now that I'm making time - an hour after I get off work everyday, getting up early on the weekend - I'm actually getting stuff done (see the recent blog posts!).

Another principle from the book is goal-setting: identifying project goals, as well as the daily steps to implementing them.  I haven't really been doing that.  Therefore, here goes:

My current project goal is to find a method of arranging data so that the Neural Network can find a really-tight function approximation for the data.

Daily goals:
TODAY: I need visualizations of the following:
* Particle Cloud Poses + AMCL Pose + Gazebo Pose + Mean X/MeanY/MeanTheta Pose
* Revisit the mean/stdev dataset and somehow see if you can tell if there's a function to find there.


Another method from the book is keeping track of your productivity by making a spreadsheet tracking date and words written (lines of code, etc.)  Then, using it like a fitness tracker - Are you on track for this week? Can you beat last week's goal?, etc.  Might be good to implement.

Friday, September 22, 2017

Woot woot! Got My Arrows

Turns out it was that I wasn't converting the AMCL Theta into [u,v] coordinates before plotting the arrow. Once I added that step, my plots have the AMCL pose mapped!


        hold on, fig = quiver(points(1,4),points(1,5),u,v,'color',[0 0 1]);[u,v] = pol2cart(points(1,6),0.1);


Isn't my little AMCL arrow cute?



Wish List

As I'm formatting my data for review, here's what I want:
-Particle cloud arrows plotted at each timestep (got this)
-AMCL's position estimate plotted at each timestep as a big red arrow (having trouble getting this)
-Gazebo position plotted at each timestep as a big green arrow (don't have any infrastructure for this)
-Plot the meanX and meanY positions on the particle cloud as dots, too.

Also, format the mean/stddev files for review.

Wednesday, September 20, 2017

Generating images of particle clouds

%import data file:
InputFileName = <>
SaveFilesToFolder=<>
TimesParticleCloudPoints = readtable(InputFileName,'Delimiter', '\t');
TimesParticleCloudPoints = table2array(TimesParticleCloudPoints);
d = TimesParticleCloudPoints(:,1);
times = unique(TimesParticleCloudPoints(:,1)); %get distinct time periods
points = [];
counter = 0;
for t=1:size(times)
    if times(t) > 70
        disp('KIDNAPPING: ')
        disp(t)
    end
   
    counter =counter+1;
    points = [];
    for w=1:size(TimesParticleCloudPoints(:,1))
        if TimesParticleCloudPoints(w,1) == times(t)
        %if Time(w) == times(t)
            %points = [points; X(w) Y(w);];
            points = [points; TimesParticleCloudPoints(w,3) TimesParticleCloudPoints(w,4);];
        end
    end
    fig = scatter(points(:,1), points(:,2));
   
    if times(t) >= 70
        header = strvcat('Post-Kidnapped. T = ', num2str(times(t)));
    else
        header = strvcat('T = ', num2str(times(t)));
    end
   
    title(header);
    saveas(fig, strcat(SaveFilesToFolder, int2str(counter),'.png'));
end

Expanding to draw arrows. This link seems helpful.  My angles are in the dataset as quaternions...

https://stackoverflow.com/questions/1803043/how-do-i-display-an-arrow-positioned-at-a-specific-angle-in-matlab
   

Update

New ideas that came out of conversation yesterday:

-Package up the point cloud datasets - a few examples of each, with kidnapping events identified - and send it out for second opinion.

-Try plotting the mean/stdev datasets from before; if you can't identify a trend or cutoff point, the NN won't be able to, either.

-Figure out what the covariance matrix represents, and why the first element has been so useful. Might give insight on how to study it.

Sunday, September 10, 2017

Same Topic, Various Guises

I'm reading "Writing Your Dissertation in 15 Minutes A Day" by Joan Bolker, and I liked one of the parts enough to blog about it:

"Some people seem always to have known what they want to write their dissertations about. They are the lucky ones...Some, like me, have written their way through the same topic in various guises often enough so they know it's theirs for life."

The second sentence there stuck out to me, because whether it's robots or people, pulling information from messy data sets appears to be kind of my schtick when it comes to research.  For one of my final projects in college (I had 2 - one for Applied Math, one for CompSci), I obtained several decades' worth of U.S. government census data for each of the Saint Louis Metropolitan Area's counties and created a model for the population in-flows and out-flows from each county (spoiler: STL City and County had serious out-flows to St Charles and Jefferson County).  I was really proud of the big Excel spreadsheets I made - for the time span I was looking at, Census data was spread out among a couple websites.  Each source had its own file format, so it took a lot of data cleaning to get a pretty dataset with all the information I needed.  It felt easier to write the programs that manipulated the datasets and distilled their information into a single end result, but I remember that getting the model right was a challenge, too.

There are parallels to this in my current thesis project for finishing my M.S. - again, it took forever to put together the infrastructure that collects robot data.  The thing that makes this more complicated is instead of working with a finite dataset, I'm constantly finding that I need A) more examples of the driving route, and B) different routes to compare against. 

Unfortunately, I haven't found a good way to automate the ROS/Python/bash scripts so that all the ROS code I need runs in parallel, and that means there's a lot of set-up each time I want to collect more driving data.  The good part is that I have some solid bash/Python scripts that automate the data cleaning and formatting part after the data has been collected.

Conclusion: It might be worth spending a little more time on getting the data collection infrastructure automated, if that would speed up the rest of the process.

Friday, September 8, 2017

Coding For Slope

This is a covariance[0] plot for a no-kidnapping data collection trial:





I first implemented what I talked about in the last post: I took 3 windows of 3 prior points a piece and averaged each window into a number.  If the previous window's average was bigger, I coded this as a 0. Otherwise, it was a 1.

This didn't work so well - there were more 1's than I wanted to see in a normal, no-kidnapping dataset. 

The next thing I tried was taking the same 9 previous points and comparing them consecutively - if prev_point9 > prev_point8, it's a 0, and otherwise a 1.  That resulted in the dataset below. 


I like this one because it represents short-lived spikes and longer-term increases.

Weaknesses of this modeling approach:
1. Doesn't represent the intensity of the increase (magnitude of the slope). I think it's mostly the degree of the slope that differentiates a kidnapping instance's covariance spike from a regular localization covariance spike. 
2. I'd like a way of identifying "This timestep and the 5 previous timesteps were ALL 1's" - somehow, that needs to make it into the model.


Time/Covariance[0]  t9>8 t 8>7  t7>6 t 6>5 t 5>4 t 4>3  t 3>2 t 2>1
32.68 0 0 0 0 0 0 0 0
43 0 0 0 0 0 0 0 1
43.36 0 0 0 0 0 0 1 0
43.64 0 0 0 0 0 1 0 0
44.18 0 0 0 0 1 0 0 0
50.3 0 0 0 1 0 0 0 0
50.57 0 0 1 0 0 0 0 0
54 0 1 0 0 0 0 0 0
56.17 1 0 0 0 0 0 0 0
57.52 0 0 0 0 0 0 0 0
58.44 0 0 0 0 0 0 0 0
61.6 0 0 0 0 0 0 0 0
62.4 0 0 0 0 0 0 0 0
65.14 0 0 0 0 0 0 0 0
69.98 0 0 0 0 0 0 0 1
73.63 0 0 0 0 0 0 1 0
73.93 0 0 0 0 0 1 0 0
74.33 0 0 0 0 1 0 0 0
77.55 0 0 0 1 0 0 0 0
80.56 0 0 1 0 0 0 0 1
81.55 0 1 0 0 0 0 1 1
85.61 1 0 0 0 0 1 1 1
86.2 0 0 0 0 1 1 1 1
88.1 0 0 0 1 1 1 1 1
89.54 0 0 1 1 1 1 1 0
91.81 0 1 1 1 1 1 0 0
93.91 1 1 1 1 1 0 0 1
94.47 1 1 1 1 0 0 1 1
95.91 1 1 1 0 0 1 1 0
97.32 1 1 0 0 1 1 0 0
102.73 1 0 0 1 1 0 0 1
104.1 0 0 1 1 0 0 1 1
104.61 0 1 1 0 0 1 1 1
105.72 1 1 0 0 1 1 1 1
107.29 1 0 0 1 1 1 1 0
108.7 0 0 1 1 1 1 0 1
110.17 0 1 1 1 1 0 1 1
111.98 1 1 1 1 0 1 1 1
113.45 1 1 1 0 1 1 1 0
114.23 1 1 0 1 1 1 0 0
114.82 1 0 1 1 1 0 0 0
118.75 0 1 1 1 0 0 0 0
120.1 1 1 1 0 0 0 0 0
123.97 1 1 0 0 0 0 0 0
126.51 1 0 0 0 0 0 0 0
129.75 0 0 0 0 0 0 0 1
130 0 0 0 0 0 0 1 1
130.29 0 0 0 0 0 1 1 0
130.56 0 0 0 0 1 1 0 0
131.8 0 0 0 1 1 0 0 0
131.28 0 0 1 1 0 0 0 1
131.59 0 1 1 0 0 0 1 1
132.75 1 1 0 0 0 1 1 0
133.5 1 0 0 0 1 1 0 0
134.2 0 0 0 1 1 0 0 1
134.76 0 0 1 1 0 0 1 0
135.62 0 1 1 0 0 1 0 1
136.23 1 1 0 0 1 0 1 0
137.76 1 0 0 1 0 1 0 0
138.89 0 0 1 0 1 0 0 1
140.61 0 1 0 1 0 0 1 1
141.88 1 0 1 0 0 1 1 1
143.37 0 1 0 0 1 1 1 1
144.71 1 0 0 1 1 1 1 1