Sunday, September 24, 2017

Daily Goals

To add the Gazebo poses, here's what I've found:
*Major Kidnapping Trial 4 is missing a print_amcl_gazebo_poses.txt file. Booo.
*The angle for Gazebo is a quaternion. Need to write a python script to convert that to radians.
I'm in instant-gratification mode now, though, so I'm going to just see if I can get X,Y plotting for the Gazebo pose to work on Major Kidnapping Trial 5, and then I'll add angle.

*****************

Update! Got the Gazebo poses added to the graph. It's enlightening - the blue dot is the robot's ground-truth pose in Gazebo:

Simulation starts at (0,0) and AMCL is started at (0,0), but the particle cloud is pretty big:



Cloud shrinks down as robot moves.  (Pose estimates are off by a quarter of a meter in each direction...)




BOOM! Kidnapped.


Plotting Mean and AMCL Poses

Plotted particle cloud + AMCL Pose + Mean X/Y/Theta all together!  Here are a couple samples:

Robot's not lost:


Robot's lost:




 Do you notice a difference?  (I don't...)

Structure



One of the biggest take-aways from Paul Silvia's "How To Write A Lot: A Practical Guide to Productive Academic Writing" is that you HAVE to make a schedule for when you're going to do your research writing.  I actually found it helpful that he's aiming at PhD students and professors, because besides writing-up their research, they have a lot of other responsibilities like teaching, administration stuff, grading homework, etc. and research might get pushed to the side by the rest of these things.  It's akin to my situation working full-time, so his advice to JUST MAKE TIME FOR YOUR WORK - you make time every Thursday night to watch a half-hour of Donnybrook - hit home.

And, I have good news: now that I'm making time - an hour after I get off work everyday, getting up early on the weekend - I'm actually getting stuff done (see the recent blog posts!).

Another principle from the book is goal-setting: identifying project goals, as well as the daily steps to implementing them.  I haven't really been doing that.  Therefore, here goes:

My current project goal is to find a method of arranging data so that the Neural Network can find a really-tight function approximation for the data.

Daily goals:
TODAY: I need visualizations of the following:
* Particle Cloud Poses + AMCL Pose + Gazebo Pose + Mean X/MeanY/MeanTheta Pose
* Revisit the mean/stdev dataset and somehow see if you can tell if there's a function to find there.


Another method from the book is keeping track of your productivity by making a spreadsheet tracking date and words written (lines of code, etc.)  Then, using it like a fitness tracker - Are you on track for this week? Can you beat last week's goal?, etc.  Might be good to implement.

Friday, September 22, 2017

Woot woot! Got My Arrows

Turns out it was that I wasn't converting the AMCL Theta into [u,v] coordinates before plotting the arrow. Once I added that step, my plots have the AMCL pose mapped!


        hold on, fig = quiver(points(1,4),points(1,5),u,v,'color',[0 0 1]);[u,v] = pol2cart(points(1,6),0.1);


Isn't my little AMCL arrow cute?



Wish List

As I'm formatting my data for review, here's what I want:
-Particle cloud arrows plotted at each timestep (got this)
-AMCL's position estimate plotted at each timestep as a big red arrow (having trouble getting this)
-Gazebo position plotted at each timestep as a big green arrow (don't have any infrastructure for this)
-Plot the meanX and meanY positions on the particle cloud as dots, too.

Also, format the mean/stddev files for review.

Wednesday, September 20, 2017

Generating images of particle clouds

%import data file:
InputFileName = <>
SaveFilesToFolder=<>
TimesParticleCloudPoints = readtable(InputFileName,'Delimiter', '\t');
TimesParticleCloudPoints = table2array(TimesParticleCloudPoints);
d = TimesParticleCloudPoints(:,1);
times = unique(TimesParticleCloudPoints(:,1)); %get distinct time periods
points = [];
counter = 0;
for t=1:size(times)
    if times(t) > 70
        disp('KIDNAPPING: ')
        disp(t)
    end
   
    counter =counter+1;
    points = [];
    for w=1:size(TimesParticleCloudPoints(:,1))
        if TimesParticleCloudPoints(w,1) == times(t)
        %if Time(w) == times(t)
            %points = [points; X(w) Y(w);];
            points = [points; TimesParticleCloudPoints(w,3) TimesParticleCloudPoints(w,4);];
        end
    end
    fig = scatter(points(:,1), points(:,2));
   
    if times(t) >= 70
        header = strvcat('Post-Kidnapped. T = ', num2str(times(t)));
    else
        header = strvcat('T = ', num2str(times(t)));
    end
   
    title(header);
    saveas(fig, strcat(SaveFilesToFolder, int2str(counter),'.png'));
end

Expanding to draw arrows. This link seems helpful.  My angles are in the dataset as quaternions...

https://stackoverflow.com/questions/1803043/how-do-i-display-an-arrow-positioned-at-a-specific-angle-in-matlab
   

Update

New ideas that came out of conversation yesterday:

-Package up the point cloud datasets - a few examples of each, with kidnapping events identified - and send it out for second opinion.

-Try plotting the mean/stdev datasets from before; if you can't identify a trend or cutoff point, the NN won't be able to, either.

-Figure out what the covariance matrix represents, and why the first element has been so useful. Might give insight on how to study it.

Sunday, September 10, 2017

Same Topic, Various Guises

I'm reading "Writing Your Dissertation in 15 Minutes A Day" by Joan Bolker, and I liked one of the parts enough to blog about it:

"Some people seem always to have known what they want to write their dissertations about. They are the lucky ones...Some, like me, have written their way through the same topic in various guises often enough so they know it's theirs for life."

The second sentence there stuck out to me, because whether it's robots or people, pulling information from messy data sets appears to be kind of my schtick when it comes to research.  For one of my final projects in college (I had 2 - one for Applied Math, one for CompSci), I obtained several decades' worth of U.S. government census data for each of the Saint Louis Metropolitan Area's counties and created a model for the population in-flows and out-flows from each county (spoiler: STL City and County had serious out-flows to St Charles and Jefferson County).  I was really proud of the big Excel spreadsheets I made - for the time span I was looking at, Census data was spread out among a couple websites.  Each source had its own file format, so it took a lot of data cleaning to get a pretty dataset with all the information I needed.  It felt easier to write the programs that manipulated the datasets and distilled their information into a single end result, but I remember that getting the model right was a challenge, too.

There are parallels to this in my current thesis project for finishing my M.S. - again, it took forever to put together the infrastructure that collects robot data.  The thing that makes this more complicated is instead of working with a finite dataset, I'm constantly finding that I need A) more examples of the driving route, and B) different routes to compare against. 

Unfortunately, I haven't found a good way to automate the ROS/Python/bash scripts so that all the ROS code I need runs in parallel, and that means there's a lot of set-up each time I want to collect more driving data.  The good part is that I have some solid bash/Python scripts that automate the data cleaning and formatting part after the data has been collected.

Conclusion: It might be worth spending a little more time on getting the data collection infrastructure automated, if that would speed up the rest of the process.

Friday, September 8, 2017

Coding For Slope

This is a covariance[0] plot for a no-kidnapping data collection trial:





I first implemented what I talked about in the last post: I took 3 windows of 3 prior points a piece and averaged each window into a number.  If the previous window's average was bigger, I coded this as a 0. Otherwise, it was a 1.

This didn't work so well - there were more 1's than I wanted to see in a normal, no-kidnapping dataset. 

The next thing I tried was taking the same 9 previous points and comparing them consecutively - if prev_point9 > prev_point8, it's a 0, and otherwise a 1.  That resulted in the dataset below. 


I like this one because it represents short-lived spikes and longer-term increases.

Weaknesses of this modeling approach:
1. Doesn't represent the intensity of the increase (magnitude of the slope). I think it's mostly the degree of the slope that differentiates a kidnapping instance's covariance spike from a regular localization covariance spike. 
2. I'd like a way of identifying "This timestep and the 5 previous timesteps were ALL 1's" - somehow, that needs to make it into the model.


Time/Covariance[0]  t9>8 t 8>7  t7>6 t 6>5 t 5>4 t 4>3  t 3>2 t 2>1
32.68 0 0 0 0 0 0 0 0
43 0 0 0 0 0 0 0 1
43.36 0 0 0 0 0 0 1 0
43.64 0 0 0 0 0 1 0 0
44.18 0 0 0 0 1 0 0 0
50.3 0 0 0 1 0 0 0 0
50.57 0 0 1 0 0 0 0 0
54 0 1 0 0 0 0 0 0
56.17 1 0 0 0 0 0 0 0
57.52 0 0 0 0 0 0 0 0
58.44 0 0 0 0 0 0 0 0
61.6 0 0 0 0 0 0 0 0
62.4 0 0 0 0 0 0 0 0
65.14 0 0 0 0 0 0 0 0
69.98 0 0 0 0 0 0 0 1
73.63 0 0 0 0 0 0 1 0
73.93 0 0 0 0 0 1 0 0
74.33 0 0 0 0 1 0 0 0
77.55 0 0 0 1 0 0 0 0
80.56 0 0 1 0 0 0 0 1
81.55 0 1 0 0 0 0 1 1
85.61 1 0 0 0 0 1 1 1
86.2 0 0 0 0 1 1 1 1
88.1 0 0 0 1 1 1 1 1
89.54 0 0 1 1 1 1 1 0
91.81 0 1 1 1 1 1 0 0
93.91 1 1 1 1 1 0 0 1
94.47 1 1 1 1 0 0 1 1
95.91 1 1 1 0 0 1 1 0
97.32 1 1 0 0 1 1 0 0
102.73 1 0 0 1 1 0 0 1
104.1 0 0 1 1 0 0 1 1
104.61 0 1 1 0 0 1 1 1
105.72 1 1 0 0 1 1 1 1
107.29 1 0 0 1 1 1 1 0
108.7 0 0 1 1 1 1 0 1
110.17 0 1 1 1 1 0 1 1
111.98 1 1 1 1 0 1 1 1
113.45 1 1 1 0 1 1 1 0
114.23 1 1 0 1 1 1 0 0
114.82 1 0 1 1 1 0 0 0
118.75 0 1 1 1 0 0 0 0
120.1 1 1 1 0 0 0 0 0
123.97 1 1 0 0 0 0 0 0
126.51 1 0 0 0 0 0 0 0
129.75 0 0 0 0 0 0 0 1
130 0 0 0 0 0 0 1 1
130.29 0 0 0 0 0 1 1 0
130.56 0 0 0 0 1 1 0 0
131.8 0 0 0 1 1 0 0 0
131.28 0 0 1 1 0 0 0 1
131.59 0 1 1 0 0 0 1 1
132.75 1 1 0 0 0 1 1 0
133.5 1 0 0 0 1 1 0 0
134.2 0 0 0 1 1 0 0 1
134.76 0 0 1 1 0 0 1 0
135.62 0 1 1 0 0 1 0 1
136.23 1 1 0 0 1 0 1 0
137.76 1 0 0 1 0 1 0 0
138.89 0 0 1 0 1 0 0 1
140.61 0 1 0 1 0 0 1 1
141.88 1 0 1 0 0 1 1 1
143.37 0 1 0 0 1 1 1 1
144.71 1 0 0 1 1 1 1 1