Skip to content

This repository presents the code for the Elastic Fast Marching Learning (EFML) demonstration learning algorithm.

Notifications You must be signed in to change notification settings

AdrianPrados/ElasticFastMarchingLearning

Repository files navigation

Elastic Fast Marching Learning from Demonstrations.

In this repository we present the code for a novel method for learning skills from human demonstrations. Our method, Elastic Fast Marching Learning (EFML), combines ideas from Elastic Maps--a Learning from Demonstration (LfD) method based on a mesh of springs--and Fast Marching Learning--an LfD method based on velocity fields. These two methods each use real-world physical phenomena, spring energy and light velocity, to find reproductions with desirable properties such as the capability to be trained using single or multiple demonstrations, meeting any number of initial, final, or via-point constraints, and finding smooth reproductions. The algorithm displays advantages in terms of precision, smoothness, and speed. We validate our method in several simulated and real-world environments, compare against Elastic Maps and Fast Marching Learning as well as other contemporary methods, and perform demonstrations and capture reproductions on a real-world robot.

Installation

To be used on your device, follow the installation steps below.

Requierements:

  • Matlab 2020b or higher

Install from source

You can clone the repository on your system.

git clone https://github.com/AdrianPrados/ElasticFastMarchingLearning.git

Folder structure

The algortihm is composed by two combination of two methods, allowing the users to taking demonstrations for an specific task, generate a solution smooth that is able to solve the generalization of that task. The code is divided in diferente folders:

LASA_dataset folder contains the data for the handwriting human motion LASA dataset to do the experiments in 2D.

RAIL folder contains the data dataset generted by RAIL for 3D manipulation task. the folder is divided in 4 tasks. For each task exists the same task applied to ADAM robot (for example PressingAdam) that represents the data adapted to the ADAM robot.

RealExperiment folder contains the data used and generated for the real experiments in the ADAM robot.

cvx-a64 folder contains the CVX model that allows to solve convex optimization is a faster way. CVX turns Matlab into a modeling language, allowing constraints and objectives to be specified using standard Matlab expression syntax

algorithms and fm2toolsfolders contains scripts that allows Fast Marching Learning method work properly.

elmap folders contains the Elastic Map implementation that allows it to work properly.

test folder contains the test to check everything works well.

Executing program

The EFML method is applicable in both 2D an 3D environments. For that purpose 2 different scripts has been created that allows the user to use the algorithm in both cases.

2D Environments

To work in 2D, the script to be launch is:

ElasticFML_2D.m

This method allow to work with synthetic data generated by an specific equation or with real data generated by the users.

N = 1000;
t = linspace(0, 10, N);
t = reshape(t, [N, 1]);
% Create your own data (uncoment if you want to generate your own data)
n=1; %Number of data to be acquired (can be modified to more than 1)
data = [];
[x1,constrains] = CaptureData(n);

for j=1:n
    indices_originales{j} = linspace(1, size(x1{1,j}, 2), N);
    traj = interp1(x1{1,j}', indices_originales{1,j});
    data{j} = traj;
end

%Simulated data (uncoment if you want an specific synthetic data)
x1 = -0.005*abs((t-5).^3) + 0.1 * sin(t) - 0.5;
n=1;
traj = [t, x1];
traj = abs(traj * 100) + 1;
data = traj;
constrains = [1, 100; 1000 100];

For the real data, the user has to capture the data using the CaptureData function. This function allows to capture the data using the mouse. The user has to click in the initial point, the final point and the via points. The number of via points can be modified in the CaptureData function. The data is stored in the x1 variable. The constrains are stored in the constrains variable. The constrains are stored in a matrix where the first column is the time and the second column is the position.

3D Environments

To work in 3D, the script to be launch is:

ElasticFML_3D.m

For 3D environments, it is necessary to create a matrix of a specified size where fast marching Learning can work first. Such an array can contain the objects that are considered as obstacles in the environment.

%% Loading the base map and the obstacles
% Follows the nex estructure: Wo(Y,X,Z) 
Wo_size = [180,180,180]; % Matrix for our robot ADAM, it can be change
Wo=ones(Wo_size);
Wo(1,:,:)=1;
Wo(Wo_size(1),:,:)=1;
Wo(:,1,:)=1;
Wo(:,Wo_size(2),:)=1;
Wo(:,:,1)=1;
Wo(:,:,Wo_size(3))=1; 

%% Obstacles can be added to the environemnt
% Wo(78.5:99.5, 41:66, 75:95)=0; % Box obstacle
Wo(50:90,16:41,1:100)=0; % Robot body (always necessary to don't have autocolisions)
% Wo(10:130,41:120,1:75)=0; % Table

The demonstrations in this case are acquired kinaesthetically by means of the robot's own arm or by means of a previously chosen dataset:

matFiles = dir('/home/adrian/Escritorio/ElasticMaps/ElMapMatlab/RAIL/PressingAdam/Pressing5/*.mat'); %Filtered data to optime velocity of FML
matFiles2 = dir('/home/adrian/Escritorio/ElasticMaps/ElMapMatlab/RAIL/PressingAdam/Pressing5NoFilter/*.mat'); %No filetred data to compare

This algorithm allows the user to generate paths with different constraints. To use those constrains is necessary to add the index and where is that constraints (the coordinate in 3D).

constrains = [pathFML(1,1) pathFML(2,1) pathFML(3,1);ConstraintX,ConstraintY,ConstraintZ;pathFML(1,end) pathFML(2,end) pathFML(3,end)];
nodes = ElasticMap3D(data, wei, stretch, bend, [1, constraintIndex, length(pathFML)], constrains,n,pathFML',1000);

The code was written and tested on MATLAB 2021b in Ubuntu 20.04. The final pick and place task using the ADAM robot in a real environment is presented in this video.

Citation

This work has been done by Adrián Prados and Brendan Hertel. Both authors has contributed equally.

If you have any question about the algorithm presented, please contact with Adrián Prados ([email protected]) and Brendan Hertel ([email protected]). If you use this code or the data please cite the following papers:

Paper for Fast Marching Learning

@article{prados2023kinesthetic,
  title={Kinesthetic Learning Based on Fast Marching Square Method for Manipulation},
  author={Prados, Adri{\'a}n and Mora, Alicia and L{\'o}pez, Blanca and Mu{\~n}oz, Javier and Garrido, Santiago and Barber, Ram{\'o}n},
  journal={Applied Sciences},
  volume={13},
  number={4},
  pages={2028},
  year={2023},
  publisher={MDPI}
}

Paper for Elastic Maps

@inproceedings{hertel2022robot,
  title={Robot Learning from Demonstration Using Elastic Maps},
  author={Hertel, Brendan and Pelland, Matthew and Ahmadzadeh, S Reza},
  booktitle={2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={7407--7413},
  year={2022},
  organization={IEEE}
}

Paper for the Elastic Fast Marching Learning

In construction 👷

About

This repository presents the code for the Elastic Fast Marching Learning (EFML) demonstration learning algorithm.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published