Skip to content

Human - Robot Collaboration for fabric folding using Kinect2, RoboDK, Reflex 1 gripper and the ATI Force Torque Gamma sensor

License

Notifications You must be signed in to change notification settings

KonstantinosAng/KinectPython

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

KinectPython

Human - Robot Collaboration for fabric folding using RGB/D and Kalman Filters.

Table of Contents

  1. Description
  2. Installation
  3. Usage
  4. Examples

Description

This algorithm is designed to utilize the Kinect for windows 2 RGB/D sensor, the KUKA LWR IV+ industrial robot, the ATI Force Torque Gamma Sensor, the Reflex One gripper and non Linear Kalman Filter estimation to enable the Human - Robot Collaboration for fabric folding. This algorithm is an extension of this publication. First the algorithm identifies the laid fabric using the background subtraction technique and map the fabric corner points to the corresponding wolrd space coordinates. After locating the fabric, the operator can enter the collaborative space and grab a fabric corner. Then the decision model will compute the robot's starting point and command the robot to approach the fabric and grab the fabric using the appropriate grasping model. After the robot has grabbed the fabric, it starts to follow the operator's movement to properly fold the fabric. The constructed framework and the hardware used can be seen in the figure below.

Hardware Specifications
CPU: Intel Core i3 - 3110m 2.4 GHz, 2 Cores & 4 Threads
GPU: Nvidia Geforce GT710M GPU Memory 1 GB
RAM: DDR3 1600 MHz 8 GB

Installation

Kinect Setup

To install the drivers for the Kinect 2 for windows download and install the Kinect for Windows SDK 2.0.

Python Environment Installation

The constructed architecture works only on Windows and is tested with Python 3.6. First create a fresh conda virtual environment with anaconda that uses Python 3.6 with the following steps:

  1. Download and Install Anaconda for Windows using this link.

  2. Create a new virtual env with Python 3.6. Open the Anaconda Prompt and type the following command.

    conda create -n name_of_your_environment python=3.6
    
  3. Activate the constructed environment.

    conda activate name_of_your_environment
    
  4. Install all requirements from requirements.txt using the following command.

    pip install -r requirements.txt
    
  5. Download all files using git clone or the .zip option and place them all in a folder wherever you want.

  6. Open the Anaconda Prompt and type the following commands to find the directory of the installed python in the conda environment.

    conda activate name_of_your_environment
        where python
        ```
    
  7. Navigate to the Python's displayed directory, for example

    C:\Users\UserName\.conda\envs\name_of_your_environment
    
  8. Navigate inside the pykinect2 installed Library of the Python.

    C:\Users\UserName\.conda\envs\name_of_your_environment\Lib\site-packages\pykinect2
    
  9. Replace all the files inside the pykinect2 installed Library with the files located in the pykinect2_original folder inside the repository's downloaded files.

  10. Add Python Directory to the systems Environment Variables PATH.

    • Search for Edit the system environment variables.
    • From the Advanced Tab click on the environment variables.
    • From the System variables scroll down, select Path and click on Edit.
    • By clicking on New add the following paths.
      C:\Users\UserName\.conda\envs\name_of_your_environment
      C:\Users\UserName\.conda\envs\name_of_your_environment\python.exe
      C:\Users\UserName\.conda\envs\name_of_your_environment\Library\bin
      C:\Users\UserName\.conda\envs\name_of_your_environment\Scripts
      

RoboDK Installation

  1. To configure RoboDK download and install the latest version of RoboDK using this link.

  2. After downloading and installing the RoboDK, load all the 3D models from the Models/ folder and place them in the correct position.

  3. The file RoboDK/KUKA/KUKA.rdk has the constructed workspace of our laboratory workspace, including the Kinect, the robot and the table and can be loaded in RoboDK.

  4. After constructing the collaborative space, a connection to the Real KUKA Robot must be established using the Instructions on the RoboDK/KUKA_2_ROBODK_COMMUNICATION/Instructions.txt file.

  5. After loading the files and connecting to the robot, leave the RoboDK open and connected.

Reflex One Gripper Installation

The Refex One Gripper works only with ROS Jade on Ubuntu 14.04 LTS. In order to install and configure the Reflex One software follow the instructions on the Gripper/instructions.txt file.

ATI Gamma FT Sensor Installation

The ATI FT Gamma Sensor works with Linux and Windows and Python 2 or 3. To setup the ATI FT Sensor follow the instructions on the ATI_FT/instructions.txt file.

Usage

If the installation is completed and everything works, then follow the next steps to use the code.

  1. Start the KUKA Controller, select the RoboDKsync35.src file, run it in an automatic loop from the teach pendant and lower the robot speed for safety.

  2. Open RoboDK, load the workstation file, connect to the robot and leave it open.

  3. Connect the Gripper and start the ROS API by running the ros_server.sh bash file.

  4. Power On the ATI controller, connect the ATI FT sensor via USB and run the ATI_FT/ati_ft_sensor.py file.

  5. Run the test_view.py file to adjust the Kinect's view and position.

  6. Capture the background by running the background_photo.py file.

  7. Then lay down the fabric and make sure it is unfolded properly.

  8. Connect the Kinect via USB to the computer.

  9. Open the track_v3.py file and change the following flags according to what you want to use:

    dim = True  # Flag for finding the fabric's dimensions
    cal = False  # Flag for calibrating the camera
    Sim = True  # Flag for starting RoboDK
    RealMovement = True  # Flag to move the real robot with RoboDK
    gestureInit = True  # Flag for custom Gesture classifier
    gripperInit = True  # Flag to connect to gripper
    sensorInit = True  # Flag to connect to the ATI FT Sensor
    kalmanInit = True  # Flag for drawing kalman on screen
    skeletonInit = True  # Flag for drawing kinect's skeleton tracking on screen
    cloudInit = False  # Flag for Cloud Skeleton Visualize in RoboDK
    cloudPointInit = False  # Flag to import the workspace as a pointCloud in RoboDK
    full_screen = False  # flag to open pygame in fullscreen
    

    Then change the following parameters to your own configurations. Specifically, the Robot's controller IP and port:

    """======================== ROBOT CONFIGS ==========================="""
    ROBOT_IP = '169.254.98.120'  # KRC2 LAN IP
    ROBOT_PORT = 7000  # KRC2 LAN port
    

    The Gripper ROS API IP, port and encryption flag same as the server's value:

    """============================= Gripper Configs ==========================="""
    VM_IP = '192.168.56.2'  # Vm with Ubuntu Host only Static IP
    VM_PORT = 20000  # Port to communicate with Ubuntu running ROS
    VM_SERVER_ENCRYPTION = True
    

    The ATI Controller Server IP, port, encryption flag (same as the server's value) and the COM port that the ATI FT controller is connected to.

    """========================== ATI FT Sensor Configs ====================="""
    ATI_FT_IP = 'localhost'
    ATI_FT_PORT = 10000
    ATI_FT_SERVER_ENCRYPTION = True
    ATI_FT_COM_PORT_WINDOWS = 'COM1'  # Port that the DAQ ATI is connected to the windows computer
    ATI_FT_COM_PORT_LINUX = '/dev/ttyUSB0'  # Port that the ATI Controller FT is connected to the linux computer
    
  10. Save and run the track_v3.py file.

  11. If everything is correct you will see the following lines on the screen: +-----------------------------+ [MAIN] Elapsed Time: seconds [MAIN] Loaded: 100% [MAIN] Starting... +-----------------------------+ [ATI FT CLIENT]: Message from Server: Hello UDP Client [ATI FT CLIENT]: Message from Server: Hello UDP Client [ATI FT CLIENT]: Message from Server: Started ATI FT... You can grab... +-------------------+ Connecting to Gripper Server [GRIPPER CLIENT]: Message from Server: Hello UDP Client [GRIPPER CLIENT]: Message from Server: Hello UDP Client [GRIPPER CLIENT]: Message from Server: Started ROS... You can publish... [GRIPPER CLIENT]: Message from Server: Hello UDP Client [GRIPPER CLIENT]: Message from Server: Opened Gripper [ROBODK]: Connection Successful [ROBODK]: Robot Status: Connected +-------------------+ Fabric Detected Width (mm): Height (mm): World Center Point (XYZ in mm): Color2World Fabric Points (mm): ISO Speed Calculated +-------------------+ +-------------------+ Starting Tracking +-------------------+

When the Starting tracking message show up then the operator can enter the collaborative space and grab a fabric corner. Then the decision model will compute the robot's starting point and command the robot to approach the fabric and grab the fabric using the appropriate grasping model. After the robot has grabbed the fabric, it starts to follow the operator's movement to properly fold the fabric.

Examples

The constructed collaborative space can be seen inside the RoboDK simulation space:

The framework was tested by performing two folds in two different directions. Image examples from the first fold:
Fold 1: Start Fold 1: End

Image examples from the second fold:
Fold 2: Start Fold 2: End

About

Human - Robot Collaboration for fabric folding using Kinect2, RoboDK, Reflex 1 gripper and the ATI Force Torque Gamma sensor

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published