Skip to content

Latest commit

 

History

History
130 lines (91 loc) · 4.52 KB

README.md

File metadata and controls

130 lines (91 loc) · 4.52 KB

empanada-napari

Important

New Version Announcement!

  • New modules
    • Morph labels - applies morphological operations to labels
    • Count labels - counts and lists the label IDs within the dataset
    • Filter labels - removes small pixel/voxel area labels or labels touching the image boundaries
    • Export and import a model - export or import locally saved model files to use within empanada-napari
  • Updated modules
    • Export segmentations - now allows 3D segmentations to be exported as a single .tiff image
    • Pick and save finetune/training patches - now allows paired grayscale and label mask images to create training patches
    • Split label - now allows users to specify new label IDs
  • Updated documentation
    • Check out the updated documentation here!

The paper describing this work is now available on Cell Systems.

Documentation for the plugin, including more detailed installation instructions, can be found here.

empanada is a tool for deep learning-based panoptic segmentation of 2D and 3D electron microscopy images of cells. This plugin allows the running of panoptic segmentation models trained in empanada within napari. For help with this plugin please open an issue, for issues with napari specifically raise an issue here instead.

Implemented Models

  • MitoNet: A generalist mitochondrial instance segmentation model.

Example Datasets

Volume EM datasets for benchmarking mitochondrial instance segmentation are available from EMPIAR-10982.

Installation

New Users

If you've previously installed and used conda, it's recommended (but optional) to create a new virtual environment in order to avoid dependency conflicts.

empanada-napari works with python=3.9 or lower

It's recommended to have installed napari through conda. Then to install this plugin:

pip install empanada-napari==1.1.1

Launch napari:

napari

Look for empanada-napari under the "Plugins" menu.

Returning Users

If you installed napari into a virtual environment as suggested in the original release documentation, be sure to activate it and uninstall the old empanada-napari.

pip uninstall empanada-napari

Then install the newest version:

pip install empanada-napari==1.1.1

empanada

GPU Support

Note: Mac doesn't support NVIDIA GPUS. This section only applies to Windows and Linux systems.

As for any deep learning models, having a GPU installed on your system will significantly increase model throughput (although we ship CPU optimized versions of all models with the plugin).

This plugin relies on torch for running models. If a GPU was found on your system, then you will see that the "Use GPU" checkbox is checked by default in the "2D Inference" and "3D Inference" plugin widgets. Or if when running inference you see a message that says "Using CPU" in the terminal that means a GPU is not being used.

Make sure that GPU drivers are correctly installed. In terminal or command prompt:

nvidia-smi

If this returns "command not found" then you need to install the driver from NVIDIA. Instead, if if the driver is installed correctly, you may need to switch to the GPU enabled version of torch.

First, uninstall the current version of torch:

pip uninstall torch

Then install torch >= 1.10 using conda for your system. This command should work:

conda install pytorch cudatoolkit=11.3 -c pytorch

Citing this work

If you use results generated by this plugin in a publication, please cite:

@article { Conrad2023,
    author = {Conrad, Ryan and Narayan, Kedar},
    title = {Instance segmentation of mitochondria in electron microscopy images with a generalist deep learning model trained on a diverse dataset},
    journal = {Cell Systems},
    year = {2023},
    month = {Jan},
    day = {18},
    publisher = {Elsevier},
    volume = {14},
    number = {1},
    pages = {58-71.e5},
    issn = {2405-4712},
    doi = {10.1016/j.cels.2022.12.006},
    url = {https://doi.org/10.1016/j.cels.2022.12.006}
}