Skip to content

3D Spectral boundary integral solver for cell-scale blood flow

License

Notifications You must be signed in to change notification settings

comp-physics/RBC3D

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RBC3D Banner

RBC3D

Spectral boundary integral solver for cell-scale flows

Authors: S. H. Bryngelson, H. Zhao, A. Isfahani, J. B. Freund

RBC3D is a flow solver for soft capsules and cells via the methods discussed in Zhao et al., JCP (2010) and more. This codebase solves the boundary integral form of the Stokes equations via an algorithm tailored for cell-scale simulations:

  • Spectrally-accurate spherical harmonics represent the deforming surfaces
  • Modified Green’s function approximation used for near-range interactions
  • Electrostatic-like repulsion prevents cells from intersecting
  • Weak-formulation of no-slip boundary conditions (e.g., vessel walls)
  • These features ensure that simulations are robust. Parallel communication via MPI enables large simulations, such as model vascular networks.

Installation

To install on a mac from the cloned repository, you can

brew install gcc mpich gfortran pkg-config wget cmake
./rbc.sh install-mac

and then from the RBC3D root directory, run these commands but replace .zshrc with where you store environment variables:

rootdir=`pwd`
echo -e "export PETSC_DIR=$rootdir/packages/petsc-3.21.3 \nexport PETSC_ARCH=arch-darwin-c-opt" >> ~/.zshrc

Then to execute and run a case, you can:

mkdir build
cd build
cmake ..
make -j 8 minicase # or just `make` to make common and all the cases
cd minicase
mpiexec -n 1 ./minit
mpiexec -n 2 ./mtube # number of nodes can be changed

This will generate output files in build/minicase/D. To keep output files in examples/minicase/D and use input files in examples/minicase/Input, you can do this instead once files are built in the build directory:

cd examples/case
mpiexec -n 1 ../../build/case/minit
mpiexec -n 2 ../../build/case/mtube

To run a case with more cells and nodes, you should use a supercomputing cluster. Instructions on how to build RBC3D on a cluster are available here.

Papers that use RBC3D

This is an attempt to document the papers that make use of RBC3D.

License

MIT.