Skip to content

rordenlab/brainchop

 
 

Repository files navigation

Brainchop Version JS MIT-License tfjs DOI

Frontend For Neuroimaging. Open Source

brainchop.orgUpdatesDocNews!Citev3


Brainchop brings automatic 3D MRI volumetric segmentation capability to neuroimaging by running a lightweight deep learning model (e.g., MeshNet) in the web-browser for inference on the user side.

We make the implementation of brainchop freely available, releasing its pure javascript code as open-source. The user interface (UI) provides a web-based end-to-end solution for 3D MRI segmentation. NiiVue viewer is integrated with the tool for MRI visualization. For more information about Brainchop, please refer to this detailed Wiki and this Blog.

For questions or to share ideas, please refer to our Discussions board.

Interface

Brainchop high-level architecture

Interface

MeshNet deep learning architecture used for inference with Brainchop (MeshNet paper)

MeshNet Example

This basic example provides an overview of the training pipeline for the MeshNet model.


Live Demo

To see Brainchop v4 in action please click here.


For v3 click here.


Updates

Brainchop v4 with NiiVue viewer


Brainchop v3 with more robust models


Interface

Brainchop v1.4.0 - v3.4.0 rendering MRI Nifti file in 3D


Interface

Brainchop v1.3.0 - v3.4.0 rendering segmentation output in 3D

News!

  • Brainchop v2.2.0 paper is accepted in the 21st IEEE International Symposium on Biomedical Imaging (ISBI 2024). Lengthy arXiv version can be found here.


  • Brainchop paper is published in the Journal of Open Source Software (JOSS) on March 28, 2023.


  • Brainchop abstract is accepted for poster presentation during the 2023 OHBM Annual Meeting.


  • Brainchop 1-page abstract and poster is accepted in 20th IEEE International Symposium on Biomedical Imaging (ISBI 2023)


  • Google, Tensorflow community spotlight award for brainchop (Sept 2022) on Linkedin and Twitter


  • Brainchop invited to Pytorch flag conference, New Orleans, Louisiana (Dec 2022)


  • Brainchop invited to TensorFlow.js Show & Tell episode #7 (Jul 2022).

Citation

Brainchop paper for v2.1.0 is published on March 28, 2023, in the Journal of Open Source Software (JOSS) DOI


For APA style, the paper can be cited as:

Masoud, M., Hu, F., & Plis, S. (2023). Brainchop: In-browser MRI volumetric segmentation and rendering. Journal of Open Source Software, 8(83), 5098. https://doi.org/10.21105/joss.05098


For BibTeX format that is used by some publishers, please use:

@article{Masoud2023, 
  doi = {10.21105/joss.05098}, 
  url = {https://doi.org/10.21105/joss.05098}, 
  year = {2023}, 
  publisher = {The Open Journal}, 
  volume = {8}, 
  number = {83}, 
  pages = {5098}, 
  author = {Mohamed Masoud and Farfalla Hu and Sergey Plis}, 
  title = {Brainchop: In-browser MRI volumetric segmentation and rendering}, 
  journal = {Journal of Open Source Software} 
} 

For MLA style:

Masoud, Mohamed, Farfalla Hu, and Sergey Plis. ‘Brainchop: In-Browser MRI Volumetric Segmentation and Rendering’. Journal of Open Source Software, vol. 8, no. 83, The Open Journal, 2023, p. 5098, https://doi.org10.21105/joss.05098.


For IEEE style:

M. Masoud, F. Hu, and S. Plis, ‘Brainchop: In-browser MRI volumetric segmentation and rendering’, Journal of Open Source Software, vol. 8, no. 83, p. 5098, 2023. doi:10.21105/joss.05098


Funding

This work was funded by the NIH grant RF1MH121885. Additional support from NIH R01MH123610, R01EB006841 and NSF 2112455.


Mohamed Masoud - Sergey Plis - 2024

About

Brainchop: In-browser 3D MRI rendering and segmentation

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 73.5%
  • CSS 14.8%
  • Jupyter Notebook 6.4%
  • SCSS 1.7%
  • Less 1.7%
  • HTML 1.3%
  • Python 0.6%