top of page

This project was developed at CSE REU 2021 at Washington University in St. Louis with Professor Tao Ju, in collaboration with the Danford Science Center. We worked to create an interactive annotation software with GUI used in biomedical imaging and plant root segmentation.

Above is a demo video of the prototype. Below is an image of our prototype. The image data shows 2D slices of a 3D CT scan of a corn plant. The 3D window shows the viewing volume (representing our stack of images), 3D annotations based on the user's 2D annotations, and our viewing plane corresponding to our 2D window. In the 2D window, colorful dots show where the viewing plane intersects the 3D annotations. Interactive features help the user annotate and navigate.​

Screen Shot 2021-11-09 at 10.25.05 AM.png
Screen Shot 2021-11-14 at 12.42.55 AM.png

 

 3D annotations consist of the skeleton, polygon cross-sections, cube junction points where children's branches meet parent branches, and numeric indices for annotations and branches. The bottom image shows skeleton mode turned on. 

Screen Shot 2021-11-14 at 12.43.03 AM.png

Background:

 

Why is this project important? Roots are essential to the plants' growth and health, playing a crucial role in providing food for our growing population. By growing the corn in artificial soil in a pot, we can gather data by taking 2D slices of a 3D CT scan. Then, biologists need to annotate to find where the roots are in the images.

Screen Shot 2021-11-14 at 12.09.57 AM.png

The process of annotation and navigation is tedious and cumbersome and currently takes biologists eight weeks to complete. In an ideal world, we would have a machine learning program to find the roots. However, to make such a program requires ground truth data, so manual annotation is still necessary. Thus, our goal is to create a tool that makes it easier and faster for biologists to find and annotate the roots. We believe that traveling orthogonally to the root allows for more efficient navigation and annotation.

The image on the left is one slice of the 3D CT Scan. On the right is the same image, but with the roots highlighted in red. The large, grey circles are the artificial soil that the roots grow in. The corn roots themselves are tiny and difficult to find. 

Results:

Screen Shot 2021-11-09 at 10.25.05 AM.png
Screen Shot 2021-11-14 at 12.05.00 AM.png

The left image shows the annotations I made using our prototype, based on the plant root data. On the right, I turned on ground truth mode, which illustrates a rendering of the annotations provided by the Danthford Science center for this data set. The results are very similar.​

Screen Shot 2021-07-27 at 1.44.09 PM.png

Looking closely at a branch that I annotated but a biologist did not, it appears that this is a root that the biologist missed. Using the navigation part of our tool, we can look at the data from different perspectives and see that this does look like a root and less like noise.

In total, it took me 5 hours to annotate this data set. Because this is a prototype, the data is about 1/10th of the entire data set. To extrapolate, it would take someone about 50 hours to annotate the whole corn root system, which is shorter than the eight weeks it currently takes biologists. 

bottom of page