-
PDF
- Split View
-
Views
-
Cite
Cite
Hansol Bae, Magnus Paludan, Jan Knoblauch, Kaare H. Jensen, Neural networks and robotic microneedles enable autonomous extraction of plant metabolites, Plant Physiology, Volume 186, Issue 3, July 2021, Pages 1435–1441, https://doi.org/10.1093/plphys/kiab178
- Share Icon Share
Abstract
Plant metabolites comprise a wide range of extremely important chemicals. In many cases, like savory spices, they combine distinctive functional properties—deterrence against herbivory—with an unmistakable flavor. Others have remarkable therapeutic qualities, for instance, the malaria drug artemisinin, or mechanical properties, for example natural rubber. We present a breakthrough in plant metabolite extraction technology. Using a neural network, we teach a computer how to recognize metabolite-rich cells of the herbal plant rosemary (Rosmarinus officinalis) and automatically extract the chemicals using a microrobot while leaving the rest of the plant undisturbed. Our approach obviates the need for chemical and mechanical separation and enables the extraction of plant metabolites that currently lack proper methods for efficient biomass use. Computer code required to train the neural network, identify regions of interest, and control the micromanipulator is available as part of the Supplementary Material.
Introduction
Plant specialized metabolites comprise a wide range of chemicals, including terpenes, phenolic compounds, and alkaloids (Osbourn and Lanzotti, 2009). Many of these natural products have significant commercial value, for example flavors, fragrances, pharmaceuticals, and biofuels (Paddon and Keasling, 2014; Mewalal et al., 2017; Tetali, 2019). Most plant metabolites are located inside individual cells and extraction thus significantly disrupts the tissue (Wilken and Nikolov, 2012). The selection of a proper release methodology is therefore critical as it affects both product purity and yield. Many well-established protocols are available for extraction and processing of plant-derived products in highly pure form: some involve grinding, centrifugation, and filtration to remove cell debris (Buyel and Fischer, 2014). This, however, results in significant contamination, contributing to the high downstream processing costs (Menkhaus et al., 2004). Non-mechanical methods such as secretion (Borisjuk et al., 1999; Komarnytsky et al., 2000), enzymatic lysis (Blanco-Pascual et al., 2014), sonication (Hassan et al., 2008), and freeze–thaw cycles (Wang and Zhang, 2012) have been considered as alternatives but are not yet feasible for industry-scale deployment.
While the difficulties in extracting molecules are evident, the unique architecture of plants may hold clues to a new and different approach. Plants have evolved to store specialized metabolites in small distinctive structures, such as glandular trichomes (GT) and laticifers (Aziz et al., 2005; Ramos et al., 2019). These structures contain enriched metabolites used in, for example defense responses (Farmer, 2014), which appear advantageous as extraction targets compared with whole-plant harvesting. First, because selective disruption of glands and laticifers does limited harm to the organism. The extraction process can thus continue almost indefinitely as new tissue emerges. Second, utilizing a targeted release strategy obviates the need for whole-plant harvesting, thus significantly enhancing purity while reducing the processing costs. However, both glands and laticifer are relatively small (0.1 mm) and tissue identification and extraction currently require highly skilled labor (Gao et al., 2019). The strong potential of selective extraction of plant metabolites provides the impetus for the development of an automated system for targeted metabolite harvesting.
Automating the identification and extraction of compounds from plant cells requires (1) accurate control of probes to physically access the target cells and (2) a robust system for visual identification of target cells. Manually controlled microneedles are routinely used to manipulate plant cells (Paul et al., 2019, 2020); however, distinguishing cell types is a complex task. Remarkable progress in computer vision aided by machine learning (ML) promises faster and more reliable feature detection. In particular, convolutional neural networks (CNNs) are remarkably efficient at this task (LeCun et al., 1998). The technology has been widely adopted in cancer diagnostics (Bychkov et al., 2018; Kather et al., 2019) and is also used for phenotypic analysis and disease detection in crop plants (Singh et al., 2018; Atanbori et al., 2019). However, the utilization of ML for tagging and subsequent physical manipulation of single plant cells is an underexplored field.
We present an advance in the automated manipulation of plants to address the challenges of metabolite harvesting. In this report, we first demonstrate that CNNs are able to accurately detect GT in micrographs of plant leaves. We then engineer microneedles that are able to penetrate target structures and extract cellular fluids. Finally, we mount the microneedles on a robotic arm and demonstrate the scalable extraction of highly concentrated plant metabolites.
Results
We present an integrated computer-vision and robotic microneedle system for the autonomous extraction of plant metabolites. The concept combines (1) an image analysis algorithm that uses a neural network to identify target cells from micrographs of plant surfaces (Figure 1, a–c) and (2) automatic extraction of the metabolite contents using a microcapillary needle mounted on a computer-controlled micromanipulator (Figure 1, d and e). We demonstrate the technology on GT of the herbal plant rosemary (Rosmarinus officinalis); however, since the structure of the target tissue is highly conserved, we expect the method to readily translate to other species. The target cells (GT) contain volatile oils and comprise 4–18 fluid-filled cells (Wilken and Nikolov, 2012), with a total diameter of approximately 120 µm and volume 1 nL. The precise chemical composition and potential medical uses of rosemary extracts have been widely studied (Begum et al., 2013). Component analysis has shown the presence of terpenes and terpenoids, among them, the dominating constituents are camphor, 1,8-cineole, alpha-pinene, camphene, alpha-terpineol, and borneol (Barreto et al., 2014).

Methodology for autonomous extraction of plant metabolites. a, Metabolites are frequently localized to GT at the surface of plant leaves, such as rosemary shown in (b). We use a neural network to identify the target cells and a computer-controlled microcapillary needle to extract the metabolites. c, A top-view micrograph of the leaf surface reveals the relative abundance of glands (dashed circles). Arrows indicate out-of-focus glands. d, To detect the target cells, smaller sections of the micrographs are fed into a neural network, which upon training can distinguish between the target and background cells. e, This is used to locate the center (x,y) coordinates of the target tissue (red crosses) which are fed to the microcontroller. The needle subsequently pierces the target cell and extracts the metabolites. The process occurs without human intervention. See also Supplemental Videos S1, S2.
To detect the target cells, we applied transfer learning to the image classification neural network GoogLeNet CNN (Szegedy et al., 2015). The detailed training process is described in the “Computer vision” section. Briefly, 2000 micrographs of leaf surface segments were acquired using a microscope. The images were manually labeled (approximately 50% contained a GT) and transfer learning was applied to GoogLeNet CNN (Szegedy et al., 2015). This allowed for accurate detection of the (x,y)-coordinates of the target cells with approximately 90% accuracy (Figure 2). The remaining 10% includes both false positives and false negatives. The mislabeled regions, however, had minimal impact on the collection process because the needle tip (see below) moves near the focal plane of the microscope. In the case of a false positive, the needle would therefore not interact with the lower epidermal/pavement cells.

Training the neural network to identify target cells. To train the neural network, the micrographs were first subdivided into smaller domains (Figure 1, d and e). These were labeled as target and background cells, respectively, and were used to facilitate transfer learning on the GoogleNet neural network (Szegedy et al., 2015). Both the training (orange line) and validation (dashed blue line) accuracy reached approximately 90% after four epochs. Training took 169 s on a single RTX2080Ti GPU. See additional details in the “Computer vision” section.
An appropriately designed microneedle is required to efficiently extract the metabolites. Needle design has been widely researched, see, for example Brown et al. (2008) and Pereira et al. (2016). In general, minimizing tissue damage favors a relatively narrow needle tip. However, the target cellular liquid is highly viscous, since it contains a concentrated mixture of sugars, amino acids, choline, and organic acids (Choi et al., 2011). Overcoming viscous flow resistance thus mandates a relatively wide tip. To determine an appropriate compromise between these two seemingly antagonistic design criteria, we tested both wide and narrow needles with either blunt or beveled tips (Figure 3; the “Micropipette fabrication and metabolite extraction” section). A microcapillary needle was produced using a flaming micropipette puller and the tip was subsequently polished. It was mounted post-fabrication on a micromanipulator which allowed for either manual or computer control of the tip position. The base of the needle was connected to a syringe pump which provided continuous fluid suction (flow rate: 5 µL/h) and the approaching angle was set at least 45° to avoid collision between the needle and obstacles on the leaf surface.

Optimal design of the microcapillary needles. The metabolite extraction rate is limited by the needle shape: accuracy favors a narrow tip while maximizing flow requires a wide conduit. a, Varying the needle design, we compare blunt and beveled tips. b, Extraction efficiency (volume extracted per target cell) clearly favored the beveled-wide design. The entrance width was blunt-narrow: 1 μm (N = 54), blunt-wide: 25 μm (N = 88), beveled-narrow: 25 μm (N = 122), and beveled-wide: 45 μm (N = 190). The histogram indicates the mean volume extracted from N samples. Error bars indicate standard deviation.
To determine the optimum tip design, we first quantified the metabolite volume extract as a function of needle shape while manually controlling the tip position. Our experiments showed that negligible volumes were harvested using both the blunt-narrow and blunt-wide tips, due to either viscous flow resistance or cell debris stuck clogging the tip (Figure 3 and Supplemental Video S2). In contrast, needles with beveled tips showed superior performance and allowed us to exact almost the entire target volume, especially when the size of the entrance was at least 45 µm (Figure 3). In summary, we found that superior results were obtained using beveled tips, which combine width and sharpness to provide an appropriate compromise between the two limiting effects (Figure 3).
Having identified the optimal beveled-wide tip design, we transitioned to using the micromanipulator in the computer-controlled mode (Figure 4). The optimized needle was mounted on the micromanipulator which took the (x,y)-coordinates of the target cells as input and subsequently pierced the target cells (see the “Micropipette calibration and computer-controlled movement” section). A MATLAB script provided a convenient control interface between the microscope and micromanipulator (see the “Data availability statement” and “Code availability” section).

Workflow and efficiency of the automatic extraction process. Our system brings together image analysis and automated micromanipulation. a, A micrograph is acquired and the target tissue location is identified (red crosses) by passing the image through the neural network. The computer-controlled micromanipulator moves to each of the target regions while suction is applied to extract the cellular fluids. b, The yield per target is comparable (60%) to an expert-level human operator. See also Supplemental Video S2.
During the automated experiments, the precise position of the capillary tip was first determined by a calibration (see the “Micropipette calibration and computer-controlled movement” section). A micrograph of the leaf surface was then acquired and subsequently passed through the neural network which determined the center (x,y)-coordinates of the targets. Then, the micromanipulator moved the needle tip to the first target and began extracting the cellular fluid. During the 10-s extraction step, the tip moved randomly in a 20 µm × 20 µm window around the center coordinates to enhance the yield. These steps were repeated for each of the remaining targets. Finally, the leaf surface was moved using a second micromanipulator and the whole process restarted. The yield in the automatic process was 0.48 ± 0.08 nL/gland, corresponding to approximately 60% of the manual extraction haul (Figure 4, b).
The automated system substantially accelerated the liquid collection process. Using our setup, manually moving from gland-to-gland took approximately 30 s for a skilled human operator. (We note that performing the extraction work over an extended period of time is a repetitive and straining task.) Adding to this the liquid collection time (10 s) leads to a total of 40 s per gland. The extraction time can likely be reduced, but we found that 10 s was sufficient in the majority of cases. By contrast, the computer-controlled micromanipulator performed the same task in just 10 s per gland, typically using 1 s to transition from one gland to the next. In conclusion, the four-fold increase in collection speed implies that the efficiency of the automated process is equivalent to or better than the manual collection process.
The computer code necessary to train the neural network, identify regions of interest, and control the micromanipulator is available for download (see the “Data availability statement” and the “Code availability” section).
Discussion and conclusion
Automating the extraction of metabolites without disrupting the whole plant is a promising technology for sustainable phytochemical production, and a relatively complete picture of the limiting processes has emerged. We have successfully demonstrated that a combination of neural network-based image analysis and microrobotics can meet the challenge with near-human accuracy. Of particular importance is the quality of the training image data and the shape of the microcapillary needle tip.
It is not unlikely that future applications of this method will target cell types with characteristics that differ from GT. Automating the manipulation of smaller cells, in particular, will be challenging. First, it will require an approximately 10-fold reduction in the needle tip dimensions, and a concomitant increase in the precision of the needle position control. Second, our method relies on the fact that the leaf surface is relatively flat. That is, features taller than the glands are rare. In this simplest case, this obviates the need for detailed programming of the vertical z-axis motion, again simplifying our method. It is worth mentioning that the approach angle of the capillary needle used here was approximately 45°. This implies that the capillary may mistarget the glands when any obstacle is located along the path. Therefore, if the capillary could move vertically, along the camera axis, the efficiency of access and extraction could be improved.
Although the aforementioned limitations on future applications are present, they are not insurmountable. Smaller needles and positioning hardware is commercially available, and a 3D model of the leaf surface can be generated using, for instance, a confocal laser scanning microscope, or simply by sweeping the focal plane using a conventional optical microscope, thus identifying the vertical position of each gland. We note that accessing the three-dimensional space is likely to further enhance yield. In our setup, around 45% of the glands identified by the neural network were located below the needle plane and were thus left untouched. This information could, potentially, be used to further refine the performance of the detection system using reinforcement learning (Sallab et al., 2017).
Future work will also address challenges related to accessing internal tissue such as the phloem, and to use the technology to unravel the function of genes and proteins (Jinturkar et al., 2011) by providing single-cell stimulus at a specific time and place. Currently, delivering materials to individual cells is highly laborious and sample sizes are often small (Gao et al., 2019). We expect our methodology to provide a scalable platform for these applications.
Materials and methods
In this section we present (1) the materials and (2) the methods used in autonomous plant material extraction. The materials include the imagining setup, the plant material, and the micropipette fabrication and oil extraction. The methods include computer vision and computer-controlled movement of the micropipette.
Imaging setup
The microscope camera (DMC2900, Leica, Germany) mounted on the microscope (DM2500, Leica, Germany) was connected to a computer via USB. The micromanipulator control unit (ROE-200, Sutter Instruments, USA) and micromanipulator (MPC-200, Sutter Instruments, USA) were also connected to the computer via USB.
Plant material
The herbal plant rosemary (R. officinalis) grown at 24°C with 16-h/8-h light–dark cycle was used for the metabolite extraction experiments. The abaxial (lower) leaf side with abundant GT was mounted face-up on a microscope slide. The leaf was held in place by adhesive tape.
Micropipette fabrication and metabolite extraction
The micropipettes were produced using a Micropipette Puller (Model P-1000, Sutter Instruments, USA). Thin aluminosilicate glass capillary, outer and inner diameter of 1.00 mm and 0.64 mm, respectively (AF100-64-10, Sutter Instruments), was placed on the puller and ran under the program with the parameters of heat ramp test value-10, pull 0, velocity 25, time 250, and pressure 500 for the suitable micropipette design. To reduce mechanical stress and avoid clogging during the extraction, the tip of the micropipette was broken by gently dragging the tip on a Kimwipe (Kimberly-Clark Professional, USA). The micropipette was used as is, or polished (see A.4), and was placed on a holder on the micromanipulator connected to a gastight syringe (#81220, Hamilton, USA) by an oil-filled polymer tube. The syringe was mounted on a syringe pump (78-9200C, KD Scientific, USA) and suction (typically 5 μL/h) was applied.
Polishing of the micropipette
The micropipette tip was polished using a microelectrode beveler (BV-10, Sutter Instruments, USA).
Computer vision
To detect the (x,y)-coordinates of target cells, micrographs of the leaf surface were recorded (see the “Imaging setup” section) and identified using a “transfer learned” GoogleNet (Szegedy et al., 2015) CNN. The transfer learning training data for the network contained 34 full-resolution 4096 × 2160 × 3 microscope images at 100× magnification. The training data were collected by dividing the full-resolution images into 224 × 224 × 3 sub-images (Figure 1, c–e). The full-resolution images contain targets in various numbers and topological complexities. To avoid detecting targets that were not in the focal plane, the sub-images were manually labeled binarily. In total, 2000 training images were generated. Approximately 50% of the images contained a target.
Transfer learning was applied to a pre-trained GoogleNet CNN for four epochs, with a mini-batch size of 10, shuffling every epoch, and an initial learning rate of 10−4. The final fully connected and classification layers were replaced with two new layers in order to adapt to the new training set data, using the protocol described in Szegedy et al. (2015). In the training processing, we minimized the cross-entropy using a stochastic gradient descent with momentum (sgdm) optimizer (Szegedy et al., 2015). The validation data were chosen randomly as 30% of the training data. The transfer learning took 169 s on a single Nvidia RTX2080Ti GPU (NVIDIA Corporation, USA) and resulted in a validation accuracy of approximately 90% for target detection (Figure 2). In experiments, full-resolution images were then divided into sub-images, passed through the transfer learned network and a full-resolution binary map was reconstructed from the binarily labeled sub-images. The (x,y)-target coordinates were then found as the center-of-mass (Figure 1, e) of regions where the sub-images were labeled as containing a target.
Micropipette calibration and computer-controlled movement
The automated metabolite extraction system relies on a microcapillary needle mounted on a computer-controlled micromanipulator. The needle is observed through an optical microscope, and we denote the tip position (x1,y1) on the micrographs. (These coordinates correspond to the image pixel coordinates of the tip.) The position of the microcontroller (ROE-200, Sutter Instruments, USA) was controlled through serial communication from MATLAB (v. 2019, MathWorks, USA). The tip position in the micromanipulator frame of reference is (x2,y2) measured in meters relative to an arbitrary origin in the laboratory frame. As the micromanipulator stage moves, the position of the microcapillary needle on the micrograph changes (see Figure 1 and Supplemental Videos S1, S2).
Establishing the link between the two set of coordinates (x1,y1) and (x2,y2) is of critical importance to our methodology (Figure 5). Specifically, a region of interest detected in the (x1,y1) image-frame must be translated into a micromanipulator coordinate set (x2,y2) before the needle can perform the movement and extraction steps.

Calibration of the needle position. a, Micrograph of a microcapillary needle (outlined) mounted on a micromanipulator. The needle tip (orange circle) has center coordinates (x2,y2) in the micromanipulator coordinate frame (yellow), and center coordinates (x1,y1) in the microscope coordinate frame (red). b, Set of images used to determine the coordinate transformation parameters (Eq. 1).
Here, (t1,t2) is the translation vector, θ is the rotation angle, and k is the scaling factor. It is possible to also include the third (i.e., z) dimension, but we shall not do that here. To determine the parameter values, we acquired N micrographs of the needle. (Typically, N = 6 was sufficient.) We extracted the tip coordinate by visual inspection, and this was paired with the micromanipulator readings . Finally, we performed a least square fit to estimate the parameter values from the N pairs of coordinates.
With the parameters of the coordinate transformation known, it was then possible to move the micropipette tip to any image position automatically, by applying the reverse coordinate transformation to the image coordinates and moving the microcontroller to the resulting coordinates. Thus, combining the methods in “Computer vision” and “Micropipette calibration and computer-controlled movement” allowed autonomous penetration of identified targets (Figure 1, d).
Code availability
The computer code necessary to train the neural network, identify regions of interest, and control the micromanipulator is available at https://github.com/Jensen-Lab/AUTOPLANT.
Data availability statement
Data that support the plots within this paper and other findings of this study are available from the corresponding author upon reasonable request. See also the “Code availability” section.
Supplemental data
Supplemental Video S1. Side view of the needle penetration process.
Supplemental Video S2. Top view of the autonomous metabolite extraction process.
Funding
This work was supported by a research grant (17587) from VILLUM FONDEN.
Conflict of interest statement. None declared.
K.H.J. and H.B. designed the research and wrote the manuscript. H.B., M.P., and J.K. performed the research. M.P. contributed new analytic/computational/etc. tools. H.B. analyzed the data.
The author responsible for distribution of materials integral to the findings presented in this article in accordance with the policy described in the Instructions for Authors (https://academic.oup.com/plphys/pages/general-instructions) is: Kaare H. Jensen ([email protected]).
References
Author notes
Senior author.