
Inside IALR
Inside IALR explores the ways that the Institute for Advanced Learning and Research (IALR) catalyzes economic transformation. Listen for a behind-the-scenes view of how our programs, people and partnerships are impacting Southern Virginia and beyond. Host Caleb Ayers and Producer Daniel Dalton interview someone new every episode, introducing listeners to IALR leaders and partners, promoting programs and highlighting opportunities to connect with us.
New episodes are published every other Monday.
Inside IALR
Tracking Plant Growth Pixel by Pixel
What if you could track a plant’s health and growth every 15 minutes, all automatically and without ever touching it?
In this episode of Inside IALR, Dr. Scott Lowman, Vice President of Applied Research at the Institute for Advanced Learning and Research, explores the SMART Platform—IALR’s Spatially and Mechanically Accurate Robotic Table system. These high-tech tables combine robotics, precision imaging and automation to capture tens of thousands of data points per experiment, helping researchers analyze plant growth, stress response and even subtle movements in real time.
Learn how the SMART Platforms allow for entire plant life-cycle testing for beneficial microbes and enable real-time monitoring of plant health. You’ll hear about how interns have played a central role in coding and refining the system, how companies can contract research on the tables and how this technology is helping lay the groundwork for more sustainable agriculture.
Whether you’re into agtech, robotics, plant biology or data science, this episode connects it all. Plus, you’ll hear how this one-of-a-kind platform is opening doors for students and researchers alike.
🔍 Topics Covered:
- What makes SMART Platforms unique
- How 80,000+ images become meaningful plant health data
- Intern-driven innovation in Python and computer vision
- Industry collaboration and commercialization opportunities
- The future of AI in agriculture and early stress detection
The Institute for Advanced Learning and Research serves as a regional catalyst for economic transformation in Southern Virginia. Our services, programs and offerings are diverse, impactful and far reaching.
Get updates from IALR on other channels:
Welcome to another episode of Inside IALR. We've had a few weeks off or I guess more than a month off at this point, just with the summer holidays and vacations and all of those sorts of things but we are back and ready to keep telling the stories of kind of what's going on here in southern Virginia, what's going on here at the Institute for Advanced Learning and Research, and today we have Dr Scott Lohman, who's our Vice President of Applied Research, here. Dr Lohman, thanks for being here. Thank you, you are by far one of our most persistent podcast guests.
Caleb Ayers:I don't even know what number time this is for you, but you have been an excellent guest every time you have been here. Time this is for you, but you have been an excellent guest every time you have been here. So the main thing I wanted to talk to you about today is kind of our plant imaging platforms that you have helped run for, I mean, I think, 10 years at this point. So tell us a little bit about I know they're called SMART platforms. We love acronyms here. Tell us what a SMART platform is.
Scott Lowman:So our SMART tables or SMART platforms platforms are the acronym stands for spatially and mechanically accurate robotic tables, and what that means is they're. They are extremely precise in their location, so if we program it to go to a certain place, the tables are five foot by ten foot and just for instance, just imagine that we have 20 plants on the table and we can program it to go to each one of those plants, and when you say it, you mean a camera.
Scott Lowman:Yeah, I'm sorry. Yes, the head is called a gantry robot. We can program that gantry head of the robot to go precisely to each of the 20 plants. The head has a camera mounted to it. It can be changed, it can be different types of cameras and it captures that image. The really defining factor about the tables is that it's accurate to within about a thousandth of an inch, so it goes almost exactly to the same spot each time and what that gives us the ability to do is to capture that image from the same spot and eventually we can time-lapse those images. It looks like there's a camera above the plant the whole time, so it seems like there's 20 plants and 20 cameras on that table.
Caleb Ayers:But it's just the one that's going back and forth and taking the pictures of each one.
Scott Lowman:Yep, and it does that. It can do it every 15 minutes. And what that does is it gives us lots and lots of data for plant science experiments. In the old days it used to be you'd plant the plant, you'd water it, you'd maybe take some measurements with some rulers or some tape measures or something, and then, at the end of the experiment, you would pull the plant up and weigh it. This gives us a lot more data in between. It really gives a lot of insight into how the plant grows and behaves under different conditions. So where did this?
Caleb Ayers:idea come from.
Scott Lowman:The idea initially started at Virginia Tech with Dr Jerzy Novak and Al Wicks from the Mechanical Engineering Department.
Scott Lowman:They had worked Al Wicks specifically had worked with imaging with the military and they used those tables to image different things that the military was working on. So he and Yersey Novak, who was a former head of the Department of Horticulture and also an important member of the original team that set up the facility here at the Institute, saw an opportunity with them in plant science. The former director here at the Institute was Barry Flynn. He was involved in that as well, and those tables were brought down about the time I finished my PhD work here at the Institute. So I was fortunate enough to be able to develop those through my postdoc work and then continue to develop those. The initial tables and the initial concept was to put them into greenhouses and we found out pretty quickly that they weren't really meant for high humidity environments and so they started to rust and different things, and so about six years ago we completely redesigned them. We designed them to be mainly aluminum, rubber or stainless steel so they aren't susceptible to those same challenges with humidity.
Caleb Ayers:So you mentioned at the beginning you know it's this camera's going through every 15 minutes taking a picture of every plant. Tell me about kind of what a typical experiment looks like. What kind of data is it collecting, how much data is it collecting, what's happening with that and then what are you doing with that information?
Scott Lowman:So many of our experiments are focused on plant biostimulants, looking at which different types of microorganisms can increase plant growth. So if you think about that type of experiment, it starts out as we start a lot of little plants. We pick plants that are uniform in size because we want to start off pretty similar with all of them. So say it's 80 plants. We may start 160 plants and pick 80 that are almost exactly the same size.
Caleb Ayers:And because the table can handle up to 80, right.
Scott Lowman:Even more in reality, depending on the type of systems. Back when we first started, we were using floating tobacco trays which about a one foot by two foot tray could hold 200 plants.
Caleb Ayers:Oh, wow.
Scott Lowman:And it's a time so we could get about 2,400 plants per table Wow. But today we mainly focus on individual plants. And it's a time so we could get about 2,400 plants per table, but today we mainly focus on individual plants. We have each plant gets an image of itself, and that promotes each plant being a sort of an experimental unit, so we can take averages from five different plants and put them together and do statistics on them's. Harder to do when you have populations right that makes sense.
Caleb Ayers:So you're collecting that data over the um, over the experiment. So then, kind of what's what's going? Yeah, I guess, what data are you collecting? What's what's going on there?
Scott Lowman:yeah, so after I should have finished last time, I apologize. Uh, we, after we plant the plants on the table, um, we program the table to go above each plant. The neat thing about the tables is that when we start the table and it goes to the first plant, it creates a folder for that plant. So the image of that first plant creates a first folder and then that plant all along during the whole experiment has its own folder on your desktop and the images go into it and then the tables run, the plants grow, images captured of every plant every 15 minutes, and then at some point we can introduce an experimental variable, whether it be a biostimulant, so a microbe that may increase plant growth, or maybe something else that may increase plant growth, or maybe something else. It may be taking water away, or maybe introducing some other nutrient or something that wasn't there before, or taking another nutrient away.
Scott Lowman:What happens is that makes it dynamic. So the plants are growing, they're happy. You've got a population of 80 plants. They're very happy and growing, and then you introduce something to it. You introduce something to about half of those plants, so 40 plants or something's introduced to, and then we're able to see precisely what happens with those 40 plants versus a plant that doesn't have that variable, and that gives us a lot of power because we can tell almost immediately what's happening.
Caleb Ayers:Yeah, and as you said, with traditional measurements you're more focusing on what's happening at the end, whereas with this you're getting that almost real-time data every 15 minutes to show when those effects are taking place. I imagine that's a lot of what you're looking at.
Scott Lowman:It is, and a lot of valuable information can be contained in that data set. So, for example, we've had some biostimulants that really promote a lot of plant growth and when they do that, as the plants get larger, they require more water. Well, that's typically not so. We don't try to adjust for that water need. We keep giving them the same amount of water. But imagine the larger plant starting to run out of water. It'll stop growing as fast. So when you go back and look at the data and the curves of the plants, the plotted out data itself, you can see that right away the plant gets really big really fast, but then it stops growing and it comes down and by the end of the experiment it's the same size of the rest of the plants. Now, what's important about that is, if you were just weighing it at the end you would think nothing happened. But with all this data we can see that something did happen and we need to go back and focus on that component of it.
Caleb Ayers:So you get data that before was missed, right, and so you mentioned you know that each plant has its own folder on the computer. So you're talking about if you're running 80 plants, that's 80 separate folders. You're talking about a picture of each every 15 minutes. How long is a typical experiment? Several weeks, right? Most experiments are three or four weeks, okay. So if you're talking about an image every 15 minutes over the course of four weeks, that's hundreds and hundreds of pictures for each plant.
Scott Lowman:Yep, yeah, some experiments generate about 80,000 images.
Caleb Ayers:Wow, okay, so 80,000 images. I know you guys don't have time to manually sort through 80,000 images, so tell me kind of what happens with all of those images, with that data that makes that usable information.
Scott Lowman:Yep. So we use Python, their computer vision suite, and sometimes we also use an NSF platform called ImageJ to analyze the plants. And what that software does is, of course, it was developed in-house, mainly by interns, and it goes into each folder, looks at each image, it separates the green plant from the background, it counts the number of pixels that that plant is, that that's on the picture itself, and then it puts that data into an Excel spreadsheet so we can tell exactly down to the pixel, how fast the plants growing. So it's not a measurement like in inches, it's the number of pixels. The more pixels, the bigger the plant is. And we've gone back and done experiments where we show and prove that, yes, a bigger plant, the more pixels are bigger plants, they weigh more.
Caleb Ayers:That baseline you're saying has already been proven that this technology works to be able to measure it this way.
Scott Lowman:Yeah, and the specific plants that we look at, there are some plants that are not as appropriate, so plants like grasses that grow straight up and are slender. It probably doesn't catch data as well, but most other plants that have leaves plants when they grow they're looking for the sun, so their whole purpose in life is to be exposed as much of their leaf surface to the sun as possible, so that's a good measure of plant size, right?
Caleb Ayers:You mentioned interns. Tell me kind of about the work that interns have done. I know I mean you bring in interns every summer to kind of tweak and improve and work on these things. So tell me about some of the enhancements, the improvements that interns have worked on on these platforms over the summers.
Scott Lowman:I wish I could take a lot of credit for the developing the software side and even the control side, but that almost all of that credit should go to interns and other students we've had working on those tables. I'm an older person, you know, I grew up in the 70s. I'm not scared of computers. I had a computer back then. But to be able to really dive into Python Python is a language and it's something that's better and oftentimes younger adults are easier it's easier for them to grasp it and use it.
Scott Lowman:So over the years, the control system, how to sort the images into folders for each plant, the graphic user interface what we call a GUI that we've created to run the tables. That's straightforward. Now it's easy to use. This summer we have two interns working on it that are even making the analysis part very easy to use. Before you had to go into you had to really program, you had to put it into the code itself. Now there's going to be sort of a graphic user interface that you just pick your folders and it does all the analysis itself. So practically all of the software side has been developed by either interns or students.
Caleb Ayers:Yeah, that's really cool Because I know I mean that's a meaningful experience for them, that they can walk away, you know, saying that they contributed to a very I mean a very cool project. I'm sure every intern that walks in there when they learn about that is excited about what they get to work on. So you mentioned, you know, that a lot of this goes with our biostimulant and biocontrol agent research where we're, you know, we're looking at how those microorganisms that we have in our plant endophyte research center impact plants again, not just at the end but over the whole process of their life. But I know we also can run experiments for companies when they have products that they want to test. So tell me about that side of it. If a company is interested in kind of, I guess, commissioning an experiment on our smart platforms, what does that look like?
Scott Lowman:Well, first of all, I can tell you that most all companies that come through the center and are interested in contracting us for plant science research are interested in those tables because it's such a unique platform. Not only does it provide you specific data all through the growing process, it also gives you lots of great images to use in marketing and in sales. So imagine being able to not only show a new customer that of course, your product makes the plant larger in data or in charts, but imagine being able to show them time-lapse images of the plants getting bigger versus the control plants.
Caleb Ayers:It really is a rich platform for generating data, both on the solid data side, but also data and images, and stuff for marketing, and you mentioned a few variables, obviously, if you're giving a different treatment or putting a different product in the plant, reducing or increasing water, what other kind of variables can you test with with these?
Scott Lowman:Mostly it's nutrients. It's the different types of biostimulants we use. So we use some biostimulants that increase hormone levels in the plants to make the plants larger. Some of them fix atmospheric nitrogen. Some of them provide other phosphate or other nutrients to the plant like phosphates. Those are the main ones so far. We haven't really got into lighting yet, so changing the different wavelengths of light, we could do that.
Caleb Ayers:You could do that for each like, for different sections of the table. You could change the lighting.
Scott Lowman:No, what we would do was we'd change the light, for different tables would have different lights on them and we'd look at how they grow.
Scott Lowman:There's other things. The range of different things we could do is almost infinite, because you can think of the different types of cameras you can use. You can think of even experiments. Like people say, plants grow better when they have music playing. So we could look at that if we wanted to. Most interestingly, though, we've been looking at plant movement. So as plants grow, they kind of move back and forth. We think we hypothesize that that's just a natural way for them to find light and shade. So most plants, as they're young, they're in shade of some type, so they're always looking for light. But we've been able to relate and connect that movement to plant health. So a healthy plant moves more. That's growing more. If we introduce something to that plant, like a drought or a pathogen, the movement stops almost immediately. So it's a great indicator of plant health. Even before we can see the plant turning yellow or shriveling up or whatever you want to call it, we can see that that plant, something's going on and we need to fix it.
Caleb Ayers:And if you're tracking plant movement as opposed to plant size, that's a whole different thing that the computer's looking for at that point. Right yeah, that's a whole different software set.
Scott Lowman:It's a whole different thing that the computer is looking for at that point right.
Caleb Ayers:Yeah, that's a whole different software set.
Scott Lowman:It's a whole different code that's developed to be able to do that. We've done that multiple different ways. There's lots of ways. Just looking at two images compared to each other, two subsequent images compared to each other, and how much pixels are different. That is one way to look at it. Another way we're looking at it more recently is tracking the tips of the plant. So as the tips are moving around, we can quantify that. And, of course, plants that are moving less, their tips move less Right.
Caleb Ayers:Yeah, that's fascinating and we've already touched on this some. But as far as the smart platforms themselves and kind of what types of information they let you get, you know. You said traditional research you're generally looking at, you know, maybe a few measurements throughout the process or a few hand measurements throughout the process and then kind of the weight at the end. This you're looking at, the whole life cycle worth of data. What other insights and analysis are you able, are you guys able, to do once you have that data set? You know, obviously you said that the computers get spitting out. Okay, here's how many pixels are in each image, but what are you able to do with that information?
Scott Lowman:We're able to plot it. That's the main thing. We can plot it and see and determine at what point there are statistical differences. So again, when you have it, in the old days you would just have the weight at the end and you determine if that weight was different whether they were larger or smaller or whatever from the control group. You determine if that weight was different, whether they were larger or smaller or whatever from the control group. In this way we can look at it and we can tell when that happens within the plant growth. So it's not just at the end, you can tell on day 32 those plants are significantly different. So we can tell how fast the response is.
Scott Lowman:Again, that's a dynamic type of measurement that's happening over time. We can start experiments with all plants the same and introduce something to them. We can also start an experiment with all plants the same and take something away from them, whether it be water or nutrients or something like that. So it just gives just a ton of data that's really nicely illustrated and it's something that everybody we present it to says it's unique. So it's not just us saying it's unique, it's others saying it's unique. Virginia Tech we've got one is being ordered now by a community college partner to use. We've got in talks with another institute of technology I don't want to say which one specifically, but they are interested in the table as well, and I think, as we continue to refine the table to make it easier to grow to make it easier to not necessarily grow, to make it easier to operate that we're going to have more of these opportunities, and what that does is it gives us a network of collaborators that we can then leverage for grants and other things as well.
Caleb Ayers:And I know you recently went to Georgia Tech to kind of talk about this stuff. I mean, when you go to these types of events to talk about this specifically, this plant imaging platform, what's kind of your pitch? As far as you know what makes this unique? Why would companies or educational institutions or technology institutions like, why would these different organizations want to get in on this?
Scott Lowman:Yep. So the main thing and one of the unique things about the institute and our research program is that we're multifaceted. We have plant scientists and we're also working with robotics and computer vision. Many of the researchers I interact with are either engineers, computer scientists, and we're also working with robotics and computer vision. Many of the researchers I interact with are either engineers, computer scientists or plant scientists. The plant scientists don't do a lot of robotics. The engineers don't have the plant science expertise.
Scott Lowman:So I see, when I go to these types of conferences, I see engineers have these just wonderful ways to image plants, to gather data. It's really incredible. And then you mix in artificial intelligence. Now it really takes it to another level. However, they don't have good controlled plant science experiments. When we have plants that are, as a plant scientist would know how to do. So what we do? We're able to do both. We're able to have really good experiments that are controlled. We can use statistics like randomized complete block design. We can really do a lot on the data side and then we have all those images to go with it and what we find is that those computer scientists and engineers are very interested in those data sets. So it's something that we can use and leverage for collaboration, mutually beneficial collaboration.
Caleb Ayers:Yeah, that's really cool and I think, as you were talking about, kind of the difference between, you know, the plant science side and the engineering side and you all being able to kind of bridge that gap and have both. I think that kind of speaks to the Institute for Advanced Learning and Research as a whole, as we kind of see ourselves as being able to bridge the gap between a lot of different types of organizations and a lot of different, a lot of different industries, public and private sector. You know, all of those things that we can kind of bridge the middle and help connect those things that usually are a little more separate. That's all the questions I have.
Caleb Ayers:Like I said, I think these things are very cool. Every time I go in one of the labs and look at them and watch, you know, watch the camera roll and look at the, look at those time lapses that you all put together. It's just really really cool piece of technology. But I mean, is there anything else that that you would want to add or think it's important that people know? It's always important to to.
Scott Lowman:To finish with the big picture, and the big picture is, as we're developing these tables, we're developing the computer vision and, of course, we can feed those into artificial intelligence models eventually, as our program continues to grow. Just think about, in the future, a robot being able to go through a field, sit still for a few moments and determine if a plant is healthy or not, based on its movement or based on how fast it's growing per minute, something that, while farmers are terrific, the human eye can't really detect. That we have to be able to have a computer be able to condense that, to have that information. So it's really a pathway to early detection of plant stress, and if you can detect plant stress early, you can fix it and then you don't lose yield.
Scott Lowman:We have tremendous challenges, both here in our country and worldwide, with feeding the population that's coming up. So by 2050, they estimate the world population is going to be about 10 billion people. To feed them, we need to almost double agricultural production per acre. So all hands on deck in agriculture. Focusing on that, because it's only like 25 years away and in the world of agriculture 25 years is not that long so it's just another tool for help to help with sustainability and to increase yield to feed the global population.
Caleb Ayers:Yeah, well, again, it's all about impact tonight. Yeah, appreciate you bringing that in. So thanks for being here, appreciate it, thank you.