Bear flag Robotics
An interview with
Co-founders Iginio Cafiero and Aubrey Donnellan
Ali Tabibian: Welcome, welcome, welcome everyone, to this episode of Tech. Cars. Machines. Once again, it's going to be Tech. Cars. R bots, and in fact, we could also say it's Tech. Cars. Agriculture, and even Tech. Cars. YCombinator. In fact, we have a little bit of a double header for you, and the reason for that is Y Combinator.
Y Combinator is probably the best known pure entrepreneur incubator in Silicon Valley. About twice a year, they [00:00:30] do a major event where they introduce some of their graduates to a group of investors. Their event in late March is event is called The Y Combinator Winter Demo Day, and about 100 companies each presented for two minutes.
A couple of us from GTK were present as investors, and it was so inspirational to be around so much energy, we decided to give a couple of the presenters that fall into [00:01:00] the Tech. Cars. Machines category a platform on our podcast. So we have a doubleheader. One episode about Beanstalk, an indoor faming startup, and in this episode we're going to be talking about Bear Flag Robotics.
What they do is automate tractors. Not the massive, [00:02:00] two floor high, green, typically John Deere tractors that you see going back and forth across a golden field of wheat, typically to a setting sun, or some other suitably dramatic lighting backdrop. In the case of Bear Flag Robotics, we're talking about smaller tractors for orchards, where they're not really used for harvesting, [00:03:00] they're used for spraying and mowing between the orchard rows . Bear Flag is going off to these growers and looking to automate their tractors, and make their tractor basically be able to go up and down these rows all on its own.
It's fun to talk about agriculture in general, by the way, because that field is pretty much where human mechanization began, and it's had spectacular results. Here’s your fun fact for this episode: about 150-160 years ago in 1850, 83% of the U.S. population worked in agriculture [00:03:30] and, more or less, barely fed itself. That number is probably 1% these days. 1% of the population works in agriculture, and that group of people, thanks to mechanization and other advances, feeds a multiple of the North American population of 320 million or so.
In this episode, we are going to be talking to the founders, two incredibly sharp and brilliant people, very nice, very intense individuals. Igino Cafiero and Aubrey Donnellan. Bear Flag is a prototypical robotics company [00:04:00] in the sense that much of their staff comes Carnegie Mellon University, which is a celebrated university when it comes to many advanced technologies, and in particular, robotics.
Let me explain a few terms to you that you'll hear in this episode. First, the Red Team. Red Team is a famous student, and they have some external collaboration as well, but mainly a student team at Carnegie Mellon, which, for many years, has been entering [00:04:30] vehicles into a competition called The U.S. Defense Department's Grand Challenge, which is the major annual event for autonomous prototypes, and the Red Team at Carnegie Mellon typically does really quite well.
Another term you will hear is RTK GPS. RTK stands for Real Time Kinematic. GPS, you're all familiar with, it runs the navigation in your car, called the Global Positioning System, if you [00:05:00] really want to know what the acronym stands for. RTK GPS is simply a more accurate version of the GPS system in your car or phone. It works by adding some equipment locaclly. Sometimes you'll a while frisbee-sized flying saucer sitting on a tripod on the edge of a field, and that's part of this RTK GPS system [00:05:30.
You'll hear the phrase Sensor Fusion, which simply means that rather than relying just on a camera system, or just on a laser radar system, the LIDAR system that you've heard about on this podcast, Bear Flag combines inputs from [00:06:00] all sorts of different sensors to achieve its result.
Now, without further ado, let's head to Bear Flag Robotics and hear from the two great founders.
Voiceover: Tech. Cars. Machines. Subscribe here, or at TechCarsMachines.com, and GTKPartners.com.
Ali Tabibian: Great. We're here today with Bear Flag Robotics. Why don't you two go ahead and introduce yourselves in whatever order you'd like, so people get used to what your voice sounds like, [00:06:30] and then people will hear less and less of me, and more and more of you.
Igino Cafiero: Thank you very much. I'm Igino, a founder of Bear Flag Robotics, and I'm here to talk with you.
Aubrey Donnellan: Hey, I'm Aubrey, also a founder. Thanks for having us here.
Ali Tabibian: It's my pleasure. Don't be shy. Give us your last names, too.
Igino Cafiero: Of course. I'm Igino Cafiero.
Aubrey Donnellan: Aubrey Donnellan.
Ali Tabibian: Great. We first heard about Bear Flag through one of our good venture investing colleagues, who [00:07:00] mentioned that they had a robotics investment. As some of our listeners know, we have an affiliate that's a major land owner in Central California, lots of orchards. Therefore, Bear Flag came up as potentially another one of the technologies that we've brought to and introduced to our farms, starting with drones, some of you may recall.
We actually saw them and met them again at Y Combinator: The Winter Demo Event. That was a couple of weeks ago. Maybe it would be fun for our listeners for you to basically give us the Y Combinator pitch. It's now ... It used to be six [00:07:30] minutes. I don't know if you know that, but it's down to two minutes. That's a perfect way of starting.
Aubrey Donnellan: I'm Aubrey Donnellan. My co-founder, Igino Cafiero. We're Bear Flag Robotics, and we are building autonomous tractors. Contrary to popular belief, no equipment OEM has released a fully autonomous tractor that's available to the market today. That's why we're here. We're planning to be the first ones to do it.
A little bit of background ... In the U.S., machine operating labor on farms accounts [00:08:00] for a whopping 28% of growers' operating costs. These labors, they're sitting on tractors all day long, driving the same exact routes every single season, day in and day out. This type of labor is really tough. Obviously, you're out in inclement conditions. It's pretty monotonous.
With that, being able to replace that labor with our autonomous tractors, growers can obviously save a lot of money, but also increase the job satisfaction of people that they have working on their farms. On top of an [00:08:30] obvious labor-savings when you introduce autonomy, we also come in with these other efficiencies that you can gain, by using the sensors that we're using.
For instance, we can reduce the inputs that are used on their crops, like herbicides, and pesticides, fungicides, by up to 20%, using our precision application software and sensors on there. In addition, we're really excited to implement some data analytics to start monitoring crop yields over time, being able to make recommendations for application and treatment [00:09:00] throughout the season, to help them increase their yields.
That's just upside, beyond the labor. Looking at ... And I said this during the Y ... Sorry, I'm not pitching really good anymore ... But what does that look like for our customers? For our first one, they're north of Sacramento, and they own a 2,000 acre walnut orchard. If we implemented between three and four tractors on their farm, they could save 10% in profit every single year. Increase their profit 10% every year.
That's obviously a really big game changer. Yeah, [00:09:30] we'll skip over the market since we decided that we're not going to talk about market, but basically, the progress of our team has been outstanding over the last six months. We've developed, as you saw, back in the garage we have two fully autonomous vehicles side by side, and a tractor. We're out in the field every week collecting hours and miles on these vehicles, and we've just done it in the last six months. The progress has been amazing.
In summary, the YC template, we're Bear Flag Robotics, we're building [00:10:00] autonomous tractors that are on real farms in California, we save our customers on labor and increase their yields. Our vision is to automate the world's farms.
Ali Tabibian: Great. Thank you for giving us that Y Combinator pitch. Now the world knows. This is a great idea. Why is today the right time to implement the idea? Is it that nobody else had come up with the idea, or is it that the technology is ready?
Igino Cafiero: When I think about that, I see two million drivers for Y now. The first is certainly market- [00:10:30] driven. The cost of labor is coming up substantially. When we go out to talk to growers, we hear continually, the economics that their farms are built on just don't work with this rising cost of labor. Furthermore, as we discussed about earlier, as the price of these commodities come down as foreign markets come online, and the price of these commodities come down, these farmers are just getting squeezed in the middle. They're facing an essential threat.
Unless they can either lower their operational expense, or increase their [00:11:00] yields, then they have a grim future. [inaudible 00:11:02] with both of those. The second part of, by now, is that technology has just made massive strides. As Aubrey and I were reflecting earlier as well, when we were at Carnegie Mellon 10, 15 years ago, Red Team was just starting its [inaudible 00:11:16] lighter, and those lighters cost hundreds of thousands of dollars.
The perception system that we have on the tractor right now many [inaudible 00:11:23] magnitude, less expensive. On top of it, the researcher on all these areas, the kinds of people that were talking to the people on our team, [00:11:30] who have been researching this technology in universities for the last decade, and now it's finally getting ready to hit the market.
Now it's this perfect intersection of both demand and technology [inaudible 00:11:40]. It will allow us to produce something really cool today.
Ali Tabibian: Great. So that's the market and technology side of it. It sounds like you two met at Carnegie Mellon. Is that right? What brought you together recently?
Igino Cafiero: Yeah, that's-
Aubrey Donnellan: It's a long story.
Igino Cafiero: Yeah. Aubrey and I met at Carnegie Mellon in 2003. I was an electrical and computer engineer, Aubrey is in mechanical. Dave, another [00:12:00] member of our time, was also an engineer there, and one of our advisors as well [inaudible 00:12:04] engineer there, too. It's a little bit of a Carnegie Mellon team, but I [inaudible 00:12:08] software. Also, on this side, I love building things. I've built ... When I was in Southern California, but race trucks to go race in the desert and back in Northern California, brought these cars, and a machine truck in the garage.
This combination of [inaudible 00:12:24] for software, and also fabrication, has really led me to industrial automation. It's something that I'm really [00:12:30] passionate about, and I'm really pleased that all of a sudden this demand and this need from customers [inaudible 00:12:36] skill sets that we have.
Ali Tabibian: When we talk about a farmer or orchard, describe to them what that looks like. How long are the rows? What spacing do they have? How tall are the trees, typically? What's growing on those trees?
Aubrey Donnellan: Obviously, it's across the board, and we're actually testing in tree nut orchards today. We hope to be in citrus and fruit orchards soon this year. We're also in [00:13:00] row crops, as well. Obviously, it's a wide range. Right now, our first customer, walnuts, they have 1,800 acres to be exact. I believe it takes us ... Well, in miles ... What, 15 miles to do just a small portion of one of their plots?
You're going pretty slow in these tractors. When you're spraying and mowing, and these operational tasks that they do on the orchards every single month during growing season. It will take them entire week, team [00:13:30] of people, to do one of these operations.
Ali Tabibian: What size of tractor are we talking about here?
Aubrey Donnellan: It really depends. We're early stage. We can technically go on any size right now, with our conception.
Ali Tabibian: For an orchard though, right?
Aubrey Donnellan: But for an orchard, you would only see typically between 50 and 100 horsepower tractors, or a narrow body tractor that's built for orchards and vineyards.
Ali Tabibian: So pretty much what we saw back there ... Like a big SUV, maybe a little taller and a couple of feet wider in the back. That's pretty much what you're focused on right now.
Igino Cafiero: [00:14:00] It's an operational constraint of how these orchards are grown. Clearly, the trees themselves need distance between each other in order to grow in a more healthy way. The shape and the size of the tractors that they drive between these rows is then dictated by the size of those trees. Any much larger would be not conducive [inaudible 00:14:18], and any smaller would not be effective. So, it's a natural farm factor based on the sight of these trees.
Ali Tabibian: Super. So now that we know what the tractor looks like, start doing the surgery. What are the mechanical interventions you make, what pieces do you add, [00:14:30] and what do those pieces do?
Igino Cafiero: Absolutely. The first test we have when we think about how to automate a tractor, is what we call perception. We use cameras and LIDARS, laser, radars, so now we can get a really detailed description of the world around us. As we drive around, we can then process that image and know both where to go, and where not to go.
We use these cameras and this LIDAR, as well as things like IMUs, which measure the position of the tractor in space, and tells if [00:15:00] the tractor is rotating, or pivoting, or it's heading is changing.
Ali Tabibian: Sort of like the stuff you'd have in an aircraft, pitch, roll, [crosstalk 00:15:06]-
Igino Cafiero: Exactly.
Ali Tabibian: Okay. All right.
Igino Cafiero: Then we also use GPS as well. I'll note that one of the tricky parts about these orchards, one of the environmental challenges, is that the canopy of the trees get so dense that it actually occludes the GPS signal as needed for precision [inaudible 00:15:24] today. When you think about precision [inaudible 00:15:26] today does, it uses what's called an RTK GPS. This [00:15:30] is how farmers can maintain a straight line across the field, when they have a clear view of the sky.
Ali Tabibian: That's like your wheat field example.
Igino Cafiero: Exactly.
Aubrey Donnellan: Mm-hmm (affirmative).
Ali Tabibian: Right. Big sky in Montana type of stuff, right?
Igino Cafiero: Precisely.
Aubrey Donnellan: Yep.
Igino Cafiero: In orchards, with these dense canopies of branches and leaves over every row, that technology just doesn't work. We've used these sensors in our algorithms to be able to drive in what's known as GPS-denied environments. A lot of our capabilities, and a lot of our research, have been around being able to navigate these GPS- [00:16:00] denied environments.
We then use a computer to process all of this information coming in, and then we use actuators so we are able to control the clutch, the brake, the steering wheel, the gas, and then implement control as well. On these tractors, there's three primary ways you control the implements, and we're able to manipulate all those as needed as well.
Ali Tabibian: How close to reality-
Igino Cafiero: We haven't actually tested the full capabilities, frankly. Usually, we test corner cases areas, where it's bound to not [00:16:30] work properly and concentrate on that. The other day we went out and we ran it for five hours, and logged 15 miles, only stopping to switch drivers. We still keep a driver in it for now, in case something does go awry.
It ran, like I said, for five hours, 15 miles, with no resuming. We would have needed to stop it otherwise.
Ali Tabibian: Interesting, so the auto companies are required to report interventions to the state. The analogy here is that you're basically saying that during that timeframe, there were no interventions?
Igino Cafiero: Correct.
Ali Tabibian: Wow.
Igino Cafiero: Yeah.
Ali Tabibian: Wow. That's incredible.
Igino Cafiero: [00:17:00] Yeah.
Ali Tabibian: That's incredible. How much of that is a function of you having trained the tractor on that specific route versus how much the tractor is able to infer its new environment automatically? And, what is the trade off? Maybe for this type of-
Igino Cafiero: Both ways are really interesting. When we first started last fall, we actually developed algorithms so that you could park the tractor at the start of any random row, and it would use its own perception to then being able to navigate the entire plot. It would go down rows, it would see where [00:17:30] trees are missing, it would see where trees are overgrown, or maybe twisted in a weird shape, they detect what we call in-liner and out-liner trees, trees that are maybe not supposed to be there, and then detect the end of the row and navigate the U-turn, identify the proper way to re-enter the row, and come back.
Technically speaking, it was a major accomplishment to be able to watch the tractor drive up and down these rows without being trained. More recently, we've moved to a model of ... Maybe a more user friendly model, of where the farmer himself will drive the tractor [00:18:00] up and down the rows and teach the tractor what to do. Then we're able to replay that back.
We have both ways. Right now, we're focusing ... As we go to market, we're focusing more on this teach and replay, but we have the capabilities, which is really cool.
Ali Tabibian: Okay, that's great. How much of a mechanical engineering challenge is this versus in an electronics challenge?
Aubrey Donnellan: We're lucky on a couple of different fronts there, in that we're on an already manufactured machine that's been stress tested if we're on a case. They make great tractors, and we [00:18:30] get to benefit from that, and their test cycles. Frankly, based on the electronics and software algorithms that our team has built, it hasn't been a huge mechanical engineering problem or challenge.
We have a couple of areas and controls that we do have mechanical solutions to, but they're minor compared to the complexity and problem solving that we've had to do on a Sensor Fusion guidance and navigation front.
Ali Tabibian: What has been both a surprising and the expected challenge in terms [00:19:00] of this integration? Where does it require the most work and the adaptation to this particular solution?
Igino Cafiero: As we think through it, the challenge is the breadth of the problem in this new space. A lot of solutions exist for your Tesla to drive down the road and sense the rows and sense the lanes. There hasn't been a lot of research, a lot of work done in these unstructured off-highway environments.
Ali Tabibian: I see. They have an expectation that their path is marked with a white line or a yellow line, and you got nothing.
Igino Cafiero: Or, it's been modeled extensively [00:19:30] already, whereas either we're building those models either on the fly, or re-driving its paths for models that we've already made. There's a lot of environmental variability there. Things like twigs stick out, and naturally we need to drive through twigs, but not through other obstacles that could damage that obstacle or the vehicle itself, and distinguishing between those sorts of things has been a challenge.
Ali Tabibian: Let me actually maybe break that up a little bit, because you've done a lot of work around perceiving the environment. Why take that approach [00:20:00] when a little bit of infrastructure, for example, might allow you to have a perfect understanding of where you are in that environment?
Aubrey Donnellan: There's a couple points on this.
Igino Cafiero: Yeah.
Aubrey Donnellan: To your point, we thought through this a lot. What small infrastructural changes can we make, or additions to the environment, can we make, make us more precise without our very fancy Sensor Fusion approach, from putting reflective tape on the trees, to having beacons on the plot, and all those are great.
If you look at collegiate [00:20:30] teams that are doing robotics, they capitalize on those approaches. The problem is for us to do is its scale, and to provide the user experience that we want to provide, we know we have to be better than that and not have this huge upfront infrastructure change for people to adopt us. So yeah, we shot for better than that.
Ali Tabibian: It's also a rugged environment.
Igino Cafiero: The very base of the trees, the living things as they grow and change. Even so much as figuring out where we are, doing traditional methodology, it's like we don't have a lot of parts [00:21:00] services like building [inaudible 00:21:01]. We've had to play other tricks.
Even things like branches falling down during a storm, it's impractical to me to stop for every obstacle in the way. So when there're obstacles, we need to drive over as a human would, and identifying the difference in classifying those are a big part of our developing capability.
Ali Tabibian: One of the things that people do generally for autonomy, is they send a vehicle through with a LIDAR and basically map the environment to a high resolution. What they're really doing is essentially, three dimensional edge detection. [00:21:30] A building will have a particular profile, the problem is when it snows, for example, that profile actually changes enough where what's in your memory isn't suddenly matching closely enough, what's actually in the live environment.
That's one of the reasons that you have things like noise, and until recently, rain, would cause a problem for this environment. What your saying is at least those buildings don't really change very often, and at least you know when it snowed. But in a living environment, that contour [00:22:00] is constantly changing. In that sense, I guess you're constantly remapping-
Igino Cafiero: [inaudible 00:22:05]
Ali Tabibian: Did I get that right?
Igino Cafiero: That's exactly right.
Ali Tabibian: All right.
Aubrey Donnellan: Yeah, we don't have to remap every single ... You know, there is a level of static, if you have a tree in the beginning of a season, for years that tree will hopefully stay in the same place. There are markers in these environments, but if you think about it, trees look very similar and very different at the same exact time.
If you're using computer vision, yeah they look kind of the same. They have a canopy, and they have a stump, [00:22:30] and there're features that you can key off of, but they also aren't perfectly the same. That makes the processing of your point clouds and detection less definitive to actually localizing the environment based on those maps, if that makes sense.
Ali Tabibian: It doe make a lot of sense. As you think about deploying these systems, the question of safety always comes up.
Igino Cafiero: Absolutely. We think about this a lot, of course. We do have the natural advantage that we're not in highly populated areas, [00:23:00] and we can make some rules about how you interact with these machines. But certainly, we can't depend on those rules. It makes the likelihood of unforeseen obstacles less, but it doesn't eliminate it, so we need to deal with it.
A lot of what we do is also [inaudible 00:23:15] both in hardware and software, to remedy single [inaudible 00:23:18] failure. Having a detection system is a completely different loop than our decision-making system, so that if one system fails, the other will catch it. Then, using different sensors for far out detection, middle [00:23:30] range detection, [inaudible 00:23:33] detection, and different reactive things we can do.
We sense either obstacles, or even when we sense failures in the system. It's a whole different category of bad things happening, is when one sensor fails, it's sending bad data, then the [inaudible 00:23:46] system to recognize that, and we have [inaudible 00:23:48].
We do have this happy benefit, which as you mentioned earlier, an 18 wheeler going down a highway, perhaps, slamming on the breaks is more dangerous than trying to avoid the obstacle or do a third thing, maybe. First, [00:24:00] the happy side effect is that we can afford false positives, and slamming on the brakes to a tractor, to believe, doesn't have very negative side effects. So, we can afford to be more cautious.
The only negative side effect is that user experience, so of course, we work to lower the number of interrupts to an acceptable amount. We're never afraid to stop the tractor, if we see something.
Ali Tabibian: What's been your experience so far in the field? What's been kind of interesting, big surprises, less welcome surprises?
Igino Cafiero: Lots of snakes in the Central Valley.
Ali Tabibian: Oh, is that right? Are [00:24:30] the poisonous?
Igino Cafiero: No.
Aubrey Donnellan: We don't know.
Igino Cafiero: Yeah.
Ali Tabibian: Oh, that's always one relief.
Igino Cafiero: We joke about that often, especially at our first customer. They deal with a number of snakes, so we joke about showing up on tractors and stuff. I'll take this-
Aubrey Donnellan: Yeah, absolutely.
Igino Cafiero: One of the most fun experiences I had, we'd come out to the Central Valley, and we're a team from Silicon Valley. These farmers had been growing this land for generations. They know best how to farm it. We do face some skepticism [00:25:00] as we come in. Growers are very knowledgeable and very polite, but they're also skeptical in doing things.
They haven't survived this long chasing trends. That was certainly the expanse of our first customer. I came and talked to them once and explained how we were thinking about things, and then I came back a month or two later with our first demo prototype, the side by side, it's in the garage right now. There was a moment where the ranch supervisor, who was very polite, but he was skeptical of technologies, standing right next to me.
I put him in the driver's seat, and we rode in [00:25:30] the side by side up the side of the road, and he's in the driver seat on the left, I'm on the right. He's sitting very rigid, and he has his arms crossed. I won't do that because of the microphone. He's staring ahead, and he's waiting to see what happens.
The side by side starts to drive down the road. The first thing that happens is the accelerator pedal goes in, [inaudible 00:25:48] pulled it in, and then the steering wheel starts making minute adjustments down the road. You could see during the course down the drive down this row, the transformation in his body language.
First, you started to see the smile curl on his [00:26:00] lips, and this arms relaxed. By the end of the row, he was just settling back, and he was like, "We could use it for this. We could it for that. This is incredible. I see how this will transform our operational ability."
Moments like that are especially encouraging, when we're able to show farmers how what we're working on can help.
Ali Tabibian: In some ways, does that worry you in terms of competition? If it takes a few months to get to a prototype, does it worry you that it could allow easy entry for others as well?
Aubrey Donnellan: We would [00:26:30] love to see more entrance in this space. We haven't seen many. The people that we read about are doing this on the startup side, super impressed, and hope that they move up. But we also don't know what's going on at the major OEMs. We don't know what they're investing in outside of their NMA deals that we read about. It's tough to tell. It's not easy, though.
The six months that we've had PHD working on some of this Sensor Fusion technology, [00:27:00] it's not a trivial problem. I wouldn't fault anyone for not being able to come up to speed on it quickly.
Ali Tabibian: That's the point that Tesla has made for a long time as well. They always said that we would benefit if more people knew people with electric vehicles that they were happy with.
Igino Cafiero: Entirely so, this educating growers about the tool that could potentially be available to them to help them survive, this is a conversation that we need to be having. Absolutely. Another thing to add on too, is this big chasm between a prototype that works, and something that you can actually [inaudible 00:27:30]. [00:27:30] And that's what we're tackling now.
Frankly, [inaudible 00:27:32] videos of the tractor are great. Like I said, putting in redundancies, having some communications systems, having an interface where a grower can tell the tractor specifically what he wants the tractor to do without being frustrated. These are the next level challenges that [inaudible 00:27:46].
Ali Tabibian: From what I've seen at other companies, their discovery is that the more advanced the technologies you're deploying, especially from a computational perspective, the more closely they need to be aware in detail of the work flow. [00:28:00] The learnings matter a lot, and therefore that's, to some extent, the advantage of being first and being ahead.
Igino Cafiero: Yeah-
Aubrey Donnellan: Couldn't agree more with that.
Igino Cafiero: We spend so much time with growers-
Aubrey Donnellan: On tractors as well.
Igino Cafiero: Yeah, coming up to speed on their operational constraints has been P1 for us, absolutely. I take conferences all over the state ... I'm a member of Western Growers and Salinas, which has been incredible for us, too, but then also just going out and talking to growers in their field and understanding how they do work and how they think [00:28:30] about it. It's not standard across all farms, or even types of farms. Understanding what the commonalities are, and how we can help the most people.
Ali Tabibian: Slight change of subject here. What did people think, your loved ones, family members, etc, when you decided to attempt this? What's their view now, a year after you guys have had prototypes and financing success, nice Y Combinator: Demo Day?
Igino Cafiero: Just to be clear, we've certainly come a long way. We're still alive, but we're far from successful. Our vision is very massive, and we [00:29:00] sit here thinking constantly that we're not moving quickly enough. Just to be clear on that. Yeah, I think it's across the range. There's sort of ... On one hand there's an inevitability of tractors will someday be automated, and then you sort of dive down into the weeds about okay, what does that look like?
Sometimes people lose interest, and sometimes they dig in. Yes, many [inaudible 00:29:22] human sense [inaudible 00:29:22] our life is the same. She tolerates the long hours, and I'm certainly not my finest when [00:29:30] I walk in the door at the end of the day. She still loves me, and I'm grateful for that.
Aubrey Donnellan: Looking back at my parents ... I come from a family of real estate sales people. I was the first engineer in my whole family, so they don't know half the stuff that I've been working on, since I was 22. Again, they're like, "Aren't those things automated already?" Nope, not yet. They were healthy skeptical, but I think people have [00:30:00] just rallied behind keeping these growers in business, and getting them the same thing that Detroit had.
When you can fundamentally change an industry without emation, it really opens up a lot of doors [inaudible 00:30:13] people are behind it.
Ali Tabibian: What are the milestones over the next several months, couple of years, however you tend to measure it?
Aubrey Donnellan: Customer, customer, customer. We have [inaudible 00:30:22] pilots going on right now, but we wanted to bring on a few more by the end of this year, just expose this across [00:30:30] growers, or orchard owners specifically in California, driving, interrupts and field ... I know we talked about that before. But driving interrupts in the field, down to just a minimal one to two throughout a whole day, so that the customers aren't going out there and rescuing the machines, or telling the machines what to do, in certain situations and just don't know what to do.
Yeah, making the user experience better, and moving towards a state where we can have one supervisor supervising a fleet of autonomous tractors. [00:31:00] We're not trying to eliminate the human from the entire operational loop, but being able to see that savings of converting your fleet to an autonomous fleet, that's our goal in the next 12, 18 months, depending on our business model.
Ali Tabibian: How do you entice a farmer to sort of go along with what you're doing, and let you test them? Do they get an advanced ... Is it like Kickstarter, they get an advanced copy of the tractor? What happens?
Aubrey Donnellan: Well, I've been extremely surprised and just super delighted that the people [00:31:30] that we've spoken to, growers across California, and I can't say this is the entire population, but surely, every single person that we;ve been put int ouch with loves what we're doing, and has said, "Come to my farm. You're welcome to test at our farm. I may not be outsourcing my operations to an early stage startup today, but I really like where you're going with this. And you guys are moving quickly." They're behind us.
I mean, we've had offers from how many different farms to come demo, come pilot. Really, everybody we talk to [00:32:00] has that open arm.
Igino Cafiero: I completely agree, and certainly we have that advantage being close to the area. But keeping in mind, growers don't adapt technology for technology's sake. Just because we have a cool gizmo, in fact probably because we have a cool gizmo, they're going to be much more skeptical.
It's on us to show that we can help them bottom line, and help them survive. They don't give us the benefit of the doubt, that's something we need to earn. It gives me security, too, because I know that if we can deliver something that helps them in these materials [00:32:30] ways, we've done something good.
Ali Tabibian: How do people get in touch with you?
Igino Cafiero: We have a website: BearFlagRobotics.com. There's an email there that also [inaudible 00:32:39].
Ali Tabibian: Did we forget to mention something?
Igino Cafiero: If you know any engineers that are looking for a good challenge, tell them to reach out. I'm always talking to people that are motivated to help.
Aubrey Donnellan: Any connections to growers who might be interested in connecting with us.
Ali Tabibian: Great. Thank you so much.
Aubrey Donnellan: Thank you so much.
Ali Tabibian: Thanks for taking the time. Bye-bye.
Voiceover: Haven't we made you feel, like, really smart? [00:33:00] Let us elevate you to Very Staple Genius. Click subscribe, or visit us at GTKpartners.com, where our subscribe buttons are much bigger.