When It Comes to Farm Data, How Good Is ‘Good Enough’?

Ron Farrell, a frequent sounding board for my wacky ideas and a source of ongoing encouragement to stay the course in this fast-changing industry, once told me to spend an appreciable amount of my time thinking about the business.

Advertisement

As leader of the editorial group, it was, in fact, my duty to scan the horizon, look for challenges and opportunities, and stick a finger in the eye of the status quo, if necessary.

So, it was with this in mind a few years ago that I viewed Farmers Edge and its move into the U.S. market on the heels of success in Canada. My perception of this “off the shelf” solution to turning data into agronomic recommendation was that, while boots on the ground were important, a key aspect of its functionality was the algorithm it employed for generating the recommendations.

I wondered at the time if such an offering, combined with what I thought would be a more rapid consolidation of farming operations, perhaps could result in a greater reliance on algorithms by these consolidated farmers?

Follow me here … farm size can only increase as the comfort level of a farm manager increases with all the variables they manage. Could agronomy that a grower considers “good enough” be acceptable in exchange for rapid growth in acres controlled?

So, fast forward to last week, when I get another mind-bending call from Jeremy Wilson, my friend and precision agriculture warrior at CropIMS. He throws out the same question: “Paul, is ‘good enough’ good enough?”

What spurred him to thinking about this was the final mandate from government regulators that would lead to the finalization of the Bayer-Monsanto deal: That Bayer would slough off its Bayer Digital Farming initiative to a third party, which ultimately became BASF.

Were regulators seeing something here? Jeremy saw it as a gigantic acknowledgement of the growing importance and power of data being collected by big agriculture companies.

It frankly surprised me a bit, although I have been pretty jaded about data and the industry’s ability to turn it into anything meaningful. Of course, then came the high-profile Facebook hearings and the revelations (though no one should be surprised) that data is being harvested to create profits for big companies. Perhaps regulators saw this coming? Perhaps this was a factor in keeping two giant databases from becoming one?

Which gets me back to “good enough,” and whether such a massive accumulation of farm data could serve to improve algorithms to that “good enough” threshold.

I would say “no” out of hand if we had a high percentage of farmers utilizing precision data through engagement with service providers. But with so many farms untethered to a robust precision program, and consolidation probably an inevitable evolution over the next decade, could “good enough” be an acceptable alternative?

Putting aside the fact that producers should be very concerned about their data’s whereabouts, regulators, food retailers, and the consuming public will have a lot to say about what we grow and how we grow it in the years ahead. In the meantime, we need to keep working on demonstrating service value and encouraging continuous improvement.

Leave a Reply

Advertisement
Advertisement