A Framework for Analytics in Agronomy
Innovative data aggregators, organizations, and scientists are applying different types of analytic techniques such as investigative data discovery, descriptive data aggregation, predictive analytics focused on outcomes, and other prescriptive techniques. Figure 1 shows a framework for analytics in agronomy. The framework is designed to enable architecturally significant use cases, including:
- Reporting. Farming operations require daily/periodic reports on a number of topics, including weather, grower and subcontractor results, and information required to follow up on planning versus actual activities. Such reports support regular farm operations and help with planning of future activities. Farms also have to maintain reports and data on seeds and applications of pesticides, herbicides, insecticides, and other treatments. Such reports are needed for cost and yield analysis and may also be required for inspection by federal and state farming regulatory authorities.
- Dashboards. The other major reporting capability is to enable dashboards for visualization and analysis. This includes dashboards for planning activities like crop planting and diagnostics of factors that could impact the quality and yield performance. Data and images gathered from satellites, drones, and sensors can also be visualized against field-level coordinates to observe the progress of crop growth and plan any required course corrections. Farmers may also require the ability to extract and transmit such data to agronomists and other advisors.
- Discovery. Support for predictive analytics is another major capability being designed into the agronomy framework to enable diagnostics and search and data exploration. Such predictive analytics require historical data to observe variance between recommended and actual yields and other limiting factors. For example, analysis of the data may highlight a farm plot that consistently yields better results than others in the vicinity that don’t get similar inputs. The farmer and agronomists can then drill down and review other factors regarding the plot to understand this positive variance and whether it can be replicated across the farm.
Figure 3 — A framework for analytics in agronomy.
Dashboards enabled by predictive analytics are already starting to pay dividends in farming operations. For instance, a recent Reuters article quotes farmer Juergen Schwarzensteiner, who rotates corn, potatoes, and grains at a 970-hectare farm in Bavaria using satellite maps and analytics software: “This plot has had top yields consistently over the years, [and] I used to just say, that’s great.... Then we got the digital maps, and differences became apparent that were not clear to the eye before.” Using digital dashboards, farmers like Schwarzensteiner are able to view color-patterned digital maps that highlight discrepancies in plants growing in plots across fields even half a mile apart. Such dashboards “aim to provide farmers with individualized prescriptions on how to work each field down to a fraction of an acre, using data they have collected on soil and weather conditions.”
At the core of the framework are structured and unstructured data sources. Agribusiness organizations, government agencies, and other research organizations generate reports and transactional data in formats that can be stored and retrieved from relational, structured databases. Such transactional and reference data may exist in databases within software applications running commercially developed databases like IBM’s DB2, Microsoft’ SQL Servers, or Oracle. Such data can be cataloged, indexed, and queried using well-understood tools and techniques.
Social media, satellites, drones, and sensors also generate vast amounts of unstructured and big data that may include images, text, and other data structures. Emerging big data analytic techniques are being applied to make sense of this data. Traditionally, farmers have applied new techniques — such as new seeds, pesticides, herbicides, and so forth — to a small plot to observe optimal yields. Instead of such empirical analysis, which takes time, farmers are also embracing results from analysis of large, real-world data sets from public sources. Analysis of such big data can produce reliable recommendations much more quickly.
An article from the International Center for Tropical Agriculture illustrates how scientists at the center have applied big data tools to pinpoint strategies that work for small-scale farmers in a changing climate:
Big Data on Colombian rice comes from commercial fields in ambient weather, a stark departure from other research.... With good analytics, this produces nuanced, reliable recommendations much more quickly. It also gives rice breeders the feedback they need to develop climate-adapted lines.
Data aggregated from different sources needs to be analyzed, visualized, and used for reporting. Such data may be designed to persist in reporting tools temporarily or in a fit-for-purpose data warehouse designed for agronomic reporting. For instance, a large farming operation, with hundreds of acres of land spread over a large geographic area may benefit from analysis of aggregated data, especially if trends from one field can predict outcomes in other fields. The aggregated data also needs to be localized. Rick Murdock, head of Ag Connections, a wholly owned subsidiary of Syngenta, explains: “We believe spatial agronomic recommendations are local and need to be driven by local retail agronomists, consultants, or grower agronomists: We know crops grow best when they are seen by the agronomist!”
Many of the techniques and solutions highlighted in the framework, including tools for data analysis, reporting, visualization, and aggregating big data with data across disparate sources, are already available. These are being used in other industries and to solve individual agronomy problems. The challenge is to aggregate data from disparate sources in different formats to draw inferences. Among the difficulties in aggregating data from different sources is the need to clean the data. A few key assumptions on data cleansing include:
- Data is generally not cleansed at the source or during the data collection stage. It is assumed that such data from different organizations and source systems is formatted and referenced according to its individual requirements. The data may not be designed with a common taxonomy, and even the metadata and units of measure may be different.
- Data cleansing could be done using automated methods, but this requires some manual effort and standardization of business rules. Data analysts need to understand the sources of the data, data definitions, and metadata, and based on such understanding, they can plan to translate and cleanse the data after retrieval.
- Cleansed data can be stored as published data for various visualization or analytical purposes. Such cleansed data may have to be reviewed periodically, as it might go stale.
[For more from the author on this topic, see "Enabling Agronomy Data and Analytical Modeling: A Journey."]
More: Articles Like This
This issue focuses on key topics of interest for financial services organizations, namely equity crowdfunding, legacy systems migration, robo-advisors, test outsourcing, and refining the reconciliation process.
This Advisor presents an overview of improving Agile techniques and practices by using design thinking within the Agile space and describes three techniques from design thinking methodologies that tend to yield benefits to Agile practitioners.
The existence of a digital backbone in an organization means that anyone aspiring and planning to transform different parts of the enterprise can leverage the digital backbone in a consistent and sustainable way, ensuring that each transformation effort connects and leverages a common platform. Digital transformation leaders are starting to realize that a powerful digital services backbone to facilitate rapid innovation and responsiveness is key to successfully executing on a digital strategy.
Can a method like EVM, developed to control projects with well-defined objectives, be applied to control product development initiatives that evolve continuously toward a “moving target”? In an Agile environment, we are faced with the dynamic evolution of a finite boundary of integrated scope, cost, time, and resources; this finiteness — essential for business management and decisions — is the cradle for project management techniques, tools, methods, and frameworks. The EVM method was first developed to help with managing complex R&D projects mostly characterized by an unstable, volatile, and evolving scope. It is therefore no surprise that EVM applies to Agile projects.
It’s a pleasure for me to introduce the first of two special issues of Cutter Business Technology Journal (CBTJ) showcasing the thought leadership and cutting-edge research and development (R&D) being done in State Street Corporation’s Advanced Technology Centres in Europe, the Middle East, and Africa (EMEA) and Asia Pacific (APAC), in partnership with University College Cork (UCC) and Zhejiang University (ZJU), respectively. The articles in this issue represent a small sample of the output from the R&D undertaken in these centers, which combine academic excellence with real industry impact.
Every business must deal with crisis, risk, and compliance challenges. Teams chartered with addressing these challenges are often split across business units and regions, which fragments crisis, risk, and compliance management efforts. Business unit silos and related complexities obscure ecosystem transparency, which in turn constrain an organization’s ability to identify risks, assure compliance, and prevent and disarm crises. Business architecture delivers business ecosystem transparency as a basis for improving a business’s ability to collectively address challenges related to crisis, risk, and compliance.
Organizations are using blockchain to create new business models — exploiting its capabilities for optimizing contract management, financial transaction management, and identity management.
For technology-dependent products, companies, institutions, and even societies, sustainability depends on learning how to manage technical debt. Like most transformations, incorporating new practices into our organizations will likely be an iterative process. We already recognize the problem, and researchers are making progress, albeit mostly on technical issues. This Executive Update proposes a policy-centered approach to the problem. It begins with a principle that can serve as a guide for constructing technical debt management policy, and then shows how to apply that principle to develop nine recommendations that enable organizations to manage technical debt effectively.
Agile methodologies, however popular they are, bring their own sets of “smells” and anti-patterns to the table, sometimes causing irreparable damage to the team. While the sources of these smells are many, one of the primary culprits is the mindset that treats Agile as “yet another methodology,” totally ignoring the cultural aspect. This article throws light on some of the prominent smells that are emerging of late in the Agile world.
If you start changing an organization toward an Agile mindset, there’s no real end. Agile is about creating an organization of continuous learning and the transformation is done when there is nothing new to learn, which will probably be never. This puts an enormous challenge on middle management.
The articles in this issue present perspectives and ideas on business transformation in the digital age. We hope they will inspire and encourage you to visualize the likely future of business in your domain and to explore the opportunities it presents. Finally, we hope their insights will help you identify suitable transformation strategies and plans and, if needed, choose viable collaboration models for partnering with startups and other firms in your digital business efforts.
Beyond buzzwords, what we are seeing is a seismic shift in the role of technology in organizations. Technology is more and more embedded in everything we do as we move into an increasingly hyper-connected digital world, a world in which technology is driving significant social, organizational, and industry change.
In this on-demand webinar, you'll discover the strategic and tactical opportunities made possible by Digital Data Streams and the opportunities for improved customer experience made possible by DDS.
At the Cutter Digital Transformation & Innovation Bootcamp, Cutter Fellow and Harvard Business School Professor Karim Lakhani talked about digitally-driven disruption of traditional business models for value creation and capture, discussing platform models like Facebook and Twitter. To date, Twitter has clearly done a good job “creating value.” But unlike Facebook, it continues to struggle with the capture part of the equation.
Social collaboration is not about technology. It’s about connecting people, and it’s changing the way business is being conducted. Similarly, gamification is not about games. It’s about motivating the personal and professional behaviors that drive business value. Together, social collaboration and gamification help companies reap great benefits — among them, the ability to deepen customer relationships, drive operational efficiencies, and optimize their workforce.
Roadmaps have two key functions in strategy planning. The first is to outline planned architectural changes that will deliver the required strategies; the second is to outline alternative ways to achieve the same results.
Just as recent global events have given us reason to pause and reflect, the pace of technology emergence and disruption is proving to be a source of inspiration and uncertainty. Transitioning to a digital world is front-of-mind for many business executives, yet finding the right path is an ongoing challenge. So we asked Cutter’s team of experts for their insights on some of the technologies, trends, and strategies that will be relevant in 2017 and beyond. In typical Cutter Business Technology Journal fashion, our call produced a wide range of opinions and reflections worthy of consideration as you chart your business technology journey for the new year.
Artificial general intelligence (AGI) is currently emerging as an area where recent developments are likely to have a major impact on the way organizations do business, societies organize themselves, and even on how we address values and ethics.
The fact is that AGI already exists in our daily life. A common example is the GPS systems present in many new cars manufactured today; and let’s not forget the drones being used to deliver pizzas and cars that drive themselves. While automatic pilots have been used in commercial planes for quite some time, what AGI is about to offer to general business and human activity is well beyond what most of us have seen so far.
2017 is going to be a year of strange winners, and perhaps the strangest of all will be a giant leap away from technology and back to solutions that don’t rely on 24/7 connectivity. With the onslaught of major hacks and Facebook embarrassment, the antitech crowd may have its best year in decades.
One of the most prevalent blockchains in the world, Ethereum, is poised to switch from a proof-of-work (POW) algorithm to a proof-of-stake (POS) algorithm, likely in 2017, with the release of the Casper codebase. Why does this matter? Because blockchain technology is becoming increasingly relevant and prevalent in businesses across the globe. It holds great potential to disrupt how businesses perform basic transactions, from payments, to programmable, self-executing contracts, to identity verification.