Report 1999-04: Surveying Farmers: A Research
M.E. Pennings, Scott
H. Irwin and Darrel
1999 by Joost M.E. Pennings, Scott H. Irwin and Darrel Good. All
rights reserved. Readers may make verbatim copies of this document
for non-commercial purposes by any means, provided that this copyright
notice appears on all such copies.
too detailed, bad time; got too much to do; not a days goes by
without getting something; just worn me out."
I spend even 1 minute? What ís the benefit to me?"
as simple as possible. Go after 3 or 4 points at most. Do not
ask questions that would require farmers to go to their records"
---Responses of farmers
about mail surveys (August 1999)
Agricultural economists have long
used mail surveys as a data collection instrument. Recent examples
include Hayenga, Hobbs, and Thilmany. The widespread use of mail
surveys can be attributed partly to the advantages of economy
and convenience inherent in such mail surveys. Surveying farmers
through the mail on a nation-wide basis can be cost efficient
when the surveys effectively generate a representative response.
The economies, however, are negated by the failure of researchers
to consider factors that stimulate response rates and completeness.
Although mail surveys are widely used in agricultural economic
research, the problem of low response rates has seldom been addressed.
A low response rate affects the mail surveyís
ability to produce high quality data. A common problem is the
lack of a representative sample due to a low response. Related
to the latter is the effect of selection by the respondent. Respondents
who are interested in the subject of the questionnaire may respond
relatively more often than respondents who are not involved in
the subject. In this case, non-respondents differ from respondents,
resulting in biased survey results.
This research note describes an
exploration of factors influencing response rates of mail surveys
sent to US farmers. First, the mail survey as a technique to obtain
primary data is briefly discussed, followed by a short literature
review of techniques used to increase mail survey response rate.
Next, the research design is described and results are discussed.
Finally, results are summarized and some recommendations are made
for improving response rates of mail surveys sent to farmers.
Factors Influencing Response Rate
In agricultural economic research,
theory is often tested using secondary data, that is, data that
have been gathered for some other purpose but are applicable to
the study. The primary advantage of secondary data is the low
cost. Moreover, much of the secondary data are instant since they
already exist and merely need to be discovered. On the other hand,
the collection of primary data (i.e., data that originate with
the specific research undertaken) can take a long time and can
be very expensive. However, the advantages of secondary data over
primary data come at a cost. Secondary data might not fit the
researcherís study because of differences in definitions. Furthermore,
secondary data may not be available, particularly for research
that involves farmersí opinions, perceptions, and attitudes.
Collecting primary data to validate
theoretical models and concepts can be done in four ways: 1) by
questioning farmers in a mail survey, 2) by personal interviews,
3) by observing their selected activities and 4) by conducting
experiments with them. In this research note, we focus on the
method most widely used in agricultural economics: the mail survey.
Surveys can be conducted through personal interviews, telephone
interviews, or mail questionnaires. Table
1 provides a short overview of the advantages and disadvantages
of mail surveys.
In the marketing and psychology
research literature several factors have been identified that
influence the response rate of mail surveys. Childers and Ferrell
found that response rates decrease with an increase in the questionnaireís
length. Moreover, they found that the length of a questionnaire,
as perceived by the respondent, is multidimensional. The length
of the questionnaire is a function of the number of questions,
number of pages, and the size of the pages. These findings are
in line with the finding of Harvey that layout is important. A
cramped layout with little space on the page is less attractive
than a longer one which has ample space for responses. Hornik
and Brennan found a positive relation between response rate and
direct rewards (monetary and non-monetary). Hansen indicated that
although a monetary inducement improves the response rate, it
does not necessarily improve the accuracy of the results. Buse
(1973) and Wolfe and Treiman showed a large positive effect of
persistence (repeated contacts) on response rate. Childers, Pride,
and Ferrell showed that emphasizing how the personís input will
help others in the cover letter raises the response rate. Jones
and Lang showed that hiding the identification of the sponsor
increases the response rate. Interesting to note is that this
finding was based on commercial sponsors. One might argue that
public sponsors and non-for-profit sponsors would have a positive
influence on the response rate. Heiberlein and Baumgartner found
support for this claim. Other factors that are related to response
rate are preliminary notification, provision of return envelope,
personalization (e.g., hand-addressed envelope and personal signature),
promise of anonymity, and specification of a deadline for returning
(Yammarino, Skinner and Childers). Herberlein and Baumgartner
were able to explain 51% of the variance in final response through
the number of respondent contacts (preliminary and follow up)
and the saliency of the survey topic. Recently, Yammarino, Skinner
and Childers found that the type of subjects sampled moderates
the effect of above-mentioned factors (e.g., consumers vs. managers
The above mentioned studies exclusively
focus on consumers. An exception is Buse (1973) who reports that
a personal letter and persistence in the form of several follow-ups
resulted in a high response rate of farmers in Wisconsin.
In this study, the primary interest
is farmers. The level of influence of the factors mentioned above
might be different for farmers and other factors, such as the
time of year the questionnaire is sent, might play a role. One
would expect that the response rate of farmers, in particular
crop farmers, is lower in times when there is fieldwork to be
done than when there is relatively less work.
In the above mentioned studies,
the response rate was the dependent variable and the independent
variables were the manipulated factors designed to affect response
rates. Rather than using different mail survey designs to investigate
the response rate, farmers were asked what features of mail questionnaires
are related to their willingness to complete the survey. These
questions were asked in a telephone interview to farmers that
did not respond to a mail questionnaire that they received a few
Research Design and Results
A mail survey was developed that
dealt with how farmers choose among market advisory services and
how they use these services. The mail survey was part of a project
that was motivated by the expansion in the use of market advisory
services by farmers in the US. Previously, information about how
market advisory services perform was limited. A research program
was developed to provide information about the performance of
these market advisory services.
The questionnaire was designed
taking into account the insights of the survey literature. That
is, an in-person pre-test was done with a group of 15 farmers,
in which they were asked to complete a questionnaire and to indicate
any ambiguity or other difficulty they experienced in responding
to the questions, as well as any suggestions they deemed appropriate.
Based on the feedback received from the farmers, some questions
were eliminated, others were modified, and additional items were
After the pre-test, a survey was
designed based on the literature reviewed above. Farmers who returned
the survey were eligible to win a $100 cash prize. The envelope
revealed that it was a questionnaire from a University and a return,
postage paid envelope was included. The cover letter was personalized,
printed with University letterhead, and indicated that it was
a University study about agricultural market advisory services
that should require about 20 minutes for completion. The questionnaires
were printed in booklet form with 12 letter sized pages containing
47 questions. The cover letter indicated that the information
provided was strictly confidential and that respondents could
call one of the researchers if they had any questions about the
survey (the researchersí names and telephone numbers were given
in the cover letter).
The questionnaires were sent in
the second week of June 1999 to 100 randomly drawn crop farmers
across the Midwest, Great Plains, and South East regions of the
US. The sample was drawn from directories kept by a US firm that
delivers agricultural market information and advisory services
via satellite. In general, the customers of this firm represent
relatively large-scale commercial farmers. After two weeks, a
reminder was sent to the non-respondents, including a new copy
of the questionnaire. Acceptance of surveys was concluded in the
second week of July 1999. By that time, only 12 questionnaires
were returned. The response rate was lower than the typical response
rates of 20% to 30% reported in the survey literature (Yammarino,
Skinner and Childers).
In order to gain insight into why
farmers did not respond, a telephone interview was conducted with
all 88 non-responding farmers in the first week of August 1999.
Of these 88 farmers, 55 completed the telephone interview. Of
the 33 farmers who did not complete the telephone interview, 15
refused to participate and 18 were not available. Tables
2 and 3 summarize the questions asked in the telephone interview
and the responses.
As shown in Table
2, a large percentage of the farmers did not scan or read
the mail questionnaire. The fact that only 25% of the respondents
scanned or read the mail survey can be partially attributed to
the time of mailing the survey. June is one of the worst months
for receiving a mail survey, with January and February being the
preferred months for receiving questionnaires. This timing preference
is overwhelming, with 63% of the farmers indicating that January
or February are the best months to complete a survey. The next
best month for completing a survey, December, was cited by only
8% of the respondents.
Results for the amount of time
respondents are willing to spend on a survey are shown in Tables
3 and 4. The telephone interview indicated that farmers are
willing to spend, on average, a maximum of 13 minutes completing
a mail survey. Forty-five percent of the respondents were not
willing to spend more than 10 minutes and 35% no more than 5 minutes
completing a mail survey. The mail survey used in this study required
about 20 minutes to complete, contributing to the low response
Compensation results also are reported
in Tables 3 and 4. About half of
the farmers interviewed expected to be compensated for completing
a survey. Money was the preferred compensation, followed by gifts
and coupons. The appropriate compensation varied between $1 and
$50, with an average of $15 and a median of $10. One-third of
the farmers identified a compensation of $15 or more.
The interview results indicate
that the appropriate compensation depends on the length of the
survey and the organization that conducts the survey. Farmers
did not expect to receive a (high) compensation from a University
or government organization, but would expect compensation from
private companies. These results refine the findings of Jones
and Lang suggesting that hiding the sponsorís identity increases
the response rate.
During the telephone interview
the farmers had the opportunity to provide suggestions that would
make mail surveys more attractive to them. A suggestion that was
often mentioned was that mail surveys should not include questions
that require farmers to consult their records. Surveys that consist
of questions that require rating and checking boxes are preferred
over open-ended questions.
Summary and Implications
The results of this study have
important implications for survey research in agricultural economics
and related fields. First, a relatively brief time window exists
for effectively conducting mail surveys of crop farmers. About
two-thirds of the farmers indicated that the best months are limited
to January and February. While this accords with common sense,
the restrictive nature of this time window raises serious questions
about the usefulness of surveying crop farmers outside of this
time period. If researchers have the choice, a survey should be
targeted for delivery in early winter. If researchers must conduct
a survey in other months, a lower response rate should be expected
and researchers need to carefully consider the bias this may inject
into survey results. Alternatives for dealing with the lower response
rate may need to be considered, such as monetary compensation.
Second, farmers are willing to
spend relatively little time completing mail surveys. Without
compensation, the majority of crop farmers will not spend more
than about ten minutes. Over one-third are unwilling to spend
more than five minutes! This strongly suggests that lengthy surveys
(sent without compensation, as is the typical practice in agricultural
economics) will result in low response rates and may be plagued
by related response biases. Anecdotal evidence suggests lengthy
surveys are the norm in agricultural economics survey research.
Surveys need to be short and tightly-focused if they are to be
effective instruments in measuring the intended constructs.
Third, cash compensation may be
required in order to assure desired response rates from crop farmers.
The requirement for compensation is related to the length of the
questionnaire and whether it is conducted by a private or public
entity. If cash compensation is included, researchers should expect
to pay on the order of $10-$15 per completed survey. While this
is a small amount viewed on an individual basis, it could wreak
havoc with project budgets in a large nationwide survey.
Fourth, the results of this study
are instructive with regard to the negative aspects of the information
revolution. While advances in computer and communication technology
foster the production and analysis of data, there is still a basic
constraint on the production of that data. Many of the comments
by farmers in the telephone interview appeared to be a plea for
relief from the flood of surveys that inundate them on a daily
basis. In the future, researchers need to carefully consider this
issue when designing research projects requiring survey data.
Finally, the survey results suggest
that crop farmers are more willing to answer questions not requiring
them to consult records for factual information. This places a
clear restriction on the type of data that might be successfully
solicited from crop farmers. Data gathering procedures that combine
secondary accounting data (e.g., farm records already available)
with survey data seems an interesting avenue to explore in futures
Brennan, M. The Effect of Monetary
Incentive on Mail Survey Response Rates: New Data, Journal
of the Market Research Society 34 (May 1992): 173-177.
Brooks, R.M., V. D. Ryan, B.F.
Blake and J.R. Gordon. Increasing Response Rates in Mailed Questionnaires:
Comment. American Journal of Agricultural Economics 57
(August 1975): 517-519.
Buse, RC. Increasing Response Rates
in Mailed Questionnaires. American Journal of Agricultural
Economics 55 (August 1973): 503-508.
Buse, R.C. Increasing Response
Rates in Mailed Questionnaires Reply. American Journal of Agricultural
Economics 57 (August 1975): 520-521.
Childers, T.L. and O.C. Ferrell.
Response Rates and Perceived Questionnaire Length in Mail Surveys.
Journal of Marketing Research 16 (August 1979): 429-431.
Childers, T., W.M. Pride and O.C.
Ferrell. A Reassessment of the Effects of Appeals as responses
to Mail Surveys. Journal of Marketing Research 17 (August
Hansen, R. A Self-Perception Interpretation
of the Effects of Monetary and Nonmonetary Incentives on Mail
Survey Respondent Behavior. Journal of Marketing 17 (February
Harvey, L. A Research Note on the
Impact of Class-of-Mail on Response Rates to mailed questionnaires.
Journal of the Market research Society 29 (July 1987):
Hayenga, M.L. Cost Structures of
Pork Slaughter and Processing Firms: Behavioral and Performance
Implications. Review of Agricultural Economics 20 (Fall-Winter
Heiberlein, T. and R. Baumgartner.
Factors Affecting Response Rates to mailed Questionnaires: A Quantitative
Analysis of the Published Literature. American Sociological
Review 43 (1978): 447-462.
Hobbs, J.E. Measuring the Importance
of Transaction Costs in Cattle Marketing. American Journal
of Agricultural Economics 79 (November 1997): 1083-1095.
Hornik, J. Time Cue and Time Perception
effect on response to Mail surveys. Journal of Marketing Research
18 (May 1981): 243-248.
Jones, W.H. and J.R. Lang. Sample
Composition Bias and Response Bias in a Mail Survey: A Comparison
of Inducement Methods. Journal of Marketing Research 17
(February 1980): 69-76.
Thilmany, D.D. FLC Usage among
California Growers under IRCA: An Empirical Analysis of Farm Labor
Market Risk Management. American Journal of Agricultural Economics
78 (November 1996): 946-960.
Wolfe, A. and B. Treiman. Postage
Types and Response Rates on Mail Surveys. Journal of Advertising
Research 19 (February 1979): 43-48.
Yammarino, F.J., S.J. Skinner and
T.L. Childers. Understanding Mail Survey Response Behavior: a
Meta Analysis. Public Opinion Quarterly 55 (Winter 1991):
Pennings is a visiting scholar at the Office for Futures and Options
Research at the University of Illinois at Urbana-Champaign and
an associate professor at Wageningen Agricultural University in
the Netherlands. Scott H. Irwin and Darrel Good are professors
in the Department of Agricultural and Consumer Economics at the
University of Illinois at Urbana-Champaign. Corresponding author:
Joost M.E. Pennings, University of Illinois at Urbana-Champaign,
Office for Futures and Options Research, 326 Mumford Hall, MC-710,
1301 W. Gregory Drive, Urbana, IL 61801. Phone: 217-333-8442,
Fax: 217-333-5538, e-mail: firstname.lastname@example.org.
Funding for this research was provided by the Risk Management
Agency, US Department of Agricultural and Illinois Council on
Food and Agricultural Research.
 A notable exception
is the work of Buse (1973).
See, Brooks et al. and Buse (1975) for a further discussion on
personalization and persistence.
 More information
about this project can be found at the homepage of the Agricultural
Market Advisory Project (AgMas) at http://www.aces.uiuc.edu/~agmas/
here for Order Form