-
PDF
- Split View
-
Views
-
Cite
Cite
Benjamin Küfner, Joseph W Sakshaug, Stefan Zins, Claudia Globisch, The Impact of Mail, Web, and Mixed-Mode Data Collection on Participation in Establishment Surveys, Journal of Survey Statistics and Methodology, Volume 13, Issue 1, February 2025, Pages 66–99, https://doi.org/10.1093/jssam/smae033
- Share Icon Share
Abstract
Over the past 30 years, self-administered establishment surveys have increasingly transitioned away from using mail to more online and mixed-mode data collection. To examine the potential impact of this transition on survey participation, we evaluate several mail and web single- and mixed-mode designs implemented experimentally in a large-scale job vacancy survey. We find that neither response rates nor nonresponse bias significantly differed between the alternative designs. Subgroup analyses revealed that establishments of all size classes showed a preference for the mail mode in the concurrent mixed-mode design, but larger establishments were more likely to participate via web than mail in the single-mode designs. Potential cost savings (over 50 percent per respondent) were evident when utilizing the web mode in either a single- or sequential mixed-mode design. Qualitative follow-up interviews indicated a general preference for the web mode due to easier handling, smoother collaboration between colleagues, avoidance of a cumbersome mail return, and being seen as a modern sustainable solution.
More and more establishment surveys are transitioning from mail surveys to web and mixed-mode surveys. Unfortunately, little is known about the impact of these different data collection strategies on response rates, nonresponse bias, subgroup participation, and costs in the establishment survey literature. Our study helps to bridge these gaps by evaluating a large survey experiment that compares mail and web single-mode and mixed-mode designs using linked administrative data and qualitative interviews. Our findings are informative for establishment survey practitioners and shed light on the effects of implementing different self-administered mode designs.
1. INTRODUCTION
Establishment surveys are a crucial source of official statistics around the world and form the basis for research in various scientific fields, such as business, economics, and sociology. However, establishment surveys face several challenges (Bavdaž et al. 2020), including declining rates of voluntary participation and increased risk of nonresponse bias (e.g., König et al. 2021; Küfner et al. 2022; U.S. Bureau of Labor Statistics 2023), and rising data collection costs. To address these challenges, efforts have focused on using more cost-effective web-based data collection (e.g., Cohen et al. 2006; Erikson 2007; Snijkers et al. 2011; Thompson et al. 2015), a trend which accelerated due to the COVID-19 pandemic (e.g., Jones et al. 2023). Many self-administered establishment surveys have also shifted to using a mix of web and mail modes, deployed either in a concurrent or sequential mixed-mode design (e.g., Snijkers et al. 2011; Snijkers and Jones 2013; Thompson et al. 2015; Dillman 2017).
One example of a high-profile voluntary self-administered establishment survey is the IAB Job Vacancy Survey (IAB-JVS) conducted by the Institute for Employment Research (IAB) in Germany. Since 2002, the IAB-JVS has implemented a concurrent mail–web mixed-mode design, where paper questionnaires are mailed to establishments with the option of online completion (Bossler et al. 2020). However, this design has become increasingly expensive due to declining response rates and forced sample size increases in order to meet data reporting requirements of the European Commission (Eurostat) (Bossler et al. 2022). To evaluate the impacts of adopting an alternative mode design on response rates, nonresponse bias, and costs in the IAB-JVS, three different self-administered mode designs were experimentally implemented and compared to the standard concurrent design: a single-mode web design, a single-mode mail design, and a sequential web-to-mail mixed-mode design.
To our knowledge, no study to date has experimentally examined the impact of these self-administered data collection mode designs on response rates, nonresponse bias, and survey costs in a voluntary establishment survey. Such information is useful for survey practitioners, as previous experimental evidence tends to be based on special populations (e.g., small employers) rather than more general ones (Bremner 2011; Hardigan et al. 2012; Harris-Kojetin et al. 2013; Ellis et al. 2013). In addition, the majority of these studies are pre-2013 and do not reflect recent changes in Internet availability and usage by businesses (OECD 2023), nor do they reflect the effects that the pandemic has had on increased working from home. Thus, up-to-date experimental evidence is lacking on the impact of various self-administered mode designs on survey participation and costs in voluntary surveys of the general establishment population.
To address these research gaps, we report the results of the IAB-JVS mode design experiments conducted in the fourth quarter of 2020 during the pandemic. This study goes above and beyond the analysis of response rates and costs by utilizing extensive administrative data to study nonresponse bias and correlates of survey participation. To gain further insights into the perceived impact of the mail and web modes on establishment survey participation, results from 46 short qualitative interviews and 12 in-depth qualitative interviews are analyzed and presented. In short, the following research questions are addressed:
RQ1: Do response rates differ between web and mail single-mode and mixed-mode designs in an establishment survey?
RQ2: To what extent do web and mail single- and mixed-mode designs affect nonresponse bias?
RQ3: Are certain types of establishments more likely to participate via mail or web modes?
RQ4: What is the impact of the various self-administered mode designs on survey costs?
RQ5: How do mail and web modes influence the decision to participate in an establishment survey, and how do establishments perceive the advantages and disadvantages of both modes?
The remainder of the article is structured as follows: section 2 reviews the existing literature on self-administered mode designs in establishment surveys and section 3 introduces some hypotheses. Section 4 describes the experimental design, data sources used, and analysis plan. Section 5 presents the results of the experiments. Section 6 summarizes insights drawn from qualitative interviews conducted with several establishments, and section 7 concludes with a summary of the main findings and implications for survey practice.
2. BACKGROUND
2.1 Web and Mail Modes in Establishment Surveys
Historically, self-administered establishment surveys were often conducted via mail (Christianson and Tortora 1995). However, due to technological advances, mail establishment surveys have gradually been replaced by web surveys. For example, the US Survey of Occupational Injuries and Illnesses, conducted by the Bureau of Labor Statistics, has stopped using mail as their primary mode in favor of online data collection (U.S. Bureau of Labor Statistics 2020). Statistics Netherlands has also adopted a strategy that encourages online completion, providing mail questionnaires only upon request or for nonresponse follow-ups (Snijkers et al. 2018). The US Economic Census was exclusively carried out online in 2017 (U.S. Census Bureau 2023a) and 2022 (U.S. Census Bureau 2023b). These are just three examples of the global trend to replace (or significantly reduce) mail surveys with web surveys (see also Buiten et al. 2018). The upward trend of web-based data collection was further intensified during the COVID-19 pandemic, where face-to-face interviewing was partially replaced by online data collection (e.g., Jones et al. 2023) and new high-frequency online panel surveys emerged (e.g., Office for National Statistics 2022; U.S. Census Bureau 2022b). In Germany, however, many long-running establishment surveys rely to a large extent on traditional modes such as mail, CATI, or face-to-face and have not completed (or begun) the transition to web modes (see Bossler et al. 2020; Gensicke et al. 2022; Egeln et al. 2023).
The growing popularity of web surveys is largely driven by costs. For example, the costs associated with printing, mailing, and data entry for thousands of paper questionnaires can be avoided with web surveys, though web surveys do come with fixed costs (e.g., software, programming) and it is still often necessary to print and mail invitation/reminder letters for general populations. Other advantages of web surveys include automated question filtering, real-time plausibility checks, and interactive features (e.g., video clips), which are not possible in mail surveys. Nevertheless, there are still many advantages to using mail surveys. For example, in a world where surveys are primarily conducted online, receiving a mail questionnaire could garner special attention, highlighting the integrity and legitimacy of the survey. Moreover, some establishments might find web surveys burdensome or infeasible due to Internet or IT security restrictions and would prefer to work with a paper questionnaire, and even come to expect them in government surveys. Those that are accustomed to the tradition of responding to mail surveys as part of their normal routine may be especially reluctant to switch over to an online platform (Haraldsen et al. 2011). In addition, establishments may use mail questionnaires to screen the questions (e.g., Giesen 2007) or prepare their answers, even if using web as the final reporting tool.
2.2 Mixed-Mode Approaches
Given that some establishments may be less likely to participate in one mode than in a different mode, a common strategy is to implement multiple modes, as in a mixed-mode design, which should appeal to more establishments (e.g., De Leeuw 2005; Dillman et al. 2009; Dillman and Messer 2010). Here, we make a distinction between mixed-mode data collection designs, where multiple modes are used to collect the survey data, and mixed-mode contact strategies, which use multiple modes to contact and recruit units, but may use a different mode design to collect the survey data (e.g., Langeland 2019; Sakshaug et al. 2019). Our focus is on the former. Two types of mixed-mode (data collection) designs can be distinguished: concurrent and sequential (De Leeuw 2005, 2018). In a concurrent design, at least two modes of data collection are offered to establishments in parallel. The main advantage of a concurrent design is that establishments are offered the full range of response options from the outset and are free to choose their preferred mode without influence from the survey institute. A disadvantage is that the choice itself may be considered a burdensome task. In a meta-analysis of mode design experiments in the context of voluntary household surveys, Medway and Fulton (2012) found that concurrent mail–web mixed-mode designs produce lower response rates than single-mode mail designs. Thus, giving people the freedom to choose between modes might put them off from participating at all. Recent evidence, however, has found no difference in response rates between concurrent mail–web mixed-mode surveys and mail-only surveys (Olson et al. 2021). But whether this finding applies to voluntary establishment surveys—a question we explore in the present study—is largely unknown. A second disadvantage of concurrent designs is that many respondents may opt for one of the more expensive mode alternatives, which can drive-up survey costs (e.g., Hardigan et al. 2012; Ellis et al. 2013).
In a sequential design, a single mode is offered initially (i.e., the starting mode) followed by a secondary mode for nonresponse follow-up, and possibly a tertiary mode for remaining nonrespondents. A typical sequential design consists of deploying the most cost-effective mode first, followed by a costlier secondary mode (e.g., Snijkers and Jones 2013). Given that only one mode is offered at the outset, there is no burden of choosing between different modes. Web is a common starting mode given its relatively low cost. Implementing such a web-first strategy (e.g., Dillman 2017) can therefore lead to potential cost savings if a significant proportion of establishments take-up the web starting mode, thus reducing the number of costlier follow-ups (e.g., Ellis et al. 2013; Gleiser et al. 2022). However, a potential downside of all mixed-mode surveys is that differential measurement error can occur if respondents answer differently depending on which mode they use (De Leeuw 2018). The risk of measurement mode effects tends to be higher when mixing self-administered and interviewer-administered modes (Klausch et al. 2013). Although the present study does not investigate measurement mode effects, we note that this is an understudied topic in the establishment survey literature and warrants further research.
2.3 Experiments in Self-Administered Establishment Surveys
Most research on self-administered mode designs comes from the household survey literature, though there are some studies that have experimented with web and mail single- and mixed-mode designs in the establishment survey context (Willimack and McCarthy 2019). Erikson (2007) compared a sequential web-to-mail design to a concurrent mail–web design in a Statistics Sweden survey and found that the web take-up rate was significantly higher in the sequential design (46.1 percent versus 5.2 percent), but that the concurrent design yielded an overall higher response rate before the nonresponse follow-up phase started (full results not reported). Bremner (2011) showed that a single-mode web design resulted in a lower response rate (59 percent) compared to a concurrent mixed-mode design (89 percent) offering mail, telephone data entry, and a web response option in a mandatory United Kingdom survey of small employers, with only 9 percent of the concurrent sample responding online. Ellis et al. (2013) compared a sequential web-to-mail design with a concurrent mail–web mixed-mode design in a voluntary US survey of local prison and jail administrators, showing that the response rate in the first field phase was slightly lower in the web starting mode (73 percent) than in the concurrent mixed-mode design (78 percent), but the difference became negligible after the second phase. Similar results were reported by Harris-Kojetin et al. (2013) in a voluntary survey of long-term care providers. Hardigan et al. (2012) compared a concurrent mail–web mixed-mode design with a single-mode mail design, both using mail contacts, and a single-mode web design using e-mail contacts in a voluntary survey of practicing dentists in Florida. The response rates of the concurrent (25 percent) and single-mode mail designs (26 percent) exceeded that of the single-mode web design (11 percent), though only 2 percent of the concurrent sample selected the web mode.
Downey et al. (2007) analyzed multiple single- and mixed-mode designs in a mandatory US survey of occupational injuries and illnesses, finding the highest response rate for a concurrent mail–web design (78.4 percent), followed by a sequential web-to-mail design (73.5 percent), and single-mode web designs with or without the explicit option to request a mailed questionnaire (71.1 percent and 71.3 percent, respectively). The web take-up rate was significantly lower in the concurrent design (21.7 percent) compared to the single-mode designs (with and without the explicit option to request a mailed questionnaire) and the sequential mode design (46.1 percent, 49.5 percent, and 47.2 percent, respectively). Similar results were found in a replication study conducted 1 year later. Millar et al. (2018) report the effects of transitioning from a single-mode mail design to a sequential web-to-mail design in the first and second waves of the US Emergency Medical Services for Children Program’s Performance Measures Survey. The response rate increased by 13.1 percent-points compared to the first wave. However, because of other adjustments made to the survey design (e.g., the contact strategy), the increase cannot be attributed solely to the web-first design. Haas et al. (2021) compared a single-mode web, a single-mode mail, and a concurrent mail–web mixed-mode design in a voluntary establishment survey in Germany. Although they experimented with two different questionnaire topics, the results clearly indicated that the single-mode web design yielded a lower response rate (6.2 percent and 5.6 percent) compared to the single-mode mail (13.9 percent and 11.7 percent) and concurrent designs (13.7 percent and 11.8 percent). These results provide the most recent evidence of the advantages to using mailed questionnaires in self-administered establishment surveys.
In summary, there is strong evidence that mixed-mode designs (concurrent and sequential) yield higher response rates compared to single-mode designs, and especially single-mode web, in self-administered establishment surveys, and that web take-up rates tend to be higher in sequential web-to-mail and single-mode web designs, relative to concurrent mail–web mixed-mode designs (Downey et al. 2007; Erikson 2007; Bremner 2011; Hardigan et al. 2012; Millar et al. 2018; Haas et al. 2021). These effects appear to be consistent across both voluntary and mandatory surveys. The empirical evidence is mixed with respect to whether a concurrent or sequential mixed-mode design yields higher response rates for establishment surveys, with studies finding either no substantial difference (Ellis et al. 2013) or slightly higher response rates in concurrent mixed-mode designs (Downey et al. 2007; Erikson 2007; Harris-Kojetin et al. 2013). What is lacking from the literature are more recent studies that experiment with the full range of self-administered mode designs, including single-mode and mixed-mode, on general establishment populations, and that also analyze the effects of these mode designs on nonresponse bias. The present study addresses these research gaps.
3. HYPOTHESES
Based on the above literature review and other theoretical considerations (e.g., Bavdaž 2010; Willimack and Nichols 2010), we derive several hypotheses regarding the impact of the mode designs on response rates, subgroup participation, and costs, described below.
3.1 Hypotheses on the Effects of Mode Design on Response Rates and Costs
As discussed in the survey literature (e.g., De Leeuw 2005, 2018), offering multiple modes either sequentially or concurrently can facilitate participation for establishments that may be unable or less willing to participate in a particular mode. Because the evidence in the establishment literature shows higher response rates for sequential and concurrent mixed-mode designs compared to single-mode designs, particularly web-only designs, we hypothesize that:
M1: A concurrent mail-web mixed-mode design leads to a higher response rate than a single-mode mail or web design.
M2: A sequential web-to-mail mixed-mode design leads to a higher response rate than a single-mode mail or web design.
The recent household survey literature is inconclusive about what type of mixed-mode design, concurrent or sequential, is optimal for maximizing response rates (e.g., De Leeuw 2018; Olson et al. 2019; Wolf et al. 2021). The establishment survey literature suggests, if anything, a slight advantage to concurrent mixed-mode designs (see section 2.3). In line with this literature, we hypothesize that a concurrent mail–web mixed-mode design increases the response rate compared to a sequential web-to-mail mixed-mode design:
M3: A concurrent mail-web mixed-mode design leads to a higher response rate than a sequential web-to-mail mixed-mode design.
The literature review indicated an increasing trend of establishments participating in web surveys compared to mail surveys (see section 2.1), which may reflect the advantages of online surveys described above. Keeping with this trend, we expect that establishments will respond at a higher rate in the single-mode web design compared to the single-mode mail design:
M4: A single-mode web design leads to a higher response rate than a single-mode mail design.
Because we pursue push-to-web approaches with the single-mode web and the sequential web-to-mail mixed-mode designs, we expect a higher web take-up rate in these designs compared to the concurrent mail–web mixed-mode design. This expectation is also based on the literature review (see section 2.3), which shows that web take-up rates are increased by using push-to-web strategies. Conversely, the same logic applies to the single-mode mail design, where we expect a higher mail take-up rate than in the concurrent mail–web mixed-mode design:
M5: The web take-up rate will be highest for the single-mode web design, followed by the sequential web-to-mail and concurrent mail-web mixed-mode designs.
M6: The mail take-up rate for the single-mode mail design will be higher than for the concurrent mail-web mixed-mode design.
Since web surveys have lower variable costs for postage, printing, and data entry than mail surveys, the higher the proportion of web respondents, the lower the per-respondent (variable) costs should be. In addition, increased response rates should have a positive impact on per-respondent costs. Our focus is on variable costs (rather than fixed costs), which leads us to the following hypothesis:
M7: The per-respondent costs are highest for the single-mode mail design, followed by the concurrent mail-web mixed-mode design, the sequential web-to-mail mixed-mode design, and the single-mode web design.
3.2 Hypotheses on the Effects of Mail and Web Modes on Survey Participation
Pertinent to research question 3 (RQ3), we test several hypotheses regarding which establishment subgroups are more likely to participate via web or mail modes. Here, we are interested in looking at response mode preferences in single-mode and concurrent mixed-mode design presentations. This analysis can inform survey practitioners what to expect if they move from a concurrent mixed-mode design to a single-mode design (or vice versa) and wish to know whether establishments are more likely to use the same mode in the new design that they used in the previous design. Such knowledge is especially useful to inform survey budgets and the expected number of mail or web questionnaires that will need to be processed. We test the following hypotheses using only the two single-mode designs and the concurrent mail–web mixed-mode design, as they allow us to cleanly study response mode preferences for different establishment subgroups. We do not analyze the sequential mixed-mode design here because response mode preferences are confounded by selection effects due to the sequential deployment of mail as a nonresponse follow-up mode.
3.2.1 Establishment size
The literature finds that larger establishments are more likely to respond via web than mail (e.g., Kaiser 2001; Dickey and Riberas 2007; Jones and Phipps 2010; Thompson et al. 2015). This finding could be driven by at least two factors. First, larger establishments might have more trouble routing the paper questionnaire to the responsible person(s) (Haraldsen et al. 2011). This may have been made even more difficult due to the pandemic with a larger share of employees working from home. In contrast, login information for a web survey can be shared quite easily via email. Furthermore, some establishment surveys (e.g., those run via the respondent portal by the U.S. Census Bureau (2022a)) also have built-in delegation functions that facilitate collaboration among different employees within establishments, though this is not yet the case with the IAB-JVS. Second, establishment size is likely correlated with the quality of IT infrastructure and PC skills of employees (OECD 2023). In extreme cases, very small establishments might run their business operations completely without a computer. In short, we expect larger establishments to be more likely to participate via web compared to their smaller counterparts:
H1: Larger establishments are more likely to (a) participate in a single-mode web than single-mode mail design, and (b) participate via web than mail in a concurrent mail-web mixed-mode design.
3.2.2 Industry
Establishments in some industries might differ in their likelihood to participate in mail or web surveys depending on their employees’ level of interaction with computers and the Internet in their daily work. For example, establishments in the agricultural and construction industries may use computers less in their daily work and therefore be less familiar with web applications compared to establishments in the information/communication and finance/insurance industries. These latter establishments are more likely to have highly developed IT infrastructure systems and benefit from the advantages of web surveys, such as easy transfer from digital documents to the online questionnaire, compared to those in the agricultural and construction industries (see also Kaiser 2001; Dickey and Riberas 2007). Another industry that might have a higher likelihood of participation via the mail mode is public administration. As the visual design of the IAB-JVS questionnaire mimics official forms, which are often still paper-based in Germany, establishments in the public administration industry should be familiar with this kind of questionnaire and data requirements. For this reason, a mail survey could increase the likelihood of participation compared to a web survey. All of these considerations lead us to the following hypotheses:
H2: Establishments in the agricultural industry are less likely to (a) participate in a single-mode web than single-mode mail design, and (b) participate via web than mail in a concurrent mail-web mixed-mode design.
H3: Establishments in the construction industry are less likely to (a) participate in a single-mode web than single-mode mail design, and (b) participate via web than mail in a concurrent mail-web mixed-mode design.
H4: Establishments in the public administration industry are less likely to (a) participate in a single-mode web than single-mode mail design, and (b) participate via web than mail in a concurrent mail-web mixed-mode design.
H5: Establishments in the information/communication industry are more likely to (a) participate in a single-mode web than single-mode mail design, and (b) participate via web than mail in a concurrent mail-web mixed-mode design.
H6: Establishments in the finance/insurance industry are more likely to (a) participate in a single-mode web than single-mode mail design, and (b) participate via web than mail in a concurrent mail-web mixed-mode design.
4. DATA AND METHODS
4.1 Data
4.1.1 IAB job vacancy survey
Since 1989, the IAB-JVS has collected data on labor demand and particularly job vacancies and recruiting processes in Germany (Bossler et al. 2020, 2022). It is a voluntary, annual, and cross-sectional establishment survey consisting of a stratified (by region, industry, and size) random sample drawn from the population of all establishments with at least one employee contributing to social security in Germany. The IAB-JVS forms the basis for regular reporting of survey estimates on job vacancies to Eurostat, which compiles European-wide vacancy statistics. The data used in this study are available from the Research Data Centre of the Federal Employment Agency in Germany. Restrictions apply to the availability of these data, which are not publicly available. For more information on data access, see https://fdz.iab.de/en.aspx.
The IAB-JVS started as a single-mode mail survey and is conducted in the fourth quarter of every year since 1990. A concurrent web survey option was introduced in 2002 and included with the mailed questionnaire. Since then, establishments have been able to choose either mode of participation. In 2006, short follow-up telephone surveys were introduced in the three following quarters to update the number of job vacancies. However, this study focuses exclusively on the fourth quarter survey. Establishments in the IAB-JVS receive up to two mailings. The first mailing includes a package with the invitation letter, paper questionnaire, login information to the web survey, notice of the participation deadline (ca. October 31), prepaid return envelope, and an additional document with survey instructions and item definitions. This first mailing is sent at the end of September each year. The second mailing is designed as a post-due-date reminder for nonresponding establishments sent after the survey deadline containing the same package of materials (with adjustments to wording and deadline) as the first mailing and is sent in mid-November. The post-due-date reminder gives a new deadline of December 23. Thus, strictly speaking, the IAB-JVS uses a concurrent mail–web mixed-mode design with one invitation letter and one post-due-date reminder.
4.1.2 Experimental design
To examine the effects of using alternative mode designs, an experiment was conducted in the fourth quarter of the 2020 IAB-JVS. The full sample consisted of 132,433 establishments, which were randomly assigned to four experimental groups taking into account collapsed establishment size and industry classes (see Supplement C in the supplementary data online for an overview of summary statistics for each experimental group). Figure 1 depicts the experimental design. The field period started on September 26 and officially ended on January 6, 2021 (65 questionnaires were received after January 6, but these are treated as nonrespondents in the analysis). The post-due-date reminder for all nonresponding establishments, regardless of mode design, was sent on November 16, 2020.

The first experimental group (i.e., control group) was conducted using the standard IAB-JVS mode design, that is, a concurrent mail–web mixed-mode design with paper questionnaire and login information to the web questionnaire included in both contact attempts (invitation and post-due-date reminder). The majority of establishments (N = 109,924) were allocated to this group.
The other three experimental groups consisted of a sequential web-to-mail mixed-mode design, a single-mode web design, and a single-mode mail design. For the sequential web-to-mail design, the invitation letter referred to the web survey, whereas the post-due-date reminder offered establishments the additional option of responding via the enclosed mailed questionnaire. For the single-mode mail design, both the invitation and post-due-date reminder contained a printed version of the questionnaire without the possibility of web completion. Except for the experimental manipulations, all field procedures were administered identically in all groups.
4.1.3 Establishment history panel
The forthcoming analysis uses data from the Establishment History Panel (BHP), an administrative database of all establishments in Germany, to investigate (proxy) nonresponse bias and correlates of survey participation (Ganzer et al. 2022). The BHP is an annual and cross-sectional aggregation of employee records to the reference date (June 30). As we use the 2020 BHP, we have administrative data for respondents and nonrespondents about one quarter before the IAB-JVS field period started. The response indicator from the IAB-JVS can be linked to the BHP via a unique establishment identifier for 127,338 establishments. Because of bankruptcies, changes in ownership or legal forms, or due to mergers or splits, the linkage was unsuccessful in 3.85 percent of cases. These observations are not included in the analysis of nonresponse bias (RQ2) and survey participation (RQ3) but are part of the response rate (RQ1) and cost (RQ4) analyses.
The BHP contains variables regarding basic establishment characteristics, including establishment size, industry, region, and various (aggregate) employee characteristics, such as the average age of employees, the share of fixed-term employees, and the average wage of employees. Since these variables are likely to be correlated with several IAB-JVS survey variables, such as the number of newly hired employees, number of vacancies, and number of hirings with fixed-term contracts, nonresponse bias in the BHP variables can serve as reasonable proxies for nonresponse bias in the survey variables. This strategy to estimate proxy nonresponse biases using administrative data has been applied in many other methodological studies (e.g., Eckman and Haas 2017; Kreuter et al. 2010; Küfner et al. 2022). All used BHP variables are categorized into roughly equal-sized groups with the exception of binary variables. The variables do not have any missing values. Supplement B and C in the supplementary data online provide an overview of the BHP variables and summary statistics for each administrative variable by mode design group.
4.2 Methods
4.2.1 Response rates
To address the first research question (RQ1), we report and compare the response rate of each mode design. In this analysis, a respondent is defined as an establishment that answers at least two essential Eurostat questions on the number of employees and job vacancies and submits their answers by clicking on the “submit” button in the web survey or by mailing the paper questionnaire back to the survey institute. Both essential questions are placed at the beginning of the questionnaire (see Supplement A.2 in the supplementary data online for the exact wording of these questions). Response rates are computed using the AAPOR RR1 definition (American Association for Public Opinion Research 2016), which is simply the number of respondents divided by the full sample (see Supplement D.1 in the supplementary data online for the corresponding formula).
4.2.2 Nonresponse bias
The average absolute nonresponse bias measure is computed separately across all statistics of interest for the establishment characteristic variables, the (aggregate) employee characteristic variables, and across all BHP variables (see also table S3 in the supplementary data online for an overview). To prevent the lower sampling variance of the concurrent mail–web mixed-mode design, due to its disproportionately larger sample size, from influencing the average absolute bias results, we use a repeated downsampling approach to ensure this design is analyzed with the same sample size as the other mode designs (around 7,500). The resulting resamples are also used for computing confidence intervals for the average absolute nonresponse bias estimates in this design. To estimate confidence intervals for the other mode designs, we use bootstrapped standard errors based on 500 replicates and a normal approximation. For the downsampling as well as the bootstrapping approach, we take industry and establishment size as strata into account. We compute only the overall estimates of average nonresponse bias for each mode design group, but no mode-specific estimates of bias for the individual modes used in the mixed-mode designs. As a sensitivity check, we also compute the median absolute nonresponse bias for the mode design groups.
4.2.3 Modeling survey participation
We model subgroup participation in multiple ways. First, we use logistic regression to model the likelihood of participation by establishment subgroup (size and industry) for each of the four mode designs. This analysis shows whether there are subgroup differences in overall response propensities between the different mode designs. For the mixed-mode designs, we make no distinction between whether an establishment participated via web or mail and are only interested in the overall response propensities for each design. No hypotheses are tested with this analysis.
In both models, we include categorized foundation year as a control variable to account for the tenure of the establishment.
Design weights are incorporated into the analysis of response rates, nonresponse bias, and survey participation to account for unequal probabilities of selection. We also account for stratification when estimating linearized standard errors for the response rate comparison and survey participation models. All computations were conducted in Stata 17 (StataCorp 2021).
4.2.4 Survey costs
To assess the impact of mode designs on survey costs (RQ4), we consider the following variable costs: postage, printing of invitation and reminder letters and paper questionnaires, envelopes, and data entry. Since exact costs are not available or cannot be published due to contractual regulations, the analysis is based on assumed costs. These assumed costs come from consultations with the survey institute, online research, and experiences from other surveys. Variable costs account for only a portion of the total costs of each survey design and are typically more pronounced for mail modes. Costs related to survey management, set-up costs, and data processing are not included (see table S54 in the supplementary data online for a list of mode-related fixed costs). Nevertheless, variable costs are an important component of survey costs and provide some indications of the cost-effectiveness of the different mode designs. In the forthcoming analysis, we report the per-respondent costs for each experimental group.
5. RESULTS
5.1 Response Rates
Figure 2 presents the design-weighted response rates for each experimental group. A tabular version with absolute numbers and unweighted response rates are shown in Supplement D in the supplementary data online. Comparing the mode designs, there is neither a substantial nor a statistically significant difference between the response rates of the concurrent mail–web mixed-mode design (15.0 percent), the sequential web-to-mail mixed-mode design (14.5 percent), the single-mode web design (14.6 percent), and the single-mode mail design (13.5 percent). Hence, there is no support for hypotheses M1, M2, M3, and M4. Both the response rate of the single-mode web design (14.6 percent) and the web take-up rate in the sequential web-to-mail mixed-mode design (7.9 percent) are higher than the web take-up rate of the concurrent mixed-mode design (4.9 percent), which lends further support to the performance of “push-to-web” strategies in establishment surveys and supports M5. In line with M6, the response rate of the single-mode mail design (13.5 percent) exceeds the mail take-up rate of the concurrent mixed-mode design (10.1 percent).

Response Rate (weighted) and 95 percent Confidence Interval, by Mode Design. Source: IAB-JVS 2020.
5.2 Nonresponse Bias
Figure 3 shows the average absolute nonresponse bias across all BHP administrative variables. Tables and figures for the average absolute nonresponse bias and the median absolute nonresponse bias are shown separately for the establishment and employee characteristic variable groups in Supplement E in the supplementary data online. Moreover, nonresponse biases for individual variables are presented in Supplement E.3 in the supplementary data online.

Average Absolute Nonresponse Bias Estimates and 95 percent Confidence Intervals, by Mode Design, for All BHP Administrative Variables.
Overall, the results show rather low levels of aggregate nonresponse bias. The average absolute nonresponse bias across all BHP administrative variables is less than 3 percent for all mode designs. Further, there are no statistically significant differences between the mode designs. Similarly, there are no substantial differences in average absolute bias between the mode designs with respect to the establishment characteristic variables (e.g., size, industry, region) or the (aggregate) employee characteristic variables (e.g., average age of employees, share of female employees) when both variable groups are examined separately. Thus, the different mode designs yield respondents that are generally comparable with respect to establishment and workforce characteristics. Similar conclusions hold for the median absolute nonresponse bias results.
For individual variables, a few notable biases can be observed. The raw nonresponse biases for establishment size categories are particularly large in the single-mode mail design with the largest bias occurring for the smallest establishment size group (7.25 percent). Specifically, establishments with less than 10 employees are overrepresented by 7.25 percentage points, which is higher than in the other mode design groups. The single-mode web (–1.79 percent) and the single-mode mail (–2.08 percent) designs have smaller negative biases for the service industry compared to the sequential web-to-mail (–7.94 percent) and concurrent mail/web mixed-mode designs (–5.32 percent), indicating that the service industry is more accurately represented by respondents in the single-mode designs. We observe a strong and significant nonresponse bias for establishments founded after 2010 in the mail-only group (–9.12 percent), meaning that these younger establishments are underrepresented in the respondent pool. Moreover, participating establishments with the highest proportion of high-educated employees in the sequential web-to-mail mixed-mode design are overrepresented by 6.58 percentage points.
In summary, the results do not show strong differences between the different mode designs with respect to aggregate nonresponse bias across all administrative variables. Lastly, the biases of individual variables show only few meaningful differences between the mode designs. For instance, the single-mode mail design overrepresents the smallest establishments and underrepresents the youngest establishments to a greater extent than the other mode designs.
5.3 Predictors of Survey Participation by Mode
Figure 4 shows the predicted probabilities of survey participation for each of the four mode designs by establishment characteristics, number of employees (panel a) and industry (panel b) based on a logistic regression model of survey participation. We do not observe any substantial differences in response propensities between the different mode designs with respect to these characteristics, suggesting a similar pattern of survey participation regardless of which survey design is used.

Predicted Probabilities and 95 percent Confidence Intervals of Participation, by Mode Design. Source: IAB-JVS 2020.
To test our mode-specific hypotheses, we take a closer look at the single-mode web and mail designs. Figure 5 shows the predicted probabilities of survey participation in the single-mode web and mail designs for the establishment characteristics based on the logistic regression model of survey participation. With respect to establishment size, larger establishments (except for the largest establishments with more than 250 employees) are more likely to participate in a web survey than in a mail survey compared to smaller establishments. This higher likelihood of participation is statistically significant and thus supports H1a. The agriculture, construction, public administration, information/communication, and finance/insurance industries show no reliable relationship with either mode, thus yielding no support for H2a, H3a, H4a, H5a, and H6a, respectively.

Predicted Probabilities and 95 percent Confidence Intervals of Survey Participation in Single-Mode Web and Single-Mode Mail Designs, by Establishment Characteristics. Source: IAB-JVS 2020. Notes: The additional column shows the difference between the predicted probability of mail and web participation and the corresponding result of a Wald test. Significance Levels: ***p < 0.001; **p < 0.01; *p < 0.05.
Figure 6 shows the predicted probabilities and confidence intervals from the multinomial regression model of participation in the web and mail modes of the concurrent mixed-mode design with nonresponse as the reference category. The corresponding regression tables are provided in Supplement F in the supplementary data online. The results show that the predicted probabilities for mail participation are higher than for web participation for the different establishment characteristics (except for the information/communication industry). For instance, larger establishments are more likely to participate by mail than by web in the concurrent mixed-mode design, which contradicts H1b. However, the difference between the predicted probabilities of web and mail shrinks with increasing establishment size, from 6.0 percentage points (mail: 11.0 percent; web: 5.0 percent) for the smallest establishments to 0.7 percentage points (mail: 2.8 percent; web: 2.1 percent) for the largest establishments. This implies that smaller establishments have a higher likelihood to participate by mail, while larger establishments are almost equally likely to participate by web or mail.

Predicted Probabilities and 95 percent Confidence Intervals of Survey Participation by Mail and Web in the Concurrent Mail–Web Mixed-Mode Design, by Establishment Characteristics. Source: IAB-JVS 2020. Notes: The additional column shows the difference between the predicted probability of mail and web participation and the corresponding result of a Wald test. Significance Levels: ***p < .001; **p <.01; *p < .05.
With respect to industry participation, establishments in the agricultural and construction industries have a higher probability of participating by mail than by web, yielding support for H2b and H3b, respectively. There is no statistically significant difference in the predicted probabilities of web and mail participation for establishments in the public administration, information/communication, and the finance/insurance industries. Hence, there is no support for hypotheses H4b, H5b, and H6b. However, it is interesting that these three industries are the only ones where the predicted probabilities of mail participation are not significantly higher than web participation. This could be a sign that these industries are more open to choosing the web mode compared to other industries.
5.4 Respondent Composition by Mode and Mode Design
As a side analysis, table 1 shows the composition of respondents by mode of participation within each mode design. Here, we can explore whether the compositions of web and mail respondents differ from each other conditional on certain mode designs and whether that depends on whether two modes are used or just one. We point out a few notable patterns and speculate on their causes. For the establishment size subgroups, we can see that the smallest establishments are overrepresented in the mail mode in all three of the mail mode designs, indicating that the mail mode brings in more of these respondents. Conversely, the web mode tends to bring in larger establishments at a greater rate than mail.
. | Mail–web (all) . | Mail–web (web) . | Mail–web (mail) . | Web-to-mail (all) . | Web-to-mail (web) . | Web-to-mail (mail) . | Single-mode (web) . | Single-mode (mail) . | Full (Sample) . |
---|---|---|---|---|---|---|---|---|---|
Number of employees | |||||||||
1–9 | 72.9 (71.8, 73.9) | 69.7 (68.3, 71.2) | 74.4 (73.0, 75.7) | 73.6 (70.7, 76.3) | 69.3 (66.7, 71.8) | 78.9 (72.5, 84.2) | 72.1 (68.8, 75.1) | 77.0 (73.3, 80.3) | 69.9 (69.0, 70.8) |
10–19 | 14.9 (13.8, 15.9) | 14.5 (13.2, 16.1) | 15.0 (13.7, 16.4) | 12.9 (10.2, 16.1) | 12.9 (10.5, 15.8) | 12.8 (8.0, 19.9) | 13.5 (10.6, 17.1) | 12.3 (9.1, 16.3) | 14.9 (14.2, 15.6) |
20–49 | 8.2 (7.9, 8.4) | 10.1 (9.5, 10.6) | 7.2 (6.9, 7.6) | 8.5 (7.7, 9.5) | 11.5 (10.0, 13.1) | 4.9 (3.7, 6.4) | 10.2 (9.1, 11.3) | 7.4 (6.6, 8.2) | 9.3 (9.0, 9.7) |
50 | 4.1 (4.0, 4.2) | 5.7 (5.5, 5.8) | 3.4 (3.3, 3.5) | 5.0 (4.6, 5.5) | 6.3 (5.6, 7.1) | 3.4 (2.6, 4.3) | 4.2 (3.9, 4.6) | 3.3 (3.0, 3.7) | 5.9 (5.7, 6.1) |
Industry | |||||||||
Agriculture/forestry | 3.5 (3.5, 3.5) | 3.1 (3.1, 3.1) | 3.7 (3.7, 3.7) | 2.8 (2.6, 3.0) | 1.9 (1.5, 2.4) | 3.8 (3.7, 4.0) | 2.2 (2.2, 2.3) | 4.6 (4.6, 4.6) | 2.7 (2.4, 3.0) |
Mining/energy/waste | 0.7 (0.7, 0.7) | 0.9 (0.9, 0.9) | 0.6 (0.6, 0.6) | 0.6 (0.5, 0.6) | 0.7 (0.6, 0.8) | 0.4 (0.4, 0.5) | 0.8 (0.7, 0.8) | 0.6 (0.6, 0.7) | 0.9 (0.8, 0.9) |
Manufacturing | 8.5 (8.4, 8.5) | 9.2 (9.0, 9.3) | 8.1 (8.1, 8.2) | 9.0 (9.0, 9.1) | 9.7 (9.3, 10.2) | 8.1 (7.8, 8.5) | 9.1 (8.9, 9.2) | 8.1 (8.0, 8.2) | 8.2 (7.9, 8.5) |
Construction | 12.7 (12.5, 12.8) | 11.0 (10.9, 11.1) | 13.5 (13.3, 13.7) | 11.9 (11.9, 12.0) | 14.7 (14.1, 15.3) | 8.5 (6.8, 10.5) | 11.8 (11.8, 11.9) | 11.6 (11.6, 11.7) | 11.0 (10.3, 11.8) |
Trade/car-repair | 15.0 (15.0, 15.0) | 14.3 (14.2, 14.4) | 15.3 (15.3, 15.4) | 11.2 (11.2, 11.2) | 5.2 (3.1, 8.9) | 18.7 (17.8, 19.6) | 16.7 (16.7, 16.8) | 14.0 (13.9, 14.0) | 18.9 (17.8, 20.0) |
Transportation/storage | 3.6 (3.6, 3.6) | 3.4 (3.4, 3.4) | 3.6 (3.6, 3.7) | 4.9 (4.9, 4.9) | 5.3 (4.8, 5.8) | 4.5 (4.3, 4.7) | 3.9 (3.9, 3.9) | 5.2 (5.2, 5.3) | 3.8 (3.5, 4.0) |
Information/communication | 2.5 (2.5, 2.6) | 4.2 (4.2, 4.2) | 1.7 (1.7, 1.7) | 2.6 (2.6, 2.6) | 2.7 (2.5, 2.8) | 2.5 (2.4, 2.6) | 2.0 (2.0, 2.0) | 2.2 (2.2, 2.2) | 3.1 (2.9, 3.3) |
Finance/insurance | 2.5 (2.5, 2.5) | 3.2 (3.2, 3.2) | 2.1 (2.1, 2.1) | 1.4 (1.4, 1.4) | 1.6 (1.5, 1.6) | 1.2 (0.4, 3.2) | 1.0 (0.9, 1.3) | 3.3 (3.2, 3.3) | 3.0 (2.8, 3.2) |
Business-related services | 20.1 (20.0, 20.3) | 21.8 (21.8, 21.9) | 19.3 (19.1, 19.6) | 21.9 (21.9, 22.0) | 29.5 (28.4, 30.7) | 12.4 (10.8, 14.1) | 22.0 (21.9, 22.1) | 21.9 (21.7, 22.0) | 18.7 (17.9, 19.5) |
Other services | 29.4 (29.4, 29.5) | 26.5 (26.4, 26.5) | 30.9 (30.8, 31.0) | 32.3 (32.2, 32.4) | 27.3 (25.3, 29.5) | 38.5 (36.9, 40.1) | 29.0 (28.9, 29.2) | 27.3 (27.1, 27.5) | 28.4 (27.5, 29.4) |
Public administration | 1.4 (1.4, 1.5) | 2.3 (2.3, 2.3) | 1.0 (1.0, 1.0) | 1.4 (1.3, 1.4) | 1.3 (1.0, 1.7) | 1.5 (1.2, 1.8) | 1.4 (1.4, 1.5) | 1.1 (1.1, 1.2) | 1.4 (1.3, 1.5) |
. | Mail–web (all) . | Mail–web (web) . | Mail–web (mail) . | Web-to-mail (all) . | Web-to-mail (web) . | Web-to-mail (mail) . | Single-mode (web) . | Single-mode (mail) . | Full (Sample) . |
---|---|---|---|---|---|---|---|---|---|
Number of employees | |||||||||
1–9 | 72.9 (71.8, 73.9) | 69.7 (68.3, 71.2) | 74.4 (73.0, 75.7) | 73.6 (70.7, 76.3) | 69.3 (66.7, 71.8) | 78.9 (72.5, 84.2) | 72.1 (68.8, 75.1) | 77.0 (73.3, 80.3) | 69.9 (69.0, 70.8) |
10–19 | 14.9 (13.8, 15.9) | 14.5 (13.2, 16.1) | 15.0 (13.7, 16.4) | 12.9 (10.2, 16.1) | 12.9 (10.5, 15.8) | 12.8 (8.0, 19.9) | 13.5 (10.6, 17.1) | 12.3 (9.1, 16.3) | 14.9 (14.2, 15.6) |
20–49 | 8.2 (7.9, 8.4) | 10.1 (9.5, 10.6) | 7.2 (6.9, 7.6) | 8.5 (7.7, 9.5) | 11.5 (10.0, 13.1) | 4.9 (3.7, 6.4) | 10.2 (9.1, 11.3) | 7.4 (6.6, 8.2) | 9.3 (9.0, 9.7) |
50 | 4.1 (4.0, 4.2) | 5.7 (5.5, 5.8) | 3.4 (3.3, 3.5) | 5.0 (4.6, 5.5) | 6.3 (5.6, 7.1) | 3.4 (2.6, 4.3) | 4.2 (3.9, 4.6) | 3.3 (3.0, 3.7) | 5.9 (5.7, 6.1) |
Industry | |||||||||
Agriculture/forestry | 3.5 (3.5, 3.5) | 3.1 (3.1, 3.1) | 3.7 (3.7, 3.7) | 2.8 (2.6, 3.0) | 1.9 (1.5, 2.4) | 3.8 (3.7, 4.0) | 2.2 (2.2, 2.3) | 4.6 (4.6, 4.6) | 2.7 (2.4, 3.0) |
Mining/energy/waste | 0.7 (0.7, 0.7) | 0.9 (0.9, 0.9) | 0.6 (0.6, 0.6) | 0.6 (0.5, 0.6) | 0.7 (0.6, 0.8) | 0.4 (0.4, 0.5) | 0.8 (0.7, 0.8) | 0.6 (0.6, 0.7) | 0.9 (0.8, 0.9) |
Manufacturing | 8.5 (8.4, 8.5) | 9.2 (9.0, 9.3) | 8.1 (8.1, 8.2) | 9.0 (9.0, 9.1) | 9.7 (9.3, 10.2) | 8.1 (7.8, 8.5) | 9.1 (8.9, 9.2) | 8.1 (8.0, 8.2) | 8.2 (7.9, 8.5) |
Construction | 12.7 (12.5, 12.8) | 11.0 (10.9, 11.1) | 13.5 (13.3, 13.7) | 11.9 (11.9, 12.0) | 14.7 (14.1, 15.3) | 8.5 (6.8, 10.5) | 11.8 (11.8, 11.9) | 11.6 (11.6, 11.7) | 11.0 (10.3, 11.8) |
Trade/car-repair | 15.0 (15.0, 15.0) | 14.3 (14.2, 14.4) | 15.3 (15.3, 15.4) | 11.2 (11.2, 11.2) | 5.2 (3.1, 8.9) | 18.7 (17.8, 19.6) | 16.7 (16.7, 16.8) | 14.0 (13.9, 14.0) | 18.9 (17.8, 20.0) |
Transportation/storage | 3.6 (3.6, 3.6) | 3.4 (3.4, 3.4) | 3.6 (3.6, 3.7) | 4.9 (4.9, 4.9) | 5.3 (4.8, 5.8) | 4.5 (4.3, 4.7) | 3.9 (3.9, 3.9) | 5.2 (5.2, 5.3) | 3.8 (3.5, 4.0) |
Information/communication | 2.5 (2.5, 2.6) | 4.2 (4.2, 4.2) | 1.7 (1.7, 1.7) | 2.6 (2.6, 2.6) | 2.7 (2.5, 2.8) | 2.5 (2.4, 2.6) | 2.0 (2.0, 2.0) | 2.2 (2.2, 2.2) | 3.1 (2.9, 3.3) |
Finance/insurance | 2.5 (2.5, 2.5) | 3.2 (3.2, 3.2) | 2.1 (2.1, 2.1) | 1.4 (1.4, 1.4) | 1.6 (1.5, 1.6) | 1.2 (0.4, 3.2) | 1.0 (0.9, 1.3) | 3.3 (3.2, 3.3) | 3.0 (2.8, 3.2) |
Business-related services | 20.1 (20.0, 20.3) | 21.8 (21.8, 21.9) | 19.3 (19.1, 19.6) | 21.9 (21.9, 22.0) | 29.5 (28.4, 30.7) | 12.4 (10.8, 14.1) | 22.0 (21.9, 22.1) | 21.9 (21.7, 22.0) | 18.7 (17.9, 19.5) |
Other services | 29.4 (29.4, 29.5) | 26.5 (26.4, 26.5) | 30.9 (30.8, 31.0) | 32.3 (32.2, 32.4) | 27.3 (25.3, 29.5) | 38.5 (36.9, 40.1) | 29.0 (28.9, 29.2) | 27.3 (27.1, 27.5) | 28.4 (27.5, 29.4) |
Public administration | 1.4 (1.4, 1.5) | 2.3 (2.3, 2.3) | 1.0 (1.0, 1.0) | 1.4 (1.3, 1.4) | 1.3 (1.0, 1.7) | 1.5 (1.2, 1.8) | 1.4 (1.4, 1.5) | 1.1 (1.1, 1.2) | 1.4 (1.3, 1.5) |
Source.—IAB-JVS 2020.
. | Mail–web (all) . | Mail–web (web) . | Mail–web (mail) . | Web-to-mail (all) . | Web-to-mail (web) . | Web-to-mail (mail) . | Single-mode (web) . | Single-mode (mail) . | Full (Sample) . |
---|---|---|---|---|---|---|---|---|---|
Number of employees | |||||||||
1–9 | 72.9 (71.8, 73.9) | 69.7 (68.3, 71.2) | 74.4 (73.0, 75.7) | 73.6 (70.7, 76.3) | 69.3 (66.7, 71.8) | 78.9 (72.5, 84.2) | 72.1 (68.8, 75.1) | 77.0 (73.3, 80.3) | 69.9 (69.0, 70.8) |
10–19 | 14.9 (13.8, 15.9) | 14.5 (13.2, 16.1) | 15.0 (13.7, 16.4) | 12.9 (10.2, 16.1) | 12.9 (10.5, 15.8) | 12.8 (8.0, 19.9) | 13.5 (10.6, 17.1) | 12.3 (9.1, 16.3) | 14.9 (14.2, 15.6) |
20–49 | 8.2 (7.9, 8.4) | 10.1 (9.5, 10.6) | 7.2 (6.9, 7.6) | 8.5 (7.7, 9.5) | 11.5 (10.0, 13.1) | 4.9 (3.7, 6.4) | 10.2 (9.1, 11.3) | 7.4 (6.6, 8.2) | 9.3 (9.0, 9.7) |
50 | 4.1 (4.0, 4.2) | 5.7 (5.5, 5.8) | 3.4 (3.3, 3.5) | 5.0 (4.6, 5.5) | 6.3 (5.6, 7.1) | 3.4 (2.6, 4.3) | 4.2 (3.9, 4.6) | 3.3 (3.0, 3.7) | 5.9 (5.7, 6.1) |
Industry | |||||||||
Agriculture/forestry | 3.5 (3.5, 3.5) | 3.1 (3.1, 3.1) | 3.7 (3.7, 3.7) | 2.8 (2.6, 3.0) | 1.9 (1.5, 2.4) | 3.8 (3.7, 4.0) | 2.2 (2.2, 2.3) | 4.6 (4.6, 4.6) | 2.7 (2.4, 3.0) |
Mining/energy/waste | 0.7 (0.7, 0.7) | 0.9 (0.9, 0.9) | 0.6 (0.6, 0.6) | 0.6 (0.5, 0.6) | 0.7 (0.6, 0.8) | 0.4 (0.4, 0.5) | 0.8 (0.7, 0.8) | 0.6 (0.6, 0.7) | 0.9 (0.8, 0.9) |
Manufacturing | 8.5 (8.4, 8.5) | 9.2 (9.0, 9.3) | 8.1 (8.1, 8.2) | 9.0 (9.0, 9.1) | 9.7 (9.3, 10.2) | 8.1 (7.8, 8.5) | 9.1 (8.9, 9.2) | 8.1 (8.0, 8.2) | 8.2 (7.9, 8.5) |
Construction | 12.7 (12.5, 12.8) | 11.0 (10.9, 11.1) | 13.5 (13.3, 13.7) | 11.9 (11.9, 12.0) | 14.7 (14.1, 15.3) | 8.5 (6.8, 10.5) | 11.8 (11.8, 11.9) | 11.6 (11.6, 11.7) | 11.0 (10.3, 11.8) |
Trade/car-repair | 15.0 (15.0, 15.0) | 14.3 (14.2, 14.4) | 15.3 (15.3, 15.4) | 11.2 (11.2, 11.2) | 5.2 (3.1, 8.9) | 18.7 (17.8, 19.6) | 16.7 (16.7, 16.8) | 14.0 (13.9, 14.0) | 18.9 (17.8, 20.0) |
Transportation/storage | 3.6 (3.6, 3.6) | 3.4 (3.4, 3.4) | 3.6 (3.6, 3.7) | 4.9 (4.9, 4.9) | 5.3 (4.8, 5.8) | 4.5 (4.3, 4.7) | 3.9 (3.9, 3.9) | 5.2 (5.2, 5.3) | 3.8 (3.5, 4.0) |
Information/communication | 2.5 (2.5, 2.6) | 4.2 (4.2, 4.2) | 1.7 (1.7, 1.7) | 2.6 (2.6, 2.6) | 2.7 (2.5, 2.8) | 2.5 (2.4, 2.6) | 2.0 (2.0, 2.0) | 2.2 (2.2, 2.2) | 3.1 (2.9, 3.3) |
Finance/insurance | 2.5 (2.5, 2.5) | 3.2 (3.2, 3.2) | 2.1 (2.1, 2.1) | 1.4 (1.4, 1.4) | 1.6 (1.5, 1.6) | 1.2 (0.4, 3.2) | 1.0 (0.9, 1.3) | 3.3 (3.2, 3.3) | 3.0 (2.8, 3.2) |
Business-related services | 20.1 (20.0, 20.3) | 21.8 (21.8, 21.9) | 19.3 (19.1, 19.6) | 21.9 (21.9, 22.0) | 29.5 (28.4, 30.7) | 12.4 (10.8, 14.1) | 22.0 (21.9, 22.1) | 21.9 (21.7, 22.0) | 18.7 (17.9, 19.5) |
Other services | 29.4 (29.4, 29.5) | 26.5 (26.4, 26.5) | 30.9 (30.8, 31.0) | 32.3 (32.2, 32.4) | 27.3 (25.3, 29.5) | 38.5 (36.9, 40.1) | 29.0 (28.9, 29.2) | 27.3 (27.1, 27.5) | 28.4 (27.5, 29.4) |
Public administration | 1.4 (1.4, 1.5) | 2.3 (2.3, 2.3) | 1.0 (1.0, 1.0) | 1.4 (1.3, 1.4) | 1.3 (1.0, 1.7) | 1.5 (1.2, 1.8) | 1.4 (1.4, 1.5) | 1.1 (1.1, 1.2) | 1.4 (1.3, 1.5) |
. | Mail–web (all) . | Mail–web (web) . | Mail–web (mail) . | Web-to-mail (all) . | Web-to-mail (web) . | Web-to-mail (mail) . | Single-mode (web) . | Single-mode (mail) . | Full (Sample) . |
---|---|---|---|---|---|---|---|---|---|
Number of employees | |||||||||
1–9 | 72.9 (71.8, 73.9) | 69.7 (68.3, 71.2) | 74.4 (73.0, 75.7) | 73.6 (70.7, 76.3) | 69.3 (66.7, 71.8) | 78.9 (72.5, 84.2) | 72.1 (68.8, 75.1) | 77.0 (73.3, 80.3) | 69.9 (69.0, 70.8) |
10–19 | 14.9 (13.8, 15.9) | 14.5 (13.2, 16.1) | 15.0 (13.7, 16.4) | 12.9 (10.2, 16.1) | 12.9 (10.5, 15.8) | 12.8 (8.0, 19.9) | 13.5 (10.6, 17.1) | 12.3 (9.1, 16.3) | 14.9 (14.2, 15.6) |
20–49 | 8.2 (7.9, 8.4) | 10.1 (9.5, 10.6) | 7.2 (6.9, 7.6) | 8.5 (7.7, 9.5) | 11.5 (10.0, 13.1) | 4.9 (3.7, 6.4) | 10.2 (9.1, 11.3) | 7.4 (6.6, 8.2) | 9.3 (9.0, 9.7) |
50 | 4.1 (4.0, 4.2) | 5.7 (5.5, 5.8) | 3.4 (3.3, 3.5) | 5.0 (4.6, 5.5) | 6.3 (5.6, 7.1) | 3.4 (2.6, 4.3) | 4.2 (3.9, 4.6) | 3.3 (3.0, 3.7) | 5.9 (5.7, 6.1) |
Industry | |||||||||
Agriculture/forestry | 3.5 (3.5, 3.5) | 3.1 (3.1, 3.1) | 3.7 (3.7, 3.7) | 2.8 (2.6, 3.0) | 1.9 (1.5, 2.4) | 3.8 (3.7, 4.0) | 2.2 (2.2, 2.3) | 4.6 (4.6, 4.6) | 2.7 (2.4, 3.0) |
Mining/energy/waste | 0.7 (0.7, 0.7) | 0.9 (0.9, 0.9) | 0.6 (0.6, 0.6) | 0.6 (0.5, 0.6) | 0.7 (0.6, 0.8) | 0.4 (0.4, 0.5) | 0.8 (0.7, 0.8) | 0.6 (0.6, 0.7) | 0.9 (0.8, 0.9) |
Manufacturing | 8.5 (8.4, 8.5) | 9.2 (9.0, 9.3) | 8.1 (8.1, 8.2) | 9.0 (9.0, 9.1) | 9.7 (9.3, 10.2) | 8.1 (7.8, 8.5) | 9.1 (8.9, 9.2) | 8.1 (8.0, 8.2) | 8.2 (7.9, 8.5) |
Construction | 12.7 (12.5, 12.8) | 11.0 (10.9, 11.1) | 13.5 (13.3, 13.7) | 11.9 (11.9, 12.0) | 14.7 (14.1, 15.3) | 8.5 (6.8, 10.5) | 11.8 (11.8, 11.9) | 11.6 (11.6, 11.7) | 11.0 (10.3, 11.8) |
Trade/car-repair | 15.0 (15.0, 15.0) | 14.3 (14.2, 14.4) | 15.3 (15.3, 15.4) | 11.2 (11.2, 11.2) | 5.2 (3.1, 8.9) | 18.7 (17.8, 19.6) | 16.7 (16.7, 16.8) | 14.0 (13.9, 14.0) | 18.9 (17.8, 20.0) |
Transportation/storage | 3.6 (3.6, 3.6) | 3.4 (3.4, 3.4) | 3.6 (3.6, 3.7) | 4.9 (4.9, 4.9) | 5.3 (4.8, 5.8) | 4.5 (4.3, 4.7) | 3.9 (3.9, 3.9) | 5.2 (5.2, 5.3) | 3.8 (3.5, 4.0) |
Information/communication | 2.5 (2.5, 2.6) | 4.2 (4.2, 4.2) | 1.7 (1.7, 1.7) | 2.6 (2.6, 2.6) | 2.7 (2.5, 2.8) | 2.5 (2.4, 2.6) | 2.0 (2.0, 2.0) | 2.2 (2.2, 2.2) | 3.1 (2.9, 3.3) |
Finance/insurance | 2.5 (2.5, 2.5) | 3.2 (3.2, 3.2) | 2.1 (2.1, 2.1) | 1.4 (1.4, 1.4) | 1.6 (1.5, 1.6) | 1.2 (0.4, 3.2) | 1.0 (0.9, 1.3) | 3.3 (3.2, 3.3) | 3.0 (2.8, 3.2) |
Business-related services | 20.1 (20.0, 20.3) | 21.8 (21.8, 21.9) | 19.3 (19.1, 19.6) | 21.9 (21.9, 22.0) | 29.5 (28.4, 30.7) | 12.4 (10.8, 14.1) | 22.0 (21.9, 22.1) | 21.9 (21.7, 22.0) | 18.7 (17.9, 19.5) |
Other services | 29.4 (29.4, 29.5) | 26.5 (26.4, 26.5) | 30.9 (30.8, 31.0) | 32.3 (32.2, 32.4) | 27.3 (25.3, 29.5) | 38.5 (36.9, 40.1) | 29.0 (28.9, 29.2) | 27.3 (27.1, 27.5) | 28.4 (27.5, 29.4) |
Public administration | 1.4 (1.4, 1.5) | 2.3 (2.3, 2.3) | 1.0 (1.0, 1.0) | 1.4 (1.3, 1.4) | 1.3 (1.0, 1.7) | 1.5 (1.2, 1.8) | 1.4 (1.4, 1.5) | 1.1 (1.1, 1.2) | 1.4 (1.3, 1.5) |
Source.—IAB-JVS 2020.
Regarding the industry subgroups, the mail mode brings in a higher percentage of construction sector respondents than web in the concurrent mixed-mode design, but the opposite is true for the sequential mixed-mode design, suggesting that this subgroup, if given a choice, generally prefers to respond via mail but is just as likely to participate using either web or mail depending on which mode is offered first. This is also reflected in both single-mode designs, which bring in similar shares of construction respondents. The opposite pattern is seen for the information/communication sector. The concurrent design brings in more web than mail respondents, while the sequential and single-mode designs bring in similar shares of respondents in both modes.
Interestingly, the mail and web modes bring in relatively similar proportions of trade/car-repair respondents in the single-mode and concurrent designs, but this group is strongly underrepresented in the web mode of the sequential mixed-mode design. This may be driven by late respondents, who are automatically allocated to the mail nonresponse follow-up phase, rather than a strong reluctance to participate in the web mode. A similar pattern occurs for the business-related services sector, where there is a strong overrepresentation of web respondents in the sequential design, but comparable representation between both modes in the other designs. In contrast to the previous example, this could be driven by an early respondent effect, where these respondents are highly cooperative and answer promptly regardless of which mode is offered. However, these are only speculations as it is difficult to draw definitive conclusions between the sequential and non-sequential designs due to the additional selection step in the former.
5.5 Survey Costs
Table 2 shows a summary of all analyzed costs per mode design and Supplement table S55 in the supplementary data online provides more details on the specific costs associated with each mode design. The analysis shows that the single-mode mail design leads to the highest costs per respondent (35.20 €), which includes costs of each contact attempt and data entry. This is followed by the concurrent mail–web mixed-mode design (29.69 €), which includes costs for postage and printing, but fewer data entry costs. Due to the higher number of web respondents and less expensive mailings, the sequential web-to-mail mixed-mode (21.55 €) and the single-mode web (13.93 €) designs are the least expensive mode designs. These results support hypothesis M7 and highlight the potential cost savings of switching to a push-to-web design in establishment surveys.
. | Mail–web . | Web-to-mail . | Web-only . | Mail-only . |
---|---|---|---|---|
Invitation | 192,367.00 € | 6,608.80 € | 6,593.84 € | 13,135.50 € |
Post-due-date reminder | 174,217.15 € | 12,181.75 € | 6,123.92 € | 11,877.25 € |
Re-postage mail | 12,389.15 € | 492.90 € | 0.00 € | 1,124.50 € |
Data entry | 15,816.11 € | 631.03 € | 0.00 € | 1,567.66 € |
Total costs | 394,835.01 € | 19,914.48 € | 12,717.76 € | 27,804.91 € |
Costs per sampled unit | 3.59 € | 2.65 € | 1.70 € | 3.70 € |
Costs per respondent unit | 29.69 € | 21.55 € | 13.93 € | 35.20 € |
. | Mail–web . | Web-to-mail . | Web-only . | Mail-only . |
---|---|---|---|---|
Invitation | 192,367.00 € | 6,608.80 € | 6,593.84 € | 13,135.50 € |
Post-due-date reminder | 174,217.15 € | 12,181.75 € | 6,123.92 € | 11,877.25 € |
Re-postage mail | 12,389.15 € | 492.90 € | 0.00 € | 1,124.50 € |
Data entry | 15,816.11 € | 631.03 € | 0.00 € | 1,567.66 € |
Total costs | 394,835.01 € | 19,914.48 € | 12,717.76 € | 27,804.91 € |
Costs per sampled unit | 3.59 € | 2.65 € | 1.70 € | 3.70 € |
Costs per respondent unit | 29.69 € | 21.55 € | 13.93 € | 35.20 € |
Note.—Postage mail package: 1.55 €; postage web package: 0.80 €; print mail package: 0.20 €; print web package: 0.08 €; postage for return mail respondent: 1.55 €; data entry mail respondent: 1.98 €; data entry web respondent: 0.00 €.
. | Mail–web . | Web-to-mail . | Web-only . | Mail-only . |
---|---|---|---|---|
Invitation | 192,367.00 € | 6,608.80 € | 6,593.84 € | 13,135.50 € |
Post-due-date reminder | 174,217.15 € | 12,181.75 € | 6,123.92 € | 11,877.25 € |
Re-postage mail | 12,389.15 € | 492.90 € | 0.00 € | 1,124.50 € |
Data entry | 15,816.11 € | 631.03 € | 0.00 € | 1,567.66 € |
Total costs | 394,835.01 € | 19,914.48 € | 12,717.76 € | 27,804.91 € |
Costs per sampled unit | 3.59 € | 2.65 € | 1.70 € | 3.70 € |
Costs per respondent unit | 29.69 € | 21.55 € | 13.93 € | 35.20 € |
. | Mail–web . | Web-to-mail . | Web-only . | Mail-only . |
---|---|---|---|---|
Invitation | 192,367.00 € | 6,608.80 € | 6,593.84 € | 13,135.50 € |
Post-due-date reminder | 174,217.15 € | 12,181.75 € | 6,123.92 € | 11,877.25 € |
Re-postage mail | 12,389.15 € | 492.90 € | 0.00 € | 1,124.50 € |
Data entry | 15,816.11 € | 631.03 € | 0.00 € | 1,567.66 € |
Total costs | 394,835.01 € | 19,914.48 € | 12,717.76 € | 27,804.91 € |
Costs per sampled unit | 3.59 € | 2.65 € | 1.70 € | 3.70 € |
Costs per respondent unit | 29.69 € | 21.55 € | 13.93 € | 35.20 € |
Note.—Postage mail package: 1.55 €; postage web package: 0.80 €; print mail package: 0.20 €; print web package: 0.08 €; postage for return mail respondent: 1.55 €; data entry mail respondent: 1.98 €; data entry web respondent: 0.00 €.
6. QUALITATIVE INSIGHTS
To augment the quantitative results, qualitative interviews were conducted to understand how establishments view web and mail modes when deciding whether to participate in a voluntary survey. To this end, we conducted 46 short structured interviews and 12 semi-structured in-depth qualitative interviews with interviewees recruited from participants and non-participants of the 2020 fourth quarter mode design experiment discussed above. The short structured interviews aimed to gather information on establishment mode preferences and the perceived advantages and disadvantages of web, mail, and telephone modes. Selected establishments were balanced across establishment size, industry, region, and experimental groups (see table S56 in the supplementary data online for an overview of the sample characteristics). Human resources representatives and managers within the establishments responsible for responding to the IAB-JVS served as interviewees. These interviews were embedded in routine questionnaire pretests and carried out via telephone from February to May 2022 by trained interviewers.
The in-depth interviews were conducted to understand the impact of mode on response processes and the decision to participate. The sample, consisting of eight interviews with respondents and four with nonrespondents, was balanced in terms of establishment size, industry, region, and experimental group as outlined in table S56 in the supplementary data online. The interviews, conducted by the authors of this article between March and May 2022, lasted between 31 and 55 minutes. Using a semi-structured interview guide (see Supplement H.4 in the supplementary data online for the complete interview guidelines), sessions were held via video or telephone. To counteract potential recall issues concerning specific response decisions and processes from the IAB-JVS conducted 1.5 years ago, we introduced the relevant mode design scenarios when needed. More methodological details are provided in Supplement H in the supplementary data online.
For both qualitative study arms, most respondents (26 out of 46 short interviews, and 11 out of 12 in-depth interviews) preferred the web mode over the mail mode, or were indifferent. Interestingly, four establishments that responded by mail in the mixed-mode groups stated a preference for web surveys in the qualitative interviews. A possible explanation for this paradoxical preference is that these establishments see the advantages of participating by web, but in a real-world situation, it seems easy for them to grab a pen and fill in the paper questionnaire, or to use the words of one interviewee: “When I have this questionnaire in front of me on paper, I tend to be the person who fills it out on paper. If I had received the questionnaire by e-mail via a link, I probably wouldn’t have printed it out and filled it in, but would have submitted it online.”(see table S59, Quote No. 1 in the supplementary data online).
We identified two channels where establishments stated that web or mail modes could influence their decision to participate. First, the appearance and length of a mailed questionnaire can impact the participation decision in a positive (e.g., formal and reputable) or negative (e.g., overwhelming or too long) way. Length, in particular, was reported as having a negative effect on participation in mail surveys because it is more salient than in a web survey. As one interviewee put it: “Length is a deterrent, yes, of course. That means that if I have a twenty-page questionnaire somewhere, […] the will to drop out suddenly increases very exponentially.” (see table S57, Quote No. 1 in the supplementary data online). Second, respondents perceive responding to a web survey as less burdensome than to a mail survey in general (“It [the web questionnaire] would go faster and would be easier, easy in terms of effort”; see table S57, Quote No. 2 in the supplementary data online).
The biggest cited advantage of web questionnaires was the flexibility to complete the questionnaire at their convenience. This was mentioned for both mail and web modes, but was more strongly associated with the web mode: “You can just organize it yourself. […] Online I can say: O.k. I’ll put that aside now and take it at 4:00 p.m. and work on it then.” (see table S57, Quote No. 3 in the supplementary data online). For both qualitative study arms, web questionnaires were reported to be faster to complete than mail questionnaires. Respondents provided three main reasons for this. First, the return of web questionnaires does not require cumbersome postal returns: “I think the general willingness to participate is generally higher with an online survey, because you simply save yourself the trouble of sending it back and so on.” (see table S57, Quote No. 4 in the supplementary data online). Second, the internal routing of the mail questionnaire can be replaced with a brief email and other kinds of cooperation are facilitated (e.g., screen-sharing): “But also—as I said—the internal back and forth, you’re quicker at it [with web questionnaires]. And with that, there is also acceptance [for the survey request], somehow. Because anything that requires less effort within the company has great advantages in terms of getting results.” (see table S57, Quote No. 5 in the supplementary data online).
Third, because using computers and web apps is part of many HR managers’ regular routine, the entire response process is regarded as being quicker: “Yes, online just goes quickly. […] I log in, that’s what I do [working with web applications] most of the time, the threshold to participate there is relatively low.” (see table S57, Quote No. 6 in the supplementary data online). Moreover, the response process of a web survey is seen as easier, because it is easier to correct answers and there are no worries about unreadable handwriting: “No, actually [online] is much better for me […] I always doubt that you can read my handwriting then.” (see table S57, Quote No. 7 in the supplementary data online). Another cited advantage of the web questionnaire is the facilitated use of internal documents, making it easier to search, copy, and paste from internal management systems. In addition, web surveys are seen as more modern and more sustainable than mail surveys. Nicely summarized by the following statement: “Especially with the sustainability mindset that is overtaking us all, online is the most efficient, cost-effective, and easiest method.” (see table S57, Quote No. 8 in the supplementary data online).
Regarding the advantages of mail surveys, some respondents of the short interviews noted that having a paper questionnaire on the desk has a reminding effect and could thereby increase the likelihood to participate. Other reported advantages of mail surveys were that establishments get an easy overview of all questions before starting the answering process and could discuss the questionnaire in a team meeting more easily. Some respondents also stated the advantage of writing notes on a paper questionnaire and considering their answers based on those notes: “I’m actually also more of a haptic person. I have to be able to take notes all the time, assess questions and answers, and think things through.” (see table S58, Quote No. 1 in the supplementary data online). Establishments also appreciate that the mail survey can be copied and filed in their records for future reference: “In addition, I can make a copy of it—and I often do this […]—and file it in our correspondence. […] I know that I can’t do that with an online survey.” (see table S58, Quote No. 2 in the supplementary data online). One cited disadvantage, common to both modes, is the burden of having to proactively contact the survey institute in the case of misunderstandings or ambiguities. Further cited advantages and disadvantages of web and mail modes from the short interviews are summarized in table 3. Additional quotes from the in-depth interviews are provided in Supplement H.3 in the supplementary data online.
Web . | Mail . | ||
---|---|---|---|
Advantages . | Disadvantages . | Advantages . | Disadvantages . |
|
|
|
|
Web . | Mail . | ||
---|---|---|---|
Advantages . | Disadvantages . | Advantages . | Disadvantages . |
|
|
|
|
Note.—Number of mentions in parentheses. Establishments were asked about their mode preference and the perceived advantages and disadvantages of web, mail and telephone interviews. Telephone interviews are not in the focus of this article and hence are not displayed here.
Source.—46 qualitative interviews 2022.
Web . | Mail . | ||
---|---|---|---|
Advantages . | Disadvantages . | Advantages . | Disadvantages . |
|
|
|
|
Web . | Mail . | ||
---|---|---|---|
Advantages . | Disadvantages . | Advantages . | Disadvantages . |
|
|
|
|
Note.—Number of mentions in parentheses. Establishments were asked about their mode preference and the perceived advantages and disadvantages of web, mail and telephone interviews. Telephone interviews are not in the focus of this article and hence are not displayed here.
Source.—46 qualitative interviews 2022.
7. DISCUSSION
This study evaluated the impacts of an experiment comparing a concurrent mail–web mixed-mode, a sequential web-to-mail mixed-mode, a single-mode web, and a single-mode mail design on survey participation in a large-scale establishment survey. The findings are useful for survey practitioners seeking to maximize participation rates and minimize nonresponse bias and costs in voluntary establishment surveys. The main findings can be summarized as follows. The experiment did not reveal any substantial differences in response rates between the four mode designs. Similarly, the four mode designs did not show meaningful differences with respect to aggregate (proxy) nonresponse bias. However, there were a few differences in subgroup participation between the mode designs. In line with the literature (Kaiser 2001; Dickey and Riberas 2007; Jones and Phipps 2010; Thompson et al. 2015), larger establishments were more likely to participate in the single-mode web survey design than in the single-mode mail survey design. In the concurrent mixed-mode design, all establishment size classes were more likely to choose the mail mode, but the difference in the likelihood of participation between the web and mail modes was smallest for the largest establishments. We found establishments in the agriculture/forestry and construction industries to be more likely to participate via mail in the concurrent mail–web mixed-mode design, but this result did not appear in the single-mode design comparison. Against our expectations, there was no strong evidence that establishments in the information/communication, finance/insurance, and public administration industries have stronger preference for participating via web than mail (or vice versa) in a concurrent mail–web mixed-mode or in a single-mode design.
Lastly, the cost analysis showed that push-to-web designs, such as a single-mode web and a sequential web-to-mail design can achieve substantial per-respondent cost savings of more than 50 and 25 percent, respectively, compared to a concurrent mail–web mixed-mode design. A single-mode mail design, in contrast, resulted in 19 percent higher costs per respondent than the concurrent mail–web mixed-mode design. However, we note that these cost results account for variable costs only, which are typically more pronounced in mail modes than in web modes. Future research could examine whether cost savings can be achieved when fixed costs, including software and programming the web instrument, are considered.
The accompanying qualitative study provided additional insights into how establishments weigh the pros and cons of web and mail modes when deciding whether to participate in a voluntary survey. In general, the participants preferred web over mail. While most participants did not explicitly state that mode has an impact on their willingness to participate, they did repeatedly mention that web surveys require less effort to respond compared to mail. Other key advantages of the web mode that were cited include sustainability, modernness, and facilitated handling of the questionnaire, which were important factors for the establishments. The fact that participants qualitatively preferred the web mode contrasts with the quantitative findings, where the mail mode was chosen at a higher rate than web in the concurrent mixed-mode design. This conflicting finding could be driven by at least two factors. First, the qualitative interviews were conducted 1.5 years after the IAB-JVS mode design experiments and overlapped with the COVID-19 pandemic when establishments were becoming more accustomed to working online. Thus, attitudes toward online surveys may have changed during this time. This may explain why for some of the same establishments their reported mode preference differed from their actual mode choice. Second, the paper questionnaire was sent along with the invitation to the concurrent mixed-mode design. As one qualitative participant noted, it was perhaps easier for many establishments to simply fill out the paper questionnaire on the spot rather than expend the effort of logging into the website, or they possibly perceived the paper questionnaire to be the priority mode given that they were contacted by postal mail rather than by email. This may also explain why there were similar discrepancies in the quantitative study, where some establishment subgroups were more likely to participate via mail than web in the concurrent design, but more likely to participate via web than mail in the single-mode designs. We suspect that had we reversed the delivery mode used in the concurrent design by emailing the survey invitation with a link to the web survey and attaching a PDF version of the questionnaire, which the establishment could optionally print and mail-back to the survey institute, this would have resulted in a higher web take-up rate compared to the mail mode and made the results more consistent across quantitative and qualitative studies. However, we can only speculate on the causes of these discrepancies and their remedies and suggest that they be investigated further in future work.
Forgoing mail questionnaires entirely and adopting a web-only mode design, as some National Statistical Institutes have done (see section 2.1), did not yield any negative impacts in our study. We attribute this result to several factors. First, the transformative impact of the COVID-19 pandemic on establishment practices, such as the prevalence of remote and hybrid working arrangements, has likely increased the burden associated with sharing paper questionnaires within establishments. Second, the pandemic has stimulated advances in IT infrastructure and skills among establishments and employees, potentially fostering a greater willingness among previously reluctant establishments to respond online. And lastly, ongoing improvements to web questionnaires (e.g., delegation functions) are reducing the perceived burden of responding online.
This study fills important gaps in the literature regarding the impact of self-administered mode designs on voluntary establishment survey participation. The lack of recent experimental evidence in this area is particularly notable given recent changes in Internet availability and usage by businesses (OECD 2023), but also increased work-from-home and flexible working patterns adopted by establishments in response to (and, in many cases, continuing beyond) the COVID-19 pandemic. Thus, we believe our findings are applicable to the post-pandemic situation, though further research is needed to confirm. Our study further adds to the literature by shining light on the effectiveness of various self-administered mode designs in a general population sample of establishments, which complements previous studies that have focused on more specific establishment populations (Erikson 2007; Downey et al. 2007; Bremner 2011; Harris-Kojetin et al. 2013; Ellis et al. 2013; Millar et al. 2018). An additional strength of the study is the comprehensive examination of several outcomes of interest: response rates, nonresponse bias, subgroup participation, and survey costs across different mode designs, which is rare in the establishment literature. The utilization of a mixed-methods approach combining quantitative experiments with qualitative interviews is another unique feature of this study, as it allowed for an examination of the pros and cons of using web and mail modes from the establishments’ perspective and thus shed light on the reasons for their participation decisions. Future methodological research might consider using a similar mixed-methods approach to analyze the drivers of establishment survey participation more generally and examine them from different perspectives.
In conclusion, the findings suggest using “push-to-web” designs for self-administered establishment surveys, implemented as either a single-mode web survey or a sequential web-to-mail mixed-mode survey. These designs can yield similar response rates and levels of nonresponse bias compared to concurrent mail–web mixed-mode designs, but with significantly lower per-respondent costs. This can also be viewed as an endorsement for the growing shift toward using web surveys as a supplement, or replacement, of mail surveys by researchers and statistical agencies.
Supplementary Materials
Supplementary materials are available online at academic.oup.com/jssam.
We thank the “Data- and IT-Management” Department at the IAB for data provision and the colleagues from Pro-IAB for recruiting and conducting the qualitative interviews. We are grateful to Economix Research & Consulting for implementing the experiment. We thank Mocja Bavdaž, Mario Bossler, Nicole Gürtzgen, Alexander Kubis, Martin Popp, and Marieke Volkert for helpful comments and suggestions. We thank Franka Vetter for valuable research assistance. We are thankful for comments by participants at the 2021 European Survey Research Association Conference, the 2021 European Sociological Association Conference, the 2022 Federal Computer Assisted Survey Information Collection Workshop, the 2022 European Conference on Quality in Official Statistics, the 2022 Business Data Collection Methodology Workshop, the 2022 German Online Conference, the 2023 workshop of the Method Section of the German Sociological Association, and an internal IAB seminar. This study design and analysis was not preregistered.