Creating a Representative, Rigorous, Insightful Multinational Survey: From Conceptualization to Data Collection

Surveys are ubiquitous today, from political predictions to simple polls about the best time for a meeting. They’re quick, cheap, and easy. So why not just whip up a survey, send it out, and let the data flow in? However, when it comes to gathering rigorous, representative data that offers valuable insights—especially across multiple countries—things get more complicated. In this blog post, I’ll summarize the 14-month journey of developing a multinational survey for the Consumer Demand for Circular Urban Living (CDCUL) project, an initiative funded by the European Commission’s Driving Urban Transitions programme.

The CDCUL project explores consumers’ willingness to pay for shared residential spaces like kitchens, offices, workshops, and even cargo bikes, with the ultimate goal of making the design and development of shared-access spaces easier for housing developers. Here’s how we built a survey that would go out to 3,000 people in Sweden, Slovenia, and the Netherlands. The process involved structured brainstorming, idea testing, survey assembly, survey testing, translation, and finalization. I detail each of these steps below.  At the bottom I offer a link to a demo survey, so you can test it yourself.

Step 1: Structured Brainstorming (Oct 2023 – Feb 2024)

Our first step was to collect feedback from the entire CDCUL Consortium consisting of  researchers, architects, housing developers, and public housing authority representatives. We kicked off the process by completing a “variable feedback form” to help us think through key themes like the types of shared services (e.g., kitchens, workshops, bikes), housing variables (e.g., space size, type of housing), and sociodemographic questions (e.g., age, income).

Through multiple brainstorming sessions, we identified five services to test:

  1. Shared kitchen-and-dining space
  2. Shared office space
  3. Shared workshop (for activities like painting and repairing)
  4. Shared toolbox
  5. Shared cargo bicycle

We focused on services that were realistic yet uncommon, simple to describe, and would appeal to a broad demographic, while also supporting sustainable living.

Step 2: Idea Testing (Mar – Apr 2024)

With our list of shared services ready, it was time to test these ideas with real people. We conducted focus groups across all three countries to gauge participants’ reactions. Did they find these services appealing? Were there any “deal-breakers” or must-have features?

The responses were mixed but insightful:

  • The kitchen-and-dining space resonated with people who hosted family dinners, while the shared office space appealed to those in small apartments with children.
  • Concerns about maintenance were frequent, especially around the shared toolbox and workshop.
  • The management style (voluntary vs. professional) and accessibility (how close the shared space was to home) also emerged as key factors that could influence willingness to pay.

Based on this feedback, we selected the services to be included in the survey for each country. All countries would test the “spaces”— shared kitchens, shared office spaces, and shared workshops. In addition, Sweden would test the shared cargo bike, while Slovenia and the Netherlands would test the shared toolbox.

Step 3: Survey Assembly (Apr – Jun 2024)

Next came the drafting of the survey. We worked in English, the consortium’s operating language, and spent weeks going back and forth to get the phrasing just right. This included crafting a “welcome” page to explain how the choice experiment would work, writing data protection statements (with the help of our legal team), and designing visuals to accompany each shared service.

An interesting challenge was randomizing how choices were presented. We didn’t just randomize the options in each question but also randomized entire sections of the survey. This complexity took time to perfect but was essential for ensuring the survey’s statistical integrity.

After assembling the first draft, we met in Maastricht in June for an in-person review. The group discussed everything from how to price these services to whether the descriptions were clear and engaging. This was a moment of both validation and realization—there was still a lot of work to be done!

Step 4: Survey Testing (Aug – Sep 2024)

Once the first draft of the survey was ready, it was time for testing. We began by inviting colleagues to take the survey, asking them to note any issues or confusion. This feedback helped us refine the language, tweak questions for clarity, and fix logistical problems.

One significant change came after hearing from test participants: We reworded how we described the pricing of shared services. Initially, we referred to services as “per use,” but feedback showed that “per hour” was clearer and easier to conceptualize.

With these adjustments, we tested the survey on a small sample of anonymous participants, which confirmed that it took participants around 10 minutes to complete—much shorter than the 20 minutes we had initially anticipated. They anonymous respondents also seemed to be completing the entire survey and answering in reasonable ways. Moving forward!

Step 5: Translation & Reverse Translation (Sep – Oct 2024)

Next, we translated the survey into Swedish, Slovenian, and Dutch. This process was fascinating because even small wording differences can change the meaning of a question. Each language version was translated by two native speakers, who then met to reconcile differences. We followed this with “reverse translation,” where the translated versions were translated back into English to ensure accuracy.

This process revealed a few minor inconsistencies, but after fixing them, we ended up with six finalized versions: three in the national languages and three corresponding English versions.

Step 6: Finalization (Nov – Dec 2024)

In the final stages, we did technical testing to make sure everything worked as expected across different devices like mobile phones, tablets, and computers. We also secured a data collection firm with access to a representative sample of participants in each country. After confirming that the survey was properly set up and tested, we launched data collection in mid-December 2024.

By early 2025, we had completed the survey deployment in all three countries and were ready to move on to analyzing the results.

Takeaway

Designing a multinational survey is no small feat. It involves months of collaboration, testing, and refining, with a focus on ensuring that the data collected is both representative and rigorous. This process took our team on a journey that included brainstorming, testing, assembly, translation, and fine-tuning—each stage crucial for the success of the final product. In reality, the process involved multiple small qualitative research projects embedded in a larger mixed-methods study.

If you’re curious about how we did it, or if you’d like to try the survey yourself, click here for the demo version!

Stay tuned for an upcoming post where we’ll dive into how we analyzed the data and what we learned about consumer demand for circular urban living.

Related posts