The rapid growth of global online fieldwork has provided a range of opportunities to researchers. Online fieldwork enables fast data collection, opens up access to hard-to-reach respondents and provides a cost-effective fieldwork solution.
However, this method of data collection also raises a number of ethical and technical issues.
This document serves as an official response to ESOMAR’s “28 questions to help research buyers of online samples” and provides detailed insight to the operational integrity and panel management practices employed by Panelbase. ESOMAR’s guidelines on conducting market and opinion research using the internet are designed to provide advice on these issues.
In order to ensure a credible and robust research panel, Panelbase has embraced these guidelines alongside strong data management principles, sound business ethics, and an overall integrity that underpins the development of its research community.
What experience does your company have with providing online samples for market research?
Panelbase is a division of Dipsticks Research Limited – a full-service research agency that has been operating since 1997. Since 2004, the company has provided online research services and conducted fieldwork in the UK, Europe and further afield. The Panelbase research community has evolved since 2004 and provided sample for thousands of online and offline projects (CATI, mobile surveys and focus groups).
Prior to 2008, Panelbase was used primarily as an internal resource supplying sample to each research division within Dipsticks Research Limited. Since 2008, and due to the rapid growth of Panelbase, we have provided sample to external clients, including; research agencies, other panels, PR companies and end clients seeking highly targeted and responsive UK sample.
Please describe and explain the type(s) of online sample sources from which you get respondents. Are these databases? Actively managed research panels? Direct marketing lists? Social networks? Web intercept (also known as river) samples?
Panelbase sources respondents mainly from its actively managed panel. For projects requiring sample beyond the scope of Panelbase’s sampling capabilities we occasionally source sample from trusted partners who themselves only use proprietary actively managed panels. Such partnering is only ever done with the advance and explicit consent of our end clients. We also conduct many projects using client-supplied data lists for projects such as employee surveys or customer satisfaction surveys. We do not use river sample.
If you provide samples from more than one source: How are the different sample sources blended together to ensure validity? How can this be replicated over time to provide reliability? How do you deal with the possibility of duplication of respondents across sources?
Any blending of sources is done with client consent and to meet clearly defined quota specifications with the blended external source being acquired to fill a specific shortfall or niche requirement. Where a study comprises multiple waves or is longitudinal by design, sample planning at the start of the project will map our partner contributions to ensure consistent sample compositions across all waves. We use proprietary scripts to identify and filter potential overlap between sources.
Are your sample source(s) used solely for market research? If not, what other purposes are they used for?
Panelbase only supplies sample for market research purposes.
How do you source groups that may be hard to reach on the internet?
We employ a broad range of recruitment techniques, including offline, to maximise the representation of hard-to-reach and minority groups. As our panel offers research opportunities both online and offline, ongoing engagement in research opportunities is not solely reliant on internet access. Our panellists also engage in telephone surveys, product testing, postal surveys and focus groups.
If, on a particular project, you need to supplement your sample(s) with sample(s) from other providers, how do you select those partners? Is it your policy to notify a client in advance when using a third party provider?
The size and responsiveness of the Panelbase membership means that we are almost self-sufficient when providing sample for online projects in the UK. In over 95% of cases we do not require sample contributions from external partners, which simplifies the research process and aids overall data integrity. On occasion, we call upon on external partners to provide sample for international projects or extremely low incidence UK-based projects where the reach of Panelbase is insufficient to support a project’s requirements. We have vetted many external partners over the years and have established solid relationships with carefully selected partners who are available to support our requirements. All of our preferred partners maintain the same levels of data integrity that we adopt and provide assurances of compliance with prevailing codes of conduct and law. When designing projects with our clients, we always ensure maximum transparency regarding the use of external partners. Usually the requirement for external assistance will become evident at the feasibility assessment stage and would form part of our project proposal, and so the client is aware of this before commissioning the project. In the majority of cases, partnering requirements stem from complex, deeply-profiled or international sample being fundamental to the project objectives. In such instances, client expectations and project briefs often acknowledge a partnering requirement in order for successful delivery of the project. Where partners are engaged, all sample is de-duplicated to ensure multiple participation in the survey is not possible.
What steps do you take to achieve a representative sample of the target population?
Sample selections are aligned with the target specification taking in to account all demographic and other attributes of the target population and constructing detailed sample selection plans to interlock such criteria at the point of extraction from our database and deploying invitations. We also calculate likely responsiveness per respondent, using their historical survey activity, in order to ensure correctly balanced sample deployments and throughput of sample on entry to each survey.
Do you employ a survey router?
If you use a router: Please describe the allocation process within your router. How do you decide which surveys might be considered for a respondent? On what priority basis are respondents allocated to surveys?
If you use a router: What measures do you take to guard against, or mitigate, any bias arising from employing a router? How do you measure and report any bias?
If you use a router: Who in your company sets the parameters of the router? Is it a dedicated team or individual project managers?
What profiling data is held on respondents? How is it done? How does this differ across sample sources? How is it kept up-to-date? If no relevant profiling data is held, how are low incidence projects dealt with?
All members are invited to complete 18 profile sections to tell us more about themselves and to assist with pre-selecting them for surveys that are relevant to their personal profile and interests. In total, each member can provide information on more than 800 fields, however this is not mandatory. All members are prompted to complete their profiles or any sections which are out of date by more than 6 months. This helps to ensure maximum accuracy at all times and assists with feasibility and sampling. Where no relevant profiling is held, we make use of our mini poll tool, which allows questions to be launched within a matter of seconds and can help to gauge incidence or act as a pre-screening mechanism for targeting niche sample profiles. We also regularly script bespoke pre-screening projects to accommodate niche or complex sample specifications which cannot be easily serviced using core profiling alone.
Please describe your survey invitation process. What is the proposition that people are offered to take part in individual surveys? What information about the project itself is given in the process? Apart from direct invitations to specific surveys (or to a router), what other means of invitation to surveys are respondents exposed to? You should note that not all invitations to participate take the form of emails.
Sample selection is driven by the profile requirements of each individual survey, as well as taking in to account additional factors such as available time for fieldwork and likely response rates. Feasibility and incidence are always established during the conception of a project in order to determine the most effective means of achieving sample selection. Exclusions can be implemented based on survey subject matter, frequency of participation, or any other criteria, as required.
When selecting or strategically excluding sample, our systems automatically extract at random those members who meet all profiling requirements. This process is subjected to quality assurance checks in order to verify that the correct sampling requirements and expectations are met before allowing our systems to engage in the mass deployment of survey invitations.
Survey invitations are typically deployed via email as well as being added dynamically to each invitee’s home page within the member website. We also have the capability to deploy SMS alerts to those members who have provided and double opted-in their mobile number. Additional notification channels which use prevailing and relevant technologies are currently in development and will be reflected in future revisions of this document.
Please describe the incentives that respondents are offered for taking part in your surveys. How does this differ by sample source, by interview length, by respondent characteristics?
Upon completion of the double opt-in process, all new members automatically receive £3 in their Panelbase account. We believe that Panelbase members should be rewarded fairly for their time when participating in research studies. Therefore, all survey invitations are accompanied by a nominal financial reward and/or entry in to a project-specific prize draw (e.g. for £100 high-street vouchers). In addition to this, and in order to ensure that any member who is screened out is also fairly compensated for their time, all non-qualifying respondents who are screened out of a survey are automatically entered in to a monthly prize draw (182 prizes each month ranging between £1 and £50 cash).
The reward offered for participation and successful completion of surveys is linked to the duration and complexity of each survey as well as the methodology and sample profile. Online usually attracts a reward between £0.10 and £10.00 for surveys lasting between 1 minute and 90 minutes, whereas focus groups can offer rewards of up to £50. Other, more complex studies that require greater input from a panel member (e.g. diary activities that require online submission of activity over multiple days or follow-up face-to-face depth interviews) can attract higher rewards. All rewards are designed to offer fair recompense for the time required to participate in the research study.
What information about a project do you need in order to give an accurate estimate of feasibility using your own resources?
The basic parameters are; length of survey, subject matter, sample size, respondent profile(s) required, survey quotas, any technical detail that might impact fieldwork and the client’s preferred timings.
Do you measure respondent satisfaction? Is this information made available to clients?
We periodically research how our respondents feel about the surveys and services that we offer them. This information informs the on-going growth and development of our panel. Survey-specific feedback is often fed back to clients however this does not happen routinely on all projects. We also encourage our members to rate us using Trustpilot and have consistently scored 5 stars with over 1,700 reviews at the time of updating this document.
What information do you provide to debrief your client after the project has finished?
Typical feedback includes information relating to incidence rates, volumes of quota full and screen out activity, and any anecdotal feedback from respondents, which we tend to receive for most surveys.
Who is responsible for data quality checks? If it is you, do you have in place procedures to reduce or eliminate undesired within survey behaviours, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item non-response (e.g. “Don’t Know”) or (d) speeding (too rapid survey completion)? Please describe these procedures.
Where Panelbase is responsible for scripting and hosting a survey, we include proprietary algorithms to automatically identify any potential rogue respondent activity such as; straight lining, speeding, and poor verbatim responses. All such instances are subjected to manual review and where these checks do not meet our required standards, the respondent’s account is flagged and a notification sent to them to advise that they have not met the required standard for this survey. They are also reminded of the importance of being diligent and following instructions to provide the best quality of considered response at all times. Multiple failures of our quality control processes may result in the removal of a panellist account.
How often can the same individual be contacted to take part in a survey within a specified period whether they respond to the contact or not? How does this vary across your sample sources?
Panellists receive an average of 5 – 10 survey invitations each month. Invitations to an individual survey are usually limited to the original email invitation and one reminder email, however the latter is often not required.
How often can the same individual take part in a survey within a specified period? How does this vary across your sample sources? How do you manage this within categories and/or time periods?
We don’t enforce a hard limit on the volume of surveys within a given period, however we are also conscious of not over inviting our members. For longitudinal studies, we usually implement a lock out mechanism of 3 months so that a respondent cannot take part in multiple waves of a study within this period. Some projects require shorter or longer lockout periods, in which case these are implemented on a case by case basis. Similarly, lock outs can be applied based on topic or exposure to certain content within a certain time frame.
Do you maintain individual level data such as recent participation history, date of entry, source, etc., on your survey respondents? Are you able to supply your client with a project analysis of such individual level data?
We store all survey participation information at an individual panel member level. Every email sent, click-through to survey, entry in to survey, exit from survey, and source of panellist, is recorded so that this can be used for quality assurance and reporting purposes.
If clients require reporting of this information in relation to their projects we are able to supply this on demand.
Do you have a confirmation of respondent identity procedure? Do you have procedures to detect fraudulent respondents? Please describe these procedures as they are implemented at sample source registration and/or at the point of entry to a survey or router. If you offer B2B samples what are the procedures there, if any?
Every online survey that we host, irrespective of also providing the sample, is subjected to stringent data integrity processes. All survey data that falls outside of our data integrity requirements are eliminated from the survey results. Furthermore, where panellists are found to provide unusable data on three occasions, their accounts are automatically removed so that they are not invited to future surveys. The payment of rewards to such respondents is rejected and they are unable to redeem any rewards held in their account. We also engage multiple anti-fraud detection systems at the point of registration, survey entry and in other areas to actively identify any potential rogue respondents and to remove them immediately from the panel. Sense checking of profile data against survey responses is an additional measure which we engage to help to filter out suspicious activity.
Please describe the ‘opt-in for market research’ processes for all your online sample sources.
The online registration process is only completed following a double opt-in confirmation which verifies the registrant’s email account being correct, as well as confirming their acceptance of our terms and conditions of use. Every member who opts-in for SMS or mobile surveys must complete a separate double opt-in process using their mobile phone. A validation code, specific to their opt-in for SMS/mobile surveys, is sent to their mobile phone and must then be entered in to their online account. This confirms receipt of the validation code, their number being correct, and re-affirms their consent to receive surveys or associated notifications on their mobile phone.
Please describe the measures you take to ensure data protection and data security.
All data and project materials provided by Panelbase members and clients are stored on secure servers to which only authorised personnel have access, and only for the purpose of administering Panelbase member accounts and surveys. All data submitted by members via the Panelbase website is done so using Extended Validation SSL technology, which encrypts the contents of the browser session and ensures integrity of the data transaction between their internet browser and our systems.
The premises within which Panelbase servers are located is secured from public or unauthorised access both physically and electronically using the latest technologies and security systems, including but not limited to; firewalls, data encryption, IP-based permissions, CCTV, and swipe entry access control. Data back-ups are subject to the same levels of physical security and authorised access.
What practices do you follow to decide whether online research should be used to present commercially sensitive client data or materials to survey respondents?
Whilst it is almost impossible to prevent disclosure of survey content we do instruct all respondents of the need for confidentiality regarding the material they are exposed to. We also take certain measures to reduce the ease with which material can be ‘grabbed’ from a survey e.g. encrypting file paths, however this is simply a preventative method for the less knowledgeable individual and anyone determined to capture survey content can easily do so on any device which publishes such information. Trust and goodwill is integral to the market research process and something that we communicate to our members at the entry to our surveys.
Are you certified to any specific quality system? If so, which one(s)?
Our company is ISO 9001:2008 certified (previously BS EN ISO9001:2000) and has been each year since 2002. We are also working towards the ISO 20252:2006 Market Research Quality Standard and ISO 27001:2005 for Information Quality Management Systems.
Our ISO-approved quality management systems are built on the principles of effective data storage, security and management. These systems are subject to continual review and change, and therefore maintain compliance with external auditing requirements at all times.
Do you conduct online surveys with children and young people? If so, do you adhere to the standards that ESOMAR provides? What other rules or standards, for example COPPA in the United States, do you comply with?
We have successfully conducted hundreds of online and offline projects with children aged between 6 and 15, drawing from our sub-panel of over 50,000 children under the age of 16. In the case of offline projects, we only engage DBS-checked personnel on such projects and parental supervision of the research process is mandatory.
Online surveys with children are also subjected to parental consent and at no point do we communicate with children directly. Surveys designed for children are done so in accordance with ICC/Esomar and MRS guidelines. In the same way that we protect the identity of our panel members and the responses they provide to our surveys, all data provided by children is handled with the same levels of care and integrity.
Any questions regarding the content of this document should be addressed to:
Angus Webb (Founder & Director)
Hexham Business Park
Alternatively, contact can be made using the following methods:
Telephone: 01434 611164
“We have used Panelbase for many years and find their panel to be very well maintained, resulting in higher quality responses. The team are great to work with and always easy to contact. They always take great care with each and every project, resulting in high quality data.”
Taylor McKenzie | Research Director