Type of market monitoring tool: Phone surveys
Sub-type: Computer-assisted telephone interview
Authority: Central Bank of Kenya
Sector: Digital credit
Tool: Phone surveys (computer-assisted telephone interviews [CATI])
What is the tool used for? To understand customer experience with digital credit, uses of digital credit, and emerging risks
Third parties: CGAP, Financial Sector Deepening Kenya (FSDK), Ipsos (vendor), Kenya National Bureau of Statistics (KNBS)
Estimated cost*: US$80,000
Year(s) of usage: 2017–present
Keywords: CATI, demand-side surveys, digital credit, digital lending, digital loans, Ipsos, Kenya, multiple borrowing, overindebtedness, phone surveys
*Does not include indirect costs such as staff time at CBK, CGAP, FSDK, and KNBS.
- In 2017, CGAP and FinAccess Management (FAM—a collaboration between CBK, FSDK, and KNBS) conducted large-scale phone surveys of digital credit users and non-users in Kenya to learn more about their experience with digital credit.1 As the first large-scale demand-side survey dedicated to the topic, the study emphasized identifying the potential risks to emerge from these new credit sources.
- In Kenya, general consumer protection regulation applies to digital credit products offered by or with a regulated financial institution. The rules, however, do not apply to nonregulated digital credit products, including those from app-based lenders or from partnerships between mobile network operators (MNOs) and unregulated lenders. The result is an uneven playing field, with regulated and unregulated lenders offering the same products but operating under different rules.
- Regulatory and supervisory powers. Central Bank of Kenya (CBK) is the prudential supervisor of Kenya’s financial sector and provides oversight of payment, clearing, and settlement systems.
- Consumer protection supervision role. There is no mandate (or unit) for market conduct supervision at CBK. However, in May 2018, Kenya’s National Treasury proposed a draft Financial Markets Conduct Bill that would establish a Financial Sector Ombudsman and a Financial Sector Tribunal, plus a Financial Markets Conduct Authority which would supervise digital lenders. While the 2018 bill has not been passed, the Central Bank of Kenya (Amendment) Bill 2021 seeks to license, regulate and supervise the business of digital lenders and safeguard the interests of consumers of digital lending products.
- Credit reporting bureaus. Since 2014, CBK has required regulated banks and microfinance institutions in Kenya to report “full file” credit histories (i.e., positive as well as negative items). Compliance and enforcement have been inconsistent, with one of the largest lenders initially reporting only negative information. This impeded other lenders from accurately assessing potential customer risk profiles and narrowed borrower options. The rules on credit reporting do not apply to nonregulated lenders. In April 2020, CBK withdrew the authorization it had granted unregulated digital lenders on providing information to credit bureaus.
Purpose and incentives
- What is the tool used for? To measure customer experience with digital credit, uses of digital credit, customer risks associated with digital credit (including overindebtedness), market penetration of digital credit, and the demographics of digital credit users.
- Incentives for tool development. FSDK already had an existing collaboration with CBK and KNBS through its management of the FinAccess Household Survey (FinAccess),2 and wanted to test a more cost-effective approach to collecting in-depth data on financial inclusion and economic vulnerability on a regular basis. Aside from cost concerns, the choice to use a phone survey over an in-person survey (like FinAccess) was made easier by the fact that mobile penetration is high in Kenya, with over 90 percent of FinAccess participants having mobile numbers. In addition, the subject of the research—digital credit—required the use of a mobile phone.
Technical methodology and data ecosystem
- Sampling. The study targeted two sets of respondents and fieldwork in Kenya was conducted by the research vendor, Ipsos. The first set used 2016 FinAccess respondents as a sampling frame. Of those interviewed, 85 percent had given consent to be contacted for follow-up surveys. The phone survey sample was randomly selected from a pool of these respondents and weighted by KNBS to be representative of adult phone owners. Moreover, to gain a deeper understanding of the market, FSDK partnered with some of Kenya’s largest digital credit providers (e.g., KCB M-PESA, M-Shwari, Branch, Equitel) to interview and analyze an additional sample referred to as the “Boost sample.”. This second set of respondents was comprised of approximately 5,500 recent digital credit borrowers from lenders’ customer databases. Insights from this additional research and from discussions with roviders were used to improve interpretation of the findings and to identify relevant implications for policy makers, industry players, and researchers.
- Survey instrument. The research team, led by CGAP and FAM, developed two survey questionnaires: (i) the FinAccess sample questionnaire and (ii) the Boost sample questionnaire. For the main FinAccess sample questionnaire, respondents who had never used digital credit answered a set of brief questions on demographics and the use of financial services, while respondents who previously had used digital credit answered the full questionnaire about their use of and experiences with digital credit. The Boost sample questionnaire differed slightly from the main questionnaire by including a few additional questions specific to digital credit use.
- Instrument review and translation. Ipsos provided feedback on both sets of questions, which, once finalized, were translated into Kiswahili and several other local languages.
- Questionnaire scripting. Both questionnaires were translated into electronic (scripted) format in the Ipsos CATI platform for use during data collection. The CATI platform supported pre-entered demographics data about respondents which was available from the databases provided.
- Training the data collection team and instruments piloting. Ipsos recruited and trained a data collection team, a quality control team, and coding personnel. Two pilot exercises were also conducted with small samples of respondents (30–35 completed questionnaires) to test for interview length and questions that could be problematic. The research team adjusted the survey instrument based on feedback from the pilots, which resulted in shorter interviews.
- Data collection activities. To prepare for the data collection process, Ipsos sent a bulk SMS message alert inviting all respondents to participate in the study. The notification helped improve the response rate, and was an important trigger due to the rising cases of reported scammers contacting the public with the aim to defraud. Data collection for the FinAccess sample was conducted over approximately one month, with multiple attempts to reach respondents made at different times and on different days. The FinAccess sample had an initial target of 4,500 respondents, 3,150 of whom were successfully interviewed. This represented a success rate of 69 percent. The Boost sample had a target of 5,427 (a number revised upward to make up for unreachable FinAccess sample respondents); a total of 5,445 interviews were achieved, thus representing more than a 100 percent success rate. This extremely high success rate was likely due to the fact that Boost sample respondents were already active contacts that financial services providers (FSPs) regularly engaged to access financial services via mobile phone.
- Data monitoring and cleaning. Data was monitored on a daily basis, and completed interviews were verified and confirmed. Use of the CATI platform ensured that data collected were relatively clean, given that data cleaning protocols such as skip routines, single and multiple coding responses to critical questions, and logic control were established during the questionnaire scripting processes. The data were checked for accuracy and consistency, with outliers sent back to the data collection team and resolved through respondent call-backs. This helped to identify problematic sections of the survey instrument, as well as interviewers who required monitoring or retraining.
- Coding and translation of open-ended responses. The coding frame was developed after the interviews were completed. Codes specified in the questionnaires were automatically used for closed-ended questions. For open-ended questions, a code frame that used verbatim responses was developed in English.The final clean data sets were delivered to FSDK in SPSS and Stata formats for analysis.
- Quality control processes. Ipsos used several processes to ensure quality control, including recruitment and training of a highly-qualified data collection team, spot checks with members of the technical team listening in on some interviews, use of the CATI platform to minimize human error, and daily debriefing sessions with the data collection team.
Staff, expertise, and other requirements
- Survey instrument development. CGAP and FSDK led development of the survey instrument, with input from CBK.
- General oversight. CBK provided study oversight, with input into the questionnaires, data analysis, and research objectives in collaboration with FSDK, which provided Ipsos with funding to conduct the phone survey.
- Sampling expertise. KNBS weighted the FinAccess sample to be representative of adult phone owners and provided technical assistance in ensuring data quality and robust statistical analysis.
- Data collection. Ipsos hired a data collection team organized into two shifts. This maximized the efficiency of reaching survey respondents since some preferred to be called in the early morning while others preferred the evening. Each shift had nine interviewers allocated to a single supervisor, for a total of two supervisors and 18 interviewers. Interviewers were recruited and organized in terms of the language competencies anticipated to be necessary for the interviews. Ipsos also engaged two CATI managers who led the implementation of quality control mechanisms during calls.
- Project management. Ipsos assigned a project manager to oversee the overall execution and management of the data collection process and an assistant project manager to conduct quality checks and spearhead collation of field insights for preparation of the field report.
- Data analysis. A small team at FSDK analyzed the data Ipsos submitted, extracting meaningful insights on customers’ experience with digital credit, uses of digital credit, and emerging risks. Findings from the data formed the basis of a joint research publication between FSDK and CGAP on digital credit in Kenya and Tanzania.
Vendor selection and cost
FSDK selected Ipsos as the vendor to carry out the phone survey, with a contract of US$80,000. Ipsos’s main roles included reviewing the survey instrument provided, piloting the instrument, recruiting and training the data collection team, collecting data, processing data, providing project updates, and delivering clean data once data collection processes were completed.
The terms of reference issued by FSDK required the vendor to exhibit the following skills and experience:
- A track record of successful implementation of major phone-based, nationally representative surveys in Kenya within the last five years
- Adequate Kenya-based research capacity to conduct phone-based surveys across all socioeconomic groups
- Proven ability to pre-code specified information onto a phone-based interview script
- Ability to create and work with panel samples
- Strong quantitative and qualitative skills
- Commitment to producing the required deliverables within a specified timeframe
Benefits and impact
- Cost-effective. CATI-enabled phone surveys are cheaper and quicker than in-person surveys, yet still can reach a lot of people in a short timeframe.
- Work in low-literacy environments. Phone surveys do not require respondents to be literate, unlike SMS or web-based surveys. Interviews can be conducted in various languages to accommodate respondent preferences.
- Rich survey data improved understanding of a new sector. Analysis of results from the phone survey revealed customer experiences with digital credit—a new sector where not only banks but unregulated providers were operating. In this environment, little information was available and digital borrowers faced potential risks, including poor transparency, overindebtedness, and multiple borrowing.
- The availability of gender-disaggregated survey data revealed gender gaps. The phone survey prompted the identification of a significant gender gap in access to digital credit (with borrowers proving to be 55 percent male and 45 percent female), as well as differentiation in frequency and amounts of digital credit by gender.
- Research informed proposals on regulation and supervision of digital credit. Following discussions on survey results, the National Treasury proposed a draft Financial Markets Conduct bill in 2018 to establish a market conduct authority that would supervise all digital lenders. Although the bill has yet to pass, in 2020 CBK drafted a bill aiming to regulate and supervise digital lenders and safeguard the interests of digital borrowers. Further, in April 2020, CBK mandated financial institutions not to share with credit bureaus any defaulted loans with a balance of about US$9,
which covers a good percentage of digital loans by first-time borrowers.
Limitations and implementation challenges
- Out of service contacts. A significant number of FinAccess sample respondents were unreachable, which posed a challenge to achieving the targeted number of interviews. To make up for “non-responses” in the FinAccess sample, the Boost sample target was increased.
- Respondent fatigue. It was observed that some respondents had been invited to participate in other studies in the past and had survey fatigue, and thereby refused to participate in interviews. Strategies to manage this challenge were put in place, with respondents being called at times that were convenient for them, including evenings and weekends.
- Suspicion of fraudulent cases. Due to a prevalence of scam calls and fraud attempts via mobile phone in Kenya, some respondents were suspicious of interviewers and did not respond to all questions. In addition, some respondents from the Boost sample were disconcerted about not having received notification of the study from their FSP.
- Questionnaire restructuring. After testing questionnaires during the first pilot, it was reported that interviews took too long. This necessitated deleting some questions and restructuring others to avoid repetitiveness and for ease of administration. These changes affected the scripts, which had to be redone, taking more time than anticipated.
- Non-response for certain sections of the questionnaires. During data collection, several respondents refused to answer questions over the phone that were perceived to be sensitive, such as sources of income, savings, and expenditures. It took a lot of convincing to get responses, and in some cases respondents refused to answer at all. Probing tended to agitate certain respondents. It was observed that income, expenditure, and savings questions were inconsistent or respondents tended to give responses they felt were socially acceptable. To mitigate these issues, interviewers reassured respondents of the confidentiality of their responses and the purpose of the study.
Future plans for the tool
Since the 2017 digital credit phone survey, in conjunction with KNBS and FSDK, CBK has conducted two additional studies using phone surveys, including a survey to assess the impact of COVID-19 on business owners, that use the 2019 FinAccess survey sample. The second is a SmartSensing survey aimed at understanding the use of smartphones among Kenyans to provide policy recommendations on data privacy. CBK also is planning to conduct a phone-based survey on remittances. FSDK has been working with CBK on efforts to make better use of the central bank’s data warehouse, improving templates for supervision and better visualization.
- The phone survey helped improve understanding of a new market with unregulated providers and little available information, including by gathering data on the experiences of and risks faced by different consumer segments, especially women.
- Posing questions over the phone tended to raise respondents’ suspicions. A longer warm-up session may have made respondents feel more comfortable.
- Sending bulk SMS to alert respondents of the digital credit study was essential and helped to reduce suspicions. When contacts were sourced from specific FSPs, such as those in the Boost sample, it would have been ideal for the providers themselves to notify potential respondents, which would provide more validity for the study.
- There is risk in using samples of respondents from previous surveys as some respondents may change or stop using certain phone numbers.
- Short questionnaires that do not take too much time to administer help avoid respondent fatigue (which can, in turn, lead to the termination of an interview).