No time for reading? The real reason why book reading is declining in South Africa

Written by: Alexandra Cronberg

Put on Sabina’s hat. Wait for the train in sweltering heat with Ifemelu. Sit down with Nathan Zuckerman and talk about the inevitability of getting people wrong. Or enter the world of the characters in any other novel. Whatever the book, the benefits of reading are numerous. It helps builds empathy, imagination, and critical thinking. These traits not only enrich personal lives but can contribute to social cohesion and innovation. Reading has also been shown, time and again, to be a strong predictor of educational attainment and academic success. Hence it can help to reduce social and economic inequalities in a country.

Book imageAgainst this backdrop, the South African Book Development Council (SABDC) is leading an initiative to promote reading, in particular among disadvantaged groups. As part of this work, they commissioned Kantar Public (at the time operating as part of TNS) to gather information on South Africans’ reading habits, and to segment the population based on those habits and willingness to read books[1]. The study applied tools developed for market research purposes, and shows how such tools can gainfully be used in the context of social research. Specifically, two market research tools were used for this study: First, a market segmentation was done to segment the population with respect to reading habits and inclination to read books. This segmentation helped answer such questions as ‘Who are the people loath to ever open the covers of a book, and how common are they in the population?’ Or, ‘Who are the people with books on their bedside table, which are gathering more dust than delight?’[2] Second, the ConversionModel was used to estimate how much free time the different segments dedicate to reading, and how much time they would ideally spend if there were no barriers to getting lost in a book. What’s the discrepancy, if any?

The study showed there is much that competes for the time and attention of South Africans. Listening to the radio, watching TV/movies, going to the mall, and hanging out with family or friends, are all more popular activities than reading. With respect to printed books, only four-in-ten households have a book in the house. South African readers spend, on average, four hours per week reading, though not necessarily books. Compared to a previous survey, the study showed that reading has dropped in terms of popularity as leisure activity: In 2006, 65% of South Africans reported having read in the past month. That figure was down to 43% a decade later.

Turning to the results of the segmentation, the study showed that almost three-quarters (73%) of the population are ‘low potential’ printed book readers, that is, this segment not only prefers to watch TV, listen to the radio, or go to the mall, but would probably prefer dusting the shelves too rather than reading. The other segments are ‘committed readers’ (14%), ‘less committed readers’ (10%), and those who are ‘open’ (3%).

The value of this analysis lies in the tailored strategies that the SABDC can develop for each of the segments, and the targeted level of effort involved. Some people may only need a bit of encouragement to pick up that book waiting on the bedside table, whereas others need to find new occasions to take up reading. For yet another group, readership starts from a blank page, so to say.

Moreover, the results from the ConversionModel showed that in South Africa there is generally a pretty small discrepancy between the actual time spent reading and the ideal amount of time dedicated to this pastime. Hence, the falling readership figures are probably not due to increasingly busy lives, but because activities and preferences have shifted. It might not have been what the SABDC wanted to hear, but nevertheless helps them inform their strategies and initiatives.

So, the SABDC is bound to stay busy for a while, working to get South Africans to pick up those books waiting to be tickled with the turn of a page. More than that, much effort is needed to get people to visit the library or bookstore in the first place. Yet the right Quoteinformation to aid the design of their programmes and initiatives makes their task easier: The study findings mean they can specifically target the groups with the greatest potential. As a result, there may be more people who will put on Sabina’s hat, wait for the train with Ifemelu, or sit down in a bistro with Nathan Zuckerman, but more importantly, step into any story or book.


[1] National survey into the reading and book reading behaviour of adult South Africans 2016. The report is available at: The study was a nationally representative household survey (n=4000).

[2] The questions may have been were worded slightly differently in the project report.

What works? Reflections on the International Summit on Social and Behaviour Change Communication

Written by: Alexandra Cronberg

Whether you know it or not, you are probably subject to social and behaviour change interventions in daily life. If you live in South Africa, perhaps you have heard radio ads or seen ads on Twitter or Facebook, talking about the importance of wearing seatbelts in the car. If you live in India, perhaps you have seen billboards with the slogan “Drink whisky, drive risky”. In many countries in Europe, cigarette packages now display graphic pictures of cancerous organs, and smoking is banned in public places. In countries all over the world, employees who are given the choice of joining a pension scheme often have the box ticked by default.

As these examples illustrate, behaviour change interventions can take many forms, including communication campaigns, taxes, legislation, and ‘nudges’.

These types of interventions are all part of the field of Social and Behaviour Change (SBC). Sometimes it’s referred to, more or less interchangeably, as Social and Behaviour Change Communication (SBCC) or Communication for Development (C4D). Whichever term is used, social and behaviour change is an umbrella field consisting of specialists in subjects such as communications, behavioural economics, anthropology, sociology, and Human Centred Design, to mention but a few.

The evolving nature and growing popularity of the field was evident at the International Summit on Social and Behaviour Change Communication ( held in Bali a couple of weeks ago. Over 1200 participants attended, indeed three times as many as that at the preceding summit in 2016.

The summit posed the pertinent question: What works?

To address it, the conference agenda offered a vast range of session on interventions, approaches, and measurements. The interventions targeted everything from use of family planning by reaching mothers-in-law, radio dramas addressing gender-based violence through challenging prevailing social norms, and television ads to normalise HIV testing among gay men. Presenters also talked more broadly about measuring and understanding social norms and networks, successfully scaling up interventions, strengthening measurements, as well as new innovative research methods.

It certainly made for interesting contents on What Works? The question of Why things work was, however, more scantily answered. Admittedly, that question was not part of the summit title. Yet it is also an important one if we want to exhibit a degree of predictability in these matters. While the complexity of human behaviour, and often non-linear nature of change, means there is no simple answer or single model for Pictureaddressing behaviour change, the very complexity of the matter means it is essential to use – and gain – insights into conscious and unconscious drivers of behaviour. Only with such insights can the design and effectiveness of this type of interventions be maximised and further advanced.

So what should be next for SBC? Arguably the challenge is how practitioners in the field – the behavioural scientists, economists, sociologists, communication experts – share knowledge and collaborate to answer not only the question of What Works, but also Why?

Offering a forum to discuss such gaps and potential future actions, the SBCC summit included a small but valuable working session on ‘What does the research agenda for social and behaviour change need to address?’. Attended by a dozen or so academics, researchers, and practitioners, the session promised a good start to bringing these groups together and enabling the sharing of knowledge and building of a joint research agenda. The conversation is currently continuing online, with actions to follow. It is also worth mentioning that other points raised were the need for a better understanding and/or sharing of innovations, ethics of interventions, sustainability, cost analysis, and a conceptual practical model of influencers.

Hopefully this collaborative initiative can help share knowledge building on existing insights to improve effectiveness and predictability of social and behaviour change interventions, and so contribute to further develop the field.

What’s more, hopefully the initiative will succeed without the need for any behaviour change intervention of its own.

Functional Literacy: A Better Way of Assessing Reading Ability?

Written by: Alexandra Cronberg

When I lived in Nigeria, my driver, a young man in his 20s, told me had gone to school for six years. Yet he struggled to read and write. Once when taking me to the airport, he almost missed the turn for ‘Departures’. I realised he couldn’t read the sign. Other times he sent me text messages containing scrambled letters and words that I deciphered with a smile and a bit of sadness. I later learnt that he was going to school again to improve his literacy. The thing is, he was also a boxer who competed internationally. He said it was difficult for him to travel without being able to read. That ‘Departures’ sign was indeed important for his own life too.

Literacy is clearly key to getting on in life, whether you are well off and taking it for granted, or disadvantaged and struggling to read. Without the ability to read and write, you might miss out on opportunities to learn, adopt new practices, or indeed get by in everyday life. For organisations and governments working to improve the situation for poorer people in Africa and Asia in particular, it is essential to know what the level of literacy is and what the gaps are. As illustrated by my driver, the level of schooling is often not a good measure. Literacy needs to be measured specifically.

There are several ways in which this can be done. Literacy measures at population level normally involve a quantitative household survey[1]. The degree of usefulness and resource intensity of the measures varies, however. Data are usually collected face-to-face, though the more simplistic measures can be applied in other modes as well. Here I will briefly discuss the pros and cons of the main approaches, and also highlight the method of ‘functional literacy’ which has been developed and implemented by IBOPE Inteligência, associated with Kantar Public in Brazil, Instituto Paulo Montenegro, the social arm from IBOPE, and Ação Educativa, a non-governmental organisation focused on education in Brazil.

African children during English class, East Africa

African children from Samburu tribe during English language class under the acacia tree in remote village, Kenya, East Africa. Samburu tribe is one of the biggest tribes of north-central Kenya, and they are related to the Maasai.

In this blog post I will focus on ways of measuring reading ability, but similar approaches can be applied for writing ability and basic numeracy. Moving on, then, to the main approaches:

  1. Asking about reading ability directly. For example “How well can you read?” or “How well can you read a newspaper?” Response options may be “Very well”, “Somewhat well”, and “Not at all”.

Clearly this approach relies entirely on respondents’ subjective opinion of how well they can read, and may also be subject to social desirability bias. It may be influenced by reading ability among people around them, and their own rose-tinted self-perception. Perhaps a respondent can easily read her brother’s text message – better than anyone else in the household – but she might struggle to read more complicated texts. She would like to say she can read very well. What will she respond?

Having said that, there are times when self-perceived ability is what matters, for example where one wishes people to put themselves forward for adult education. Another advantage of this otherwise quite limited approach, is that it is a very short question that can fit into even SMS questionnaires. Moreover, the version of the question that simply asks how well respondents can read avoids the issue of defining the language. While this may be a drawback if more in-depth information is required, the question can serve to give a general sense of literacy level.

Asking specifically about newspaper reading means a reference point-of-sorts is introduced. However, it also raises the issue of language. What if most newspapers are published in, say, English rather than local languages? Which language should the question refer to?

Finally, it is worth mentioning that the literacy questions above are sometimes asked with respect to other people in the household rather than the respondent. This avoids potential social desirability bias, but it means links with other factors cannot be analysed so straight-forwardly.

  1. Asking the respondent to read a sentence out loud, eg ‘Parents love their children’ (from the Demographics and Health Survey, as referenced in the 2006 UNESCO paper).

This approach moves closer to assessing actual ability in an objective manner, rather than relying on self-reported answers. Responses are normally coded along the lines of ability to read ‘full sentence’, ‘partial sentence’ or ‘not at all’. While this approach is generally an improvement from self-reported measures, the sentence is usually a very simple one and provides a rather crude tool for assessment. Also, responses may not reflect actual comprehension. Few respondents succeed in reading only ‘part of the sentence’ – usually they can either read all of it or nothing, meaning it is not a very nuanced measure even for what it is trying to assess.

  1. Giving the respondent a brief text to read and then assess their comprehension.

Giving respondents a brief text to read and then asking questions to assess their comprehension provides a better assessment of literacy than just asking them to read a sentence out loud. The example below is taken from an Education Impact Evaluation survey in Ghana (2003), again as referenced in the UNESCO paper.

“John is a small boy. He lives in a village with his brothers and sisters. He goes to school every week. In his school there are five teachers. John is learning to read at school. He likes to read very much. His father is a teacher, and his parents want him to become a school teacher too.”

The respondent is then asked questions such as ‘Who is John?’, ‘Where does John live?’, ‘What does John do every week?’ etc. Often the responses are provided in multiple choice format.

Responses are grouped into categories based on the number of correct answers. This approach provides more reliable and nuanced results than the measures above, but it arguably doesn’t capture an adequate range of literacy levels reflecting how well people can function in the real world.

  1. Functional literacy: Giving the respondent a test to assess literacy based on a series of everyday-related activities.

This approach takes the literacy assessment a step further by incorporating a number of different tasks, reflecting everyday life in the context of a given society. It thus provides a much richer measure of literacy. It specifically measures ‘functional literacy’. The test has been developed in Brazil and covers things like reading a magazine, instruction manuals, and health related information. The test contains about 20 questions. For example, respondents are asked to look at a magazine and indicate where on the cover the title is located, or link the headings on the cover with the relevant articles. Other test questions relate to instructions on how to clean a water tank, information on who is eligible for vaccinations, and information on how to pay for a TV in installments. The level of difficulty increases as the test progresses. The responses are then coded using the method of Item Response Theory, meaning the increasing level of difficulty is taken into account in the weighting of responses. Respondents are categorised into one of four groups reflecting the level of functional literacy: 1) Illiterate, 2) Rudimentary, 3) Basic, and 4) Fully literate.

As mentioned above, this approach has been developed by our Kantar Public team in Brazil in partnership with Instituto Paulo Montenegro and Ação Educativa. It now provides official literacy statistics over time for the country. In principle, the assessment can be incorporated into any questionnaire and could be adopted for other countries. The downside, however, is that it can take a bit of time. While a person who can read well would only need about 15 minutes to complete the task, it often takes much longer for someone with lower level of literacy, not least because respondents often do not wish to give up. The other thing is that, as far as I am aware, it has so far only been developed for the Brazilian context. It would be extremely useful to adopt it to other languages and societies too, which indeed I hope we will get a chance to do.

On that note, I will end this blog post. Hopefully the continued measurement and development of global literacy indicators will help direct resources to improve people’s literacy among those who need it the most. The adoption of functional literacy in other countries would be a step in the right direction.

Hopefully better measures and improved literacy will contribute to a future where no one is held back because they struggle to locate the ‘Departures’ sign, and people like my Nigerian driver can take off in their boxing careers, or in any other ambition or aspiration they may have.

[1] For a comprehensive discussion of the first three approaches described in this blog post, see the UNESCO paper ‘Measuring literacy in developing country household surveys: issues and evidence’ (2006), available at: