This month I started working on a new project with colleagues from Leeds Beckett University and The University of Leeds which will use self-tracked data to assess demographic differences in activity patterns.
This research is funded by the Economic and Social Research Council (ESRC) and the Consumer Data Research Centre (CDRC). We will be using data held by the CDRC which has been gathered by an app called Bounts. This app connects to other fitness tracking apps and rewards activity with points which can be exchanged for prizes and discounts. The data generated by the app (which includes the types of activity engaged in, distance, steps, the source of the data as well as basic demographic data on the users) will be combined with consumer profiling datasets and categories (also made available through the CDRC).
Access to these data should help us to get an understanding of who is using activity tracking apps (in relation to gender, income, location and other proxies for social class), in what ways they are using them and whether they are related to higher levels of activity.
This should also help us to investigate if the economic and social environment in which an individual lives is associated with activity (or inactivity) levels and identify if technology use is socially patterned and if it either reduces or increases existing social inequalities.
A second strand of the project will assess the effectiveness of a community national weight management intervention through using similar data matching with consumer datasets. This should also help to investigate if engagement with weight management interventions is socially patterned.
I think the efficacy of self-tracking approaches to behaviour change is largely taken for granted with many individuals choosing to use step-tracking apps and fitness bands (such as Fitbit). They are also increasingly being pushed through public health campaigns such as the NHS “Couch to 5k” app and Sheffield City Council’s “Move More” (which uses an app to facilitate competitions between workplaces and schools). Workplaces are also increasingly using self-tracking systems to encourage more activity and better engagement at work through competitions and challenges.
While there is some evidence to suggest that these kinds of apps and devices might be effective at behaviour change it is not clear to what extent their use (and effectiveness) are socially patterned. So, for instance, if these interventions are not effective for those who are most in need of them perhaps they need to be re-designed or other approaches entirely are needed.
I am also excited about this project on a methodological level as we will be using commercial datasets and categorisations of populations. Sociologists (including me) have been critical of the collection of these kinds of data and the categorisations which they have enabled. Consequently they are often quite squeamish about using these kinds of data themselves. This is exacerbated by a broader antipathy towards quantitative methods amongst sociologists (at least in UK) who often see it as incompatible with a “critical” approach to analysis. Mark Carrigan has written very well on this recently.
My thinking around this has been influenced by David Beer’s work on the use of “digital by-product” data and “commercial data” and his position that use of these data can enable critical social science. Perhaps sociologists even have a responsibility not to leave the analysis of these kinds of data to commercial enterprises. Big data can be used to control people or to sell products but might also be able to highlight social inequalities and to challenge or improve social policy interventions.