In his introduction, Nova writes:
“While ethnography generally draws on qualitative data, it does not not mean that quantitative approaches shouldn’t be employed in the research process. Combining the two leads to a “mixed-method approach” that can take various forms: data collection and analysis can be either separated or addressed together, and each of them can be used in service of the other. Of course, this isn’t new in academic circles and corporate ethnography but there seems to be a renewed interest lately in this topic.
One of the driving forces of this renewed interest is the huge amount of information produced by people, things, space and their interactions — what some have called “Big Data“. The large data sets created by people’s activity on digital devices has indeed led to a surge of “traces” from smartphone apps, computer programs and environmental sensors. Such information is currently expected to transform how we study human behavior and culture, with, as usual, utopian hopes, dystopian fears and *critical sighs* from pundits.
Although most of the work of Big Data has focused on quantitative analysis, it is interesting to observe how ethnographers relate to it. Some offer a critical perspective, but others see it as an opportunity to create innovative methodologies to benefit from this situation.
Aside from Rebekah Rousi’s post (featured here yesterday), EthnographyMatters will feature various case studies and perspectives on the implications of mixed-methods approaches, including Fabien Girardin (on how he used sensor data to yield field observations in a study for Le Louvre in Paris), Alex Leavitt (discussing his research on Tumbler using a computational ethnography perspective), Tricia Wang (sharing her thoughts about the opposite of Big Data, in what she calls “thick data”) and David Ayman Shamma from Yahoo! Research (describing his personal perspective on the topic).