Digital technologies today are indispensable tools used in almost every facet of our daily lives. Especially in the developing world, mobile phones have transformed the lives and livelihoods of average citizens. Yet, two decades ago, when there were more phone lines in Manhattan than in most of Sub-Saharan Africa, only a few visionary institutions could have imagined that computers, the Internet and mobiles would be so prominent in poverty-stricken environments. Information and communications technologies (ICTs) began to emerge as an issue in the field of development at a time when the concepts of sustainable development, biodiversity, economic growth and services for all dominated the landscape. These discourses did not consider the introduction of technology to address development issues, as technology was perceived as a luxury item rather than an indispensable building block for social and economic development. The rhetorical question, “Which is more important, hospital beds or computers?” was a common dismissive response to the suggestion that digital tools had a place in international development programming.
Despite this early skepticism, a few institutions and players in the world of development were prepared to argue that, like implementing basic needs infrastructure, access to ICTs was also needed. These early advocates for exploring the use of technology in the global south assumed that, by supporting and researching the ways in which ICTs could be used for development purposes, they would be able to overcome a range of developmental barriers such as access and performance in the education, health, political and community sectors of life.