Running an international research project where many researchers from different institutions are involved may seem like a daunting task. COVID-19 has not made things any easier lately.
While many of us have relied in the past on emails to run our projects, there are more efficient ways to keep everybody in the loop, and, at the same time, organize efficiently project-related information.
In previous projects, I’ve set up bespoke versions of Moodle running on our own server. Why Moodle? Well, it was easy to translate the “course” oriented Moodle interface into a Work Package-driven structure. This way, project members could seemingly find relevant information under work packages or topics.
This has been a great option for some 15 years now, although I must admit that not every university or organization will be happy with you running things like Moodle or WordPress in their servers on grounds of security vulnerability. Moodle takes these issues very seriously and maintains a dedicated page on how to protect your data. However, the widespread use of mobile devices somehow made Moodle less and less attractive.
Since 2017 I have used Slack to collaborate with other colleagues, first in technology-related projects and then pretty much everywhere, including my own area, Arts and the Humanities. According to Slack, this app encrypts data at rest and data in transit for both free and pay customers. Slack complies with the EU General Data Protection Regulation (GDPR).
Slack can make your project management easier in, at least, 5 areas. The following is based on my own experience.
1. Creating topic-driven conversations
Research projects are typically divided into Work Packages or deliverables. Every WP or deliverable can be converted into a channel. Thus, channels can be seen as a holder where conversations and information sharing takes place. Creating channels is easy and intuitive.
Project members can use predefined channels (for example, work packages) or create new channels as they deem it useful.
Channels are displayed on the menu on the left. Their use is intuitive. If the channel is in bold, you have new stuff to check out. If it’s not, you’re on top of the messages posted there. Every message is posted on a #channel but, this is interesting, you can tag it with other channel hostages, people, ideas, etc. This way you increase the context of your message.
2. Find anything, anytime, the easy way
Searching for stuff in SLACK is easy and surprisingly useful. Search results are contextualized and provide information about people, channels, messages and files.
The results will give you lots of context to interpret your search term. This may seem irrelevant if you’re searching for a given document or message that you just checked out a few days ago, but it is incredibly useful in a long project comprising 2, 3 or 4 years of work and different work packages and colleagues.
3. Notifications
SLACK gives total control over how you want to be notified. You can choose to get notified 24/7 or just to receive notifications weekdays between, say, 9 to 5. You decide.
4. Sharing files (and pretty much anything)
Sharing files is easy and SLACK offers integration with services such as Google Drive or Dropbox.
5. Channels again: a great way to organize information and categories
Messages and threads can be tagged under channels using a hashtag. Different channels can include different group members (i.e. not everyone may be involved in every single work package or deliverable), and private channels can be set up so that, for example, only PIs access those.
SLACK can be downloaded as a desktop app, a mobile app or can be used as a web server.
I find this text so compelling I´d like to preserve a copy of these words.
This text is copyright and belongs to Jan Blommaert, a sociolinguist and discourse analyst.
Two of my maîtres à penser died relatively young. Michel Foucault was 57, Erving Goffman was 60. It is highly likely that I shall die relatively young as well. I’m 58 now, and I have been diagnosed with cancer stage 4 in mid-March 2020. Since there is suddenly very little future left to plan, speculate or dream about, one tends to use such landmark moments as a prompt to reflect on the past. The guiding question in this – quite an obvious one – is: what was important?
I will restrict my reflections to the professional parts of my life. This is, of course, an artificial segmentation, and readers must keep in mind that the professional part of life was always interwined with the nonprofessional parts, often in uneasy or poorly balanced ways. Perhaps that story should be told elsewhere. For now, I will focus on the part of me that was called “academic”.
*****
Let me briefly preface what follows by reviewing what was not important.
What was not important was competition and its attributes of behavioral and relational competitiveness, the desire or urge to be the best, to win contests, to be seen as the champ, to proceed tactically, to forge strategic alliances and what not. I did not have a sense that I had to be part of a specific clique or network, and I don’t think I ever made great efforts to get close to people considered to be important. If I was a member of such networks, it was rather by accident than by design – it happened to me.
I never self-imagined as a genius, individually measured against others, and individually responsible for the production of superb stuff that everyone should read, know, quote and assign to students. Quite the contrary: I saw myself as unexceptional, and as someone who would always need a good team around me in order to achieve anything. Given that academic life, in my case, was not a thing I had actively desired and sought, but a gift I received from others, I felt a duty to be good, as good as I could be, and better tomorrow than today. So I worked hard, essentially taking my clues from others – the literature of course (a community of others often overlooked when we talk about academic achievement), but also contacts and friends with whom teams could be formed. Discussion and brainstorm were my favorite activities; they were in the most literal sense the ludic, fun, pleasure dimensions of academic life. What I did alone, usually, was the slow and careful analysis of data. But that’s the only thing that’s really individual in a range of activities that were collective and involved intense sharing, exchange and generosity. And even that thing – the data analysis – was usually submitted to the judgment of others before it could be publicly shown. So much for being the lone, unique and autonomous genius researcher.
In such contexts of collective sharing, conditioned by maximum generosity, changing one’s mind is self-evident. The very point of having a discussion or brainstorm – an “exchange of ideas” – is that ideas can be exchanged and changed, and that one leaves the session with better things in one’s head than before the session. Learning is the key there, and if I would be ready to pin one label onto myself, it’s the label of an eternal, insatiable learner.
Which is why I read massively all through my life. And while part of that reading was “just” reading, another part was studying. Most of my career, I was involved in some kind of study, collecting and selecting writings from which I wanted to draw advanced insights, useful for the research projects I was engaged in. I studied, for instance (and the list is not complete), structuralism, existentialism, phenomenology, arcane things such as the works of Rudy Botha on Chomsky and the Functional Grammar attempts of Simon Dik, Talmy Givon and M.A.K. Halliday; but also the entire oeuvre (or, at least, most of what I could get) of Michel Foucault, Carlo Ginzburg, Bakhtin, Freud, Durkheim, Simmel, Parsons, Eric Hobsbawm, E.P. Thompson, Pierre Bourdieu, Charles Goodwin, Dell Hymes, Michael Silverstein, Erving Goffman, Aaron Cicourel, Harold Garfinkel, Anne Rawls, Fernand Braudel, J.K. Galbraith, Immanuel Wallerstein, Arjun Appadurai and several others. I studied Marx and Marxism in its very diverse varieties, Rational Choice, Macchiavelli, Darwin, G.H. Mead’s work and influence, Dewey, Paolo Freire, Ngugi wa Thiong’o, Okot p’Bitek, Walter Rodney, Issa Shivji and quite a bit of African political theory from the 1950, 1960s and 1970s. In order to understand a lot of that, I had to study the works of Mao Zedong and the history of the Cultural Revolution in China. And so on, and so forth.
If I have regrets now, it is about the fact that some of those studies will remain unfinished. I took great pleasure from them.
I disliked and dislike – intensely – the development of academic industrial culture that I was witness to throughout my career, with almost-totalized individualization of academic work and performance measurement, with constant inter-individual competition driving young and vulnerable colleagues to extreme and dangerous levels of stress and investment in work rather than life, and with managers emphasizing – without any burden of evidence – that the “single-authored journal paper” (published, evidently, behind a huge paywall) is the pinnacle of academic performance and the gold standard for measuring the “quality” of an individual researcher. Added to this – and this, too, I was a witness of – is the growth of a veritable celebrity culture in academia, in which mega-conferences take the shape of pop festivals with rockstar headliners bringing their greatest hits in front of an audience of poorly paid struggling academics who spent their personal holiday budgets purchasing a ticket for such events. Little truly valuable intellectual work is going on there. And identical to pop festivals, the carbon footprint of such academic rock concerts is scandalous.
Frankly, all of this is in its simplest and most elementary form anti-academic and anti-intellectual. It’s the recipe for bad science, not for innovation and improvement. I participated in all of it, for all of it became “new” while I was active – it was the culture that defined my career. That culture defined me as one of these rockstars for a while, and thus placed me quite consistently in the company of a small coterie of similar rockstars. It is not a thing I shall miss, for it was invariably awkward and alienating, and very often incredibly boring. And this new culture took away and delegitimized a previous culture, one of collegial dialogue, collaboration, slowness, time to think, to reflect and to doubt, periods of invisibility and absence from public stages – because one was doing some serious bit of research, for instance. And a culture in which one would write something whenever, and because something new had to be reported, not because one needed to achieve one’s annual output quotum or another “top” paper in order to be eligible for promotion, tenure or appointment.
A footnote: another part of that defining culture was university reorganizations, managerialization and budget cuts, with an increasing rat race for jobs (for which the intellectual world pays a terrible price), “customer-oriented” academic programs that had to be checked by the marketing guys as to their merits in a market of academic products, the decline of vital academic “support staff” and the almost-complete commodification of academic output – see the point about “single-authored journal papers” above, and one can add the metrics and impact mania to it. Academic publishing, as an industry, has become a disgrace and is an obstacle to science, not a facilitator (let alone an indispensable actor). Publishing has become a form of terror for young scholars, while it should be an instrument for liberation, for finding their voice and feet in the business. Burnout has now become an endemic professional hazard in academia, much like depression, unhappy human relationships and unhealthy lifestyles. It’s become a highly unattractive environment for human creativity, while it should be an environment, a specialized one, ideally tailored to precisely that.
*****
So that was unimportant. The important things can be summarized in a few keywords: to give, to educate, to inspire. I will add a fourth keyword later.
As I said earlier, my academic life was a gift I received from others. It was unexpected as a gift, and I was unprepared for it. When I received my first academic job in 1988, I mainly looked at people I considered bad examples, and I decided to not do things the way they did it. I essentially decided to be the kind of academic I myself would like to encounter if I were a student. If I had to teach, I should teach the kind of class I myself would love to attend as a student. And if I had to write, I should write texts I myself would enjoy reading. It’s a simple discipline I maintained throughout my career: it’s never about me, it’s always about the student, and my role is to give the student tools and resources useful and valuable for that student, not for me.
I realized early on that my role in the lives of the young people who were my students was that of an educator, not just a lecturer or a teacher. And once I realized that, I took it very seriously. I meticulously prepared every course I ever taught (and there were many), and I always rehearsed every lecture. I never walked into a lecture hall without a fully developed story and a script in mind for how to deliver it. If you have to teach, teach, and do that in a no-nonsense way. Make every minute of the class a moment worth attending for students, and make sure that they learn something in each of your classes. That sounds simple and straightforward, but it isn’t. It’s actually quite a tall order.
It starts from a refusal to underestimate your students. Many of my former students will remember that I would start a course by announcing that I would aim just one inch above their heads, so that they would have to stretch a bit in order to keep up with the pace and content of the course. I always did that: I gave students readings, contents and assignments often judged by colleagues to be too demanding or “above their level” – first-year students would have to read a book by Foucault, for instance. Well, the fact is that they did, and they learned massively from it. So what precisely “their level” is, usually and preferably remains to be determined after the process of learning, not prior to it. Prior to it, no one is “ready” for specific chunks of knowledge; they become ready through the work of learning. Not understanding this elementary fact, and assuming that students “have” a particular level that we, teachers, need to adjust to, is a dramatic error. In my career I have seen very often how this error leads to the infantilization of exceptionally talented young people, and to learning achievements that were a fraction of what could have been achieved. Please never underestimate your students.
Instead, give them the best you have to give. That means: don’t give your students old and pedestrian information, but give them your most recent and most advanced insights and thoughts. Draw them into the world of your current research, expose them to the most advanced issues and discussions in the field, show them complex and demanding data, and allow them into your kitchen, not just into your shop. For large parts of my career, I had a huge teaching load. I could only keep classes interesting for students and for myself by establishing direct and immediate links between my ongoing research and my teaching. I would take half-finished analyses of new data into the classroom, and finish the analysis there, with my students, allowing them to see how I made mistakes, had to return to earlier points, skip some particularly tough bits, and so forth. The good thing was: my frequent classes did not entirely eat away my research time, they were research time, and students were exposed to a researcher talking about a concrete and new problem that demanded a solution.
*****
It is at this point, I believe, that “teaching” turns into “education”. As teachers, we do not “transfer knowledge” and we’re not, in that sense, a sophisticated or awkward kind of bulldozer or forklift by means of which a particular amount of resources is taken from one place (ours) to another (the students’ minds). This is how contemporary academic managerialism prefers to see us. I have already rejected it above.
No. Whether we like it or not, we are much, much more than that for our students, and we have to be. All of us still remember many of our teachers, from kindergarten all the way to university. Some of our memories of them may gradually fade, and some of the teachers may only survive in our memories as vague and superficial sketches attached to particular moments in life. But some of these teachers are actually quite important in the stories we build of ourselves; and of such teachers, we sometimes have extraordinarily extensive and detailed memories. Even more: some of these teachers served (and serve) as role-models or as people who defined our trajectories and identities at critical moments in life. And when people talk about such teachers, we notice how closely they observed and critically monitored even the smallest aspects of behavior of their teachers; their actual words and how, when and why they were spoken; particular gestures made or faces pulled; pranks or surprises they created, and so forth.
I became very aware of the fact that, as a teacher, I will be remembered by my students. I knew, at every moment of interaction with students, that this moment would leave a trace in their development and would often be given a degree of importance it never could have for me. In sum, I realized that, as a teacher, every moment in which I interacted with students would be a moment of education, of the formation of a person, using materials I would be offering to them during that specific moment of interaction. My entire behavior towards them would potentially be educational material in that sense. And my entire behavior towards them, consequently, needed to be organized in that sense. I should allow students to get to know me – at least, get to know a version of me that could be remembered as someone who positively contributed to their development as adult human beings. Respect, courtesy, integrity, professional correctness, empathy, reliability, trustworthiness, commitment: all of these words stand for behavioral scripts that demand constant enactment in order to be real.
Several times in my career, students told me what could best be called “secrets”, highly delicate personal things usually communicated only to members of a small circle of intimi. Twice, young female students came into my office in deep distress, announcing that they had been raped – and I was the first person they called upon for help. While such moments were of course disorienting and caught me cold, they taught me that as a teacher I was very much part of students’ lives, in ways and to degrees I never properly realized. And it taught me the huge responsibilities that came with it: we are so much more than “academics” for these young people; we are fully-formed human beings whose behavior can be helpful, important, even decisive for them. We should act accordingly, and not run away from this broader educational role we have.
*****
The third keyword is “to inspire”, and I need to take a step back now. I mentioned the delight I always took in studying. The real pleasure I took from it was inspiration – other scholars and their works inspired me to think in particular directions, to think things I hadn’t been able to think before, to do things in particular ways, to explore techniques, methods, lines and argument, and so forth. Let me be emphatic about this. I can’t remember ever studying things in order to follow them the way a disciple follows the dictates of a master or an apprentice follows the rules of a trade – or at least, I remember that each attempt in that direction was a dismal failure. I was never able to absorb an orthodoxy, and to become, for instance, someone happy to carry the label of – say – critical discourse analyst or conversation analyst.
Whenever I studied, I wanted to be inspired by what I was studying, and I described inspiration above: it’s the force that suddenly opens areas and directions of thought, shows the embryo of an idea, offers a particular formulation capable of replacing most others, and so forth. Inspiration is about thinking, it is the force that kickstarts thinking and that takes us towards the key element of intellectual life: ideas. And science without ideas is not science, but a rule-governed game in which “success” is defined by the degree of non-creativity one can display in one’s work. The exact opposite, in other words, of what science ought to be. Science can never be submissive, never be a matter of “following a procedure” or “framework”. It is about constructing procedures and frameworks.
There were many moments in my career when graduate students would introduce their work to me, and preface it by saying things such as “I am using Halliday as my framework”. Usually, my response to that was a question: “how did Halliday become a framework?” And the answer is, of course, by constructinghis own framework and refusing to follow those designed by others. People who “became a framework”, so to speak, took the essential freedom that research must include and rejected the constraints often mistaken for “scientific practice”. The essential freedom of research is the freedom to unthink what is taken to be true, self-evident and well-known and to re-search it, literally, as in “search again”. It is the freedom of dissidence – a thing we often hide, in our institutionalized discourses, behind the phrase “critical thinking”. I see dissidence as a duty in research, and as one of its most attractive aspects. I believe it is exactly this aspect that still persuades people to choose for a career in research.
Inspiration draws its importance from the duty to unthink, re-search, and question, which I see as the core of research. We can make the work of unthinking and re-searching easier (and more productive, I am convinced) when we allow ourselves to draw inspiration from that enormous volume of existing work and the zillions of useful ideas it contains, as well as from interactions with friends, colleagues, students, peers – allow them to affect our own views, to shape new ones, to help us change our minds about things. And in our own practices, we should perhaps also try, consciously and intentionally, to inspire others. I mean by that: we should not offer others our own doctrines and orthodoxies. We should offer them our ideas – even if they are rough on the edges, unfinished and half-substantiated – and explain how such ideas might fertilize – not replace – what is already there.
I have quite consistently tried to inspire others, and to transmit to them the importance I attached to inspiration as a habitus in work and in life. In my writings, I very often sought to take my readers to the limits of my own knowledge and give them a glimpse of what lies beyond, of the open terrain for which my writings offered no road map, but which my writings could help them to detect as open for exploration. This has made parts of my work “controversial” and/or “provocative” – qualifications that are usually intended to be negative but inevitably also articulate a degree of relevance and suggest a degree of innovation. I was usually quite happy to receive these attributions, and they never irritated me. It also never irritated me when I found out that someone I engaged with in conversation did not know me well, had not read my work and did not pretend to have read it. Usually, those were among the more pleasant encounters.
*****
These three things were definitely important to me in the professional part of my life: making a habit of giving, sharing and being generous in engaging with others; being aware of my duty to educate others and of the responsibilities that come with that, and to take that duty very seriously; and taking inspiration as a central instrument and goal of academic and intellectual practice. I can say that I have tried to apply and implement these three aspects throughout my career; I cannot claim to have done so faultlessly and perfectly – there is no doubt that I made every mistake known to humanity, and I am not speaking as a saint here. But the three elements I discussed here were – now that I can look back with greater detachment – always important, always guiding principles, and always benchmarks for evaluating my own actions and conduct.
*****
I now need to add a fourth keyword: to be democratic. It’s of a slightly different order.
I grew up and studied in the welfare-state educational system of Belgium, and given the modest socio-economic status of my family, I would probably never have received higher education in other, fee-paying systems. I’m very much a product of a big and structural collective effort performed by people who did not know me – taxpayers – and regardless of who I was. I am a product of a democratic society.
I remained extremely conscious of that fact throughout my adult life, and my political stance as a professional academic has consistently been that I, along with the science I produce, am a resource for society, and should give back to society what society has invested in me. “Society”, in this view, includes everyone and not just a segment of it. It is necessarily an inclusive concept. And science in this view has to be a commons, a valuable resource available to everyone, an asset for humanity. Practicing this principle became increasingly difficult because of the developments I already mentioned above: the rapid and pervasive commodification of the academic industry during my career. Academic institutions, and academic work, became and have become extraordinarily exclusive and elitist commodities, and academic work that refuses the limitations commensurate with this commodification are, generally speaking and understated here, not encouraged. I’ll return to this below, but I need to continue an auto-historical narrative first.
Working a lot in Africa and with Africans throughout my career, no one needed to tell me that knowledge, surely in its academic form, was not available to everyone, and that a large part of humanity was offered access only to hand-me-downs from the more privileged parts. One can take this literally: many of the school books used in the early and mid-1980s in Tanzania were books taken off the syllabus in the UK and shipped – as waste products, in effect, but under the flattering epithet of educational development assistance – to Tanzania. And almost any student or academic I met at the University of Dar es Salaam (which became my second home for quite a while in the early stages of my career) would answer “books, journals” to the question “what is it you lack most here at the university?” Bookshelves in departments were indeed near-empty (even in so-called “reading rooms”), and the small collections of books privately held by academics (usually collected while doing graduate work abroad) were cherished, protected and rarely made available to others. In the University bookshop on campus, shelves were also empty, supplies were dismal and most of the collection on offer was dated. (Its most abandoned and dusty corner, however, became a treasure trove for me, for that was where cheap editions of the works of Marx, Lenin and Mao Zedong could be found, donated long ago by the governments of the USSR, the GDR and China.) My own working library at home – the working library of a PhD student – was several times larger than some of the departmental collections I had seen in Dar es Salaam. To the extent that “white privilege” has any meaning, I had a pretty sharp awareness of it from very early in my career.
Inequality became the central theme in my work and academic practice from the first moment I embarked on it. And I never abandoned it. I wanted to understand why understanding itself is an object of inequality. Concretely, I wanted to understand why the story of an African asylum applicant was systematically misunderstood and disqualified by asylum officials in Belgium and elsewhere; why the stories of particular witnesses in the South African Truth and Reconciliation Commission were seen as “memorable” while others were forgotten or never taken seriously; why so many stories from the margins are considered not even worth the effort of listening to, let alone to record and examine; why some groups of people are not recognized as interlocutors, as legitimate voices that demand respect and attention, and so forth. This general concern took me, during my entire career, to the margins of societies I inhabited and worked in, and made confrontations with racism, sexism and other structural forms of inequality inevitable.
It also led to various practical decisions about how I organized my work. I will highlight three such decisions.
One. My experiences in African universities made me very much aware of the existence of several academic worlds, not the idealized one “academic community” sometimes invoked as a trope. And I decided to spend a lot of my efforts working with, and for the benefit of, what is now called the Global South. I am proud of official work I did with the University of the Western Cape in 2003-2008, where I coordinated a very big academic collaboration project on behalf of the Flemish Inter-University Council. UWC is a historically non-white university, and it still bore the scars of apartheid in 2003: the university was severely under-resourced and lacked the infrastructure as well as experience for building a contemporary research culture. Working in very close concert with the local university leadership – the most inspiring and energizing team of academic leaders I had ever met, and lifelong friends since – I believe we were able to turn the ship around. In the process I got to know a large community of amazing people who taught me a lot about what real commitment is – from Chancellor Desmond Tutu down to Allister, the man who acted as my fixer and driver whenever I was in Cape Town.
Informally, I did my best to work with and for scholars and institutions in the Global South, slowly building networks of contacts in several countries and trying to be of assistance in a variety of ways. The people I encountered through these networks usually didn’t have the money to travel to conferences where I appeared, nor the money needed to purchase my books. And this takes me to a second decision.
Two. I wanted to make my work available in open access and to create genuinely democratic mechanisms of circulation and distribution. Remember what I said earlier about science as a commons: I take that seriously. So, from very early on, I started series of working papers that enabled published high-quality material to circumvent the paywalls of commercial publishers. And as soon as the web became a factor of importance in our trade, I used it as a forum for circulation and distribution. Everything I write is first posted on a blog (this blog), and then usually moves to a working paper format in the Tilburg Papers in Culture Studies, before it finds its way into expensive journals or books. I also became an early mover on academic sharing platforms such as Academia.edu and ResearchGate. And I am proud to see that a large segment of those who read and download my materials are scholars from the Global South – those who can’t afford the commercial versions of my work.
But my obsession with open access is not restricted to the issue of Global South readerships. My own students, working with me at a well-resourced university in an affluent country, cannot afford to buy my books. As I said earlier, the academic publishing business has become a disgrace, and it excludes growing numbers of people who absolutely need access to its products. I saw it as part of my duty to subvert that system, to share and distribute things usually not free to be shared and distributed, and to do so early on with recent material. For making old texts widely available is good and useful, but the real need for scholars in very large parts of the world is to gain access to the most recent material, to become part of ongoing debates, to align their own research with that which is cutting-edge elsewhere. And the academic publishing industry does brilliant, truly majestic efforts to prevent exactly that.
We should not be part of that industry, we should not be its advocates and we should not feel obliged to serve that industry’s interests. We are its labor force, and we provide free, unpaid labor to it. We sign contracts with them – non-negotiable ones, usually – in which all rights to our own work are handed over, appropriated and privatized – in return for a doi number and a pdf. We are exploited by that industry to an extent that most other sane people find ridiculous. While, if we do a little bit of creative work, we don’t need that industry any longer. As academics, we have an idea of the audiences for our work out there that is far more precise than that of any marketing officer in an academic publishing firm. We also have a very good idea of who might be knowledgeable and reliable reviewers of our work. And we just need a website to post our work when it’s ready for publication – offering it free of charge and without constraints on sharing to anyone interested in it, not to all those who have paid a certain amount of cash for it.
Three. Throughout my career, I never stopped addressing non-academic audiences. I gave literally hundreds of lectures, workshops, training sessions and public debates for professionals and activists in a range of fields – education, social work, care, law, policing, antiracism, feminism, support to refugees, youth organizations, trade unions and political parties. As a rule I did so without charging a fee (see what I said earlier about giving things back to society), and the default answer to invitations was “yes”. I always found such activities rewarding, and the audiences I met through such activities were often extraordinarily energizing ones. I also continued to write materials in Dutch. Over a dozen books, if I am not mistaken, and piles of articles – all written for lay audiences, often based on my ongoing research, and often used in professional training programs. It was my way of trying to bring recent science to a broader public forum quickly. For social workers or teachers in multilingual classrooms should not be given information that was valid a decade ago; they should get the most advanced insights and understandings available and take these into their practices.
I used a label for the things I mentioned in this section. I called it “knowledge activism“. In a world in which knowledge is at once more widely available than ever before, and more exclusive and elitist than ever before, knowledge is a battlefield and those professionally involved in it must be aware of that. Speaking for myself: a neutral stance towards knowledge is impossible, for it would make knowledge anodyne, powerless, of little significance in the eyes of those exposed to it. Which is why we need an activist attitude, one in which the battle for power-through-knowledge is engaged, in which knowledge is activated as a key instrument for the liberation of people, and as a central tool underpinning any effort to arrive at a more just and equitable society. I have been a knowledge professional, indeed. But understanding what I have done as a professional is easier when one realizes the activism which, at least for me, made it worthwhile being a professional.
*****
I will stop here. I have reviewed four things that I found important, looking back at a career as an academic that started in 1988 and is about to end. As I said, one should not read this review of important principles as the autobiography of a saint. I was evidently not perfect, made loads of errors, have been unjust to people, have made errors of judgment, have indulged in a culture of academic stardom and overachievement which I should have identified, right from the start, as superficial and irrelevant; I have been impossible to work with at times, grumpy and unpleasant at even more times, and so on. I am an ordinary person. But I do believe that I can say that I tried really hard to organize the professional part of my life according to the four points discussed here, and that the attempt, modest as it was, made that part of my life valuable to me. The satisfaction I draw from that is sufficient to end that part of my life without remorse, and without a sense of having missed out or of having been short-changed by others. I am happy to stop here.
A collection of conversations on research methods in action designed to demystify how we know what we ‘know’.
The resource has been developed bySarah Lageson PhD, an Assistant Professor at Rutgers University-Newark, School of Criminal Justice and Kyle Green PhD, an Assistant Professor at Utica College, Department of Sociology and Anthropology.
Source: Melissa Bond, Olaf Zawacki-Richter & Mark Nichols. 2019. Revisiting five decades of educational technology research: A content and authorship analysis of the British Journal of Educational Technology. British Journal of Educational Technology, 50,1, 12–63.
OASIS summaries are one-page descriptions of research articles on language learning, language teaching, and multilingualism that have been published in peer-reviewed journals listed on the Social Science Citation Index. The summaries provide information about the study’s goals, how it was conducted, and what was found, and are written in non-technical language. Where relevant, they also highlight findings that may be of particular interest to language educators, although the initiative is not solely aimed at research with immediate practical implications. The summaries are generally approved, and often (co-)written, by the author(s) of the original journal article.