Teaching Hub

Four Questions for Teaching in the Real World

picture of several globes

by Nathan Loewen, Department of Religious Studies

I think educators have responsibilities to equip their students for the real world (rather than so-called “jobs of the future”). One of the most real aspects of our students’ worlds is their data streams. Since I began teaching in 2005, I have never presumed students would avoid going online to complete their assignments. Today’s students don’t just “google.” They use platforms like Chegg Study and GroupMe to find quick answers to assignments. They collaborate to complete their work on cloud-based services. Sometimes their professors ask them to use social media. They are constantly creating data streams. I want to reconsider my academic responsibilities as a teacher in the real world, where students generate results with greater impact than transcripts.

Data streams have very real outcomes in the short, medium and long-term. I am suggesting there is a new form of academic responsibility related to students’ data streams.

Nearly all my course assignments assume students use digital platforms. By “digital platforms,” I mean web-based devices. Some examples of these platforms include my institution’s LMS, cloud-based storage or word processing, library searches, internet searches, social media. After taking workshops with Chris Gilliard, Bill Fitzgerald. and Jade Davis1, I am rethinking how my assignments anticipate students’ use of digital platforms in the real world.

Here are four questions asked during these workshops:

  1. What games are we playing with digital platforms?
  2. And what are the rules?
  3. Am I able to effectively communicate what I know about these games and their rules?
  4. How will I verify my students’ knowledge about these games and their rules?

I might not be prepared to teach for the real world if I cannot satisfactorily answer these questions. The experience of today’s higher education includes learning skills to advance their interests while navigating a complex world. And, as is the case with most learning experiences, it is likely that students do not necessarily know what their interests may be. For example, as Joel Reidenberg et al reported on Privacy and Cloud Computing in Public Schools, the degree to which students will sign off their rights and give away their data is astounding. They may not be as aware as they should about the permanence of their data streams on the web. Chris Gilliard makes this point about pedagogy and the logic of platforms:

Students are often surprised (and even angered) to learn the degree to which they are digitally redlined, surveilled, and profiled on the web and to find out that educational systems are looking to replicate many of those worst practices in the name of “efficiency,” “engagement,” or “improved outcomes.” Students don’t know any other web—or, for that matter, have any notion of a web that would be different from the one we have now.

If I don’t know how the web and the internet work — i.e., the ways platforms filter information and establish user data streams — I am likely unable to frame my teaching in ways that avoid causing potential harms to myself or my students. Learners need to be able to identify and value their data streams.

No digital platform is neutral. We can know this by virtue of the fact that digital platforms are designed.

Protestor with sign saying "error 155: democracy not found"

As Nick Srnicek explains, some people somewhere designed each platform with their own very specific interests. Tarleton Gillespie asks readers to understand that these “platforms” are campaigns that advertise, make promises, promote values, and make bids for leadership.2 I suspect I am doing a disservice to my students if I cannot begin to explain in general terms what is at stake with these games and their rules. My hunch is that I have a responsibility to ensure that students understand not only their rights and responsibilities but more so, the liabilities and varying future outcomes of participation on digital platforms.

The upshot here is not to dive into the ways to end the corporate creation, use, and reproduction of people’s data streams. The internet would stop working without data streams. My objective is to clearly understand how to teach students about their data streams and to narrow those streams while they are completing assignments for my courses.

Data Stream Simulation

Here are some concrete steps I am taking:

  • Assess the data stream risks related to each of my assignments and then broach these considerations with the class.
  • Talk with students about whether and why they care what their friends see, and then extend their considerations into the future.
  • Discuss with students the relevant policies for online platforms at their institution.
  • Practice critical reading of a platform’s Terms of Service with the students.
  • Introduce students to the idea of a Student Collaborators’ Bill of Rights with regard to their participation in digital projects at their institution.
  • Review internet browsers with the class in order to compare their privacy options with Mozilla Firefox.
  • Explore how internet searches work in order to introduce the privacy features of Duck, Duck Go.

What do you think?

FOOTNOTES

1. My attendance at Digital Pedagogy Lab 2018 was graciously supported by Dr. Christine Taylor (Vice President/Associate Provost for Diversity, Equity, and Inclusion) and Dr. Tricia McElroy (Associate Dean of Humanities, Arts and Sciences).

2. See Tarleton Gillespie. 2010 “The Politics of ‘Platforms’,” New Media and Society. Volume: 12 issue: 3, 347-364: A term like ‘platform’ does not drop from the sky, or emerge in some organic, unfettered way from the public discussion. It is drawn from the available cultural vocabulary by stakeholders with specific aims, and carefully massaged so as to have particular resonance for particular audiences inside particular discourses. These are efforts not only to sell, convince, persuade, protect, triumph or condemn, but to make claims about what these technologies are and are not, and what should and should not be expected of them. In other words, they represent an attempt to establish the very criteria by which these technologies will be judged, built directly into the terms by which we know them. The degree to which these terms take root in the popular imagination, whether in the rhetoric of the industry or in the vocabulary of the law, is partly the result of this discursive work (Berland, 2000; Gillespie, 2006; Pfaffenberger, 1992).