Optimize algorithms to support kids online, not exploit them

Part of my work as a researcher and educator involves identifying innovative, authentic ways to embed technology in instruction. In this work, I’m excited when I work with educators to thoughtfully utilize technology as they support learning objectives. I’m also perplexed when I work with individuals that don’t always see the positive opportunities that may come with tech use. I’m also always concerned by the darker, negative side of technology.

An example of this includes my use (and promotion) of tools like the ones included in the Google Apps for Education (GAFE) suite. These include Google Docs, Google Slides, Google Sheets, and more linked together in Google Drive, This, together with Chromebooks in classrooms from Elementary on up provides a good way to help educators and students to read, write, and connect in digital spaces. In many ways, it also helps cede the future use of these texts and tools by individuals to Google. It helps make them brand users now…and in the future.

Future users of the Internet

These early users of technology are a big market for big tech. As they get used to the specific offerings of companies in this space (Google, Apple, Amazon, Facebook, Spotify, Twitter), they also may prefer to keep them as they grow older.

More importantly, these companies (and many other unseen entities) are busy collecting data on children. They’re using this data to teach machine learning engines and algorithms. They’re also collecting these digital breadcrumbs as they examine the residue they’ve left behind as they’ve interacted with digital content.

All of this puts our children at risk now…and in the future.

This latest post by Joi Ito is part of his series of posts about young people and screens. It thoughtfully addresses many of the challenges that I think about daily.

Children are exposed to risks at churches, schools, malls, parks, and anywhere adults and children interact. Even when harms and abuses happen, we don’t talk about shutting down parks and churches, and we don’t exclude young people from these intergenerational spaces. We also don’t ask parents to evaluate the risks and give written permission every time their kid walks into an open commercial space like a mall or grocery store. We hold the leadership of these institutions accountable, pushing them to establish positive norms and punish abuse. As a society, we know the benefits of these institutions outweigh the harms.

I do not believe that we can entirely ban these spaces, tools, and practices from our children…nor would we want to. But, we need to identify and promote our expectations for what is meant by access to their “meaningful online relationships and knowledge” means now, and in their future.

SOURCE: Wired

About the author

Ian O'Byrne

Dr. W. Ian O’Byrne is a educator, researcher, & speaker. His work centers on teaching, learning, and technology. He investigates the literacy practices of individuals as they read, write, and communicate in online & hybrid spaces.

View all posts

2 Comments

  • Consent is such an interesting topic Ian. With the world ever changing, I am not sure how we best account for what we maybe inadvertently giving up. Banning does not feel like the answer, but putting our hands in the air and giving up is not much better. I aspire for a ‘good enough’ (if that is the right term) digital mindfulness. Where I think I understand, but am also aware that there is most likely a lot occuring outside of my peripheral vision.

    • Hey Aaron. I really appreciate the idea of being “mindful” of technology. We cannot address all of these complexities, but we must also recognize them and not put our “head in the sand.” We need to be responsible, mindful users.

Leave a Reply

Your email address will not be published. Required fields are marked *