Ethical OS and constructing the future

Written by aoconnor1

November 15, 2025

Blog Post 6- Ethical OS and constructing the future

Zone 2: Addiction and the Dopamine Economy

Thinking of risk zone 2 and calling for tech designers, as well as users, to consider and understand how tech affects human interaction and our physical/mental self. Initially in my mind this references back to watching “The Social Dilemma” and the acknowledgment from the youth in the documentary of the reliance and importance social media plays in their day to day lives and social lives. For them it was a given and coveted aspect of life. Also, the bullying and real life implications to the child who was being bullies and his mother talking about the shift in his behavior, as well as those who committed suicide as a result. Secondly, the poem “It Was The Damn Phones” by Kori Jane Spaulding highlighted the addiction and risk zone. Her own understanding of cell phones and doom scrolling. Both highlight the increased use and effects of tech on how we are interacting and being shaped by it and potential risk for overuse. 

Zone 4: Machine Ethics and Algorithmic Biases

Thinking of risk zone 4 it calls for acknowledgement of our own bias brought into and influencing tech and what media is brought by the algorithm. Referencing back to “can AI go rough” podcast episode talk about how AI is not inherently “bad” but operates off what is input and adapts to new information given by users to give more of what fits the user. Also, Craig’s “Adapting Clinical Skills to Telehealth: Applications of Affirmative Cognitive‑Behavioral Therapy with LGBTQ+ Youth.” and how tech can be used to adapt to meet people where they are with telehealth appointments. It is the role of social workers to acknowledge our own bias brought into our work and asking for transparency from tech designers. 

Zone 6: Data Control and Monetization

In this risk zone requires tech designers as well as ourselves to be objective of how our data is gathered, controlled, accessed and the end result of how it is used. When listening to the “Your Undivided Attention” podcast episode the gentleman from Hinge made an effort to explain from his perspective that, yes, the app is using your data but to help daters get “off the app.” but ultimately the app has a monetary goal in attracting users. Then in Reamers article “Artificial Intelligence in Social Work: Emerging Ethical Issues” he addresses the clients own autonomy and need to give explicit consent highlights the risk zone and social workers role. 

Zone 7: Implicit Trust and User Understanding

In zone 7 designers and users adopt a universal code for users, with our implicit trust as a given. I believe in the podcast episode from “Your Undivided Attention” the guest Sherry Turkle wraps up the episode with advocacy for “the rules of the road” for AI and tech designers is an example of  this universal code. Also, in Digital Transformation “Social Work: Integrating Technology for Enhanced Practice, Outreach, and Education” it is highlighted with attention to informed consent and being literate in the digital world.



1 Comment

  1. LizP

    First, I feel like your photo really encompasses the blog post! Second, “Thinking of risk zone 4 it calls for acknowledgement of our own bias brought into and influencing tech and what media is brought by the algorithm,” is such a scary bit of information. I have learned so much in this class about creator bias! I actually never even considered it before because I thought that everything was simply driven by entering codes to produce data. I just didn’t think about how the program could reflect beliefs, assumptions, and perspectives of the person that is designing and building the technology.

Submit a Comment