Blog Post 6-Economics

Written by LizP

November 15, 2025

 

 

 

 

Zone 1: Truth, Disinformation, and Propaganda
I feel like throughout our modules a Zone 1 connection could be found. However, I related ideas between Zone 1 and the article in Module 4 titled, “The Digital Divide Is a Human Rights Issue: Advancing Social Inclusion Through Social Work Advocacy” (Sanders & Scanlon, 2020). Zone 1 focuses on how technology shapes the flow of information which includes truth, misinformation, and propaganda. The article discusses how the digital divide creates a human rights violation because it prevents marginalized communities from fully participating in society, accessing resources, and advocation for themselves. Another connection is that propaganda typically thrives where digital literacy is low. We need equal access to digital training so that people can safely AND critically engage online. Improving digital literacy is a core strategy for resisting propaganda and developing better critical thinking skills.

Zone 2: Addiction and the Dopamine Economy

It Was the Damn Phones stood out to me on a very personal level because I am a Gen X parent to a Gen Z teenager. My experience growing up was entirely different from my daughters. “They didn’t know what it was like, having the world at the tip of our fingers. We scroll through the trash so much, we have news headlines tattooed on our skin. Wires for veins. AI for a brain. And they may not have understood. But they were right. It was the damn phones” (Spaulding, 2025). These lines were like a gut punch as my daughter has even referred to herself as a computer in the way she acts and breaks down information. Her generation is dealing with addiction and dopamine in a way that mine didn’t have to. The second connection, Zone 2 explains how digital tools are created to keep a user returning seeking out dopamine from every like, message, or screen movement. The poem echoes this by describing how phones have become irresistible and intrusive. Constant checking, the magnetic pull, the inability to disconnect all reflect addiction patterns shaped by dopamine.

Zone 4: Machine Ethics and Algorithmic Biases

Going back into our previous modules, I connected Zone 4 to Frederic Reamer’s “Artificial Intelligence in Social Work: Emerging Ethical Issues” (2023), from Module 3. Starting with Machine Ethics I go straight to AI not having morals. When using AI to make recommendations in social work such as risk assessments or interventions, AI uses calculated predictors that reflect a machine and the designer of the machine. Reamer has concerns about misdiagnosis, unfairness, or misrepresentation that go along with this thought, AI is definitely doing more than just calculating and those decisions have a moral impact on our clients. There is also a connection between Zone 4 and this article in how specific AI is designed. Machine Ethics isn’t just about how AI is used ethically, but also how the technology is built. “Social workers who use or are contemplating using AI should draw on these prominent guidelines and address a number of key ethical considerations related to informed consent and client autonomy; privacy and confidentiality; transparency; client misdiagnosis; client abandonment; client surveillance; plagiarism, dishonesty, fraud, and misrepresentation; algorithmic bias and unfairness; and use of evidence-based AI tools. These key ethics concepts should be reflected in ethics-informed protocols guiding social workers’ use of AI” (Reamer, 2023).

Zone 5: Surveillance State

I think that Zone 5 is one of the more problematic issues for me when it comes to the use of AI. I am uneasy when I think about my own child telling their deepest secrets and issues to a bot. “We are entering a new phase in how young people relate to technology, and, as with social media, we don’t yet know how these social AI companions will shape emotional development. But we do know a few things. We know what happens when companies are given unregulated access to children without parental knowledge or consent. We know that business models centered around maximizing engagement lead to widespread addiction. We know what kids need to thrive: lots of time with friends in-person, without screens, and without adults. And we know how tech optimism and a focus on benefits today can blind people to devastating long-term harms, especially for children as they go through puberty” (Mclean, 2025). I am also concerned about the connection between Zone 1 and AI companies. How are companies using your data?
“Most chatbots are not therapeutic tools. They are not designed by licensed mental health professionals, and they are not held to any clinical or ethical standards of care. There is no therapist-client confidentiality, no duty to protect users from harm, and no coverage under HIPAA, the federal law that protects health information in medical and mental health settings (McLean, 2025). If that wasn’t disturbing enough the article goes on, “That means anything you say to an AI companion is not legally protected. Your data may be stored, reviewed, analyzed, used to train future models, and sold through affiliates or advertisers. For example, Replika’s privacy policy keeps the door wide-open on retention: data stays “for only as long as necessary to fulfill the purposes we collected it for.” And Character.ai’s privacy policy says, “We may disclose personal information to advertising and analytics providers in connection with the provision of tailored advertising,” and “We disclose information to our affiliates and subsidiaries, who may use the information we disclose in a manner consistent with this Policy (McLean, 2025). I find this information absolutely horrifying, but also find myself in a position where I see that technology can be extremely helpful in social work.

McLean, M. (2025, August 6). First we gave AI our tasks. Now we’re giving it our hearts. AfterBabel. https://www.afterbabel.com/p/ai-emotional-offloading?action=share

Reamer, F. G. (2023). Artificial intelligence in social work: Emerging ethical issues. Journal of Social Work Values and Ethics, 20(2), 47–60. https://jswve.org/download/2023-20-2-reamer-ai/

Spaulding, K. J. [Kori Jane Spaulding]. (2025, June 11). It was the damn phones [Video]. YouTube. https://www.youtube.com/watch?v=ouxed-5uxDM&t=5s

Sanders, C. K., & Scanlon, E. (2020). The digital divide is a human rights issue: Advancing social inclusion through social work advocacy. Journal of Human Rights and Social Work, 6(2), 130–143. https://doi.org/10.1007/s41134-020-00147-9

2 Comments

  1. KimBee

    This was a great post! I really like how you use the example of propaganda thriving in areas that lack digital literacy. Literacy, awareness, and fact checking seems to be very important in this digital era especially with social workers. Your reflections effectively connect the Ethical OS risk zones to core social work concerns and assigned readings, showing how these abstract technology risks have personal and practical presence in the social work field. As you have highlighted, marginalized communities hit by the digital divide and often times miss out on an opportunity and become vulnerable to manipulation with propaganda The powerful lines from Spaulding’s poem capture how “wired” and compulsively engaged today’s youth are, often at the cost of mental well-being and social connection. I too can relate to cell phones and social media and I do not have a teenager. I am trying to safeguard and prevent such an impact that this digital society will have on my daughter. The risk of algorithmic unfairness and misdiagnoses in social work interventions means practitioners must not “outsource” ethical judgment. Your concerns regarding Zone 5 (Surveillance State) and the lack of confidentiality with AI “companions” speak to a critical discomfort many social workers share. There is no moral ground or human aspect in an AI “companion”. Across all these zones, your comments summarizes why social workers must develop not only technical and digital skills but also critical thinking skills and reshape the mind with critical habits.

  2. allisonganz

    Hi Liz! I wrote about Zone 5 as well and I completely agree that the surveillance piece is honestly the most unsettling part of all this. I get really uneasy too, thinking about kids using chatbots with basically no real guardrails in place. And you’re so right, the scariest part is that mental health professionals had absolutely no role in designing these tools, yet kids are turning to them for support. They’re not held to any ethical standards, and on top of that, these companies can pretty much use the data however they want. It’s a huge red flag, and it really shows why we need social workers involved in shaping policies around this stuff.

Submit a Comment