User:Halfak (WMF)/Human tech

From Meta, a Wikimedia project coordination wiki

To me, human tech involves the development of advanced digital technologies in the support of human processes without the naive attempt to replace humans entirely. Human tech acknowledges the complex and intertwined socio-technical nature of our environment and embraces it. When we consider introducing a new digital technology into a human-led process, we ought to ask a series of questions to make sure that the technology has been useful.

Does it address a systemic need?
Often we can see a clear way to address an individual's need or an imagined need (we can be very creative), but understanding a real infrastructural gap in a larger system is very difficult and requires a lot of in-context exploration. Take, for example, new page patrol in English Wikipedia. In order to understand what is going on with this curation infrastructure and what it would take to perform a successful technological intervention, you'd need to understand the history and the people involved in reviewing new pages. IMO, building exactly what the patrollers want would be either impossible or inherently problematic. It's only in exploring other parts of the content creation system that the roles of WikiProject organizers and subject matter experts becomes apparent. Through analysis, the basic assumptions of the dominant group of patrollers can be challenged. Though interviews of community organizers, their potential role in new page curation becomes clear. I could go on and on about the type of gap represented here. In the long run, we need to build theory about the system, where and how gaps form, and how digital technologies can be used as tools by real people in order to fill those gaps.
Does it shift burden?
Our volunteers are overloaded. If we can take some of the load off of them, they'll have more time to work on other things. While many do find the basic tasks of maintaining a large content repository rewarding, others want to be able to focus on more directly productive work. Further, there are costs with doing many types of tasks slowly. Backlogs have input and output speeds. If the input speed is larger than the output speed, a backlog may spiral out of control. This is a very stressful situation and results in bad feelings spread around. There's an incentive in this situation to re-adjust the burden. By making the work of the primary curators faster—through some simplification—we may cause serious problems for other parts of the system/community. E.g., when Wikipedians used machine learning models and templates to adjust their quality control practices to be more efficient, they shifted the burden of dealing with good-faith newcomers. While it originally appeared as though all was well and the system was more efficient, it took years to discover that a more serious problem related to retention and diversity was growing unchecked. Understanding the system surrounding a specific process can help us make sure that we're not just offloading work that humans from group A did to humans in group B.
What bias is perpetuated and how do we mitigate it?
Bias and inequalities creep into everything that we do. English as a primary translation? Non-native English speakers are dis-empowered. High system requirements? People who can't afford the latest hardware will not be able to work as effectively. Training a model on past behavior? Whatever peculiarities there were are now reinforced by the decision support that the model provides. Worse is that in the case of advanced technologies based on a sufficiently complex modeling process, heuristics aren't enough to make sure we don't introduce problematic biases. Instead, we should see the opportunity in our users to discover, describe, and help address the biases that we the developers cannot see.
How does the technology adjust?
We operate in a computer mediated space. When we release a technology, we're turning a set of rules into a hard reality for those who occupy this computer mediated space. In our world—the wiki world—rules need to change. Consensus changes. How will the technology adapt to new needs by the community of people who use it and are affected by it? We can approach this in several ways.
  • Adaptation at the application level (tool developers): By providing APIs, datasets, and other basic, machine-readable resources to volunteer tool developers, we enable them to experiment with new technologies (new rules) for supporting collaborative work. While this is imperfect due to limitations on how volunteers can be expected to iterate upon and maintain tools that become critical infrastructures, it has historically been an under-appreciated source of innovation. Often tool developers are users themselves, and they have deeper insights into the immediate infrastructural needs of our community than we could ever have.
  • Curated artifacts as part of the technology: By allowing the technology to adjust—to be configured in a basic way—based on changes to a commonly editable artifact, we allow users to make substantial adjustments to that technology as needs and consensus change. See AbuseFilter and the community of people who maintain the rules. See also the work that the Scoring Platform team is doing on modeling "draft topic"—using the WikiProject Directory (a community edited artifact) as a basis for the target classes.
  • Sustained product team to make adjustments: Sometimes, the best way forward is to maintain an expert product team who can adjust/extend a technology on demand. This will need to be a "product team" because the process of identifying and prioritizing necessary changed to the technology is part of a product development process that requires the varying expertise that product teams have.

Human tech is about building tools for humans that allow them to collaborate better together. A tool does not remove agency from the humans who use it, but it does direct behavior in a strong way. The design of tools that support and are refined by humans is an attempt at making technology work in a human context rather than trying to make humans work in a technological context. Wikimedia is situated uniquely to take advantage of the potential human/machine collaboration: (1) we have curation problems at great scale, (2) we have a mature, collaborative community, and (3) we have hired some of the foremost experts on wiki processes with experience developing advanced technologies.