Included is a link to the "Undivided Attention" podcast where the guest is Taiwan's Digital minister. She discusses the progress and successes that Taiwan has made in producing *digital democracy* software, implementing *radically open transparency* at the government level, and building search engines whose underlying dynamics combat *political polarization*. Since the discussions up until this point have been desperately lacking for working examples, I hope to move the discussion out of the speculative with this offering: https://your-undivided-attention.simplecast.com/episodes/the-listening-society-yZ1PBlPF
A year or so ago, Nick and I were discussing what could be done to incentivize individuals to engage others across their ideological/political boundaries. While Nick was in favor of implementing a *richer* notion of *moderator* into debate platforms, I aimed to change the underlying dynamics of our suggestion engines. Suggestion engines today are known to facilitate silo-ing by identifying others like one's self and offering the individual more of the same, and a good deal of the literature supports the observation that iterating on this process leads to dense delta-like concentrations of what an individual tolerates or believes. An approach that I find actionable is to extend the suggestion process to model individual tolerances, suggest content at individual tolerance boundaries, and incentivize the extension of those boundaries. Tang explains that in Taiwan, they built suggestion engines that promote content more when the content is agreed upon by individuals that typically disagree. While I cannot speak to the efficacy of this approach, I am happy to see similarly dynamical attempts to solve the problem. For those that sat through the 3-hour anti-trust senate hearings, it is clear that without such a sophisticated approach, the government will attempt to solve the problem by demanding *case-by-case* that *such-and-such* result be *more fair*. Additionally, for those of us in the tech business, it is clear that such platforms are capable of what they do exactly because they are automated. To hire 100,000 individuals to moderate Facebook is simply not a solution, and to Nick's point, especially not a solution under the current poverty-stricken conception of *moderation*. This means that solutions will need to be implemented at a systems level and through studying the dynamics which arise from a platform's actionable behaviors and policies. I am thankful that some nations are taking the problem seriously enough to take action and that soon we may have working examples of *digital democracy* at scale. -- Sent from: http://friam.471366.n2.nabble.com/ - .... . -..-. . -. -.. -..-. .. ... -..-. .... . .-. . FRIAM Applied Complexity Group listserv Zoom Fridays 9:30a-12p Mtn GMT-6 bit.ly/virtualfriam un/subscribe http://redfish.com/mailman/listinfo/friam_redfish.com archives: http://friam.471366.n2.nabble.com/ FRIAM-COMIC http://friam-comic.blogspot.com/
