A Proposal to Jack & Zuck: Chris Anderson (ex-WIRED) & James Currier on Regulating Social Media
In this episode, James Currier and Chris Anderson delve into the complexities of regulating social media, discussing the role of likes, extreme content, and the potential impact of nationalization on large companies like Facebook. They also explore the challenges faced by rapidly growing firms like Robinhood and address root algorithmic causes in tech companies.
Key Points
- Regulating large tech companies requires understanding their network effects and the societal impact of their algorithms, which traditional methods of governance like antitrust laws may not effectively address.
- An ombudsman model, adapted from traditional media, could provide an independent, proactive oversight mechanism for tech companies, ensuring they address the societal implications of their content and algorithms.
- Tech company leaders need to evolve from focusing solely on growth and profit to embracing stewardship and responsibility for the social and public implications of their platforms.
The usual Silicon Valley response to regulation is to offer self-regulation instead — but only tech companies have the skills and speed to fix what they broke, unable to manage the influence over the public sphere they attained almost by accident.
Unless the few who know how to actually build these tech networks speak out, it won’t just be technology that suffers - it will be the entire world. Having built, funded, and analyzed “unicorn” social networks for 25 years and former Editor-in-Chief of Wired magazine, Chris Anderson and NFX Partner James Currier formulate a solution that resurrects a concept from the golden age of newspapers - the ombudsman.
The ombudsman is another important node in the network that has proven itself in the past and is still missing. Listen as Chris and James outline together why these network problems require network solutions, here on the NFX Podcast.
Chapters
0:00 | |
7:41 | |
13:05 | |
18:27 | |
25:00 | |
31:21 | |
35:09 | |
39:25 | |
43:45 | |
46:56 |
Transcript
Loading transcript...