Ten Rules for 2025: Lessons from a Very Weird Year

Ten Rules for 2025: Lessons from a Very Weird Year

Dec 6, 2024

Dec 6, 2024

2024 showed us exactly who writes the rules of our digital future – and exactly why we need to tear up their playbook. While the tech industry was busy optimising humanity out of existence, something more interesting was happening: communities were building power. 

The most expensive mistakes of 2024 weren't technical failures – they were failures to understand how resistance shapes markets, how solidarity drives innovation, and how power flows through technical systems. From the OpenAI crisis to campus protests, from API wars to picket lines, one thing became clear: tomorrow's competitive advantage isn't about having the best tools, it's about understanding how technology and collective action reshape each other. 

What follows is a short list of some of the lessons I've learned in 2024, and how I want those lessons to shape 2025. I wrote most of them in the middle of the night, sat on a beach in California about 2 minutes after I took the photo above. They might not mean anything to you, and that's OK. These rules are not for everyone; they're a personal strategic framework to help me navigate a landscape where technical innovation and political resistance have become inseparable. In essence, they are how I plan to survive and thrive in the aftermath of Silicon Valley's most hubristic year – and in the process try to build something better from its ruins.

1. Less data. More knowledge.

2024 showed us the limits of data-driven everything. The OpenAI leadership crisis and subsequent questions about AI hallucinations demonstrated that more data doesn't equal better understanding. ChatGPT's confident misstatements highlighted how raw information without theoretical frameworks and critical thinking can actively mislead. It's a direct response to the "data is the new oil" mentality that reached its apex with Silicon Valley's race to train ever-larger language models. More and more, it seems clear to me that real conversations with real humans represents the only way safe way to get genuine insight and to really understand why things are the way they are.
This year I will: Develop frameworks before collecting data, and refuse to mistake quantification for understanding.

2. There is no problem that a library card can't solve.

2024 showed us the devastating power of knowledge destruction as a weapon of war. As Israel systematically destroyed Gaza's universities and libraries, and Western institutions silenced Palestinian academics, we witnessed how access to knowledge remains fundamentally political. Meanwhile, academic platforms like Sci-Hub faced increased legal pressure, AI companies began charging premium rates for access to their models, and the Internet Archive's controlled digital lending program collapsed. Against this backdrop, traditional libraries emerged not just as crucial democratic spaces, but as sites of resistance. From campus protests defending academic freedom to grassroots efforts to preserve and share threatened knowledge, 2024 made clear that free access to learning isn't just politically radical – it's essential for survival.
This year I will: Actively support and amplify threatened knowledge communities, starting with Palestinian academics and extending to all those fighting for open access.

3. Join a union.

2024 was the year tech workers truly mobilised. From AI researchers demanding ethical guidelines to content moderators fighting for better conditions, collective action became unavoidable. The mass layoffs at Google, Meta, and Microsoft, coupled with the acceleration of AI deployment, made it clear: individual excellence won't protect you from structural change. The WGA strike's focus on AI rights showed how crucial collective bargaining will be in shaping our technological future.
This year I will: Put my union membership at the center of my professional identity, not as an afterthought but as a core part of how I engage with clients and colleagues.

4. Be radically disciplined in the embrace of antidisciplinarity.

The limitations of pure computer science in addressing AI ethics, coupled with the growing recognition of AI's social impacts, validated the importance of crossing disciplinary boundaries. The debate around the AI pause letter highlighted how technical expertise alone can't address socio-technical challenges. We've seen how anthropologists, sociologists, and philosophers became crucial voices in tech policy discussions.
This year I will: Build research methodologies that deliberately transgress disciplinary boundaries, bringing critical theory into board rooms and ethnographic methods into tech development.

5. Decode the infrastructure, not just the interface.

2024 revealed how surface-level analysis repeatedly failed to capture deeper structural shifts. While media fixated on ChatGPT's conversational abilities, the real story lay in the massive computational infrastructure and labour networks powering it. The OpenAI board crisis wasn't just corporate drama but a fundamental clash of ideologies about AI governance. The semiconductor wars weren't just about supply chains but about who controls the physical layer of our digital future. As technical systems became more opaque, the ability to analyze underlying structures - from training data politics to hardware dependencies - became not just an academic exercise but a survival skill.
This year I will: Make infrastructure analysis a core part of every project, forcing conversations about what lies beneath the interface in every client engagement.

6. Refuse algorithmic determinism; embrace human messiness.

The rise of the e/acc movement and its clash with AI safety advocates exposed the limitations of seeing technological development as inevitable or deterministic. The growing backlash against workplace surveillance and productivity monitoring tools demonstrated the human cost of optimisation culture.
This year I will: Build failure, friction, and human judgment into my methodologies, explicitly rejecting the myth of algorithmic objectivity. 

7. Your attention is labour. Bill accordingly.

The monetisation of everything reached new heights - from Threads' launch trying to capture our social connections to AI companies harvesting creative work for training data. The creator economy's continued precarity showed why we need to be more intentional about how our attention and engagement create value for platforms.
This year I will: Create clear boundaries around my attention, implementing a pricing structure that reflects the true cost of cognitive labor.

8. Citation is a radical act of care.

As AI made proper attribution more crucial yet more challenging, intentional citation became political. The lawsuits against AI companies for uncredited training data use highlighted the importance of acknowledging intellectual lineage. The ongoing debates about AI-generated content and academic integrity made citation not just ethical but essential.
This year I will: Build comprehensive citation practices into all my work, making visible the often invisible labor of marginalized knowledge workers.

9. Choose your platforms like you choose your politics.

2024's platform upheavals - from X's ideological transformation to Reddit's commodification of community labour - showed that every platform choice is inherently political. When Meta scrubbed news from feeds in response to journalism regulations, when Discord shifted toward AI integration, when Mastodon communities debated governance models, we saw how platform architectures encode specific worldviews and power relations. Choosing where and how to engage isn't just about features or user experience - it's about actively aligning our digital presence with our political values and understanding the broader implications of our platform participation.
This year I will: Audit my platform usage against my political principles, making explicit choices about where and how I participate in digital spaces.

10. Make solidarity your metric of success.

As personal brand building reached saturation point and the creator economy showed more cracks, collective approaches gained traction. The success of digital unions, mutual aid networks, and community-owned platforms offered alternatives to individualistic metrics of success. The ongoing crisis of trust in institutional knowledge highlighted the importance of building genuine communities rather than just audiences.
This year I will: Develop new metrics for my practice that prioritize community impact over individual visibility, measuring success through collective advancement rather than personal gain.

These rules emerge from a year that showed us how interconnected our challenges are - from AI governance to labour rights, from knowledge access to platform politics. They're not just guidelines for individual success but principles for collective navigation of an increasingly complex digital landscape.

HELLO@SUDOCULTURE.COM

THERE IS NO PROBLEM THAT A LIBRARY CARD CAN'T SOLVE.

© 2024

HELLO@SUDOCULTURE.COM

THERE IS NO PROBLEM THAT A LIBRARY CARD CAN'T SOLVE.

© 2024

HELLO@SUDOCULTURE.COM

© 2024