Discover more from Strange Attractors
The Lies of Empire are "neither neutral, nor coincidental". Log off and Organize, baby.
We've reached a singularly undeniable point in our social awareness where our understanding of computer hackers is as good an avatar as any for how our media culture has failed our civilization.
“Most of the evil in history is perpetrated not by lunatics or monsters but by individuals of responsibility and commitment, whose most unsettling aspect is the apparent normality of their deportment.” wrote Michael Parenti in History as Mystery; we see this play out all the time, and there’s this seeming willingness to respond to all nuanced doubt of the official narrative as being conspiratorial or somehow nefarious. From the provable, such as the CIA’s involvement in worldwide regime change, or the corruption at high levels of world governments, or the extreme autocracy of multinational corporations, to the easily understood but difficult (artificially impossible in some cases) requirements of proof in social justice around things like police brutality, imperialism, and armed occupation. The reasons any society lays down for authoritarianism are well-established, the idea of an Overton Window is well known as well, but more interesting are the fantasies of what average citizens may wish they could do about it. In the modern era, one of the more common narratives is around the mythology of hacking. From Sneakers to Hackers to biopics like Underground: The Julian Assange Story, the idea that a sufficiently motivated citizen can expose the status quo for the fraud that it is, yet when it plays out in real life, as it has in the case of Assange (of course with his own conduct worthy of scrutiny, but possibly his own worst enemy when it comes to fighting the PR battle this status quo was infinitely more prepared to fight), his actions draw the ire of those who worship the impoverished, jingoistic imagery that upholds the belief that if the United States, major banks, etc. had anything to hide, they probably had a good reason.
I am not going to make a case for the character of Julian Assange, himself, because, frankly, it’s immaterial in considering what his prosecution represents and should not be playing out this way in any rational world given the duality of responses from media to reality; a human rights violation by many measures of a journalist, the idea that the state and military can act unaccountability in pursuit of violent repressive occupation many times over, and that the media has played a role is centering her character and not the import of those revelations. What is important, however, is how and why this is happening, and there’s no more stark example of this, as I’ve said, than how our culture consumes its understanding of technology and its deep entanglement with the state.
“[O]rthodoxy can rest on its own unstated axioms and mystifications, remaining heedless of marginalized critics who are denied a means of reaching mass audiences. Orthodoxy promotes its views through the unexamined repetition that comes with monopoly control of the major communication and educational systems. In sum, while dissidents can make mistakes of their own, they are less likely to go unchallenged for it.” says Parenti, and this is something to keep in mind—why does empire get the benefit of the doubt? Why does the official narrative, despite debunking and well-sourced counternarratives much of the time, remain canonical beyond the first unchallenged telling? Well, we can look at this a few different ways, but we can see this dynamic dramatically on display in how we portray cybercrime— something well-understood conceptually, but is so heavily politicized that it’s easy to fail to corroborate even the most openly bogus narratives, while relying on preconceived biases of an audience of qualified, but gaslit into believing they are not (i.e. technical proficiency), critical observers to be exploited.
Computers are not infallible, but they are a tool that gets wielded as responsibly and effectively as the wielder’s understanding— most “hacking” (I’ll use this informal, popularly understood misnomer for the sake of clarity, the security context) is terribly mundane, rarely fast paced, but a reflection of one’s ability to understand and apply a cognitive framework, like any other analytical discipline. You might spend a few minutes writing a script, for example, to surface some potential vector for exploitation, while sufficiently randomizing those requests to avoid tripping some kind of alarm about the number of failed login attempts or from a given location, or against a given protocol, which would itself take hours to run automatedly, but it demonstrates the sort of thinking I am describing for a real-world application of these skills. I mention this because of how most leaks occur; by accident, through random probing, through misconfigurations, but however it happens, a lot of times it’s just left to chance. In the film Underground, the teenage Assange (Mendax) and his two friends penetrate the United States military network, and much of the manhunt for the trio centers around, not whether or not the material he surfaces that proves Gulf War-era civilian targets and not even whether or not this is bad for society, that the real security threat—as the AFP saw it- was that he could leak it.
Whatever your opinion of Assange, the real-world (I believe the film elected to be less than unrelentingly straightforward about his teenage activity to tell a larger truth about what would come later) response to Wikileaks is an example of exactly this type of dynamic—they could prove wrongdoing, but the only wrongdoing that got prosecuted was that it was leaked at all, and now whether or not Assange, in his capacity as a journalist, had any right to inform the public of what a supposedly democratic coalition was doing in countries it is occupying. In the realm of fiction, we see this play out as it might rationally; in Hackers, The Plague and his accomplice are held accountable for the fraud (and potential ecological distaster used as leverage) Zero Cool & Co. expose, in Sneakers Robert Redford’s band of security professionals take on the NSA, these are situations where there’s a party assuming accountability, and the public becomes aware of a wrong that has been, in part, righted, but in the real world, the lines between moral and legal and correct are blurred to the point of indistinguishibility, if you ask the average citizen.
This is, perhaps, understandable— we’re a society built of deep propagandization of even the most educated, it’s conditioning and it’s willful, and getting out from under it is a challenge, and it’s meant to be that way; even the most willing to step outside of this and challenge their biases still maintain deep blindspots (I suspect this is, for example, why Assange’s defense is so hard to develop—there are either true believers, who refuse any wrongdoing in any aspect of his life given the ostensible but unrelated good, or those who vilify him as a traitor regardless of what he was alleged to have leaked. This is the same with men like Edward Snowden—his own whistleblowing the product of poking around in the NSA’s Sharepoint, and eventually being exposed to the troubling information with which we are acquainted today. The problem? Despite the proof, many either deny it’s import, or deny it’s accuracy, and impugn his character and motives for having done so despite the documents being objective, documentary evidence of the claims— does it matter if he, himself, had an ulterior motive? I don’t believe he did, but I also don’t believe it would matter if he did. Whether or not he was allowed to do these things, some sort of limit hangout, is beside the point as well— it’s out there, and the public’s response was an effective gauge of how effective any resistance to the state might be.
The flip side, to this, of course, are the “useful idiots”— they mean well, but ultimately, their definition of hacktivism, in this case, isn’t to make social justice from the carcass of an exposed failed state, but to reform the status quo and become a part of its power superstructure in some (ultimately contradictory) anodyne way. A text by Joseph Menn about the Cult of the Dead Cow comes to mind: the hackers in question have good intentions, but ultimately, in collaborating with the state, they bear some responsibility for the information culture that resulted in the 2016 election being a nexus for the worst of Internet culture being weaponized against the public. Anti-patterns became the norm, and this was because these well-intentioned, but ultimately arrogant hackers got played by the global elite class into proliferating conditions favorable only to them. Their response? Well, work with and for the feds, after becoming the problem (almost all the principals in the story are now big tech leaders, many in FAANG), after first warning of the perilous future.
Guys like Assange are hardly any more clean, but that’s beside the point— the nature of the contribution matters, and the results here speak for themselves. We see these men as judicious for working with, by their own admission, a corrupted and arrogant state, they also arrogantly because they, themselves, can technocratically reform. Their approach is rank libertarianism, it seeks to portray itself as benevolent, but just as these men had to learn there are no benevolent corporations, a corporatist technocracy cannot be truly benevolent either. Post-2016, their response (that of tech, writ large— I don’t pretend to think the actions of these individuals exists in a vaccum, or that the institutional malice of technology companies is shared by these, again, probably well-meaning people out of their depth ideologically) has been to take the same approach to solving the problem that they took to causing it. Not grassroots organizing, but across the class prerogative of now comparatively wealthy and influential peers.
I had the occasion to watch The Social Dilemma recently, and absolutely hated it— it is the most extreme version of the phenomenon I’ve described above, except with the added vile element of seeking to wash one’s hands of responsibility for what social media (as an extention of the technology used to propagate it) has wrought, which far exceeds the intellectual grasp of the technologists to understand how thoroughly exploitable by the truly evil. My review of the film was as follows:
10 minutes into “The Social Dilemma”: “the problem” is capitalism and, with no absence of ideas, nothing happened at Google as a result of this “revolution” is that workers in tech aren’t unionizing. This is weird propaganda drying to pin hopes on benevolent “good technologists”.
It correctly identifies surveillance capitalism but doesn’t acknowledge this was always the logical outcome of all these behaviors and tolerances for it from entrepreneurs and bogus libertarian ideals in tech. Doesn’t acknowledge we’re also labor being alienated from our labor.
This mindset is exemplified when he makes a good metaphor for who succeeds in tech when speaking about himself: he’s talking about astonishing PhDs as a 5 year old magician, now a grown man who didn’t consider he was being humored by adults and maybe wasn’t the delicate genius he considered himself. He comment about how bicycles didn’t ruin society, but it absolutely did generate that kind of hysteria, for example; social media technology isn’t so singular an innovation that makes engineers so singular that existing labor understanding stops applying. This is propagandistic.
It goes on to make this case based on the (demonstrably false) infinite complexity of these FAANG datacenters; implication that only they are capable of regulating themselves. These are EXECUTIVES washing their hands of their role in SOFT AI; not amok algorithms. It’s THEIR bias.
One of the talking heads, a FB investor keeps saying it’d be an effective tool for a dictator, but we’re ALREADY living under corporatist authoritarianism in the US without tech and now worldwide as a function of imperialism. Tech CEOs ARE the dictators of these nation-states. Jaron Lanier is the only interviewee even remotely getting close to the point of any of this and also is the only one without a stake in these companies’ success.
Ends with compelling case for regulation, falling short of nationalization, bc the capitalists realize it’s the end; this is the last chance to protect their interest lest we delineate further from billionaire/not, realize there are ultimately no protections for the working class. The problem is NOT that it’s an inhumane application of this benevolent system; it’s an inhumane system that once corrupted has these untenable consequences. It’s a feature, and this documentary is intent on asserting otherwise to protect the VC class interest of Silicon Valley.
As Menn says in the book on CdC, “If the combination of mindless, profit-seeking algorithms, dedicated geopolitical adversaries, and corrupt US opportunists over the past few years has taught us anything, it is that serious applied thinking is a form of critical infrastructure. The best hackers are masters of applied thinking, and we cannot afford to ignore them.”— this is a beseechment, in practical terms, to the hackers as much as it is to society; they don’t need to be your technocrats, but they don’t need to be marginalized either. It’s, whether Menn intends it this way or not, the best framework for forging a more skeptical, analytical world, using these tools, not being personified (or, as is more often the case, dehumanized) through them. The solution isn’t the tech, or reforming it, but what role it can play in a solution. More and more the question being asked is if there’s a role for it at all—the conceit of The Social Dilemma is precisely what I’m describing; the most guilty for our present state of affairs, empowered by the prevailing corporatism of our economic system to be the most influential while being the least productive and demonstrably less successful than entrepreneurs in other areas, believe they are entitled to a second chance at fixing it.
What all of these narratives teach is us is precisely this; it’s a means, not the end, to look towards technology to solve (or ascribe as the cause, rather than a particularly virulent symptom, of) these systemic problems. A call to civic duty is a good impulse, a desire to join the state in protecting the public good is also a good impulse, but what we’ve seen play out in reality vs. that of fiction is that the public good is defined by the state despite demonstrable harm to the public, and those who act in the interest of the public are ostracized as terroristic threats, and those who embolden the state violence are compensated by our economic system and propelled upwards. As Dug Song says in the CdC book, “Security is about how you configure power, and who has access to what.” and for a narrative that misses this point so badly, this is an incredibly powerful insight. The solution they arrive at, sadly, is to look amongst their own alumni ranks— then-US Senate candidate Beto O’Rourke (who the book reveals was a teenage script kiddie in the group) as their man inside the government, and then tapping their own professional networks, which is already problematic for the same reasons I detailed in the The Social Dilemma review. It’s arrogantly flirting with a model of government they’ve already done entirely too much for by accident that I shudder to think how bad things could be if they did it on purpose when you consider the level to which the spectrum, politically, seems to range from vapid libertarianism, to even more neoconservative-lite Democratic Party politicians wokescolding radicals to reign it in.
In the meantime, being isolated in a UK prison (Belmarsh, known for, particularly in the last couple years of inmate deaths, its poor conditions) and living in exile in Russia, respectively, two of the men responsible for large disclosures that reveal true and world-harming corruption live in, Julian Assange and Edward Snowden, a state of perpetual disavowal, even by their “own kind”—perhaps not in words, but in action, willing to assist the state, willing to reinvigorate the corporatism that has plagued the United States for decades, which Big Tech™, FAANG, etc. has only accelerated. We’re asked to consider their motives, which may or may not be pure or without self-interest (whatever your opinion of them, and similar whistleblower political prisoners like Reality Winner), without considering the object merit of the material, which clearly served neither of their interests to release, only that of the public interest. Michael Parenti in Dirty Truths suggests, “Often the term ‘conspiracy’ is applied dismissively whenever one suggests that people who occupy positions of political and economic power are consciously dedicated to advancing their elite interests. Even when they openly profess their designs, there are those who deny that intent is involved.” and that is as good an explanation for, amidst capitalism-driven corporatist rule of the government, how we rationalize, as a society, understanding the morality of a fictional narrative, but castigate, without challenging biases, the real-world equivalent, often facing the same ethical concern. What was just in fiction is conspiratorial and indefensible in reality, and this division of reactions is partially explained by Parenti’s observation, and of course those who speak to this truth are accused of “imagining a conspiracy because [they] ascribed self-interested collusion to powerful people.”
Again, in Against Empire, Parenti identifies the primary issue at play here, about the state aggression against the public in asserting a Republic when the corporatist authoritarian wing of capitalists (in this case, in tech, among the most unjustly wealthy) is not only in control, but controlling the narratives as well: “The diseconomies of capitalism are treated as the public's responsibility. Corporate America skims the cream and leaves the bill for us to pay, then boasts about how productive and efficient it is and complains about our wasteful government.” Consider how media like these are pushed on us as objective documentary evidence, how the corporate media treats proclamations from tech companies, people of perceived intellectual import (consider, even, the problems of trusting Elon Musk as any kind of authority), as irrefutable, benevolently shared fact.
The problem in finding a solution is actually quite simple, and the thing hackers, in fiction and reality, get right is that there needs to be organization, there needs to be collective action, a coherent ideology, just like technologists working for these companies must organize and there is quite a bit to “unlearn” about our world that, again, arrogance prevailing, many technologists do not believe apply to them as workers. A common trope amongst businesses, but tech businesses in particular, is that of a philosopher king; the genius who built the thing, and now is a billionaire for having built the thing. The reality is that very few people, notable amongst them guys like Elon Musk or the rest of the so-called PayPal Mafia, the founders of Apple, Microsoft, etc. were foundational in building the products that enriched them, and even if they had been, justifying vast wealth, in the billions, while workers either make a miniscule fraction of this, or work in horrifying conditions, while, today, still doing all of the labor, is textbook alienation from the product of one’s labor.
There’s a persistent belief in technology that perks and benefits are the same; that free meals (to encourage more time spent at the office, in actuality, rather than the pretense of it being in exchange, or as compensation, for), for example, are the same as comprehensive health care, or PTO policies that are not punitively enforced (if there is no limit, the limit is actually very low, and the cost very high culturally). Another persistent belief is that because engineers, for example, the minority of employees at these vast organizations, are well-paid, tech workers have no reason to organize in their own interests. This is, again, not true.
A common issue I’ve increasingly seen become a point of contention with technologists and their employers in recent years that is eminently, demonstrably solveable through organizing: the notion of ethical application of technology. Consider the example of GitHub’s contract with ICE; employees have been tasked with the decision to do the work, or leave, despite nominally being heard by management, who has since doubled down on their intention to fulfill the contract despite all we know about what goes on in ICE detention centers, and what ICE/CBP’s practices are out in the field. The limited hangout of Google’s Project Maven, for example, being halted by leadership after workers protested, similarly, had the effect of showing organizing works, but that without formalizing that organization, the company can proceed in other avenues to continue to assist law enforcement, irrespective of the ethical implications. Ditto Microsoft, ditto SpaceX, ditto any tech company who services, meaningfully, the goals of an imperialist, fascistic state— if it is not your product (for, example, a difference from a company like Palantir makes a product that many argue only exists to aid state violence) that is at issue, it’s how it is being used, and they get part of the way there by organizing around these causes, but fail the larger social responsibility to the ecology of our civilization by not organizing themselves as the laborers, not the owners and those who profit from their labor, alienated from, both, the profit of their labor and how their labor contributes to the moral calculus of our world.
There seems to be no trouble identifying problems, and understanding a solution is readily at-hand, but where we’ve seen efficacy, we’ve also seen an unwillingness to take the logical next step to a more robust society. The limits are not so low, and thankfully this discussion is now at-hand and many technologists are doing the hard work of beginning to form and join unions. As, once more, Parenti says: “You dont know you’re wearing a leash if you sit by the peg all day.”
Recent things I’ve read, listened to, or watched that I am now recommending: