The overall critique of the education is its absorption within and continued production of capitalism. So this is not the 60s call for relevance, meaning, and authenticity. It’s more brutal. There’s no safety net of welfare to catch it; the result of education in contemporary conditions is prostitution and debt.
As I read in other places about the glee the precariat (rebranded as “Generation Flux” by Fast Company) should feel at their new, flexible lives, critiques like Undressing the Academy (helpfully summarized by Jodi Dean above) appear in a clearer context. We are seeing (and really, have been seeing for most of the last quarter century) one of the key tensions of modernity revived. In his book All That’s Solid Melts into Air Marshall Berman draws two distinctions between the nineteenth century modernists - like Goethe, Marx, Nietzsche - and the futurists of the twentieth century (from the Italian futurists of the interwar period to people like Toffler and McLuhan. The first distinction is that the nineteenth century modernists were able to consider the exhilarating changes happening around them; they could
rejoice in the destruction of traditional structures; but they knew the human costs of this progress, and knew that modernity would have a long way to go before its wounds could be healed.
In contrast, futurists of the twentieth century were almost totally enamored with the relentless march of technological progress, paying little heed to the social effects of its grinding gears. He begins this trajectory with the Italian futurists who served as the ideological shock troops for Mussolini’s modernization efforts.
The futurists carried the celebration of modern technology to a grotesque and self-destructive extreme, which ensured their extravagances would never be repeated. But their uncritical romance of machines, fused with their utter remoteness from people would be reincarnated in modes that would be less bizarre and longer-lived.
McLuhan’s technological determinism serves as a more recent iteration of this ideological position. It also serves as a useful example of the other distinction between nineteenth and twentieth century modernism. The human costs or social disruption (to use a fashionable term) don’t matter to McLuhan because, from within his framework, they simply can’t. History is moving inexorably forward, technological change is inevitable, so fretting about human costs is like being critical of the effects of the weather: As Burman puts it,
the problem of all modernisms in the futurist tradition is that, with brilliant machines and mechanical systems playing all the leading roles […] there is precious little for modern man to do except plug in.
This dehumanized vision of the future - where humans of the future delight in the earthly pleasures of technology as if the latter were manifested in some Garden of Eden by a supernatural force - takes the messiness of history in the past and streamlines it. In this sense it is apolitical, ahistorical, and, despite its apparent celebration of human innovation, desperately unimaginative in the possibilities available for future human agents (and likewise uninformed about how we arrived at the present state of affairs).
all the other great nineteenth-century critics also understood the ways in which modern technology and social organization determined man’s fate. But they all believed that modern individuals had the capacity both to understand this fate and, once they understood it, to fight it. Hence, even in the midst of a wretched present, they could imagine an open future.
In the Fast Company cover story (and its supporting profiles), there are glimpses of both visions, but the overwhelming message is one of the cynical modernism of the futurist tradition. What is unique is that, like the boosters of globalization and digitization before them, they have perfected a pitch wherein their cynical modernism ventriloquizes the more nuanced modernism of the past. The common element of both these post-1989 traditions is that they see only market-driven solutions as the only viable options. Agency, therefore, is limited exclusively to atomistic, individual “career management” and trying to work for a different future (or resisting the more degrading elements of the one profferred) gets you branded a “pessimist.” There is no real room, as in the nineteenth-century critics, to fight the inevitable future: only to find a way to “thrive” in it. Perhaps more insidious, the people who resist this aren’t seen as principled, socially minded, or even historically well-informed: they are simply too emotionally attached to the way things used to be. So Fast Company editor Robert Safian riffs off a quote from danah boyd:
“There are all kinds of reasons to be afraid of this economy,” says Microsoft Research’s boyd. “Technology forces disruption, and not all of the change will be good. Optimists look to all the excitement. Pessimists look to all that gets lost. They’re both right. How you react depends on what you have to gain versus what you have to lose.” Yet while pessimists may be emotionally calmed by their fretting, it will not aid them practically. The pragmatic course is not to hide from the change, but to approach it head-on.
Safian ends his article with a sincere reference to social Darwinism:
To flourish requires a new kind of openness. More than 150 years ago, Charles Darwin foreshadowed this era in his description of natural selection: “It is not the strongest of the species that survives; nor the most intelligent that survives. It is the one that is most adaptable to change.” As we traverse this treacherous, exciting bridge to tomorrow, there is no clearer message than that.
There are many tacks I could take in relation to this article, most of which could (and have) be expanded into full books or even research archives. But I will highlight only a handful. My purpose in doing so is not to chide him or Fast Company or any of the other techno-fetishist futurists among us for their overly optimistic vision of the future: I have no interest in being pessimistic and I don’t gain any emotional comfort for longing for the “good old days.” Instead, my interest is in articulating a clear understanding of the past and the present as a conscious creation of human action; to separate the precariousness of our employment conditions from the enthusiasm we rightly feel for the possibilities of networked, social media; and to try to establish what, if anything, an education at the university level should do to prepare future human agents (i.e. students) for their possible role in this future society.
My problem with the Fast Company article is that it largely believes its own hype. This makes it unable to understand what is actually happening (and has actually happened) and unable to imagine the breadth of possibilities for the future, even in the near term. None of this is to say that it isn’t correct (or isn’t correctly representing) a dominant or emergent cultural trend. Obviously there are people for whom this conception agency works very well. This doesn’t mean that it is somehow “natural” or “inevitable.” Nor, for that matter, are they at all novel. Take this passage from Safian’s manifesto:
What’s “a bitch” for companies can be terror for individuals. The idea of taking risks, of branching out into this ambiguous future, is scary at a moment when the economy is in no hurry to emerge from the doldrums and when unemployment is a national crisis. The security of the 40-year career of the man in the gray-flannel suit may have been overstated, but at least he had a path, a ladder. The new reality is multiple gigs, some of them supershort (see “The Four-Year Career”), with constant pressure to learn new things and adapt to new work situations, and no guarantee that you’ll stay in a single industry. It can be daunting. It can be exhausting. It can also be exhilarating.
I am all for this as a career option, so long as it is just that, an option. In the words of Andrew Ross,
The kind of self-employment that is so pervasive in the creative and knowledge industries has proven to be a powerful draw in spite of its disposition for self-exploitation. There are many people who contribute to these industries who prefer the independent or intermittent work life, and who choose it, despite all of its built-in risks and pitfalls because it gives them a degree of freedom, mobility, and control over their time. Neoliberalism produced so many converts because it exploits the credo that individuals have power over their economic destinies. Yet this belief is not the exclusive property of market fundamentalists nor should it be regarded as such. It can be espoused by individuals in more democratic kinds of work environment - ones that are just and vibrant but are also well protected from market overexposure. Another way of putting this is to say that it is not just employers who want freedom and flexibility. Everyone should have a right to choose their own balance of freedom and security in employment, and we should craft policies to ensure that making the wrong choices does not prove catastrophic to people.
Fast Company and our culture at large would like for all but this last sentence to be true. The “exhilarating” benefits of this new world of work are not just available to all who would embrace them: they are forced on everyone as a means of livelihood. This reveals the real mechanism of the change, which has little to do with technology per se and everything to do with the social system in which this technology is embedded, namely, the full market society of what C. B. MacPherson long ago termed “possessive individualism.” MacPherson’s book looks at the philosophies of Locke, Harrington, and the Levellers, but it begins with Hobbes and it is in Hobbes that Fast Company’s ontology finds its forerunner, even if the latter completely ignores the role of the Leviathan in shaping that ontology.
For Hobbes, as we all recall, the state was said to be necessary because, without it life would be “nasty, brutish and short” due to the “war of all against all” that resulted from the interaction of beings with a deeply engrained “human nature” to fight with each other over scarce resources. MacPherson is skeptical about this “nature” and instead posits that, in fact, what Hobbes attributes to human nature is actually the product of a particular historical and social set of circumstances. As he summarizes it:
All men in society (and in the hypothetical state of nature as well) do seek ever more power, but not because they all have an innate desire to do so. The innately moderate man in society must seek more power simply to protect his present level. (41)
put another way
[The postulates which produce] a market relationship in labor as a commodity [which] produce the essential requirement of Hobbes’s society, namely, the mechanism by which those who want more power or delight than they have can engage in continual, non-violent competition for the power of others, which compels the others to enter the competition. (61)
The Hobbesian vision, as we usually see it, is that of a war of all against all. But what MacPherson zeroes in on is the fact that this does not arise from some innate desire to take from others in order to have more for ourselves. Many people might be satisfied with what they have, they might be okay, for instance, with a comfortable job serving some crucial social function (teaching, for instance, or making food) where they are a productive member of society, paid a living wage, and provided with a decent life for themselves and their families. In our current economic environment, with austerity as the only item on the menu, what Fast Company is saying is that precarity is something that these people not only CAN choose, but MUST choose. As MacPherson said of Milton Friedman (in an essay titled “Freedom’s Tombstones”), this undermines the meaning of the word “choose.”
The key difference, therefore, between Ross’s understanding of this new world of work and that of Fast Company is that the latter cannot imagine a way for this new, imaginative, exciting world of work to take place under conditions where, to paraphrase Ross, making the wrong choice doesn’t prove catastrophic. Instead, we should embrace this Darwinian struggle as a contest, a competition with ourselves and others to see how long we can surf a career trajectory, but to then accept our failure to “thrive” in this environment as just and natural - even if it ends in destitution, hunger, homelessness, illness, or death.
This echoes the narrow, futurist brand of modernism Berman chides in his book. It sees resistance as futile pessimism that wastes precious time - time which could be spent, in Berman’s words, “plugging in” to the great technological interface that stands before us. The critique of the role of the academy (to return to the quote above), in so far as it prepares students ONLY for this kind of self-entrepreneurship, is quite sensible. It is, in Dean’s words, ”a cry for education free from capitalism, free from exploitation, competition, and the miserable conditions that leave students with no time to think open and faculty with no time to teach.” From a perspective that sees the future as inevitable, this “cry” may seem like an overly emotional, nostalgic screed against what will, in any case, occur. Here, Safian helpfully provides a short history lesson to bolster his claim.
At the turn of the 20th century, as cities grew to be the center of American culture, those accustomed to the agrarian clock of sunrise-sunset and the pace of the growing season were forced to learn the faster ways of the urban-manufacturing world. There was widespread uneasiness about the future, about what a job would be, about what a community would be. Fringe political groups and popular movements gave expression to that anxiety. Yet from those days of ambiguity emerged a century of tremendous progress.
This is account, innocent as it is of any actual understanding of the struggles that went on in the twentieth century, helps highlight the actual alternatives available. After all the “Fringe political groups and popular movements” were instrumental to crafting from those ambiguous days a century of progress. The labor movement, the progressive movement, the civil rights movement, the women’s movement, and just about every other social movement in the twentieth century can be found in utero in those dawning days. And, contrary to Safian and neoliberal extremists like Richard Epstein, the only way that progress was won in a widespread fashion (and therefore, the only way that the mass consumer comfort of the post war era was possible) was through the principled, sometimes violent, struggles in which “Pessimists” of the time engaged. And, as difficult as it might be to conceive for the permanent “Me” generation, they did this not because it would “aid them practically” (which is the only reason anyone should do anything in GenFlux); instead, they did it so that people a century or so later could have the freedoms and material gains Safian now takes for granted.
This is the full, critical understanding of modernity where, as Berman says, “modern individuals had the capacity both to understand this fate and, once they understood it, to fight it.” Thus today, we can take the good things on offer from the Fast Company menu of precarious effervescence, cast aside those of the “wretched present” and “imagine an open future.” Ironically, it is the source of the “pessimists” quote that provides one of the more nuanced understandings of our present condition.
When danah boyd speaks of their being both optimists and pessimists, she retains a twinge of the crude, materialist solipsism that guides the overarching article (“They’re both right. How you react depends on what you have to gain versus what you have to lose.”) But in her longer comments she recognizes the narrow field in which this new ethos can productively (or at least potentially productively) operate:
There will always be some business people who take risks for the rest of their lives. But I’m not sure society writ large can handle that kind of instability. All we want is certainty.
While it is hard to tell exactly what boyd things about this, she at least recognizes it as a basic human reaction, rather than some atavistic emotional weakness. And this, in turn demonstrates her seemingly sincere recognition that there are human costs to this process. Demonstrating again that she is, by Fitzgerald’s index anyway, a “first rate intelligence,” she is able to allow for the possibility that, yes this thing we’re going through is amazing, but it is also seriously destructive. And there is every reason for those who are on the receiving end of the destruction to push back. This will take a variety of forms - political, economic, and cultural. Her profile ends with a version of this observation:
Any time there’s a radical disruption, everyone rushes to reform a new power structure, to restabilize things. Some people win in this process and some people lose. Tech forces a disruption, but who gets to win? Maybe Steve Jobs is so embraced as a hero because we have this anxiety over control.
boyd’s observations - which may join a wealth of others, possibly even more nuanced, that aren’t quoted by Safian in her profile - demonstrate that she feels at least somewhat ambivalent about the neoliberal future Fast Company predicts. Her comment about Jobs is clearly tongue in cheek, but her casting it as a question of winning is striking. Taken with her other comment about there being a very specific area of society where this can apply, “winning” means something, again, very specific.
In relation to the overall debate about modernism, “winning” is a strange concept. It joins “certainty” and “control” as keywords in this debate. Modernization, after all, is supposed to be the elimination of the uncertain elements that have threatened human life for centuries: modernization is supposed to make it easier to live so that we can concentrate on the more creative, educated, enlightened issues that confront us. For those of us lucky enough to live in areas with running water, paved streets, public schools, and the other amenities of modern life, a good proportion of our time is freed up by modernization. Water is cleaner so we don’t have to spend time boiling it. We can ride on trains or drive on roads which makes everyday travel less of a chore. Specialists learn deep, meaningful things about the world around us (sometimes through - gasp! - decades of experience in a single field!) And workers of all kinds build, cook, create, plant, harvest, generate, mine, drill, and carry away the refuse (solid, liquid, etc.) to make this system livable for all of us.
These crucial social functions haven’t gone away. We still need people to think about education and perform it on a daily basis. We still need people to monitor infrastructure to make sure the rest of us don’t drive off a crumbling bridge or drink water with an unacceptable level of fecal matter in it. We still need people, to speak to Fast Company’s interests, to build out the US’s fragmented and pitifully inadequate broadband services, a task that private industry is fully able but completely unwilling to perform. All of these tasks require dedication, attention, and lifelong learning in a fairly narrow field.
Some of their results are mundane. The people who work in them may appear (to the latest social technology pundits) to do uncreative, dispirited, boring work. But I assure you the people who find creative ways to get the shit out of our water are far more essential than the people who find creative ways to put shit on our cell phones. And their work is creative: as Louis CK said, everything’s amazing! But while Fast Company is clearly happy about the cutting edge stuff, it takes for granted all these mundane tasks (though everybody thinks they can be an educator these days). Or, more accurately, it celebrates the fact that the (supposedly) Darwinian rules of the finance industry are now becoming more widespread - as if we will all benefit from rapid turnover in educators, novice chemists managing the water supply, and negligible institutional memory in virtually every social institution - save the revolving door between finance and the regulatory system.
This, after all, is what people are “pessimistic” about. For what Fast Company and the more general neoliberal ethos of austerity demands is not only that these social functions are performed, but that they are performed in a dramatically profitable way, measured in short term financial gains, repeated regularly. I realize that this is not explicitly what the article says. But it hooks into a general ethos of how “value” is measured - and the expansiveness of this definition in terms of the social totality. It is one thing to marvel at Facebook or Google or Legal Zoom or Zipcar. It is another to argue that the industrial turnover this creates can, and even should, be acceptable as a general social precarity. All jobs, all workers, all industries may be subject to disruptive innovation by investors or startups: therefore all jobs, all workers, all industries must brace themselves for battle. The stakes of this battle are life and death; you go into the ring a lone gladiator and who knows if you will emerge on the other side. This is true, again, whether you perform a basic service to society or you are entering the crowded market of building fart apps.
To be very clear, my problem isn’t with innovation or disruption per se. I am a product of this high tech innovation and, like Safian, am amazed at how quickly industries and entire realms of social and cultural practice can be completely transformed. For instance, last night my wife and I watched a movie on out TV. We didn’t use antennae (though for broadcast TV that’s our system) or cable or DVD or even stream it from Netflix, Vudu, Amazon or any of the other companies that you can pay to show you movies. Instead we watched a film on Youtube called Life in a Day.
It is truly an amazing snapshot of humanity. It is partial and professionally crafted (by the director of Gladiator) himself. But it is the product of hundreds of people - many of them obviously talented with the aesthetics and narrative tools of filmmaking. People from 192 countries sent 4500 hours of video documenting their life on July 24, 2010. Scott and other editors and directors sorted through 80,000 clips and stitched them together into a comprehensive patchwork of human experience - or at least as comprehensive as it could be in 90 minutes or so.
This is the product of an astounding human endeavor almost unimaginable just a decade ago. I watched it on a whim - for free - in my living room.
On the one hand, this is certainly a disruptive innovation (a concept I don’t fully embrace). I like movies. I like the craft of storytelling and the aesthetics of a good cinematographer. I like that there are people like Ridley Scott or Woody Allen or Spike Lee who have been doing it a while and have the instincts, connections, and tacit knowledge to make it happen. I would like for there to be people like this for a very long time, supported in some fashion so that their trials will eventually lead to success. In that sense, in principle, I like the fact that it is now more possible for more people to do this, to have a chance to try their hand and tell their story; to distribute it to all of us and possibly speak to us in a fundamental way.
I can also see the benefit of something like this innovation creeping into more and more of our lives, shaking up the normal order and possibly improving how we do things. I joke about water treatment, but education is clearly ripe for this kind of disruption. I’ll speak to that at the end. For now, the point I’d like to make is that, as much as I’d like for more people to be able to do this, I have no interest in making everyone’s lives contingent on their succeeding in making the right choice at the right time. The romantic bohemian lifestyle is supposed to be a choice, not the pinnacle of modernity.
In a sense, making this Darwinian struggle for survival the pinnacle of modernity actually reverses the very goal of modernization: instead of having fewer concerns about our basic survival and focusing our time and energy on higher concerns, those higher concerns are turned into the stakes of the reinvigorated Darwinian struggle. We are forced through an artificial scarcity to act as if, in this land of plenty, we will suddenly be caste into the abyss if we don’t find a way to please the arbitrary, shifting will of the market. The “creative destruction” of the market realm colonizes our life world and we are supposed to start making cost-benefit analyses of even our children and marriage. It makes the nuanced, detailed, labor intensive product of Life in a Day useless: we might as well string together webcams from all the major stock exchanges if that is all that matters in life.
This gets us back to the notion of certainty. Carrie Lane, in her book Company of One, looks at some elements of the ideology that pervades Fast Company and the high tech fields in general. But she finds it in an unusual place: speaking to unemployed tech workers. As an anthropologist, her goal is to find out how they see themselves, how they understand their situation, and how that relates to larger trends and ideologies. What she finds is the notion of “career management.” This ideology is identical to that of Fast Company’s celebration of GenFlux, “At core, career management entails seeing oneself not as an employee, even when traditionally employed, but as an independent entrepreneur, as, in the words of one job seeker, ‘a company of one.’” (45) She said that this belief was “expressed, with varying degrees of fervor, by every job seeker I interviewed.” Among these (potential) workers, this ideology was more than an explanation of their (potential) employment, it served, “simultaneously as a mental model of work and the labor market, a set of behavioral guidelines, and a badge of identification for its loyal adherents.”
It was, in this sense, a completely coherent, completely totalizing discursive regime through which they made sense of the world and their place in it. As in Fast Company’s description, it is through this regime of practice - managing one’s career- that one is able to have agency in the world. And, as in the order described above, it is explicitly contrasted with certainty. Certainty is understood as certainty of employment, but since employment is central to survival in a social system dominated by capitalism and commodification its meaning is overdetermined. Still, as in Safian’s article, the desire for certainty is seen as emotional, feminine, weak, irrational, and nostalgic.
The claim of heroic self-reliance is again used to establish a masculine agency, while victimhood is defined not by losing a job or experiencing professional or financial hardship but by the very act of seeking secure employment. (46)
Lane points out (along with Boltanski and Chiapello) that, far from this being the product of the tech industry or its particular kind of insecure, continuously disrupted work structures,
the idea that workers must function as companies of one has its roots in management culture and is intimately connected to a decades-long corporate campaign to ‘prevent emotional commitments to the firm, and make sure that employees, when shown the door don’t seek the window. (49)
On a side note, this makes boyd’s observation about Steve Jobs ironic, since Apple remains unable to prevent its Chinese employees from jumping out of windows. The upshot here is that, as corporate America engaged in its blatant attack on workers and labor in general over the past three dacades, it has fostered a fashionable garb for this process such that corporations have no social responsibility and workers should expect to have no job security no matter how many years of their lives they have put towards making that corporation successful.
This rhetorical reframing of insecurity neatly transfers the burden of maintaining the social contract to the shoulders of the individual worker, who is expected not only to bear that burden but to welcome it as an opportunity for self-reliance.
or, as Quintin Tarrantino might put it, “this reminds me of a joke”
Cultural institutions like Fast Company have helped to make this material, political and economic transformation into not only a natural phenomenon, but one we should be happy about. Pessimism will get you nowhere: embrace your precarious future!
Unmentioned by Lane is the other possible result of this transformation (and the other individual response) to a predominant, acceptable feeling of precarity and insecurity - or, more likely, the other response the corporate ideology was trying to stave off. As Mark Ames vividly argues, the string of US workplace rage shootings throughout the 1990s were not the result of lone psychopaths, but a relatively reasonable response to this abrupt reversal of the social contract:
These weren’t the works of psychopaths — they were people fighting against something intolerable that many of us know is there, but hasn’t been named yet. There isn’t a Marx to give a name to post-Reagan middle-class pain. How do you fight against something horrible, oppressive, and debilitating before it even has a name? Especially when everyone, especially middle-class people, sneer at it and refuse to believe it’s valid. When you’re too deep in the culture, you start to think that the most horrible/mundane aspects are normal and just the way things are. When you’re outside of it for awhile [as was Ames, a Russian expat at the time] it’s a little easier to see the insanity and brutality for what it is.
As it turns out, there is a name for something like this feeling. Predictably, it comes from one of those nineteenth century thinkers who was witnessing the janus faced modernization for what it truly was. Emile Durkhiem, in his study of suicide, termed it anomie. It is, in essence, a sort of disease of unbridled capitalism. Social bonds are torn apart, norms destroyed, and individuals with particular sets of skills, necessary for one form of production, suddenly find themselves devalued as workers and even as humans because their skills no longer fit with the times. In the early twentieth century, largely in response to mass demonstrations of unemployed workers economic scholars like Keynes discussed the strange phenomenon of the “involuntarily unemployed” (where people can’t find work, but want to) and how this fit into the larger issue of “structural unemployment” (which is where there was a mismatch between the skills and locations of workers and the possibly places of employment). There is no need to go into the details of these theories, except to say that the consensus at the time was that it was best - politically, economically, culturally and socially - to try to cushion these forms of “frictional unemployment” on the individual level. As Joan Robinson, who was no fan of the welfare state, helpfully describes the situation:
The worst part of heavy unemployment was not the waste of potential wealth (and, as we shall argue in a moment, its removal has not been achieved mainly by avoiding waste) but the rotting of individual lives, the damaged self-respect, the desperate egoism and cringing fear on one side and the smug self-deception on the other. Certainly we live now in a cleaner, more human country.
In contrast to realizing (through what we might call “smug self-deception”) that the waste of individual lives is a horrible (and largely avoidable) consequence of the present order, Fast Company and much of the business culture of the US sees this as the natural Darwinian process of capitalism. Like Rick Santelli’s (staged, h/t Ames, again) rant, they see no reason to bail out the losers. The goal of the economy from this perspective is not widespread growth and stability, but to “reward people that could carry the water, instead of drink(ing) the water.” The “smug self-deception” this inspires is that, if you are rewarded (through whatever means necessary) it must mean you are one of the water carriers. If you are not rewarded (i.e. you are not employed), then that must be justified: losing in this economic battle makes you a loser in an ontological sense. This should carry all the Darwinian punishments it implies: losers become extinct.
There are several alternatives to this model of society, both at the macro and micro levels. But in general the main thrust of these alternatives is to counter the atomistic individual - and especially the reward structure that favors individuals alone. In many ways, this alternative is integral to the exciting new world Fast Company enthuses about: non-hierarchical structures of sharing and interaction facilitated by new technologies and cultural practices are the keystones of this economy. Unfortunately, the reward structures are still such that amazing collaborative projects are only funded piecemeal, with individuals involved either not collecting income or working at several other, often unrelated jobs, in order to do this creative, productive, unpaid work. As we move forward, we will likely only see this exacerbated as a trend, with more of the productive work created in cooperatives, but with many creative, innovative people unable to find an entry point into actual paid work.
In fact, what is unique about many of the people they profile is that they are somewhat unconstrained by monetary concerns and can therefore work within a sort of gift economy, taking lower paying jobs to get skills or going back to school if necessary. This is, indeed, far more common for the average middle class citizen than in the past. If there is an ethos that emerges out of this, it is not that one need only be a career manager, but that a variety of social institutions and methods of economic support are necessary to incubate innovative ideas. One of their paradigmatic GenFluxers, “has worked in academia, in government, in big public companies, and in startups.”
What these trends point to is not that we should prize the serial autonomy of the Fluxer, but that we should, in looking at how things are shifting, figure out more creative ways to support human life than through the inefficient industrial artifice of wage labor. Imagine, for instance, if every US citizen had the quality education (and likely, the quality health care, preschool, prenatal care, and nutritional sustenance) that the most successful of these Fluxers did? Imagine the untapped human creativity that would suddenly come online if all the job seekers grinding away their lives (resume, resume, job fair, resume, resume, networking meeting…) trying to fit their round peg into a board with no peg holes (the EPI reported this week that we’ve had nearly three years of a job-seekers vs. job openings ratio of more than 4:1), but instead were given the opportunity for several years of paid apprenticeship retraining in a field we need covered as a society. Or, hell, just give them some money. If it’s essential to have people work at jobs where they are paid, then mandate a shorter workweek so there is enough paid work to go around. Or, put another way, if the problem is that the average CEO makes more in a single day than the average worker makes all year, there’s no need to tax away that CEO’s wealth; just make them work less so that more people are paid by that salary. It sounds crazy right now, but my main point is that there must be alternative ways to reward the social production of value than the direct wage relation and many of them may involve collective efforts of cooperatives, collectives, or even - gasp! - the state.
Developing these other forms of social and economic solidarity would lessen the effects of the “creative destruction” the dominant ideology places squarely at the feet of the individual. But the ancillary effects on innovation - economic, social, and cultural - would likely be quite astounding. In a recent essay for the World Policy Institute, Sci-fi author Neal Stephenson speculates on the reasons for what he sees as “Innovation Starvation.” I’ve written more about this essay elsewhere. But the nugget I’d like to take from him is, as he says, “Innovation can’t happen without accepting the risk that it might fail.” He sees risk averse institutions as the primary cause of this stagnation. Corporations and academia are unwilling to make dramatic improvements because the payoff isn’t worth the investment. Despite the shades of Christensen’s disruptive innovation here, what I would highlight is not the Darwinian need for these institutions to innovate or die (which, again, is only half the story, even in Christensen’s examples where some incumbents use state policy or market monopoly to absorb competitors quite easily). Instead it is to do the opposite of what the austerity hawks recommend: take people’s skin out of the game.
That is to say, innovations will be more likely to come - particularly in large scale social institutions like education, law, or health care - if there is sense that, to again quote Ross, making the wrong choice won’t prove catastrophic. Ross means this on an individual level, but we could project this on a social level as well - i.e. where making the wrong choice is catastrophic. In some cases this doesn’t even require new resources - just preventing the wasteful expropriation of those resources for questionable forms of innovation. For instance, a recent Nation article discusses the national campaign of venture capitalists to get state funds for virtual charter schools - and in some cases, like Florida, to create state policies that force students to take classes online, therefore creating a captive audience for those venture capital firms. It is hardly news that the same trend is occurring in Higher Education, where for-profit firms like University of Phoenix and Kaplan are sopping up an enormous amount of federal student aid money in order to offer classes online. This is clearly a form of disruption that has the full support of the US government (therefore making it a predictable subject for Christensen to examine.)
What also hardly needs saying is that, while boosters of this innovation highlights the technology behind it, its convenience, and its easy scalability (i.e. you don’t have to build schools), the key innovation from a business perspective is the ability to force fewer teachers to serve more students. In other words, it is the injection of the “generation flux” finance-friendly labor system into a previously secure form of public service.
The easy response - and the predictable one, as we see from finance friendly folks like Bloomberg, Arne Duncan, and even the President - is that these teachers simply need to suck it up and get over their childish, feminine belief they should have jobs for life. This would make sense if that was all that was at stake in the struggle. Most teachers who oppose this see an ineffective, untested, completely mercenary system of education being implemented with short term interests in mind and balk at its long-term pedagogical effects. In other words, they fear the complete destruction of the social institution of education brought about by the neoliberal belief that austerity requires cuts and markets bring efficiency (both of these beliefs are arguable to say the least).
The unfortunate byproduct if this battle is to pit the people who know something about education (i.e. educators) against the “innovators” who see technology as primarily a labor saving device that will increase the bottom line. You can’t speak of the pedagogical benefits or potential of more technology in the classroom, unbundling the university, or structuring blended or distance learning in pedagogically effective ways without seeming to give energy to the mercenary forces bent on destroying the very institutions that have traditionally valued those questions and fostered research on their productive implementation. On the other side, while you can pay lip service to the principles of pedagogy and likely find willing shills to push your agenda, if you appear to over-value the actual process of teaching or the experience usually necessary to do it well, then it becomes hard to sell your service whose sole principle is actually to eviscerate those institutions.
This makes it virtually impossible to foresee how these two sides could work together. Instead of there being any mutual agreement or socially beneficial consensus, there will be, in boyd’s terms, winners and losers. In this case - as in many other cases - the real winners and losers are not actually involved in the fight. At stake is the future of our society, the future generations who will be educated (or not) for the next century, on whom we will all depend for the next wave of innovation. But because of the market driven channel of the struggle, it is far more likely that we will all lose, even if we can be convinced that everything driven by the market is an automatic win (“this reminds me of a joke…”)
If the VC firms who are angling for this market get their way, they may very well succeed in capturing the education market for the vast majority of underserved communities in the country (make no mistake: the venture capitalists themselves will be sending their kids to schools where they get a significant amount of face time with well educated, less exploited faculty). As with charters and vouchers before them, this will siphon more public money into the private sector and away from building successful, sustainable public institutions with sound, innovative pedagogy. Unlike public institutions, however, these virtual charters will create no infrastructure, no institutional memory, no wealth of experiential, tacit learning - or, at least, this will not be their goal. After the venture capitalists realize that, like housing and other easy government funded grifts, making this work will actually take, well, work, they will get out, taking their easy winnings to the next venture and leaving the rest of us worse than we started. Maybe it will work out differently, but I’m doubtful. When the rule of the land is, “make it profitable or kick it to the curb,” the fact that you are dealing with a major public institution is only so much dust on the scales.
And yet - AND YET - it is clear that education could do with a shake up. That technology and networked culture are crucial drivers to making education more accessible economically and culturally to more people. As someone who works and has worked in that field, I can also say that there are many people very energized about the way these technologies can help facilitate learning, many people who would love to take a shot at helping move this innovation forward, but also many people exhausted by the dehumanizing forces that have already infected education - especially higher education. Their skin is collectively rubbed raw with testing and assessment measures in K-12 and the rapid casualization of labor throughout higher education. An increasingly opaque and disconnected administrative apparatus makes decisions based on cost-benefit analysis and “prioritizing” institutional resources. Tenure is under attack at every level and for the current generation of high school students, going into education would likely seem like a foolish choice: better to look at finance. Therefore when danah boyd says,
“Building new connections is a critical part of building a new economy,” boyd says. “The American education system, as flawed as it is, is great for the creative class because of the way it mixes up networks.”
The open question is whether things will remain this way, whether the relatively thin umbrella held over the heads of students and teachers as they try to manifest this elusive process of learning will be stripped away in favor of the Gingrich vision of working-class education: where students work as janitors in order to pay for their education. This would make the “prostitution and debt” of contemporary education explicit from the outset.
The alternative is to take the skin out of the game. To provide ample support to these and other social institutions, valuing the knowledge they have, but also asking for them to take an innovative tack. More importantly, in a twist on the Rawlsian imperative of justice, giving everyone involved in the institution a basic minimum income, or promising to find some place for them in whatever institution emerges from the innovation will take their economic (and institutional) self interest out of the equation. This would release their intellectual energy towards collaborating on innovative forms of education which might or might not succeed, but whose failure would not result in the complete dissolution of the institution or its most experienced constituents.
In other words, the goal would be to embrace the essential insecurity of this new age, encourage people to innovate and find new ways of working, thinking, collaborating, and sharing, but giving people some baseline support so that their failure won’t be catastrophic - to either themselves or to society at large.
My feeling is that many of the current political movements are focused precisely on making something like this possible. Reforming education in some fashion, making it more responsive as an institution to the social and technological changes around us, and making it more affordable and available are clearly central elements of these struggles. But so is the ability to apply the skills and intelligence people have acquired through that process to a meaningful form of labor for which they receive some basic ability to sustain themselves. In a sense this is exactly what the promise of modernity entails. As absurd as something like a guaranteed minimum income might sound, from this perspective what is absurd is that we wouldn’t have it. On the other hand, it is not something that we will just get: as rational as it may be, it will take lots of those productive “pessimist” activists to bring it about. As I mention above, this has happened before so it can likely happen again. We just have to be ready to fight for it.
The economic crisis has already destroyed an enormous amount of our collective economic capital, be it in pension funds or housing values; we shouldn’t allow our response to this crisis be the destruction of our collective human capital - and we should hardly rejoice that a handful of fortunate workers will be able to ride this wave (or at least stay afloat.) Their success will likely be short lived if the shore they land on is a wasteland of “rotting individual lives.”