The basic model is descriptive, not predictive -- that is to say, it provides a vocabulary for describing what has happened, but cannot predict what will or will not be "disruptive," except perhaps in the way that recurrent patterns in history provide a way of anticipating what will likely happen in the future. As Jill Lepore, writing for the New Yorker put it, "disruptive innovation as a theory of change is meant to serve both as a chronicle of the past (this has happened) and as a model for the future (it will keep happening). The strength of a prediction made from a model depends on the quality of the historical evidence and on the reliability of the methods used to gather and interpret it. Historical analysis proceeds from certain conditions regarding proof." She doesn't feel the conditions have been met. Regardless, it's questionable whether the model could ever become predictive, except in very limited cases, because one simply cannot anticipate all the variables. The "disruptive" innovation, whatever it might be, enters into a "status quo" that is enormously complex and intricately interconnected. Consider another "fringe" item -- e.g. the fully "electric" car. If ubiquitous, they would "disrupt" not only the whole network of "filling stations," but all the networks that serve the "filling stations." Insofar as many filling stations are also "convenience stores," it would likely have an impact too on soft drink and beer distributors, and what about the stoner sitting behind the counter over-charging for cheese balls? Hard to say, dude.
The notion of disruptive innovation is popular among conservatives, in part because it seems to validate a particular American mythology virtue -- the self-reliance of the self-made man. The practical example provided by Wikipedia provides a case in point. Bear with me. I'll quote it in its entirety:
In the practical world, the popularization of personal computers illustrates how knowledge contributes to the ongoing technology innovation. The original centralized concept (one computer, many persons) is a knowledge-defying idea of the prehistory of computing, and its inadequacies and failures have become clearly apparent. The era of personal computing brought powerful computers "on every desk" (one person, one computer). This short transitional period was necessary for getting used to the new computing environment, but was inadequate from the vantage point of producing knowledge. Adequate knowledge creation and management come mainly from networking and distributed computing (one person, many computers). Each person's computer must form an access point to the entire computing landscape or ecology through the Internet of other computers, databases, and mainframes, as well as production, distribution, and retailing facilities, and the like. For the first time, technology empowers individuals rather than external hierarchies. It transfers influence and power where it optimally belongs: at the loci of the useful knowledge. Even though hierarchies and bureaucracies do not innovate, free and empowered individuals do; knowledge, innovation, spontaneity, and self-reliance are becoming increasingly valued and promoted.
In the beginning, we had catholic "main-frames," and that one computer served the needs of many people. That "status quo" was disrupted by the likes of Bill Gates and Steve Jobs, both of whom saw the possibilities of the "personal computer," and, not unlike Ford, were able to transcend a programming mind-set and make the computer easily accessible to the masses. Having said this, I very much doubt that their motives were quite as techno-utoptian as Wikipedia implies. The creation of the personal computer itself, for example, may have even begun with the protestant idea of individual empowerment over against an external hierarchy -- and those individuals may have demonstrated a certain level of "knowledge, innovation, spontaneity, and self-reliance" in the creation of their products. Bill Gate's and Steve Jobs' "disruption" of IBM, however, resulted in Microsoft and Apple Computers. I'm pretty sure the employees of both Microsoft or Apple do not feel particularly empowered, and both would see hierarchies and bureaucracies with any number of "dilbertian" foibles that quash innovation, spontaneity, and self-reliance even as their leadership self-reflexively "promote" the attributes that ostensibly made them rich. In both cases, the innovation "disrupted" a established hierarchy and bureaucracy, but simply replaced it with another established hierarchy and bureaucracy. Meanwhile, Bill Gates and Steve Jobs have exercised considerable "elite" power, in part because they were simply the right men, in the right place, at the right time. They, in effect, won the lottery, and more power to them.
So far as the networking of the personal computer, I doubt that anyone thought, "hmmm, from the vantage of knowledge production, the personal computer itself is inadequate." It's much more likely that someone thought "hey, wouldn't it be cool if we could leave messages for one another on the computer?" It's much more likely that someone else thought something like, "hey, wouldn't it be cool if we could sell our porn on the computer instead of sleazy bookstores downtown?" So on and so forth -- thousands of independent decision points, if not "self-interested," per se, then at least "self-involved." Hence, the on-going evolution of "networked computers" in a "marketing environment." The ubiquity of email creates considerable convenience, and good riddance to the jangling interruptions of the unwanted phone call, but it comes with a privacy downside as well. Clinton could no doubt explain that in some detail. Internet porn may have put the sleazy bookstore out of business, and good riddance, but the easy accessibility of internet porn too comes with a social downside that I probably don't need to explain. I haven't even touched on the different sort of ubiquity that Mark Zuckerberg created with Facebook or Jeff Bezos created with Amazon. I'm suggesting, of course, that the particular "innovators" in computing set out with a limited intent, for good or for ill, but I sincerely doubt that their intent was anything like the techno-utopian notion of "knowledge creation and management." Some were able to leverage the public's desire for finger tip convenience and instant gratification into wealth and a position within the economic elite, and they may feel themselves to be "free and empowered individuals," but the rest of us not so much. There was no "over-arching" intent to make the personal computer "an access point to the entire computing landscape." Even assuming the "disruptors" had that goal and their "disruptions" accomplished it, we should remember that "access to" does not mean the "entire computing landscape" is surveyed by any individual.
And what has been the social impact of the technologies created by the economic "disruptions" of the self-made men, the new "elites" of the knowledge economy? We're still ferreting that out, of course, but I could point out a couple strands of common wisdom. Technology has not ushered in a new protestant utopia of self-reliant, knowledge-enabled men and women blessed with wealth by the invisible hand of Adam Smith's free market, at least not in any sense that Emerson would recognize. To be self-reliant is one thing, self-obcessed is another, and while social media promises connectivity to the world, it leaves most staring at a computer screen hoping that one's newest "selfie" will generate some likes. The internet has promised a world of ideas, but one need only read the comment sections of many on-line publications to give some credence to Hobbes' darker view of human kind -- that the more active and passionate participants are motivated, not by rational insight into actual knowledge or a clarity of innate good conscience, but rather by a "shared" ignorance and fear amplified in the echo chambers of social media. So far as the invisible hand, one wonders where it might be hiding. As is always the case, some have succeeded within the technology driven "disruptions" of our social and economic networks, but as many and perhaps more have been left behind. As the New York Times editorial board put it, "increased automation and the offshoring of jobs [enabled by improved communication and transportation technologies] have hit men with less than a college education particularly hard," who have become the "disposables" of the new age. I am referring to an article entitled "millions of men are missing from the job market," which in turn refers to a working paper by Alan Krueger, a Princeton economist, who "casts light on this population, which grew during the recession that started in 2007." The Times summarizes, suggesting that, "as of last month, 11.4 percent of men between the ages of 25 and 54 — or about seven million people — were not in the labor force, which means that they were not employed and were not seeking a job. This percentage has been rising for decades (it was less than 4 percent in the 1950s), but the trend accelerated in the last 20 years," a period corresponding to the widespread investment in personal computing. Correlation is not causation, but from an employer's standpoint, there is another piece of common wisdom as well -- that the sole economic justification of technology is to limit cost and improve efficiency, which means in effect either limiting the number of costly employees needed to do a particular task, or the elimination of those costly employees altogether.
Of course, it's not quite that simple, so let me touch on obvious objections up front. There are other "costs" that can be eliminated by the advent of a "new" technology and limiting or eliminating those "costs" often serve as it primary justification. Then too, as the land of opportunity, we can expect the the disruptive technology to create a range of new "opportunities" for those able to capitalize on them. Consider again the "disruptive" technology of the electric car. If it were to become truly ubiquitous, it would limit to some degree our dependence on fossil fuels, at the same time limiting some "costs" to the environment. It's a bit more difficult to quantify such "environmental costs." We would need to increase capacity on a power grid already strapped in some locations, and we would need to consider the environmental effects of the batteries, both of which in turn might have some environmental impacts. Nevertheless, I'm sure we could estimate with some degree of accuracy the net impact of the electric car on the environment, and that alone may ultimately justify its adoption despite its "disruption" of the very broad network of those employed in fossil fuel production and distribution. And then too, the electric car will create new opportunities for employment. To increase capacity on the grid, there may emerge new opportunities not only in nuclear power generation, but in renewables like solar and wind as well. To deal with the toxic content of most batteries and nuclear power plants, there may emerge new opportunities in the recovery and recycling of waste metals. Nevertheless, there too, I'm sure we could estimate the net effect of the electric car on employment, and the creation of new, well-paid jobs might also justify its adoption despite its "disruption" of the existing "status quo."
Even assuming that the introduction of the "disruptive" technology of the electric car is justifiable on environmental, social, and economic grounds, that still leaves us with the problem of the "disposables," those pushed aside by the introduction of the new technologies. And to all appearances it is a growing problem. The Times also reports that "while it’s hard to generalize across a large group of people, it’s clear that job market changes can have significant health effects on the labor force," not least opioid addiction. "The connection between chronic joblessness and painkiller dependency is hard to quantify," the Times admits, and "Mr. Krueger and other experts cannot say which came first: the men’s health problems or their absence from the labor force." Nevertheless, "some experts suspect that frequent use of painkillers is a result of being out of work, because people who have no job prospects are more likely to be depressed, become addicted to drugs and alcohol and have other mental health problems. Only about 2 percent of the men say they receive workers’ compensation benefits for job-related injuries. Some 25 percent are on Social Security disability; 31 percent of those receiving benefits have mental disorders and the rest have other ailments, according to an analysis by the Urban Institute." Some of this is also, no doubt, "gaming the system." The connection between bureaucratic process and opioid addiction too would be hard to quantify, but anecdotally, my wife, who has a slowly degenerative arthritis of the spinal column and has had three reconstructive surgeries made an initial application for disability. She does suffer from chronic pain, and has developed any number of work arounds to manage that pain, but she avoids opioids. She was told, by counsel, that her application lacked standing because she was NOT currently using opioid pain medications. She was encouraged, by co then unsel, to seek a prescription EVEN IF she did not actually use the drugs.
The responses to the growing problem are, for the most part, predictable. On the conservative side, it's essentially "that's the breaks," along with various forms of corporate welfare and tax cuts. It's not surprising, in part, because the conservative party is the party of the great American myth that it requires only knowledge, innovation, spontaneity, and self-reliance to succeed, and the greatest of these is self-reliance. Pluck, tenacity, grit -- choose your adjective -- are all that is REALLY needed to succeed in this land of opportunity. For those that DO succeed, the message is compelling, reinforcing the belief that they are, themselves, more worthy. Having said this, however, almost all the research tends to indicate -- surprise! -- that the rich get richer while the poor get poorer. The Atlantic, for example, recently ran an article suggesting that "America is even less socially mobile than economists thought." They write that "scads of reports have documented how parents’ income dictates how financially successful someone will go on to be." In other words, "the amount of money one makes can be roughly predicted by how much money one’s parents made," and the farther up the income spectrum one moves, the more true that assertion becomes. "Children born to 90th-percentile earners," those at the high end of the scale, "are typically on track to make three times more than the children of 10th-percentile earners," those at the low end of the scale. Again, even the children of the rich get richer, while the children of the poor get poorer, if not in absolute terms, then relative terms.
Not all areas of the county are created equal, however, and a group of Harvard and UC-Berkeley researchers (aka liberal elitists) have noted that "intergenerational mobility varies substantially across areas within the U.S. For example, the probability that a child reaches the top quintile of the national income distribution starting from a family in the bottom quintile is 4.4% in Charlotte but 12.9% in San Jose." These numbers are encouraging of the great American myth of upward mobility in one respect. It is POSSIBLE to scale the income ladder, and the 4.4% in Charlotte and the 12.9% in San Jose prove the possibility. They are also encouraging in another respect. San Jose is the epicenter of the "disruptive" technologies that are changing the American economic landscape, and it is perhaps not surprising that it demonstrates greater upward mobility than Charlotte. The principle industry in Charlotte is banking, and Measured by control of assets, Charlotte is the second largest banking headquarters in the United States, after New York City. While the characterization might be a bit unfair, it is the difference between a city dedicated to the growth of wealth through "disruptive" technical innovation and a city dedicated to preservation of acquired wealth, and if there is growth of wealth, it is growth through various forms of rent-seeking interest and investment. Finally, it is encouraging in another respect as well. The researchers, noted that, when they explored "the factors correlated with upward mobility," they found that high mobility areas, like San Jose, "have (1) less residential segregation, (2) less income inequality, (3) better primary schools, (4) greater social capital, and (5) greater family stability." These are decidedly "liberal" issues and the left of center "results" most liberals would like to see, and so again, while the characterization might be a bit unfair, particularly since we cannot equate correlation with causation, we find greater social mobility in liberal California than we find in conservative North Carolina.
As a side note, the characterization might be a bit unfair on another scale. The researchers found that "the probability that a child from the lowest quintile of parental income rises to the top quintile is 10.8% in Salt Lake City," and Utah is anything but a liberal bastion. Having said this, however, one would need to take into account the effect of "the church," and the dominance of the LDS faith throughout the state, which creates, in some respects, the "liberal" results. Though Salt Lake's minority populations are growing, not unlike the US as a whole, and Salt Lake City itself if experiencing "white flight" from the core city to the outlying suburbs of Ogden and Provo, it remains a remarkably homogenous state both racially and religiously. The church itself provides considerable social capital and its value scheme places great emphasis on both large and stable family structures. Because of this, it is a very "young" state, with an emphasis on "good schooling" for the children, and while there is some revolt against "public" schooling typical of conservative politics, they have traditionally supplemented public schooling with private religious instruction. Next to my campus at Salt Lake, for example, there was the LDS institute, which served both an instructional mission relative to the LDS faith as well as an extended social network. Moreover, there is a supplemental "tax" on income through tithing, which provides for church activity, but also a social safety net. Finally, there is a considerable entrepreneurial culture among the LDS, so much so that it is sometimes difficult for a non-mormon, like me, to distinguish between the missionary and the marketer. In short, while it is possible for non-governmental structures to provide for the so-called liberal agenda, and Salt Lake stands as a clear example, it is unlikely that the conditions prevailing in Salt Lake could prevail throughout the country. If not "the church," then the next best bet is "the government," and so you see a "liberal" polity that emphasizes less racial and ethnic discrimination, income redistribution from the top down aimed at greater income equality, increase funding for schooling from pre-school through college, and more emphasis on social safety nets.
Having said all this, I have been leading up to a string of propositions:
- First, if one looks at the successes of the emerging economy, it glows with all the promise of the newest iPhone. Having said this, however, the technical advance often comes at the expense of specific groups of people, the "disposables." Just as the technical advances in robotics have decreased the need for workers on the factory floor, advances in natural gas extraction have decreased the reliance on the miners who extract coal. We can expect the advances in artificial intelligence to displace human intelligence behind the wheel of the local taxis and delivery trucks when Google perfects its driverless car and Amazon perfects its drone delivery systems. There are any number of other forces at work to create "disposables," not least the globalization of labor markets for unskilled and semi-skilled workers, but its wishful thinking to believe that we can just "bring those jobs home," not without other political and social disruptions best enumerated elsewhere. The introduction of "disruptive" technologies, in short, changes the fundamental distribution of labor, and it does so forever. Our social structures do not have an "undo" button. The "disruptive" technologies of past centuries, among other things, have "disposed" of the need for agriculture workers. Rural America will never again need the number of agricultural workers that it needed in the past. The industrialization and corresponding urbanization of the nation absorbed many of the "disposable" farm workers, and that worked for a while, but now the "disruptive" technologies of the present, among other things, have "disposed" of the need for industrial workers. Urban America will never again need the number of factory workers that it needed in the past.
- Second, we really do not know how to deal with the growing number of "disposables." It is unreasonable to expect that the "disposables" will simply find other opportunities. Unlike the rural workers of the past, one cannot simply follow the jobs and move to the city. What is the displaced textile worker to do? Move to Bangladesh? Nor can the displaced assembly worker just "become" a robotics or CNC technician, not without a significant personal and financial investment in education, and even if they could make that investment, the number of displaced assembly workers far exceeds the number of robotics or CNC technicians needed. While there is job growth, for the unskilled and semi-skilled factory worker, taking one of those burgeoning "service industry" jobs usually means a step down the economic ladder into work that, without union protections, is poorly paid, part time, and without benefits.
- Third, it is not a far leap from being "disposable," to being "deplorable." Who is to blame? If we ask that question -- and it is almost inevitable as humans being that we WOULD ask that question -- it is virtually impossible to make an emotionally satisfying case for the more or less random and ultimately impersonal economic forces at work. It is absolutely impossible to reduce it to a twitter feed. If I buy into the mythos of self-reliance -- if I believe, really believe, that success is the result of individual intelligence and character -- then what does one make of economic failure? If I believe, really believe, the failure is a result of my own stupidity and laziness, it shouldn't surprise us that disability claims and the slow suicide of drug addiction are on the rise, not to mention outright suicide. If I cannot bring myself to believe the failure is a result of my own intransigence, then there must be nefarious forces at work. Likewise, it really shouldn't surprise us that conspiratorial animosities aimed at blacks, hispanics, muslims, whites, cops, the one percent -- choose your loathing -- are on the rise. One's own historical circumstances will, of course, influence the particular nature and shape of one's populist loathing, but at fundament it all serves the same emotional purpose -- deflecting responsibility for one's condition onto others.
- Third, as a member of the educated elite -- Brown PhD -- I can say with smug assurance that more or less random evolutionally forces are at work economically and socially. Though it is POSSIBLE to lift oneself out of poverty by sheer entrepreneurial pluck, for the vast majority of Americans, upwards of 90%, it remains highly UNLIKELY. For the 90%, where they were born and to whom they were born has a greater impact on their future than their individual characteristics. We don't choose our parents, and in that sense, we did not choose our likely fate. For those lucky enough to be born into a stable family sufficiently wealthy to live in neighborhoods with good schools, a broad network of contacts well integrated into licit economic and social structures, good on you. For the impoverished, they're just, well, ratf**ked from the outset by disrupted families, bad schools, and a broad range of social contacts more familiar with the criminal justice system, the intricacies of gang hierarchies, and the illicit drug distribution systems, all of which serve to perpetuate their condition.
- Fourth, and finally, as a member of the educated elite, I can say with smug assurance that our democracy will most assuredly not be brought low by the moral degradation of "gay marriage" as a punishment from god for our embrace of sodomy. As we "dispose" of more and more people, it will be brought low by the the slow decline from ideology into idiocracy, the identity politics of hate, and the demagoguery that goes with it. The growing disparities, and the despair that accompanies such disparities, are not an "individual," but a "social" and "generational" issue and must be handled as a "social" and "generational" issue, but we seem incapable of doing so. For those who have theirs, it is much more satisfying to believe, really believe, that it is the result of their individual superiority. It is much more satisfying to believe, really believe, the disposables are simply deplorable and deserve their fate. For the increasing number of "disposables," particularly those who have been recently "disposed," however, it is much more satisfying to believe, really believe, that it is the result of nefarious activity, a vast conspiracy of -- choose your loathing -- directed at "us" and our kindred. It is much more satisfying to believe, really believe, that "they" -- choose your loathing -- are the great satan that must be destroyed.
So, what to do? I can give another string of "what not to do" propositions. We cannot arrest technical development. We should not devolve into idiocracy or succumb to deplorability. We should not, in other words, elect Trump, expecting Trump to push the magic button and irradiate the hated other. If Trump's candidacy has revealed anything, the vaunted "white anger" fueling it has revealed an emergent category of "disposables." We should elect Clinton, but we should do so with the full understanding that her election will simply be an ideological stay against an uncertain future.
No comments:
Post a Comment