Platt Perspective on Business and Technology

There are no low value playing cards: some thoughts regarding Yuval Harari’s vision of our collective future

I find myself writing about hands-on non-managerial employees and what they do as an ongoing narrative thread that runs throughout this blog. And I write of the value and importance of these employees, for how they carry out so much of the actual day-to-day activity that brings a business – any business or organization to life. I write about management and managers too, and about leadership in a business, and I write about what good, effective management and leadership can mean in that vastly complex and far-ranging context. Have I simply been focusing on the workplace of the past in all of this? I write that thinking back to Plato’s Republic and how it can be viewed as a swan song reminiscence of classical Greece and its city states, from when they were still holding sway as its supreme source of, and voice of power and of order and of culture and civilization. That state of being was already slipping into an unrecoverable past, even as Plato composed his treatise on the Greek city state. (For a full text English language translation of The Republic, follow this link and you can of course, find this blog in its entirety at this WordPress link.)

I write this blog as an ongoing narrative: a single multi-threaded interconnected treatise on businesses and technology, and with a goal of addressing as full a range of scales of impact and involvement as possible in that. And my goal in all of this has been to both describe and discuss our current here and now, as of this writing, and to predictively project forward from that, for what is at least likely to come.

In one sense, my goal for this posting is that it offer at least one more point of connection that might be drawn between the two seemingly so disparate topics of labor and management that I began this note with. And I do so against a rapidly emerging backdrop of general employee marginalization, with that playing out as even basic job security vaporizes for so many and as executive suite compensation increases to levels that have no real historical parallels for the few. And I particularly note here, the so-public an ongoing awarding of hundreds of millions of dollars and more to executives as golden parachutes, and as so many receiving such largess have been found to be abusers through the revelations of the #MeToo movement, or by otherwise proving themselves to be unworthy of any severance packages at all. This so public an awarding of such excess in severance benefits even holds as senior executives who have held positions of trust and responsibility are forced out of office at those corporations for egregious malfeasance.

• I have written of this disparity in this blog in more strictly economic terms, citing the growing skew in the Gini coefficient (or Gini index as it is also called) as it is measured across national economies and their private sectors, and both in countries such as the United States that claim to hew to free market policy, and in countries such as the People’s Republic of China that are overtly state run and one Party run at that, even as they offer what are described (there) as private sector opportunities.
• And I have written of this in the context of automation and the increasingly widespread advent of at least artificial specialized intelligence agents, that are capable of taking over an increasing range of job descriptions and jobs held.
• And I have written of this in terms of job relocation, with that largely meaning the exporting of work performed to locales that offer lower salary and benefits, and I have to add lower workplace safety standards too: reducing personnel-related costs as what have traditionally been one of most business’ largest cost centers.

All three of these views of this trend and its emerging consequences, and others that I have also at least touched upon in this blog that fit into them, hinge on an at least implicit presumption that most workers and potential workers have limited and diminishing overall value in the workplace and in the overall workforce as a whole. And that brings me to Yuval Harari’s best seller books, and to two of them in particular:

• Harari, Y.N. (2015) Sapiens: a brief history of humankind. Harper Collins, and
• Harari, Y.N. (2017) Homo Deus: a brief history of tomorrow. Harper Collins.

And I focus in what follows here on the second of those two books, as Harari seeks to map out what he sees as possible and even likely in the years to come – in his case projecting much farther forward into the future than I do in my writings.

I begin addressing those books, and his second in particular by noting what I can only see as the irony implicit in its title. Harari writes of the death of traditionally conceived gods in Homo Deus, and of the rise of humanism as a religion, and of the rise of what he calls dataism – the observance of data and of big data in particular as if an omnipotent, omniscient, omnipresent being. And in the same narrative progression, he writes of Homo Sapiens transcending our species’ biologically evolved and (up until now) technologically constrained limitations, to become Homo Deus: mankind as god-kind. And to complete that circle, he relegates Homo Sapiens turned Homo Deus to what amounts to the same dustbin of history that he had just consigned the gods of traditional religions to.

• What would supplant humanity and in ways that would make at least most people irrelevant?
• And how and in what sense would humans at least for the most part become irrelevant?
• And what change agents would drive this?

I have already touched upon the answers to those questions in my above-offered bullet points when mentioning automation and artificial intelligence. And I only need to add big data: the god of Harari’s dataism to that set to round out his answers to them, at least as offered in his books. There, big data is viewed as essential to meaningfully living in the world as a valued actively contributing member of society, where “big” has long-since meant “beyond mere human understanding and capability” and both for scale and volume and for the ever-shortening timing of reasoned response and action that big data and its effective use demand.

Harari writes of the marginalization of humanity and of humans – and certainly unaugmented ones, on the grounds that people will increasingly be unable to keep up with what our tools, and our smart tools in particular can do. In principle at least, people or at least the most skilled of them might be required for initially developing artificial specialized intelligence agents, and certainly up to the point where they would self-learn and self-evolve on their own from there. Development of a true artificial general intelligence might very well take people out of the loop for even just that, and even for the smartest and most skilled.

I have recently found myself writing here about artificial general intelligence and what at least categorically would go into its functional design and development. See, for example, my series: Reconsidering Information Systems Infrastructure, as can be found appended to the end of Section I in my Reexamining the Fundamentals directory page. As risky as longer range predictions can be, for suggesting when specific technologies will arise and how they will mature, I do expect that humanity and its artificial specialized intelligence agents will in fact arrive at an at least meaningfully threshold level of artificial general intelligence before the end of the 21st century and quite possibly before its midpoint. Yes, computer and information science professionals have been making that type of next few decades predictions since the dawn of the electronic computer so I am keeping good (and also inaccurate) company there. But conservatively speaking and certainly given the possible consequences of that happening, prudence would dictate that humanity be prepared for it, and from a presumption that it will definitely arise soon. And that brings me to the last few pages of Homo Deus, and Harari’s statement that all that he had been offering in the preceding pages and chapters there, need not happen – if that is, people decide to actively work towards the development and emergence of a more human-included and even a more human-centric 22nd century and beyond.

• I see that as one of the great essential goals for the 21st century that people and the businesses that they form and work in and the governments that they live under, should all be working on. And we need to start working on this before, for example, we have locked in some specific understanding of what general artificial intelligence even is, with that already effectively set in stone for what has already been developed for it and to the point where it would be taken for granted.

I end this posting by citing a second series that I am also currently developing and adding to this blog: Moore’s Law, Software Design Lock-In, and the Constraints Faced when Evolving Artificial Intelligence (as can be found at Reexamining the Fundamentals 2 as its Section VIII.) And with that in mind, I make explicit note to the title of this thought piece and its beginning with its phrase: “there are no low value playing cards.”

A poker hand with a well selected set of what would generally be presumed to be low value number cards (e.g. three 2’s) can beat a set of nominally higher valued cards (e.g. a pair of kings) any day. My point here is that nominally valued and routinely expected can be misleading and they can even be very specifically wrong and certainly in the right contexts. Realistically, who offers more positive value to a business that is leading a repudiated executive officer to the exit and dismissing them for egregious behavior, that makes the entire business look bad for his having done that there? Is it this now former executive as they are walked out of that door with their golden parachute in hand, or is it the janitor who will clean out their now-former office there for further use by others?

Turning back to reconsider Harari and his message, we have to rethink, and perhaps even just think through as if for a first time, what really holds value for people and in people. I would suggest starting that with the simpler here-and-now exercise of better dealing with those golden parachutes and who gets them, and then expanding out from there to deal with the challenges raised by AI and automation and the rest, as Harari warns us to do.

I am certain to have more to add to this topic, or rather this complex set of them in future writings. Meanwhile, you can find this essay at Reexamining the Fundamentals 2 as appended to the end of its Section VIII series (and also see Page 1 of that directory. And I also include this in my Ubiquitous Computing and Communications – everywhere all the time 3 directory page, and also see Page 1 and Page 2 of that directory for further related material.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: