The Dark Side of Digital

 Four years ago, I wrote reflecting on my 50 years in IT, and the pursuit of value from the use of IT. I described the changes that had occurred over that time since I started my working life as a computer operator on an IBM 1401, which had a (not really published as such) processing speed close to 10 million times slower than today’s (2013) microprocessors, 8k of storage (later upgraded, with an additional unit, to 16k), no solid state/hard drive, displays or communication capability, and no operating system (that was me!). Weighing in at around 4 tons, it needed a fully air conditioned room, with a raised floor, approximately twice the size of my living room.

I described how my world in 2013 compared with that time. I had powerful technology in my small home office, wirelessly connected within my house, and to the world beyond through the internet. I had access to an ever-growing body of knowledge that could answer almost any question I had, and which enabled me to manage my banking, pay bills, check medical lab test results, organize travel, shop, read books, listen to music, watch videos, play games, organize, edit and enhance photographs and videos, and a myriad of other tasks.

I went on to describe how, beyond my individual world, at the enterprise level, the technology model is changing from computing – the technology in and of itself, to consumption – how individuals and organizations use technology in ways that can create value for them and, in the case of organizations, their stakeholders. I discussed the extent to which technology, and how it was being used, was continuing to change, at an ever-increasing rate, including:

  • increasing adoption of the Cloud;
  • Software (and just about anything else) as a Service; the explosion of “Big Data”, and, along with it, analytics and data visualization;
  • mobility, consumerization and BYOD which fundamentally changes how, where and when we interact with technology and access information;
  • the ”internet of things” (IOT) bringing with it unprecedented challenges in security, data privacy, safety, governance and trust; and
  • robotics and algorithmic computing which have considerable potential to change the nature of work.

I closed by talking about what hadn’t changed, and what needed to change. Putting my value lens on, I lamented that, then, 15 years since The Information Paradox, in which I described the challenge of getting value from so called “IT projects”, was first published, the track record remained dismal, and realizing the value promised by IT remained elusive. I attributed this situation to several factors, the primary ones being:

  • a continued, often blind focus on the technology itself, rather than the change – increasingly significant and complex change – that technology both shapes and enables;
  • the unwillingness of business leaders to get engaged in, and take ownership of this change – electing to abdicate their accountability to the IT function; and
  • failure to inclusively and continually involve the stakeholders affected by the change, without whose understanding and “buy in” failure is pretty much a foregone

 

What a difference four years makes

OK – that’s (probably more than) enough of a recap – I’m now going to fast forward some 4 years (a lifetime in the digital world) to today, 2017. While the challenge of creating and sustaining value from our use of technology described above is still real, our failure to address it, along with an almost total failure of leadership – technical, business and government leadership, have brought us into an increasingly dark place – one that I think few of us saw coming, certainly not unfolding as it is. I call this place “the Dark Side of Digital”. I alluded to it in 2013 in discussing IOT, robotics and algorithmic computing, when I said that they brought with them “unprecedented challenges in security, data privacy, safety, governance and trust…(and) have considerable potential to change the nature of work” – I would now revise and add to the latter saying “…have considerable potential to fundamentally impact the future of work and, indeed, the future of society”.

The elements of this dark side fall into three main categories:

  1. Cybersecurity: This is the most traditional category – one that, albeit not so- named, has been with us since the advent of computers, when cards, tapes or
    other media could be lost/stolen. However, as our connectedness continues to increase, so does our susceptibility to cybersecurity attacks, with a growing number of such threats arising out of machine-to-machine learning and the Internet of Things. There are nearly 7 billion connected devices being used this year, but this is expected to jump to a whopping 20 billion over the next four years. Most cybercriminals are now operating with increasing levels of skill and professionalism. As a result, the adverse effects of cyber-breaches, -hacks, or –attacks, including the use of ransomware and phishing continue to escalate resulting in increased physical loss and theft of media, eroding competitive advantage and shareholder value, and severely damaging reputations. More severe attacks have the capacity to disrupt regular business operations and governmental functions severely. Such incidents may result in the temporary outage of critical services and the compromise of sensitive data. In the case of nation-state supported actors, their attacks have the potential to cause complete paralysis and/or destruction of critical systems and infrastructure. Such attacks have the capacity to result in significant destruction of property and/or loss of life. Under such circumstances, regular business operations and/or government functions cease and data confidentiality, integrity, and availability are completely compromised for extended
  1. The Future of Work: The fear that technology will eliminate jobs has been with us pretty much since the advent of the first commercial computers, but, until the last few years, the argument that new jobs will appear to replace the old has largely held true. Now however, the revolutionary pace and breadth of technological change is such that we are experiencing a situation in which, as recently described by the Governor of the Bank of England, Mark Carney.

“Alongside great benefits, every technological revolution mercilessly destroys jobs & livelihoods well before new ones emerge.”

Early AI and IOT systems are already augmenting human work and changing management structures across labor sectors. We are already seeing, and can expect to continue to see uneven distribution of the of AI impact across sectors, job types, wage levels, skills and education. It’s very hard to predict which jobs will be most affected by AI-driven automation.

While, traditionally, low-skill jobs have been at the greatest risk of replacement from automation, as Stephen Hawking says, the “rise of artificial intelligence is likely to extend job destruction deep into the middle classes, with only the most caring, creative or supervisory roles remaining.” He goes on to say that “we are at the most dangerous moment in the development of humanity”.

  1. The Future of Society: On the societal front, a paradigm shift is underway in how we work and communicate, as well as how we express, inform and
    entertain ourselves. Equally, governments and institutions are being reshaped, as are systems of education, healthcare and transportation, among many others.

AI and automated decision making systems are often deployed as a background process, unknown and unseen by those they impact. Even when they are seen, they may provide assessments and guide decisions without being fully understood or evaluated. Visible or not, as AI systems proliferate through social domains, there are few established means to validate AI systems’ fairness, and to contest and rectify wrong or harmful decisions or impacts. Professional codes of ethics, where they exist, don’t currently reflect the social and economic complexities of deploying AI systems within critical social domains like healthcare, law enforcement, criminal justice, and labor. Similarly, technical curricula at major universities, while recently emphasizing ethics, rarely integrate these principles into their core training at a practical level[1]. As Mike Ananny and Taylor Owen said in a recent Globe and Mail article[2], there is “a troubling disconnect between the rapid development of AI technologies and the static nature of our governance institutions. It is difficult to imagine how governments will regulate the social implications of an AI that adapts in real time, based on flows of data that technologists don’t foresee or understand. It is equally challenging for governments to design safeguards that anticipate human-machine action, and that can trace consequences across multiple systems, data-sets, and institutions.” This disconnect is further adding to the erosion of trust in our institutions that we have been seeing over several decades.

Adding to the threats to society is the proliferation of internet and social media. In a world where we can all be publishers, we see shades of Orwell’s 1984 in a post-truth word of alternate facts, and fake news. Rather than becoming a more open and collaborative society, we see society fracturing into siloed echo- chambers of alternate-reality, built on confirmation bias, and fed by self-serving populist leaders, posing dangerously simplistic solutions – sometimes in tweets of 140 characters or less – to poorly understood and increasingly complex issues.

 

So, what do we need to do?

The complexity of these challenges, and their interconnectedness across sectors make it a critical responsibility of all stakeholders of global society – governments, business, academia, and civil society – to work together to better understand the emerging trends.

If business leaders expect to harness the latest technology advances to the benefit of their customers, business and society at large, there are two primary challenges they need to address now.

  1. As companies amass vast amounts of personal data used to develop products and services, they must own the responsibility for the ethical use and security of that information. Ethical and security guidelines for how data is collected, controlled and ultimately used are of paramount concern to customers, and rightfully so. To gain the trust of customers, companies must be transparent and prove they employ strong ethical guidelines and security standards.
  1. It is incumbent on organizations to act responsibly toward their employees and make it possible for them to succeed in the rapidly changing work environment. That means clearly defining the company vision and strategies, enabling shifting roles through specialized training, and redefining processes to empower people to innovate and implement new ways of doing business to successfully navigate this new and ever-changing

As a society, if we are to avoid sleepwalking into a dystopian future, as described in 2013 by internet pioneer Nico Mele as one “inconsistent with the hard-won democratic values on which our modern society is based… a chaotic, uncontrollable, and potentially even catastrophic future”, we must recognize that technology is not destiny – institutions and policies are critical. Policy plays a large role in shaping the direction and effects of technological change. “Given appropriate attention and the right policy and institutional responses, advanced automation can be compatible with productivity, high levels of employment, and more broadly shared prosperity.”

The challenge is eloquently described by WEF founder and executive chairman, Dr. Klaus Schwab.

“Shaping the fourth industrial revolution to ensure that it is empowering and human- centred, rather than divisive and dehumanizing, is not a task for any single stakeholder or sector or for any one region, industry or culture. The fundamental and global nature of this revolution means it will affect and be influenced by all countries, economies, sectors and people. It is, therefore, critical that we invest attention and energy in multi- stakeholder cooperation across academic, social, political, national and industry boundaries. These interactions and collaborations are needed to create positive, common and hope- filled narratives, enabling individuals and groups from all parts of the world to participate in, and benefit from, the ongoing transformations.”

 

A call to action!

We need, as Dr. Schwab goes on to say, to “…take dramatic technological change as an invitation to reflect about who we are and how we see the world. The more we think about how to harness the technology revolution, the more we will examine ourselves and the underlying social models that these technologies embody and enable, and the more we will have an opportunity to shape the revolution in a manner that improves the state of the world.”[3]

We cannot wait for “them” to do this – as individuals, we can and must all play a leadership role as advocates in our organizations and communities to increase the awareness and understanding of the changes ahead, and to shape those changes such that, as Dr. Schwab says, they are empowering and human-centred, rather than divisive and dehumanizing.

[1] Source: The AI Now Report, The Social and Economic Implications of Artificial Intelligence Technologies in the Near-Term, A summary of the AI Now public symposium, hosted by the White House and New York University’s Information Law Institute, July 7th, 20

[2] Ethics and governance are getting lost in the AI frenzy, The Globe and Mail, March 20, 2017

[3] Source: The Fourth Industrial Revolution: Risks and Benefits, Wall Street Journal, Feb 24, 2017

The Digital Economy and the IT Value Standoff

The emerging  digital economy, and the promise and challenges that it brings, including the need to shift focus beyond reducing cost to creating value, are adding fuel to the seemingly never-ending discussion about the role of the IT function, and the CIO.  There is questioning of the very need for and/or name of the position, and the function they lead. Discussions around the need for a CDO, the so-called battle between the CMO and the CIO for the “IT budget”, and other similar topics proliferate ad nauseam. Unfortunately, most, although not all of these discussions appear to be about the technology itself, along with associated budgets power and egos, within a traditional siloed organizational context. This akin to shuffling the deck chairs on the Titanic, or putting lipstick on a pig – it’s way past time for that!  As technology becomes embedded in and across everything we do, and we are increasingly becoming embedded in everything technology does, we have to acknowledge that the way we have managed technology in the past will be a huge impediment to delivering on the promise of the Digital Economy. Indeed, it has proven woefully inadequate to deliver on the promise of technology for decades.

Recent illustrations of this include failed, or significantly challenged healthcare projects in the U.S., Australia, and the U.K.as well as disastrous payroll implementations in Queensland, New Zealand and California (you would really think that we should be able to get payroll right). And this situation is certainly not unique to the public sector, although these tend to be more visible. In the private sector, a large number of organizations continue to experience similar problems, particularly around large, complicated ERP, CRM and Supply Chain systems.

All too often, these situations are described as “IT project” failures. In most cases, while there may have been some technology issues, this is rubbish. As I and others have said many times before, the ubiquitous use of the term “IT project” is a symptom of the root cause of the problem. Labelling and managing investments in IT-enabled business change, as IT projects, and abdicating accountability to the CIO is a root cause of the failure of so many to generate the expected payoff. Business value does not come from technology alone – in fact, technology in and of itself is simply a cost. Business value comes from the business change that technology increasingly shapes and enables. Change of which technology is only one part – and increasingly often only a small part. Technology only contributes to business value when complementary changes are made to the business – including increasingly complex changes to the organizational culture, the business model, and the the operating model, as well as to  relationships with customers and suppliers, business processes and work practices, staff skills and competencies, reward systems, organizational structures, physical facilities etc.

From my many previous rants about our failure to unlock the real value of IT-enabled change, regular visitors to this blog will know that I am particularly hard on non-IT business leaders, starting with Boards and CEOs, for not stepping up to the plate. When it comes to IT, the rest of the business, from the executive leadership down, has expected the IT function to deliver what they ask for, assuming little or no responsibility themselves, until it came time to assign blame when the technology didn’t do what they had hoped for. The business change that IT both shapes and enables must be owned by business leaders, and they must accept accountability, and be held accountable for creating and sustaining business value from that change. This cannot be abdicated to the IT function.

However, having spent quite a lot of time over the last few months speaking with CIOs and other IT managers, it has been brought home to me that some, possibly many of them are just as much at fault. There appear to be a number of different scenarios, including CIOs who:

  1. “Get it” and are already seen as a valued member of the executive team, providing leadership in the emerging digital economy;
  2. “Get it”, but have been unable, and, in some cases,  given up trying to get the rest of the executive team to step up to the plate;
  3.  Sort of “get it”, but don’t know how to have the conversation with the executive team;
  4. May “get it”, but are quite happy to remain  passive “order-takers”; or
  5. Don’t “get it”, still believing that IT is the answer to the world’s problems, and don’t want to “give up control”.

The result, in all too many cases, is a stand-off where the business doesn’t want to take ownership, and the IT function doesn’t know how, or doesn’t want to give up control. As Jonathan Feldman said in a recent InformationWeek post, “..enterprise IT, like government IT, believes in the big lie of total control. The thought process goes: If something lives in our datacenter and it’s supplied by our current suppliers, all will be well…my observation is that the datacenter unions at enterprises want “the cloud” to look exactly like what they have today, factored for infrastructure staff’s convenience, not the rest of the supply chain’s.” Until this standoff is resolved, the “train wrecks” will continue, and we will continue to fail to come anywhere near realizing the full economic, social and individual value that can be delivered from IT-enabled change.

At the root of all this is what I described in an earlier post as The real alignment challenge – a serious mis-alignment between enterprises whose leaders have an ecosystem mindset, and adopt mechanistic solutions to change what are becoming increasingly complex organisms. But it’s also more than this – in a recent strategy+business recent post, Susan Cramm talked about “the inability of large organizations to reshape their values, distribution of power, skills, processes, and jobs”. The sad fact is that, as organizations get bigger, an increasing amount of attention is spent looking inward, playing the “organizational game”, with inadequate attention paid to the organizations raison d’être, their customers, or their employees. As Tom Waterman said, “eventually, time, size and success results in something that doesn’t quite work.” Increasingly today, it results in something that is, or will soon be quite broken.

Most of the focus of the conversation about the digital economy today is on improving the customer experience, as indeed it should be – although we have been saying the same for decades with, at best, mixed success. We will come nowhere close to  achieving that success unless we put equal focus on our people, and rethinking how we govern, manage and organize for the digital economy such that we maximize the return on our information and our people.

This will require that leaders truly lead – moving beyond tactical leadership, aka managing, to strategic and transformational leadership. That we move from a cult of individual leadership – “the leader”, to a culture of pervasive leadership – enabling and truly empowering leadership throughout the organization- putting meaning to that much-abused term “empowerment”. That we break the competitive, hierarchical, siloed view and move to a more collaborative, organic  enterprise-wide view. The technology exists to support this today – what is lacking is the leadership mindset, will and capability make the change. As Ron Ashkenas said in a 2013 HBR blog – “The content of change management is reasonably correct, but the managerial capacity to implement it has been woefully underdeveloped”.

I am not saying that this will be easy easy to do – it isn’t, very little involving organization, people and power is. And somehow, throwing in technology seems to elevate complexity to a new dimension. And we certainly don’t make it any easier with the ever-growing proliferation of books, frameworks, methods, techniques and tools around the topic. Many of which have evolved out of the IT world, and are, as a result, while intellectually correct, often over-engineered and bewilderingly complex to executives and business managers who need to “get this”.

So, let’s get back to the basics – governance is about what decisions need to be made, who gets to make them, how they are made and the supporting management processes, structures, information and tools to ensure that it is effectively implemented, complied with, and is achieving the desired levels of performance. It’s not about process for process sake, analysis paralysis, endless meetings, or stifling bureaucracy – it’s about making better decisions by finding the right balance between intellectual rigour and individual judgement. In a previous post, Back to the Basics – the Four “Ares” I introduced the four questions that should be the foundation for that decision-making:

  1. Are we doing the right things?
  2. Are we doing them the right way?
  3. Are we getting them done well?
  4. Are we getting the benefits?

A common reaction to the four “ares” is that they are common sense. Indeed they are, but, unfortunately, they are far from common practice! if business leadership to move beyond words in addressing the challenge of creating and sustaining value from investments in enterprise computing, social media, mobility, big data and analytics, the cloud etc. emphasis must be placed on action—on engagement and involvement at every level of the enterprise,  with clearly defined structure, roles and accountabilities for all stakeholders related to creating and sustaining value. The four “ares” are a good place to start!

 

2012 – A Perfect Storm in IT!

One consequence of a 3 month hiatus, forced initially by surgery and concluded more voluntarily with much needed relaxation in Hawaii, is that I have had time to actually read and digest much of the material that, all too often, I only have time to quickly scan – and then rarely get back to. Amongst all this material was a considerable amount of prognostication on 2012 trends. In many ways, little of this was new, but collectively, it does amount to a “perfect storm” that challenges the way we as individuals, societies, and enterprises – small and large, public and private, look at, use and manage technology, including both the demand and supply side and, probably most importantly, where they intersect. In this post, I will briefly discuss the elements of this “perfect storm”, add the one element that I find to be conspicuously missing from the dialogue, and discuss the implications of both.

  1. The “cloud” – the dream of the “information utility” has been around for decades, and now, with the “cloud”, while there are still significant governance, security and privacy issues to work through (some real, some “noise”),  this is now closer to being a reality.
  2. The data explosion, “big data” – I read recently that 90% of the data in the world today was created in the last 2 years – this exponential growth of data is creating both enormous challenges, and great opportunities – on the technology side, developments include the rise of Hadoop, and recent announcements of Dynamo DB from Amazon, and Big Data Appliance from  Oracle, as well as the growing need for new data visualization and “data scientist” skills.
  3. Analytics, particularly real-time analytics – some of the technologies mentioned above, and indeed those below, are fundamentally changing the analytics landscape. Huge amounts of data – structured or unstructured, can now be analyzed quickly, and data can increasingly be captured and analyzed in real time. The challenge here is to resist the temptation  to succumb to analysis paralysis – to know what information is both relevant and  important, what questions to ask, and to think ahead to what actions might need be taken as a result of based on the answers to these questions.
  4. Mobility – services can now be accessed, data captured, information found, and transactions performed from almost anywhere – other work locations, coffee shops, restaurants and bars, at home, in other countries, in taxis, trains or buses, on airplanes or even on a cruise ship – limitations of distance and time have been virtually eliminated. The challenge here, apart from the security and privacy issues that are common to most of these points,  is to be able to find the “off” button in an increasingly, always on, 24/7 world. On an individual basis we need to maintain a work-life balance, and from a business perspective, “burn out” seriously erodes the effectiveness and value of  their most critical resource – people.
  5. Consumerization, including BYOD and “app”s – while it could be argued that these 3 could each merit their own category, I have chosen to “lump” them together as, collectively, they represent a further significant shift from the traditional “technology push” world, with the IT function in a control mode as the gatekeeper, to the “user tool pull” world with IT, potentially – if they get it right, in a facilitation role as a service broker.
  6. Social Media – this is, to some extent, simply one “flavour” of the previous 2 elements, but a very significant one, with potentially huge implications. While much of the attention to date has been on controlling social media, enterprises are increasingly using it as a communication channel, and beyond that, to tap into it to find out what their customers, and employees are thinking. Here, one challenge/opportunity that I see how we can use social media to improve performance  by tapping into the collective knowledge within organizations – “crowd sourcing” input into decision-making and, as a result, making better-informed decisions, and having employees feel more connected with, and empowered by their organizations.

In all the discussion around the elements of this “perfect storm”, much if not most of the focus had been on the IT function needing to respond more quickly to deliver and/or support capabilities in these areas. There has been much less discussion of how the use of these technologies will be used to lead to positive outcomes – creating and/or sustaining  individual, societal and enterprise value – or of the changes that will be needed in the behaviour of individuals, societies and enterprises if that value is to be realized. If we as individuals and societies are not to become “the tools of our tools”, and enterprises are not to continue the increasingly expensive and value-destructive litany of IT failures, we need to shift our focus from the technology to how we manage and use the capabilities that the technologies provide to increase the value of our lives, our societies and our enterprises.

I don’t make these comments as a later day “luddite”,  rather my focus on value is driven by many decades of frustration at our being nowhere near to realizing the individual, societal and business value that intelligent and appropriate use of technology can create. We will not close that gap until we – as individuals, or leaders in society or business, take “ownership” of how we  use technology, based on the outcomes that are important to us, and the value that we seek to create and sustain! In the enterprise world, this has fundamental implications for the roles and accountabilities of business executives and line of business managers, and for the role of the IT function, as discussed in 2 earlier posts, The Future of IT, and Value from IT – There is a Better Way!

In closing, in the context of individual and societal value, 2 areas that I have long had an interest in, and that I will be watching closely this year are healthcare and education. While we shouldn’t expect seismic shifts in either to happen quickly – it’s just not the nature of the beasts, the ground is starting to move. In healthcare, much of this is driven by the funding crunch, with increase focus on eHealth, based largely on “meaningful use” of EHRs, as well as an increasing number of apps such as Phillips Vital Signs Camera for the iPad. In education, with some exceptions, it is still somewhat more of a grass roots movement, although Apple’s recent iBooks 2 and iTunesU announcements, and organizations such as Curriki may well be  changing this. What I believe we will see here, over time, is an evolution beyond eHealth and eLearning to iHealth, and iLearning, with individuals taking increasing “ownership” of their own health and education.