A professor explains how you can control the future of work

As companies increasingly look to team up machines — AI and autonomous technologies —with humans in the workforce, there has never been a time when company values are so important.

Technology, no matter how smart it may seem, is not capable of understanding the nuances of the decisions it makes or recommends. Only humans can do this. And so ensuring all employees truly understand company values and what ethical lines cannot be crossed will be crucial to consistently drive the right decisions.

Value and innovations impacting the Gen Z workforce

Values are incredibly important to Millennials and Gen Z. “We increasingly understand that younger generations can look at the same world the older generations and see something different,” explains Julie Jenson Bennett, director of the Generation Poetry project, studying the children of the 21 st century and how they communicate. “Where we worry about technology and robots and AI taking our jobs and being better than us, they worry about the people building those technologies and the ethics they’re using to make them.

“They know that technology is not an objective force that can be trusted more than humans to tell the truth and be right, but just another way that humans express themselves, including lying, cheating, and stealing as well as loving, learning, and laughing together.”

These issues are also becoming increasingly important to everyone, from investors to government to academia to the media. Companies that allow themselves to be ruled by technology and not by values will not succeed in the future. Over the next 10 years, a constellation of technologies is going to radically change how we live and work. No industry will be left unaffected nor any part of our personal lives. I call this the coming age of Sentient Tools.

These tools are a mix of artificial intelligence (AI), Internet of Things (IoT), the industrial Internet of things (iIoT), smart cities, distributed computational intelligence, robotics, and autonomy in land, sea, and air. This coming age of Sentient Tools is coming at us very quickly and advancing with increasing speed. We know it will have massive effects on the labor force as well as how we educate and train the next generation of workers.


How you can control the future of work

While this can sound scary, I am not pessimistic about these coming technologies. Understanding the impact that they may have on the future, means that we can be active participants in how we want to work with these technologies, not let them dictate the future.

Too often people and organizations imbue technology with too much intelligence. There are many examples and anecdotes surrounding AI-powered facial recognition coming up with ridiculous results, at best — and at worst, giving back results that are flat out racist or sexist. And just leave robots to open doors and it’s clear robotic overlords are not yet coming to take things over.

As we begin to bring these Sentient Tools into our companies and into our lives we must understand that while they do have intelligence, it’s all artificial intelligence — and AI is not human-level intelligence.

Building a high tech human hammer

No matter the platform, these technologies are tools and tools should be designed to aid humans. A hammer is just a hammer. Alone, it’s not very interesting. What makes a hammer interesting when you use it to build a house and make somebody’s life better. This is the same way that we must judge these coming technologies. We must ask ourselves how are we using them to enhance our companies and make people healthier, happier, more productive, better connected, more secure, or, simply, make them laugh a little.

But it is also true that all technology can be used to hurt people, whether it be intentionally or unintentionally. The reality of these tools is that there is peril. One cannot design a hammer sufficient enough to build a house that is also not sufficient enough to injure or even kill someone. That’s where humans come in with culture and ethics. We have laws that say injuring someone with a hammer is illegal — and we have norms that say, “it’s just not cool.”

As we move into the future of Sentient Tools you must embrace the idea that you are in control and it is up to you to set the ethics. But in truth, it’s not a matter of ethics. If we ask, “How do we make ethical autonomous technologies?” then we are asking the wrong question, and imbuing technology with too much intelligence and power.  The end state that we are ultimately striving for is to have ethically compliant machines. Technologies that adhere to our rules, ethics, and norms.

Playing the Sentient Tools — and playing to our ethics

Once you have decided upon these rules, ethics, and norms for your company and for society as a whole, instantiating them inside of technology is relatively simple. With the right tools, team structures and trained professionals — think engineers, ethicists, social scientists, and managers — it can become a natural part of doing business.

The real question, then, becomes not if we’re going to work side-by-side with the machines, but how we will work side-by-side.  It is up to us as humans and as business leaders to decide and document our organization’s ethics. When it comes to adding these amazing technologies that will transform nearly every part of our lives, you will need to ask yourself the question — what is the future we want and what is the future we want to avoid?

Every organization needs to decide the future they want. You decide as an organization made up of other humans. What is the values and ethics that you will support and reward? This is not only how culture is made inside of a company but it’s also how we set technological safeguards.

Be rules by values,  not by technology

“We must be careful that in a hyper-accelerated, hyper-connected future of merged human/algorithm systems, we don’t minimize human values to maximize shareholder value,” warns Renny Gleeson, Managing Director, Business Innovation Group, at Wieden + Kennedy Ad Age’s 2019 agency of the year.  “It would be too easy and too sad, and we must be better than that.”

We know that the coming age of Sentient Tools is going to change our organizations. The question you now have in front of you is how will you establish and communicate the effect of these technologies to your employees, customers. If you don’t prioritize this, you’ll not only turn off tomorrow’s Gen Z workforce and customers but ultimately not protect yourself from ethical issues in the world-at-large in the future.

Brian David Johnson is a Professor of Practice at Arizona State University’s School for the Future of Innovation in Society, and a Futurist and Fellow at Frost & Sullivan, a visionary innovation company that’s focused on growth. He also works with governments, militaries, trade organizations, and startups to help them envision their future. He has over 30 patents and is the author of a number of books of fiction and nonfiction