How to safely realise the enormous potential of Al – Interview with Juliette Powell and Art Kleiner


Share on LinkedIn

Listen to Podcast.

Today’s interview is with Juliette Powell and Art Kleiner. Juliette is an author, entrepreneur, technologist, and strategist, who works at the intersection of culture, data science, and ethics and Art is a writer, editor, consultant/facilitator and entrepreneur with a background in technology, business culture, scenario thinking and organizational learning. They are both faculty members at New York University’s Interactive Telecommunications and Interactive Media Arts program and have recently co-authored a new book called The AI Dilemma: 7 Principles for Responsible Technology. Juliette and Art join me on the podcast to talk about the book, the four challenges we face to regain control of our personal data, the five steps businesses can take to build AI accountability, the regulation of AI, the three things teams need to consider in developing or using systems that generate texts, images, or other content amongst a whole bunch of other things.

This interview follows on from my recent interview – The science behind repairing trust – Interview with Professor Peter Kim – and is number 484 in the series of interviews with authors and business leaders who are doing great things, providing valuable insights, helping businesses innovate and delivering great service and experience to both their customers and their employees.

Here are the highlights of my conversation with Juliette and Art:

  • In the right hands, AI is beneficial to all of us, but it could also cause some serious harms like the way it is treating certain groups of people versus others.
  • We are our personal data. That’s the thing that people see first. They see it before they see our body language, before we show up in person. We look up each other and we know more about each other than we arguably ever have before.
  • The four challenges that we face to regain control of personal data:
    • Digital trust: do people really believe that their data is gonna be held with the same respect we would give it ourselves?
    • Digital infrastructure: do we have the wherewithal to manage our personal data? Do we have the electricity and the internet to manage it?
    • Digital access: are some groups able to control their data more than other groups?
    • Data literacy: do we have the wisdom and knowledge (the technical knowledge and the human knowledge) to use it responsibly?
  • Many of the biases that we have in our systems are essentially biases that we as humans have.
  • The more languages you use on an AI system, the better it is.
  • There’s so much oral history that is not on the internet at all.
  • AI takes people’s existing biases and automates and amplifies them. Just like it does for what we do like about human nature.
  • The five steps businesses can take to build their AI accountability:
    • Education – Be aware of the consequences of what you do. How do you do that? Good judgment comes from experience and experience comes from poor judgment. So you have to have education that allows people to experiment and see what the outcomes are without really doing harm.
    • Governance of data – Where are we collecting our data from? For example, data about facial recognition, for instance, systematically underrepresents people with darker skin and women. Are we taking precautions that the data covers the range of people that we’re interested in? If it’s a system about people, are we aware of how representative the data we’re gathering is to the problems we’re trying to solve?
    • Governance of the model – Can the model be queried if it does something unfair? Can the organization that created it be queried? Can the organization balance its structural need for secrecy, its trade secrets, and it’s competitive advantage against the legitimate need that people have to find out what’s really going on with the system?
    • Intellectual property – Who owns the material? If you are generating new work out of old work then is the new work respecting the same kind of IP ownership that the old work had?
    • Open-sourcing – Like we did with the internet, can settle on a few basic common approaches to things? Will those approaches lead us to a more responsible way of using the technology?
  • Just about every tech company that I spoke to said that they were still playing catch-up when it came to GDPR and that they were in no way ready for the Artificial Intelligence Act coming out of the EU.
  • I am a big believer in self-regulation. Self-regulation for all of us as humans: Now that we have generated AI in the palm of our hands, I think it’s up to us to be responsible for it.
  • I’m shocked sometimes when I listen to some of my students who genuinely are asking a chatbot whether they should quit their job, whether they should leave their partner, whether they should move to the other side of the world, and they trust the response from the chatbot more than they do from their friends and family, saying that the system has more data on them and therefore knows them better.
  • The three things that teams should consider in developing systems that generate text and images and other content.
    • Actors – Who are the actors who are going to be using what you develop and how are they going to use it?
    • Behavior – what are the things people can do with what you produce?
    • Content – Aristotle talked about the good, the beautiful and the true. There’s going to be a lot more beautiful. And hopefully a lot of good. And then how much is going to be true? And how do we know?
  • It’s easy to change data and technology. The thing that’s really hard to change is ourselves, because that requires us to do things differently, develop new habits and behaviors. That’s the thing that we don’t necessarily pay as much attention to as we should. But it’s the thing that we should lean into if we’re going to get the most and the best out of this technology.
  • You don’t really learn about a complex system until you get into it, and really study it.
  • Juliette’s Punk CX word(s): The right to experience
  • Art’s Punk CX word(s): (Creative) friction
  • Juliette’s Punk XL brand: Virgin Atlantic.
  • Art’s Punk XL brand: The independent proprietors.

About Juliette

Juliette PowellJuliette Powell is an entrepreneur, technologist, and strategist, who works at the intersection of culture, data science, and ethics. Her consulting services focus on global strategy and scenarios related to AI and data, banking, mobile, retail, social gaming, and responsible technology. She has delivered live commentary on Bloomberg, BNN, NBC, CNN, ABC, and BBC and presentations at institutions like The Economist, Harvard, and MIT. She works with such organizations as YPO, Reuters, the United Nations, Warner Brothers, Cirque du Soleil, IBM, and the World Bank Group.

She is the author of the best-seller, 33 Million People in the Room: How to Create, Influence, and Run a Successful Business with Social Networking (Financial Times Press, 2009). She was a cofounder with Intel Labs of the research network WeTheData. Her newest title, The AI Dilemma is based in part on her research conducted at Columbia University and through The Gathering Think Tank. Juliette is a faculty member at New York University’s Interactive Telecommunications Program and the founding partner of Kleiner Powell International (KPI).

Say Hi to Juliette on the social media network formerly known as Twitter @juliettepowell, and feel free to connect with Juliette on LinkedIn here.

About Art

Art KleinerArt Kleineris a writer, editor, consultant/facilitator and entrepreneur with a background in technology, business culture, scenario thinking and organizational learning. He has long been ahead of the curve in describing what would happen next– for instance, in the late 1970s he described how the internet would personalize magazine articles. He is a faculty member at New York University’s Interactive Telecommunications and Interactive Media Arts programs, where he has taught classes on the future of media since the 1990s. Previous books include The Age of Heretics: A History of the Radical Thinkers Who Reinvented Corporate Management (Warren Bennis Books/Jossey-Bass, 1996 and 2018); Who Really Matters: The Core Group Theory of Power, Privilege and Success (Random House, 2003); and (with Jeffrey Schwartz and Josie Thomson) The Wise Advocate: The Inner Voice of Strategic Leadership (Columbia University Business Press, 2019). His newst title, co-authored with Juliette Powell is The AI Dilemma: 7 Principles for Responsible Technology.

Between 2005 and 2020, Kleiner was editor-in- chief of the award-winning management magazine Strategy+Business, and a director at PwC. He was editorial director of the best-selling, Fifth Discipline Fieldbook series with Peter Senge and an editor at the Whole Earth Catalog. He has a degree in journalism from the University of California at Berkeley.

Feel free to connect with Art on LinkedIn here.

Finally, do grab a copy of Juliette and Art’s new book: The AI Dilemma.

Credit: Image by Gerd Altmann from Pixabay

Republished with author's permission from original post.

Adrian Swinscoe
Adrian Swinscoe brings over 25 years experience to focusing on helping companies large and small develop and implement customer focused, sustainable growth strategies.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here