BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

The Responsible Innovation Framework For The Age Of AI

This article is more than 3 years old.

This year, with facial recognition, privacy, and ethics gaining more attention in the marketplace, technology leaders are increasingly thinking about AI’s impact on people’s lives. But, with mixed reality, 5G wireless converging with artificial intelligence, the focus is really on Responsible Innovation. 

Alka Roy founded the Responsible Innovation Project to start the dialogue with technology leaders from various industries about what it means to develop technologies in a responsible manner. 

Her framework seeks to create norms in the marketplace where law and market forces drive each other to propel innovation forward in a human-centered sustainable way. 

She recently spoke at the Newton Series at UC Berkeley, where several hundred students resonated with her vision. They saw the need to bring a responsible lens to the technology culture and the technology ecosystems that we are creating. 

Alka Roy says, “Ask yourself, how much do you trust the people who come in and try to sell you something called responsible AI—a package of sorts. What does responsible AI even look like? Is there such a thing? It’s a hard problem to solve. It takes work. When people build AI or come with toolkits or solutions, I trust them more when I think they're being responsible. When they are willing to show up with transparency about the shortcomings and value. When they can talk about how they hold themselves accountable.”

Alka’s framework puts responsible innovation at the center of developing technology that is dependable & inclusive, delightful & trusted, and open & safe. Her framework accounts for how law, market forces, and technology can work together to create norms within the ecosystem.

Roy says, “This framework is not rocket science. It’s a reframing. The external variables or stakeholders are defined, not as pressures as we are taught in business schools, but as influences. You have to be viable economically for you to sustain and prosper, yes. You interpret and answer to the legal and policy requirements. And then there is self-regulation, the norms, the culture. This is what I am really focusing on. With innovation and technology, this is the part that is most fluid and critical. Because we are constantly in motion, making choices and decisions.”

Cultural Norms Come From Interactions Within The Innovation Ecosystem

Our innovation ecosystem is composed of many different products, companies, industries, where people develop, consume, and care about the technologies that they use. It’s the interactions between the products, companies, and technologies that naturally drive the set of principles that can become cultural norms of Responsible Innovation.

Roy says, “Every digital experience or interaction we have, like a search engine interaction, or a product that you use in a mapping product, a slew of people, many companies, many technologies have come together to create this complex little hive. What if everybody followed a set of principles with transparency about values, milestones, and it was part of the process and discipline to assess risks and understand the impacts, not only to people who usually call stakeholders to the wider set of stakeholders. If you couldn’t answer that question, there would be others to help, and it would be ok to ask for help. Think of the scientific process or manufacturing industry that has build processes to navigate higher risks and need for precision. Sometimes just understanding who and what we are excluding and including helps. And this can be based on the risk, built into our decision-making tools, best practices, with this other lens of how it will impact the world.”

Often, when we think about societal issues created by technology, we tend to pit humans against machines. This kind of thinking creates an atmosphere of animosity toward technology that ultimately does not help us solve the underlying issues to move the needle toward sustainable innovation.

Roy says, “I just want people to understand that when these discussions become framed as people against  machines, it's problematic because it's not really humans against machines. It's really a bunch of humans and a bunch of machines and a bunch of processes and a bunch of cultures trying to get as much out of a user that they can. It's really a many to one correlation. If you study modeling or game theory or behavioral economics, it gets confusing and complex fast. We're definitely not going to solve the problem correctly if we don't even understand it correctly.”

Understanding The Problem and Making Incremental Shifts

Last month, Alka Roy convened a roundtable of industry leaders to find out what are the main issues related to Responsible Innovation. A report was produced after the roundtable that consolidated everyone’s feedback. It is a starting point for understanding the layers of responsible innovation.

Roy says, “What I've been doing is what many others have been doing for a long time—just peeling the layers. Peeling the layers for myself and with my colleagues and really anyone I talk to. The messy part, though is that you have to arrive at it for yourself. It’s a collective and individual journey. Large companies, meaning people in those companies are doing  what they are doing because it has worked for a long time. It's habitual. First, they have to slow down and admit that there are problems or opportunities. Then they have to be willing to give something up that they are currently getting. You know, with their current process and habits. Then, they have to know what to fix and how to fix it. They have so many competing interests. And what if they're not even measured on building technology and AI with care and values—you know, responsibly. If they aren’t rewarded, internally or in external communities, or even for thinking about it. Will it be worth it?”

Humans create technology. But, relationships underlie the creation of these technologies. Relationships also underlie the usage of technologies. Relationships are formed from everyday interactions. The interaction between consumers and technology, the interaction between developers and products that they develop, and the interaction between businesses that produce technologies in the marketplace can all represent incremental shifts in cultural norms toward a more responsible innovation paradigm.

Roy says, “Relationships determine interactions, and they both evolve. They are constantly evolving. So, it’s hard for me to prescribe a definite way—the right or wrong way. That is why I stay away from shaming or even taking discreet, ethical stands unless it’s an extreme or clear case. I’m an engineer and an artist. To me, everything is about discovery and incremental shifts. And we have a responsibility to teach our students how to do that responsibly, our children, and the machines that we create, especially if we build autonomy in it.

Technology Leadership and the Mindset Shift

In the last few decades, as we have seen technology being developed and deployed at an unprecedented pace, we have also seen the problems inside companies and in industries that we consistently grapple with when it comes to technology. Issues such as privacy, equality were all debated and negotiated within the free market system. These negotiations often start at lines of code and perpetuate throughout the systems to become industry norms.

Roy says, “What I have been researching is how we are making decisions with technology, be it neural networks or AI or a brain-computer interface (BCI), or even simple decisions like what comments should I put next to these three lines of code so that someone else can understand it. How do we understand our own agency, our power, influence? Do we feel a sense of responsibility for what we're putting out into the world in a complex way? Not trying to please someone else, but tapping into our own self-regulation, our own power. Whenever I talk to a startup founder or a tech leader about that, if they allow themselves to go there, there is a shift of 1% maybe, but then they make decisions differently. We have to educate our humans, our people differently, have them tap into their agency to make those responsible decisions under time pressures.”

When we educate the next generation of technology leaders differently, there’s tremendous power in creating these mindset shifts toward responsibility. Underlying every technology are relationships that flourish on trust. For instance, our hiring practices in technology historically focused on hiring unicorns. But that is changing now. We are more focused on hiring groups of technologists that have complementary skills and building creative multidisciplinary teams to execute innovations. In these teams, there are opportunities to build a trusting culture that can enable each individual to think more about the impact of technology that’s created.

Roy says, “Ok, so that feels natural to me. It’s messy but also filled with potential for innovation—like our mind and social systems and relationships. We build our relationships in layers, scaffolding after scaffolding. When we make business or technology decisions today, we do that. We use  decision trees, mental models, algorithmic models about risks, rewards, and probability.  What we need to do is to make some of these variables transparent, add variables that include impact—leverage inclusive design practices, good engineering models. Apply and see how it works. The stakes are high. It matters how we choose the test or training or synthetic data-sets. What do we know about its source, or its lineage? Who can it harm? Who does it leave out? What was rigor used in making and testing the model we are using? How much agency does our design take from other people?”

Constant Evolution Means Constant Redemption

Similar to the way that innovation evolves, cultural norms of responsible innovation, once defined in the industry, are adopted by companies incrementally. Behind every decision, there’s a chance to improve that decision for a better outcome.

Roy says, “We all have a chance of redemption with every decision. This is what I told the startup founder I talked to this morning. I felt kind of icky talking to that investor, you're going to have another chance to talk differently to a different person, be willing to say yes and no for yourself. This stuff is not easy but what is? There is so much clutter that we often can’t hear ourselves. Learn from each thing. If there’s a part of you judging yourself, see if you can redirect it to the next decision. In my experience, self-righteousness and judgements restrict rather than expand...”

As technologies become increasingly complex, our understanding of the products we develop and how they are used evolve inside the marketplace change. On any given day, our assumptions may be limited. As the marketplace changes, there’s always a need to evolve our thinking to keep up with these technologies. Therefore, there’s always room to think about the impact of our technology to make incremental shifts toward responsibility.

Roy says, “We have mental models, which we're trying to obviously put into computers, right? We have various mental models that we constantly navigate, and they're competing mental models. We program many of these mental models into machines. The reason we get many of them wrong is because we are often trying to make decisions about our future based on what we know about the past. And honestly, it’s hard to pinpoint and code the fluidity of even slightly complex and interdependent decision making. We have ask the question: Why are we designing technology the way we are designing it? And what needs to evolve? Get curious about what we think we know and see if we can see it or understand it in a novel way.”

Follow me on Twitter or LinkedInCheck out my website