Ethics in Tech Education: Designing to Provide Opportunity for All

Mariah Hay, Vice President of Product, Pluralsight

We know that ability is equally distributed among humans, but opportunity is not. As the need for skilled technologists grows, so must our ability to empower individuals with accessible tech training. The data that can be gathered about an individual’s learning patterns can help inform the ultimate personalized educational experience, accelerating the cycle from novice to master, or it could be weaponized – used to judge an individual and block opportunities for jobs and advancement. As we design experiences and systems, we become the ethical stewards of the impact we could have on millions of lives. It’s up to us to make the right, and often hard decisions. Hear from Mariah Hay, VP of Product at Pluralsight about her experience designing product for tech education, the choices her teams have made to avoid weaponization, and how human centered design can inform the ethical underpinnings of our missions, our companies, and our bottom lines.

Mariah is a dedicated product leader who is kept up at night by the thought of failing to support her team and her customers. She is a self-described human-centered design nerd. Her team is problem finders and solvers. You have to find the right problem to solve, and then you have to be able to execute on the solution. We have a cool, very powerful job – even in the history of the world. It is frightening that we have that kind of power. If we’re not careful, we can be problem creators. With great power comes great responsibility.

We have to understand ethics in the context of technology. It means that we deal with good and bad “with moral duty and obligation”. That comes form our industry, from our social circles. Since the 5th century, professional groups have crafted ethical codes. Earliest physician ethics required that they expand their knowledge by consulting with other physicians. Medical has paved the way with six principles:

hay-01

Engineering has a code of ethics, too. They have six main tenets:

hay-04

The engineering community provide examples of how to uphold each one of these, including what you might encounter in the real world, and what to do about it. They given an iron ring on their pink finger as a reminder.

What happens when it’s not so life or death as in medicine or engineering? What happens when we create problems, what are our ethical responsibilities. She recently read a book called Tragic Design, which provided an example of how the EPIC medical records system led to the death of a cancer patient because critical information wasn’t available on the primary screen seen by the nurses.

Johns Hopkins found that medical errors should be the third leading cause of death in the US. This has to do with communication, workflow – things that could be addressed by software. What would you do if that was your software? Volkswagen was charged with being deceptive about their emissions, and it resulted in both prison terms and fines for the engineer, his General Manager, and more recently the CEO.

Cambridge Analytica is not a pure data science company. It is a “full service propaganda machine”. Mariah shared an interview with one of the founders, in which he describes their work as a “grossly unethical experiment”, and he admits that they were experimenting with the “psychology of an entire nation”. If you were asked to design a system like this, what would you do? These issues stem from a lack of ethical accountability. Some of it is lack of awareness, some is blatant disregard for the law. The lack empathy, that apathy allows people to just shrug their shoulders.

She works in technology education. Education should improve people’s lives, and ultimately the human condition. Some time ago, Pluralsight acquired Smarterer. Pluralsight had 7000 courses, and the Smarterer algorithm helped assess people’s skill and guide them to courses. What did they know, what courses should they take next? The learners loved it, they didn’t care that Pluralsight were using their data to do show progress, benchmark, etc. However, Pluralsight also sells directly to enterprises, for their employees. So, now they have to think about how / what they service they should be providing, without betraying their learners. It raised all kinds of ethical questions. Would it discourage people from taking assessments that might affect them? Their first ethical tenet is “don’t weaponize the product”. Their second non-negotiable is “find your blind spots”. In order to address the latter, they met with other practitioners to make sure their research was sufficiently thorough.

In fact, it became clear from their research that learners were worried about some of this data. They feared being ranked against other people. You get a score of 1-300, and there is a proficiency level (which was useful). Learners were worried about how the scores would affect raises and promotions. So Pluralsight made the decision to share the proficiency but not the score itself. Companies submitted features requests that included many of the things that learners didn’t want. Unfortunately, many of those suggestions weaponized their products. So, they worked to understand the pain point behind those suggestions, so they could design alternate solutions.

hay-06The main point here is a pause for proactive, accountable ethics – they didn’t just put all the features in the backlog. This is a big deal. If they let companies use these assessments instead of interviews, they won’t engage in a complete conversation with candidates. And so on.

Their third ethical tenet was “taking 200% accountability for their decisions”; 100% for themselves, and 100% accountable for the people around them.

They continued to strive to meet the needs of both organizations and their learners, but they did that in such a way that they maintained the trust of their learners. In the end, this can be a critical competitive advantage – in a way that is bigger than product. That has really cool side effects:

hay-07

Operating on a human-centered ethics pay off in the near-term and the long-term. Pluralsight experienced 40% YOY growth, and successfully IPOed in May this year.

Mariah recapped three key things that drive her team’s behavior and decision-making:

  • They are committed to find their blind spots, which includes being honest and aware of their personal abilities and limitations.
  • They are fierce about not weaponizing the product, making sure that the solution or experience does not have downstream negative impact to anyone.
  • The team is 200% accountable; 100% accountable for the things they do personally, and 100% accountable for those around them.

Her challenge to all of us is to act with courage. These challenges won’t go away. We have the potential to kill people than we thought possible three, five, and ten years ago.

We can affect people’s lives, and we have a choice about where we spend our time and energy. So please, act with courage. We have to consciously decide to participate. Companies aren’t inherently unethical. But they aren’t people, so they can’t have ethics – those have to live in the practitioner.

It is the doctor, not the hospital that takes the Hippocratic oath. That is why the engineer – and not their company – wears that iron ring.