Ross, a system built on the back of IBM’s Watson, claims to be able to interpret questions lawyers ask it, and read “through the entire body of law and returns a cited answer and topical readings from legislation, case law and secondary sources to get you up-to-speed quickly.”

But the first thing I noticed about Ross wasn’t how many legal documents it can search at once, or how accurate it claims to be. It was the name: Ross. It’s a regular name for a human, except that it stands out when compared to other AI. Siri, Cortana, Alexa, Ross: one of these things is not like the other. Siri, Cortana and Alexa are digital assistants—they help you find your coffee meeting, manage your calendar, play your music. Ross is a lawyer.

Anne Marie Slaughter: The National Security Issue No One Is Talking About

Put another way, if we want women, men, young people, older people, people of color or any other demographic to use a cybersecurity product, they should be part of designing that product. That’s a basic tenet of Silicon Valley, enshrined as a key principle of user experience design, the backbone of technology product development. For the most part, this mindset has yet to penetrate the cybersecurity field.

You need only look as far as Siri and other digital assistants to see the pernicious effects that homogeneity can have on technology: A recent study shows that Siri, Cortana and the rest of the gang can easily help users after they report they’ve had a heart attack—but are flummoxed after hearing “I’ve been raped.” On the flip side, when women are at the designing and engineering table, they are likely to know that heart attack symptoms in women are often very different from those in men and provide a different response when a female voice asks Siri what a heart attack feels like.