The MIT Media Lab controversy and getting back to ‘radical courage’, with Media Lab student Arwa Mboya – gpgmail


People win prestigious prizes in tech all the time, but there is something different about The Bold Prize. Unless you’ve been living under a literal or proverbial rock, you’ve probably heard something about the late Jeffrey Epstein, a notorious child molester and human trafficker who also happened to be a billionaire philanthropist and managed to become a ubiquitous figure in certain elite science and tech circles.

And if you’re involved in tech, the rock you’ve been living under would have had to be fully insulated from the internet to avoid reading about Epstein’s connections with MIT’s Media Lab, a leading destination for the world’s most brilliant technological minds, also known as “the future factory.” 

This past week, conversations around the Media Lab were hotter than the fuel rods at Fukushima, as The New Yorker’s Ronan Farrow, perhaps the most feared and famous investigative journalist in America today, blasted out what for some were new revelations that Bill Gates, among others, had given millions of dollars to the Media Lab at Jeffrey (no fucking relation, thank you very much!) Epstein’s behest. Hours after Farrow’s piece was published, Joi Ito, the legendary but now embattled Media Lab director, resigned.

But well before before Farrow weighed in or Ito stepped away, students, faculty, and other leaders at MIT and far beyond were already on full alert about this story, thanks in large part to Arwa Michelle Mboya, a graduate student at the Media Lab, from Kenya by way of college at Yale, where she studied economics and filmmaking and learned to create virtual reality. Mboya, 25, was among the first public voices (arguably the very first) to forcefully and thoughtfully call on Ito to step down from his position.

Imagine: you’re heading into the second year of your first graduate degree, and you find yourself taking on a man who, when Barack Obama took over Wired magazine for an issue as guest editor, was one of just a couple of people the then sitting President of the United States asked to personally interview. And imagine that man was the director of your graduate program, and the reason you decided to study in it in the first place.

Imagine the pressure involved, the courage required. And imagine, soon thereafter, being completely vindicated and celebrated for your actions. 

Arwa Mboya. Image via MIT Media Lab

That is precisely the journey that Arwa Mboya has been on these past few weeks, including when human rights technologist Sabrina Hersi Issa decided to crowd-fund the Bold Prize to honor Mboya’s courage, which has now brought in over $10,000 to support her ongoing work (full disclosure: I am among the over 120 contributors to the prize).

Mboya’s advocacy was never about Joi Ito personally. If you get to know her through the interview below, in fact, you’ll see she doesn’t wish him ill.

As she wrote in MIT’s The Tech nine days before Farrow’s essay and ten before Ito’s resignation, “This is not an MIT issue, and this is not a Joi Ito issue. This is an international issue where a global network of powerful individuals have used their influence to secure their privilege at the expense of women’s bodies and lives. The MIT Media Lab was nicknamed “The Future Factory” on CBS’s 60 Minutes. We are supposed to reflect the future, not just of technology but of society. When I call for Ito’s resignation, I’m fighting for the future of women.”

From the moment I read it, I thought this was a beautiful and truly bold statement by a student leader who is an inspiring example of the extraordinary caliber of student that the Media Lab draws.

But in getting to know her a bit since reading it, I’ve learned that her message is also about even more. It’s about the fact that the women and men who called for a new direction in light of Jeffrey Epstein’s abuses and other leaders’ complicity did so in pursuit of their own inspiring dreams for a better world.

Arwa, as you’ll see below, spoke out at MIT because of her passion to use tech to inspire radical imagination among potentially millions of African youth. As she discusses both the Media Lab and her broader vision, I believe she’s already beginning to provide that inspiration. 

Greg Epstein: You have had a few of the most dramatic weeks of any student I’ve met in 15 years as a chaplain at two universities. How are you doing right now?

Arwa Mboya: I’m actually pretty good. I’m not saying that for the sake of saying. I have a great support network. I’m in a lab where everyone is amazing. I’m very tired, I’ll say that. I’ve been traveling a lot and dealing with this while still trying to focus on writing a thesis. If anything, it’s more like overwhelmed and exhausted as opposed to not doing well in and of itself.

Epstein: Looking at your writing — you’ve got a great Medium blog that you started long before MIT and maintained while you’ve been here — it struck me that in speaking your mind and heart about this Media Lab issue, you’ve done exactly what you set out to do when you came here. You set out to be brave, to live life, as the Helen Keller quote on your website says, as either a great adventure or nothing. 

Also, when you came to the Media Lab, you were the best-case scenario for anyone who works on publicizing this place. You spoke and wrote about the Lab as your absolute dream. When you were in Africa, or Australia, or at Yale, how did you come to see this as the best place in the world for you to express the creative and civic dreams that you had?

Mboya: That’s a good question — what drew me here? The Media Lab is amazing. I read Whiplash, which is Joi Ito’s book about the nine principles of the Media Lab, and it really resonated with me. It was a place for misfits. It was a place for people who are curious and who just want to explore and experiment and mix different fields, which is exactly what I’ve been doing before.

From high school, I was very narrow in my focus; at Yale I did Econ and film, so that had a little more edge. After I graduated I insisted on not taking a more conventional path many students from Yale take, so [I] moved back to Kenya and worked on many different projects, got into adventure sports, got into travel more.

Epstein: Your website is full of pictures of you flipping over, skydiving, gymnastics — things that require both strength and courage. 

Mboya: I’d always been an athlete, loved the outdoors.

I remember being in Vietnam; I’d never done a backflip. I was like, “Okay, I’m going to learn how to do this.” But it’s really scary jumping backwards; the fear. Is, you can’t see where you’re going. I remember telling myself, ” Okay, just jump over the fear. Just shut it off and do it. Your body will follow.” I did and I was like, “Oh, that was easy.” It’s not complicated. Most people could do it if they just said, “Okay, I’ll jump.”

It really stuck with me. A lot of decisions I’ve [since] made, that I’m scared of, I think, “Okay, just jump, and your body will follow.” The Media Lab was like that as well.

I really wanted to go there, I just didn’t think there was a place for me. It was like, I’m not techie enough, I’m not anything enough. Applying was, ’just jump,’ you never know what will happen.

image 4

Image from Arwa Mboya

Epstein: Back when you were applying, you wrote about experiencing what applicants to elite schools often call “imposter syndrome.” This is where I want to be, but will they want me?

Mboya: Exactly.




10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

How the Valley can get philanthropy right with former Hewlett Foundation president Paul Brest – gpgmail


Paul Brest didn’t set out to transform philanthropy. A constitutional law scholar who clerked for Supreme Court Justice John Harlan and is credited with coining the term “originalism,” Brest spent twelve years as dean of Stanford Law School.

But when he was named president of the William & Flora Hewlett Foundation, one of the country’s largest large non-profit funders, Brest applied the rigor of a legal scholar not just to his own institution’s practices but to those of the philanthropy field at large. He hired experts to study the practice of philanthropy and helped to launch Stanford’s Center for Philanthropy and Civil Society, where he still teaches.

Now, Brest has turned his attention to advising Silicon Valley’s next generation of donors.

From Stanford to the Hewlett Foundation

Photo by David Madison / Getty Images

Scott Bade: Your background is in constitutional law. How did you make the shift from being dean at Stanford to running the Hewlett Foundation as president?

Paul Brest: I came into the Hewlett Foundation largely by accident. I really didn’t know anything about philanthropy, but I had been teaching courses on problem-solving and decision making. I think I got the job because a number of people on the board knew me, both from Stanford Law School, but also from playing chamber music with Walter and Esther Hewlett.

Bade: When was this?

Brest: I started there in 2000. Bill Hewlett died the year after I came. Walter Hewlett, Bill’s son, was chair of the board during the entire time I was president. But it’s not a family foundation.

Bade: What were your initial impressions of the foundation and the broader philanthropic space?

Brest: Not having come from the non-profit sector, it took me a year or so to really understand what it [meant] to use our assets in each area in a strategic way.  The [Hewlett] Foundation had very good values in terms of the areas it was supporting — the environment, education, population, women’s reproductive rights. It had good philanthropic practices, but it was not very strategically focused. It turned out that not very many foundations were strategic.

Paul’s framework for thinking about philanthropy

Paul informal photo

Photo provided by Paul Brest

Bade: What do you mean by ‘strategic’?

Brest: What I mean [by] strategic is having clear goals and having an evidence-based, evidence-informed strategy for achieving them. Big foundations tend to be conglomerates with different programs trying to achieve different goals.

[Being strategic means] monitoring progress as you work towards those goals. Then evaluating in advance whether the strategy is going to be plausible and then whether you’re actually achieving the outcomes you’re trying to achieve so that you can make course corrections if you’re not achieving.

[For example,] the likelihood that the roughly billionaire dollars or more that have been spent or committed to climate advocacy are going to have any effect is quite low. The place where metrics comes in is just having kind of an expected return mindset where yes, the chances of success are low, but we know that the importance of success — or putting it differently, the effects of failure — are going to be catastrophic.

What a strategic mindset does here is say: it’s worth taking huge bets even where the margins of error of the likelihood of success are very hard to measure when the results are huge.

I don’t want to say the [Hewlett] Foundation was anti-strategic, or totally unstrategic, but it really had not developed a [this kind of] systematic framework for doing those things.

Bade: You’re known in the philanthropic community for putting an emphasis on defining, achieving, measuring impact. Have those sort of technocratic practices made philanthropy better?

Brest: I think you have to start by asking, what would it mean for philanthropy to be good? From my point of view, philanthropy is good when I like the goals it chooses. Then, given a good goal, when it is effective in achieving that goal. Strategy really has nothing to say about what the goals are, but only how effective it is.

My guess is that 90 plus percent of philanthropy is intended to achieve goals that most of us think are good goals. There are occasions when you have direct conflicts of goals as you do with say the anti-abortion and the choice movements, or gun control and the NRA. Those are important arguments.

But most philanthropy is trying to improve education or improve the lives of the poor. My view is that philanthropy is good when it is effective in achieving those goals, and trying to do no harm in the process.

Current debates on philanthropy


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Teaching ethics in computer science the right way with Georgia Tech’s Charles Isbell – gpgmail


The new fall semester is upon us, and at elite private colleges and universities, it’s hard to find a trendier major than Computer Science. It’s also becoming more common for such institutions to prioritize integrating ethics into their CS studies, so students don’t just learn about how to build software, but whether or not they should build it in the first place. Of course, this begs questions about how much the ethics lessons such prestigious schools are teaching are actually making a positive impression on students.

But at a time when demand for qualified computer scientists is skyrocketing around the world and far exceeds supply, another kind of question might be even more important: Can computer science be transformed from a field largely led by elites into a profession that empowers vastly more working people, and one that trains them in a way that promotes ethics and an awareness of their impact on the world around them?

Enter Charles Isbell of Georgia Tech, a humble and unassuming star of inclusive and ethical computer science. Isbell, a longtime CS professor at Georgia Tech, enters this fall as the new Dean and John P. Imlay Chair of Georgia Tech’s rapidly expanding College of Computing.

Isbell’s role is especially given Georgia Tech’s approximately 9,000 online graduate students in Computer Science. This astronomical number of students in the CS field is the result of a philosophical decision made at the university to create an online CS master’s degree treated as completely equal to on-campus training.

Another counterintuitive philosophical decision made at Georgia Tech — for which Isbell proudly evangelized while speaking at conferences like the MIT Technology Review’s EmTech Next, where I met him in June — is to admit every student who has the potential to earn a degree, rather than making any attempt at “exclusivity” by rejecting worthy candidates. In the coming years all of this may lead, Isbell projected at EmTech Next, to a situation in which up to one in eight of all people in the US who hold a graduate degree in CS will have earned it at Georgia Tech.

isbell 1

Isbell speaks to Gideon Lichfield, Editor-in-chief of the MIT Technology Review, at its EmTech Next conference in June. Image via MIT Technology Review.

“What they’ve done is pretty remarkable,” said Casey Fiesler, a 3x recent graduate of Georgia Tech and a founding faculty member and CS professor at the University of Colorado’s College of Media, Communication, and Information.

And it’s promising that Fiesler, who has become known in the tech ethics field for her comparative study of curricula and teaching approaches, told me, “ethics can be integrated into online [CS] courses just as easily as it can be into face to face courses.”

Still, it is as daunting as it is impressive to think about how one public school like Georgia Tech might be able to successfully and ethically educate such an enormous percentage of the students in arguably the most influential academic field in the world today. So I was glad to be able to speak to Isbell, an expert on statistical machine learning and artificial intelligence, for this gpgmail series on the ethics of technology.

Our conversation below covers the difference between equality and equity; cultural issues around women in American CS, and what it would look like for ethics to be so integrated into the discussion of computing that students and practitioners wouldn’t even think of it as ethics.

Greg Epstein: Around 1/8 of Computer Science graduate degrees will be delivered by your school in the coming years; you’re thinking inclusively about providing a relatively huge number of opportunities for people who would not otherwise get the opportunity to become computer scientists. How have you achieved that?

Charles Isbell: There’s an old joke about organizations: don’t tell me what your values are, show me your budget and then I’ll tell you what your values are. Because you spend money on the things that you care about.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

How ‘ghost work’ in Silicon Valley pressures the workforce, with Mary Gray – gpgmail


The phrase “pull yourself up by your own bootstraps” was originally meant sarcastically.

It’s not actually physically possible to do — especially while wearing Allbirds and having just fallen off a Bird scooter in downtown San Francisco, but I should get to my point.

This week, Ken Cuccinelli, the acting Director of the United States Citizenship and Immigrant Services Office, repeatedly referred to the notion of bootstraps in announcing shifts in immigration policy, even going so far as to change the words to Emma Lazarus’s famous poem “The New Colossus:” no longer “give me your tired, your poor, your huddled masses yearning to breathe free,” but “give me your tired and your poor who can stand on their own two feet, and who will not become a public charge.”

We’ve come to expect “alternative facts” from this administration, but who could have foreseen alternative poems?

Still, the concept of ‘bootstrapping’ is far from limited to the rhetorical territory of the welfare state and social safety net. It’s also a favorite term of art in Silicon Valley tech and venture capital circles: see for example this excellent (and scary) recent piece by my editor Danny Crichton, in which young VC firms attempt to overcome a lack of the startup capital that is essential to their business model by creating, as perhaps an even more essential feature of their model, impossible working conditions for most everyone involved. Often with predictably disastrous results.

It is in this context of unrealistic expectations about people’s labor, that I want to introduce my most recent interviewee in this series of in-depth conversations about ethics and technology.

Mary L. Gray is a Fellow at Harvard University’s Berkman Klein Center for Internet and Society and a Senior Researcher at Microsoft Research. One of the world’s leading experts in the emerging field of ethics in AI, Mary is also an anthropologist who maintains a faculty position at Indiana University. With her co-author Siddharth Suri (a computer scientist), Gray coined the term “ghost work,” as in the title of their extraordinarily important 2019 book, Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. 

Image via Mary L. Gray / Ghostwork / Adrianne Mathiowetz Photography

Ghost Work is a name for a rising new category of employment that involves people scheduling, managing, shipping, billing, etc. “through some combination of an application programming interface, APIs, the internet and maybe a sprinkle of artificial intelligence,” Gray told me earlier this summer. But what really distinguishes ghost work (and makes Mary’s scholarship around it so important) is the way it is presented and sold to the end consumer as artificial intelligence and the magic of computation.

In other words, just as we have long enjoyed telling ourselves that it’s possible to hoist ourselves up in life without help from anyone else (I like to think anyone who talks seriously about “bootstrapping” should be legally required to rephrase as “raising oneself from infancy”), we now attempt to convince ourselves and others that it’s possible, at scale, to get computers and robots to do work that only humans can actually do.

Ghost Work’s purpose, as I understand it, is to elevate the value of what the computers are doing (a minority of the work) and make us forget, as much as possible, about the actual messy human beings contributing to the services we use. Well, except for the founders, and maybe the occasional COO.

Facebook now has far more employees than Harvard has students, but many of us still talk about it as if it were little more than Mark Zuckerberg, Cheryl Sandberg, and a bunch of circuit boards.

But if working people are supposed to be ghosts, then when they speak up or otherwise make themselves visible, they are “haunting” us. And maybe it can be haunting to be reminded that you didn’t “bootstrap” yourself to billions or even to hundreds of thousands of dollars of net worth.

Sure, you worked hard. Sure, your circumstances may well have stunk. Most people’s do.

But none of us rise without help, without cooperation, without goodwill, both from those who look and think like us and those who do not. Not to mention dumb luck, even if only our incredible good fortune of being born with a relatively healthy mind and body, in a position to learn and grow, here on this planet, fourteen billion years or so after the Big Bang.

I’ll now turn to the conversation I recently had with Gray, which turned out to be surprisingly more hopeful than perhaps this introduction has made it seem.

Greg Epstein: One of the most central and least understood features of ghost work is the way it revolves around people constantly making themselves available to do it.

Mary Gray: Yes, [What Siddarth Suri and I call ghost work] values having a supply of people available, literally on demand. Their contributions are collective contributions.

It’s not one person you’re hiring to take you to the airport every day, or to confirm the identity of the driver, or to clean that data set. Unless we’re valuing that availability of a person, to participate in the moment of need, it can quickly slip into ghost work conditions.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Tech leaders condemn tech’s role in elevating white supremacy – gpgmail


A group of tech leaders has banned together to speak out against white supremacy and rampant hate speech on tech platforms. The group, Build Tech We Trust, refers to itself as a collective of tech CEOs, activists, changemakers and workers who are committed to countering hate and terrorism.

In a public letter published today, Project Include CEO Ellen Pao, Code 2040 CEO Karla Monterroso, ReadySet CEO Y-Vonne Hutchinson, Project Include Founding Member Erica Baker, Block Party CEO Tracy Chou and others make a call to hold tech platforms accountable and “build tech we trust.”

Despite platitudes by tech CEOs that their respective platforms are designed to bring the world together and foster connection, these platforms too often cause harm and “are radicalizing and fragmenting communities by providing an unprecedented ability to coordinate attacks and amplify hate,” the letter states.

That’s not to say that tech companies have done nothing to try to combat hate speech and white supremacy, but what they’ve done just hasn’t been enough. In June, former ACLU Washington Director Laura Murphy said Facebook’s white supremacy policy, despite some changes, was still too narrow. Meanwhile, stories have recently emerged regarding how people become radicalized on YouTube.

The letter comes shortly after the mass shootings in El Paso, Texas and Dayton, Ohio where many of the victims were either Latinx or black. Tech leaders in the letter also note other shootings where people were targeted because of their race, sexuality and/or religion, like Pulse Nightclub shooting and Charleston church massacre.

“White supremacist terrorism and violence, fueled by racism and misogyny, and empowered by technology, is on the rise,” they write. “They’ve moved beyond their white robes and hoods to social media and public rallies where they radicalize and fund their growing membership. Our government leaders at the highest levels encourage and spread it. Our industry leaders enable and profit from it. Four of the five worst gun massacres in modern history have taken place over the past two years. Evidence shows that many of these shooters are inspired by white supremacist ideology and targeting marginalized people.”

The aim of the letter is to serve as a call to action to encourage their fellow technologists to build ethical and responsible tech platforms.

“Whether it be a walkout, refusing to build or buy tech that accelerates hate, calling out unfair anti-abuse policies that silence marginalized voices, or continuing to demand answers from those in positions of power, the time to act is now,” the leaders write.

You can read the full letter over on Build Tech We Trust. I’ve reached out to Monterroso and will update this story when I hear back.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Why AI needs more social workers, with Columbia University’s Desmond Patton – gpgmail


Sometimes it does seem the entire tech industry could use someone to talk to, like a good therapist or social worker. That might sound like an insult, but I mean it mostly earnestly: I am a chaplain who has spent 15 years talking with students, faculty, and other leaders at Harvard (and more recently MIT as well), mostly nonreligious and skeptical people like me, about their struggles to figure out what it means to build a meaningful career and a satisfying life, in a world full of insecurity, instability, and divisiveness of every kind.

In related news, I recently took a year-long paid sabbatical from my work at Harvard and MIT, to spend 2019-20 investigating the ethics of technology and business (including by writing this column at gpgmail). I doubt it will shock you to hear I’ve encountered a lot of amoral behavior in tech, thus far.

A less expected and perhaps more profound finding, however, has been what the introspective founder Prayag Narula of LeadGenius tweeted at me recently: that behind the hubris and Machiavellianism one can find in tech companies is a constant struggle with anxiety and an abiding feeling of inadequacy among tech leaders.

In tech, just like at places like Harvard and MIT, people are stressed. They’re hurting, whether or not they even realize it.

So when Harvard’s Berkman Klein Center for Internet and Society recently posted an article whose headline began, “Why AI Needs Social Workers…”… it caught my eye.

The article, it turns out, was written by Columbia University Professor Desmond Patton. Patton is a Public Interest Technologist and pioneer in the use of social media and artificial intelligence in the study of gun violence. The founding Director of Columbia’s SAFElab and Associate Professor of Social Work, Sociology and Data Science at Columbia University.

desmond cropped 800x800

Desmond Patton. Image via Desmond Patton / Stern Strategy Group

A trained social worker and decorated social work scholar, Patton has also become a big name in AI circles in recent years. If Big Tech ever decided to hire a Chief Social Work Officer, he’d be a sought-after candidate.

It further turns out that Patton’s expertise — in online violence & its relationship to violent acts in the real world — has been all too “hot” a topic this past week, with mass murderers in both El Paso, Texas and Dayton, Ohio having been deeply immersed in online worlds of hatred which seemingly helped lead to their violent acts.

Fortunately, we have Patton to help us understand all of these issues. Here is my conversation with him: on violence and trauma in tech on and offline, and how social workers could help; on deadly hip-hop beefs and “Internet Banging” (a term Patton coined); hiring formerly gang-involved youth as “domain experts” to improve AI; how to think about the likely growing phenomenon of white supremacists live-streaming barbaric acts; and on the economics of inclusion across tech.

Greg Epstein: How did you end up working in both social work and tech?

Desmond Patton: At the heart of my work is an interest in root causes of community-based violence, so I’ve always identified as a social worker that does violence-based research. [At the University of Chicago] my dissertation focused on how young African American men navigated violence in their community on the west side of the city while remaining active in their school environment.

[From that work] I learned more about the role of social media in their lives. This was around 2011, 2012, and one of the things that kept coming through in interviews with these young men was how social media was an important tool for navigating both safe and unsafe locations, but also an environment that allowed them to project a multitude of selves. To be a school self, to be a community self, to be who they really wanted to be, to try out new identities.


10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Inside the history of Silicon Valley labor, with Louis Hyman – gpgmail


As I wrote for gpgmail recently, immigration is not an issue always associated with tech — not even when thinking about the ethics of technology, as I do here.

So when I was moved to tears a few weeks ago, on seeing footage of groups of 18 Jewish protestors link arms to block the entrances to ICE detention facilities, bearing banners reading “Never Again” in reference to the Holocaust — these mostly young women risking their physical freedom and safety to try to help the children this country’s immigration service is placing in concentration camps today, one of my first thoughts was: I can’t cover that for my gpgmail column. It’s about ethics of course, but not about tech.

It turns out that wasn’t correct. Immigration is a tech issue. In fact, companies such as Wayfair (furniture), Amazon (web services), and Palantir (the software used to track undocumented immigrants) have borne heavy criticism for their support of and partnership with ICE’s efforts under the current administration.

And as I discussed earlier this month with Jaclyn Friedman, a leading sex ethics expert and one of the ICE protestors arrested in a major demonstration in Boston, social media technology has been instrumental in building and amplifying those protests.

But there’s more. IBM, for example, has an unfortunate and dark history of support for Nazi extermination efforts, and many recent commentators have drawn parallels between what IBM did during the Holocaust and what companies like Palantir are beginning to do now.

Dozens of protestors huddle in the rain outside Palantir HQ.

I say “companies,” plural, with intention: immigrant advocacy organization Mijente recently released news that Anduril, the company founded by Palmer Luckey and composed of Palantir veterans, now has a $13.5 million contract with the Marine corps for their autonomous surveillance “Lattice” towers at four different USMC bases, including one border base. Documents procured via the Freedom of Information Act show the Marines mention “the intrusion dilemma” in their justification for choosing Anduril.

So now it seems the kinds of surveillance tech we know are badly biased at best — facial recognition? Panopticon-style observation? Algorithms of various other kinds — will be put to work by the most powerful fighting force ever designed, for expanded intervention into our immigration system.

Will the Silicon Valley elite say “no”? To what extent will new protests emerge, where the sorts of people likely to be reading this writing might draw a line and make work more difficult for their peers at places like Anduril?

Maybe the problem, however, is that most of us think of immigration ethics as an issue that might touch on a small handful of particularly libertarian-leaning tech companies, but surely it doesn’t go beyond that, right? Can’t the average techie in San Francisco or elsewhere safely and accurately say these problems don’t actually implicate them?

Turns out that’s not right either.

Which is why I had to speak this week with Cornell University historian Louis Hyman. Hyman is a Professor at Cornell’s School of Industrial and Labor Relations, and Director of the ILR’s Institute for Workplace Studies, in New York. In our conversation, Hyman and I dig into Silicon Valley’s history with labor rights, startup work structures and the role of immigration in the US tech ecosystem. Beyond that,  I’ll let him introduce himself and his extraordinary work, below.

image1 4

Louis Hyman. (Image by Jesse Winter)

Greg Epstein: I discovered your work via a piece you wrote in the Washington Post, which drew from your 2018 book, Temp: How American Work, American Business, and the American Dream Became Temporary. In it, you wrote, “Undocumented workers have been foundational to the rise of our most vaunted hub of innovative capitalism: Silicon Valley.”

And in the book itself, you write at one point, “To understand the electronics industry is simple: every time someone says “robot,” simply picture a woman of color. Instead of self-aware robots, workers—all women, mostly immigrants, sometimes undocumented—hunched over tables with magnifying glasses assembling parts, sometimes on a factory line and sometimes on a kitchen table. Though it paid a lot of lip service to automation, Silicon Valley truly relied upon a transient workforce of workers outside of traditional labor relations.”

Can you just give us a brief introduction to the historical context behind these kinds of comments?

Louis Hyman: Sure. One of the key questions all of us ask is why is there only one Silicon Valley. There are different answers for that.




10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something

Ethics in the age of autonomous vehicles – gpgmail


Earlier this month, gpgmail held its inaugural Mobility Sessions event, where leading mobility-focused auto companies, startups, executives and thought leaders joined us to discuss all things autonomous vehicle technology, micromobility and electric vehicles.

Extra Crunch is offering members access to full transcripts of key panels and conversations from the event, such as Megan Rose Dickey‘s chat with Voyage CEO and co-founder Oliver Cameron and Uber’s prediction team lead Clark Haynes on the ethical considerations for autonomous vehicles.

Megan, Oliver and Clark talk through how companies should be thinking about ethics when building out the self-driving ecosystem, while also diving into the technical aspects of actually building an ethical transportation product. The panelists also discuss how their respective organizations handle ethics, representation and access internally, and how their approaches have benefited their offerings.

Clark Haynes: So we as human drivers, we’re naturally what’s called foveate. Our eyes go forward and we have some mirrors that help us get some situational awareness. Self-driving cars don’t have that problem. Self-driving cars are designed with 360-degree sensors. They can see everything around them.

But the interesting problem is not everything around you is important. And so you need to be thinking through what are the things, the people, the actors in the world that you might be interacting with, and then really, really think through possible outcomes there.

I work on the prediction problem of what’s everyone doing? Certainly, you need to know that someone behind you is moving in a certain way in a certain direction. But maybe that thing that you’re not really certain what it is that’s up in front of you, that’s the thing where you need to be rolling out 10, 20 different scenarios of what might happen and make certain that you can kind of hedge your bets against all of those.

For access to the full transcription below and for the opportunity to read through additional event transcripts and recaps, become a member of Extra Crunch. Learn more and try it for free. 

Megan Rose Dickey: Ready to talk some ethics?

Oliver Cameron: Born ready.

Clark Haynes: Absolutely.

Rose Dickey: I’m here with Oliver Cameron of Voyage, a self-driving car company that operates in communities, like retirement communities, for example. And with Clark Haynes of Uber, he’s on the prediction team for autonomous vehicles.

So some of you in the audience may remember, it was last October, MIT came out with something called the moral machine. And it essentially laid out 13 different scenarios involving self-driving cars where essentially someone had to die. It was either the old person or the young person, the black person, or the white person, three people versus one person. I’m sure you guys saw that, too.

So why is that not exactly the right way to be thinking about self-driving cars and ethics?

Haynes: This is the often-overused trolley problem of, “You can only do A or B choose one.” The big thing there is that if you’re actually faced with that as the hardest problem that you’re doing right now, you’ve already failed.

You should have been working harder to make certain you never ended up in a situation where you’re just choosing A or B. You should actually have been, a long time ago, looking at A, B, C, D, E, F, G, and like thinking through all possible outcomes as far as what your self-driving car could do, in low probability outcomes that might be happening.

Rose Dickey: Oliver, I remember actually, it was maybe a few months ago, you tweeted something about the trolley problem and how much you hate it.

Cameron: I think it’s one of those questions that doesn’t have an ideal answer today, because no one’s got self-driving cars deployed to tens of thousands of people experiencing these sorts of issues on the road. If we did an experiment, how many people here have ever faced that conundrum? Where they have to choose between a mother pushing a stroller with a child and a regular, normal person that’s just crossing the road?

Rose Dickey: We could have a quick show of hands. Has anyone been in that situation?




10 minutes mail – Also known by names like : 10minemail, 10minutemail, 10mins email, mail 10 minutes, 10 minute e-mail, 10min mail, 10minute email or 10 minute temporary email. 10 minute email address is a disposable temporary email that self-destructed after a 10 minutes. https://tempemail.co/– is most advanced throwaway email service that helps you avoid spam and stay safe. Try tempemail and you can view content, post comments or download something