Transcript
Miley: I'll give you just maybe a little bit of my background, because it's non-traditional. I didn't start thinking I would be in tech. I didn't go to a tech school. I didn't study at a prominent university. I grew up in a fairly lower middle-class neighborhood, what some people call working class. It was in Silicon Valley, so I really had an opportunity to watch tech grow during the '80s and '90s. It's just fascinating to see where we're at today, with things like ChatGPT, which seems to be getting more mindshare than anyone ever thought. Generative AI that is taking over the zeitgeist where people are doing images, and they're having conversations, and they're calling it sentient. They're building relationships with AI online, which is just crazy.
There's something that really caused me to think about this talk, which is, there are two things that are related, but they don't seem related, which is the bias in AI and the sustainability of what we're trying to do with AI. Why I wanted to do this particular talk is because I grew up in an area and currently live in an area where transformative technologies, transformative industries have an impact on the communities that they exist in. Then they have an impact on the communities around the world. Sometimes those impacts are good, and oftentimes they're not. I wanted to just set that stage. Once again, set the expectations. This is probably going to be rough, probably going to get a lot of things wrong. I'm not going to dive too deep into the technical details, because there are people who are better at it than I am. I do want to dive into some of the social and the cultural details. Because that's where I think, aside from the technology, the biggest impact is going to be had.
Background
Just a little bit about myself: Google, Apple, Twitter, Slack. The CTO for the Obama Foundation, which is one of the biggest honors in my life to be able to work directly with the former president of the United States. I never thought someone who didn't even finish secondary school would actually be able to work with the president of the United States. That's where I was, and it was fun. It was really exciting. I have two stints at Google, which are really awesome. I'm currently an advisor in the office of the CTO at Microsoft. I've done testing, DevOps, product development, now technical advisor.
The Dirty Secret of AI
AI has a dirty secret. It's dirty. Generative AI is amazingly energy intensive, even more so than normal cloud services. As you can tell, and we're hearing about it, it's booming, but so is its carbon footprint. Data centers are being built everywhere. The energy that they're taking is sometimes green, most of the time, not clean. Google and Meta and Microsoft are all doing their level best to buy green energy, to buy carbon credits. The fact of the matter is, there's not going to be enough. The fact of the matter is, as we continue to build out our data centers, as we continue to invest in this infrastructure, it's going to continue to emit CO2. For those of you who have been watching the climate crisis unfold, and I'm from California, and I was born and raised in California, and I live in California. This is the second most rain in recorded history in California. Hundreds of inches of snow, almost 700 inches of snow, at the higher elevations. Flooding in places that haven't flooded, ever. We're watching the impact of the climate crisis play out on our screens every day. Part of this is because we're a carbon intensive economy, and generative AI is going to continue that, and it's only going to get bigger. One of the things I find fascinating is human beings have this great ability to solve their problems with more complexity. As we come up with generative AI, as we push this out to people, we're like, we know it's going to eat up carbon, but we're going to make it more efficient. We know it's going to emit more carbon, we're going to make it more efficient. I think that's great but it's creating the problem, but then turning around and saying they're going to solve the problem. As they solve the problem, the problem hits people's communities. It floods the Central Valley where a great deal of the fruits and vegetables in the United States are actually grown. The people who pick those vegetables, who process that food are migrants, are lower socio-economically, and they are impacted by this. Their homes are flooded. Their income is gone, because the fields are flooded. We are just watching this. Part of this is because we create these systems. We create these technologies that push more CO2 in the atmosphere, and then we say we're going to fix it later. How do you fix someone who doesn't have a home anymore because of the climate crisis? How do you fix someone who doesn't have a job to go to anymore? These are part of the problems.
Generative AI Will Need Different Infrastructure
The problem with this is that generative AI is going to need a different type of infrastructure. The demands, means there are going to be different data center designs, which means digging up more ground, which means removing trees, which means removing the carbon that they can sequester, which means disturbing the soil, uncovering the soil, which means removing their ability to absorb more CO2. The more that we build these data centers, 100,000, 200,000, 300,000 square foot data centers, the more carbon we actually emit, and the less carbon the environment can sequester. This is a problem that's not going to go away. Maybe it goes away if you put your data center in a building in the middle of New York, but that just doesn't seem to be cost effective for a lot of different reasons. We also have a cooling system problem. I was shocked in the research I was doing, and I thought I knew this. The newer data centers, the hyperscale data centers that are coming up, take anywhere from 10 million to 19 million liters of water to cool them per day. That's per day. That's water that can't be returned, in many cases, to potable water. It's evaporated. It becomes wastewater and it gets discharged. I think about this, and I think about communities that don't have access to water, or have limited access to water or don't have clean water. We have data centers that are taking clean water, and using it in tens of millions of liters on a daily basis. We're doing that, so that you can generate pictures, so that you can have a conversation with an AI, so you can play around. It's shocking to me. The intense nature of this is going to lead to a huge, I call it another data center boom. Meta just recently stopped development on several data centers in Europe, on the continent, to do a redesign because of AI. They're probably going to build a bigger data center, a different type of data center, that's going to take more energy, that's going to take more land. It's crazy to me that this is happening.
HyperScale Data Centers to the Rescue?
Hyperscale, maybe they're coming to the rescue. The data will move around faster, will have its own energy sources. I've seen solar. I've seen wind. I've seen geothermal, which I think is great. They're highly automated, skilled jobs, and they are eco-friendlier, whatever that means. I don't know how a data center could be eco-friendlier, you tear up the ground, you plow, take out trees, you throw it out there. I don't understand how we call that.
The American Interstate Highway System
I want to take a detour, because I want to talk about infrastructure. We're building the infrastructure of the future. This is what we build to do what we do. We are all part of this. Every one of you are going to have some part of this new infrastructure. Whether it's the data center, whether it's the software, whether it's the telecommunications aspect of it, we're all a part of this. Building this infrastructure is no small task. I tried to think, it's like, what infrastructure in the last 100 years has been built that could be this transformative. I was like, skyscrapers, maybe not. Mega cities, maybe not. I did come up with one. The interstate highway system in the United States, it's actually pretty awesome. This was designed or came up about 75 years ago, by the Eisenhower administration in the 1950s, to connect all the major cities with roads, as part of the Defense Act, so that troops and equipment can be moved, but also just to improve the overall infrastructure. It's really amazing when you look at it. It connected all the major cities. Hundreds of thousands of jobs were created. Forty-six thousand miles of road were built inside of 15 years, which up until that point, no one had ever seen anything. It transformed a continent. My father grew up during this time. I just remember him talking about this, and talking about just how fascinating it was to see these huge two-lane roads running miles through. He actually drove the interstate highway system from the East Coast of Florida to where I was born and raised in California. That's how he got to California, was on this major system, which is one of the reasons I'm here today. I put this up there. I do think it's one of the greatest public works projects in history. It changed America and it influenced infrastructure all over the world.
Roads for the Win
This is not necessarily a bad thing. There was a lot of platitudes about it. I'm going to put these up here. I'm going to leave them up here for you to read. Then maybe talk a little bit about what I saw, and what this really meant for a generation of people. It meant that people could leave their areas and move elsewhere very easily. Before it was a lot harder. There's something the United States called The Great Migration, the largest single migration of people in the history of the world at that point in time. The Great Migration was when millions of African Americans left the South, the home of slavery, and spread out throughout the country: New York, Chicago, Detroit, St. Louis, Oakland, California. Just millions of people packing up and leaving. This migration went from the early 1900s, into the 1970s, so the 60-year migration of people. It accelerated because of the interstate highway system. It accelerated because people didn't have to take trains, which was difficult. Because if you were living in Mississippi, and you were black, and you wanted to leave this area that had this history of oppression, people would actually stop you from getting on a train, but they couldn't stop you from getting in a car. They could stop you from getting on a bus, but they couldn't stop you from getting in a car. You just get in your car and drive. This was huge for my culture in the United States. It was huge for the United States, because it helped all of us move to different places to try to have a better life.
I spend a little bit of time on that because you see that someone at some point said this would happen, and they didn't think it was going to happen just for a race or a group of people. They thought it was going to be good for the country as a whole. It ended up being that way. There's one thing that's interesting on here, and I want to come back to it, is that the interstate highway system has been one of the most important economic development strategies of the federal government, along with the GI Bill. The GI Bill was designed after World War II to give returning service members, loans to get homes, loans to go to school, loans to essentially improve their life.
Roads with Bad Intentions
These roads had bad intentions. These are roads with bad intentions. We ask, how can highways be racist? Highways can be racist because they cut through neighborhoods. This is where I had this epiphany that what we're doing with AI, what we're doing with the infrastructure is much like what we're doing with the infrastructure for the federal highway system in the United States. We're building roads for new industry, for new commerce, for new ways to communicate, for new ways to work. When you do that, you have to build it with intentions. Hopefully, good intentions, not bad intentions. Because the federal highway system in the United States, unfortunately, was built with racist intentions. It was built through neighborhoods that were predominantly black or low income. It cut neighborhoods in half. Eminent domain was used to take people's property at below market value or no value, you just had to move. Infrastructure was built that would deny people access to certain areas. This happened in New York. I didn't know about this until about a year ago. It was amazing. New York, with the money they got from the federal government, built bridges too low for buses that would take black and Puerto Rican residents to the beach, so they couldn't go. I thought about that. I was like, that's crazy. How do you make something like that racist? It's like this is infrastructure. I was like, no, infrastructure can be racist. Infrastructure can be discriminatory. This is the problem that I see with AI and sustainability.
Data centers suck up huge amounts of water. They take that water from communities, and they don't give anything back for that. Data centers create heat islands that change the local environment. No one thinks about that until after the fact. Data centers create noise. I didn't know this until a friend of mine in Chandler, Arizona was like, "I keep hearing this buzzing and it took us months to figure out it was the data center." Low frequency sound of all these servers, all these chillers, all these coolers, these are the impact this infrastructure is having on people. All of us are using this and not even thinking about it. Just like we drive on the highway system in the United States and don't think about the impact it had on the communities and still have on the communities.
I'll give you an example of that, where I live in San Francisco. I'm in San Francisco in a neighborhood called Bayview. On one side, are two freeways, the 280 and the 101, that were built with the funds from the federal highway interstate system: pollution, particles, noise, everything. Asthma rates are two-and-a-half times higher in my neighborhood. There's low birth weight. People are just having like kids who are not developing intellectually, because of the amount of pollution. On the other side is a shipyard that was used to decontaminate the ships that were used in the nuclear testing in the South Pacific. In my neighborhood, I'm between a freeway throwing dust and pollution, nitrous oxide, and everything else, and a bunch of nuclear waste that is buried and/or lost somewhere. We don't think about this infrastructure as we drive through it. We don't think about the impact it has on people. I think that's part of the problem. When you look at some of this data that I've put up here, these are all papers and/or research that's been done, and how they've impacted minority communities. This is not just in the United States, this happens everywhere.
We were talking earlier, and I just have this epiphany, capitalism is consistent. It consistently extracts as much as it can from a system, with no regard to the health of that system. Then it moves on. The federal highway system built all these roads, put them in, and then it moved on. Neighborhoods were destroyed. Communities were upset. Social order was changed. We all drive on these roads, and we don't care. I spend so much time on this, because we're doing that today with AI. We know that AI is biased. We know the data we train it with is biased. What do we say? We'll fix it later. When is later, when you've made your money? When is later, when you've had your exit? Is that later? The time is now. The time is to look at how you're training your data. The time is look at where you're running your models. What data center it's running in. What region it's running in. Where it's getting its energy, and making a choice, not just because it's expedient, but because it's the right thing to do.
I grew up in Silicon Valley. My nightly news, my 6:00 news, my newspapers were Steve Jobs and Gordon Moore. I watched them build the future. There's an impact to that, where the chips that run these data centers, where the company that creates the GPUs that are so in demand, NVIDIA, is in Santa Clara, in Sunnyvale. There's a long central expressway in Silicon Valley that has the highest concentration of polluted Superfund sites in the United States. You can't drink the water from this area. You can't grow anything in the ground, but you can run your GPUs and you can train your models. I keep harping on this, because we are at this moment in time where we all get to make these choices. If we make the same choices. If we do things the way we've always done them. If we let capitalism continue to be consistent, this is what will happen. We will degrade environments. We will destroy neighborhoods. We will destroy communities. We will impact the development of children, particularly children who are already disenfranchised, who already have a terrible time of it, and we'll try to fix it later. I don't do tech because it's cool or because it's the money, I do it because I thought it would help people. I really did want to be in tech, not just to change the world, but to change the world for the better. This is the choice that all of us get to make every day. As we learn more about how to deploy these models, and train, and use training data, we have a responsibility to not just do it, but to do it intentionally, and do it with good intentions, not bad intentions.
Infrastructure Impact on Communities
This is something that I was talking about. This is the infrastructure impact on the communities. This is all fairly recent. The one that got me is, Biden moves forward with a mining project that will obliterate a sacred Apache religious site. The Native Americans, the indigenous Americans in North America haven't had it bad enough that we're going to do this. They're mining for copper: copper for your electric cars, copper that is going to be in the chips that you need to run your models. Chandler, Arizona, is actually putting in an ordinance for noise. This one was really difficult for me, historic black cemetery was moved for a Microsoft data center in Virginia. Are we still doing this? It's 2023. Are we still letting capitalism destroy communities, destroy history? Yes, we are. I really hope that all of you look at this, decide to do a little more investigation. As you're learning more, and as you are starting to deploy, as you grow companies, grow the next Meta, grow the next Google, that instead of fixing it on the back end, you try to incorporate it on the front end.
What About Bias in Generative AI?
I had a lot of slides and a lot of data, because I wanted to talk about bias in generative AI, then this popped up. You look at this, and you're like, what? You're not even trying at this point. You're just saying the quiet thing very out loud, which is we really don't care about diversity, what we really care about is the appearance. We're going to use the latest models. We're going to use generative AI to solve this problem for us. I didn't even know what to say to that. I threw all the rest of my slides out, because I just wanted to have a conversation at this point. People threw some tweets out there that I thought was hilarious, and also sad. This is the crux of the problem. If a company as big as Levi is going to use the technology that we're creating to essentially disenfranchise groups of people to such an extent that all they can do is type in, show me a black, or show me a Latinx, or show me someone from the Sudan or Ethiopia. What else are they going to do?
One of the examples that I was coming up with even before I saw this was, Hollywood loves diversity these days. I think they love it for diversity's sake. Someone's going to do something like this. Sooner or later, we're going to see generative AI actors. Some company is going to generate actors with AI. I call it cultural appropriation via prompt engineering. Because people will be able to type in, it's like, I want a black actor, or I want a Latinx actor woman for this role that speaks like Viola Davis, but make her more black. Make her really intelligent or make her really dumb. Make them really promiscuous, or make them really pious. This technology has the potential to not just incorporate our biases, but to amplify them, and amplify them on a stage we've never seen before. People are talking about deepfakes of politicians. People are talking about generative voice AI that can now copy your voice completely, like we haven't had people doing that already.
When I saw this, I was like, this is already starting. It is already starting to look like companies are just going to continue to be consistent with capitalism, because now they don't have to pay somebody. They don't have to do the work. They don't have to sit in front of me, or someone who looks like me or somebody who looks like her, they can now just get what they want. They don't have to learn. They can continue to amplify their biases. They don't have to pay them, taking away money from artists, taking away money out of communities, continuing to amplify the stereotypes that currently exist. We will just drive through this infrastructure on our way to wherever we're going. That's something that makes me want to quit tech and just go live on a farm. My mom said this to me, and I just felt really bad. She's like, haven't you all screwed up enough? She's like, look at what Meta has done. Look at Twitter. She just goes on. My mom also was in tech for a while. She says these things to me, and I have to agree with her. She's like, when are you going to get better? Aren't you doing that at Microsoft? You're in the office of the CTO, you're working on AI, you're working on sustainability, aren't you doing that? I'm like, I'm trying mom, but capitalism is consistent. It's hard to stop this train, because everyone wants to get paid. Everyone wants their little piece.
I want to leave this up there because if you think that it's not a problem, this will tell you it's a problem. If you think it's not a problem, look at how much water is being used. Look at how much energy is being used. Look at crypto. Crypto just took off. People threw money at it. All of a sudden, you had coal plants coming online that hadn't been online for years. You had companies rushing to take advantage of this, and billions of dollars thrown into it, that most of it was a fraud. CO2 emissions shot through the roof, because all of this mining is being done. I try to keep things non-political when I'm up here for the most part. China, I think was smart and right to ban mining. There's no benefit. There's no societal benefit. There's no social benefit. There's no cultural benefit to doing that. I start to think, is there a cultural benefit to this? Is there a societal benefit to generative AI? Is there a societal benefit to these models? I don't know. I say that from inside the house, because I did a lot of this research using ChatGPT. I was like, "ChatGPT, write my talk," while I'm on this plane. The irony is that I'm on a plane spilling CO2 out the back, using ChatGPT, probably burning CO2. I'm like, this is what we do.
If you don't think this is a problem, if you don't think that it's something that you can do something about, I think you need to maybe not do this as a living. I say that because it's not going to get better unless we make it better. Unless we make these choices. If we are living with our heads in the sand, thinking that we're not going to build the infrastructure, like the interstate highway system that is damaging the most vulnerable communities, then we're wrong. Because that's what we always do. You cannot name a technology that has been transformative that has not damaged the most vulnerable parts of society. That have not damaged cultures. That have not destroyed communities. All of us here for the first time, actually can say, we can do something about it before it gets out of hand. Before there's 46,000 miles of road built. Before there are another 5000 data centers built. We can vote with our models. We can vote with our jobs. We can vote where we decide to run these. We can vote how we run them. I implore each of you to do this. Because if you don't, these conferences are going to start looking really strange. Because no one will show up, everybody will just have their avatar online that they designed with ChatGPT, or Midjourney, or DALL·E 2, and go from there.
Mitigation Strategies
Everybody says, I should have a slide of mitigation strategies. I have a slide of mitigation strategies. What can you do? Smaller models are always better, bigger models are not necessarily better. Think about that. Think about how you can break your models up. There's something, when you're doing training, something called societal context. There's a group, I think it's called the SCOUTS group in Google, that is actually adding societal context, so you can train your model with societal context. What does that mean? You can actually train a model that knows about sexism, that knows about anti-trans viewpoints, that know about our own biases, and can help you mitigate them in your models. When it comes to CO2 emissions, you can't manage what you don't measure. I just threw some of the big ones up here, the Emissions Impact Dashboard from Azure, the Carbon Footprint for GCP. The AWS Customer Carbon Footprint Tool Overview, is a real good way to know that. There's something in the financial world, they call it KYC, Know Your Customer. I came up with something a little different, KYD, Know Your Data, and KYDC, Know Your Data Center. Know where your jobs are running. Know where that energy is coming from. Understand the companies that built them, and how they're deploying them around the world, and whether or not they are doing it in a socially and culturally responsible way. We talk about environment, yes, but it's socially and culturally responsible as well.
Resources
To learn more about bias in AI, these are the two people who you should just read about, Dr. Timnit is amazing. She's doing amazing work at the DAIR Institute. Joy Buolamwini, at The Algorithmic Justice League, they're doing really good work in this. I can't speak to their work because they know more about it than I will ever know. Definitely do your research there before you do any major work.
Conclusion
I think we're at a seminal moment in technology again, and I try to think about other moments in technology that are like this. Maybe it was the advent of the World Wide Web. I remember driving somewhere in Silicon Valley, it's probably the mid-90s, and seeing on a billboard, a movie poster that had www. whatever the movie name was .com. It was the first time I saw an advertisement that had a website. This is probably '96, maybe '95. It's very early. At that moment, I knew something was different. I knew something was different when I first saw ChatGPT. I was like, this is going to be crazy big, because for the first time, people can actually interact with technology, and the technology was meeting people where they were at. I really do think that this technology is meeting us where we're at, and we have an obligation to make sure that we are meeting it with compassion. That we are meeting it with humility. That we are trying to understand the social and the cultural impacts before we do it. Because we're only going to get one chance to do this. If we do it wrong, and we build this out, and we continue to degrade the environment, I don't know what the world's going to look like. Because right now it's starting to not look like what I expect it to and what I grew up expecting it to. This is our way to stop it.
See more presentations with transcripts