I recently saw someone making the following comparison:
- The transition from paper maps to Waze.
- The transition from unassisted to AI-assisted.
At first, this comparison made me incredibly uncomfortable, but the more I think about it, the more I believe it to be accurate.
I mean, who made the comparison has no idea. He was comparing devs to obsolete cab drivers.
But the analogy holds. And replacing "AI" with "Waze" seems to be a good litmus reality check.
- "Waze Specialist"
- "My company helps people use Waze to increase productivity"
- "I was a pioneer Waze user"
- "I've spent most of my life dreaming of something like Waze"
See how it sounds? It's perfect. We expect this technology to be as ubiquitous as GPS-assisted navigation. Maybe we should start treating it as such: a commodity.
You have to separate the technology, from who is controlling it, what resources it is using up and how companies envision utilizing it. People aren't afraid of the technology, they're afraid of Silicon Valley and corporate America.
I can get on board with most of that, but part of the issue here is the battle over "what resources it is using up" is focusing on the wrong resource.
Assume unabated growth of AI data centers in the USA: in general, you'll stop being able to afford the electricity to run your AC and your freezer before feeling pain from the extra stress on water supplies.
The water issues in the USA have many causes and do exist, but blaming DCs for them or expecting their absence/removal to fix anything is like blaming climate change on residential driveways and trying to reverse it by planting a few lawns in their place.
I don't have any intuition for how credit works at the scale of an entire economy; I'm willing to just take what you say on faith… but I have to, too.
However, from watching public discussions about money, I think I'm a lot more clued in to how it works than most people. I get the impression most people couldn't even take what you just wrote on faith, the idea that total supply is limited rather than what any given person could get being limited only by their own creditworthiness, which means I think they'd find it really hard to care about credit running out.
This is the root of it. The control paired up with the current state of society.
Technological advancements in modern times appear to only benefit very few numbers of individuals. The 1960s excitement was a time when we imagined it would benefit everyone more from it--not necessarily equally, but at least life not getting worse! Now we just see ourselves being replaced and struggling more, with the "lucky" ones in precarious employment.
Whilst wealth inequality is already creeping towards record levels in the West, with potential wars are on the horizon, civil war being encouraged, and tech stocks appearing to be in a bubble alongside P/E ratios of corporate stocks at high levels like we saw before other financial disasters, it's hard for anyone to be excited about the near future.
A lot of our best science and technology emerged from bad intentions--war, accidents, control, private power struggles. A lot of our worst science and technology emerged from good intentions--charity, humanity, heroism, public works. It's a strange world. We have to contend with edge cases, the butterfly effect, and dynamic chaotic systems etc. These are some reasons for the potential "road to hell paved paved with good intentions".
The state of AI is unfortunately tragic to me. I've always loved, learned, used, designed, and then built technology. My life has largely revolved around it, and I'm witnessing another technological revolution. Yet the more I've learned about how humans use and control technology, the more I realized the truth:
Technology =/= progress.
We should all think extremely carefully about what we're contributing to. Progress is inevitable, but as designers, builders and preachers of the latest technologies, we have a small amount of power in society to help ensure resources cannot be secretly utilized against us. This small power is in our choice to use current offerings or encouraging others to, and whether we work for those who with to control us.
An excellent comment much like everything zerosizedweasle wrote here.
However, I can't believe you wrote these:
> A lot of our best science and technology emerged from bad intentions--war, accidents, control, private power struggles.
>A lot of our worst science and technology emerged from good intentions--charity, humanity, heroism, public works.
And then concluded:
> It's a strange world.
I'd suggest to make a conclusion based on a deeper analysts starting with the question "Why?"... Nothing strange about it. For starters, aren't wars always sold with good intentions? As is AI now.
> We should all think extremely carefully about what we're contributing to.
What about to whom you're contributing to?
> Progress is inevitable, but as designers, builders and preachers of the latest technologies, we have a small amount of power.
How true, it's small indeed, for the sake of clarity I'll put it differently:
Don't bring design skills to a political fight - that's definitely not the way to success, big or small.
I don't know if you've seen the movie Elysium. But the premise is that most of humanity lives impoverished filthy existence on Earth, while a tiny class of elite live on a space station in orbit. This is what people in the United States fear about AI. It's why the United States has the greatest rates of AI loathing of all countries in the world. They're afraid of that inequality.
It isn't about the tools or using them, it's about the scale. The scale of impact is immense and we're not ready to handle it in a mutitude of areas because of all the areas technology touches. Millions of jobs erased with no clear replacement? Value of creative work diminshed leading to more opportunities erased? Scale of 'bad' actors abusing the tools and impacting a whole bunch of spheres from information dispersal to creative industries etc. Not even getting into environmental and land-use impacts to spaces with data centers and towns etc (again, it's the scale that gets ya). And for what? Removing a huge chunk of human activity & expression, for what?
I suspect a lot of people do. Probably a massive number. They just silently derive utility out of it. Those who hate it are probably more vocal.
Not sure about love, but I like it at least, it's useful to me. But it's like a frozen TV dinner, not something worth bringing up.
This.
Not everyone has the same wants and needs. For everything that some people love, there are others who don't.
I recently saw someone making the following comparison:
- The transition from paper maps to Waze.
- The transition from unassisted to AI-assisted.
At first, this comparison made me incredibly uncomfortable, but the more I think about it, the more I believe it to be accurate.
I mean, who made the comparison has no idea. He was comparing devs to obsolete cab drivers.
But the analogy holds. And replacing "AI" with "Waze" seems to be a good litmus reality check.
- "Waze Specialist"
- "My company helps people use Waze to increase productivity"
- "I was a pioneer Waze user"
- "I've spent most of my life dreaming of something like Waze"
See how it sounds? It's perfect. We expect this technology to be as ubiquitous as GPS-assisted navigation. Maybe we should start treating it as such: a commodity.
You have to separate the technology, from who is controlling it, what resources it is using up and how companies envision utilizing it. People aren't afraid of the technology, they're afraid of Silicon Valley and corporate America.
I can get on board with most of that, but part of the issue here is the battle over "what resources it is using up" is focusing on the wrong resource.
Assume unabated growth of AI data centers in the USA: in general, you'll stop being able to afford the electricity to run your AC and your freezer before feeling pain from the extra stress on water supplies.
The water issues in the USA have many causes and do exist, but blaming DCs for them or expecting their absence/removal to fix anything is like blaming climate change on residential driveways and trying to reverse it by planting a few lawns in their place.
Credit, right now it's using up and starving parts of the economy of capital. Credit is a resource in that it is a finite pool that is tapped.
I don't have any intuition for how credit works at the scale of an entire economy; I'm willing to just take what you say on faith… but I have to, too.
However, from watching public discussions about money, I think I'm a lot more clued in to how it works than most people. I get the impression most people couldn't even take what you just wrote on faith, the idea that total supply is limited rather than what any given person could get being limited only by their own creditworthiness, which means I think they'd find it really hard to care about credit running out.
> "I don't have any intuition for how credit works at the scale of an entire economy"
> "I think I'm a lot more clued in to how it works than most people."
Which one is it? Can you be clued enough if you don't get the macro role of credit?
> I think they'd find it really hard to care about credit running out.
True, but nobody has said that's what you should try to get people to care about, it's a separate issue.
There's no contradiction there, it's both.
"Better than most people" is a very low bar.
(Also, not the macro role of credit, the fact it can run out even when everyone is individually credit-worthy).
[dead]
This is the root of it. The control paired up with the current state of society.
Technological advancements in modern times appear to only benefit very few numbers of individuals. The 1960s excitement was a time when we imagined it would benefit everyone more from it--not necessarily equally, but at least life not getting worse! Now we just see ourselves being replaced and struggling more, with the "lucky" ones in precarious employment.
Whilst wealth inequality is already creeping towards record levels in the West, with potential wars are on the horizon, civil war being encouraged, and tech stocks appearing to be in a bubble alongside P/E ratios of corporate stocks at high levels like we saw before other financial disasters, it's hard for anyone to be excited about the near future.
A lot of our best science and technology emerged from bad intentions--war, accidents, control, private power struggles. A lot of our worst science and technology emerged from good intentions--charity, humanity, heroism, public works. It's a strange world. We have to contend with edge cases, the butterfly effect, and dynamic chaotic systems etc. These are some reasons for the potential "road to hell paved paved with good intentions".
The state of AI is unfortunately tragic to me. I've always loved, learned, used, designed, and then built technology. My life has largely revolved around it, and I'm witnessing another technological revolution. Yet the more I've learned about how humans use and control technology, the more I realized the truth:
Technology =/= progress.
We should all think extremely carefully about what we're contributing to. Progress is inevitable, but as designers, builders and preachers of the latest technologies, we have a small amount of power in society to help ensure resources cannot be secretly utilized against us. This small power is in our choice to use current offerings or encouraging others to, and whether we work for those who with to control us.
An excellent comment much like everything zerosizedweasle wrote here.
However, I can't believe you wrote these:
> A lot of our best science and technology emerged from bad intentions--war, accidents, control, private power struggles.
>A lot of our worst science and technology emerged from good intentions--charity, humanity, heroism, public works.
And then concluded:
> It's a strange world.
I'd suggest to make a conclusion based on a deeper analysts starting with the question "Why?"... Nothing strange about it. For starters, aren't wars always sold with good intentions? As is AI now.
> We should all think extremely carefully about what we're contributing to.
What about to whom you're contributing to?
> Progress is inevitable, but as designers, builders and preachers of the latest technologies, we have a small amount of power.
How true, it's small indeed, for the sake of clarity I'll put it differently:
Don't bring design skills to a political fight - that's definitely not the way to success, big or small.
I don't know if you've seen the movie Elysium. But the premise is that most of humanity lives impoverished filthy existence on Earth, while a tiny class of elite live on a space station in orbit. This is what people in the United States fear about AI. It's why the United States has the greatest rates of AI loathing of all countries in the world. They're afraid of that inequality.
People fear that which they don’t understand.
In AIs case the other people love what they don't understand.
It isn't about the tools or using them, it's about the scale. The scale of impact is immense and we're not ready to handle it in a mutitude of areas because of all the areas technology touches. Millions of jobs erased with no clear replacement? Value of creative work diminshed leading to more opportunities erased? Scale of 'bad' actors abusing the tools and impacting a whole bunch of spheres from information dispersal to creative industries etc. Not even getting into environmental and land-use impacts to spaces with data centers and towns etc (again, it's the scale that gets ya). And for what? Removing a huge chunk of human activity & expression, for what?