29 Comments
User's avatar
skybrian's avatar

Aren't there better examples? Elon Musk is in a class by himself, but might be better described as a tech industry executive with some technical expertise. (How much is debated.) He isn't a software engineer and didn't gain power by writing code. He has minions who are software engineers.

To drill down a bit, I think one of the keys to SpaceX's success was the ability to attract rocket engineers to work long hours on something new and exciting that made sense to them from a technical perspective. They were attracted by the promise of being able to execute faster than was typical at NASA contractors. Funding of course matters too.

Google had this attraction at the beginning, too, for software engineers. And there was a time when Tesla was pretty exciting for car guys.

The ability to attract *many* talented nerds, along with funding, to work together on a common cause, is more about technical leadership than about writing code. Lone coders don't accomplish all that much by themselves, and many companies have management that doesn't listen to non-managers very much.

It's also possible to attract unpaid nerds to work on inspiring projects, or to pay people to work on uninspiring projects, but it's harder to get good people without having both an exciting project and funding to pay them. (A strong possibility of getting rich certainly helps too.)

Expand full comment
Ke Zhang's avatar

I think the tech elite has so far failed to provide a positive future for society, because recent technology progress has both potent positive and negative effect on our lives. The effort to use technology to improve the physical world has largely been a positive force. I count Amazon, Tesla, Space X as net positive, and Uber, Airbnb, Doordash as mixed but still valuable. The effort on the virtual world has such strong positive and negative effect, which stretches our minds so much, probably left us worse off. The promise that we will be better if we are all connected / if information is free has not turned out to be true. We are overwhelmed by too much connections and turned passive. We are lost with too much information running around. I think AI has the potential to make things much worse.

It is commonly lamented that most modern science fictions are dystopian. Tech people commonly accuse the public as insufficiently optimistic about technology. But I think the dystopian vision is just a natural extrapolation of current trend -- the forces that are tearing us apart is stronger than the forces that are making the world better. I think the tech leaders -- Andreessen in particular -- failed to even acknowledge the problem, and therefore definitely not noble.

Expand full comment
Mike Taylor's avatar

Slightly snarky counter-theory: software engineers are selling a highly-addictive product that their customers' brains haven't evolved to resist well. They've successfully eaten the world for the same reason 19th century purveyors of opium successfully ate China. But at least China's government tried to protect its citizenry.

Expand full comment
Xpym's avatar

So, this "wisdom" thing seems pretty crucial to the whole enterprise, do you intend to say more about it? You say that these days nobody is trained in wisdom, but could such training be made available, in principle? Are there untapped "wisdom experts"? Would they teach largely the same things that were taught a century ago, or does the curriculum need substantial updates?

Expand full comment
David Chapman's avatar

These are good and difficult questions!

I think the answer is "yes, this is possible, in practice as well as in principle." How, concretely, to do it is not clear. It's not clear what the curriculum would be, because yes it does need substantial updates.

It's also not clear what should be the institutional structure for it. Universities are a non-option, probably. And this is dense and difficult enough material that lightweight alternatives such as a self-paced online video course wouldn't be sufficient. (Although, maybe better than nothing?)

I sketched a possible curriculum and institutional structure at the end of "Ofermōd," intermediate in "weight" between the two. (Did you read that?)

Expand full comment
Doug Bates's avatar

What do you think about the Stoicism revival movement as a vehicle for teaching wisdom?

Expand full comment
David Chapman's avatar

I don't know enough to have a meaningful opinion. I guess, when I've looked casually at Stoicism, I've thought "this doesn't seem promising," but I'm mainly ignorant of it.

Expand full comment
Doug Bates's avatar

I suppose you already ruled out Buddhism as a vehicle for teaching wisdom. An article on that subject would be interesting.

Expand full comment
David Chapman's avatar

Thanks! In *some* sense everything I write is that. Or tries to be! But I'm working on doing it more explicitly now. Upcoming, maybe!

Expand full comment
Doug Bates's avatar

I suspect one issue is that our cultural idea of wisdom, particularly the kind needed for nobility, is different from prajna. One reason I brought up the Stoicism revival movement is that Marcus Aurelius likely exhibited the nobility you have in mind.

Quite a few notable Western Buddhist teachers have acted in ways that are widely considered to have been unwise. Some of them have shown themselves to have so little nobility as to be unable to run a retreat center well.

Expand full comment
Xpym's avatar

No, sorry, that wasn't in the free portion.

Expand full comment
Frank Miroslav's avatar

I think one thing you missed is that the activists and moralists managed to convince (some) of the software engineers to embrace their ideals and that this caused a reaction from tech elites.

See Charity writing on corporate DEI as partially driven by bottom-up persuasion and awareness campaigns

I think social media explains a lot about why awareness suddenly exploded in the 2010s. People who might never have intentionally clicked a link about racism or sexism were nevertheless exposed to a lot of compelling stories and arguments, via retweets and stuff ending up in their feed. I know this, because I was one of them.

The 2010s were a ferment of commentary and consciousness-raising in tech. A lot of brave people started speaking up and sharing their experiences with harassment, abuse, employer retaliation, unfair wage practices, blatant discrimination, racism, predators.. you name it. People were comparing notes with each other and realizing how common some of these experiences were, and developing new vocabulary to identify them — “missing stair”, “sandpaper feminism”, etc.

https://charity.wtf/2025/02/10/corporate-dei-is-an-imperfect-vehicle-for-deeply-meaningful-ideals/

And see Andreseen being interviewed by the NYT

The unifying thread here is, I believe it’s the children of the elites. The most privileged people in society, the most successful, send their kids to the most politically radical institutions, which teach them how to be America-hating communists.

They fan out into the professions, and our companies hire a lot of kids out of the top universities, of course. And then, by the way, a lot of them go into government, and so we’re not only talking about a wave of new arrivals into the tech companies. …

By 2013, the median newly arrived Harvard kid was like: “[expletive] it. We’re burning the system down. You are all evil. White people are evil. All men are evil. Capitalism is evil. Tech is evil.”

https://www.nytimes.com/2025/01/17/opinion/marc-andreessen-trump-silicon-valley.html

I think there's a degree of straightforward class conflict going on that gets missed because it doesn't map neatly on to traditional leftist models of what class conflict is supposed to look like

Expand full comment
David Chapman's avatar

Thank you very much for this!

The Andreessen/Douthat interview was fascinating. I knew most of this, but it was great getting the details from Andreessen directly, so to speak.

I really like both those guys, although we differ sharply on a lot of object-level issues.

Back in November, I posted the same skepticism Douthat expressed in January:

> “An honest politician is one who stays bought.”

>

> I suspect some people who believe they bought a politician will be disappointed.

https://x.com/Meaningness/status/1862868244295508433

It makes me feel good about my judgement reading Douthat saying in January "Elon is wildly overconfident, he doesn't know how any of this works, he's going to fail, and you (Marc) are overestimating how aligned Trump is with what you want."

I guess we can't be sure yet, but that seems to be what has happened. Ofermōd...

Expand full comment
Max Clark's avatar

it's hard to think we've lived in something like a society where people had representation and a voice in their own governance. essentially, the presumption we live in a democratic society does not seem true.

the complaint that we didn't elect elon or the other billionaires doesn't bother me much, because i don't feel like i elected trump or even my city's mayor, to be honest.

elections themselve weaponize our own rationality against us: who's going to spend time running for office if they're not rich or "well-connected"? who's going to spend hundreds of hours researching every ballot initative, if their vote is 1 out of millions? who's going to pick a noble outsider, when one of the two main candidates is a "lesser evil"?

in all these cases, elections fail us. we need sortition (would love a post breaking down the theory of sortition!). only with sortition could i describe a society as being democratic. it may end up being ruled poorly, in which case, rule by elite might be our best bet, after all.

Expand full comment
mtraven's avatar

This series on nobility is generating many thoughts; I'm assembling them into a page of my own (which may or may not end up published) so as not to overburden your comments. But the more you touch on actual politics, the more angry and upset I get. I'm trying to deal with that, I am not at all sure how to express it or whether I should express it. I have no standing to lecture you on matters of morality or politics, but feel compelled to nonetheless.

You are, I think, trying to avoid base-level partisan politics, for good reasons (it is often dull, emotional, and ignoble). Yet this is what is happening in the world, and you can't avoid invoking it by mentioning figures like Marc Andreesen, a very visible and vocal Trump supporter.

In this base world, we are in the midst of a fascist takeover which threatens the fundamental structures of a free and open society -- the rational systems by which we operate. The defunding of Harvard may be the most salient recent example but there are many others. This is real! This is what power in the real world is doing right now! And it will be stopped only by an opposing power. This is the real political situation and if nobility is the wise use of power then it has to figure out what side of this battle it is on.

I may have a different interpretation of nobility than you, but in my mind it is incompatible with fascism. Fascism is definitionally not for the benefit of all, it is for the benefit of a particular group at the expense of outsiders. The Trump regime is about hurting people – trans people, immigrants, foreigners, government workers, scientists. It likes to pretend to nobility though. Donald Trump in particular, in his person is an avatar of ignobility, lacking all aspects of the wisdom and character. But he like gold leaf and parades.

In my mind, a minimum qualification for nobility in our current real-world situation is being opposed to this authoritarian takeover. Andreesen, whatever his merits as a software engineer, investor, or writer of manifestos, vocally supports Trump and so is off the nobility list.

These considerations block (for me) consideration of the more interesting point of this post, about the possible role of software engineers in exercising wise political power. Andreesen is a singularly bad example. Aside from his support of Trump, he and his firm are heavy backers of cryptocurrency, which I'm guessing you would agree is not a particularly beneficial use of software.

Sorry to be so negative, but you can't talk about politics without dealing with this kind of stuff. I like that you mention Stewart Brand, who (while not a software person) I think of an avatar of nobility in the sense of hacking systems for the greater good. Who are some software engineers who exhibit nobility? Terry Winograd comes to mind. A weirder example: Richard Stallman. Not a particularly pleasant or attractive person, but he actually did lead a major software policy hack with great positive (IMO) social implications.

Expand full comment
David Chapman's avatar

Hmm, I'm confused by this. I think I've made it clear in this series that I strongly disapprove of the Trump administration's actions, and of Musk's DOGE actions. And that while Andreessen's manifestos contain aspects I like, they're missing a key chunk of understanding what the right use of power is. I'm writing the series now as a sort of emergency measure. Maybe this was somehow not explicit enough for you to pick up on it?

I don't think there's any point in my writing generic "Trump is bad" takes, or diving into the details of "this thing he did yesterday is especially bad." Literally millions of other people are doing that, and I have nothing distinctive to add.

I think that the tech right made a huge strategic error in backing Trump. Their reasoning was "he can't possibly be worse for us than the Biden administration has been; and if we back him, we can influence him." I thought, before the election, that they were wrong on both counts; and tweeted that just after the election; and it's looking like I was right.

I wrote this comment reply about that:

https://open.substack.com/pub/meaningness/p/software-engineers-are-eating-the?r=1cnfx&utm_campaign=comment-list-share-cta&utm_medium=web&comments=true&commentId=120463347

Expand full comment
mtraven's avatar

Well, apologies if I've misinterpreted or gone off inappropriately (and certainly not asking you to write generic "Trump is bad" posts, god forbid). I'm glad you are basically on the side of sanity. Bur I still need to argue (sorry). You disapprove of Trump and Musk, but you also say:

> I think that the tech right made a huge strategic error in backing Trump

See I have a different interpretation than you. I don't think the "tech right" made an error, they were quite clear-minded about choosing to align themselves with Trump out of their perceived self-interest, because that is who they are. Thiel, the central figure of the tech right, has been a major Trump backer for a decade or more. This is what they want, this is who they are, there's no error here. (And if they regret their support, I haven't heard much about that). If they want to be leaders they have to own their decisions, their alliances and the consequences thereof.

If it was in fact an error, then it illustrates that they have terrible judgement and should never be in charge of anything. It should not have taken much wisdom to know that aligning with Trump was a bad idea. A very tiny amount of nobility would have sufficed. How much metarationality does it take to not entrust the country to a convicted conman? One thing you can say about Trump, he is pretty much exactly as advertised, so the people who support him are not doing it out of mistaken belief, they know what they are getting.

At any rate, we agree at least that these powerful people made some very bad decisions, due to a lack of wisdom perhaps, although we might disagree about the particular nature of their deficiencies. Whatever. They went wrong somewhere along the line.

You I think propose to address the problem by teaching them (and software engineers in general) how to deploy power more wisely -- to help correct their mistakes. An admirable goal, but my own reaction is different – they do not need to be taught to wield their power more wisely, they instead need to be opposed, defeated, their power checked by countervailing power.

I will shut up now and let you get on with your program. I apologize for going off on my own tangent; this stuff is intensely interesting and personal and urgent and confusing and resonant for me.

Expand full comment
David Chapman's avatar

Thanks! Some clarifications fwiw:

> I don't think the "tech right" made an error, they were quite clear-minded about choosing to align themselves with Trump out of their perceived self-interest

I do recommend the Andreessen/Douthat interview: https://www.nytimes.com/2025/01/17/opinion/marc-andreessen-trump-silicon-valley.html

In it, Andreessen says they did as you suggest: aligned with Trump for business reasons. (And, apparently, contra their personal social values in his case at least.)

But, as Douthat suggested, and as I suggested back in November, it seems that they were naive in trusting that Trump would return the favor.

> “An honest politician is one who stays bought.”

>

> I suspect some people who believe they bought a politician will be disappointed.

https://x.com/Meaningness/status/1862868244295508433

> entrust the country to a convicted conman?

As far as I can figure, they figured he would be mainly ineffective, repeating the pattern of his first term. Then the Republican Congress would deliver business-friendly but within-bounds-of-sanity policy while Trump was ranting impotently on TV. If so, that was a mistake.

> propose to address the problem by teaching them

Geez, I don't propose anything like that. This problem is WAY beyond MY reach. I hope to influence a few people on the margin in a positive direction. If so, I expect that to be on a comparatively tiny scale. Small amounts of good are still good, though! And sometimes a snowball rolls down hill; who knows.

> their power checked by countervailing power

Well I don't see any such power. The Democratic Party has discredited itself. ("Less bad than the Other Leading Brand," even if true, is inadequate to gain the power for that.) Maybe something can rise from the ashes of either or both Parties, somehow? Or there will be a new, effective, saner power? 🤞

Expand full comment
mtraven's avatar

I did read that interview, it was very revelatory about A's thought processes, and not in a good way, but let's not get into that.

This discussion reminded me of some old posts I wrote on Scott Alexander's idea of conflict theorists vs mistake theorists. https://omniorthogonal.blogspot.com/2018/03/conflict-theory.html Your comment about support of Trump being an error makes you a mistake guy whereas I am a conflict guy (not fair to classify you based on a single remark of course). Anyway, I hope that is meta enough and relevant enough to be interesting and make up for dragging your comments into base-level politics (in >1 senses).

Expand full comment
Jonathan Weil's avatar

>What we should fear is not [artificial] intelligence as such, but sudden massive shifts of power to agents who may be hostile or callously indifferent. Technological acceleration can do that; but a new sort of AI is neither necessary nor sufficient to cause acceleration.

This puts me in mind of Eddie Izzard’s line in response to “Guns don’t kill people; people kill people”: “ok sure, but guns definitely help.” Wouldn’t “a new sort of AI”, that was (a) truly agentic and (b) highly capable, be very likely indeed to cause acceleration (this is what its would-be developers are hoping for)? Maybe not necessary, but the things that would prevent it from being sufficient (a lack of will, and/or regulatory obstacles) seem pretty much absent. As for power flowing to callously indifferent actors, maybe I’ve been reading too many rationalists, but agentic AGI without a *very particular* form of alignment seems, in expectation, sufficient to cause this?

Expand full comment
David Chapman's avatar

Yes, apparently this was less clear than I expected! My point was that "AI" is dangerous, possibly catastrophic, even if it is not "agentic." It's dangerous to whatever extent it confers power on *some* entity, whether human or artificial.

Expand full comment
Jonathan Weil's avatar

This makes sense, and was mostly clear at first reading. It was the “nor sufficient” that i think maybe misled me a little as to your central point (or, quite possibly, I am just being a bit dense today!)

Expand full comment
Dave's avatar

Loving this path. Today's reading made me think of old RAW and his 'model agnosticism' and then all of the various models - some my own, most by others - that I've delved into as a software engineer for so many years... 'Oooh, that's how you are thinking about it?!?'

Expand full comment
nope's avatar

I will argue that the most "meta rational" discipline is actually electrical engineering (possibly because that's what I studied in college, but I do actually work in software now). Or rather I don't think modern software engineering teaches you metarationality as much as it used to.

From what I understand metarationality is about applying different models/views/schemas onto phenomena. Software involves doing this at lower levels but as we get to modern development a lot of it is now about having the details abstracted away and you just engage with a clean interface that you don't look beneath. This is a modernist approach and does everyone a disservice, especially when performance and flexibility count. Now as software engineers progress into their careers they are eventually forced look beneath the layers or are at least to switch abstractions, and that's where many either take the journey forward or flounder.

Otoh studying electrical engineering is still primarily a college endeavor where you are required to learn a ton of math and basic circuit and physics concepts that work with said math. Much of the basic workings of electronics involves deciding which mathematical and physics abstractions are right for which situations, with the outcome of ones design being directly helped or hurt by the factors. Is a wire an ideal conductor or does it have impedance? When can we treat our voltage as AC or DC, and what does that do for our analysis? Which region of the transistor are we operating in and how do we maintain it's presence there? Will thermal aspects be relevant and how? Etc. This is all initially overwhelming but if you get the hang of it I think you end up being pretty metarational.

Expand full comment
David Chapman's avatar

Cool, thanks, good points!

Expand full comment
nope's avatar

Also Bezos has an EE degree in addition to Comp Sci, and another famous EE billionaire is Mike Bloomberg.

Expand full comment
Phssthpok's avatar

You are high on your own supply, mate.

Expand full comment
Mike H's avatar

The engineers are not in charge of any of this and never have been. All the shots are called by a narrow financial elite who understand technology only insofar as it is a useful tool to the seizure of more money and power.

Expand full comment