It's pretty wild that software is the only "engineering" discipline without any concept of professional ethics or loyalty to human safety that superceeds the whims of the employer.
Why do people think that is? Have there been any attempts to change this from the inside over the past decade? Where are professional associations like the ACM in all of this? It's a shameful state of affairs and reflects poorly on the whole discipline.
People who design bridges and vehicles have real responsibilities and standards they are held to, yet somehow the software that actually runs these things is exempt.
This is how Boeing negligently murdered hundreds of people with MCAS. By taking responsibility for safety away from actual engineers and misplacing it with people who write software.
My ethics curriculum while getting my CS degree basically highlights the last decade+ of big name tech companies as "things that should not be done for obvious reasons". So I avoided all of them, looking for some of the most boring, banal tech jobs/applications of computers to clerical work imaginable.
Imagine my surprise when even most of those jobs were founded on things/goals/ambitions/central risks that were also profound ethical lapses. I've gotten to the point where I'm honestly wondering whether humanity is even ready for ubiquitous computing, to which seemingly the answer lately trends toward "hell no". Great thing to only realize after 10 years in the industry.
Point being, ethics is something self enforced, and we've taken great pains to ensure there is no professional licensure body or anything else around software; in particular because of the fundamental asymmetry created by locking that fire away from mortals behind a bunch of barriers to entry. Those with tech will have an advantage over those without. This is a certainty. The price of that (intentionally opening access to expertise for anyone curious) is what we have now. The tech that gets implemented is a reflection of the collective soul of humanity. If Greed is God, and the Deadline and Sale take priority over Safety, Fitness for Use, Quality, and Characterization of the System-Under-Scrutiny; then previously mentioned lapses in the ethicality of implementations of computer systems are what we're gonna get, and the conscientious objector will just be walked out the door and the next practitioner less afflicted by scruples will be walked in instead.
I'm open, as I've always been to putting a thumb on the scales through greater organization of active practitioners to actually make a means to ensure some subset of systems don't get implemented, but I'm not sure that's the right answer. The right answer is to improve ourselves and our non-computerized ways of life so there is no damn incentive to make the Torment Nexus.
When I was in college majoring in CS in the early ’00s I took several classes about ethics and technology. I also took plenty of liberal arts and literature classes throughout my entire education, as well as reading many sci-fi novels for pleasure. I learned the lesson of not building the torment nexus early and often. Thus far I have refused employment at any places doing so. I have an ethical and moral code that comes before anything an employer asks of me.
I think the problem is that clearly not everyone took the same path, and did not get the same moral education. I’m not religious at all, but religions do play a part in morally educating people. Not only are fewer people religious, but more of the religions that are popular are outright immoral in their teachings.
For profit companies do not act morally when not required to by law. Remember when Google hired some AI ethicists, and then fired them? Heck, they couldn’t even maintain their elementary school ethical code of “don’t be evil.” Companies who engineer bridges, medical devices, aircraft, etc. are regulated by law. They don’t follow ethical engineering practices of their own volition. They do it only because we force their hand.
We’re starting to see now what it looks like to regulate tech with some of the policies in the EU. While those regulations are flawed in several ways, they are also working properly. GDPR does have positive impact on privacy. Digital Markets Act is forcing Apple to allow side-loading.
We can defeat the torment nexus by simply outlawing it.
I took the same professional ethics course as you in school (Hi from another RIT anime nerd), and I found that even by mid 2000s, most people coming into the program openly mocked the lessons of Therac-25 and the Hyatt Regency Walkway Collapse as heavyhanded, but a disturbing few viewed these as lessons in how you could cut corners and avoid getting charged.
Google used to have ethicists ethicists and a culture of don't be evil. They were fired by the AI ethicists, who for the most part were among that latter group - Rationalists who were supremely good at rationalizing their choices rather than making good ones.
I don't think that any of the current efforts at regulation are going to work. The latter culture has taken over the institutions, and that's why you see so many baying for the institutions to be taken down.
However, ACM continues to accept money from companies that routinely and systematically violate this code (for example privacy in section 1.6) and seems to have little interest in sanctioning them.
> We want you to actively contribute to the discussion of ethics and computing (through the comment section or our survey (https://tinyurl.com/CACM-ethics) because participation is part of the computing profession.
> People who design bridges and vehicles have real responsibilities and standards they are held to, yet somehow the software that actually runs these things is exempt.
The most impressive accomplishment of the computing industry is avoidance of any kind of liability.
Because there was a very conscious attempt to destroy cybernetics. Resulting in MIT etc becoming a school that builds human calculators for Harvard/biz grads to utilize as literal machines without souls. <cite>albertm
I think other engineering disciplines grew up in an era where workers rights and public safety was becoming a big focus of government. Software engineering grew up in a time where we took government for granted; even demonized it
I agree with the crux of your point, but commonly held engineering ethics are still compatible with building weapons that directly attack other humans. The ethics basically focus on doing right by the employer/society you're working for. And for the surveillance industry, that society is the obscenely rich elites and us plebs are all just the targets.
If you build the Torment Nexus, it might even be ever so slightly less tormentful[1] than it otherwise could have been because you're a good person, perhaps even slightly better than most.
This thing you eventually come to believe, of course, is exactly what one must believe for the Torment Nexus to be made.
Sure, there are structural and economic incentives that tilt the playing field, but this is the belief that is easier to accept than to turn down the high pay, deal with the hassles of insurance or lack of, or have to encounter the immigration authority.
Hannah Arendt considered this argument in her essay "Personal Responsibility Under Dictatorship":
> In the tumultuous discussion of moral issues which has been going on since the defeat of Nazi Germany, and the disclosure of the total complicity in crimes of all ranks of official society, that is, of the total collapse of normal moral standards, the following argument has been raised in endless variations: We who appear guilty today are in fact those who stayed on the job in order to prevent worse things from happening; only those who remained inside had a chance to mitigate things and to help at least some people; we gave the devil his due without selling our soul to him, whereas those who did nothing shirked all responsibilities and thought only of themselves, of the salvation of their precious souls… (p 34)
But, ultimately, she finds the excuse lacking:
> Politically, the weakness of the argument has always been that those who choose the lesser evil forget very quickly that they chose evil. (p 36)
Oh absolutely! It's my hope that detailing the psychological mechanism at play is in some way an antidote to its acceptance. It feels if anything a bit naive when faced with the possibility that there exists a group who gleefully implements the most heinous practices so this line of thinking doesn't even enter the picture.
I have a friend who's working at the torment nexus factory, and whilst he knows he's helping kids getting blown to literal pieces, material conditions "dictate" him that he has little other choices besides closing his eyes and hardening his heart. Or quitting and working for someone else, but that's a tough argument to make.
I believe that, especially in software, we don't ask ethical questions up front and just start solving puzzles.
If and when people start asking themselves these questions, their only options are to rationalize away the harm their work has done entirely, engage in moral relativism, or consider "the big picture" at a zoom-level where nothing ever really matters.
I think it's just the over-simplification, not of tradeoffs of type of work vs one's own health insurance, but of just classifying some jobs as pure evil, and also the nowadays-tepid lesswrong writing style.
And no gods no masters is the sort of thing you might see someone say right before they become the leader, ban religion, and cause mass starvation.
Feels like we're getting into "contractors working on the Death Star" territory (surprisingly lucid and concise refrain on the subject from Smith et. al.). The job itself isn't "evil", but anyone capable of seeing how it fits into a bigger picture will think of your death as expected collateral.
This isn't the metro, but rather a discussion forum full of your peers. And the article isn't crazy disconnected ranting, but well-founded and well-reasoned arguments.
>But surely the real problem is a systemic one and you can’t blame the workers for participating. After all, they are just trying to feed themselves and get healthcare. And, that is correct, of course. What’s also correct is that systems are powered by people. They rely on the labor of people to function
That statement is pivotal and one which people really need to be called out on when they use it as a crutch.
"Don't hate the player, hate the game"... Motherfucker there's only a game because of the players!
Reminds me of the feelings combat drone remote pilots speak of where they’re also often forced into difficult ethical positions. It’s literally the job but there is clearly also harm being done. A terrorist than turns out to be not a terrorist etc
> Tech was a sector that made us think of humankind moving forward, possibly into some happy Star Trek like future where no one needed money and pie magically appeared in your wall if you said “Magic wall, pie me!”
Am I too naive in thinking that the AIs we're working on are a prerequisite step on the way towards Star Trek's post-scarcity future?
No, whether this tech could lead to a post-scarcity future is debatable. The naive part is thinking that it will lead to it, given the way we use it and the power structures that profit from it.
I personally feel like we're more on the path to the Altered Carbon-esque cyberpunk dystopia, only it will be clean and pretty and white and gleaming and only the "elect" will actually have full independent cognitive function.
Everyone else will be animal farm brainwashed into caring more about celebrities and sports and war and arguing online with strangers about pointless topics and politics while they eat their slop and work their meaningless jobs to keep the engine of civilization moving forward in a way that benefits the wealthiest, and AI will ultimately become the old school Andy Griffith cop that seems all nice and kind and helpful but will T-1000 sell us out to its true masters without remorse or hesitation no matter what.
Is it just me or is the dam starting to break on all kinds of truth-telling that were long taboo in this sort of space? I don’t think this post would have been so well-received on HN even a couple of years ago.
> Your soul will not remain intact while you hoover up artists’ work to train theft-engines that poison the water of communities in need.
This sentence nose-dove the article's credibility.
Intellectual property and the enforcement thereof is in and of itself a Torment Nexus. The belief that thoughts and ideas and words and images and sounds can be "stolen", and that such "theft" is somehow a bad thing (instead of the sort of free exchange of ideas that has benefited humanity for its entire recorded history) is itself mutually exclusive with having an intact soul.
Yes, artists deserve to be able to earn a living making art (absent a universal basic income that renders the notion of "earning a living" moot). Yes, it's understandable that they choose to do so by wielding IP law, because that's the most straightforward option they have in a capitalist system that actively rewards Torment-Nexus-enforced rentseeking. No, that doesn't make them any less complicit in the perpetuation of that Torment Nexus. These are the same laws that enable Disney to sue the pants off of parents who dare to decorate their dead children's coffins after said children's favorite fictional characters. These are the same laws that rob other creatives of their creative autonomy lest their works "infringe" on the "rights" of richer creatives who can afford better lawyers. These are the same laws that normalized shipping rootkits with creative works for the sake of "digital 'rights' management". These are the same laws that actively hinder the preservation of creative works for historical posterity, causing those works to be at risk of being lost forever. Intellectual property has done vastly more harm than good, and AI throwing a wrench in the ability to meaningfully enforce it is one of the exceedingly few good outcomes of AI proliferation.
Your soul will not remain intact while you parrot MPAA/RIAA "yOu WoUlDn'T dOwNlOaD a CaR" talking points in defense of collecting royalties until 70 years after you die.
Demonizing the one good outcome of AI is the opposite of pushing in the right direction, though. It's like the author missed his own point, self-awarewolf style.
I think you're assuming a lot to call it a "good outcome". I foresee hefty regulatory capture and compensation deals made with the big copyright businesses, but no real increase in freedoms for individuals.
That would be the outcome of the Torment Nexus succeeding at kneecapping the first thing in decades with any hope of destroying it, not the outcome of the thing in question succeeding at destroying the Torment Nexus.
I don't really understand what you mean here, because I don't really know what you specifically mean by Torment Nexus. It wasn't bad for a rhetorical technique of pointing out the incentive-attractor(s) that we're already suffering (Mammon, the orphan grinder, etc), but the term doesn't really work for analyzing technicals unless you spell it out.
In general, now that the pump has been fully primed for capital to flow into developing "AI", I do not see how copyright law is going to make much of a dent in that trend. Nor do I see how "AI" companies are going to make a dent in copyright law for anyone but themselves. I foresee large "AI" companies being essentially unbound on training over small-owner copyrighted works, upstart "AI" companies needing to pay into a hefty protection racket, and individuals still bound by imaginary property laws whether directly (old fashioned piracy) or when using common genAI (sorry Dave, I can't do that).
I just ran into a situation where ChatGPT refused to quote me the relevant bit of the electrical code for my state (supposedly binding law), because those laws were created by wholesale importing the "National Electrical Code" which is copyrighted. At best, the situation is an open legal question. And yet de facto there is still a restriction that prevents me from using the tool to engage with the law in good faith.
> I don't really understand what you mean here, because I don't really know what you specifically mean by Torment Nexus.
I thought I made that pretty clear when I wrote in my original comment that "[i]ntellectual property and the enforcement thereof is in and of itself a Torment Nexus."
> In general, now that the pump has been fully primed for capital to flow into developing "AI", I do not see how copyright law is going to make much of a dent in that trend. Nor do I see how "AI" companies are going to make a dent in copyright law for anyone but themselves.
"AI" exists outside of the various corporations hosting LLMs on The Cloud™. The corporate-hosted LLMs get undue emphasis largely as yet another result of the Torment Nexus that is intellectual property.
> [i]ntellectual property and the enforcement thereof is in and of itself a Torment Nexus.
That does not make it clear. Saying that A is an instance of B does not define what B is.
> "AI" exists outside of the various corporations hosting LLMs on The Cloud™
You're speaking obliquely here, so I'm left guessing what you mean. I think you're just referring to how individuals can train/download/modify/run models locally. I don't see how that affects copyright, as it seems to fit in the same exact place as piracy of the source material (unfortunately). Downloading a model that has gotten the attention of corpos for "infringing" will be treated exactly how torrenting original works is now.
It's pretty wild that software is the only "engineering" discipline without any concept of professional ethics or loyalty to human safety that superceeds the whims of the employer.
Why do people think that is? Have there been any attempts to change this from the inside over the past decade? Where are professional associations like the ACM in all of this? It's a shameful state of affairs and reflects poorly on the whole discipline.
People who design bridges and vehicles have real responsibilities and standards they are held to, yet somehow the software that actually runs these things is exempt.
This is how Boeing negligently murdered hundreds of people with MCAS. By taking responsibility for safety away from actual engineers and misplacing it with people who write software.
My ethics curriculum while getting my CS degree basically highlights the last decade+ of big name tech companies as "things that should not be done for obvious reasons". So I avoided all of them, looking for some of the most boring, banal tech jobs/applications of computers to clerical work imaginable.
Imagine my surprise when even most of those jobs were founded on things/goals/ambitions/central risks that were also profound ethical lapses. I've gotten to the point where I'm honestly wondering whether humanity is even ready for ubiquitous computing, to which seemingly the answer lately trends toward "hell no". Great thing to only realize after 10 years in the industry.
Point being, ethics is something self enforced, and we've taken great pains to ensure there is no professional licensure body or anything else around software; in particular because of the fundamental asymmetry created by locking that fire away from mortals behind a bunch of barriers to entry. Those with tech will have an advantage over those without. This is a certainty. The price of that (intentionally opening access to expertise for anyone curious) is what we have now. The tech that gets implemented is a reflection of the collective soul of humanity. If Greed is God, and the Deadline and Sale take priority over Safety, Fitness for Use, Quality, and Characterization of the System-Under-Scrutiny; then previously mentioned lapses in the ethicality of implementations of computer systems are what we're gonna get, and the conscientious objector will just be walked out the door and the next practitioner less afflicted by scruples will be walked in instead.
I'm open, as I've always been to putting a thumb on the scales through greater organization of active practitioners to actually make a means to ensure some subset of systems don't get implemented, but I'm not sure that's the right answer. The right answer is to improve ourselves and our non-computerized ways of life so there is no damn incentive to make the Torment Nexus.
Easy to say right? But that execution... Oofta.
When I was in college majoring in CS in the early ’00s I took several classes about ethics and technology. I also took plenty of liberal arts and literature classes throughout my entire education, as well as reading many sci-fi novels for pleasure. I learned the lesson of not building the torment nexus early and often. Thus far I have refused employment at any places doing so. I have an ethical and moral code that comes before anything an employer asks of me.
I think the problem is that clearly not everyone took the same path, and did not get the same moral education. I’m not religious at all, but religions do play a part in morally educating people. Not only are fewer people religious, but more of the religions that are popular are outright immoral in their teachings.
For profit companies do not act morally when not required to by law. Remember when Google hired some AI ethicists, and then fired them? Heck, they couldn’t even maintain their elementary school ethical code of “don’t be evil.” Companies who engineer bridges, medical devices, aircraft, etc. are regulated by law. They don’t follow ethical engineering practices of their own volition. They do it only because we force their hand.
We’re starting to see now what it looks like to regulate tech with some of the policies in the EU. While those regulations are flawed in several ways, they are also working properly. GDPR does have positive impact on privacy. Digital Markets Act is forcing Apple to allow side-loading.
We can defeat the torment nexus by simply outlawing it.
I took the same professional ethics course as you in school (Hi from another RIT anime nerd), and I found that even by mid 2000s, most people coming into the program openly mocked the lessons of Therac-25 and the Hyatt Regency Walkway Collapse as heavyhanded, but a disturbing few viewed these as lessons in how you could cut corners and avoid getting charged.
Google used to have ethicists ethicists and a culture of don't be evil. They were fired by the AI ethicists, who for the most part were among that latter group - Rationalists who were supremely good at rationalizing their choices rather than making good ones.
I don't think that any of the current efforts at regulation are going to work. The latter culture has taken over the institutions, and that's why you see so many baying for the institutions to be taken down.
> Where are professional associations like the ACM in all of this?
https://www.acm.org/code-of-ethics
However, ACM continues to accept money from companies that routinely and systematically violate this code (for example privacy in section 1.6) and seems to have little interest in sanctioning them.
See also:
> We want you to actively contribute to the discussion of ethics and computing (through the comment section or our survey (https://tinyurl.com/CACM-ethics) because participation is part of the computing profession.
https://cacm.acm.org/opinion/the-future-of-professional-ethi...
> People who design bridges and vehicles have real responsibilities and standards they are held to, yet somehow the software that actually runs these things is exempt.
The most impressive accomplishment of the computing industry is avoidance of any kind of liability.
Because there was a very conscious attempt to destroy cybernetics. Resulting in MIT etc becoming a school that builds human calculators for Harvard/biz grads to utilize as literal machines without souls. <cite>albertm
I think we need our own variant of the "iron ring" at this point. I wear a similar steel ring on my pinky as an oath reminder.
Its a lot of work to grass-roots something like that, and I don't have the charisma for it.
I think other engineering disciplines grew up in an era where workers rights and public safety was becoming a big focus of government. Software engineering grew up in a time where we took government for granted; even demonized it
I agree with the crux of your point, but commonly held engineering ethics are still compatible with building weapons that directly attack other humans. The ethics basically focus on doing right by the employer/society you're working for. And for the surveillance industry, that society is the obscenely rich elites and us plebs are all just the targets.
> commonly held engineering ethics are still compatible with building weapons that directly attack other humans
"Once the rockets are up, Who cares where they come down? That's not my department," Says Wernher von Braun.
https://news.ycombinator.com/item?id=44702782
If you build the Torment Nexus, it might even be ever so slightly less tormentful[1] than it otherwise could have been because you're a good person, perhaps even slightly better than most.
This thing you eventually come to believe, of course, is exactly what one must believe for the Torment Nexus to be made.
Sure, there are structural and economic incentives that tilt the playing field, but this is the belief that is easier to accept than to turn down the high pay, deal with the hassles of insurance or lack of, or have to encounter the immigration authority.
1. https://en.wiktionary.org/wiki/tormentful
Hannah Arendt considered this argument in her essay "Personal Responsibility Under Dictatorship":
> In the tumultuous discussion of moral issues which has been going on since the defeat of Nazi Germany, and the disclosure of the total complicity in crimes of all ranks of official society, that is, of the total collapse of normal moral standards, the following argument has been raised in endless variations: We who appear guilty today are in fact those who stayed on the job in order to prevent worse things from happening; only those who remained inside had a chance to mitigate things and to help at least some people; we gave the devil his due without selling our soul to him, whereas those who did nothing shirked all responsibilities and thought only of themselves, of the salvation of their precious souls… (p 34)
But, ultimately, she finds the excuse lacking:
> Politically, the weakness of the argument has always been that those who choose the lesser evil forget very quickly that they chose evil. (p 36)
https://grattoncourses.wordpress.com/wp-content/uploads/2016...
> ultimately, she finds the excuse lacking
Oh absolutely! It's my hope that detailing the psychological mechanism at play is in some way an antidote to its acceptance. It feels if anything a bit naive when faced with the possibility that there exists a group who gleefully implements the most heinous practices so this line of thinking doesn't even enter the picture.
I have a friend who's working at the torment nexus factory, and whilst he knows he's helping kids getting blown to literal pieces, material conditions "dictate" him that he has little other choices besides closing his eyes and hardening his heart. Or quitting and working for someone else, but that's a tough argument to make.
It's telling how little discussion this post is generating on HN. Perhaps a little too uncomfortable for a large portion of the crowd?
I believe that, especially in software, we don't ask ethical questions up front and just start solving puzzles.
If and when people start asking themselves these questions, their only options are to rationalize away the harm their work has done entirely, engage in moral relativism, or consider "the big picture" at a zoom-level where nothing ever really matters.
I think it's just the over-simplification, not of tradeoffs of type of work vs one's own health insurance, but of just classifying some jobs as pure evil, and also the nowadays-tepid lesswrong writing style.
And no gods no masters is the sort of thing you might see someone say right before they become the leader, ban religion, and cause mass starvation.
Feels like we're getting into "contractors working on the Death Star" territory (surprisingly lucid and concise refrain on the subject from Smith et. al.). The job itself isn't "evil", but anyone capable of seeing how it fits into a bigger picture will think of your death as expected collateral.
Usually when crazy people get on the metro and start yelling their political opinions at no one, we all know it’s best to ignore them.
Perhaps those of us voicing no opinion at all are the crazy ones.
This isn't the metro, but rather a discussion forum full of your peers. And the article isn't crazy disconnected ranting, but well-founded and well-reasoned arguments.
Grammar nit, but it highlights how steeped in passivity we all are:
> he's helping kids getting blown to literal pieces
Helping kids [who are] getting blown to bits would be a noble endeavor. What you presumably mean is that he is helping to blow kids to literal pieces
>But surely the real problem is a systemic one and you can’t blame the workers for participating. After all, they are just trying to feed themselves and get healthcare. And, that is correct, of course. What’s also correct is that systems are powered by people. They rely on the labor of people to function
That statement is pivotal and one which people really need to be called out on when they use it as a crutch.
"Don't hate the player, hate the game"... Motherfucker there's only a game because of the players!
Reminds me of the feelings combat drone remote pilots speak of where they’re also often forced into difficult ethical positions. It’s literally the job but there is clearly also harm being done. A terrorist than turns out to be not a terrorist etc
> Tech was a sector that made us think of humankind moving forward, possibly into some happy Star Trek like future where no one needed money and pie magically appeared in your wall if you said “Magic wall, pie me!”
Am I too naive in thinking that the AIs we're working on are a prerequisite step on the way towards Star Trek's post-scarcity future?
No, whether this tech could lead to a post-scarcity future is debatable. The naive part is thinking that it will lead to it, given the way we use it and the power structures that profit from it.
I personally feel like we're more on the path to the Altered Carbon-esque cyberpunk dystopia, only it will be clean and pretty and white and gleaming and only the "elect" will actually have full independent cognitive function.
Everyone else will be animal farm brainwashed into caring more about celebrities and sports and war and arguing online with strangers about pointless topics and politics while they eat their slop and work their meaningless jobs to keep the engine of civilization moving forward in a way that benefits the wealthiest, and AI will ultimately become the old school Andy Griffith cop that seems all nice and kind and helpful but will T-1000 sell us out to its true masters without remorse or hesitation no matter what.
Is it just me or is the dam starting to break on all kinds of truth-telling that were long taboo in this sort of space? I don’t think this post would have been so well-received on HN even a couple of years ago.
> Your soul will not remain intact while you hoover up artists’ work to train theft-engines that poison the water of communities in need.
This sentence nose-dove the article's credibility.
Intellectual property and the enforcement thereof is in and of itself a Torment Nexus. The belief that thoughts and ideas and words and images and sounds can be "stolen", and that such "theft" is somehow a bad thing (instead of the sort of free exchange of ideas that has benefited humanity for its entire recorded history) is itself mutually exclusive with having an intact soul.
Yes, artists deserve to be able to earn a living making art (absent a universal basic income that renders the notion of "earning a living" moot). Yes, it's understandable that they choose to do so by wielding IP law, because that's the most straightforward option they have in a capitalist system that actively rewards Torment-Nexus-enforced rentseeking. No, that doesn't make them any less complicit in the perpetuation of that Torment Nexus. These are the same laws that enable Disney to sue the pants off of parents who dare to decorate their dead children's coffins after said children's favorite fictional characters. These are the same laws that rob other creatives of their creative autonomy lest their works "infringe" on the "rights" of richer creatives who can afford better lawyers. These are the same laws that normalized shipping rootkits with creative works for the sake of "digital 'rights' management". These are the same laws that actively hinder the preservation of creative works for historical posterity, causing those works to be at risk of being lost forever. Intellectual property has done vastly more harm than good, and AI throwing a wrench in the ability to meaningfully enforce it is one of the exceedingly few good outcomes of AI proliferation.
Your soul will not remain intact while you parrot MPAA/RIAA "yOu WoUlDn'T dOwNlOaD a CaR" talking points in defense of collecting royalties until 70 years after you die.
I share your view of imaginary property. But pushing in the right direction is much more important than being 100% correct about everything.
Demonizing the one good outcome of AI is the opposite of pushing in the right direction, though. It's like the author missed his own point, self-awarewolf style.
I think you're assuming a lot to call it a "good outcome". I foresee hefty regulatory capture and compensation deals made with the big copyright businesses, but no real increase in freedoms for individuals.
That would be the outcome of the Torment Nexus succeeding at kneecapping the first thing in decades with any hope of destroying it, not the outcome of the thing in question succeeding at destroying the Torment Nexus.
I don't really understand what you mean here, because I don't really know what you specifically mean by Torment Nexus. It wasn't bad for a rhetorical technique of pointing out the incentive-attractor(s) that we're already suffering (Mammon, the orphan grinder, etc), but the term doesn't really work for analyzing technicals unless you spell it out.
In general, now that the pump has been fully primed for capital to flow into developing "AI", I do not see how copyright law is going to make much of a dent in that trend. Nor do I see how "AI" companies are going to make a dent in copyright law for anyone but themselves. I foresee large "AI" companies being essentially unbound on training over small-owner copyrighted works, upstart "AI" companies needing to pay into a hefty protection racket, and individuals still bound by imaginary property laws whether directly (old fashioned piracy) or when using common genAI (sorry Dave, I can't do that).
I just ran into a situation where ChatGPT refused to quote me the relevant bit of the electrical code for my state (supposedly binding law), because those laws were created by wholesale importing the "National Electrical Code" which is copyrighted. At best, the situation is an open legal question. And yet de facto there is still a restriction that prevents me from using the tool to engage with the law in good faith.
> I don't really understand what you mean here, because I don't really know what you specifically mean by Torment Nexus.
I thought I made that pretty clear when I wrote in my original comment that "[i]ntellectual property and the enforcement thereof is in and of itself a Torment Nexus."
> In general, now that the pump has been fully primed for capital to flow into developing "AI", I do not see how copyright law is going to make much of a dent in that trend. Nor do I see how "AI" companies are going to make a dent in copyright law for anyone but themselves.
"AI" exists outside of the various corporations hosting LLMs on The Cloud™. The corporate-hosted LLMs get undue emphasis largely as yet another result of the Torment Nexus that is intellectual property.
> [i]ntellectual property and the enforcement thereof is in and of itself a Torment Nexus.
That does not make it clear. Saying that A is an instance of B does not define what B is.
> "AI" exists outside of the various corporations hosting LLMs on The Cloud™
You're speaking obliquely here, so I'm left guessing what you mean. I think you're just referring to how individuals can train/download/modify/run models locally. I don't see how that affects copyright, as it seems to fit in the same exact place as piracy of the source material (unfortunately). Downloading a model that has gotten the attention of corpos for "infringing" will be treated exactly how torrenting original works is now.