It seems no matter where you go, There's this talk about that "AI" taking over. Since recently DALL-E 2 is the hottest thing, the focus has been on visual arts. As art is one of my pastitmes, I can't really not hear about it. I did notice something sinister around the talk. And it's not me being mad that the digital Rembrandt is here, putting the human one in the shadow and shame for eternity. It's more that no matter how you think about it, what is happening is theft.
Think of how these AI systems work. Some programmer creates a model, you feed it data, and then it can create output based on that. So far so good, but there are few issues with this. One being that what is required to produce the AI network is 1% the programmer work, and 99% of the work of all the artist who produce data of it. Issue is, the programmer is paid, the artist is not. Think of how weird this is in human terms. This is as if you lived in a town where artists put their art in front of their houses for everyone to see. So you go around with the camera, snap pictures here and there of artworks (seeing the content is digital, a copy is 1 for 1 the same thing as the object you are copying), return home. Then you apply a few filters on the artworks and then sell it back without credit.
If someone did that once, we all would probably feel a bit bad but think of it as a one in a time nuisance. Let it go. But if someone went on like that for days upon days, consuming people's artworks and then selling them. We would rightfully get mad and consider this a scam. You know, how weird is it to that someone just steals your work and then sells it back?
So let's take a closer look at the business model of any of these companies. I'm going to take Instagram, since it being an app made for visual media, it comes close to what we talked about before. What makes it valuable? Software wise, what Instagram is, it's just a database, few operations to create and delete pictures (nowadays videos too) and few buttons to do it. I hate to say it, but it's not that different from the site you are reading this on. It's the exact same thing. Sure, there is a bit of work to scale it to it's massive size, and there's analytics software to give you feed recommendation based on what you already saw. But really, if you sold Instagram as a software it wouldn't be really that expensive. I know it's a crude metric, but Xenforo forum software, which I saw being used on two pretty good online forums with all the abilities Instagram has and more, costs just 160 dollars. So the software can't really be the meat of it
So what is valuable about it? It's pretty obviously the user data. Instagram could disappear from the earth and the software could be recreated in a week if someone wanted to nearly the same state as it is now. Probably the code would even be better because it wouldn't have a decade of legacy code to slow it down. But if we lost the data, From the pictures self to it's metadata, We would have lost billions.
The irony of this shouldn't be lost to the reader. With Instagram being free, it has to fund itself by something else than direct payment. And the fun thing is, it's funded by YOU. This applies to all social media companies. It's the monetization of your leisure. In your free time, you chat, make pictures, upload videos. You are contributing to the value of Instagram directly.
So the idea of the web being free like it is now, is rubbish. You do pay for it, but it's by directly working for Instagram. Some might say this is preferable because it keeps the access free to everyone. By not paying for these services a 13 year old kid with a crappy phone has the same access to the service as someone well off. I have few issues with that once again.
With the service being free, it funds itself by selling your data to advertisers. That alone should feel off. The machinery to get information from people nowadays is larger and more developed than the spying agencies in the cold war. Some say this doesn't matter because Instagram will never be KGB. Which I can get behind, but leaving the infrastructure ready for a spy and control network on levels the KGB could only dream off and then trusting it will always be in good hands is naive.
There is harm and manipulation done as we speak through these services. It's just for profit. And it's not just in one way
With that said, if you tell this to someone, it's not that they disagree with you. No! They will tell you are right. But they see it as the only possibility. Basically saying this is the only possible world. Well, if I wanted to be snarky I would say that the manipulation worked and you already lack the ability to see potential futures and choose between them. But if we take the point as valid, I need to present some alternatives. That you can read at the end, I first want to see how AI changes this discussion
This is a bit tricky, but I'm going to first look at this from the point of ownership. Since the AI is owned by some big centralized entity, it would be very weird for it to be allowed what it wants with others data. Just because it's a bad idea by itself. It's a truism to say that power shouldn't be centered to a small group of people cause it encourages abuse. So by that logic it's not a stretch to say that having an AI company owned by few using the data of the many to be wrong. There's a sentence originally about parliament that applies as well here: "that what touches all should be approved by all"
So yes, I think majority of my issues would be gone if the AI became a public utility used by everyone for free. But once we go that way, why not think of making an internet a public utility too? With some imagination we could imagine it being self funded too. It really doesn't cost that much nowadays to run a server where all your data could be hosted. If you give off 10 dollars or euro per month, which is the same as Spotify premium costs, you would pay what I pay to run 2 websites, one which receives daily usage, and this one which contains a good chunk of written word by me and few art pieces and photographs.
An other point to make is that modern "AI" doesn't have what it takes to be an individual. individual is a pretty vague definition (like all definitions) that's easy to abuse, after all what allowed slavery in many ancient societies is the denial of seeing slaves as human, but it has a point. AI is way closer to machine than it is to man, and no one thinks sewing machines to laptops should be treated as individuals.
It doesn't really think like the way we think. What the AI does is see our work, see the patterns in it, and then weave the patterns back together based on prompts to create the desired results. Some might say that is what humans just do. Like nearly all arguments that try to say the mind no different than a machine, the argument relies figuring how to do something with a machine, and then just assuming the mind does it the exact same way, no matter how different the internal process you experienced when creating something was. One issue with that is that it's just speculation. No one really knows how the mind works, if they did we wouldn't be left with handful of big scientific and philosophical questions about it.
One already might just take perception itself. What you see in front of you, you can grasp and understand in your mind. You have an internal account of it and the vision you receive is held in your consciousness. It's not the eye's work. The eye doesn't perceive. Neither is it like the the display you see on the monitor. After all, the display doesn't understand anything outside of the most mechanical terms (think of something like: window of your browser is at pixel [x:0, y350]). Something has to be conscious about the display. There's an argument made by Leibniz about this. Imagine a machine that actually could perceive and be conscious. We could take the machine and scale it up to the size of the building to see it's inner workings. But no matter how we would look at the cogs moving in it, they couldn't explain perception. That to me is enough to show the idea that computation or mechanical thought is not enough to explain what makes us human. And it's not jumping into some mysticism. I by all accounts am an materialist, although not a vulgar one. it's just an unknown we have, but one that's special. After all, if we showed caveman a light bulb he would think we are mystics too. No?
But with saying that, I'm going instead to ask a question. Why do we need data at all then? Whatever language you use will be different a century from now and was different a century ago. Unless you have active people developing it and translating it to other language, the currently trained AI model would be useless. Same with art, we never did the same art for centuries. Styles change, innovations happen. By using the statistical method of the AI you can't rely on it to be useful few decades from now without feeding it data. That implies it needs human labor to keep turning.
And it's not just art or translation. That applies to voice acting, design, driving, cooking, programming and probably quite a bit more. Saying that AI makes these things is like saying that the telephone produces human speech. One fun thing to note is how much this resembles the argument of exploitation that was made by Marx.
One of the first big economic theories was that all economic activity depended on labor, and labor itself was seen as borderline sacred (it is to this day, just not as openly) . With the value of a commodity (think rubber bath ducks if you don't have an imagination) depended on how much, let's say, hours of labor were expended on it. Marx then came and argued that what this theory proves is that people are being exploited. Seeing that in a factory, many work in production, and of all the sum of profit that an enterprise earns, the workers get the minority of that back. While the owner of the factory, even if you believe he is important, earns a majority well beyond what he should.
An argument against that is, that a business owner takes risks and is crucial for operation. I'm going to just agree with it, but then the wealth difference is still ridiculous. After all, no man, no matter how skilled, can run an international company by himself since he is a mere human. So let's think of one of the most well paid workers one can think of, programmers in silicone valley. The good ones, top of the cream, do earn around 100k to millions monthly. But the people who own these companies have net worth of billions. To be as successful as to have a billion in your bank, someone making a million per month (which would be the top of the cream) would need to wait 83 years. And that's for a billion. The business owners are multi billionaires.
Of course, I feel like this is silly talk for the average person. Most people earning 9k monthly would consider themselves well off. Especially if you take people in the third world into account. This is not an argument against people being rich. I'm fine with multi millionaires existing provided they really do the heavy lifting, but no one will tell me any person can be as important to "deserve" billions. Unless he is some super genius that cured cancer and invented hyper-drives so that we can fly to other galaxies in span of minutes. But the ones who are so rich nowadays aren't.
And I think that's the issue with AI. You can't tell me that the owners of AI companies have done enough work to get the fruit of collective labor for themselves. It just doesn't register in me. What risk do they even have? All of these AI companies are funded by corporations which are "too big to fail", so worst case that could happen is they don't come up with clever enough algorithms to actually do the work? They essentially have endless room for failure, so the idea of there being risk here is laughable.
Some people believe this data collection is temporary, and after that the AI will forever be able to do everything by it's own. I hope that's not the case, because it will probably be seen as age where work is more derivative and boring then it is even now, but assuming that happens the AI still needs to feed on human data to know what to actually sell. We can imagine fully AI generated movies for instance, but someone still needs to watch and rate them, which means data still will be collected even if we assume full automation. You can't really have this modern AI system without data and stealing people's work.
Some reader: Gee! for so much talk you sure didn't give any solution!
Alright, to put my money where my mouth is. Here is are possible solutions to this problem.
Make AI services public utility. And this isn't some communist plot or anything to take away anything. Public libraries exists and have functioned well. The internet made them obsolete, but their model might be something worth emulating. After all, it was used by countless people as a service without being abused or failing in any meaningful way.
Other one is... just own the platform. It's a radical position to take, but imagine a social media platform where you had a stake in it. You paid a bit to host your own place on it. And if people used your what you produced. Be it data such as in text, or outright finished works, like, let's say, youtube videos, you just get paid for it. We can think of many ways this could work. Either through a distributed payment network, or by everyone holding a dividend in some social media company. Sure, it's something rather unusual, but if we were so opposed to unusual ideas we would still be living in caves.
I would even say you are encouraged to produce quality. After all, if someone pays for something he rather considers it of enough quality to put money on it. Such a platform, if implemented well (I can already see some crypto scammer creaming their pants before hearing this) and without predatory practices, could produce quality work, instead of just quantity like we expect on modern platforms.
As a fun rumor that I don't know if it's true or not, a huge part of the internet was build at Xerox Parc. A research facility owned by a company who exists to make printers. Apparently the internet wasn't supposed to have a copy function because why need to copy anything when you can just look it up on the network? But the nature of Xerox made the copying function slide into the net.
Of course there are more arguments against it. And I don't fully believe in that theory. But it has one important point and that is that production is made by the many, yet it's profits are taken by the few. I might write more in depth some other time about how all it fits together.