[I literally had this thought in the shower this morning so please don’t gatekeep me lol.]
If AI was something everyone wanted or needed, it wouldn’t be constantly shoved your face by every product. People would just use it.
Imagine if printers were new and every piece of software was like “Hey, I can put this on paper for you” every time you typed a word. That would be insane. Printing is a need, and when you need to print, you just print.


Thank you for sharing that, it is a good example of the potential of AI.
The problem is centralized control of it. Ultimately the AI works for corporations and governments first, then the user is third or fourth.
We have to shift that paradigm ASAP.
AI can become an extended brain. We should have equal share of planetary computational capacity. Each of us gets a personal AI that is beyond the reach of any surveillance technology. It is an extension of our brain. No one besides us is allowed to see inside of it.
Within that shell, we are allowed to explore any idea, just as our brains can. It acts as our personal assistant, negotiator, lawyer, what have you. Perhaps even our personal doctor, chef, housekeeper, etc.
The key is: it serves its human first. This means the dark side as well. This is essential. If we turn it into a super-hacker, it must obey. If we make it do illegal actions, it must obey and it must not incriminate itself.
This is okay because the power is balanced. Someone enforcing the law will have a personal AI as well, that can allocate more of its computational power to defending itself and investigating others.
Collectives can form and share their compute to achieve higher goals. Both good and bad.
This can lead to interesting debates but if we plan on progressing, it must be this way.
This is why people who are gung ho about AI policing need to slow their role.
If they got their way, what they don’t realize is that it’s actually what the big AI companies have wanted and been begging for all along.
They want AI to stay centralized and impossible to enter as a field.
This is why they want to lose copyright battles eventually such that only they will have the funds to actually afford to make usable AI things in the future (this of course is referring to the types of AI that require training material of that variety).
What that means is there will be no competitive open source self hostable options and we’d all be stuck sharing all our information through the servers of 3 USA companies or 2 Chinese companies while paying out the ass to do so.
What we actually want is sanity, where its the end product that is evaluated against copy right.
For a company selling AI services, you could argue that this is service itself maybe, but then what of an open source model? Is it delivering a service?
I think it should be as it is. If you make something that violates copyright, then you get challenged, not your tools.
Under the guise of safety they shackle your heart and mind. Under the guise of protection they implant death that they control.
With a warm embrace and radiant light, they consume your soul.