
Will outsourcing to AI eliminate original thought?

I saw the comment "It's not about solving for speed but solving for depth" under an AI discourse on outsourcing one's brain to generative artificial intelligence on LinkedIn and figured, that's a cool topic to think through for a bit and write about.
There's a growing fear that humans are or will give up much of thinking desire to artificial intelligence.
As usual, I find this fear to be comical because people already don't like thinking for themselves. A lot of people out there are walking around with ideas and beliefs they gathered from listening to or reading others.
One person says or writes something and it gets passed around until 10,000 say or belief the same thing.
But, that's not an excuse
Despite this reality, I still believe that it is absolutely necessary to assess this possibility and solve for what risks are associated with it.
I think above all else, the most important area to focus on would be politics.
If people were to form a habit of jumping into an AI chatbot to ask "Is X president evil" and consider it's response "truth" then that would create an incentive for a political body to leverage this tech to influence narratives or public perspectives on matters.
Translation: not good!
But let's put a pin on that for now. Let's try to understand something else.
Why would people outsource to AI?
Usually, every problem has a smaller problem often termed the "root cause" so in the case of outsourcing to AI, what fundamental problem makes the concept appealing?
As much as the phrase "outsourcing" can mean a lot of things and not just giving away the desire to think, it all comes back to the same thing: cognitive load.
If I outsource my design tasks, I'm running away from the cognitive load of thinking through every data relevant to completing the task independently.
If I outsource finding the best finance book online, I'm running from the cognitive load required to read through numerous reviews and determine the best.
If I outsource writing an email, same thing.
It all comes down to not wanting to apply any mental effort.
Unfortunately, this is a problem we can't solve by pressing a button because that would be like asking people to choose suffering when there's a quick fix right in front of them.
Making risks greater than rewards
I think that the solution to this growing fear is making the potential damages greater than the rewards.
This is obviously already the case for the average person who'd choose outsourcing thinking to AI completely, so this isn't about them, but the companies behind the tech.
This is essentially regulation, a reality many have agreed will be difficult but necessary to continually apply efforts.
It has to be more potentially damaging for AI companies to capitalize (in an unethical way) on the reliance of people on their chat bots, than the rewards.
Because to answer the question: yes, outsourcing to AI will eliminate original thought, to a greater extent than we already see today.
In literally seconds people will believe and act on things completely new to them, and that reality must come with great consequences that keeps the companies behind these tech away from indulging in activities that makes the greater fears a reality.
In addition to this. I think that efforts still have to be made towards educating people to use AI to seek depth and not chase speed of execution as the earlier quote highlighted.
Achieving greater depth generally increases execution speed in the long run. People need to understand that AI doesn't make them better if they don't use it accordingly and those savings in cognitive load being chased can turn into something more damaging.
Posted Using @badbitch/will-outsourcing-to-ai-eliminate-original-thought-e7s" target="_blank" rel="noopener noreferrer">INLEO
Estimated Payout
$0.31
Discussion
No comments yet. Be the first!