Is this how you see human intelligence? Is human intelligence made without the input of other humans? I understand that even babies have some sort of spark before they learn anything from other people, but dont they have the human dna input from their human parents? Why should the requirement for AI intelligence require no human input when even human intelligence seemingly requires human input to be made?
Sorry, lots of questions, just food for thought I suppose.
Yeah, to be clear, I'm not arguing that current LLMs are as creative and intelligent as people.
I am saying that even before babies get human language input, they still get input from people to be made, the baby's algorithm to make that spark is modled on previous humans by the human data that is DNA. These future intelligent AIs will also be made by data that humans make. Even our current LLMs are not purely human language input, they also have an algorithm that is doing stuff with that data in order to show to us its, albeit relatively weak, "intelligent spark" that it had before it got all that human language input.
Chatbots are not new. They started around 1965. Objectively, gpt4 is more creative than the chatbots of 1965. The two are not equally able to create. This is an ongoing change, in the future AI will be more creative than today's most creative AIs. AI will most likely continue on its trajectory and some day, if we dont all get destroyed, it will eventually be more intelligent and creative than humans.
I would love to hear an rebuttal to this that doesn't just base its argument on the fact that AI needs human language input. A baby and its spark is not impressively intelligent. What makes that baby intelligent is its initial algorithm plus the fact that it gets human language data. Requiring that AI must do what the baby does without the human language data that babies get makes no sense to me as a requirement.