The importance of foundational skills.
When using AI, it's really important that we get good at prompt engineering, which is a fancy word for “giving instructions in a clear way”. The big issues I see with this are two things. One, we're actually pretty terrible at communicating. And two, we don't actually know what we want the agent to do. The first one is self-explanatory. We use big, vague words that don't clearly state what we want or what makes an output good versus bad. I'm guilty of this all the time, but honestly, it's a lot of work to write a detailed prompt (especially when all you want is to find the cheapest place to buy groceries). The second one is more important though. I've caught myself wanting AI to write good copy, but then I realize I've never actually defined what "good copy" means. I don't have enough experience to even understand the nuances of what makes a strong foundation for a good piece versus a bad one. I'm using copywriting as the example here because marketing and lead gen is where I'm struggling in business right now, but this applies to anything. If you're having trouble with writing and you try to get AI to write something, of course it's not going to understand what makes it sound human versus not human. You haven't taken the time to explain it. Same thing with AI-generated images and video. We know it can produce good results, but it also produces bad results a lot of the time because the defaults are things like waxy skin and unnatural body compositions. The issue is that we haven't told it the specifics (things like "use a light skin tone with some blemishes and natural imperfections"). These models are exceptional at what they can do. But a lot of the time when we don't get the output we want, it's user error, not a model problem. TLDR: Bad AI outputs is a skill issue.