AI is here and I find it difficult to feel good about it. I asked ChatGPT to write me a LinkedIn Profile if I were a small business consultant. Here’s its output:
💼 What I Do:
- Business planning & growth strategy
- Operational and process improvement
- Marketing and brand development
- Financial analysis & forecasting
- Digital transformation and systems implementation
Our business is looking to hire a small business consultant and here is the profile of one of the candidates.
Our consulting work focuses on:
✅ Strategic planning & growth initiatives
✅ Business operations, systems, and SOP development
✅ Team building, hiring, and leadership development
✅ Marketing, branding & customer acquisition strategy
✅ Financial modeling, pitch decks & business plans that fund
✅ Organizational change management & leadership structure
How many ways are there to write a profile?
Is it AI? Does it Matter?
We have an employee who I asked to add a product to our website. I viewed the tool today and it had product descriptions way beyond any copyright I could have created. AI output from a 4 year employee outperformed a 20 year veteran. Yet, this 4 year employee wouldn’t know why he would use that product. If anyone ever asked him to show how to demo that product, we’d have a problem.

This brings us back to square one when hiring a consultant who uses AI. What makes me think they have any idea what they are doing when they can rely on a computer for output? What’s stopping them from putting a 10 page report on your desk generated by a computer and saying, “follow this.” That’s not worth paying for. You expect real world experience that isn’t shiny on the outside, but has a glowing core on the inside. Tomorrow I plan on asking all our candidates how AI is implemented in their business and see how they answer the question. Lying about it is clearly bad because it shows an insecurity about their experience level.
This scenario will play out billions of times in the future. Who gets credit for an AI output? Does it matter if the result is superior? These questions are not easy to answer. I have an underlying fear of people who rely on AI. They become cyborgs where they need that crutch to survive (and possibly thrive). It’s like HER where Joaquin Phoenix thinks he’s formed this unique/emotional relationship with a computer only to end with that computer having millions of relationships exactly like his. Does it matter if it’s positive? For me, I feel like the answer is it does. Call me a purest but the future will be cold and sad if this is the case.

Recent Comments