4 reasons why AI isn't making your company more productive and 3 ways to fix it

In July 2024, Upwork published a report with an alarming observation:

47% of employees using AI say they have no idea how to achieve the productivity gains their employers expect, and 77% say these tools have actually decreased their productivity and added to their workload.

Apparently, letting employees use ChatGPT isn’t enough to get productivity improvements and there is a huge skill gap between those who became more productive with AI and everyone else.

At the same time, the Walmart’s CEO Doug McMillon says their productivity improved a 100 times with the help of AI:

Without the use of generative AI, this work would have required nearly 100 times the current head count to complete in the same amount of time and for associates picking online orders, showing them high-quality images of product packages helps them quickly find what they’re looking for.

How is that possible? Why some companies struggle with AI while others use it to increase productivity?

Why Are Employees Less Productive When They Use AI?

In the Upwork report, we read:

Nearly half (47%) of workers using AI say they have no idea how to achieve the productivity gains their employers expect. Over three in four (77%) say AI tools have decreased their productivity and added to their workload in at least one way. For example, survey respondents reported that they’re spending more time reviewing or moderating AI-generated content (39%), invest more time learning to use these tools (23%)” (…)

Not knowing how to achieve the expected result clearly shows the issue is caused by insufficient knowledge, not laziness or malice. It’s a general computer skill problem. Software can be difficult to use, especially a new category of software. Has anyone used AI chatbots ever before?

When it comes to the added workload and diminished productivity, I bet people have said the same thing about Excel at some point. In my IT career, I have seen people use the calculator app on their phones to get a result they typed into an Excel cell instead of using the built-in functions of Excel.

In the case of Excel misuse, it’s undoubtedly an employee’s fault. After all, public schools have taught students how to use spreadsheets at least since the 1990s. However, nobody teaches how to use AI yet, and even if they did, would you wait until a new generation of workers graduates?

No Proper Training on How to Use AI

If your employees don’t know how to use AI, you should train them. Waiting until they learn the required skills in their free time is inefficient. Most of them won’t do it, ever.

The executives surveyed by Upwork correctly identified the consequences of lacking AI-related skills while failing to figure out the underlying cause and blamed employees:

One in two executives at companies using AI believe their company is falling behind their competitors (51%) and that their workforce’s overall productivity levels are stalled due to a lack of employee skills and adoption (50%).

If a factory buys a new machine, do the managers throw a manual on employee laps and let them learn by doing while hoping the machine won’t cut too many fingers off? Generally, such negligence in employee training happens only in factories that compete with others by being the cheapest suppliers. It’s accuracy by volume. If you produce enough widgets, some of them will be good enough. Similarly, if you let everyone use AI, some employees will be more productive. We didn’t realize introducing AI is like buying a new machine at a factory.

Expectations Misaligned with AI Capabilities

AI isn’t a magic tool. People see cool demos where AI writes emails, makes games, or apps, and wants AI to do the same for them. The demo is like an Instagram photo of an influencer. The influencers take 30 photos and post the best one. The creators of the demo try a dozen ideas and show you the best one.

Also, we are extrapolating way too much. Some people see an AI model writing code for the snake game and conclude that the same model can code an ERP application when you instruct AI with a three-sentence prompt.

The same happens when you look at demos of AI systems used for marketing. Automatically generating hundreds of personalized cold emails looks like an attractive sales strategy until you see that the only personalization AI used was writing the company name and mentioning the product currently advertised on their website. Will it sell? Of course. As long as you play the “number game” and send enough messages. Will it be profitable? Will it build trust and a lasting business relationship with customers? So far, the creators of AI systems for marketing tend to ignore those questions.

Expectations Misaligned with Employee Skills

AI-assisted human-guided process is the proper approach, but not many people know how to do it. Even worse, employees don’t communicate the need for training, and the management has no clue about the actual performance. According to the Upwork report:

Thirty-seven percent of C-suite leaders at companies that use AI said their workforce is “highly” skilled and comfortable with these tools, but only 17% of employees actually reported this level of skill and comfort. Thirty-eight percent of employees, in fact, reported feeling overwhelmed about having to use AI at work.

Employees afraid to say they have problems completing a task and need training may be a problem more related to corporate culture than employee skill, but 83% of employees feeling their AI skills are lacking isn’t something we can ignore.

Fixing the AI skill gap is a job for the management. People who are already overwhelmed won’t find a solution. They may decide to stop using AI, and your company will fall even more behind those who use AI efficiently. You may need to buy them an online course or hire an external consultant to teach them. Perhaps you already have an employee who is better at using AI than everybody else and can train others. But first, make sure the employee is at least a decent teacher. Nothing is worse than a terrible tutor explaining something you already have trouble understanding.

Scepticism Towards AI

If a person doesn’t know how to use AI properly, they need to protect their ego somehow. They tried AI. They typed a highly specific question, and ChatGPT made a mistake; thus, AI is useless. Is Google useless because it shows pages that don’t answer the question among the relevant results?

Unfortunately, ChatGPT became synonymous with AI. It’s not the only tool and certainly not the best tool for everything. Part of the AI skill gap is a lack of knowledge of things like Perplexity or Microsoft Copilot for Bing. People don’t know they should use them instead of ChatGPT when they need accurate information based on web research. For coders, the gap shows the most when they ignore the Anthropic Claude model or Cursor’s Copilot++ and use ChatGPT and Github Copilot for everything.

Of course, AI hallucinates or ignores parts of the prompt, but that doesn’t mean it doesn’t work. For me, nitpicking on AI’s hallucinations has the same vibe as showing a picture of a wind turbine on fire and saying wind turbines can pollute, too.

We can limit hallucinations by using old-school search first, giving AI source information, and asking it to produce the answer based on the provided context (like Perplexity does). We can ensure AI doesn’t skip any part of the prompt by splitting the task into separate requests executed individually.

How to Use AI to Boost Company Productivity

In the aforementioned report, Upwork doesn’t offer a sensible solution. They are a marketplace for freelancers, so of course, they suggest hiring freelancers who are better at using AI than your full-time employees. Outsourcing, not upskilling? Upwork is wrong.

Getting help from freelancers may be a short-term solution, but what about all the people you already hired? Will you fire them because they don’t know how to use AI? Your employees need training showing them how to use AI to do the tasks they perform daily and with examples relevant to the position they hold.

Everyone Should Attend a Prompt Engineering Training

Everyone should attend a prompt engineering training. Without prompt engineering training, people will rely on tricks they heard from colleagues, stumbled upon on LinkedIn, or saw in a TikTok clip.

While new versions of AI models are more lenient toward conversation-style prompts, nothing beats learning a few basic prompt engineering techniques, such as in-context learning (a fancy name for giving AI examples of questions and answers), chain-of-thought, or the elements of an effective prompt.

During the basic training, employees should also learn about other existing AI tools. It’s crucial to make them understand ChatGPT isn’t a replacement for Google, and AI-powered search engines already exist. Also, tell them the AI search engines aren’t chatbots, so they should just type what they are searching for instead of trying to converse with Perplexity.

If you don’t want to pay for classes, at least send them a link to my prompt engineering guide. However, remember that most employees won’t even read the article the first time they get a link. Send it a few times and give them time.

Teaching programmers to work with Copilot-like tools

Early this year, I observed a weird trend on Twitter. If you were a programmer, it was fashionable to complain about Github Copilot or claim AI makes you work slower. I wondered if it was some virtue signaling or bragging (something like, “Look, I work on such complex things AI can’t help me with anything”) and I still don’t understand the trend. Especially because Github Copilot and, later, Cursor.sh made me much faster and more satisfied with my work (mostly because I don’t have to type the boring parts of code anymore).

I was told that in the 1990s, when front-end developers were called webmasters, some programmers refused to use IDEs because “a true programmer” writes code in a text editor with no code highlighting. Now, we know it’s a preposterous statement, but there are still proven techniques like test-driven development or domain-driven design that some programmers neglect. Now, they neglect AI, too. As Rachel Woods says:

Some people are afraid to admit their work looks like a treadmill. And that using AI would actually force them to get off it.

When you organize programmers’ training, you have to focus on those who want to get off the “treadmill.” Some programmers oppose any idea that’s forcing them to change how they work. For example, persuading programmers to write tests after the implementation is counterproductive when you teach a test-driven development workshop. Instead, you focus on those who want to learn and make them advocates for the change.

Similarly, programmers’ AI training should focus on experiencing how AI changes their workday instead of being persuaded. First, do the prompt engineering training. As I said, everyone should attend it. Then, show programmers how to plan work and split tasks using Copilot-like tools. Make them comfortable with tweaking the AI-generated code. They shouldn’t expect the code to be perfect every time. And please include test-driven development in the workshop topics, but don’t claim AI can generate tests. There is a better way to do TDD with AI.

Use AI Automation Instead of Chatbots

AI (like every automation) works best when people don’t notice it.

Do you really need a chatbot? What if the report with the data your employees need was generated by AI but delivered to them as a dashboard? What if the action items were automatically added to their to-do lists instead of requiring someone to open an email or a meeting transcript summarization tool? What if AI were used to find relevant information online about the prospect added to your CRM instead of using AI to generate a cold email?

Of course, AI can do much more. You can automate low-stake, reversible decisions like forwarding an email to the right person or department. You can use AI to find similar support cases from the past and draft a response to the client (but don’t send the answer automatically before a human reviews the text). AI can automatically analyze data and send information about new trends or detected anomalies. For example, AI-based data analysis can warn you when the clients start to repeatedly complain about a new issue in your company’s online reviews.

Every business process may benefit from automation. Some can be fully automated, and in others, you can automate at least some of the tasks. But you need a process. You can’t automate chaos or the practice of doing whatever people feel like doing.

Fortunately, business is fractal, and chaos has parts resembling a process. For example, “We don’t always check online reviews about the new clients, but when we do it, we look at those 3 websites and check if someone complains about delayed payments” is a process.

As I wrote in the article about the problems you will face during an AI transformation, start small, look for automation ideas, and keep calm. We are still early. Your competition is not leaving you behind. Everyone who says so is trying to sell you an overpriced online course.


Do you need help building AI-powered applications for your business or training your employees to use AI productively? for your business?
You can hire me!

Older post

4 Gründe, warum KI Ihr Unternehmen nicht produktiver macht, und 3 Möglichkeiten, dies zu ändern

47% der Mitarbeiter, die KI einsetzen, haben Schwierigkeiten, die Produktivitätserwartungen zu erfüllen. Der Artikel erklärt die Gründe für die Qualifikationslücke und zeigt effektive Strategien zur Steigerung der Produktivität mit KI auf, von maßgeschneiderten Schulungsprogrammen bis hin zu intelligenten Automatisierungslösungen.

Newer post

人工智能未能提高公司生产力的 4 个原因和 3 个解决方法

47%的使用人工智能的员工难以达到预期的生产率。文章解释了技能差距的原因,并展示了利用人工智能提高生产率的有效策略,包括量身定制的培训计划和智能自动化解决方案

Are you looking for an experienced AI consultant? Do you need assistance with your RAG or Agentic Workflow?
Schedule a call, send me a message on LinkedIn. Schedule a call or send me a message on LinkedIn

>