As is the case when truly innovative technologies emerge, the clamor by providers and users of these tools and platforms is for productivity gains. Classic productivity comes in the form of producing more output with the same or less resources. Generative AI clearly falls into the category given its ability to search, extract and create information outputs in a fraction of the time of existing, more labor intensive processes.
Is It Really Productivity?
A problem with claiming productive gains however, is that true, tangible productivity metrics are difficult to estimate, especially in the early stages of an innovative roll out. Issues such as adjusting organizational work processes, learning curve effects, strategizing over best areas of application, all require extensive resource re-allocations, time and investment to achieve output gains.
For Gen AI, organizations of all types have to allocate personnel for training and prompting LLMs. They have to identify high value applications in their work processes. Then there is the issue of optimizing internal data resources to feed LLMs, etc. So when companies claim that they are creating content with a 30% reduction in employee resources (e.g. time/numbers of employees), that’s simply the output effect without considering the changes in inputs that are required to achieve them.
An additional element that has created more work when implementing Gen AI is the requirement of editing, qualifying the output of LLMs. Verification of accuracy, currency, copyright infringement, ethics and general trouble shooting for hallucinations is a critical element to operationalizing GEN AI for a production environment.
The Productivity Pump
There is a bright spot for productivity to this complex technology adoption process and that involves an ironic element that exists in all organizations and this refers to “domain knowledge”. What this concept entails is employee knowledge of organizational issues (processes, strategy, etc) from an internal, operational perspective along with knowledge elements of factors external to an organization. Consider the following elements involving the implementations of GEN AI.
- Strategizing for best area of GEN AI application
- Prompting to produce quality of desired output
- Editing and analyzing content that is produced by LLMs
These factors require extensive company and industry knowledge in order to best deploy and optimize the power of GEN AI. Senior managers, SMEs and simply, employees with expert knowledge, are the sweet spot to achieving productivity.
In order to clarify the concept further, consider a number of areas in which GEN has been applied.
- Examining publicly filed reports (10K and annual reports) by companies to extract critical information. (HBR)
- Marketing effectiveness (emails, digital strategies)
- Real Estate (client advisory services)
- Call Center (customer support)
- Insurance (matching policy content to client needs)
- Coding (creation of code and reverse engineering of legacy systems)
- Online Retail (matching product attributes to customer orders)
- Legal Research
Applying GEN AI for these and other areas require an expert knowledge base to achieve positive productivity. What that means is that senior management, SMEs or simply, employees with extensive understanding of processes, policies and history of the business scope are required to apply GEN AI and monitor the output created. The applications may involve a myriad of work processes, where the “domain, expert knowledge” is key to identify those areas, monitor how LLMS create content and ultimately edit the content created in order to validate its quality.
An Example of Domain Knowledge: Re-engineering Legacy Systems
An insightful application of GEN AI is the creation of code (programs, algorithms. A new twist to this involves the reverse-engineering of legacy/nearly outdated computer programs into more current flexible and powerful coding languages. What is needed are coders that fully understand complete systems that were built years ago using near obsolete programming languages for today’s operations, along with the expertise of more current languages (e.g. Java, Python). This domain knowledge helps verify the accuracy of reverse-engineering of legacy systems so they can be set into production with current code.
Individual or Production Platforms (Productivity Differentiations)
The productivity paradigm here rests on a few major issues. The first relates to the number of users of GEN AI and the second, the type of data that will be processed. As a productive tool, (e.g. for an individual using open source data existing outside an organization), productivity is clear and quickly achieved. It’s comparable to a scenario of using spreadsheets verses hand held calculators for analytics, where speed and time savings is evident.
However, when seeking to operationalize GEN AI, rolling out LLMS to staff within a company, leveraging internal data, productivity gains become more complex. Factors to consider include:
- Did the creation of LLMs fully optimize GEN AI to create best models to roll out?
- What data is needed and how will it be integrated to feed GEN AI?
- Do the users have the required knowledge to verify content created?
- How quickly do the data inputs that created LLMS set in production mode change, where re-optimizing models are required?
These factors all require extensive resources to achieve time savings for the creation of content where productivity gains are not so clear. More specifically, the process of setting LLMs into production mode entails time allocation of data engineers to adapt internal data resources, training issues on how to prompt to build models and adding a layer of personnel to verify the output to be released as part of routine operations.
The Importance of the Human Element
As AI continues to evolve, the fear of its ability to displace the human element has risen. GEN AI should no doubt render some employment functions obsolete, however the real driver behind its successful implementation remains the knowledge base of workers who direct its use to value areas, guide users on how to produce required content and ultimately verify its output. The irony of this remains; there is a notion that AI will replace the need for human input as it can out-think people, however it is this knowledge base of humans that are guiding its success. At this juncture, knowledge developed by workers remains the sweet spot to fully leveraging AI.
Tag/s:Artificial Intelligence
Business Transformation Employee Experience