An article in the Economist this week suggested AI a major threat to consultants, whose primary goal is the provision of expertise. I remain unworried. Consultants make their living providing specialized expertise to cover rare situations. However, most consultants are actually consultants, but contractors: you pay them to get a job done that you can't (or don't want) to do yourself. The name is simply a matter of prestige: more like BCG, less like the local plumber.
Of course, every contractor-consultant aspires to being a proper consultant, by providing "thought leadership", which is simply a way of saying "I have thought about this a great deal and have useful things to say", in the sense of having useful advice. Of course, the consultant-contractor dichotomy is blurred in the other direction--while some of the BCG folks do 'strategy' rather a lot more do 'implementation', where they build you a new organizational structure rather than a new bathroom.
Working on something, I realized "You can only write what you know about" and so a great deal of consulting is knowing about things. Which requires rather a lot of learning about things, and doing research about them. Consultants get accused of "Let me Google that for you", but that sort of misses the point--you could Google it for yourself, but since your consultant is really a contractor, you really are paying for someone to Google it for you.
An aside on that: Googling something requires ever more wading through an SEO optimized and enshittified web, but also the ability to extract content from our society's least enshittified corpus: publicly available PDFs. This category includes peer reviewed research, think tank reports, local government documents, for-profit and non-profit thinkpieces, etc. An advantage I strongly suspect will remain, because while much web-content is open for the scraping under a generous fair-use doctrine, a PDF has a much better (more litigable) claim to being 'published' and copy-righted than a blog post.
So, in the context of AI, an increasingly valuable portion of the service a consultant provides (compared to an AI) is the able to discriminate between high-value and low-value content, in terms of quality and relevance. Of course, all consultants use AI, but I find arguments against AI use tediously familiar to childhood injunctions about using spell-check. My (unassisted) spelling is indubitably worse, but the productivity of my time is vastly greater. It should tell you something that we no longer employ whole armies of copy-editors in document production. The real question is if your consultant is making efficient use of AI (to produce analysis tools and draft documents) and scam artists (using AI to produce analysis and review documents).
If it's something I could task a junior analyst with, it's suitable for AI. (To the very real peril of the entire class of junior analysts, who of all people should focus on using AI to become much more productive). Which raises the issue of the use of AI by junior analysts--if they give you something an AI could have produced, what is their value? But that only suggests a mis-use of analyst time and capacity, like paying someone to spellcheck a document. But that requires senior analysist to understand the capabilities of AI, and how to apply it well. And that capacity is sadly lacking. It's easier to spurn and disparage AI tools rather then learning to use them (and teach others to use them). But it's a hard time--AI is not widely adopted, and where it is widely adopted, it's not necessarily well used. Things are changing so fast that there really aren't best practices in the professional use of AI--merely cautionary tales about the mis-use of AI. But I expect that's also a source of competitive advantage: Everyone claims expertise in AI and AI use, but it will take rather a bit longer for consultant clients to be able assess actual facility with it.
As an aside, if you can't afford to hire someone competent to assess and manage a consultant/contractors work, you have no business contracting something out. The risk of buying a 'pig in a poke' is simply too great, and fly-by-night consultants feeding on the credulity ignorant have been a risk since the days of court sorcerers. An interesting parable for AI--if you can't assess the quality of what your AI produces, you have no business using an AI. Which is perhaps the fundamental skill senior analysts should be teaching their juniors.
No comments:
Post a Comment
And your thoughts on the matter?