I have written in the past about the high demand for AI talent, and the low supply of highly qualified personnel to fill this demand. The void left by big companies eating up all of the talent is driving up the cost of AI development, just at a time when you would expect prices to fall, given all the awesome free software out there. Furthermore, I like working on projects from such a broad range of industries. I’m happy to just work at scaling what I’m doing right now.

This article is about the pressure on consultants like me to maximize revenue in this feeding frenzy for AI, and the ugly things that spawn from this temporary imbalance between supply and demand for what I will call AI solution architects.
The Marshmallow Test is a famous study that measures delayed gratification in children. In short, the longer the kid waits to eat the marshmallow, the more self control they have. The results further indicate that delayed gratification is a good trait to have. In the AI field I am experiencing a reverse marshmallow effect. I am being pushed by financial incentives to NOT delay gratification. In this case gratification means money.
AI is really cool. Here is the problem: I want to make money. Cash money. The pull to maximize revenue distorts the longer term value of doing cool things by pushing consultants like me to bid on projects I know I can finish quickly. The incentive is against innovation and risk.
Usually, in the fable of the tortoise and the hair, the tortoise comes out on top. In this ultra-noncompetitive AI field, the hare wins. He hops home with a bag of gold every day as the tortoise builds his startup one brick at a time, eventually to cash out stock options or use them as toilet paper. This is a bit insensitive to the innovators, but a bird in the hand is worth two in the bush. And let’s face it, doing a standard AI project makes more money with less risk, than doing a project that has a non-zero risk of total failure. Sadly, I have to let not one, but two burning hot leads die this week because the requirements are too risky to fix price, and we couldn’t get together on an hourly rate. One wants high precision and recall (hard to do), and the other has a messy dataset. I probably shouldn’t, but I feel awful about walking away.
The cynical viewpoint/counterpunch on contemporary AI is one I heard from a colleague today at a party: "People who get A’s stay in school. People who get B’s end up working for the people who got C’s". Sorry, I don’t know the original source of the quote. Please add it in the comments if you find it. The idea in this quote is that middling engineers like Dilbert work for stupid bosses in consulting firms, while the best and brightest are hidden away in the ivory tower cooking up the next big thing. I think this is a bit harsh. As a consultant, I guess I am the Dilbert and the stupid boss at the same time? A CEO I met last week called my approach "living hand to mouth." Maybe there is some ugly truth to that.
I have a more realist and perhaps optimistic viewpoint about all this. I remember when IoT and big data were the biggest thing ever. It passed. I think that firstly, the demand for AI will be met by a torrent of well trained engineers, as always happens with a supply-demand mismatch in the workforce. The invisible hand will (should?) do it’s thing. Second, consultants like me that are chasing development opportunities rather than solving one overarching big problem will eventually either hit on a cool solution and focus on that, or risk becoming one among many small firms, eventually swept away by the ever shifting sands of the high-tech landscape.
Am I crazy to chase revenue on short contracts (and licensing deals) over the longer term (stable) full-time jobs, and long term (high return) equity arrangements? Let me know what you think.
Happy Coding!
-Daniel [email protected] ← Say hi. Lemay.ai 1(855)LEMAY-AI
Other articles you may enjoy: