Amplified Human Intelligence

Generative AI

If clients measure the value of what you produce in terms of word count, lines of code, or other piecework measures, then generative AI threatens your business model. But what if AI is not Artificial Intelligence, but Amplified (human) Intelligence? That’s our POV on AI; this issue is focused on amplified human intelligence.

👍👍 The Good Stuff 👍👍

(Potentially) non-hype-ey AI news sources:

So-called “awesome” lists on Github tend to be useful, fat-free directories of… (sometimes) awesome resources.

Strategists and analysts are fun to listen to, but practitioners like Tom here often create way more value when they share hard-won learnings.

Tips for being the human in the loop with AI-powered coding assistants.

🌱🌱 Idea Seeds 🌱🌱

If you are developing internal LLM-powered tooling (or when you get with it and do so), you’ll face the question of whether to use prompt-stuffing with context retrieved via RAG or fine-tuning a foundation model. The latter seems superior, but is costly. Papers like this one keep calling into question fine-tuning’s real-world superiority vs cheap-but-effective RAG.

“AI won’t threaten my job; my work is too difficult/important/complex/etc for AI to replace.” This paper suggests such denial might be pure cope. For example:

…do not find evidence that high-quality service, measured by their past performance and employment, moderates the adverse effects on employment. In fact, we find suggestive evidence that top freelancers are disproportionately affected by AI. These results suggest that in the short term generative AI reduces overall demand for knowledge workers of all types…

Hui, Reshef, and Zhuo

Subscribe to keep reading

This content is free, but you must be subscribed to The Argonautic to continue reading.

Already a subscriber?Sign In.Not now