The Case for JavaScript Directives
When Next.js introduced "use cache" as a replacement for the previous unstable_cache wrapper function, it was already raising eyebrows. A string literal instead of an imported function? But then Vercel announced the Workflow SDK with "use workflow" and "use step", and the reaction went from skepticism to outright mockery. The internet was flooded with drama and hypothetical directives. "use carefully", "use ai", "use coffee" - the jokes wrote themselves.
Seeing "use cache" at the top of a function does look strange. My first instinct was to reach for a wrapper function instead, something I can import, something with type signatures, something I can configure. That felt more natural to me until I actually thought through what directives are doing here.

Directives are not new
The reaction to "use client" and "use server" in React, and now "use cache", and "use workflow", treats directives as if they're some reckless invention. They're not. JavaScript has had a directive since ECMAScript 5 introduced "use strict" in 2009. A string literal at the top of a file or function that changes how the runtime interprets the code below it, without an import or function call.
The difference is that "use strict" was standardized and runtime-enforced as a language feature. The new directives are framework-level. They only work inside specific build pipelines. That's the legitimate criticism, and I think it's worth taking seriously. But the mechanism itself, a string that tells the system to treat this code differently, has been part of JavaScript for over 15 years.
// You can even use the cache directive at the file level in Next.js
'use cache'
export default async function Page() {
// ...
}
The case against
The arguments against directives are valid. They're not type-safe. You can't pass options to them. They're not part of any JavaScript specification. They look like language features but only work inside specific build pipelines. Composability in packages is poor because bundlers might strip them out until they're trained on the new directives.
And there's the lock-in concern, which is more nuanced than it first appears. Not all directives are equal here. "use client" and "use server" are standard React pragmas, supported across frameworks that implement React Server Components. They define execution boundaries that any compatible framework can interpret. "use cache" is different. It's a Next.js framework feature. The semantics, what "cached" means, how revalidation works, are tied to Next.js and its hosting model. Writing "use cache" everywhere means coupling your caching boundaries to one framework's semantics.
These are fair points. I don't dismiss any of them.
Why I changed my mind on wrapper functions
My initial reaction was: just use functions. Import a cache or workflow wrapper from a package, pass your function into it, add options. That's I used to do it with the Upstash Workflow SDK: you import from @upstash/workflow, define your steps explicitly, and everything is type-safe and composable. That pattern felt more natural to me.
// Upstash Workflow SDK uses a higher-order function
// to define the workflow
import { serve } from "@upstash/workflow/nextjs"
export const { POST } = serve(
async (context) => {
await context.run("initial-step", () => {
console.log("initial step ran")
})
await context.run("second-step", () => {
console.log("second step ran")
})
}
)
But then I started thinking about what these wrappers actually do. A normal higher-order function changes inputs and outputs. You pass a value in, you get a transformed value out. That's function composition. A workflow() wrapper that makes your inner function durable, that retries failed steps, that persists state across restarts, that replays execution after a deploy, does something fundamentally different: it changes where and how every line inside it runs.
For example,new Date() inside a durable workflow doesn't return the current time on replay, it returns the original timestamp. Math.random() is deterministic. Every await is a potential suspension point that might resume on a completely different machine, days later. A wrapper function that does all of this is just as much compiler magic as a directive. The difference is that the wrapper pretends to be a regular function call. The directive doesn't pretend to be anything.
Directives as infrastructure boundaries
At some point during this inner debate I realized I was thinking about directives wrong. I was evaluating them as an API design choice, comparing syntax ergonomics. But directives aren't an API, they are markers for infrastructure boundaries.
"use cache" doesn't add caching logic to your function. It tells the build system: this function's return value should be cached, across requests, with automatic (pre-defined) revalidation. That's not something a function wrapper can honestly represent. It's a deployment decision, not a runtime behavior.
// The Next.js cache directive
import { cacheTag } from 'next/cache'
async function getProducts() {
'use cache'
cacheTag('products') // on-demand cache invalidation
cacheLife("hours") // customize cache duration
return fetch('/api/products')
}
There's a catch, though. The directive alone only defines the boundary. It says "this is cacheable" but not how long, or how to invalidate it, or where to store it. For that, you're back to framework functions: cacheTag("products") to attach tags, cacheLife("hours") to set TTL, updateTag("products") to invalidate. All imported from next/cache. The directive defines semantics and constraints, the framework APIs define policy and implementation. That second layer is inherently vendor-specific because it depends on the hosting model, the cache store, and Next.js build-time vs. runtime behavior. So even with directives as infrastructure markers, you don't escape framework imports entirely. You just separate the "what" from the "how."
The same applies to "use workflow". It doesn't wrap your function in retry logic, but tells the runtime: this function runs in a durable execution environment where each step is persisted, replayed, and retried independently. The entire execution model changes.
When a new developer opens a file and sees "use cache" at the top, they know immediately that something unusual is happening. It's visually distinct from every function call around it. Compare that to const handler = cache(async () => { ... }). To someone unfamiliar with the codebase, that looks like any other higher-order function. The fact that it fundamentally changes how execution works is not visible.
Where I land
Directives aren't perfect. The lack of configurability is real. The composability problem in packages is real. The lock-in risk exists, though Vercel has made the Workflow SDK open-source with no hard dependency on their platform and introduced the "World" abstraction that allows workflows to run anywhere.
Before "use cache" and "use workflow", if you needed durable execution, retries, or caching that survives server restarts, you had to bring in an SDK via a package: Temporal, Inngest, Upstash Workflow, Trigger.dev, or build your own orchestration layer. New dependencies, new syntax, new mental models, and often your logic moved out of your codebase entirely.
I'd rather have a syntax that's honest about what it does. "use cache" tells me: this function is not a regular function. The execution model is different from what you'd expect in normal JavaScript. That's useful information. A wrapper function hides that same information behind something that looks ordinary.
Different infrastructure should mean different syntax. Directives deliver on that, even if they feel messy at first.