Imagine a scenario where one of your routes' loaders needs to retrieve some data that for one reason or another is quite slow. For example, let's say you're showing the user the location of a package that's being delivered to their home:
import { json, useLoaderData } from "react-router-dom";
import { getPackageLocation } from "./api/packages";
async function loader({ params }) {
const packageLocation = await getPackageLocation(
params.packageId
);
return json({ packageLocation });
}
function PackageRoute() {
const data = useLoaderData();
const { packageLocation } = data;
return (
<main>
<h1>Let's locate your package</h1>
<p>
Your package is at {packageLocation.latitude} lat
and {packageLocation.longitude} long.
</p>
</main>
);
}
We'll assume that getPackageLocation
is slow. This will lead to initial page load times and transitions to that route to take as long as the slowest bit of data. There are a few things you can do to optimize this and improve the user experience:
Promise.all
(we have nothing to parallelize in our example, but it might help a bit in other situations).If these approaches don't work well, then you may feel forced to move the slow data out of the loader
into a component fetch (and show a skeleton fallback UI while loading). In this case you'd render the fallback UI on mount and fire off the fetch for the data. This is actually not so terrible from a DX standpoint thanks to useFetcher
. And from a UX standpoint this improves the loading experience for both client-side transitions as well as initial page load. So it does seem to solve the problem.
But it's still sub optimal in most cases (especially if you're code-splitting route components) for two reasons:
React Router takes advantage of React 18's Suspense for data fetching using the defer
Response utility and <Await />
component / useAsyncValue
hook. By using these APIs, you can solve both of these problems:
Let's take a dive into how to accomplish this.
defer
Start by adding <Await />
for your slow data requests where you'd rather render a fallback UI. Let's do that for our example above:
import { defer, useLoaderData } from "react-router-dom";
import { getPackageLocation } from "./api/packages";
async function loader({ params }) {
const packageLocationPromise = getPackageLocation(
params.packageId
);
return defer({
packageLocation: packageLocationPromise,
});
}
export default function PackageRoute() {
const data = useLoaderData();
return (
<main>
<h1>Let's locate your package</h1>
<React.Suspense
fallback={<p>Loading package location...</p>}
>
<Await
resolve={data.packageLocation}
errorElement={
<p>Error loading package location!</p>
}
>
{(packageLocation) => (
<p>
Your package is at {packageLocation.latitude}{" "}
lat and {packageLocation.longitude} long.
</p>
)}
</Await>
</React.Suspense>
</main>
);
}
If you're not jazzed about bringing back render props, you can use a hook, but you'll have to break things out into another component:
export default function PackageRoute() {
const data = useLoaderData();
return (
<main>
<h1>Let's locate your package</h1>
<React.Suspense
fallback={<p>Loading package location...</p>}
>
<Await
resolve={data.packageLocation}
errorElement={
<p>Error loading package location!</p>
}
>
<PackageLocation />
</Await>
</React.Suspense>
</main>
);
}
function PackageLocation() {
const packageLocation = useAsyncValue();
return (
<p>
Your package is at {packageLocation.latitude} lat and{" "}
{packageLocation.longitude} long.
</p>
);
}
So rather than waiting for the component before we can trigger the fetch request, we start the request for the slow data as soon as the user starts the transition to the new route. This can significantly speed up the user experience for slower networks.
Additionally, the API that React Router exposes for this is extremely ergonomic. You can literally switch between whether something is going to be deferred or not based on whether you include the await
keyword:
return defer({
// not deferred:
packageLocation: await packageLocationPromise,
// deferred:
packageLocation: packageLocationPromise,
});
Because of this, you can A/B test deferring, or even determine whether to defer based on the user or data being requested:
async function loader({ request, params }) {
const packageLocationPromise = getPackageLocation(
params.packageId
);
const shouldDefer = shouldDeferPackageLocation(
request,
params.packageId
);
return defer({
packageLocation: shouldDefer
? packageLocationPromise
: await packageLocationPromise,
});
}
That shouldDeferPackageLocation
could be implemented to check the user making the request, whether the package location data is in a cache, the status of an A/B test, or whatever else you want. This is pretty sweet 🍭
The React Router defer API is another lever React Router offers to give you a nice way to choose between trade-offs. Do you want the page to render more quickly? Defer stuff. Do you want a lower CLS (Content Layout Shift)? Don't defer stuff. You want a faster render, but also want a lower CLS? Defer just the slow and unimportant stuff.
It's all trade-offs, and what's neat about the API design is that it's well suited for you to do easy experimentation to see which trade-offs lead to better results for your real-world key indicators.
<Suspense/>
fallback render?The <Await />
component will only throw the promise up the <Suspense>
boundary on the initial render of the <Await />
component with an unsettled promise. It will not re-render the fallback if props change. Effectively, this means that you will not get a fallback rendered when a user submits a form and loader data is revalidated. You will get a fallback rendered when the user navigates to the same route with different params (in the context of our above example, if the user selects from a list of packages on the left to find their location on the right).
This may feel counter-intuitive at first, but stay with us, we really thought this through and it's important that it works this way. Let's imagine a world without the deferred API. For those scenarios you're probably going to want to implement Optimistic UI for form submissions/revalidation.
When you decide you'd like to try the trade-offs of defer
, we don't want you to have to change or remove those optimizations because we want you to be able to easily switch between deferring some data and not deferring it. So, we ensure that your existing optimistic states work the same way. If we didn't do this, then you could experience what we call "Popcorn UI" where submissions of data trigger the fallback loading state instead of the optimistic UI you'd worked hard on.
So just keep this in mind: Deferred is 100% only about the initial load of a route and it's params.