React Performance: A Production Checklist
Performance optimisation in React is not a single technique — it is a layered strategy. Applying a trick without understanding where your bottleneck actually is wastes time and sometimes makes things worse. This checklist covers what enterprise teams measure, fix, and monitor.
There's a more tool that helps you visualize and detect performance issues in your React app: react scan.
https://github.com/aidenybai/react-scan
Measure First
Every optimisation decision should follow a measurement. Guessing produces placebo fixes.
Tools:
- React DevTools → Profiler — shows exactly which components re-render and how long each render takes
- Chrome DevTools → Performance tab — records the full main-thread timeline, including JavaScript execution, layout, and paint
- Lighthouse — audits LCP, FCP, TBT, CLS, and TTI in one run
- Web Vitals — tracks real-user field data (LCP, FID, CLS) in production
Run the profiler before and after every change to confirm improvement is real.
Prevent Unnecessary Re-renders
This is the most common and impactful React performance issue.
React.memo
Wraps a component so it only re-renders when its props change:
const UserCard = React.memo(function UserCard({ user }) {
return <div>{user.name}</div>
})
Use when the component renders frequently and its props are stable. Skip it for trivial components — memoization has its own overhead.
useMemo
Caches the result of an expensive computation:
const sortedUsers = useMemo(
() => [...users].sort((a, b) => a.name.localeCompare(b.name)),
[users],
)
Use for large list processing, heavy filters, or derived data structures. Do not use it for cheap operations — it adds memory and comparison cost without benefit.
useCallback
Returns a stable function reference across renders:
const handleDelete = useCallback((id) => {
setUsers((prev) => prev.filter((u) => u.id !== id))
}, [])
This matters when passing the function to a React.memo-wrapped child. Without it, the child re-renders anyway because the function reference is new every render.
Avoid inline object and array literals in JSX
// ❌ Creates a new object reference on every render
<Component style={{ margin: 10 }} />
// ✅ Stable reference
const style = { margin: 10 }
<Component style={style} />
The same applies to inline arrays passed as props or useEffect dependencies.
Memoize Context values
A very common oversight:
// ❌ New object every render — all consumers re-render
<AuthContext.Provider value={{ user, setUser }}>
// ✅ Stable reference — consumers only re-render when user changes
const value = useMemo(() => ({ user, setUser }), [user])
<AuthContext.Provider value={value}>
Component Architecture
Keep components small and focused
A 400-line component is difficult to memoize, hard to profile, and almost guaranteed to re-render too broadly. Split by UI section or logical responsibility.
Keep state as local as possible
Lifting state higher than necessary means re-rendering more of the tree on every change. A modal's open/closed state belongs in the component that owns the modal, not in global state.
// ❌ Global state for local UI
const [isModalOpen, setIsModalOpen] = useGlobalStore()
// ✅ Local state
const [isOpen, setIsOpen] = useState(false)
Context re-renders every consumer. At scale, a single context update can cascade through an entire subtree.
Large List Rendering
Rendering 1 000+ rows in the DOM is one of the fastest ways to freeze a React app. The DOM itself is the bottleneck — not React.
Virtualisation renders only the rows visible in the viewport:
import { FixedSizeList as List } from 'react-window'
function UserList({ users }) {
return (
<List height={600} itemCount={users.length} itemSize={50} width="100%">
{({ index, style }) => <div style={style}>{users[index].name}</div>}
</List>
)
}
Libraries: react-window (lightweight), react-virtualized (more features), @tanstack/virtual (framework-agnostic).
Always use a stable, unique key — never array index when the list can be reordered or filtered:
// ❌ Index key breaks reconciliation on reorder/filter
users.map((u, i) => <Row key={i} user={u} />)
// ✅ Stable identity
users.map((u) => <Row key={u.id} user={u} />)
Bundle Size
Users download your JavaScript. Smaller bundles mean faster first loads.
Code splitting with React.lazy
const AdminPanel = React.lazy(() => import('./AdminPanel'))
const Dashboard = lazy(() => import('./pages/Dashboard'))
// Route-level splitting in React Router
<Route path="/admin" element={
<Suspense fallback={<Spinner />}>
<AdminPanel />
</Suspense>
} />
Split at route boundaries, and also for heavy components like chart libraries, rich-text editors, and map widgets.
Import only what you use
// ❌ Imports entire lodash (~70KB parsed)
import _ from 'lodash'
// ✅ Imports only the function you need
import debounce from 'lodash/debounce'
// Or use lodash-es for tree-shakeable builds
import { debounce } from 'lodash-es'
Analyse the bundle
- Vite:
vite-bundle-visualizer - webpack:
webpack-bundle-analyzer
Run it once. You will almost always find a large library pulled in unnecessarily, or the same library duplicated at two versions.
Prefer smaller alternatives
| Heavy | Lighter alternative |
|---|---|
moment (67KB) | date-fns (tree-shakeable) or Temporal API |
Full lodash | Individual imports or native array methods |
axios | Native fetch + a thin wrapper |
Data Fetching
Using useEffect for data fetching is functional but carries real costs: no caching, no request deduplication, awkward cleanup, and the data only arrives after render.
React Query (@tanstack/react-query) or SWR solve all of this:
const { data: users, isLoading } = useQuery({
queryKey: ['users'],
queryFn: () => fetch('/api/users').then((r) => r.json()),
staleTime: 60_000, // treat data as fresh for 60 seconds
})
Benefits over useEffect:
- Caching — same query across components shares one request
- Deduplication — concurrent calls for the same key fire once
- Background refetch — stale data refreshes silently
- Automatic retry — transient network failures are handled
- Request cancellation — unmounting cleans up in-flight requests
Debounce Expensive Events
Typing into a search field should not trigger an API call 300 times per second:
const debouncedSearch = useMemo(() => debounce(search, 300), [])
useEffect(() => {
return () => debouncedSearch.cancel()
}, [debouncedSearch])
The cleanup cancels any pending call on unmount to prevent state updates on a dead component.
Avoid Memory Leaks
Unclean side effects cause setState calls after a component unmounts, producing console warnings in development and subtle bugs in production.
useEffect(() => {
const id = setInterval(tick, 1000)
return () => clearInterval(id)
}, [])
useEffect(() => {
const controller = new AbortController()
fetch('/api/data', { signal: controller.signal })
return () => controller.abort()
}, [])
React StrictMode mounts and immediately unmounts every component in development — this deliberately triggers missing cleanup so leaks are caught early.
Anonymous Components in JSX
// ❌ Creates a new component type on every render — React unmounts and remounts
<Route element={<(() => <Dashboard />)} />
// ✅ Stable component reference
<Route element={<Dashboard />} />
Defining a component inline in JSX means React sees a different component type every render. It destroys and recreates the DOM subtree instead of updating it.
Production Build
Development mode is significantly slower — it includes extra validation, dev-only warnings, and StrictMode double-invocation. Always benchmark against the production build:
npm run build
npm run preview # Vite
# or
npx serve dist # any static server
Never report performance numbers from dev mode.
Production Monitoring
Optimization is not a one-time task. Real users on real devices find issues your laptop never will.
- Sentry — runtime error tracking with component stack traces
- Web Vitals API — measure LCP, FID, CLS in real user sessions
- Performance monitoring (Datadog, New Relic) — long-term trend visibility
Priority Order
If you have limited time, the following order produces the most value per hour of work:
- Prevent unnecessary re-renders —
React.memo,useMemo,useCallback, stable context values - Virtualise large lists —
react-windowor@tanstack/virtual - Code split at route boundaries —
React.lazy+Suspense - Replace
useEffectfetching with React Query or SWR - Analyse and reduce bundle size
- Architecture — local state, small components, avoid Context for high-frequency updates
Everything else (Web Workers, SSR, image CDN, CSS purging) is high value in specific contexts but rarely where you should start.
The Core Principle
Measure. Fix what you can prove is slow. Measure again.
Optimising unmeasured code is the most common way to spend two days making a fast component slightly faster while the real bottleneck — a 2MB unoptimised bundle or a 1 000-row unvirtualised list — goes untouched.