Miłosz Orzeł

.net, js, html, arduino, java... no rants or clickbaits.

Vanilla div performance

INTRO

I've recently checked whether using a Box in Material UI 5, instead of plain div, has significant performance impact. I don't think it would matter much in practice, but read the post if you are interested in details... Doing this check got me thinking: how much faster making the little square divs would be if we were to drop React and do the DOM manipulation with plain JS? Let's find out!

 

DISCLAIMER

I'm not suggesting that you should ditch the comforts of React/MUI and write your programs with vanilla JS just because it's faster to create tens of thousands of DOM elements this way. It's best to avoid making too many elements (think of paging and virtualization), and if you must, try to memoize to avoid repeating expensive operations...

 

STRESS TEST APP

Just like the program from previous post, the test app for this post is also about making lots of small squares with varying colors (through randomized opacity). It was made with Vite Vanilla + TypeScript template (vite: 5.4.1, typescript: 5.5.3).

Live demo is here: https://morzel85.github.io/blog-post-vanilla-div-performance

The code is here: https://github.com/morzel85/blog-post-vanilla-div-performance

Div stress test app... Click to enlarge...

 

The app has an input for choosing amount of squares to create inside a container div (flexbox with wrapping), a few buttons to choose different ways of making the divs and a button for clearing all squares. Clearing is done by setting container.innerHTML to empty string and is included before making the squares so the items don't add up.

The Make: createElement (inline) + appendChild calls this function (here inline styles are used, just like in the previous app):

const makeCreateElementInline = () => {
  clear();

  for (let i = 0; i < amount; i++) {
    const div = document.createElement('div');

    div.style.backgroundColor = 'darkorange';
    div.style.opacity = Math.random().toString();
    div.style.margin = '1px';
    div.style.width = '15px';
    div.style.height = '15px';

    containerDiv.appendChild(div);
  }

  clearButton.disabled = false;
};

The Make: createElement + appendChild calls this function:

const makeCreateElement = () => {
  clear();

  for (let i = 0; i < amount; i++) {
    const div = document.createElement('div');
    div.className = 'item';
    div.style.opacity = Math.random().toString();
    containerDiv.appendChild(div);
  }

  clearButton.disabled = false;
};

This time a CSS class is used to apply repetitive styles (all except opacity):

.item {
  background-color: darkorange;
  margin: 1px;
  width: 15px;
  height: 15px;
}

The functions above use old-school for loop to add items, while in previous post I've used Array.from({ length: amount }, (_, i) => ... ) to generate items. The Array.from feels more natural in functional component but is slower. It is still fast enough that it just doesn't matter here: around 200 ops/s for Array.from vs 1300 ops/s for classic for loop on a test that fills array of 100K items).

Both functions use createElement and appendChild from DOM API.

The Make: string + innerHTML buttons calls this function:

const makeInnerHTML = () => {
  clear();

  let s = '';

  for (let i = 0; i < amount; i++) {
    const d = `<div class="item" style="opacity: ${Math.random()}"></div>`;
    s += d;
  }

  containerDiv.innerHTML = s;

  clearButton.disabled = false;
};

If you have C#/Java background like me you may think that code above contains a classic blunder: not using string builder. Well, it turns out that JS engines are pretty good at string concatenation. Check out this great post.

The final Make: createDocumentFragment + appendChild calls this function:

const makeCreateDocumentFragment = () => {
  clear();

  const fragment = document.createDocumentFragment();

  for (let i = 0; i < amount; i++) {
    const div = document.createElement('div');
    div.className = 'item';
    div.style.opacity = Math.random().toString();
    fragment.appendChild(div);
  }

  containerDiv.appendChild(fragment);

  clearButton.disabled = false;
};

The idea here is to first append all items to a fragment (obtained by createDocumentFragment) that is not a part of DOM tree and only when it's done, append the fragment to DOM. It used help in older browsers, just like the innerHTML solution shown before...

I've included couple of different methods of adding elements but the results below will show only the behavior of Make: createElement (inline) + appendChild to stay close to React version and to prevent this post from getting too log. 

I've clicked a bit and to be honest I haven't noticed a significant difference that would justify switching from simplest createElement + appendChild to using innerHTML or createDocumentFragment... But, the buttons are there for you to try it yourself.

 

DESKTOP RESULTS

Just like before, I'm running the test on 7 years old PC with Intel Core i7-7700K, 16 GB RAM and MSI GTX 1080 Ti with Chrome 128 on Ubuntu 20.04.

That's what my eyes could see (time between button press and items appearing/page becoming responsive):

  • 500, 1K, and 2K: items appear instantly.
  • Barely perceptible lag starts to appear around 4K items but event at 8K it could pass as instant.
  • For 15K: about 0.3s.
  • For 50K: around a second.

Here's the performance trace for 10K items (just mind DevTools instrumentation is not free):

Performance report for 10K items. Click to enlarge...

 

Not bad, right? 

OK, let's check 25K:

Performance report for 25K items. Click to enlarge...

 

Half a second, sweet!

How about a 100K?

Performance report for 100K items. Click to enlarge...

Ok, finally we got Chrome to sweat a bit, but still... 2.3s for one hundred thousand divs! Surely the browser is smart enough not to paint the squares which are far off-screen but the divs are there (check the Nodes count on the screenshot, and document.querySelectorAll('*').length gives more than 100K too).

One thing worth mentioning here is the scrolling performance: scrolling through the absurd block of 100K squares could get really slow, it could freeze the browser for a few seconds depending on the the distance. For the more "reasonable" 20K it was pretty smooth... 

Quick note about Firefox: I've run the abusive 100K check in Firefox 130 and div making was even a bit faster than in Chrome plus there was no scrolling issue! The flip side: I noticed that getting back to a tab with this many elements was bit laggy.

 

MOBILE RESULTS

Like before, tests were done with my 3+ years old, non-flagship, Samsung Galaxy A52 on Android 14 (this time in Chrome 128 instead of 127).

This is the perceived performance I got (time between clicking a button and page with generated items being responsive):

  • 500, 1K, 2K items: instant.
  • 3K: barely perceptible delay.
  • 5K: about 0.2s.
  • 10K: about 0.4s
  • 50K: about 1.2s
  • 100K: about 3s.

Honestly I'm shocked how fast it is on a phone worth about 250 USD (1000 PLN) - I mean a new one, mine fell a bit to many times without a case to be worth that much ;)

There's not much difference between the phone and my (old) PC in the div-making speed. My greatest surprise is the scrolling: it didn't freeze the browser even with 100K elements! The worst I've seen was empty space shown for a while but scrollbar stayed responsive all the time (it actually performed way better than on desktop Chrome)! 

 

Material UI 5 Box performance

INTRO

I've been relying on Material UI Box components quite a lot, because doing so allows use of theme-aware sx property and common attributes such as display or gap. It makes code more consistent with other uses of MUI components.

The Box output is lightweight (it's just a div) but I was wondering how making plenty of these can impact performance, so I've build a test app to (ab)use the Box...

TL;DR: It's quite unlikely that Box vs div performance might become an issue in real application.

 

MUI v6

Just when I was writing this text, MUI team has released the v6.0.0 of Material UI. The release announcement blog post mentions runtime performance improvements and experimental availability of Pigment CSS (zero-runtime CSS-in-JS solution that will eventually replace use of Emotion and allow sx property on plain divs). 

I'm keeping this post focused on v5 (to be precise: v5.16.7 which was released less than 3 weeks ago). There are many projects that use v5 and will stay with it for a while. Plus it might be useful to compare the two major versions in the future...

 

STRESS TEST APP

I've made a small application (vite: 5.4.1, react: 18.3.1, @mui/material: 5.16.7, typescript: 5.5.3) to generate lots of squares with random colors (either by rendering plain divs elements or by using the MUI Box).

Live demo is here: https://morzel85.github.io/blog-post-mui-5-box-performance

The code is here: https://github.com/morzel85/blog-post-mui-5-box-performance

Box stress test app... Click to enlarge...

 

When the MAKE button is pressed, the app generates chosen amount of items. Use the CLEAR button to remove all items. Toggling between Plain div and MUI box options rerenders all the items. Divs are green, Boxes are purple. Simple.

Here's how the divs are created: 

import { memo } from 'react';

export const PlainDivs = memo(({ amount }: { amount: number }) =>
  Array.from({ length: amount }, (_, i) => (
    <div
      key={i}
      style={{
        background: 'darkgreen',
        opacity: Math.random(),
        margin: '1px',
        width: '15px',
        height: '15px'
      }}
    />
  ))
);

Yeah, inline styles are used even for static properties. These could be extracted out to single class but this is to make it closer to the Box/sx version.

This is how Box items are done:

import { memo } from 'react';
import { Box } from '@mui/material';

export const MuiBoxes = memo(({ amount }: { amount: number }) =>
  Array.from({ length: amount }, (_, i) => (
    <Box
      key={i}
      sx={{
        background: 'purple',
        opacity: Math.random(),
        margin: '1px',
        width: '15px',
        height: '15px'
      }}
    />
  ))
);

Notice that the random opacity rule is quite unfavorable for MUI/Emotion as it will generate a lot of different CSS classes that must be injected to the page at runtime! The generated CSS rule might look like this:

.css-s5s1br {
    background: purple;
    opacity: 0.846957;
    margin: 1px;
    width: 15px;
    height: 15px;
}

 

DESKTOP RESULTS

Here's a couple of results from my 7 years old PC with Intel Core i7-7700K and 16 GB RAM (MSI GTX 1080 Ti still going strong!) with Chrome 128 on Ubuntu 20.04.

  • For the default 500 items generating items feels practically instant for both Plain div and MUI Box. Same for 1K.
  • Let's go for 5K: divs are about 0.3s, Boxes about 0.4s.
  • How about 15K? Ok, now there's about a 1.2s of lag for div version and maybe 1.4s  for Box.
  • Well... 50K? 5,5s for divs and about 6,5s for Boxes.
    Quick sanity check: document.querySelectorAll('*').length -> 50056. It really created all those elements. Nice job React, nice job MUI! Aren't modern browsers a marvel of engineering? I remember the times when we had to worry about not putting too many JS validations on a form...

The above highly scientific results we collected by my eyeballs an stopwatch (time between pressing the MAKE button and page becoming responsive).

If you want something more precise here's performance trace for a 10K items: 

Plain div vs Box performance report... Click to enlarge...

 

Speed-wise there's not that much difference between Plain divs and Boxes version, although you can see that Box versions uses about 3.5x more memory. Sounds like a lot but the 10K of Boxes (with unnaturally large amount of unique classes) took about 30 MB.

Watch out for tests with too many items (especially if you open Elements tab), DevTools might choke a bit...

 

MOBILE RESULTS

Ok, how about a phone? This is how my 3+ years old, non-flagship, Samsung Galaxy A52 performs (Chrome 127 on Android 14):

  • 500 and 1K items: instant for both divs and Boxes. 
  • 5K: about 0.5s for divs and 0.9s for Boxes.
  • 15K: about 1.7s for divs and 3.3 for Boxes.
  • and finally the absurd 50K: about 15s for divs and 22s for Boxes (hope you never have to render this many elements on a desktop, let alone mobile)...

Speaking of DOM size, Lighthouse provides warnings and errors for excessive DOM size (as of this writing the thresholds are about 800 and 1400 elements respectively). It also reports DOM depth, it's a performance factor too (which my little app doesn't check, but the Box doesn't increase it). The largest sizes I've seen in practice was about 25K elements. When stuff like this happens is usually caused by a data grid with complex cell renderers and lack of virtualization (columns virtualization is important too).

 

BONUS INFO

When application is running in release mode (result of: npm run build), MUI/Emotion doesn't create individual style elements for each class.

When you click on <style> to see where the CSS rule is defined:

Finding style source... Click to enlarge...

 you will land on style element that appears empty:

Emotion style element... Click to enlarge...

 

This is a bit confusing, where are the classes? Emotion uses insertRule API which is very fast but the disadvantage is lack of DevTools support (check this GitHub issue and this answer in particular).

 

Update 2024-09-08:
I have a follow-up post that checks div performance without React: https://en.morzel.net/post/vanilla-div-performance

Accessing state in Redux Thunks

INTRO

The Redux Thunks offer a good way of separating logic from UI components. Within a thunk, you can: access the entire state, dispatch actions and thunks, generate side effects (contrary to reducers)... Redux Toolkit contains the redux-thunk middleware and adds it during standard store configuration, nice.

While dispatching actions and accessing state in thunks is easy (every thunk receives dispatch and getState functions as arguments), it's also easy to misuse the API and operate on outdated state, especially when async operations are involved.

In this post I'll show you two possible mistakes, one about referencing stale state object and another related to order of async operations.

 

CODE SAMPLE

Full code is available in this GitHub repository (vite 5.2.0, react 18.2.0, react-redux 9.1.0, @reduxjs/toolkit 2.2.3, typescript 5.2.2): https://github.com/morzel85/blog-post-redux-thunk-state-access

Here's the Redux Toolkit slice that I'll use to discuss the issues (with the thunks skipped as these will be shown later).

import type { PayloadAction } from '@reduxjs/toolkit';
import type { AppThunk } from './store';
import { createSlice } from '@reduxjs/toolkit';

export type ExampleState = {
  count: number;
  text: string;
};

const initialState: ExampleState = {
  count: 0,
  text: ''
};

// Thunks skipped...

export const exampleSlice = createSlice({
  name: 'example',
  initialState,
  reducers: {
    resetCount: state => {
      console.log('Reducer resetting count');
      state.count = 0;
    },
    resetText: state => {
      console.log('Reducer resetting text');
      state.text = '';
    },
    increment: state => {
      console.log(`Reducer incrementing count ${state.count}`);
      state.count += 1;
    },
    appendText: (state, action: PayloadAction<string>) => {
      console.log(`Reducer appending text ${action.payload}`);
      state.text += action.payload;
    }
  }
});

export const { resetCount, resetText, increment, appendText } = exampleSlice.actions;

export default exampleSlice.reducer;

As you can see, the slice is really simple, it manages one number and one string.

 

STALE STATE

The second argument passed to a thunk is getState function. It gives you access to the entire state which you can use directly or through selectors...

In case you need to access the state multiple times, it's tempting to grab the state with single getState call, but don't do it. If you dispatch an action in a thunk, Redux will run the reducers and ensure that updated state will be available further in the thunk code, but to get it you must invoke the getState function again.

Take a look at this thunk:

export const thunkWithStateAccess = (): AppThunk => (dispatch, getState) => {
  console.log('thunkWithStateAccess START');
  dispatch(resetCount());

  // This value will be stale!
  const state = getState();

  console.log(`state.example === getState().example: ${state.example === getState().example}`); // true

  dispatch(increment());
  console.log(`count in thunk with state: ${state.example.count}`); // 0
  console.log(`count in thunk getState(): ${getState().example.count}`); // 1

  console.log(`state.example === getState().example: ${state.example === getState().example}`); // false

  dispatch(increment());
  console.log(`count in thunk with state: ${state.example.count}`); // 0
  console.log(`count in thunk getState(): ${getState().example.count}`); // 2

  console.log('thunkWithStateAccess END');
};

This is the console output:

thunkWithStateAccess START
Reducer resetting count
state.example === getState().example: true
Reducer incrementing count 0
count in thunk with state: 0
count in thunk getState(): 1
state.example === getState().example: false
Reducer incrementing count 1
count in thunk with state: 0
count in thunk getState(): 2
thunkWithStateAccess END

Notice that initially the state.example and getState().example reference the same object. This changes after increment action is dispatched. Reducer modifies the state and the latest value is no longer available under the state const (it doesn't matter that it's declared as const, same would happen with let).

 

ASYNC THUNKS

Now let's see what the state access situation looks like when async thunks are involved (here I'm talking about asynchronous thunks in general sense, not about the createAsyncThunk from RTK):

export const thunkThatAppendsB = (): AppThunk => dispatch => {
  dispatch(appendText('B'));
};

export const asyncThunkThatAppendsCWithDelay = (): AppThunk => async dispatch => {
  const value = await new Promise<string>(resolve => setTimeout(resolve, 1000, 'C'));
  dispatch(appendText(value));
};

export const asyncThunkThatAppendsDWithoutDelay = (): AppThunk => async dispatch => {
  const value = await Promise.resolve('D');
  dispatch(appendText(value));
};

export const thunkWithoutAwait = (): AppThunk => async dispatch => {
  console.log('thunkWithoutAwait START');
  dispatch(resetText());

  dispatch(appendText('A'));
  dispatch(thunkThatAppendsB());
  dispatch(asyncThunkThatAppendsCWithDelay()); // No await: E and D will be first!
  dispatch(asyncThunkThatAppendsDWithoutDelay()); // No await: E will be first!
  dispatch(appendText('E'));

  console.log('thunkWithoutAwait END'); // This will happen before D and C are appended!
};

This is the console output of thunkWithoutAwait:

thunkWithoutAwait START
Reducer resetting text
Reducer appending text A
Reducer appending text B
Reducer appending text E
thunkWithoutAwait END
Reducer appending text D
Reducer appending text C

The text state ends up as: "ABEDC", what a mess! 

Here the letter A is appended simply by dispatching appendText action. The letter B is added by dispatching synchronous thunk, that internally dispatches appendText action. So far so good, state contains the "AB" text. 

Problems start when asynchronous thunks are dispatched without await. Notice that appending happens right after B and that "thunkWithoutAwait END" is logged before the async thunks can modify the state. This happens even for the async thunk which resolves without delay.

The important thing is that both the asyncThunkThatAppendsCWithDelay and the asyncThunkThatAppendsDWithoutDelay are awaiting promise resolution, and this is enough for the browser to let the invoking function (thunkWithoutAwait) continue. It has nothing to do with Redux, this is how JavaScript promises work (even if you were to switch from await to classic then the effect would be the same).

Fortunately the solution is simple: don't forget the await if you expect serialized behavior of your thunks:

export const thunkWithAwait = (): AppThunk => async dispatch => {
  console.log('thunkWithAwait START');
  dispatch(resetText());

  dispatch(appendText('A'));
  dispatch(thunkThatAppendsB());
  await dispatch(asyncThunkThatAppendsCWithDelay());
  await dispatch(asyncThunkThatAppendsDWithoutDelay());
  dispatch(appendText('E'));

  console.log('thunkWithAwait END');
};

Now the console output is:

thunkWithAwait START
Reducer resetting text
Reducer appending text A
Reducer appending text B
Reducer appending text C
Reducer appending text D
Reducer appending text E
thunkWithAwait END

and the text state ends up as sane: "ABCDE".

That's it, just don't get too crazy and remember that thunks are not meant to dispatch tens of actions or handle persistent connections... 

Controlling CSS module class names in Vite

INTRO

A couple of months ago, I've migrated my vim.morzel.net hobby project from Create React App to Vite and noted down a few things that required attention. Recently, while adding dark mode to the same project, I've spotted another difference between CRA and Vite: class names for CSS modules were build differently.

For example, this is what I had in CRA for file named DrillChoiceScreen.module.css and a class named Content:

Class names with CRA... Click to enlarge...

 

and this is what I got for the same thing in Vite:

Class names with Vite... Click to enlarge...

 

Notice that the file name prefix is missing. I preferred having it as it makes it obvious which .module.css contains the styling.

I've spent some time experimenting with Vite config for CSS modules, and logged few observations in case I need to tweak it again in the future, or you need to do it now :)

 

CONFIG FILE

As usual with Vite, the configuration happens in vite.config.ts, which is automatically added when you start a new project with npm create vite@latest command. This is how the fresh file looks like (at least for React/TypeScript+SWC in Vite 5.2.0, with semicolons added - cause I like 'em):

import { defineConfig } from "vite";
import react from "@vitejs/plugin-react-swc";

// https://vitejs.dev/config/
export default defineConfig({
  plugins: [react()],
});

And this is how it looks after adding a css.modules configuration to make the file name appear in generated CSS module class names:

import { defineConfig } from 'vite';
import react from '@vitejs/plugin-react-swc';

// https://vitejs.dev/config/
export default defineConfig({
  plugins: [react()],
  css: {
    modules: {
      generateScopedName: '[name]_[local]_[hash:base64:5]'
    }
  }
});

 

STRING PATTERN

Providing string pattern to generateScopedName property (as shown above) is the easiest way to affect generated class names. Vite uses postcss-modules (by default, but there's experimental support for Lightning CSS, configured differently). The interpolated string supports multiple placeholders which are described here.

While using the string was easy, the [name] pattern produces full name of the file. I name my CSS modules in this way: for component Example.tsx the styling goes to Example.module.css. Having "module" in class name was a bit annoying...

Fortunately, the generateScopedName can also take a function that gives a lot of control over the generated names.


CUSTOM FUNCTION

Below you can see an example of generateScopedName function config that creates class names containing: custom prefix _m, file name, class name and a line number.

This is based on example from postcss-modules docs but tweaked slightly to give more descriptive parameter names and to handle both Unix (LF) and Windows (CRLF) line endings.

For Example.module.css file and a MyClass class from line 10, it produces: m_Example_MyClass_10 as generated class name. 

import { defineConfig } from 'vite';
import react from '@vitejs/plugin-react-swc';
import path from 'path';

// https://vitejs.dev/config/
export default defineConfig({
  plugins: [react()],
  css: {
    modules: {
      generateScopedName: (className, filePath, css) => {
        const classNameIndex = css.indexOf(`.${className}`);
        const lineNumber = css.substr(0, classNameIndex).split(/\r?\n/).length;
        const fileName = path.basename(filePath, '.module.css');
        const prefix = 'm_';

        return `${prefix}${fileName}_${className}_${lineNumber}`;
      },
    }
  }
});

When deciding on a class name, it's important to remember that it should not start with a digit. Best to always include some prefix (can be as simple as an underscore). Watch out especially if you would like to have the name started with content hash (nothing worse than a flaky issues caused by some hashes starting with a digit).

Speaking of prefixes and hashes: the css.modules has a hashPrefix property, but mind that it doesn't control what's prepended to the class name. It affects the computed hash value which you can include in the string version of generateScopedName with the [hash] pattern. 

This is how a function that includes hash of CSS (full module file content) could look like (provided that import crypto from 'crypto'; is added):

generateScopedName: (className, filePath, css) => {
  const classNameIndex = css.indexOf(`.${className}`);
  const lineNumber = css.substr(0, classNameIndex).split(/\r?\n/).length;
  const fileName = path.basename(filePath, '.module.css');
  const prefix = 'm_';

  const hash = crypto.createHash('sha1').update(css).digest('hex').substring(0, 4);

  return `${prefix}${fileName}_${className}_${lineNumber}_${hash}`;
}

The crypto module contains tones of options, but for the purpose of content hash suffix simply taking a few characters from SHA-1 hex representation should be ok.

 

DEV AND RELEASE NAMES

As show earlier in this post, the defineConfig function can take an object that describes the configuration, but it can also take a function. Vite docs call it conditional config. The function receives and object with a couple of properties, one of them is command. Using this property you can distinguish between running the app locally while developing (npm run dev) and building for release (npm run build).

Here's an example of conditional config that uses hashes for class names for build and much longer and more descriptive names for development:

export default defineConfig(({ command }) => {
  return {
    plugins: [
      react(),
    ],
    css: {
      modules: {
        generateScopedName: (className, file, css) => {
          const classNameIndex = css.indexOf(`.${className}`);
          const lineNumber = css.substr(0, classNameIndex).split(/\r?\n/).length;
          const fileName = path.basename(file, '.module.css');
          const prefix = 'm_';

          let result;

          if (command === 'build') {
            const hash = crypto
              .createHash('sha1')
              .update(className + lineNumber + css)
              .digest('hex')
              .substring(0, 8);

            result = `${prefix}${hash}`;
          } else {
            const hash = crypto.createHash('sha1').update(css).digest('hex').substring(0, 4);
            const combined = `${prefix}${fileName}_${className}_${lineNumber}_${hash}`;

            result = combined.replace(/(?<=^.+_)Top_\d+_/, '');
          }

          return result;
        }
      }
    }
  };
});

Notice that the hashed value for command === 'build' combines the class name, line number and CSS. In the else branch (development) just the CSS is hashed, but there the hash is only an addon on much more informative name.

In the development mode you can also notice a replace call with a regular expression. It may look a bit complicated because it uses lookbehind assertion, but it's there to shorten names such as Example_Top_1_Something_5 to Example_Something_5 (I'm using Top as the "main" class name for a component that uses CSS module).

Notes from migrating React app from CRA to Vite

INTRO

I've recently made the decision to migrate my vim.morzel.net pet-project from Create React App to Vite. To be honest, I was quite content with how CRA (with a bit of React App Rewired) functioned, and updating might not have been necessary. However, I wanted to use this migration as practice before possibly employing Vite on something more serious (where the speed and active development of Vite might prove a blessing).

Here's a summary of the front-end part of vim.morzel.net as reported by cloc on src directory:

-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
TypeScript                      94            535             88           3382
CSS                             32            268             10           1613
JavaScript                       3             71             26            364
Markdown                         1             28              0             58
JSON                             1              0              0             15
SVG                              1              0              0              1
-------------------------------------------------------------------------------
SUM:                           133            902            124           5433
-------------------------------------------------------------------------------

The project is done mostly with TypeScript and CSS (modules). JS lines come primarily from glue code for WebAssembly module for drills (which is done in Rust in separate repository).

Most notable dependencies are Redux (with Toolkit), React Syntax Highlighter, Fontsource and React Testing Library. Full list with links is at the bottom of this page.

As you can see, the project is small, but nonetheless useful as a testing ground for migration to Vite.

Application is deployed on DigitalOcean droplet with Ubuntu 20.04, NGINX 1.18.0, Node.js 17.3.0 and PM2 5.10.

In this post I'll note my approach for the migration (which worked quite well) and the issues I ran into. Hopefully you will find some useful information here in case you plan to switch to Vite too.

 

PREPARING FOR MIGRATION

Switching dev/build tooling sounded like a good occasion to upgrade the React version too (from 16.9.0 to 18.2.0). This went surprisingly smoothly, I just needed to update to new application root (details here).
Once on latest React, I went for newest Jest and React Testing Library without any blockers (wasn't sure yet about trying Vitest).

 

A CLEAN SLATE

Initially the plan was to replace CRA features with Vite as described in this article or this one, but then I thought that I'm too much of a noob in Vite and I would rather start with a nice and clean Vite project and move the screens to that fresh setup. So I've created a branch for the Vite version and put there only the results of running Vite scaffolding (v4.3.9) for React with TypeScript and SWC. Then I checked out the CRA version (master branch) to another folder so I could easily move files around and have the two applications running side by side for comparison. Putting content from old version to the new was quite easy: basically copy & paste of folders with features, utils and overall app setup (like Redux store). Then copy some style and config files, favicon etc. A bit more attention was needed for putting chunks of code in index.html, src/main.tsx and src/App.tsx. I of course had to install a couple of dependencies that the app needed but were not part of Vite scaffolding...

 

SOME HURDLES

There were a couple of things that required adjustments, here they are in random order - I might have forgot some, sorry :)

Port configuration

By default Vite runs dev server on port 5173, CRA does it on 3000. Since I had a couple of bookmarks with :3000 I wanted to keep it. It's easy to change: just add this to your vite.config.ts:

server: {
  port: 3000,
  open: true
}

The open flag will automatically launch browser when server starts (just like CRA does by default).

You can also add a config for running production build locally with npm run preview (no need for installing serve like in CRA):

preview: {
  port: 3000,
  open: true
}

Accessing config from .env files

The CRA version used .env files (like .env.development or .env.production) with the help of env-cmd package. Chosen values (prefixed with REACT_APP_) were automatically included in UI and accessed like this: process.env.REACT_APP_SOMETHING in code. Fortunately Vite has built-in support for .env files (thanks to dotenv), but exposed values should be prefixed with VITE_. Access to values is slightly different, there is no process.env but you should use import.meta.env (for example: import.meta.env.VITE_SOMETHING)

DEV or PROD?

In CRA one could check process.env.NODE_ENV value to see if application was build in development or production mode. In Vite this should be changed to check of import.meta.env.DEV or import.meta.env.PROD boolean properties.

No need for %PUBLIC_URL%

While copying some things into new index.html file form CRA version I forgot to remove %PUBLIC_URL% placeholders, these should not be present in Vite version.

Injecting values into views

In CRA I've used preval.macro to inject build timestamp into a diagnostic feature. Maybe it was possible to make it work in Vite too but I went with source transform in vite.config.ts instead:

plugins: [
  react(),
  {
    name: 'build-timestamp-placeholder',
    transform(src, id) {
      if (id.endsWith('Footer.tsx') && src.includes('BUILD-TIMESTAMP-PLACEHOLDER')) {
        const date = new Date();
        return src.replace('BUILD-TIMESTAMP-PLACEHOLDER', date.toISOString() + date.getTimezoneOffset());
      }
    }
  }
]

Update (2024-01-28): You can also define global constant replacement to have a built timestamp. This Stack Overflow answer describes it nicely, it looks like a better solution. I'm leaving the part above because it won't hurt to know how to write a source transform :)

Update (2024-05-02): There's one more thing you might want to tweak in the config: generated CSS module class names.

Missing robots

When I run Lighthouse on Vite version, it told me that the app was missing robots.txt file (bad from search engines perspective). Indeed, the file was missing, so I copied it to the public folder from CRA version and problem went away.

Missing caching on static files

In CRA version, the static files of built application (*.js, *.css, *.wasm, *.woff2...) were kept in folder named static. In Vite this went to a folder named assets and made Lighthouse rightfully angry about missed opportunity for caching static resources. Since I run my app on NGINX, I had to update a rule that sets caching headers on files from folder named static to the assets folder.

This is how traffic looked like before enabling caching on static files:

Trafic without caching... Click to enlarge...

and this is with caching:

Trafic with caching... Click to enlarge...

Build output directory

When you run a build on CRA the results go into a directory named build, in Vite it goes to dist. I've had a few scripts that assumed build so these had to switch to dist.

Precise file extensions

Vite doesn't like it when a file has *.js or *.ts extension but contains JSX. It turned out that I had one rogue file like it, renamed, fixed.

Tweaking lint rules

The default ESLint rules in Vite were a bit too harsh for my toy project, so I got to add a few exceptions into .eslintrc.cjs config (like '@typescript-eslint/no-non-null-assertion': 'off')

Switching to Vitest

I've heard some good things about Vitest and since full integration of Jest into Vite is currently a bit complicated I decided to give Vitest a try.

This is the test config I have in vite.config.ts:

test: {
  globals: true,
  environment: 'jsdom',
  setupFiles: './src/test/setup.ts',
  // you might want to disable it, if you don't have tests that rely on CSS
  // since parsing CSS is slow
  css: true
},

and this is the entire content of import src/test/setup.ts;

import '@testing-library/jest-dom';

Aside of config, I had to change usage of jest.fn() to vi.fn() to make the test that use function mocks work.

Oh, one more thing: in CRA/Jest test run complained about one of the files generated for WebAssembly module (maybe it could've been fixed with transformIgnorePatterns in Jest config) but in Vite/Vitest I don't see this problem anymore. 

 

VITE ON MASTER

Once I was happy with how the app was working on a branch created for migration it was time to put Vite on the main (master) branch. This was very easy with the use of merge --strategy=ours (details) because I refrained from making any changes on master while working on the migration. After the merge, I have a clean Vite setup while Git history of files that existed before migration is preserved. Nice.


RESULTS

On localhost

Here's some comparison between the CRA and Vite versions while working on localhost (both after migration to React 18.2.0):

  • Running production build (average of 3 runs, duration as reported by real line in Linux time command output): CRA: 9s, Vite: 4.5s.
  • Starting production build locally with serve -s build in CRA and npm run preview in Vite (average of 3 runs, with time counted from running the command to a functional app appearing in new Chrome tab): CRA 3s, Vite: 0.8s
  • Observing change in a screen while running in DEV mode (hot module reload): CRA: 0.5s, Vite: practically instant :)
  • Running tests: well, frankly speaking I have (currently, yeah) very little tests on that hobby project so I can't offer very meaningful numbers, I can only tell that Vite/Vitest looks about 2.5x faster than CRA/Jest.

To sum up: Vite is noticeably faster but that doesn't mean that CRA version is slow (the difference will start to be significant on a larger application).

On server

What about the deployed application? Unfortunately I don't have data from CRA before updating React so here are the Lighthouse (navigation/desktop with forced clear storage) scores from CRA on React 16.9.0 (left) and Vite on 18.0.2 (right):

Lighthouse scores for CRA vs Vite (navigation, desktop, with forced cache clear)... Click to enlarge...


These are the results for Vite on mobile (sorry, no data for CRA version):

Lighthouse scores for Vite (navigation, mobile, with forced cache clear)... Click to enlarge...


This is the result for mobile but without forced clear storage (so this simulates returning visitor and benefits from static assets cache):

Lighthouse scores for Vite (navigation, mobile, without forced cache clear)... Click to enlarge...