2024 State of JS

The 2024 State of JS came out not too long ago, so I'm going to dive into some of the things that caught my eye.

Imports Statements

Examples for all of the following can be found on the GitHub repo

There are 2 ways of importing a module in JS: statically or dynamically.

// static
import defaultModule from "module";
import { namedModule } from "module";

// dynamic
const dynamicModule = await import("module");

Both satisfy the same requirement of moving code from point A to B, but there are some differences I wasn't totally aware of.

Execution

As you can tell by the above syntax, unlike static imports, dynamic imports resolve to a Promise. This ability to

dynamically

import a module, opens up some interesting use cases, especially if said module is slow, resource intensive, and/or conditionally important.

I'm guessing this is how React.lazy works, since both share the same signatures.

Live Bindings

Live bindings are something I've never experienced first-hand, but, honestly, my first though is that it sets us up for some nasty bugs 😳. Static imports return a Binding, which is a JS entity that is reactive. Meaning, it's immutable and read-only because its source has the ability to update it.

Probably best explained through code, so take this contrived example plucked from MDN:

// module.ts
export let value = "first value";
setTimeout(() => (value = "second value"), 500);

// main.ts
import { value } from "./module.ts";

console.log(value); // "first value"
setTimeout(
  () => console.log(value), // "second value"
  1500
);

Import Attributes

Import Attributes are useful when an HTTP's content-type value is set incorrectly. For example, if fetching a json file, a browser is going to parsed it as JS, which would result in the following error:

Failed to load module script: Expected a JavaScript module script but the server
responded with a MIME type of "application/json". Strict MIME type checking is
enforced for module scripts per HTML spec.

The solution, is adding a {with: {type: "json"}} obj as the second argument.

// fails
import("data.json");
// succeeds
import("/data.json", { with: { type: "json" } });

WebSockets

I've always used a library to handle WebSockets, so most of this was obfuscated from me, but here's what I've learned when trying to roll me own.

All code can be found here

Handshake

A WebSocket is initiated on the client by sending an UPGRADE HTTP request where the browser will set some distinct headers:

Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Key: dGhlIHNhbXBsZSBub25jZQ==
Sec-WebSocket-Version: 13

The server responds, by generating a Sec-WebSocket-Accept value from the Sec-WebSocket-Key header and returns a 101:

HTTP/1.1 101 Switching Protocols
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Accept: s3pPLMBiTxaQ9kYGzzhZRbK+xOo=

I'm not sure why, but there's a very specific formula in generating the Accept header.

  1. Concatenate Key header with the magical string: "258EAFA5-E914-47DA-95CA-C5AB0DC85B11"
  2. Take the SHA-1 hash of step 1
  3. Base64 the hash

Bada bing bada boom, a socket is established.

Data Frames

All code can be found here

Establishing a connection is fairly straight forward. The complexity goes through the roof when extracting information from the messages, which takes the form of a "frames of data". The general idea is each frame is broken into 4 parts:

Part one contains the FIN and opcode fields. I see this as metadata for the data frame. It contains information such as the type of data (i.e., text, binary, etc.) and if it's multipart (i.e. does this frame contain all the information or will there be more to come).

Part two contains the payload length. In other words, this informs our reader when to stop processing the data.

Part three is the "mask". For all client-to-server communication, the data is XOR encrypted. This mask serves as the key to decode the payload.

The final part is the payload.

Astro

I've been sleeping on Astro, but it's f-ing cool. It's moving the industry in the direction I'm personally a big fan of: more SSR with more partial renders.

Astro is achieving this partially-hydrate-on-the-client type rendering through its island architecture. Next.js has something similar with PPR but the HTTP implementation differs. Astro sends over a payload to the client, then, within a JS module, makes another request for the delayed component. Next.js, on the other hand, creates a long-lived stream that doesn't close until all the delay components have resolved.

Syntactically, they don't feel too different and I'd argue the DX is comparable:

/** ASTRO */

// components/slowComponent.astro
---
await new Promise((resolve) => setTimeout(resolve, 5000));
---

<div>Long awaited content :)</div>

// pages/index.astro
---
import SlowComponent from '@/components/slowComponent.astro'
export const prerender = false
---

<h1>Static Title</h1>

<SlowComponent server:defer>
  <div slot="fallback">Loading...</div>
</SlowComponent>

<p>More static content here</p>
/** NEXT.JS */

// components/slowComponent.tsx
export default async function SlowComponent() {
  await new Promise((resolve) => setTimeout(resolve, 5000));

  <div>Long awaited content :)</div>
}

// app/page.tsx
import { Suspense } from 'react'
import SlowComponent from '@/components/SlowComponent.tsx'

export default async function Home() {
  return (
    <h1>Static Title</h1>

    <Suspense fallback={<div>Loading...</div>}>
      <SlowComponent />
    </Suspense>

    <p>More static content here</p>
  );
}

Other Stuff

That's pretty much all I got. There's a couple other features, such as Promise.allSettled, Rollup, and Astro + React that I dinked around with in the GH repo. Take a look if you're interested.

As always, thank you for reading and let me know what you think!