View accompanying code on Github
The 2024 State of JS came out not too long ago, so I'm going to dive into some of the things that caught my eye.
Examples for all of the following can be found on the GitHub repo
There are 2 ways of importing a module in JS: statically or dynamically.
Both satisfy the same requirement of moving code from point A to B, but there are some differences I wasn't totally aware of.
As you can tell by the above syntax, unlike static imports, dynamic imports resolve to a Promise. This ability to
dynamically
import a module, opens up some interesting use cases, especially if said module is slow, resource intensive, and/or conditionally important.I'm guessing this is how React.lazy works, since both share the same signatures.
Live bindings are something I've never experienced first-hand, but, honestly, my first though is that it sets us up for some nasty bugs 😳. Static imports return a Binding, which is a JS entity that is reactive. Meaning, it's immutable and read-only because its source has the ability to update it.
Probably best explained through code, so take this contrived example plucked from MDN:
Import Attributes are useful when an HTTP's content-type value is set incorrectly. For example, if fetching a json file, a browser is going to parsed it as JS, which would result in the following error:
The solution, is adding a {with: {type: "json"}}
obj as the second argument.
I've always used a library to handle WebSockets, so most of this was obfuscated from me, but here's what I've learned when trying to roll me own.
All code can be found here
A WebSocket is initiated on the client by sending an UPGRADE
HTTP request
where the browser will set some distinct headers:
The server responds, by generating a Sec-WebSocket-Accept
value from the
Sec-WebSocket-Key
header and returns a 101
:
I'm not sure why, but there's a very specific formula in generating the Accept
header.
Key
header with the
magical string:
"258EAFA5-E914-47DA-95CA-C5AB0DC85B11"
Bada bing bada boom, a socket is established.
All code can be found here
Establishing a connection is fairly straight forward. The complexity goes through the roof when extracting information from the messages, which takes the form of a "frames of data". The general idea is each frame is broken into 4 parts:
Part one contains the FIN
and opcode
fields. I see this as metadata for the
data frame. It contains information such as the type of data (i.e., text,
binary, etc.) and if it's multipart (i.e. does this frame contain all the
information or will there be more to come).
Part two contains the payload length. In other words, this informs our reader when to stop processing the data.
Part three is the "mask". For all client-to-server communication, the data is XOR encrypted. This mask serves as the key to decode the payload.
The final part is the payload.
I've been sleeping on Astro, but it's f-ing cool. It's moving the industry in the direction I'm personally a big fan of: more SSR with more partial renders.
Astro is achieving this partially-hydrate-on-the-client type rendering through its island architecture. Next.js has something similar with PPR but the HTTP implementation differs. Astro sends over a payload to the client, then, within a JS module, makes another request for the delayed component. Next.js, on the other hand, creates a long-lived stream that doesn't close until all the delay components have resolved.
Syntactically, they don't feel too different and I'd argue the DX is comparable:
That's pretty much all I got. There's a couple other features, such as Promise.allSettled, Rollup, and Astro + React that I dinked around with in the GH repo. Take a look if you're interested.
As always, thank you for reading and let me know what you think!
Comments