JavaScript to Machine Language

Bowen
Nerd For Tech
Published in
6 min readJul 9, 2021

--

First of all, I have recently read a portion of Hellfire Club by Jake Tapper and have come to understand the taboo of uncited works. While I will try to do my best in implementing citations, I do trust you (the reader) to make mature and educated decisions when reading the following. If you have any quandaries I wish you to do your own research, if you have any complaints, I wish you to do the former, then leave a comment with your improved perspective.

Without further ado, I introduce a subject that might be found mute by many modern engineers, however fundamental to how we all operate.

A rough idea of your computer handles JavaScript (e.g. this article you’re viewing)

So, that is a lot to handle, and I think everyone would appreciate some definitions.

Interpeted Language - Real Time Transfers from one language to another, but not the base language (think a Translator translating to Foreign Dignitary at the G2 Conference)Transpiled - Batch Transfers from one language to another (think translating one language to another in Google Translate... before it reaches the base language [please reference: here])Compiled - Batch Transfers from one language to another (same as transpiled, except the batch transfer reaches the base language without detouring)

With those definitions along with the diagram, you can tell that for JavaScript to be understood by the computer, it takes several steps to reach the part in which the computer executes the necessary code. However, this is the fundamental idea of how you run web pages and only account for the “Your Computer” box in the diagram below.

The idea of how you access webpages (note: “access” & not “run”)

Although the main point is how JavaScript runs, it is still important to understand how you receive the digital packets containing code, assets, and files to actually render a page in your browser (the diagram above is a 10,000-foot view of how that particular part works).

Because your access to the ethernet only gives your personal computer the materials, performance does make a difference for more than servers as modern personal computers do a lot of work themselves. Luckily JavaScript going through all of the hoops of the first diagram in this article is not much of a problem these days. To be honest, your access to Facebook, Wikipedia, etcetera are simple tasks for today's hardware. More network-intensive tasks such as YouTube, Netflix, & other nefarious ventures have more to do with your cable provider than your CPU. The only software that worries about micro efficiency these days, powers autonomous vehicles, rockets, Nvidia Ray-Tracing (via hardware-level mods), & a few other exceptional cases. Of which the entirety of the former first relies on low-level languages.

The next most important thing, before we actually engage in the conversion of low-level / assembly into machine language, is to understand the computer a bit better (pun intended).

Arguably the calculator is the first “computer.” Others will argue that the first computer was the decoder of the Engima (or the Engima itself). Another cohort will argue it was original textile machines. However, while I refuse to deny the influence of such intriguing additions, I do subscribe to the belief that logic gates a type of transistor or diode is the origination of the modern computer (image below; and essential to early calculators).

Fancy Image circa. 1948 (Click here for an explanation that doesn’t make your head hurt)

Anyways, if you clicked on the four previous links, and were formidable enough to Google the original calculator, I feel it is safe to assume that you now have obtained some basic knowledge of how modern-day computers work.

Aside from quantum computers — which to be honest are in early development — modern computers are the culmination of improved manufacturing processes. A “micro-processor” is simply a microscopic engineering feat. For example, a microprocessor might contain 100 million logic gates (a rough estimation that I am sure is outdated, as microprocessors improve bi-annually at an exponential rate).

With the new understanding that computers are impressively orchestrated using simple components, it’s time to move on to how the conversion of low-level / assembly into machine language works.

Previously I have written about my abstract opinion of Binary (Link A, Link B), which is machine language at its core. While my writing might be esoteric, it does help an individual unfamiliar with Computer Science understand the basic principles. Below is an image that I created for Link A — that describes how a human (not a machine) might interpret (reference earlier for real-time “interpreting”) binary.

Human Readable Binary

Unfortunately, that is not how Binary actually works. Binary is much more “stupid” (in a non-offensive way alluded to by Link B). However before I go into how binary might render this webpage or inversely how this webpage’s code might cause your computer to display the page itself, I do wish to dissect the computer in the diagram below).

Diagram of Main Computer Components
CPU - Centeral Processing Unit
GPU - Graphical Processing Unit
Memory - Quick access storage
Storage - Think of like an old school CD-ROM or Floppy Disc

Code can be stored in storage, but it is compiled into machine instructions via memory. The compilation process uses an assembly language to translates the higher-level code into machine language.

The cool part, which I guess is a neat knick·knack, is that developers in the 70s didn’t actually need to write raw binary. In fact, there were a couple of ways to reduce the complexities of simply “true & false”. These shortcuts are mostly attributed to Decimal (base 2), Octal (base 8), & Hexidecimal (base 16) transformations in assembly language.

For example, in decimal, this is how the number 1232 translates to Binary.

Same process for octal & hexadecimal aside from the fact that the base changes from 2 to 8, & then 16 with hexadecimal supporting characters A-F mapped to non-single digit integers ascending (A -> 10; F -> 15).

Combine the integer-based instructions, memory locations, & integers themselves — the only thing missing is text. Well, fortunately, there is text encoding for the basic English alphabet (such as UTF-8, UTF-16, UTF-32, UTF-MB4, UCS-2, UCS-4, ASCII, etc.) & even Emoji Encodings.

Encodings don’t reinvent the process from their other three counterparts (memory location, integer instructions, & integers), for example, the “8” in UTF-8 stands for the number of bits. UTF-8 actually uses hexadecimal to translate characters to binary.

UTF-8
abcd = 01100001 01100010 01100011 01100100
where each character contains 8 bits;
a = 01100001
b = 01100010
UTF-8 working with hexidecimal
alphabet (english) = abcdefghijklmnopqrstuvwxyz
alphabet (hexidecimal) = 0x61 0x62 0x63 0x64 0x65 0x66 0x67 0x68 0x69 0x6a 0x6b 0x6c 0x6d 0x6e 0x6f 0x70 0x71 0x72 0x73 0x74 0x75 0x76 0x77 0x78 0x79 0x7a
Neat Trick (8, 4, 2, 1)a = 0x61 (6 = 0 1 1 0, 1 = 0 0 0 1) = (8-bit binary 011000001)
b = 0x62 (6 = 0 1 1 0, 2 = 0 0 1 0) = (8-bit binary 011000010)
...
z = 0x7a (7 = 0 1 1 1, a (or 10) = 0 1 0 1) = (8-bit binary 01110101)

Finally, how does all this turn into the article I’m viewing right now?

Well, it an infinite multitude of simple steps, that orchestrate complex feats at quantum speed. The thing you should be wondering is: “if my parents had to go to the library to look through the encyclopedia, what will my kids be looking through?”

Interesting, however tertiary it may be, an article I wrote on Micro-Optimizations in Video Games relates to this. While it is a completely different subject, it does illustrate that when a large cohort of contributors builds off of each other's previous iterations, remarkable feats can be achieved. Both computers and video games are guilty of improving exponentially.

Enjoy the read?

Leave a clap or comment

Share with friends or on your favourite social platform

--

--