diff --git a/docs/1. Charles Babbage and Ada Lovelace/index.html b/docs/1. Charles Babbage and Ada Lovelace/index.html index 15fe5716..cbc192d1 100644 --- a/docs/1. Charles Babbage and Ada Lovelace/index.html +++ b/docs/1. Charles Babbage and Ada Lovelace/index.html @@ -40,100 +40,110 @@
Human beings have made a number of tools to make math calculations more convenient and accurate. One of these tools, the abacus, was used by several ancient civilizations.
-
In Korea, the abacus was introduced from China around the 1400s, and it was used by individuals as well as banks until the 1980s. After that, computers replaced it in banks and it is hardly used for personal use anymore.
-
In the West, in the 17th century, Pascal and Leibniz made mechanical calculators using gears.
-
In 1822, the British mathematician Charles Babbage designed a difference engine consisting of memories, an arithmetic unit and input/output devices like the computers we use now, but in a mechanical way.
-
Ada Lovelace, born in 1815, the daughter of George Gordon Byron, a representative Romantic poet in England, grew up with a single mother because her father had earlier abandoned her family. Since her mother was afraid that her daughter would take after her father, she only allowed Ada to learn mathematics and science instead of literature.
-
"Why does my mom only want me to learn mathematics?"
Famous scientists taught Ada at the time, including De Morgan who is famous for De Morgan’s law. As a result, she showed a great talent for mathematics which he recognized.
-
"This allows the logical sum to be a logical product."
"This is De Morgan's law!"
Ada saw the difference engine created by Charles Babbage by chance when she was seventeen years old.
-
"This is the difference engine!"
"Wow, I would like to participate in your research."
She created an algorithm to obtain the Bernoulli number for the purpose of describing the analytical engine, which is considered to be the first computer program.
-
"A good example is needed to explain the difference machine."
At the time, Ada first introduced the concepts of loops, goto, and control statements, which are important concepts used in programming languages, in her algorithms. This is why she is called the world’s first programmer. There is also a programming language called Ada which is named after her.
-
"Loop? Goto? If?"
However, Charles Babbage was not able to complete the difference engine and analytic engine due to technical limitations, so Ada never got to implement her algorithm.
-
diff --git a/docs/2. Alan Turing and Von Neumann/index.html b/docs/2. Alan Turing and Von Neumann/index.html index fa143933..e421c3f4 100644 --- a/docs/2. Alan Turing and Von Neumann/index.html +++ b/docs/2. Alan Turing and Von Neumann/index.html @@ -39,139 +39,152 @@"When can I run my algorithm on the machine?"
"Well, I’m not sure if it will work.z"
-
Who initially made the type of computer we use today? After the Second World War, many countries tried to develop a computer, but Alan Turing, an English mathematician, was the first to propose a theoretical model of the computer.
-
"I will find a way to prove Gödel's incompleteness theorems."
At the time, a famous mathematician, David Hilbert, introduced a problem, which is about whether we could find an algorithm (machine) that could solve all the mathematical problems. Gödel's incompleteness theorems, proposed by the Austro-Hungarian mathematician, logician and philosopher Kurt Gödel, are widely interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible [5]. Turing wanted to prove if Gödel's incompleteness theorems were right using his own way by designing a machine. Finally, in 1937, he published a paper, "On Computable Numbers, with an Application that mentioned the Turing machine. It was an abstract machine that processed the symbols written on a strip of tape as the rules of the symbols defined in the table, which defines the mathematical model of the computer [1]. The symbols written in the tape could be thought of as software.
-
“Wow, rewriting this document by hand is like copying the program.”
Alan Turing contributed to the Allied victory by creating a device called Bombe designed for breaking German ciphers during the Second World War [2]. At the time, computers were made for specific purposes. By the end of the war, the United States began making a general purpose computer, called ENIAC(ENIAC). John Mauchly and his team began to develop ENIAC in 1943 and completed it in 1946 at the University of Pennsylvania. After that, the US military used ENIAC to calculate ballistics tables for missiles.
In 1940s, the programming was totally different from today’s programming. We needed to connect each switch to set an instruction and also had to** **replace the circuit board in order to run another program. In addition, it was too heavy (30 tons) and consumed too much energy: 200 KW of electricity because it used 18,000 vacuum tubes [3].
-
“Is this true coding?”
"It's just the beginning"
Later, the team who made ENIAC began to develop EDVAC, the world's first stored-program computer, which was delivered to the US Ballistic Research Laboratory in 1949. Von Neumann was involved in the development of EDVAC as a consultant and wrote the first draft of a report on EDVAC. He proposed a computer architecture that was designed to save a program and its data in the same memory.
-
”EDVAC adopted the binary symbol for the first time.”
Now, all computers are using the computer architecture he proposed. We call it Von Neumann architecture.
-
(From wikipedia)
As you can see in the picture above, Von Neumann architecture consists of a CPU (Central Processing Unit), a memory unit, and input and output devices. The processing unit contains an arithmetic logic unit and processor registers. The control unit contains an instruction register and program counter. The memory unit may store data and instructions together. In general, the control unit executes an instruction by fetching it from memory, using its ALU(Arithmetic Logic Unit) to perform an operation, and then storing the result in its memory[6].
British government officials eventually found out about the EDVAC computer made in the United States and the report written by Von Neumann.
-
“The US already made a stored-program computer!”
They asked Alan Turing to make a stored-program computer like EDVAC. Finally, he had the opportunity to develop a stored-program computer using his ideas from designing the Turing machine starting in 1945 at the National Physical Laboratory. The name was ACE (Automatic Computing Engine).
-
“We need a computer like EDVAC”
“I also have a good idea”
When we look at his paper, published in 1946, although the report was written later than the EDVAC report by Von Neumann, it also included a detailed design for the stored-program computer. Moreover, the hardware is minimally designed to run arithmetic instructions in software, which is the same design philosophy and approach of the RISC CPU today. However, the development of ACE was delayed due to his wartime work and the secrecy required, and he returned to Cambridge for a sabbatical year in 1947 without having finished working on ACE[4].
-
"I already have a design. Why haven't they made a decision yet?"
Eventually, the mathematics institute at the University of Cambridge developed a computer called Electronic Delay Storage Automatic Calculator (EDSAC) in 1949 with the proven design of the von Neumann computer.
-
Alan Turing was ahead of the idea and design of the stored-program computer, but the actual implementation was led by the United States. In fact, since Alan Turing spent a few years working on his Ph.D. at the Institute for Advanced Study in Princeton University in 1937, while von Neumann was a professor of mathematics there, von Neumann might have gotten ideas from Alan Turing. However, von Neumann did not mention it in his report.
-
"Alan, can you explain your Turing machine a bit more?"
"Yes, sir."
They might have talked to each other like this.
In fact, Britain made the first computer, Colossus, in order to break German ciphers during World War II, but the machines and related information were destroyed after the war as part of the effort to maintain the secrecy of the project. Germany also built their own computers during World War II, but their research could not continue due to their defeat.
-
“We can use the machine for other purposes”
“No, we can’t. They would be destroyed because of confidentiality”
In the United States, immigrants such as Von Neumann contributed greatly to the development of early computers and continued to develop commercial computers. Finally, the computer era began with their efforts.
-
diff --git a/docs/3. The Era of Commercial Computers/index.html b/docs/3. The Era of Commercial Computers/index.html index c6d45404..bb7dbeec 100644 --- a/docs/3. The Era of Commercial Computers/index.html +++ b/docs/3. The Era of Commercial Computers/index.html @@ -39,100 +39,109 @@Von Neumann, a Hungarian immigrant, worked on the design of the stored-program computer.
Alan Turing obtained a Ph.D. from Princeton university and proposed a theoretical model of the computer.
Kurt Gödel was an Austro-Hungarian-born Austrian, and later American. His research on the incompleteness theorems led to the birth of the Turing machine.
-
The computer was developed in earnest during World War II and was mainly used for military purposes: breaking German ciphers and calculating the ballistic range of missiles.
-
"Has the ballistics calculation been completed?"
"I have not gotten the results yet."
Some engineers involved in early computer development had predicted the commercial potential of computers earlier.
-
"Can we sell EDVAC to government agencies other than the military?"
"That’s a good idea; why not start a company for this opportunity?"
In 1947, John Eckert and John Mauchly, who developed ENIAC and EDVAC, founded the world's first computer manufacturing company called the Eckert-Mauchly Computer Corporation (EMCC). After that, they developed UNIVAC, the newer version of EDVAC, and delivered it to the U.S. Census Bureau,
-
“Are you getting the average life expectancy?” “Wait a minute.”
The company was then expected to supply UNIVAC via contracts with the Army, Navy, and Air Force. However, those contracts were eventually cancelled in 1950 after some employees were suspected as communists during the McCarthy era.
-
“How do communists develop a computer for the US military?”
“That’s a misunderstanding. We do not have any communist employees."
Mauchly was also suspected and forced to leave the company, and it took him two years to get back to work. In the meantime, the company was in financial difficulty and it was eventually sold to Remington Rand in early 1950.
-
“Get Bolsheviks out!”
In fact, the idea for the von Neumann architecture was initiated by Mockley and Eckert, but their credits were removed from von Neumann's paper, First Draft of a Report on the EDVAC, dated June 30, 1945. Their business was not successful, also due to political reasons.
In the 1950s, a number of companies began to make commercial computers. IBM produced punch card systems and also released its first computer, the IBM 701, in 1952.
-
“Hmm, this machine is still using vacuum tubes and doesn’t have a monitor and keyboard yet. The memory size is only 36 bits x 2048.”
In particular, IBM developed the Fortran and LISP computer languages for its successor, the IBM 704, released in 1954.
-
“I proposed Fortran in 1953 and completed its development in 1957. This was the first compiler with optimization.”
“I was surprised to learn that there were compilers in the 1950s.”
Note: John Backus is the creator of Fortran computer language.
In 1953, IBM introduced the IBM 650 computer, the first mass-produced computer. This computer used magnetic drums to store programs, which provided a faster access time than drum-based storage devices. This model was relatively inexpensive and became popular in universities, so many students started to learn computer programming with this machine.
-
diff --git a/docs/4. how did people write code in the early days of computing?/images/image1.png b/docs/4. how did people write code in the early days of computing/images/image1.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image1.png rename to docs/4. how did people write code in the early days of computing/images/image1.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image10.png b/docs/4. how did people write code in the early days of computing/images/image10.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image10.png rename to docs/4. how did people write code in the early days of computing/images/image10.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image2.png b/docs/4. how did people write code in the early days of computing/images/image2.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image2.png rename to docs/4. how did people write code in the early days of computing/images/image2.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image3.png b/docs/4. how did people write code in the early days of computing/images/image3.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image3.png rename to docs/4. how did people write code in the early days of computing/images/image3.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image4.png b/docs/4. how did people write code in the early days of computing/images/image4.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image4.png rename to docs/4. how did people write code in the early days of computing/images/image4.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image5.png b/docs/4. how did people write code in the early days of computing/images/image5.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image5.png rename to docs/4. how did people write code in the early days of computing/images/image5.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image6.png b/docs/4. how did people write code in the early days of computing/images/image6.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image6.png rename to docs/4. how did people write code in the early days of computing/images/image6.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image7.png b/docs/4. how did people write code in the early days of computing/images/image7.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image7.png rename to docs/4. how did people write code in the early days of computing/images/image7.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image8.png b/docs/4. how did people write code in the early days of computing/images/image8.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image8.png rename to docs/4. how did people write code in the early days of computing/images/image8.png diff --git a/docs/4. how did people write code in the early days of computing?/images/image9.png b/docs/4. how did people write code in the early days of computing/images/image9.png similarity index 100% rename from docs/4. how did people write code in the early days of computing?/images/image9.png rename to docs/4. how did people write code in the early days of computing/images/image9.png diff --git a/docs/4. how did people write code in the early days of computing?/index.html b/docs/4. how did people write code in the early days of computing/index.html similarity index 62% rename from docs/4. how did people write code in the early days of computing?/index.html rename to docs/4. how did people write code in the early days of computing/index.html index ad22292b..dcbd5972 100644 --- a/docs/4. how did people write code in the early days of computing?/index.html +++ b/docs/4. how did people write code in the early days of computing/index.html @@ -38,102 +38,123 @@“I also started computer programming with IBM 650 for the first time.”
When computers were first introduced commercially, how did people write code for computers? In fact, there was no software similar to what we have today. All functionalities were implemented as functional units in circuits. ENIAC was also able to run various programs by wiring each functional unit and used punch cards as a storage device[1].
+When computers were first introduced commercially, how did people write code for computers? In fact, there was no software similar to what we have today. All functionalities were implemented as functional units in circuits. ENIAC was also able to run various programs by wiring each functional unit and used punch cards as a storage device[1].
-
+"This is a computer for arithmetic operations"
-
“That is a computer for breaking German ciphers”
"Can I make a computer for ballistics calculation?"
"How many relays and vacuum tubes are needed for this?"Actual computer programming has been possible since stored-program computers such as EDVAC and EDSAC were introduced in 1949. Basically, the program can be executed after it is loaded into memory.
-A program is made up of instructions that the machine can understand, which is called machine code. The machine language is difficult for humans to understand because it consists only of binary numbers such as 0 and 1.
+
Actual computer programming has been possible since stored-program computers such as EDVAC and EDSAC were introduced in 1949. Basically, the program can be executed after it is loaded into memory.
+A program is made up of instructions that the machine can understand, which is called machine code. The machine language is difficult for humans to understand because it consists only of binary numbers such as 0 and 1.
-
So, that’s why assembly language was born in the early age of computer programming. For example, the users wrote executable code on EDSAC using an assembly language called Initial Orders. This is an example of an assembly language used in EDSAC [2].
+So, that’s why assembly language was born in the early age of computer programming. For example, the users wrote executable code on EDSAC using an assembly language called Initial Orders. This is an example of an assembly language used in EDSAC[2].
-
All machine language instructions used in EDSAC are composed of 17 bits. The first column is the operation code, and the second column, 1 bit, is not used. Third column is the operand, representing the address. The last bit indicates whether the current instruction is 17 bits or 35 bits.
What does mnemonic, TOS, mean? It means that it loads the value of the accumulator A into m[0] and then initializes the accumulator A. The second command H is that it loads the multiplier register R with the data in m[2].
As you can see, machine code is just binary code so it’s difficult for humans to easily understand and remember it. Therefore, assembly language is used to represent each low-level instruction or opcode with mnemonics. When assembly language is converted into machine code, we call it assembling
-
Assembly language == Assembling == > machine code
In the early days, assembling was done by hand, so the term, hand assembling, was used. Assembly language has been around since the 1950s. The early programmers must know how to use assembly language because there was no high-level language at the time.
-If there is no assembly tool, you still have to assemble assembly code manually by looking at the mnemonic conversion table. -
“I’m writing code”
+If there is no assembly tool, you still have to assemble assembly code manually by looking at the mnemonic conversion table.
++
+"I’m writing code"
+
However, keyboards and monitors did not become commercially available until the 1960s. The first computer with a monitor and keyboard was Multics, which was jointly developed by Bell Labs and MIT in 1964 [3], and in the 1970s, most computers had a terminal with a screen and keyboard. Then, how could programmers write code and check the results without a monitor and keyboard in the 1960s.
-
Early programmers used punched cards to write code. Punch cards were originally used for data storage from the late 19th century, and were used by the US Census Bureau for census purposes. It is easy to understand if you think of the current OMR sheet to mark answers.
+++"Finally, I got a computer with a keyboard and a monitor"
+
Early programmers used punched cards to write code. Punch cards were originally used for data storage from the late 19th century, and were used by the US Census Bureau for census purposes. It is easy to understand if you think of the current OMR sheet to mark answers.
IBM had developed a punch card system at the time and was supplying the system worldwide so the punch card was well used as an essential part of early programming.
-
Punch card for Fortran programming
+For example, the early programmers used punch cards as a development tool. First, the programmer wrote assembly code on paper. Then they debugged by running the code in their minds. When they were convinced that there were no more errors in their code. Finally, the code was hand assembled into machine code and they filled out machine code line by line onto a punch card.
-
“I need to write the code on the punch card”
+
Programmers took the punch card to the operator of the computer room.The operator put the punch card into the punch card reader and the computer was able to execute the code loaded from the punch card reader. In reality, the programmer had to wait in line to hand the punch card to the operator and wait a long time until they received the execution result. If there is something wrong with the result, they had to do the same things over and over to get the result they wanted.
: -
“Can you take a look at my code?”
“Next person”
+"Can you take a look at my code?"
+
"Next person"
The interesting thing is that just copying a punch card is the same as copying a program, so it was possible to copy programs by writing at the time.
-
References
-[1] Celebrating Penn Engineering History: ENIAC
-[2] EDSAC Initial Orders and Squares Program
-[3] History of the Computer Keyboard
- +Until the 1960s, hardware was the main focus of computer engineering. Computers and peripheral devices were large machines that filled big rooms, they were costly and required manpower to operate. Unlike today, software back then was not a separate field of engineering or science
-
Mathematicians were involved in the invention of early computers so it was easy for students majoring in mathematics to approach computer programming in the early days.
-
People who majored in science or engineering learned programming because they could do mathematical calculations and various experiments with computers that were previously done manually. Some of them fell in love with programming and changed their jobs as programmers.
-
Dennis Ritchie, creator of C and Unix, originally studied physics and applied mathematics in university.
According to the story of Margaret Hamilton, who participated in the Apollo 11 mission in the 1960s and developed the lunar lander software, software development was not taken seriously as engineering and was not even considered a science.
Margaret Hamilton, participated in the Apollo 11 mission in the 1960s and developed the lunar landing software. Notably she observed that software development was not taken as seriously as engineering and was not even considered a science.
After studying mathematics in college( or university?), she started (or began) working as a programmer at MIT to support her husband's Ph.D. At that time, she participated in (or join) the development of a program to predict the weather without any proper software development training.
After studying mathematics in university, she began working as a programmer at MIT to support her husband's Ph.D. At that time, without any proper software development training, she joined in the development of a program to predict the weather.
-
After that, Following this, she became a system programming expert and participated in the Apollo 11 mission. At the time, the Apollo 11 mission did not include any budget and schedule for software development, and there was no mention of software in the requirements [1]. However, software was important enough to control the flight of the spacecraft and the lunar lander. Margaret took her daughter to the office on weekends to develop more reliable software.
-
Eventually, in 1968, as many as 400 people participated in the development of the lunar lander software, and the Apollo 11 mission was successfully completed.
-
In particular, she first used the term “software engineering” at the beginning of the Apollo missions so that software could be recognized as an independent domain like hardware. She also helped to create a course work about computer programming at MIT[4].
An interesting fact is that many software developers were women until the 1960s, but this is a significant difference compared to the current situation where men dominate the computer programming field. The reason is that at the time, software development was less important than hardware development, so women were mainly in charge, and they were also paid less than men[2]. If you look for black and white photos of early computers on the Internet, you can see a lot of women in front of the computers.
-
All the people who did the programming by connecting the functional unit boards at ENIAC were women[3].
-
Grace Hopper, who created the first compiler and did the first debugging(?), is also a woman and a doctor of mathematics. In 1947, Hopper was working on programming with the Mark II. One day her punch card input was not working.
-
Eventually, she found out that the computer wasn't working properly because of the dead moth inside the relay.
-
And while attaching the moth to the workbook, she left a note saying that it was the first computer bug
-
In the early days when computers were used, women played a pioneering role in software development and created the foundation for software engineering to be born.
Note: These comics were written with a reference to “A Brief History of Hackerdom” by Eric S. Raymond.
-
-
-
If you visit the Wikipedia page on the History of Programming Languages, you will find a list of the programming languages that were developed in the early days of computing. You may encounter some languages on the list that you've never heard of before - these are likely to be ancient, obsolete languages that are no longer in use.
Of course, we must also acknowledge the contributions of female programmers during this time period.
. -
This culture of real programmers has developed computing and networks. In addition, it has evolved into the open source hacker culture of today.[1].
-
The early hacker culture begins with the introduction of the first computer called PDP-1 by MIT in 1961[1]. At this time, some students who came across the PDP-1 developed the first video game called Spacewar! for fun, and also made a text editor and a chess game. They were also the first to play computer music.
-
In the YouTube video below, you can see the music playing and Spacewar! running on PDP-1.
-
At the time, the MIT Artificial Intelligence Lab was the birthplace of hacker culture. In the late 1960s, they developed a time-sharing operating system called ITS(Incompatible Timesharing System)that ran on PDP-10 and a language called LISP, which was made available free of charge to other universities and corporations. After the early form of the Internet, the ARPAnet, was connected, the hacker culture spread to other universities and institutions along with ITS.
-
The hacker culture started with the Tech Model Railroad Club at MIT. This club made moving model trains and studied how the trains would not collide with each other [1].
-
For reference, the video below shows why a computer is needed to control a moving model train.
-
The PDP models made by DEC contributed greatly to the hacker culture and the birth of free software. It was a kind of mini computer that was sold at a relatively low price and was particularly popular in universities. For reference, DEC donated a PDP-1 to MIT in 1961 [2].
-
After that, students interested in train control in the Tech Model Railway Club spend more time with the PDP-1.
-
-
-
-
-
-
+
+
+
+
+
And, for fun, the students created the first ever video game called Spacewar!
-
At the time, the MIT Artificial Intelligence Lab (AI Lab) was the birthplace of hacker culture, and it was developing an operating system called Multics with GE and Bell Labs. However, due to different opinions on operating system development, they began to develop their own operating system called ITS (Incompatible Time Sharing System) from the late 1960s.
-
MIT hacker Tom Knight(right) developed the first kernel for ITS.
-
Actual development started on the PDP-6 and it was all written in assembly
-
At the time, ITS operating system had a unique user environment that could not be found today. In the early days, anyone could log in to the system without a password. All files, including documentation and source code, could be edited by anyone.
-
In addition, it was possible to access ITS not only inside MIT, but also from other institutions or schools through ARPAnet. The wide-open ITS philosophy and collaborative online community had a great impact on hacker culture, and also on free/open source and wiki movements[3].
-
Richard Stallman, who later started the free software movement, also participated in the development of the ITS operating system as a member of the community while working at the MIT AI Lab from 1971, where he was influenced by the hacker culture.
-
-
+
In the 1960's, the Incompatible Timesharing System (ITS) was being heavily developed at MIT. Meanwhile, at another location on the east coast of the United States, there was another lab with the same hacker spirit: AT&T Bell Laboratories.
The groundbreaking Unix and C language, which would go on to change the world, were being developed.
-
Here is MIT Meanwhile at Bell Lab.
“I feel the force somewhere…”
Coincidentally, the people who were working on Multics were also working on ITS and Unix, including Ken Thompson and Dennis Ritchie from Bell Labs.
-
"I think we need to step away from the Multics project now."
“Yeah, the development time has become way longer than we expected."
The Multics project began in 1964, but due to the large code size and complexity, the schedule fell far behind Bell Labs' expectations.
-
In 1969, Bell Lab. pulled out of the development of Multics.
-
Based on his experience developing Multics, Ken Thompson creates a new operating system by himself at Bell Labs.
-
Ken Thompson reimplemented many of the key features he had developed in Multics in Unix.
-
He adapted the file system he had already implemented in Multics in Unix on PDP-7, and Dennis Richie joined him in the development. Once the development was well underway, a team was organized and they began implementing the operating system features we use today, such as the filesystem, process model, device files, and command line interpreter, for the first time on PDP-7.
-
Then, PDP-11 was introduced, which differed in CPU instructions from the PDP-7.
-
-
-
B language was also developed for use in Multics by Ken Thompson and Dennis Ritchie in 1969.
-
In 1971, Dennis Ritchie added a character type to the B language and rewrote the compiler code to generate PDP-11 machine code[3].
-
-
In 1973, basic functionalities were complete, and it was called C, which was just the next version of B.
-
Dennis Richie began rewriting Unix in C that same year.
-
Dennis added the structure type to the C language to define the user's custom data. Now, the C language is powerful enough to write Unix kernels.
-
Although Unix and C were created in a short period of time by Ken Thompson and Dannis Richie, most computers, including cell phones, still run on OS based on Unix today. In addition, operating system kernels are still developed in C today.
-
These comics offer a brief introduction to the history of Free and Open Source Software (FOSS), including early computer and software developments. Originally written in Korean at https://joone.net, they will be published as a book in 2024.
If you find any errors or have suggestions, please leave a comment on the comic or submit a pull request on GitHub.
Your support will contribute to maintaining the domain name, upgrading devices and gear, and purchasing reference books. You can sponsor this comic project by following this link. diff --git a/docs/index.html b/docs/index.html index 76dc6c10..52d9484a 100644 --- a/docs/index.html +++ b/docs/index.html @@ -51,7 +51,7 @@