Community Articles Tutorials
github-logo youtube-logo

What is Computer Science?

An Introduction to Comptuer Science As a Branch of Science

post-image

In the early days of computers, in terms of what a computer is widely viewed and accepted to be today, a computer scientist wouldn't have called themselves a computer scientist. In fact, the first time a computer science degree program was offered at a university was in 1953, at the University of Cambridge. Going back to the mid 19th century, just about a century before the first degree was offered, Charles Babbage, an English mathematician, created what is considered the world's first computer, the Analytical Engine. This machine looked very different from modern day computers but it did share some characteristics. While Babbage was regarded as a mathematician, amongst many other educated classifications, today he could easily be considered a computer scientist.

Today, the fields of computer science and computer engineering are two separated fields of studies at institutions of higher learning. While it is possible that any person studying one field will be knowledgeable in both, it does make sense to distinguish the two fields. Computer engineering tends to be more focused on hardware, software and network design. A computer engineer should be well-versed in physics and electrical engineering in order to understand the limitations of computing technology, how to efficiently and effectively exercise those limitations and how to expand them. Computer science, on the other hand, tends to be more focused on software, web technologies, databases, algorithms and topics built around the many use cases of computer hardware. A computer scientist should be good in mathematics and critical thinking, as maximizing the potential of computers requires a great understanding of computational efficiency. Here, we will mainly focus on computer science. However, since computer science and computer engineering have such a strong potential to overlap, some engineering topics will also be discussed.

The foundation of computer science is mathematics. That's not to say that the skill sets acquired in studying computer science will all include complex mathematical equations producing results with digits trailing into infinity. However, the concept of functions, sending input through a process to achieve a particular output, is a common task in writing code. Similarly, comparing equality or inequalities, a common task in mathematics, is another common task when making logical decisions in programming. Having a great understanding of mathematics, particularly algebra and calculus, aids in developing a great understanding of the foundations of computer science. It also helps in having a better understanding of the more advanced fields of computer science like image recognition, which requires an understanding of deferential equations. At the core, the logic that mathematics and computer science are built on is what makes them such interconnected fields.

It is hard to say exactly when the first computer was conceived or constructed, which would subsequently inform us who the first computer scientists were or give us a concrete history of computer science as a branch of science. However, there is enough data to provide a good idea of how modern day computers were conceived and the important individuals(often a part of teams) that were involved in the process.

The abacus is known to have been a tool of ancient Babylonian and Sumerian societies over 4000 years ago. In it's time, the tool would have been used to perform various calculations. The importance and value of the abacus was so great that variations of the tool were used all around the world, for over centuries after the ancient Babylonians and Sumerians. The main purpose of the abacus, the ability to efficiently perform computations, is what allows it to be regarded as the earliest non-human computer in documented history.

Fast forward a few thousand years and we have Leonardo da Vinci. Among the numerous achievements in his lifetime, da Vinci managed to create the blueprints for a mechanical calculator, something that did not exist at the time. While it's hard to say what da Vinci did with the blueprints after making them, the blueprints would have produced a working mechanical calculator. IBM decided to prove this in 1968 when they successfully coordinated a project to do so.

About a century after da Vinci, the 1600s produced some notable events and individuals that helped create the field of computer science as we know it today. In 1614, John Napier published Mirifici Logarithmorum Canonis Descriptio(Description of the Wonderful Canon of Logarithms). This book introduced the world to the mathematical concept of logarithms. Including 90 pages of tables of trigonometric functions and their natural logarithms, the content of this book aided computer scientist and engineers for many years after it's publication. Around the same time, Edmun Gunter, an English mathematician and astronomer, was working on logarithm tables of his own while also producing the Gunter's rule. The Gunter's rule was a tool used to, of course, perform calculations. The Gunter's rule led to the development of another form of a logarithmic rule, an adjustable logarithmic slide rule. This new rule, developed by Wilhelm Oughtred, saved time and added precision to calculations of that time.

From these inventions came the first mechanical calculators, consisting of rotating wheels allowing for calculations to be performed at the click of buttons. Wilhelm Shickard is regarded as producing possibly the first mechanical calculator in 1624. However, his attempt was not extremely successful and the calculator he built often broke. In 1642, Blaise Pascal was working on his version of a mechanical calculator that can add and subtract. His attempts were much more successful than Shickard's. In fact, Pascal's device was so great, he produced a handful of them and was able to sell them as well. A few decades later, in the 1670s, Gottfried Wilhelm Leibniz began to work on the stepped reckoner, an improvement of Pascal's device that could multiply and divide in addition to add and subtract. While this device wasn't produced to be sold and only two were made, it's capabilities were ahead of it's time. In addition to the stepped reckoner, Leibniz also produced an article titled, Explication de l'Arithmétique Binaire(Explanation of the Binary Arithmetic). This article was the introduction of the binary system of 1s and 0s to the masses. This work, the combiniation of mathematical advancements and mechanical engineering, helped to lay the foundations of the technology that would later be used to build the machines we all know to be computers today.

Building off of what had been laid out and established, Joseph Marie Jacquard found a way to automate the production of fabric in the early 1800s. Using the ideas of the binary system Leibniz established, Jacquard used cards with hole punches in them(in specific slots to simulate an on/off system) in order to efficiently and repeatedly produce various fabrics with various patterns. This invention opened up the world to the idea of a complex machine being ran on the binary system of 0s and 1s.

Some time in the 1820s, Charles Babbage, a mathematician/mechanical engineer built the Difference Engine, another version of a mechanical calculator that could tabulate polynomial functions, a mathematical process that is performed regularly in science and engineering. As it sounds, tabulating polynomial functions can be a daunting task. Babbage found a way to automate this process. While this machine worked, Babbage struggled to find support in financing the project. Toward the end of his time working on the Difference engine, Babbage realized he could make a better machine using a more generalized design. This machine became the Analytical Engine. Like the Jacquard loom, the Analytical Engine used punch cards to operate. Augusta Ada King, Countess of Lovelace wrote programs for Babbage's machine and is therefore regarded as the first computer programmer. This was the first time a computing machine could perform all arithmetic operations on any set of input in order to produce a consistent output based on the input the machine is given.

In 1854, George Boole published An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities. This book contained the foundations of boolean algebra, a form of algebra that allowed any and all computational logical processes to be represented mathematically. Shortly after Boole's discoveries, Charles Sanders Peirce was showing the world that all logical operations can be controlled by electricity, in 1886. The beginning of electronic computing technology had finally arrived.

The advancements of computing technologies began to really take hold just in time for the 20th century. Herman Hollerith implemented a punch card computer system to help automate the process of counting ballots in the 1890 election. Hollerith would go on to found the Computing-Tabulating-Recording Company in 1911. This company would later change it's name to IBM. A couple decades later, Alan Turing was engineering an electromechanical computational machine called Enigma to aid Britain in WWII. This machine decrypted German communications and is recognized as being a crucial component to the Allies defeating the Axis powers. Around that same time period, a number of computing devices began to be produced. John Vincent Atanasoff was granted funding from his school, Iowa State College, to build the Atanasoff-Berry Computer in 1937. That same year, Howard Aiken began his planning of his Mark series computing devices. A couple years later, David Packard founded Hewlett Packard(HP) in Palo Alto, California.

A number of historical computing devices were produced in this period such as the Z3 and Z4, built by Konrad Zuse, the ENIAC, designed by John Mauchly and J. Presper Eckert for the US Army and the Electronic Delay Storage Automatic Calculator, built at the University of Cambridge. With all these devices, a computer programming language was needed. Grace Hopper, a Ph.D. Yale mathematician, helped out here with her development of the first programming language, COBOL, in 1953 while working as a US Navy Reserve. It was around this time that the first computer science degree program was offered at the University of Cambridge. The foundations of computer science had been built by intellects specializing in fields such as mathematics, mechanical engineering and electrical engineering. Early computers were built to make mathematical computations more efficient. Advancements in mathematics, amongst other fields, continually press the advancement of the computing devices we use today. While the computers we most interact with today are digital devices, operating off of electrical signals alone, some computers are analog and digital machines, using mechanical and electronic components. As it was in the early days of the field, a major goal of computer science remains to be simplifying tasks. The means of achieving this goal is through the application of computers.

There are many types of computers that exist in the world and there are many applications of these computers. As a result, there are various fields of computer science, all essentially revolving around the idea of simplifying or optimizing tasks. If you're writing software, you're creating instructions for a machine that can be used repeatedly, yet only have to be written once. Maybe you're making an online website that can be accessed by people all around the world, yet you can run it in your bedroom(kinda). Maybe your obsessed with data and decide you'll retrieve data that is produced from a computer or computer program and study it. You can create a software tool that aids in analyzing the data in order to retrieve valuable information from it. You can even spend time developing a new programming language that optimizes the software development process and experience. These are just a few examples of the different paths that can be taken when deciding to study computer science. The exact technologies that would be used in each example may be different but one thing remains the same. Each is focused on simplifying and optimizing tasks through the use of computing hardware and software. In order to do this efficiently and effectively, a good understanding of computing technologies is needed. So, while you may not need be a master at machine language to write good code using a high-level programming language, having a decent understanding of underlying technologies like machine language, can make you a master at writing code in a higher level language like Java or Python. Understanding the limit and potential of various computing systems and/or technologies allows for optimal utilization of the capabilities they present.

Therefore, computer science is not only the study of the technologies computer hardware and software create, it is also the study of the underlying mechanisms that make the technology possible. The world of computers has grown drastically and is continuing to grow. This makes the amount of fields that fall under computer science as a branch of science continually grow as well. Blockchain and artificial intelligence, for example, are attractive fields gaining a lot of steam recently. On the other hand, network architecture and database administration aren't as appealing to most but are extremely valuable fields to become an expert in. Whichever path one may choose in the world of computer science, the information obtained can only be of benefit for the digital planet Earth has become.