The story of computer science

--

In these days we live in, not an hour goes by that doesn’t get some kind of a computer involved with your life. And that fact only got amplified with the introduction of COVID-19.
Getting some eggs? Check.
Talking with friends? Check.
Showing your homework? Check.
Watching Black Widow beat guys up? Check.

Family of three watching something on a phone with a laptop in front of them.

Each and every piece of tech you use goes through a lot of stages when they design it. Someone lays out the circuits; then someone chooses which electronic components to use; someone wraps all those wires and chips up with a good cover, and finally nerdy people sitting in front of computers install the software so you can shoot down your opponents in Call of Duty. It’s pretty obvious that those tasks require different skills. People like you and me study to specialize in those fields so they can design and make your laptop work, in a pleasant way that impresses you. When all those aspects from drawing circuit boards to coding is

Computers have long been in use even before the invention of electricity. The term computing means to calculate. Therefore, the very first occurrence of a computer recorded by historians was even before the birth of Christ, and that was an abacus.

2/5 Chinese abacus.
A model of the aforementioned abacus fetched from sciencing
Big Data Jobs

In 1936, Alan Turing, the father of computing invented the Turing machine that displayed the raw computational power that could be achieved. This, and the stored program concept by John Von Neuman showcased the possibilities that can be achieved by computers. The word, computer science was first used by George Forsythe in the 1960s, and a department was created at the University of Purdue in 1962, and the wheels were set in motion by power-hungry nations and scientists alike. Ever since the field has been improving at an astonishing rate.

Since 1991, the Association for Computing Machinery, IEEE Computer Society, and the Association for Information systems have collaborated to maintain and update the taxonomy of 5 major disciplines of computer science. They are,

  • Computer engineering
  • Computer science
  • Information systems
  • Information technology
  • Software engineering

According to Britannicca…

Computer science is the study of computers and computing as well as their theoretical and practical applications. Computer science applies the principles of mathematics, engineering, and logic to a plethora of functions, including algorithm formulation, software and hardware development, and artificial intelligence.

Trending AI Articles:

1. Why Corporate AI projects fail?

2. How AI Will Power the Next Wave of Healthcare Innovation?

3. Machine Learning by Using Regression Model

4. Top Data Science Platforms in 2021 Other than Kaggle

The roots of computer science are set deep into mathematics and naturally, engineering. If you turn your concentration into the most basic levels of computer science, you would realize that you can actually predict which problems can be solved by computers, and which can’t be, and that is made possible with math, believe it or not. Binary, discrete mathematics, linear algebra, statistics, and calculus are some of the most influential aspects of math when it comes to computers.

Electrical engineering also plays a profound role in the fine workings of every computer. Say the development of electrical engineering hit a brick wall, and the field of computer science itself would crumble to nothing in a matter of few years. The camera in your smartphone, the GPU that enables you to play games, the processors that toil day and night to keep a nation’s economy afloat have all been made into perfection by engineers who chose the best ways to make circuits and which materials should be used to make the tiniest electronic components. Also, they are the ones who should be thanked for the storage devices we use today, whether it be a pen drive, a hard disk or even, a floppy drive.

Photo of a typical computer circuit by Umberto from Unsplash

Creating big, fast storage media isn’t enough. A big library full of books would be of no value to students if they were poorly organized and it took half a day to find a book. Similarly, efficient ways had to be invented to store, sort, search and retrieve the needed information in the blink of an eye. Also, user-friendly Graphical User Interfaces had to be made to make the management of information easier and less faulty by the human operators. All of these things came under the Management of Information Systems, which was quite a big deal back in those days.

With the combined effort of personnel from those and other related fields, computers, and the field of computer science, grew by the day. In the 60s, computers were being used for business purposes as well as for scientific computing. And astonishingly, the experts in the field had managed to write programs in machine language, which is pure 1s and 0s, aka binary language. They soon realized it wasn’t a sustainable method, not only due to the ever-increasing need for sophisticated programs but also since machine language was much harder to read and humans are error-prone. And out came the assembly language, which was easier to read and didn’t require a program to be modified for each and every computer it ran on. After a few years, High-Level Languages were born. Simple as good old English, and generalized for each generation of computers at that time, they were the Holy Grail of computer programming and made the potential of computers have a growth spurt. At that time COBOL and FORTRAN were widely used, for business programming and scientific computing respectively.

And so the field of computer science marched forward, with the addition of Operating Systems, graphical user interfaces, better and compact storage devices, and last but not least, the Internet. All of these shaped the field of computer science and manipulated itself in ways no one had imagined. With the invention and addition of new technologies, new disciplines branched off, fueled by the hunger for money and power. However, thanks to international bodies such as IEEE and IDA, they have always been put in the right mold to make a boon out of them.

As of today, for an average human, all that matters concerning computer science are their social media accounts, usual shopping sites, some apps, and his personal pieces of technology, — usually his laptop, phone, and an occasional camera, and, of course, the thrilling hype about AI. While you can say people today are more tech-savvy than they ever were, most of us take it for granted and never marvel at the array of computers and networks that make all of that possible. This makes many users oblivious to the various ethical issues that surround almost every piece of tech worthy of thinking about, not to mention the notorious malware attacks that have become a part of technology too.

Photo by Sajjad Hussain M from Burst

Computer science is an unstoppable force that only grows stronger by the day. Whether we are going to use it to make the world a better place, or to marmalize it. This is something everyone should take a minute to think about.

Don’t forget to give us your 👏 !

--

--