Friday, November 15, 2013

A Brief History on Computer Science | PART ONE: Calcular Wars


I could herald past efforts and speak of the innovation involved in computers from the abacus to modern-day smartphones and laptops, but I won’t. Instead I’ll focus on things I find important to remember in reference to Computer Science, and modern day development. This does involve a lot of Mathematics and technology, so ignore my comment and bring on the abacus!

A bunch of smart people refined logic –people like Gottfried Leibnitz who paved the way to modeling logic in binary formats in the early 1700’s. A bit over one hundred years later Charles Babbage described the first computer ‘Analytical Engine’, and in 1854 George Boole created a structure of rules which used simple true-false statements into computational processes. Multi-threaded calculators become a reality; Ada Lovelace develops an algorithm, and predicts symbolic calculation will be possible in the future.

It blows my mind every time I think that less than two hundred years ago, we were just beginning to figure out how to design calculators, less, computers, and mobile devices. I’m old enough to have seen the rise of Apple and Microsoft, the browser and file sharing wars, and many of the developments impacting the languages and operating systems we use today.

Stuff happens and cybernetics and bugs are coined by magazine submissions and moths in Navy circuits, respectively. Important stuff, hey! How else would we get our Neuromancer and Electric Sheep? Next up, we go back in time to the nineteen-forties! Expanding on Konrad Zuse’s mechanical computers, John von Neumann, and his famous von Neumann Architecture is developed, featuring a reduced instruction set computing (RISC) architecture that uses 21 instructions to, “make it so.” These computers use memory, the ALU (arithmetic logic unit), and IPU (instruction processing unit) to work together and perform interpretation, assignment and manipulation of data using registers. This is a huge step, and led to computing as we know it. Memory got better, microprogramming (Wilkes) and compilers were built (Grace Murray Hooper), and transistors started being built smaller and smaller.

A Timeline of Computers (CLICK ME!)


2 comments:

  1. It took as tens of thousands of years to stand upright and only a couple of hundred to hunch down on a computer. There have been so many developments that have advanced the computer science field over the last 50 years alone and there are so many more yet to come. I wish I was old enough to see the rise of the PCs and the internet but at the same time I will hopefully get to see some crazy innovations of my own, maybe even be part of them! The blog was well written, fun to read, and nicely edited.

    ReplyDelete
  2. Hi,

    It is so amazing to see how people developed smarter technologies decade after decade. When I started researching about computer history, I came across some interesting facts regarding earlier machines that were developed and how clever those engineers were. This intuitiveness is what has led to such fast growing technology. Even if we pick our grey cells, it becomes so hard to think about the transition from Stone Age to Current Era.

    Your blog is a justification of thorough research. Great work!

    ReplyDelete