Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Brown Dogs and Barbers
Brown Dogs and Barbers
Brown Dogs and Barbers
Ebook260 pages10 hours

Brown Dogs and Barbers

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Computers are everywhere, running our lives, handling our social interactions, serving as the backbone of every business.

And yet, how well do we understand them? How much do we know about their rise to ubiquity? We take computers for granted, but there is a fascinating wealth of ideas waiting to be explored, a rich trail of information explaining how we got to where we are now. That trail includes grand dreams, intricate puzzles, mind-stretching concepts and a cast of colourful characters.

Brown Dogs and Barbers is a story about computer science. Join me on a journey through the story of computing, discover just what makes the machines tick, learn why computers work the way they do and meet the cast of characters responsible for it all.

LanguageEnglish
PublisherKarl Beecher
Release dateSep 12, 2014
ISBN9783000470769
Brown Dogs and Barbers
Author

Karl Beecher

Dr. Karl Beecher was born and educated in the UK. His career has taken many twists and turns, but always involved his central passion: computing. He worked as a software engineer before moving into academia to research software evolution and open source software. He was awarded a PhD in computer science from the University of Lincoln in 2009. Soon after, he took up a research post at the Free University of Berlin, Germany. Today, Karl still lives in Berlin and is a co-founder of Endocode, a software startup company. He blogs at www.computerfloss.com and tweets under the name @karlbeecher.

Related to Brown Dogs and Barbers

Related ebooks

Computers For You

View More

Related articles

Reviews for Brown Dogs and Barbers

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Brown Dogs and Barbers - Karl Beecher

    Karl Beecher

    BROWN DOGS

    & BARBERS

    What‘s Computer Science

    All About?

    Copyright © 2014 Karl Beecher

    All rights reserved.

    The author‘s permission is required for any reproduction and duplication.

    ebooks are not tranferable. It violates the copyright to resell or give away this ebook.

    All trademarks referenced herein are the properties of their respective owners.

    1st edition

    ISBN 978-3-00-047076-9

    For Jennifer.

    Chapter 0.

    Why Do Programmers Start Counting at Zero?

    An Introduction to Who and What This Book Is All About

    I’d like to begin by asking you about your toaster. If I asked you to tell me how your toaster worked, I bet you’d have no trouble coming up with a decent explanation. Initially, you might claim to have no idea, but I’m sure a moment’s thought would yield a good description. Even in the worst case, you could actually look inside a toaster and deduce what was happening. Perhaps then you’d be able to tell me how electricity causes the filaments to heat up and radiate that heat onto the bread or the crumpet or whatever causing it to cook.

    If I were to ask you how a car worked, that might be more challenging. Again, you might instinctively feel that a car’s workings are a mystery. But even then, if you stop and think about it, you might recall a few vague terms that help out. Perhaps you could tell me how petrol is stored in the car’s tank and when you press the accelerator, fuel is drawn into the engine where it’s ignited. Then you’d tell me that this drives the pistons...or something...and they turn the...I think it’s called the crankshaft...which is connected to the wheels and makes them turn. That’s what I would say and I know virtually nothing about how cars actually work.

    I’m guessing all this without even knowing you, your occupation or your interests. True, you might be an engineer or a physicist for all I know and able to give better explanations, but chances are you’re not. My point is, even if you have only the merest, passing interest in science and technology, I’m confident that you know enough about toasters and cars to give a half-decent explanation of how they work. Understanding things like these comes partly from school-learning where, even if you were dozing off during physics lessons, you still picked up some of that stuff about electricity and internal combustion engines. And let’s not underestimate how ingrained on our popular consciousness these concepts are. People around us talk about the workings of everyday technical items all the time, so some of it is bound to stick whether we realise it or not.

    But computers are different. Many of us haven’t got the first clue how computers work. Think about it. Could you tell me how the individual components in your computer work together? Could you even name any of the components? I’m certain some of you could, but I’m just as sure that a lot more people couldn’t begin to explain a computer. To some, it’s a magic box that sits under the desk and somehow draws letters and images onto the monitor screen at breathtaking speed.

    Let’s get one thing straight: I wouldn’t blame you for being unable to explain the workings of a computer. There are good reasons why you shouldn’t be expected to know this stuff. One very important reason, again, is schooling. In many countries, computer science is not taught as part of the general curriculum. In my own country of birth (the United Kingdom), computing education has for many years meant nothing more than learning how to use word processors and spreadsheets. These are important skills to be sure, but this is definitely not computer science, a topic that studies at a fundamental level how to use mathematical principles to solve problems. The majority of children leave school having learned to be passive users of computers at the most, and many people are now asking why such an important area of knowledge is absent from the curriculum.

    The mystery surrounding computers is a problem that’s becoming worse over time. When computers first arrived, they were monstrous things bigger than a family-sized fridge and kept in huge, environmentally-controlled rooms. Their job was usually to do boring tasks like process tax returns and payrolls; tasks that could be done by hand, albeit much more slowly. Computers had banks of flickering lights that lit up when the machines were thinking and spools of tape mounted on the front that spun around when the computer was looking in its databank. Some were even partly mechanical, clicking and tapping noisily when the numbers were being crunched. Yes, they were mysterious then—but today it’s even worse.

    Computers are no longer just mysterious—they’re magical.

    Today’s computers are a million light years ahead of their early ancestors. Nowadays they’re small, sometimes able to fit into the palm of your hand. How can something so tiny do such impressive things? And they’re also ubiquitous. Computers have gone far beyond their original, humble number-crunching duties and now organise every aspect of our lives. As a result, they’ve become utterly unknowable. Today’s computer is an impersonal black box that gives no hint as to its workings. Of course, there’s a user interface that allows us mere humans to operate the computer, but one main purpose of a modern user-interface is actually to hide the internal workings of the machine as much as possible. There are few external indicators about what’s really happening inside. Without moving parts (apart from the cooling fan, which I assure you performs no calculations) and with internal components that give no visible clue as to what they’re doing, it’s become impossible to figure out how a computer works by examining it. So advanced and unknowable have computers become, they may as well operate on principles of magic.

    But there are genuinely knowable principles upon which computers operate. We find things that pump, rotate or burn easier to understand, because physical principles are more intuitive to us. In contrast, the driving principles behind computers are mathematical and thinking in these terms comes harder to humans. There are some physical principles involved, of course. Your computer contains various things—circuit boards, wires and chips—which all function according to good old-fashioned physics. But (and I don’t mean this to sound dismissive) those are merely the computer’s hardware. In computer science, there is a sharp and critical distinction between the physical machinery that performs the work (the hardware) and the mathematical principles which allow it to do anything meaningful. These principles make up the field of computer science. In theory, you can build computers out of all sorts of weird and wonderful parts, be they mechanical, electronic, or even water-powered. Yet, however a computer is implemented, it must work according to the principles of computer science in the same way that every car’s internal combustion engine, as varied as they are, all work according to the relevant laws of physics.

    Hardware gets mixed up with the field of computer science. I’m pretty laid back about that, but some purists like to emphasise the strict division between the machinery and the principles. Roughly speaking, this corresponds to a separation between hardware and software. Software, a word I’m sure you’ve heard before, is the collection of programs which computers run, and the concept of a program goes to the heart of computer science. Unfortunately, programs are hard to define, but rest assured that you’ll come to understand what a program is over the course of this book. What makes programs tough to penetrate is that they’re nebulous, abstract things rooted in mathematics, the parent subject to computer science. Programs have numerous legacies by virtue of this inheritance. Like mathematics, programs don’t exist in a physical sense. They’re conceptual things; ideas which exist in programmers’ minds that have substance only once they’re written down.

    This legacy from mathematics explains many things. It explains why programs look like jumbles of mathematical formulae. It explains why computer science attracts so many nerdy folks who are good with numbers. And it explains why programmers count up from 0 instead of 1 like the rest of the human race. Maybe you’ve noticed that? You might look through some of the programs on your computer and find a new one labelled version 1.0. Why 1.0?

    OK, you might say, when a program is updated, the author appends a number to identify the version. After the initial version is updated several times, we progress through versions like 1.4 to 1.5 to 1.6 and so on. I get that. But why start at 1.0? Why not 1.1? And why, when I upgrade to the second version, is that called version 1.1?

    You’d also find this peculiarity were you to read through the contents of a computer program. If you watch a race on TV, you’d say that the winner came in position 1, the runner-up in position 2 and so on. If you ask a programmer to write a program for processing the race, the results would begin with the winner assigned position 0 instead, and the runner-up in position 1. To a programmer, the hero is a zero.

    Counting up from zero, which feels somewhat unnatural, actually simplifies matters when you deal with lists of things. In these cases, counting up from 1 can cause confusion. For instance, have you ever stopped to think why the years of the twentieth century all began with 19 and not 20? It’s something that often trips up little kids (and occasionally big ones too!). Why was the year 1066 part of the eleventh century and not the tenth?

    Figure 1. Two buildings with different numbering schemes

    Let’s look at an example of counting up from 0, because we all do it occasionally whether we realise it or not. In some parts of the world, the bottom floor of a building is called the ground floor and the next one up is the first floor. In this case, the ground floor could just as easily be called the zeroth floor. Similarly, when programmers refer to specific items in a list (which they do a heck of a lot), they often need to calculate the position of an item in that list by offsetting it from a base position. This base item is labelled number 0. Working out a position when a list is arranged like the floors in a building makes things a little simpler. Floor 3 (or the third item) is three above the ground floor (or zeroth item). If the ground floor were floor 1, then the third floor would be two above the ground floor as shown in Figure 1. We count centuries in a manner similar to the left-hand building in the figure. Because we count centuries up from one (the years 1 to 100 were the first century, not the zeroth century), we have to remember that centuries don’t match the years within them. It’s only a small confusion, but working out positions in a list is done so often that little hiccups like this can actually cause more problems than you think.

    With this explanation, you’ve hopefully learned something new about computer science. I know it’s trivial, but nevertheless it shows you something about the subject and explains why that something is the way it is. This example is just the tip of the iceberg. There’s much more complex and interesting stuff still to come. Computers are complex things, more so than any other machine we’re likely to use on a daily basis. Unfortunately, they remain mysterious to many people. For many of us, our relationship with computers is one of bemusement, amusement, frustration, and fascination, all experienced at arm’s length. We sometimes even find ourselves as the subserviant member in the relationship, desperately reacting to the unfathomable whims of our computer as we try to make it happy. This is not an ideal state of affairs if we’re going to be so reliant on them in our everyday lives. It doesn’t have to be this way. If our relationship with computers is sullied by their mysteriousness, the answer is simple: learn more about them. And I don’t mean learn how to make spreadsheets.

    To understand what’s going on in that magic box beneath your desk, we’ll need to look at the science behind it.

    This book will present you with the core concepts of computer science. You will learn about the subject’s history, its fundamentals and a few things about its most pertinent characters. Understanding these concepts will demystify the machine. Each chapter can be read as a stand-alone unit. However, they’ve been written in a way that reading from start to finish is like reading a story. The chapters vaguely follow a chronology and each one builds on preceding ones. It’s your choice.

    However you choose to read it, this book will take you from the earliest beginnings of mechanical computation and show you how we arrived at today’s world of the magical and ubiquitous electronic computer. You will also learn of the monumental problems that faced computer scientists at every stage. You will see how they developed ingenious solutions that allowed the field to progress. And you will observe how progress led to both new opportunities and new problems.

    Part I.

    Fundamental Questions

    When all is said and done, the only thing computers can do for us is to manipulate symbols and produce results of such manipulations.

    Edsger Dijkstra (1930–2002)

    What is computer science? What does a computer scientist actually do? These are difficult questions to answer, but if we hope to learn anything about the subject then I suppose we’d better deal with them.

    Looking for the definition of computer science in a dictionary won’t help, because there are as many different definitions as there are dictionaries. In fact, even computer scientists don’t tend to agree on the definition of their subject, so what chance have the dictionary writers? What’s more, the subject has developed a huge array of sub-fields over the years and, at first glance, they seem absurdly diverse. For example, computer vision specialists look at how computers deal with images; network experts concern themselves with how to get computers talking to each other; and information theorists don’t deal with computers at all, instead spending their time worrying about how to process and quantify information. Given all this, how could I possibly discuss computer science in a way that covers the whole discipline?

    But physics is also a widely diverse subject area. Yet physicists can collectively claim that they study the fundamental nature of matter and how the universe behaves, whether it’s sub-atomic particles or whole families of galaxies. Surely then, we can also sum up computer science into a nice, tidy phrase. That’s one thing I’ll do in this first part of this book. I’ll show that there is a way to address computer science collectively, and in so doing I’ll show that all its practitioners share a stock-in-trade, which is studying how to compute.

    No subject area is born in a vacuum. Every field of science has branched off from some predecessor, taking a handful of ideas with it and using them to form the core of a new discipline. To demonstrate the kind of concepts essential to computer science, the first part of the book will explain a few of the concepts that predate computer science but nevertheless lie at its heart.

    Chapter 1.

    Inputs, Processes and Outputs

    The definition of computer science

    Computer scientists study the science of computation. Yes, I admit it seems embarrassingly obvious to say that...after all it’s right there in the name. Nevertheless, I’m not being flippant; it’s a useful thing to say, but it needs some explanation. Ask yourself: what does it mean to compute? In particular, what possible meaning of compute could apply to all the diverse fields of computer science?

    Figure 1.1 What it means to compute

    In its most general form, computation is as simple a concept as that shown in Figure 1.1. It involves inputting some data, processing it in some way, and generating some output. Simple as that. It’s like a conveyor belt that carries raw materials into a machine, whereupon the machine thrashes around doing its magic and eventually pushes the finished product out the other end. As a model of computation, it’s widely applicable. From the smallest operation to the biggest computer task imaginable, computing always involves taking some input, doing some work with it and returning some output.

    Computation describes all the things you do when you use your computer, including the simplest things

    Enjoying the preview?
    Page 1 of 1