With four episodes to go Vic chose a topic that was never on the list - the history of programming. We look at five key people in the evolution of programming.
Transcript available
Welcome to Pragmatic. Pragmatic is a weekly discussion show contemplating the practical application of technology. Exploring the real world trade-offs, we look at how great ideas are transformed into products and services that can change our lives. Nothing is as simple as it seems. This episode is sponsored by ManyTricks, makers of helpful apps for the Mac. Visit ManyTricks, all one word, .com/pragmatic for more information about their apps, Butler, Kimo, Leech, Desktop Curtain, TimeSync, Usher, Moom, NameMangler, and Witch. And if you visit that URL, you can use the coupon code pragmatic25. that's Pragmatic the word and 25 the numbers in the shopping cart to save 25% off any Mini Tricks product. This episode is also sponsored by our friends at lynda.com. lynda.com is the easy and affordable way to learn. We can instantly stream thousands of courses created by experts in their fields of business software, web development, graphic design, and lots more. Maybe you wanna master Excel, learn negotiation tactics, build a website, or tweak your Photoshop skills. Well, visit lynda.com/pragmatic feed your curious mind and get a free 10-day trial. There's something for everyone there. So if you ever want to learn something new, what are you waiting for? We'll talk more about both of them during the show. I'm your host, John Tidgey, and I'm joined today by my co-host, Vic Hudson. How's it going, Vic? It is good, John. How are you? I am exceptionally exceptional. Oh, right. Oh right. Well, you know, I mean, I'm alright. I just like to play it up sometimes. Today's episode is going to be a bit different and different insofar as this has never been a topic on the list exactly. In fact, this is a topic that I asked. This is the fourth last episode of Pragmatic and I thought it might be nice to let you pick a topic for once. Like, completely whatever you wanted. Anything goes. So... Woohoo! Yeah, I know. Rockin'. So what'd you pick? I picked the evolution of programming. Well, as a programmer, um, as you are, Um, I guess I should have seen that coming. [laughter] Alas, I had... It is a subject that is a bit near and dear to my heart. Alas, I did not see it coming, so I had nothing prepared and I have been crazy busy at work and I wish I had more time Because there is so much to cover in this. I'm going to do the best I can with the time that I've got Which honestly seems to be a relatively good sentiment for life Well, I think if we did it truly proper, it would probably be like a 10 hour episode So I think it's probably suffice to say it would have to be a highlight reel no matter what I'm with you on that Okay, then so the best way to start is to begin or the something like that. So, okay Line 10 Line 10. Oh, yes. Start at line 10 Yeah, well like yes, but then go to line 20 and then line 30 And then when you realize you left something out you can insert it at line 15. Okay look 10 go to 10 and you're done. Okay. No that's a loop John. That's a loop that never ends. Except this show which is going to. Okay evolution of programming. Programming and these are definitions that I didn't actually look up the definition of programming maybe I should have but here's what I think programming is. I think programming is the act of instructing a machine how to behave based on a reconfigurable set of instructions. What do you think? I call that a decent definition. Cool I didn't pull that out of anywhere except my um head head head anyway. A machine whose primary function is to process information I would refer to that as a computer. So, the takeaway from doing a bit of research on this and my own personal knowledge that I've accumulated over years and years of listening to the history of computers and programming is that, honestly, there are a very great number of people around the world over the course of the last 150 years in particular, although even before that before that that have contributed big things to the development of what we know of as the computer and of programming those computers today. Lots of people have contributed and a lot of people overstate just how much specific individuals actually did contribute and there's a lot of misunderstanding around what they contributed and what they didn't and hopefully yeah I mean I mean, hopefully we'll straighten some of that out. You know? - Good deal. - Alternatively, we'll make people that are cheering or von Neumann fanboys very angry and they're gonna like Twitter bomb me and stuff and hate me forever. So apologies if I offend anybody, if I get some of the history wrong because the problem is I've done quite a bit of research into different people in the past. And honestly, yeah, maybe some of this will surprise you. Maybe it won't, we'll see. So, I want to tackle this one again differently. And not just because you picked this topic, Vic, but because you also were very keen for me to look at some of the historical slant. So, I'm going to talk about five of the bigger names, bigger in the sense of people talk about them a lot more, not necessarily bigger because they contributed more or less because that's a matter of opinion, you know? And a lot of the things that they said- - Well, there's a lot of conflicting reports too. - Oh, sure. I mean, history is written by the victor, right? So if, and this is one of the things that I found with the patent system is that someone that invents something, but doesn't patent it seldom gets credit. The person that patented it and their name is on the patent. - They get all the glory. - They get the glory, they get the attribution. The reality is usually, not always, but usually there's more to the story. And it's in my experience, it is never safe to assume that the name on the patent is the person that actually invented the thing. So, and people do invent things in parallel. That's a real thing. And saying who did it first, it's kind of a little bit ridiculous. The point is that we are where we are as a result of a lot of people's work and a lot of people's great ideas. So anyway, five people, right? That is nowhere near enough. I know that, but this podcast would go on for an infinite, well, not infinite, but it would eventually run out. Infinite's a really big number. So like really, really big. Anyway, so if I leave anyone out that you feel was pivotal in the creation of the computer or programming or its evolution, then I apologize. Feel free to yell at me. People do sometimes, that's fine. You know, and I got to draw a line somewhere. So, okay. With those caveats, I'm going to jump in to some more caveats because it's such a big topic. If I don't put a boundary box on it, I'll be in trouble. So next, I'm not going to talk about every programming language ever devised. Not happening. What? Nope. Not happening. I got a list. Good. Feel free to- It's quite long. Are you talking about the notes or have you got your own private... No, no, no. I'm just playing with you. You're messing with me again. I would be afraid of how long a list of every programming language ever devised really is. You know, I'll tell you the truth. I started making a list and it scared me too. So I stopped. I just started doing a list of programming languages that I programmed in in my lifetime. and I'm 38 and there are programmers out there that are older than me that have programmed in stuff I mean I know what COBOL is but I haven't actually written a line of code in COBOL but I know other engineers that have and the list would just go on and on so anyway, got to draw a line, not doing that I'll cover some of the basics, some of the big names but that's it And I'm also not going to cover every single sub-method of program structure, okay? I'm not going to start debating the evolution of object-oriented programming, multi-threaded programming, anything like that. I'm not interested in synchronization, asynchronous programming. That's exciting stuff, John. I know it is, but that is not part of the evolution of programming that I want to cover because again, I'd be here forever. Yeah, I'm just trolling you a little bit. This is going to be the Vicks topic, Vick trolling episode, isn't it? Is that what this is turning into? No, no, no. I mean, go right ahead. It's fine. It's your show. As you were, as you were. That's your show. It is now you just take ownership of it right there. All right. Uh oh. Uh oh. Yeah, now be afraid. So, okay. So yes, finally, finally, finally, finally, there were often disputes about who covered what. So, hopefully I'll try and provide both sides of the story and you can make up your own mind. Okay. The very first... People that feel we make mistakes or get it wrong should feel very free to interact with us on Twitter and tell us about that. If they want to, you know, or they could just shake their fist briefly, turn their head away, talk to their loved one, and then realize they completely forgot about what John just said that had made them so angry. And they won't go back 30 seconds to, you know, reignite that rage. That's cool. You know, go Zen. OK, where do we start? And you know what? I absolutely have to start with the computational machine. It's called an abacus. Did you know, did you know that there is actually no definitive time at which anyone can say for certain when the abacus was created? It's a long, long, long, long, long time ago. Right. Because of course, people were counting with pebbles and groups and stuff like that for a long time ago. And let's be honest, abacuses can only really count. They're a counting machine, you know. To say that they're programmable is somewhat of a stretch. You can't actually put a program into an abacus and have it compute something by turning a handle, pressing a button or blowing it or whatever, you know, just it's not going to happen. Another little interesting bump along the road bump, positive bump though that I have to mention is the slide rule. Who doesn't love a slide rule? Have you used a slide rule, Vic? I've actually never used a slide rule. I've never seen a slide rule. I've heard of a slide rule. I have an idea of what they're all about, but they were a little before my time. You are missing out. I have my father's slide rule. We actually had one class in high school. It was Math C, which was extension maths that you were encouraged to take if you wanted to go into engineering or physics, which is what I was hoping to do at university. So I did it in high school with a slide rule. And we have one lesson where we did two or three equations. We solved the numbers using slide rules. So, and I have my father's slide rule from when he went to university and he was doing teaching. And I think they're absolutely brilliant. I looked at them first time, numbers everywhere, and I'm like, how the hell does this work? And anyway, so interesting history, John Napier, he's a Scotsman. 1617, actually, he invented logarithms and it was the basis of logarithms because the thing about logarithms is so cool and you can tell a kind of mathematician kind of weirdness that I've got. I can't help it. I do think logarithms are cool because you can multiply by adding and that's cool. It's weird, but it's actually true. Anyway, so the slide rule was based on Napier's foundation of logarithms. So that's how slide rules rule. And in 1632 was the first slide rule. And that was still in use quite famously by NASA in the 1960s during the development of the mercury the Gemini and the Apollo programs. Not everything, by the time they got to Apollo, not everything was done with a slide rule, but a lot of it was. Like examples like the SR-71 Blackbird also, a lot of its design, all the calculations, not all of them, but a lot of calculations were done using slide rules. So, you know, by the time it got to the mid to late 70s, though, you know, most engineering shops and universities were switching to computers and calculators for the heavy calculations. So there you go. Anyway, technically the slide rule is a computational device. You program an input and it gives you a result. So there you go. And I also have to have a quick mention, tip of the hat to Mr. Pascal. Yes, thank you. Or Pascal, if you prefer. But I'm sorry, mate, your machine can only add and that's just too boring for me. So yep, tip of the hat, but it just added, mate. That's it. Anyway, I know he's not a listener. That's not possible. So I think-- - I think you can safely assume that. - I can safely assume this is true. Okay, all right. So we have to start, and I wanna say, I said how I was gonna do this differently. I'm gonna talk about the people, their contributions, and then we'll talk about some of the technicalities between people. So I'm gonna start off with two first. And the very first person, person that is widely considered to be the father of programmable computers is of course Charles Babbage and he was born on Boxing Day in 1791 so that's the 26th of December for those that don't know that Boxing Day is the 26th of December. So and it's interesting you know doing the research on these people a lot of these people were born in December so there you go popular month or rather nine months prior was a popular month I guess anyhow he was a mechanical engineer English guy And the funniest thing about Babbage, I think, the insane thing is that many of his designs were never physically built. Or I'll say in his lifetime, they weren't physically built. And it's like, OK, I'm getting ahead of myself. He had a lot of money for the difference engine. So there were two things he designed. The first one, well, I did more than that, but there were two big ones that he was known for. The Difference Engine, of which it had two full designs, a first and second revision. And the second design he did was something called the Analytical Engine, which was a programmable computer. But these aren't computers like you would think of. These are completely mechanical devices. So the idea is that there are sets of wheels that you would put into a certain position in a program on the difference engine and the physical cranking and turning would then produce a result. And because of the positioning of all of the different dials and all of the gears and interlocks and everything, that would perform the calculation. And it seems crazy, but you're using mechanical energy to do those calculations. But he never finished He got some by the way through building the first revision and he had plenty of money. He just had a falling out with the guy building it and said, "Yeah, forget it." So, he was all interested in the second revision of the difference engine at that point. Anyway, it wasn't until the late 20th century that they actually completely decided to build a complete functioning difference engine. And the original design was built by a multi-millionaire and the other one was built in England. And both of the machines built to his specifications work perfectly. It's just that he never built them in his lifetime. It's kind of like... it makes me think a little bit of Van Gogh. It's like you've gone through his whole life thinking he was a horrible painter, he had all sorts of of mental issues with depression and lost an ear through suspicious means. The point is, you know, and now he's considered to have been an artistic genius, you know, and Babbage was, you know, I wouldn't say he's quite in that quite that extreme. Certainly people recognized his work, but he never built him. But the analytical engine was the one that he is most remembered for in relation to programming. And that's of course the point of this episode. And that is... Did you give a year on that? On which one? The analytical engine? Uh huh. It was 1840... No, it was 1850, I think it was 1852. I apologize if the date is wrong. Yeah. So... I just thought people might be interested to know that it went that far back. Yeah, it does. And the analytical engine was basically designed to be programmed rather than using the the dials and everything that he'd had on his original difference engine, but with using a punch card. And punch cards were a quote-unquote new idea at the time, which we'll talk about punch cards a little bit more shortly. So, the designs were brilliant, okay, but in essence the takeaway is poor Charles Babbage, his ideas and concepts were well ahead of the time that he lived in. They were expensive to manufacture for the computational power that they provided, they were enormous. But they were also the first truly programmable, the analytical engine was the first actual design that could work for a programmable computer. And that is why Charles Babbage's contribution is so important. Truth be told, though, that's his machines translated electronically don't bear a great resemblance to what we think of now as computers. But they had the virtue of being the first designs that actually could work. Which is what makes this contribution interesting. So the next person to talk about is actually closely related to Babbage and also hotly debated and that is of course Ada Lovelace. Now Ada Lovelace is not actually her name. Her name was Augusta Ada Byron and she's Countess of Lovelace after marriage and she came to be referred to as Ada Lovelace. The truth is though that Augusta Ada Byron is a daughter of Lord Byron, the famous poet, also English. And the thing is that she was born again in December, the 10th of December in 1815. So, you know, almost two decades thereabouts after Charles Babbage. So a bit younger at the time but she was a mathematician. Now obviously you know being of a better background could afford to be educated but you know I was talking to Kirsten about this before and as she pointed out in that day and age you know it was unusual for there to be very many female mathematicians. So that in and of itself is notable especially considering that her father was a poet. It's an interesting choice. So, in any case, Ada was a mathematician and I think it was about the age of 19, she became very fascinated with Babbage's work. The thing is, she is widely considered to be the first person to have written a computer program. So the first computer program. And it's kind of accurate. Technically, it was an algorithm, but people kind of attribute that to being a program, right? Yeah, but I mean, it's okay. So let's explore that a minute. I think it's sort of accurate, but it's kind of close to the truth, partly for what you said, but partly for other reasons. So, all right. So essentially, Babbage's notes about how to program his analytical engine. Well, let's face it, didn't physically exist in either of their lifetimes. Let's say that wasn't really the best. OK. Yeah. It was pretty shocking. Now, he gave a seminar. I mean, the design was fine. The blueprints were OK, but how you programmed it, the detail about how useful it could be, what it could do for you and how to program this thing was pretty, pretty light on. So he gave a seminar in Italy about his machine and how you could program it. And a young Italian engineer wrote up that lecture. And in one of those crazy things, I don't understand. English guy gives presentation in Italy. Italian engineer writes it up in French. I don't know. So anyway, maybe someone can explain to me why on earth that unusual combination came into it, came to be. But irrespective, perhaps French was a more widely spoken language than English in Europe at that point. That's possibly the reason. Irrespective. This is Luigi Manabre? That's him. Yes. So, and that sort of... I didn't mention him by name because that was the end of his contribution. So, if you can call it much... He took notes in the lecture, okay? And wrote it up as a paper. So, that's... But it wasn't that paper that gained popularity. Because Babbage's friend, Charles Wheatstone... And for those electrical engineers out there, Charles Wheatstone, that's the same Wheatstone who's famous for his involvement in what's become known as the Wheatstone Bridge, which I built during university and so on. So not going into that, but yeah. So those people, they sort of, they stuck together, just like people still do, collaborating, people, experts in their fields tend to flock together a bit. So anyway, so Babbage was friends with Wheatstone And Wheatstone commissioned Ada to translate the French paper into English. But Ada didn't just translate it. She actually took the time to learn about the machine and she sat with Babbage extensively. It took her nine months. So, the better part of a year for her to actually do that translation. It wasn't a huge paper, but what she did is she expanded considerably on the notes in the paper and very clearly discussed the uses, the usefulness and what the machine, the analytical engine could be used for and future capabilities eventually would prove to be correct. So, it's the sort of thing that where Babbage, I always got the feeling that Babbage was kind of like, this is a really cool idea, we should do this, it's awesome, it's wonderful, it's everything, but it wasn't big on the execution and wasn't big on the selling it. Whereas Ada was very good at explaining it and selling it. So the bit where there's the dispute, though, is over whether or not she actually wrote quote unquote, the punch card program for the analytical engine, which I think was the Fibonacci, I think, Bonomio. Anyway, there was an algorithm, like you were saying. And the problem is, if you look at the correspondence, the letters that went back and forth between Ada and Babbage, Charles Babbage, it was it was it almost suggests that Charles Babbage wrote the algorithms and that Ada Lovelace actually created the program from those algorithms. But at one point where he had attempted to assist her in creating the punch card program, he'd made an error, which Ada picked up on and Ada corrected his mistake. So, there's no question that she understood how the machine should be programmed. And there's no question that she helped popularize it and she understood the value of the machine and helped to explain it to the masses. And there's no question that that inspired a lot of people to become more involved in computational machines, which would then go on to become Yeah. So, she also famously dismissed the idea of artificial intelligence. That's yeah, well, yes. On that note, I'd like to talk about our first sponsor who does not write artificial intelligence software, but some of the software they write is incredibly cool and that's many tricks. Now they're a great software development company whose apps do many tricks. It's in the name. Can you tell? Their apps include Butler, Kimo, Leech, Desktop Curtain, TimeSync, Usher, Moom, NameMangler and Witch. As in a witch. Not which way am I going. Now the thing is that there's so much to talk about for each of their apps. I can't go into each of them here so I'm just going to highlight four of them. We're going to start with my personal favorite and that's Moom. Now Moom makes it easy to move any of your windows to whatever positions on the screen that you like, halves, corners, edges, fractions of the screen and you can even save and recall favourite window arrangements with a special auto-arrange feature when you connect or disconnect an external display. Usher can access any video stored in iTunes, Aperture, iPhoto on any connected hard drives and your Mac, it just allows you to easily group and sort and tag them in one place. If you install pairing or flip for Mac, there's no need to convert anything into an iTunes H.264 format. So you don't have to handbrake your afternoons away just in order to get into iTunes to watch your videos anymore because you can do that in Usher. You can watch them all in Usher. So if you've got a video collection that's scattered across different programs and drives in different formats, then Usher can help you straighten it all out. It's very handy. Now, name manglers, the next one. You've got a whole bunch of files and you need to rename them quickly, efficiently and in large numbers. Well, NameMangler can help you with that. It can extract the metadata from the files, it can use that to rename the files and obviously it's got search and replace but you can create staged renaming sequences and the best part of it is if you get something wrong and you mess it all up you can just go back to where you started and have another go. No damage, works really well. So, which I mentioned which before, you should think about which is a supercharger for command tab app switching. Now which is great for and it's very popular with X Windows people like me. If you've got three or four documents open at once in any one app, then which is beautifully simple pop up will let you pick exactly the one that you are looking for. It's really, really handy. Now, that's just four of their great apps. There's still five more to check out. And there's a new one coming as well. All of these apps have free trials that you can download from many tricks alloneword.com/pragmatic and you can try them before you buy them. They're available to buy from their respective pages on the site or through the Mac App Store if you prefer to get it through there. But if you visit that URL, once again they've extended it because they're awesome, you can take advantage of a special discount off their very helpful apps exclusively for pragmatic listeners. Use the code pragmatic25, that's pragmatic the word and 25 the numbers, in the discount code box in the shopping cart and you'll receive 25% off. This offer is only available to pragmatic listeners for a limited time because the show is going to end and so too will this offer so take advantage of it while you can. There's only a few more weeks left. Thank you to Many Tricks once again for sponsoring the show for so long and and always being there for us. Thank you so much. Okay I've mentioned punch cards. I think it's time I think it's time that we talk more about punch cards. Do you like punch cards, Vic? I have actually never seen or used or touched a punch card. You have missed out because I think... I think I probably have. The first time I saw a punch card, I looked at it and I tried to figure out what it was. I didn't know what it was. I was picking this thing up and it had all these little funny rectangular holes in it and numbers down the sides. I'm like, "What the hell is this for?" It looked a bit like a library index card to me, but obviously it wasn't. Because of course- Well, there's probably a lot of young whippersnappers like me that just are completely clueless about this. Well, not clueless that it exists, but clueless what it's about. This is the moment when I point out- They should keep in mind something like a standardized test answer sheet with little squares that you shade in with a pencil for multiple choice, except instead of shading it in, they would punch the holes. Exactly. But, Vic, you just said young whippersnappers like you who may not have seen these things, you're two years older than me. So, what's... So, what happened, man? I say this in reference to the grand scheme of computer programming. OK, well, OK, let me just tell you how I came across them first. OK, when I was doing my university degree, the university had a whole bunch of old antiquated stuff. So it was old tech when I started uni. I started my first year in university was in 1994, in 1993. Gosh it was so long ago. Anyway 20 plus years ago, way back in the dream time. And then they hadn't used that machine for at least 10 years. So they were using the programmable punch card computers up until the end of the 70s, essentially. Anyway, as in, you know, my university was. OK, so let's talk a little bit about the history of the punch card. And the thing that's interesting is it comes back to something I was saying before about the first person to patent it gets a lot of the credit. So if you read Wikipedia, you will get that the first person who patented the punch card was a guy called Herman Hollerith, and that was in 1884, and he was using it for the purposes of storing data for the US Census. OK, Hollerith did not invent the punch card. We know this because, well, our good friend, Mr. Charles Babbage, designed a machine that used punch cards in the analytical engine several decades before that. So, Hollerith did not invent the punch card. No, not only that, Babbage also did not invent the punch card for anyone else that thinks that he did. He didn't. See, the punch card. Sorry. Is it safe to say that Herman Hollerith was the first patent troll? I don't know. Moving on. So the problem, I guess, with the history of the punch card is that data, the thing that annoys me is the difference between code and data. OK. And I realise that no matter how you slice it, code is data, but data isn't code. Yeah. You know, it's kind of like a one directional Venn diagram, if that makes any sense, because you could argue that a series of instructions about how a code should be executed can be interpreted as being a form of data. Right. But if I have data that is simply representative of information, that data does not therefore mean it can be used as an executable program series of instructions. So this is the problem that I've got with the distinction between, well, this punch card stores a program versus no, this punch card stores data. So it's not really a punch card. Well, the thing is that the patent was for storing data, not programs. So Herman Hollerith's thing was, it was about storing data for the census. It was not about storing a program, whereas Babbage's application was as a program. But Babbage was not the first person to use it as a form of programming, because if we go back to our definition, remember our definition at the beginning was "programming is the act of instructing a machine how to behave based on a reconfigurable set of instructions." If it's a fixed set of instructions, it's not programmable, then it's not programming. Right. If I decide to design a machine to do one thing and one thing only, it's not programmable. It's a custom machine. OK, so that's important to remember. So if you want to be more pedantic or perhaps more thorough, depends how you want to think about it, then why don't we go back to the Jacquard loom? Have you ever heard of the Jacquard loom? I have not. The 1801 was the first actual working model of a Jacquard loom. And it was, you know, created by Joseph Marie Jacquard, who, you know, was French. And the Jacquard loom punch card or punch cards were a series of punched cards, funnily enough. And what they would do is you would feed them into his loom and that would determine the pattern that the loom would... The weaving pattern. Yes. Now, it was not the first necessarily of its kind, but it was the first that was on a mass scale that could weave as the truly complex patterns. There were still machines before that that used punch cards in one way or another, but it's recognised as essentially the first mass manufacturing application using a punch card. And that punch card and weaving pattern, that is data. Okay. It's not, you can argue it kind of is a program, it kind of isn't. But you know what? I think it is a program because that loom and that weaving pattern on that loom depended upon the settings. You change the settings in the program, you change the end result. Therefore, you are programming that machine. Therefore, that is... It's an instruction set. It's an instruction set. Yes. Unique to that machine. An algorithm. Yes. And we're going to get to that in a minute, the whole uniqueness point. So, here's the next interesting thing that most people don't know is Mr. Charles Babbage was a huge fan of this guy. Well, we assume he was because he had kept a portrait of him on his wall, you know, surrounded by his Jacquard loom punch cards. So, it's pretty clear that the loom's punch cards were a direct influence on the analytical engine that Babbage created. And that's where he got the inspiration to use punch cards for programming analytical engine. Okay, so that's really cool. It is very cool. In the more modern parlance, of course in the IBM parlance, they came to be known as just, you know, punch cards and a group of punch cards, rather like cards in a deck, were called a deck. Which I sometimes wonder if that was some of the inspiration for the name DECK, Digital Equipment Corporation, like as a, you know, DECK replaced DECK of Cards. I don't know, whatever, probably not. But still, it's nice to think about it. That is not... It's not completely unreasonable. I think it would be cool if it was true, but I'm not sure it is true. Anyway, that's okay. I'll leave that one alone. This is This is not the bugle. Okay. Before we talk about our next big name, I think we need to go through some more detail about types of machines. Okay. So let's talk about different kinds of machines like manufacturing equipment. So the way I would define manufacturing equipment that can be programmed is like the loom, right? But it doesn't have to be a loom. I mean, you can have machines, you can program do all sorts of things. I mean, if you want to talk about advanced robotics, that's all programmable. The machines that can be programmed to weld different spots on a car as it's going on a production line or pick and place machines or CNC router machines that machine away material to create different designs. All of those manufacturing equipment, they're all programmable. And at different points in history have used different methods of programming. So that's a physical output from a, I guess it's a physical input in a sense, but it's a programmable input, it's more the point. Now, other types of machines are purely for calculation. And you've got two kinds of outputs. You've got a physical output from a calculation, either printed on a piece of paper or ticket tape, or, you know, whatever, or it's visually displayed. And of course, we've been focusing on the real basic stuff, you know, like calculation for the sake of calculation. Like here's a polynomial to the factor of God knows what, and here's the answer, it's 10.1 or whatever, or 10.10.1. Anyway, that wasn't even funny. The point is that, you know, I guess I'm talking about programming machines. Those are the ones I'm particularly gonna focus on. I'm trying to think of how many other machines don't fall into those categories. I'm sure there's some, but those are the bigger ones. So in terms of computers in the 20th, in the first half of the 20th century, there are essentially two kinds. There was what they referred to as a stored program, digital computer, and the other ones, the program controlled computer. And these are kind of broad, esoteric sort of categories. They're not something that most people aren't familiar with that terminology, and it's not commonly used. But I guess the point is that you had to try and separate the kind of computers that they were. They're both computers, but they are very different in how they all functioned or how you program them. So a stored program digital computer, it does what it says on the box. The program can be input by many different methods, but it can be in some instances, semi-automated. But the key point is that the program is stored somehow in the computer's own internal memory. And memory could be through multiple means. It could be a mechanical memory. It could be relay switch memory, or it could be, of course, the, you know, it could be thermionic valves, or of course it could be through these wonderful things called transistors, you know, and random access memory. Well, they're technically random access memory is all of those things I just mentioned. It's just that these days, random access memory is meant in the parlance of silicon. So, you know, so the pro, the computer itself stores the program that it then executes. So stored program, digital computer, as opposed to a program controlled computer, where they are programmed by either setting a series of switches and dials like a Babbage, you know, difference engine or an analytical machine, or of course you can insert patch leads to route data, you know, to different control signals between different functional units. And I also saw one of those, like an old analog computer, some people refer to those as analog computers. And, you know, the point is that you would literally patch if you wanted a gain block, you would patch with a lead from here to here and you would get 10 units of gain and so on and so forth. You could set your feedback resistors to do a voltage split, blah, blah, blah, blah, blah, all this sort of rubbish. But anyway, so that's all a form of programming. Of course, it's very labor intensive and that's a problem. And the labor means that every time you wanna run a program, it takes minutes, hours, days, weeks to set it up, depending on how big the program is and how complicated it is. And there's a lot of room for error. So those machines, thankfully, died because, well, they sucked. But you know what? That was still an important step on the journey. Let's be honest, that were a big step. Stored program, digital computers, that's where it's at, anyway. So I mentioned the mechanical settings. I mean, you could spin dials and wheels, like mechanical wheels to a certain position, you know? And what you would do, for example, in the Babbage system and you would turn a crank handle, you literally would turn a handle and crankity, crankity, crank, X number of spins and it rotates through all the different combinations to end up with a result. I'm not gonna go into how it does it. If you're interested, there's a link in the show notes, go and read up all about it. And as I said, they actually did build some of his machines and it's pretty damn cool, but it's also enormous for its computational power. So it's not exactly gonna fit in your backpack and well, let's face it, it really isn't all that powerful when you compare it to, I don't know, a basic wristwatch. - Yeah. - That's okay. However, in more recent times, using a tape or a card with pre-punched data to set the input conditions, that was far better because it reduced the setup time. And if you could store it by using a storage method, then obviously that's going to be a hell of a lot better 'cause it just has to go to the RAM to get the code and execute it. Yep. But you still got to put it in somehow. And remember, keyboards for computers, that hadn't been, there were no screens at this point, none of that. Right. Okay. So, one of the first computers, the Colossus, which we'll talk about when we talk about Mr. Turing shortly, it photo-optically read a tape that had marks on it, and that was used to set wheel positions much more rapidly. increased the setup speed of the machine for every calculation cycle. And I think you could call it a mass-produced computer of its day because they made 10 of them. It's like, "Yay, that's a lot!" Well, it's a booming industry. Booming industry for the 1940 computers industry. So yeah, you know, it was popular, quote-unquote popular. That was during the latter half of World War II. Anyway, so reading mechanisms, obviously, they were all improved and everything moved towards optical or magnetic. Inevitably though, for human inputs, we eventually progressed to keyboards. And in terms of saving programs, beyond random access memory, we wanna be able to import programs somehow and save them more efficiently than on a ticker tape that could get, or a punch card that could get folded, spindled or mutilated as they would often have in warning text. Do not fold, spindle or mutilate, which used to be, of course, what they would say on an envelope for a physical letter. You have physical letters? - Yeah. - Yeah, those things. - I do, I remember those. - Crazy, crazy days. You put a stamp on them and you put them in a magic box and they show up on someone else's doorstep on the other side of the world, that's just crazy. Anyway, so, yes. So the problem with all of these methods though, oh, sorry, I forgot, sorry. I might talk about magnetic tape drives, disc drives, then we evolved to spinning platter hard drives, EEPROMs, EPROMs, E-squared PROMs, random access memories, and now of course, flash memory being where it's at. The bottom line is they're all just storing the program. The whole idea of you type it in, you save it to something, and then the computer recalls it later, that hasn't changed in a hundred years. Well, maybe not a hundred years, maybe about 80 years. Pretty, since they first had stored code in them. Just the method changes, that's all. So the programming language for a lot of these early machines, though, was custom. It was all semi-custom. It was highly customized. So if you need a punch card that worked-- sorry, a ticket tape that worked on a Colossus, it would only work on a Colossus. It wasn't going to work on anything else. Same with the punch cards. Punch card program for a certain brand of computer would not work for a different one. There was no common standard. There was no nothing. And there simply weren't enough computers out there to demand it. I guess that was the point. But when they became far, I mean, okay, you could argue that the basic building blocks and AND gate and OR gate and exclusive OR gate, they all existed, but the representations were unique and not interchangeable. I guess that's the point. Yeah, so I've complained about this before in PLC programming world, because all the damn PLCs are all subtly different and so on and so forth. And the PLC program well it's no different you know their programs and that's programming as well and one of my early frustrations that I have mentioned previously is entering a program using a custom programmer it's literally a box a customized box with a numeric keypad and ABCD and an enter key on it and you plug it into the PLC and you would type in an instruction and you hit enter and it would go beep and it was a good beep or a bad beep and a bad beep would be oops start again you know and once you have finished you commit the code with a certain key sequence I forget what it was. And there you go, your PLC was programmed, you know. And that was that was like no screen. It was just an audible beep and a flashlight and a keypad made out of the bulkiest keyboard you could imagine. You know, like Commodore 64 thick keyboard, that kind of thickness. So anyway, I had one of those. Yeah, I had a VIC-20. Same physical dimensions pretty much except more sucky. Anyway, Never mind that. All the cool kids had Commodore 64s and I was stuck with my VIC-20. Anyway, I don't really, I'm not really all complaining that much. I think the VIC-20 was fantastic and it inspired me to get into computers and programming a lot more. So, you know, I was glad we were able to get one. Anyway, so before we go on any further, I'd like to talk about our second sponsor and that's lynda.com. So by the way, John von Neumann is next. So lynda.com is for problem solvers. Yeah, it's for the curious and for people that want to make things happen. You can instantly stream thousands of courses created by experts in their fields of business software, web development, audio, graphics design, and lots and lots more. Way too many to list. Now, they have an enormous library of titles that you can choose from. There are new courses added every day, and that makes sure their library is relevant and up to date. Lynda.com is used by millions of people around the world, has over 3,000 courses on topics like web development, photography, visual design, and business, as well as software, training software, like for Excel, WordPress, Photoshop. They work directly with experts from many different industries and software development companies as well to provide the timely training you need. Often, the exact same day the newest release becomes available on the market, so you know you've got the latest information the moment you are most likely to be needing it. Now, this is nothing at all like the homemade tutorial videos you're gonna find on YouTube. And they might tell you, if you're lucky, a little snippet, unindexed, buried deep, deep, deep somewhere inside at some insane time index, and they'll never tell you what it is. Anyway, lynda.com make high quality, easy to follow and well indexed video tutorials with transcripts that are broken down into very easily searchable sections. And this bite-sized piece approach makes it really, really easy to stop and pick up wherever you left off if you get interrupted. And you can do that whenever you need to. So you can learn at your own pace, in your own way, and in your own time. Now whether you're a complete beginner with absolutely no knowledge at all about a subject, or let's say you've got moderate or you consider yourself an advanced user and you're just looking to brush up on what's new in the latest version? Well, lynda.com has courses that span the entire range of those experience levels. You can learn on the go as well. lynda.com has iPhone, iPad and Android apps and they also support playlists and provide certificates as evidence when you've completed courses. If you're on LinkedIn, you can publish those certificates directly to your profile. Now many, many years ago, I left Windows and I switched to a Mac and I got stuck into lynda.com Tiger the basics tutorial and then next year, Leopard new features and essential training. And that was eight years ago. They're not a new thing. They have been around a long time for a good reason. They are that good. Now, some interesting courses available right now include Excel 2013 power shortcuts, always handy to know the shortcuts. If you're a regular listener, you'll know that I love Excel. That's okay. Please don't hate me. I love Excel. And another interesting automation related course is called Up and Running with If This Then That. That's IFTTT. And now there's also courses of yeah, it's IFTTT is really cool. So that that one is worth checking out. Now, this course is also on WordPress, Photoshop, Google Drive, Google Sites. I mean, seriously, there really is something for everyone. So a Lynda.com membership will give you unlimited access on training on hundreds of topics all for one flat rate. Whether you're looking to become an industry expert, you're passionate about a hobby, you just want to learn something new, you can visit lynda.com/pragmatic and sign up for a free 10-day trial. It's free to try. And once you do, you'll see exactly what I'm going on about and why I think it's so cool. Thank you once again to lynda.com for sponsoring Pragmatic. Okay. John von Neumann or Newman, but let's go with Neumann. I say Newman, but I'm probably wrong. Yeah, I think you are. okay, though. Lots of people say Newman. Anyway, John von Neumann was born in Hungary, and that was on the 28th of December. Why are all these people born in December? They later- he later moved to America. Now, he has been- I don't get it. It's like if you're going to contribute to computer science, you had to be born in December. Okay, I'm screwed. Never mind. Very, very prolific. I missed the boat, too. Yeah, well, we're both screwed. There you go. So computer science, you have to live without, without, you know, Hudson and Chidji contributions. Just we were born on the wrong month, man. What can we say? OK, moving on. So he was a very prolific contributor to science and engineering technology, and he is best known, I think, for his work on the Manhattan Project and the first hydrogen bombs. However, for the purposes of this discussion, more interested in his contributions to the computer. Of one of his, I think, rather funny creations, he is credited with creating the first self-replicating computer program in 1949. That makes it the world's first computer virus. Oh, yeah. Oh, I'm not sure if he should be proud of that, but there you go. This is the bit that I want to talk about, though, with Mr. von Neumann. He began writing and it is said that it became accidentally widely released. It was the first draft report on the EDVAC, EDVAC. That was in 1945, and it described a computer architecture that is still in some small ways the basis of modern computers and is referred to as the von Neumann architecture. Now, that specific kind of machine came to be thought of as a stored program digital computer. It describes a common memory space that is used for storing both the program code and the data, and a control unit and an ALU, arithmetic logic unit. Those two comprise what became known as a central processing unit, CPU. And that sits between the input to the machine and the output from the machine. Now, if you're with me so far, there is a problem. It's actually based on the work of Mr. J. Presper Eckert and John William Munchley. I think that's how you pronounce it. They were actually the inventors of the ENIAC, I think that's how it's pronounced, ENIAC computer. And that was at the University of Pennsylvania. And that was happening at that time. Now, I guess the worst part is the unfinished draft paper invalidated the pending patent claims the two people had done the bulk of the work on that computer. It's true that Von Neumann was a consultant at that time. However, it's not clear how much involvement he had. I have no doubt that the guy... He's another patent claim. I hate patterns. Anyhow, nevermind that. That is another whole discussion for another show. Stay tuned to see if it's on the final three. But anyway, also, I guess the bottom line is it's funny how names stick. They get associated with something. And remember, that was a first draft. That wasn't even a formally completed, finished report. It was, yeah, it wasn't. It was a draft. Now, why would you circulate a draft paper? I mean, think it through. I'm just saying, I'm not going to... You draw your own conclusions anyway. So the architecture is also given birth to another terminology that's thrown around, and that's the von Neumann bottleneck. And I kind of like the idea of a bottleneck because the bottleneck is an idea that, well, if you've got a bottle, The neck of the bottle is the part of the bottle that narrows towards the top, towards the lid, the exit point and entry point of the bottle, such that if it's got a larger base than the neck and you turn upside down, the liquid can only run out at the rate of the minimum diameter of the neck. And because it's a liquid, liquids are uncompressible. So no, you can't put more pressure behind it to force it in case anyone was trying to be clever. Anyway. So the bottleneck is an idea. The idea is that something is restricted in the flow. And that of course is applicable to data. So the data flow is restricted. Why? Well, that design has a common bus that's used to fetch the instruction code and the data 'cause it's in the same memory area. So you can't have simultaneous fetch requests. It's fundamentally restricted. And that's mainly the reason why we don't use the von Neumann architecture predominantly. What we do use is something called the Harvard architecture and that overcomes the bottleneck by separating the buses and that allows simultaneous read functionality. And in fact, if you want to be really pedantic, there's a, it's called the modified Harvard architecture. That's actually the most common structure these days in a modern computer. Some people say, oh, but the von Neumann architecture still used in cache memory and blah blah blah blah blah and okay yeah fine it's a small part of a small part of the computer but at the time it was a huge deal but I think too many people say oh computers are now completely based on von Neumann's work no they're not okay I'm sorry was it a big contribution absolutely was it his contribution specifically I'm not sold maybe maybe Maybe not, but irrespective. If you want to go into more detail about each of those architectural differences between Harvard and modified Harvard, feel free to. There are lots of links in the show notes. No short of information out there on the net because of course the net was predominantly written in its early days by computer geeks who found this stuff interesting. Plenty to read. Yes. Two more people to talk about. you've heard of this dude. His name is Alan Turing. Yeah, I think some vague references, maybe. Some vague references, maybe. I have always been curious about Turing because the Turing test had fascinated me from a very young age. And Eliza, the artificial intelligence software, also fascinated me from a very young age. So I became aware of Turing almost before von Neumann, certainly before Babbage, Ada, and any others or Pascal. I think he's probably definitely a little more well-known than those guys. Alan Turing was born on the 23rd of June, Not December. Oh, yeah. Anyway, 1912. And he died. Maybe that's why things didn't work out too well for him. Don't look at it like that. Stop jumping to the damn end. All right. Sorry. Okay. He died because everyone dies. So that's not really a new story, but he died with quote unquote from "accidental cyanide poisoning". Now, and he was only 16 days shy of his 42nd birthday. Now, you may be thinking the first thought that comes through your mind, because it's the first thought that comes through my mind. How do you die from accidental cyanide poisoning, considering cyanide is used as a suicide tool of choice for, well, at least that's what we're led to believe, for secret agents and, you know, there's arsenic and cyanide, right? And of course there's that beautiful memorable excerpt from Licensed to Kill 007 when the Hong Kong narcotics guy is busted by the drug kingpin, Sanchez and he sort of walks in and flips a tooth, crunches on it and then he's dead inside like 5 seconds and white foamy stuff comes out of his mouth. It's all very dramatic. Well, how does one accidentally die from cyanide poisoning? Well, we'll get to that in a minute, but anyway, recently there was a movie released called The Imitation Game and it was a portrayal of Turing's life and he was portrayed by Benedict Cumberbatch who is of famous from his portrayal of modern Sherlock Holmes. And he is absolutely amazing and brilliant in that and I love that show. And he also was in the second Star Wars, Star Trek, good God I'm going to get shot for that, Trek Into Darkness, I think it was, second episode of the reboot, second movie in the reboot. Yeah. And he played Khan Noonien-Song and did an absolutely amazing job of that as well. He's been in a few other things as well, also did as well, not quite as well, but still. I have to admit that the movie, in terms of factual accuracy, could be considered to be woefully inaccurate in many details. So let's... I've heard that. I haven't seen it yet. Well, let's start with the details where it's wrong that relate to programming. OK, the name of the machine that actually broke the German Enigma code was called the Bomb, and it's B-O-M-B-E. He contributed to its development, but the first version, you know, a prototype, I guess you could call it a prototype, was actually called Victory. Which is not what they call it in the movie. I think it was something like Christopher or something. I forget. Anyway, it was not those names. So that's like, OK, the digital computer that Turing invented was actually called the Universal Turing Machine. But the first programmable digital computer was called Colossus, which I mentioned previously, and that was actually built by, I think, as an engineer, Tommy Flowers. So, Turing didn't actually build a machine. It was based on some of his ideas, yes, but he didn't build it. Now, the Colossus is considered a program controlled computer. Not sure if I mentioned that previously, but the reason that Turing was credited so much is just because there are key elements in it that were ideas from his own machine. That's all. The accusations in the movie that he was a suspected Soviet spy at the end was complete BS you know but let's never let facts get in the way of a good story I guess. So there's a few good biographies out there of Turing there's a link to one in the show notes now that's been redone post like it was released well before the movie then they re-released it after the movie was released and said yo hey you should buy this because the movie was based on this but it's like yeah well apart from the new cover and not much shells. It's, you know, anyway, so, so yeah, have a look at that if you're interested. Now, cheering died of supposed accidental poisoning. But the problem is, of course, that that was in what 1950 something. It's not like CSI was a big thing. Okay, this is the 50s. CSI. There was no CSI in Miami, there's no CSI New York and they weren't holidaying in the UK and you know didn't lend a hand out to their English friends with all their you know this is not yeah so it was very badly investigated and some have said because of his sexual preferences it was basically there were lots of people that just jumped to conclusions and it was just not investigated and he was mistreated. I think there is definite evidence that that was the case and that sort of discrimination was very common, not just of Turing, but in that era. And frankly, it still happens, not to the point of which that they subjected people to 50 years ago, but certainly 60 years ago now, you know, certainly... Oh, back then it was still literally illegal. Yes. Things have changed. Things have come a long way. So anyway, rather than focus on that, let's talk about the supposed suicide. Now, Turing showed no signs or rather none of the usual signs of someone who's intending to commit suicide. And that of course is a sort of evidence that was provided which is hardly definitive, but it's something. Now, he did keep cyanide in his house, but he used it because He was a tinkerer, he was an experimenter. He liked to run experiments on all sorts of crazy things because that's just the sort of guy he was. He liked experimenting on things and that's okay. Admittedly, I like experimenting on things too but I don't carry cyanide in my house so I don't know, whatever. So anyway, he kept cyanide in his house for some of his chemical experiments but he was also known to be a little bit careless. There was one time where it was described that one of his experiments, there was an electrolysis experiment. Well, he needed some power for that experiment and he got it from a nearby light socket. Not kind of safe really, although admittedly back then everything was all used wiring, there were hardly any circuit breakers, if any, in most houses at that point because the circuit breakers were too expensive. The other problem of course is there was no earth leakage. So it was just like, oh yeah, it was a fuse. So if you go over current, you blow a fuse. Long enough to easily long enough to kill you before a fuse blows. So anyway, so he had an experiment room, a lab room, and it's unclear whether he was present in that room immediately prior to his death, but the room when they came to his house afterwards smelled of cyanide vapor. It suggests that he was running an experiment prior to his death involving cyanide and the cyanide had vaporized as part of that experiment. Now, it's also likely based on the levels of cyanide found in his organs that the cyanide was inhaled rather than ingested. And the thing to understand about that is that inhaled cyanide vapor takes a lot longer to kill you than it does if you directly ingest it. Direct ingestion is not exactly instantaneous, but it's pretty damn close for a lot of people. It depends on the person and the concentration. Cyanide, of course. But the truth is that inhaling it takes longer, but it's still just as fatal. Now, there's plenty of reason to doubt that it was suicide. Even murder theories have been kicked around. Also it was staged. unlikely. If you're going to stage it, you'd make it a little bit more obvious. And if it was suicide, you'd expect it to have been more signs. So, you know, it's just fishy. So, we'll never know. I think that because the evidence collected was so, you know, poorly done. Anyway, so back to programming, I thought that was an interesting aside. And one of those things that, you know, like I said, was as far as the imitation game goes, as the movie, don't let the facts get in in the way of a good story and don't let a facts that can't be misinterpreted, be bent to make it a better story. Whatever. Thank you, Hollywood. All right. Turing, however, was also famous for that test I mentioned, the Turing test. I don't think I actually mentioned what it was. So the standard interpretation of what a Turing test is, is imagine three players, call them players, okay? Two of them are people. One of them is a computer or a computer running a program. OK, so the concept is imagine two rooms or two doors. And in a third room, you have the interrogator. Call them player C if you'd like. Player A sits behind one door, player B sits behind the other door or room. Either way, player C cannot see A or B. The only interaction they have is through a computer terminal. So through the computer terminal, computer system, computer interface, if the person, if player C is unable to determine whether player A or player B is the computer or is the person, if they can't pick, then they are said, then the computer is said to have passed a Turing test for artificial intelligence, such that the artificial intelligence can impersonate a human being accurately. That's why I was so fascinated with Eliza. Eliza purported to be a therapist of sorts, someone you could talk to. Or was that Dr. Spezo? I can't remember. Anyway, so very fascinating. Now, another term that's associated with Turing is the term Turing complete or Turing equivalent. And I have to admit, you know, I know I'm not a computer science major. I am an electrical engineer, but I've done plenty of programming and I'll look at this term and I find it to be very fluffy and not particularly useful as a measurement of anything. I totally get the artificial intelligence angle. That makes sense. A lot of his ideas about computation and random numbers and how random number theories can help you to reach conclusions in less time and all that. It's fascinating stuff, very interesting. Yeah, well, some people, I guess. But the point is that this whole idea of Turing complete, do you understand, have you come across this one? - I've seen it, the reference, but I don't know a whole lot about it. - Okay, well, look real quick. What it is, is the concept is that a real world general purpose computer or computer language can approximately simulate the computational aspects of any other real world general purpose computer or computer language. Okay. Okay. So that's what Turing complete means. And I don't see why that's a particularly useful measure of anything, because if I can simulate another aspect of another computational device, I don't see how that's particularly useful. Beyond modern virtualization, it's like, what is that a measure of exactly? It doesn't mean that the device is more usable. it perhaps it means it's more flexible maybe, but I don't get it beyond that. Maybe someone can explain why that's a big deal because I don't get it. Anyway. All right. That's enough about cheering. Moving on to the next person and we're going to talk about abstraction and then we're going to wrap it up. Now, this one was your request and I threw this one in at your request and it's a lady by the name of Margaret Hamilton. And this relates to NASA specifically, initially to NASA. One of the things that people don't, I think, don't appreciate is NASA's role in advancing computer technology, because computational power meant that the Apollo program in particular was able to do far more advanced control than previous missions because of the computational power that it had, not just on board, but also on the ground. And the bottom line was that that involved writing a lot more software for a system that essentially relied on that software to be absolutely reliable, bulletproof, dependable. Because when you are that far away from Earth, there is no fallback. So in the past, you would only place that level of trust in a mechanical system or at very least an electrical system, But not a controlled system based on software. That was a newer idea considered to be quite risky at the time. And even now today, sometimes it's considered risky. The mission and these people's lives depended upon it. Yeah. And I mean, you want to take that extension into airplanes today. I mean, for the longest time and even in most airplanes today, fly-by-wire systems have a hydraulic backup, you know, which is an electromechanical servo replacement. servo backup system such that if the control system on the plane fails, the computerized system fails, then the pilots can always resort to a more traditional method of controlling a plane. Because remember, you know, before the hydraulics, it was all done by wire. And then when the planes got too big, the, the, the flight surfaces, the airfoils became too large and heavy for the wire systems to, you know, for, for a pilot to actually move them physically. So that was when, Howard Hughes introduced hydraulics and then hydraulics became accepted after a while because you're all never gonna fly a plane with hydraulics that's too dangerous what if the hydraulics fails all well you know the plane will crash you know I can't control it and that was prevailing for a few decades people got over it began to trust the hydraulics to the point at which now hydraulics is your your base of your basis of um you know control entry level of controllability if all else fails right? And now this skepticism surrounds software. So, you know, the time will come when software, you know, becomes like software that does the flying, but the autopilot will be the next thing. So, I don't trust the autopilot, you know, with no windows or something. I don't trust the cameras out the front of the plane because there's no windows anymore. Okay. So, back to Margaret Hamilton. So, Margaret Hamilton was involved at NASA during when the space race was in its full swing during the 60s. And because of the computer systems made a lot of things more possible with less weight. Obviously, though, no one had really written code that had to be that reliable, that dependable. And anyway, so Margaret Helmson was a mathematician from MIT, and she led a team that went on to develop a lot of the key building blocks for modern software engineering. And software engineering, the term itself is credited to her. Yes. Yes. So, their team went on to create a few things called Universal Systems Language, which is 001 Access, and Development Before the Fact, DBTF, which is different from MTBF, if you're a reliability guy. Anyway, and that's formal systems theory. So, they pioneered the concept of priority displays in software systems, where the software where in an emergency can interrupt the user, in that case, big astronauts, so they could rectify issues in real time. Now, I found that interesting because these sorts of systems, or those concepts already exist in panel construction and design. But that was back in the days when panels were enunciated lamps, push buttons, and alarm strobes and alarm sirens. So some of the priority stuff, like what is a siren? What's an enunciator? What's a flashing light? what color is the light, where is it organized on the panel, all that grouping. Priority displays existed previously. That was not a new concept, but what was new was how you handled priority displays in a software system. So that was different and that was new, totally uncharted territory. And comes with, I'd like to add, a different set of challenges. But I think her biggest contribution, in my opinion, beyond all of that, was the way in which the code debugging was performed on a component level and an integration level at all stages of code assembly into the final system. Extensive code simulation, running through all conceivable situations at a system level. And they use that to identify any issues and fix them before the code was released. In essence, they were making the code as bulletproof as possible. That level of thoroughness in the past, I think had never really been attempted with code of that complexity. And the methodologies that they followed, that they created, that they developed during that time, were the blueprint for many that followed. Which is why, you know, she's considered to have, you know, not just coined the phrase of software engineering, but provided the basis for a lot of programming practices that have become common since then. - Yeah. She also pioneered a lot of concepts in asynchronous software. - Yes, she did. And I didn't wanna go down the whole synchronous asynchronous thing. I excluded that at the beginning, but yes, that is- - Yeah, but it's worth noting. - It is worth noting, yes it is. But in any case. So before we wrap this up, I just wanna talk a bit about abstraction. I hinted at this earlier on And abstraction has changed the landscape of programming. And the underlying problem is that each platform in the early days was unique. And I mean, OK, in many ways, each platform is still unique, right? It has unique aspects. So consider the Babbage machines in the beginning, right? Custom set of wheels, custom set of instructions. There was no common instruction set. you consider the Colossus. Again, it has a unique instruction set, a unique programming method, the early PLCs, same thing, analog computers, same thing. They were all unique. They were all low volume. They all suffered the same problem. In order to use the machine, you had to learn how to program that machine specifically only about that machine. One of the challenges in control system engineering is not learning how to program an FBD, a ladder logic or any of that stuff, sequential function charts, you know, or, you know, structured text doesn't matter. The point is that there are all the subtle differences and all the subtly different ways of programming it. It's not just the different ideas that you come up against. It's the fact that every PLC just has to be subtly different. Well, let's say on the computer platform, I want to learn Java. Well, I can write Java and it'll work and compile and run on a multitude of different platforms. In fact, almost all of the major platforms in the world support Java. So once you learn one language, you truly don't have to worry about its interoperability on other platforms for the most part, not in the same way you have to, that I have to when I'm dealing with PLCs. I'm envious, very envious of what Java gives you. Java is not the only example. It's just the example that I've chosen, but you know, It illustrates the point, I think. If you break it. It's a bad example. Sorry? It's a bad example. Well, I'm not holding it up as a shining light of this is the best programming language it was ever written. That's not what I mean. What I mean is it is there are languages out there. It is possible to abstract everything away to a point at which you can have a common programming language that works on a multitude of completely different devices, completely different inputs and outputs, completely different control system, sorry, CPUs, operating systems, internal architectures, storage types, it's the list is endless. You know, I can run Java on a laptop or a desktop made by completely different companies with completely different CPUs, different hardware configurations and different operating systems and it'll work much the same. Yeah, you know, that's amazing. That's what abstraction has done. Because I mean, think about it, you've got Intel versus ARM. That's just one example. There's others, but that's one of the big ones completely different, completely incompatible instruction sets. And yet to program them, you know, what you have to learn those instruction sets? Well, no, we abstract those away. An operating system sits on top. Yeah, that it creates a bunch of function calls, you call those function calls, it instructs the CPU what to do, that's the operating system. Got it. Okay. But all I've done is I've now said, okay, I've now got an operating system. But those operating systems in and of themselves are also not compatible. Because the API function calls developed by Microsoft or Apple or, or anyone that builds a Linux distro, right? Red Hat. Yeah, Nick. I mean, the point is, everyone, every operating system then is different. So I can't write code that will compile using the dotnet framework that will work on the other operating systems. There's a different set of API calls, some functionality is available in some and it's not in others. It's not cross platform, but something like Java is. And of course, web developers will argue that's the whole point of the web is it is the ultimate cross platform set of programming languages. Yeah, so anyway, what am I getting at? I guess I'm getting at, sorry, another one, a 32 bit and 64 bit as another example. So at a compiled code level, they're not compatible. But if you abstract away from that compiled code level, you go up a level to the operating system, you could argue they're a little bit higher level, if you will. Yes, a higher level. So go to a higher level and eventually you keep abstracting away, you get to a level that is truly cross platform. And that's a big deal because it means that you can learn to program one language and it'll work on all of them. So, if you could program a Colossus machine back in the 40s, that's great. That is completely non-transferable skill. That is unique to the 10 machines that they built. Yahoo, great. If you did punch cards and learn how to program punch card for a specific brand of computer back in the 60s, that's great. That's not transferable either. You couldn't even transfer it to another different kind of computer back in those days. So, I guess you could consider that Java is an example of the holy grail of programming. I can instruct this machine and a bunch of different machines completely differently constructed with the same instruction set to solve the same problem. Sounds like a dream, right? And you've already ruined it by saying that Java sucks. (laughs) - It's an opinionated statement and I'll preface that. I'm sure there's plenty that would disagree with you. - Okay, well you tell me, why does it suck? Why do you think it sucks? - Well, it was built on this premise that's often thrown about that oftentimes turns a lot of geeks stomachs called a right wants run anywhere. - Yeah. - And in some respects they did deliver on that. - Yeah. - I guess, but it's also pretty cumbersome and it's vulnerable to a lot of security problems and issues and it's just not good. - Okay, I'm gonna go a little bit more generic than that. That's very specific, but I'm gonna go very generic than that 'cause I think Java is a good example of the problems and illustrates the problems. It's not the only problem child out there. As cross-platform languages go, it's just a good example. So, all right, here's my problem. In terms of the advancement of computational devices and computers and programming, standardizing languages and language operations cripples the development of new ideas because you're restricting what the current feature set is and the near future and even sometimes the long-term future capabilities by whatever standards you set. Because you have to make sure it'll still work everywhere. Exactly. So the input output methods of devices are constantly changing. I mean, just look at what happens with multi, what's happened with multi-touch in the last few years. Is Java truly capable of handling multi-touch input? Or is it still based on the whole mouse clicking thing? Because people are obsessed with desktops. I don't know the answer to that. but last time I checked, Java and Pinch to Zoom don't play so well together. You know what I'm saying? It is. It's better than it was. But you see my point, right? Input and output methods change. They've made it work pretty well for Android. Okay. You know, most of those applications are all written in Java. Okay. Maybe all of those applications are written in Java. I don't know. I'm completely talking out of my mouth because I've never done any Android development, but that's my understanding. Remember I'm talking about cross the board, cross platform. So I'm talking about iOS as well. I'm talking about Windows phone as well. So, you know what I'm saying? And another thing then, what about touch gestures on your track pad? You know, the Apple operating system, for example, it's got a whole bunch of extra things you can do. Well, is that gonna get poured into Java? Well, no. - No, I doubt it. - That's not available on all machines, I guess. So you get this divergence. So what are you doing? You're creating a system that is stuck with a common set of input and output methods. And that's the problem is you end up in this situation where you've got to have a keyboard, you've got to have a mouse, you know, but what if I don't want to have a mouse? Well, I'll tap the screen and I'll have the cursor move to where I tapped. And that'll be a tap, will be a click, a left click. Why a left click? Well, because I said it's a left click. And it's like, OK, so now what we're doing is we're taking a paradigm that worked and we're shoehorning it into something else where it doesn't work so well. Yeah, yeah, there's a reason that when Apple... It's a pair of handcuffs. Yeah. There's a reason when Apple did iOS as touch interface, they didn't have a mouse cursor because there was no mouse. But something like Java, you know, it's much slower to include that sort of technology. Anyway, all right. Standardization by committee. And if they had if they had wanted to put iOS on on other people's machines and other services. There are a lot of affordances there that have partly what makes iOS and the iPhones and the iPad so great that they would have never been able to do. Absolutely right. Yes, totally agree. So standardization by committee slows down evolution. That's just a fact. That's the way it goes. You get a whole bunch of parties, they go out and do their own thing. One company goes in one direction, another company goes in a different direction, one company copies the other one and so on. And once this happens and the market stabilizes, it's time to go for a standard and try and standardize what people are doing. Don't let it go too late though, otherwise you'll end in the VHS beat a max situation. The problem I have, and here's the way I'd like to sum up the problem with the whole aspects of trying to standardize a language and create a common language for a platform. And that is the drive to a common language and a truly hardware agnostic programming language cripples itself by the same mechanisms that drove its creation in the first place. You can never win. And so long as, unless of course you want to stop evolving. If your option is, I'm going to stop trying to make a better computer, I'm going to stop making a bit of programming language. Well, instantly, you can suddenly succeed with a cross platform, fully abstracted, truly generic programming language. So long as you want to evolve and advance, you can't. And that's never going to happen because everyone always wants to try something new. Just look at all the extra rubbish that's added to PHP every nanosecond. Anyway, still, new library. So There will always be by virtue of these... There were probably 300 JavaScript frameworks created while we recorded this. Exactly. There will always be by virtue of the above things, there will always be many, many more machine specific or platform specific languages than there will ever be common truly cross platform languages. However, I think it is a wonderful thing that Linux and Java exist. Okay, I know they have their issues, but you know what? I think it's wonderful that they exist because they provide the guaranteed lowest common denominator for programming without too many strings attached. They're mostly kind of sort of open source in parts, mostly kind of sort of, and they'll run on practically anything. And I just realized, I just said one of my pet peeves and that's lowest common denominator. And while I'm on the topic, you know what? Lowest common denominator is so annoying. It's an annoying idiom expression, whatever the heck you wanna call it, because it doesn't mean what it means, what people think it means. You know, like mathematically, the numerator is the number on the top of the fraction and the denominator is the one on the bottom. And the least common denominator comes when you're adding two fractions of the different denominators. So the lowest common denominator for like a third and a sixth is six or a third and a quarter is 12, you know? So the literal extension of the mathematical definition of a lowest common denominator is actually, well, it's least common denominator is the correct way to say it, but it's the smallest value at which two dissimilar items can be added. That's not what I mean when I say lowest common denominator. That's not what people mean. You know, it's like-- - Yeah, it's been co-opted. - It has been, yes. It has been badly co-opted. Anyway, so there's always a basic level of functionality that you can fall back on, or this has been built with the majority of implementations in mind. It's hard to see the connection between those concepts and the concept of the common interpretation of lowest common denominator. We're not, yeah, we're not adding anything together. Not even remotely. It's not what we're talking about. And I think that's a great note to end the show on. What do you think? - Sure. - Oh God. I've reached the end of my notes and my tether. So let's go there. If you want to talk more about this, you can reach me on Twitter @JohnChijji and my writing and this podcast and others I've made are hosted at my site techdistortion.com. If you'd like to get in touch with Vic, what's the best way for people to get in touch with you, Vic? They can find me on Twitter @vichudson1. That's right. Don't forget the 1 because Vic is number 1. And if you'd like to send any feedback, please use the feedback form on the website. where you'll also find the show notes for this episode on the podcasts, Pragmatic. Don't forget that the show will be ending in a few weeks time and I have one last final vote listeners can participate in if they choose to go to techdistortion.com/pragmatic. There's also a link in the show notes and you can vote on your favorite episodes of the show. And I'll announce those on the final episode. It's anonymous if you want it to be you don't have to put your name and email address in there if you don't want to, but I will be tallying the results for the final episode. As an incentive though, those lovely awesome Pragmatic stickers, and thank you to everyone who's bought one and has taken photos and shown me where they've stuck them. That's always interesting to find out what people do with the stickers. And also for the shirts too, thanks for those to everyone that's sending those pictures. And anyway, I'm actually going to be giving away three random entries, so there'll be three stickers given away, one to three random entries from people that have submitted their favorite episodes in that list for the final episode. So please go ahead and vote on your favorite episodes, once you like, once you don't like, go for it, I don't mind, all good. So top three get free stickers. So hey, everyone likes free stuff. Okay, so I'd also like to thank our sponsors. Free is good. Free is always good. Free, Linux is good. Linux is free. Anyway. Oh no. Java is free. Free is good. I'm going to stop there. I'd personally like to thank ManyTricks for sponsoring Pragmatic. If you're looking for some Mac software that can do many, many tricks, remember to specifically visit the URL ManyTricks or OneWord.com/Pragmatic for more information about their amazingly useful apps and use the discount code Pragmatic25. That's Pragmatic the word and 25 the numbers for 25% off the total price of your order. Hurry. It's only for a limited time, just like this episode, just like this podcast. I'd also like to thank lynda.com for sponsoring Pragmatic. If there's anything at all you'd like to learn about and you're looking for an easy and affordable way to learn, then lynda.com can help you out instantly. It's streamed thousands of courses created by experts in their fields of business software, graphic design, web development, lots more. Visit lynda.com/pragmatic to feed your curious mind and get a free 10-day trial. There's something for everyone. So if you've ever wanted to learn something new, what are you waiting for? And that's it for this one. So, oh yeah, you can follow Pragmatic Show on Twitter to see show announcements all the way to the top. Did I say that? Oh dear. Anyway, thanks for listening everybody. And I'll just unmuddle myself now. Thanks as always. - Thank you, John. [MUSIC PLAYING] (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (dramatic music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) (upbeat music) [Music] (upbeat music) [Music] Did you did you enjoy your own topic? I did. Was it everything you hoped? I believe so, yeah. You believe so. You better be. Like I said we'd have to do like a 20-hour podcast to really cover it all in detail. So I think we did a good job on the important highlights. I slaved away for hours for you. No, I think we did a good job on the important highlights. I'm not disappointed at all. I'm a little disappointed that you brought Java into it, but I'll let that pass. You'll have to let it pass. It's been recorded now. I can't put it out. But no, I mean, seriously, though, Java is the perfect example of a cross-platform abstraction. You have to. It is. It's a good example for the idea. Yes, for the idea. It's a very bad implementation. My point is you can never do it. This is my point. You name a good, truly cross-platform abstraction that's a programming language. There isn't. Exactly. That's and that's why. There can't be. It defeats itself by virtue of the reason that you're trying to create it. You know, it'll always be hamstrung. It'll always be behind. It'll never embrace new technologies, you know.