Machines start to learn

I have been ever since I can recall fascinated with machines and humans and how they engage together and develop. While at college during the early 1990’s and studying for a graduate degree in cellular microbiology and genetics; I had learnt how biological cells – the micro components of our existence – mutate and become different than when they start out. We learnt how a concept known as ‘genetic engineering’ then in its infancy can cause new gene pairs and whole new characteristics. The Drosophila Melanogaster – better known as the common fruit fly was our guinea pig in the labs. Even then we at our classes often wondered if we could take this mechanism to the more inanimate machines and what could in fact happen if we did? Somehow the concept of artificial intelligence did not then come to our non computer addled minds. Computers then were in fact assisting the whole process of gene sequencing; understanding as it were the way life came about and the introduction of computer software in fact substantially decreased the time it took for the complete human genome to be sequenced.  

Anyone who knows his computers and recognizes how viruses work will not in the least be surprised with this biological reference that may seem out of place in the cold hard world of bits and bytes. Trojans; Worms and viruses – what are otherwise in fact bits of software code  are known to mutate and reproduce (send out copies of themselves)  in the wild and crazy world of the ‘botnet’ and the various other dark and sallow breeding pits of the IRC channels which turn into distributed denial of service attacks. That is a lot of tech geek speak with biological overtones and what that essentially   means is that computer code infects your computer and makes it a ‘zombie’ and then uses it to attack servers across the world. And thus  the future comes about and it is amongst us as we speak and what projects may come if the mind be let to wander. What would you perhaps then say about computers and processor motherboards and microprocessors that can re organize who they are depending upon the software (firmware) that has been loaded on to them! These are known in technology speak as FPGAs – field programmable gate arrays – and as the name suggests – out there in the field you could program them to think different from when they started out!  

That degree in  genetics did not help much; I landed up selling computers and software and other such paraphernalia and even got good at it. That is also when I realized I wish I had been an über geek with a degree in computer engineering and not in the life sciences. But it was too late now and I could not be because I could not program.  I had in fact done some work in BASIC including a certificate course in 1984 while at junior high school but I never really took it ahead and today to tell you the truth though I know what they are I cannot do VB or PERL or Python or even the very basic C+ or # or even a Java script. I have told myself I should do something about it and hope maybe I can. But I have been busy selling HPC – high performance computing machines since the late 1990’s and I know that those men in the jump suits at Intel or elsewhere can bring forward a mean processor out for the rest of those coder dudes to build their dreams on.  

The first ever computer I saw was something called an IBM compatible PC/ XT DOS and the first ever video game I played on it was called ‘Pong’. This was way back when computers were not supposed to be used by ordinary mortals and Bill H. Gates the 3rd was still trying to get his iron grip over the world. Today as anyone would have told you; the average computer does transactions that are a million times more than what the ones in 1984 did. Intel announced a few days back a processor that could perhaps be clocked to 3.3 Ghz which would be a quad core – 4 individual cores and they were here only following what their competition – AMD did some time back. The new processors have really geeky names but that is as they say just the tip of the iceberg. Recently Thom Sawicki, technology strategist for the Intel Communications Technology Lab discussed the future and guess what they have come up with: an 80-core announcement! Specifically it isn’t just that someone has put 80 cores on a single chip and called it a "processor" instead it is about the larger implications of massively multicore processor for system- and network-level architecture. In his words: "Once you jump to terascale, you have to ask and need to ask ‘what are the implications for everything? The platform of the future when you get to terascale will look different and act different. It won’t be a CPU of 80 cores surrounded by a chipset and some peripheral. You’ll see a much tighter, more integrated organization." The picture that Sawicki paints is of a ‘server-room-on-a-chip’ a single piece of silicon that uses many cores and virtualization to do the kind of work that it currently takes multiple networked servers to do. Sawicki gave the example of a hypothetical multicore chip that can run a high-volume e-commerce solution on a single piece of silicon. Instead of web server box that takes orders and then sends them over the network to another machine for processing, you could use two separate cores for these tasks, with each core running a virtual server.  

Ah man what all that geek speak means is that they are coming out with even faster and  better and perhaps even hotter processors. And we are only speaking of Intel – we have not yet even started talking about what Texas Instruments; Motorola; Philips or Samsung  are doing. And just when you though – ah that is a lot of computing – this professor student duo at the Oslo University Kyrre Glette and professor Jim Tørresen have developed; or rather written software that makes hardware imitate evolution at runtime and all changes to existing hardware have to be made through software. What their hardware does is par up “genes” in the hardware to find the hardware design that is the most effective to accomplish the tasks at hand. Just like in the real world it can take 20 to 30 thousand generations before the system finds the perfect design to solve the problem, but this will happen in just a few seconds compared to the 8-900.000 years it took humans to go through the same number of generations. This team first started to use evolution back in 2004 when they made a chicken robot “Henriette”, yes a chicken. The chicken robot used evolution, this time software based to learn how to walk on its own. Evolution solves a lot of problems that programmers cant solve, a programmer can’t think of every problem that might occur if say a robot was sent to Mars and fell into a hole, through evolution that robot could learn how to climb out of the hole without the interference of humans. The team now wants to make a robot designed to help in the installation of oil pipes and other oil related equipment at 2.000 metres depth, these depths make it almost impossible to communicate with a robot, you’ll either have to have 2-3 kilometres of wires or communicate through echo signals which in turn will give a multi second delay. Their research paper can be accessed here (PDF). And they were not even the first in this Paul Layzell and Jon Bird at the University of Sussex in Brighton applied a program to a simple arrangement of transistors and found that an oscillating output ‘evolved’. When they looked more closely they found that, despite producing an oscillating signal, the circuit itself was not actually an oscillator. Instead, it was behaving more like a radio receiver, picking up a signal from a nearby computer and delivering it as an output! The machine had indeed learnt!  

And where that will take us is for us to imagine!

About Soumya
A technology enthusiast, forever enamored by all that it hath wrought and of course here is an attempt at making sense of it all and perhaps simplifying it!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: