Owning the client!

Twitter recently changed the Terms of service for access to its API to build applications off or on its feed and in one swift move dropped a hammer as one writer noted on the entire ecosystem. Citing consistent user experience and the need thereof they have just about banned every competing client for their service that stands out there. Twitter Platform lead Ryan Sarver in a developer forum recently clearly laid out that “Developers have told us that they’d like more guidance from us about the best opportunities to build on Twitter. More specifically, developers ask us if they should build client apps that mimic or reproduce the mainstream Twitter consumer client experience. The answer is no,”. Sarver notes that according to Twitter’s own data, some 90% of active Twitter users now use official Twitter apps on a monthly basis to access the service. Ever since Twitter bought Tweetie and turned it into their own native iPhone app, third-party developers have been wondering where this would leave them and before long Twitter moved into Android, iPad, Mac, Windows Phone, BlackBerry, and other spaces and this only compounded some of that fear. Ryan Sarver highlights Twitter’s “diverse ecosystem” of more than 750,000 registered apps and that ecosystem just got altered a bit. People are once again unhappy as a company is seen to be taking action against their developer community. But at the core this is just simple monetization. Twitter targeted Uber Media; a company that has been quickly buying up a significant part of the Twitter ecosystem possibly to figure out a way to monetize it. Uber Media’s Twitter third-party clients, namely, UberTwitter and Twidroyd were shutdown on Mid Feb 2011 by Twitter. The ban was later revoked once Uber Media accepted to make changes but the damage was done.

The importance of a tightly knit client to access the service is not something that can be taken lightly, no. Facebook has not had much to do on the client side till recently when they bought Israeli startup Snaptu a very useful and light client for access to Twitter, Facebook and a host of other services. My personal experience of this tool has been let us say the mst positive and I was using it on a Nokia E90 running Symbian which perhaps gives you an idea of how cross platform this actually is. I mean does anyone even make apps for Symbian still and yeah what does that say about me when you realize I just mentioned I use a Nokia!


The year so far in tech!

It has been a while since I last blogged and this is apparently in keeping with trends – bloggers seem to drop off at an alarming rate. Recent trends and a research paper by the Pew Research Centre indicate that blogging has declined among a few segments. More so because people probably feel that the medium is too word heavy perhaps writing long meaningful sentences etc and somehow to feel pressurized to be meaningful. Social media has taken up the slack and people like to let out perhaps small tweet bursts! But even then most tweets they say are pretty meaningless anyway and are designated as ‘pointless babble’ so yeah no help there and we just need to perhaps revisit the medium to let loose some thoughts!!


It has been a long time indeed and my favorite space – technology has seen some interesting changes, moves, counter moves, trends etc by the many players that exist there!! One of the key developments I think was the Apple vs. Adobe fracas – a battle to choose the better tool for video. Apple felt it was not Adobe’s Flash but the new standard HTML 5. Steve Jobs weighed in on the issue in April and some commentators have stated that the arguments were disingenuous and it is just Jobs way of taking and keeping control over the user experience. Which may not be such a bad thing since Apple sold so many iPads (without Adobe and Flash in them) and iPhone 4 and gave everyone from Adobe to Nokia and RIM some pretty sleepless times. Latest reports on this battle seems to indicate the thought that Jobs may have been right and there are issues with Flash and Google’s latest mobile OS – Android has been facing some of the problems working with the platform that is prone to crashes and is a resource hog. The developers however were not too happy since they liked the code once and run anywhere approach for their apps. Apples insistence that everything be coded specific to the device to take advantage of all that it has got makes for more work and sometimes not the best work since there are not that many people – or rather that many skilled people that can code on Apple’s platform X Code. However it looks like Apple won this round – Flash does seem to have problems – and developers develop on where the money is and currently in the mobile apps space – that is with Apple.

There were some interesting buys in the large corporate IT space and the latest was Mc Afee a vendor for anti virus and internet security being gobbled up by Intel – a chip maker. Everyone opined on what this means – but the best explanation is perhaps that Intel wants to get into the security business and needs to be in the security business. Ask any CTO, CIO – what is it that keeps them up at nights and most people will say Security and Intel wanted in on that which might make for safer computers. The security landscape in the last few years has changed drastically enough for vendors to wake up and take some decisions
in terms of where they want to go with security on their products. Intel has declared that Mc Afee will function as an independent subsidiary and that may well be a good thing but in the longer term the integration of the technology will be what will make this a sure shot success and something that consumers and enterprises will look to.

Acquisitions were everywhere and HP raised the ante in the stack game with strategic buys of 3 COM and Palm. They now have servers, networks, mobile OS (but no CEO!) and now compete directly with Cisco who has expanded into blade servers and storage with Acadia a joint venture with EMC. Cisco is positioned for global dominance of course and has been for a while now so no stopping them anyway!! Their acquisition in the video space with Tandberg and Flip seem really interesting to watch as they seem to be going with the video will be the new traffic driver philosophy. But then Cisco does more or less everything these
days, whether it is security with Scansafe, mobile, unified communications, WAN optimization and teleconferencing and all of this to sell more routers remember!

Somewhere along the line Facebook – something always seemed like juvenile pass time – became the biggest web site on the world, an alternate internet in itself in a way. It also has amazing ‘stickiness’ the amount of people who stay on the site. Not surprisingly they are also the ones fighting for the next increment in the LAN – the 100 Gigabit Ethernetand say what they really need terabit Ethernet.There was even speculation that they would be looking at ARM for their new datacenter at Pineville, Oregon but this was denied as a rumor which probably saved a lot of people at Intel and AMD an heart attack or two!

Oracle’s purchase of Solaris has also dealt the open source business a bit of an heartache and one of the first things that Oracle has done is go after Google on patent infringement for developing a new mobile stack called Dalvik that they claimed infringed on their Java patents (that they got after buying Sun) and the case went to court. Not content Orace also caused the Open Solaris Governing Board to also quit since they have no plans to stay open any longer. But then Oracle had already shown that they were going to be different and there was going to be changes n the way things were being done. One of the first victims was the channel partner’s of Sun who had to now become partners of Oracle and they were taken through coals and even today across the world – the channel is on tenterhooks. But then Oracle and Lary Ellision and the way they work the world always has been different from most other people. Some say that they are now becoming the new Microsoft – so what has happened to the old Micrsoft – maybe they became benign and all? Among things Oracle also got sued by the US Department of Justice thanks to some whistleblowers telling them that they have been overcharged on the GSA deals. The investigation is on but if true it confirms the old thought that vendors if they can the will gouge on price – even the American government with whom they have a written agreement to provide at the lowest prices. Oracle’s litigious ways with the patents that it has got access through the purchase of Sun reminds me of SCO (once upon a time the Santa Clara Organization) and its shenanigans in the American courts with its patents on Unix and though we all know what that came to – this one will be some thing interesting to watch.


Microsoft the other big daddy – or rather the original big daddy of the IT business has had an interesting year in several ways. Their consumer facing products like the cell phone and one that they released – the Kin – came short tremendously. In other businesses they kept their lead and moved to the cloud to take on the challenges of Google with on demand licensing and availability of their core productivity suite available as a service. This August marks 15 years since their launch of Win95, at that time a big move up in the OS sweepstakes. But who would have guessed or projected that 15 years later the OS battle ground would look so different with access to information now being from multiple internet connected devices – most of which have no to a rudimentary role for Microsoft to play. In the Internet world – it is rarely about being first but more about being better. Any one recall Lycos or Altavista or for that matter the Rio Noman MP3 player? There is really nothing called a first mover advantage here – it is all about the user interface, the experience and that is always the magic sauce that puts it all together. Friendster, MySpace and 6degrees were possibly one of the earliest on the social networking scene – where are they now? Tablet computing or at least the Microsoft vision of what that means has also been around for a long time now – it never struck anyone to remove the stylus and make it tactile totally and today we have the iPad that has sold more than all tablets ever made and that too in a fairly short time.


Technology they say is a hard mistress and to be ahead the word clichéd as it may sound has always been innovation and vision. A lot of people make smart phones and those that do – make many models. Apple makes one model and yet beats them all. It gives customers virtually no choice and believes perhaps in the old adage – my way or the highway and yet manages to get as many customers as it does and has hordes of more or less rabid fans. There are many explanations  – a strong control of the user
experience some say right from the unpacking for the product – a tight integration others say, innovation others say. The point though and what I think is that they have learnt from their mistakes and decided to stay ahead. That and as a wise man once said – make a better mousetrap and the world will beat a path to your door!


Enterprise security and the new threat landscape.

Up until a few years back companies did not have any role called CISO – chief information security officer – a lot of companies do not yet today. Most think that all this is part of the CTO’s look out. That is one of the more succinct indications of what has changed and the fact that things have changed and it is not business as usual.Ever since the first ever self replicating viruses came onto the scene several years back information security has been a concept whose need seems to be today recognized a bit more widely than anyone would have thought they would. A large part of the requirement is now mandated by law and part of the huge compliance bill that companies need to pay. The cost of data breaches and the ensuing mayhem has reached considerable numbers and it is seems only natural that governments have enacted regulations that keep data security on every one’s minds and budgetary allocations. It wasn’t long before the ever useful though often embattled CTO found it necessary to have in his team a CISO. One of the key reasons was the ever changing threat landscape and the ever changing, maturing and developing technologies and the security concerns that they create. It hasn’t been al that long since Virtualization was the cause célèbre in the IT town and yet today the cause for concern are the new emerging trends in vulnerabilities in the virtual systems and the opportunities to be attacked that they represent. Before you know it and all systems and subsystems that form an intrinsic part of the life cycle of the industrially manufactured   products and customer experience can become and has the inherent capacity to become a security breach.


The risks today are in terms of infecting and utilizing just about any and everyday mechanism that have been relegated to the background of our existence like elevators, video cameras out in the street, escalators can be remotely accessed, hacked and changed to work on a new logic than what it may have been meant for. In far industrialized nations of the west for example even a simple tax as opening and starting your car is controlled by a network sand computer people at different locations through something called Lo Jack – essentially a telemetry set up that enables M2M (machine to machine) communications and control through satellite and cellular networks. Up until a few years back virus outages involved code that would use your mail servers as spam servers to further distribute the infected email as a wide and easy to replicate vector. Things started getting complicated with malware and worms with messier payloads that could replicate and mutate and foil the best antivirus engine’s threat detection mechanisms with command and control structures operating large and widely distributed network of infected computers called the botnets. Your average IT guy was soon swamped trying to stay ahead of an ever changing and evolving threat landscape that he could not make any kind of sense of unless they spent hours and years staying abreast of the technologies that the attackers used and the safety and foil mechanisms that the corporate defender had at his disposal. Things got complicated pretty fast and the bad guys were no more content on just erasing your hard drive or make it unusable but focused on how they could steal important financial information like credit card logs in and identification data.


Companies storing key identification details of their customers and their financial information that could be misused soon became the targets of smart minded hackers that were always looking for a chink in the IT armors to exploit and steal the important and sensitive data. Soon markets developed out of vulnerabilities in systems that could be sold to the highest bidder there who would then find means of exploiting and monetizing it further! White papers have been published looking at the business size and the economic impact it has on the enterprises. The new kind of internet security attack is aimed like all attacking maneuvers to disable and fatally strike down an enemy’s core systems. A few years back the state of Lithuania was similarly blockaded by an orchestrated distributed denial of service attack that totally slowed and brought down there internet
access and  their ability therefore to conduct any kind of business on the internet including disbursing financial credit through cards etc. Enterprises operate in such war like circumstances with the added advantage as it were (cynically) of a whole vast supply of internal threats and weak spots – the users and internal employees of the company, guests, contractors etc that come into the network and often create in their wake their own breaches or infections from poorly patched and maintained machines. One of the worst attacks of the last few months has not yet totally abated is the Conficker Worm that was spread by nothing more that the auorun.inf enabled by default for all USB drives. Poorly patched machines and disregard for the software upgrade process further enhanced the wider pandemic kind of infection vicious cycles that helped it grow to the gargantuan proportions it has today.


Corporate networks operate under two kind of rather differing circumstances in that they have to be partially open to allow for the influx of hu ans who will be suing the systems and the networks and the resources and also at the same time have or are repositories of information and fundamentals that represent a very large dis proportionate risk in their breach or compromise. The weakest link in corporate networks has always been study after study pointing to one major aspect or vulnerability and that was the human element that either by plain stupidity and carelessness as well as willfully and mischievously caused and has the capacity to cause the most harm and disruption. Today just securing your network against virus is not the end of the story for a lot of technology organizations; they need to consider another layer of security with DLP or data loss prevention systems. This is that change in the threat landscape such that no one solution or silver bullet exists to put everything under one manageable control. The natures of threats today are so various and constantly changing so as to effectively defy capture and detection through signature based mechanisms. Just as one vulnerability gets patched and other one is added more or less simultaneously and it seems to the un initiated like a never ending arms race – which probably is what it is! But organizations cannot just roll over and play dead – they have to on occasion by legal requirements and acts of law and at other times due to the unimaginable harm that such breaches can cause pursue an attempt to securitize and guard against breaches. But at more often than not business instances are full of organizations that thought this was a one time effort that could fix it all, guard and secure all doors and walls and windows. But information technology security is rarely like that or as easy as physical world security  – there are situations here where one is trying to guard against vulnerabilities that have not yet been found using only tools and signature based heuristics of known ones.


A certain class of paranoid or security super conscious organizations take things to next kevels and have their own, army of certified ethical hackers that are perpetually and manually trying to find out the next hack or chink in the software application. The deeper integration and coming together in terms of shared platforms or address books and core databases like telecom applications with email application and unified communications have also added to the various new ways that corporate systems and networks add to their list of vulnerabilities that already exist. A few years back a hack was discovered that could allow ordinary Polycom video conferencing systems connected to the corporate LAN for videoconferencing on IP (at that time the new technology glimmer and have to have product) become an entry point to the LAN and wreak havoc if needed be. Vendors in that space including one that I was working with at that time immediately moved to change their platforms and their operating systems, board architecture, chip designs etc. to accommodate more and more security – which in an already heavy codec was only adding overhead. These hardware refresh programs in the middle of 2005 by the videoconferencing makers actually brought out systems able to work with several more instruction sets per second and vastly helped performance and also allowed for a steady growth plan to accommodate more features on higher definition and higher security.


Security is expensive in more ways than one and at several systems it is a function of the hardware as it is the software applications that provide security. Modern day development tools for software focus on the amount of security and self healing capabilities that they can bring to the system and these need faster and better hardware and chip I/Os to work with. Vendors are pushing their envelopes and bringing to market bigger and better and today a lot of that bus bandwidth and the chip set cores go towards enabling a higher and higher level of security that now needs to be built in and authentication systems that now need to be enabled from biometric to simple mathematical passkeys. Even now computers get by with simple alphanumeric passwords as the identity management tool of choice – but soon this will become more and more irrelevant and face recognition, biometric, mathematical passkey generators will be the newer level of complication that will bring to the entire authentication process more simplicity and a higher degree of non compromise ability. Most organizations today concerned with information security or where information security is a government mandated compliance issue – two factors to three factors in and out of band authentication is today  the norm. The Reserve bank of India recently implemented PGP pass key based multi factor authentication for transactions in their organization. Vendors and analysts have called it the layered defense the only truly credible defense strategy that one can expect to have in the current corporate network security threat landscape. The multiple layers mean multiple vendors and a passing around of all the goodies and enhance complexity and drives more sales of hardware and software because you can never be sure if you have all the hatches closed.


Bureau and standards institutes and certifications exist that companies can strive to be lauded with after fulfilling several requirements  and after a sizeable cost and compliance exercises, but as these places they say this is a running exercise kind of an activity – there is no one shot and one size that fits and ends it all. The strategic organization takes all standards and fine tunes a standard of their own that often enhances and exceeds what is available thereby hoping to be a better standard, others create their own teams to comprehensively and continually test their defenses and in this continual process testing and strengthening their networks.

Virtually there videoconferencing & telepresence


Being able to see someone and share with them what you have at your end when far apart and perhaps on a telephone – to me has been like an application that has been looking for the technology to come about. Transmitting real time video images with audio often called videoconferencing has been around surprisingly as long as Bob Dylan. AT&T introduced it to the public at the the Picturephone at the World’s Fair in New York in 1964. While viewed as a fascinating curiosity, it never an early videophone from AT&Tbecame popular and was too expensive to be practical for most consumers when it was offered for $160 a month in 1970. It wasn’t until Ericsson demonstrated the very first trans-Atlantic LME video telephone call that companies saw the real potential for success and profitability and began to refine their own video conferencing technologies. Advancements such as Network Video Protocol (NVP), in 1976, and the Packet Video Protocol (PVP), in 1981, both helped the maturation of video conferencing, but both stayed in the laboratory or for private corporate usage. Companies began refining video conferencing technologies, including advancements as network video protocol (NVP) in 1976 and packet video protocol (PVP) in 1981. None of these were put into commercial use, however, and stayed in the laboratory or private company use. Picturephone was meanwhile pulled by AT&T in 1974 after a million dollars in development costs. clip_image002

In 1976, Nippon Telegraph and Telephone established video conferencing (VC) between Tokyo and Osaka for company use. IBM Japan followed suit in 1982 by establishing VC running at 48000bps to link up with already established internal IBM video conferencing links in the United States so that they could have weekly meetings. In October of 2001, television reporters began using a portable satellite and a videophone to broadcast live from Afghanistan during the war. The ability to communicate with the other end and see them on perhaps satellite or; through telephone lines (ISDN) has been something people have always wanted. Look at any self respecting sci fiction movie and you will see people then were in to talking to TV screens and making sense of what went on!

Ever since Jack Kilby invented the Integrated Circuit, almost all technological innovation has been around the better ‘chip’ and as processors get smarter and better and possibly hotter – the applications really get better. The modern car from BMW or Mercedes for example has more than 300 ECUs – electronic control units. There is no reason that talking to a TV screen – videoconferencing as we knew it should not get better and today it has to the extent that people now can talk to their mobile phones and also who they are speaking to. Though novel this is not new and as early as 1992 computer operating systems like the Mac OS had a video chat application built in – audio came in later by 1994. Texas Instruments (where Prof. Kilby did some great work) in their developers conference at Bangalore recently showcased among things a car with a chip that could read sign boards and high definition video conferencing with image clarity that was unique. That car and the sign boards is some time away but High Definition true life audio visual communication has been around (some of it on TI’s chips) for a while now.

clip_image004Communication happen at many levels and today’s technology is capable of taking that feeling of being there to newer vistas of virtualization and for the business users the results have meant increased productivity. The government at India has been for example been able to realize the potential of video communication through screens on their satellite networks and have also deployed a sizeable ‘multiconference’ bridge thereby taking to unprecedented levels communication amongst its officials and hierarchies. Unknown to many and in the selfless way that only government can the history of the videoconferencing division at the National Informatics Center at New Delhi mirrors in fact the development of global videoconferencing as we have come to know it today.

One of the main developments off course has been pricing. In 1982, Compression Labs Incorporated (CLI) introduced their system to the world for $250,000 with lines for $1,000 an hour. The system was huge and used enormous resources capable of tripping 15 amp circuit breakers. This was, however, the only working visual communication system available until PictureTel’s VC hit the market in 1986. This was substantially cheaper at $80,000 system with $100 per hour lines! Today at a tenth of that price you can get yourself a high definition capable entry level system that would do more than anything those earlier systems could. Besides the systems there are off course other costs like the network; the furniture and also the displays and perhaps audio also and off course a nice sound proofed room for the more quality conscious discerning seekers – or business users as the industry calls them. This adds to the total cost of deployment but if you were in say down town Mumbai or Hyderabad right now – and needed to really see the other end when you speak to him – you can get yourself a place within reasonable limits at less than Rs 5000 ($100) an hour which even at today’s rates is a steal. Off course this is a ‘cyber cafe’ and the other end needs to also have a system – but that is the only limitation. When you think that five thousand Indian rupees or a hundred US dollars is what the average test (SAT/GRE/GMAT) to qualify to go to American universities costs; you know it is not really a lot of money.

clip_image006That then was the application that needed the technology that could all put it together. The ability for American universities to be able to perhaps see the people that they were admitting or for that matter any other company that would want to interview a candidate far away without having to have them to fly all the way. Today the benefits are being realized by heads of states and their governments and CEOs and their companies. And it cuts across lines and whether it is the President of America G. W. Bush or President Ignacio Luis Lula da Silva of Brazil – video communication has brought them closer to their constituents. India’s own rocket scientist president was also in fact an eaclip_image008rly adopter of this technology.

At the free end of the spectrum there are services of distinct repute like the Estonian company Skype (since purchased by American free marketer eBay) or if you would rather go the enterprise option from the larger network equipment makers whose latest and absolute cutting edge product release is an ‘Unified Phone Messenger’ built around a ‘presence server’ based around the Sessions Initiation Protocol. This protocol under the IETF has contributed tremendously to bring business quality audio visual communication and collaboration from the corporate boardrooms and meeting rooms to the ubiquity of the enterprise desktop. The enterprise communication equipment makers whether it is Alcatel with Lucent or Cisco, or Avaya a company that traces its heritage back to AT&T, today have products that build in the capability to communicate over video.

In the early days of the technology it was about cameras and grainy slow moving pictures with no quality to speak off. Today however it is about presence and collaboration and integration with the desktop. Right now technology is available that can connect a traveling clip_image010business executive with his laptop computer and a web cam at an airport Wi Fi Lounge to perhaps the corporate boardroom equipped with a high end a room based VC system for a video call. Surveys of communication methodologies and techniques have evaluated that even email – what one always thought was as good as it gets is only 40% of the message getting across. Being really there is a 100% and video collaboration technology actually moves things up to 70 to 80% of the ‘message’ being conveyed actually being understood. The time for the video and the audio to travel to the other end on those corporate networks has indeed come and IT managers and their CIOs are beginning to see the value in this technology. Today true life like video with High Definition displays are changing the way people communicate and impacts the way people and organizations interact and engage. For the enterprise this can translates into productivity gains and the benefits the bottom line. Some of the benefits may be surmised:

  1. It accelerates the decision-making process and improves intra-company communications and increases flexibility and encourages collaboration across time zones and distances for the large enterprise.
  2. It can get you expert advice from remote sites and the benefits of this have been realized in distance learning and telemedicine projects that have proven their value across India and the world.
  3. And of course there is something to be said about the reduction in travel and time and expenses and the return on investment on this capital expenditure head once you really get around to doing it.

But in IT and Telecom – serious business as they are, fads and mindsets prevail and a lot of choices are made without often thinking through and identifying what in fact are the key ‘customer’ needs and requirements. There is a difference in choosing standardized and clip_image012to an extent commoditized components & applications like ERP; email; RDBMS. There isn’t much that you can go wrong with and there again there aren’t that many vendor choices to be made either. In choosing technology that is for the communication network however it is often more sensible to find a vendors that can today do it all and also pack something for tomorrow even if they are slightly expensive right now. Vendors who swear by interoperability and those that are standards oriented from the core will usually be able to deliver the best solution for the end customer. This kind of a vendor usually has a growth plan about how the technology will unfold and where they would fit in as it evolves. For the customer this means a longer life line on the product with less service requirements but clear and demonstrable benefits.

The right vendor usually has a technology road map for the way ahead and what they want to do. They may often not be the cheapest priced solution provider but however they will be flexible and will have a nuanced approach to the ever changing market realities. Accepting and expecting change is great but to be able to cause change will be how great vendors should be judged by. If your IT & Telecom vendor is a technology organization driven solely on sales quotas and smart sales people; they are often not what will solve the enterprise’s problems. The guys who do are the people that understood that the receipt of the customer’s purchase instruction was in fact the beginning of a relationship. After all at a company no one ever buys just one phone and vendors today realize that with conferencing and video and collaboration and telepresence this is exactly the way ahead with real value of this technology being realized only through wider deployments.

The importance of open source:

Drug companies have a great explanation as to why patents must persevere. They say that they spend a lot of money on developing new molecules, new medicines and if they weren’t allowed a legally enforceable manner to recover that cost from people/consumers et al – there would perhaps be no motivation to develop new molecules, new medicines and make perhaps the world a better place. But then drug companies say a lot of things for example from 1898 through to 1910 heroin was marketed as a non-addictive morphine substitute and cough medicine for children. Bayer marketed heroin as a cure for morphine addiction before it was discovered that heroin is converted to morphine when metabolized in the liver. The company was somewhat embarrassed by this new finding and it became a historical blunder for Bayer All that tells you in my humble opinion is that drug companies can be wrong and well the records shows that they did not know better! But heroin notwithstanding patents provide protection and keeps poor sub Saharan patients away from AIDS medicine and they argue that this is a good thing. It is perhaps that great bane of the capitalist world that a lot of people have ranted about. The poor don’t deserve anything! Developed nations governments have perhaps realized this and they have Medicare in America – health insurance. Michael Moore a man prone to make movies on the ills of American society has a made a documentary on the subject called Sicko which speaks of the trouble the average American can face getting medical assistance in the land of the free and brave. In India however we have a way of making things easier and we hate to pay – something about the fact that we are not yet the richest people in the world (the fact that three of the top twenty five rich people in the world notwithstanding!)    In India we had patent protection of a different type it is called process patent – processes can be patented not the product itself. India today has the largest number of US Food & Drug Administration (FDA) approved drug manufacturing facilities outside the US. In addition, Drug Master Files (DMFs) filed by Indian companies with the FDA is 126 higher than Spain, Italy, China and Israel put together. DMF has to be approved by FDA for a drug to enter the US market. This makes us the world’s largest maker of bulk drugs. The Indian patent act of 1970 amended on March 22, 2005 marks the end of a protected era and signals a new phase in the integration of India into the global pharmaceutical market. The new amendment seeks to make copying of post-1995 patented drugs illegal. Part Indian ingenuity and part regulation have led to the fact that the poor Indian (and there are many of them out there) have access to some of the best medication on the world.


Unlike drugs and software however is another ball game – it is not as much patent regulations as the fact that for every software maker there has always been a pirate. But the big SW makers have clamped down in a big way and the difference is noticed. I recall in 1997 there was this little cramped alley on Brigade Road at Bangalore opposite Rex Cinema that hosted a multitude of small sellers selling pirated copies of every known software in the world from Microsoft Windows and applications to Oracle’s RDBMS; anti virus; CAD/CAM s; Adobe – the whole shebang. This was before they got busted by a zealous software industry and disappeared. They did disappear from Bangalore’s Brigade Road yes – but anyone who has the time and inclination to survey the footpath vendors of DN Road and near Hutatma Chowk (the erstwhile Flora Fountain) at Mumbai will be rewarded with a plethora of pirated software options at less that 400 rupees – $10. I bet Kolkata, Chennai, Jharkhand, Chattisgarh and even Delhi have their own little places. But then soon pirated software had its cost – it was a lot like selling drugs – if you got busted and you were mainstream – there was hell to pay. Soon the only people selling these were the non lethal crook types. Sure there were ways for the real digital cognoscenti to download it off warez sites and other such locations but there was always a cost to pay. Until that is Linus Trovalds and his ilk came onto the picture with his version of a computer operating system originally developed in 1969 by a group of AT&T employees at Bell Labs including Ken Thompson, Dennis Ritchie and Douglas McIlroy. Here was a strong OS with a penguin as a mascot and it was strong and best of all it was free. Linux is one of the most prominent examples of free software and open source development; its underlying source code can be freely modified, used, and redistributed by anyone. The Linux kernel was first released to the public on 17 September 1991, for the Intel x86 PC architecture. The kernel was augmented with system utilities and libraries from the GNU project (is a computer operating system composed entirely of free software. Its name is a recursive acronym for GNU’s Not Unix, which was chosen because its design is Unix-like, but differs from Unix by being free software and by not containing any Unix code. GNU was founded by Richard Stallman and was the original focus of the Free Software Foundation (FSF) which was  to create a usable operating system, which later led to an alternate term, GNU/Linux. Linux is now packaged for different uses in Linux distributions, which contain the sometimes modified kernel along with a variety of other software packages tailored to different requirements. But this piece of code did not catch on – for one it was just about as unwieldy and geeky as UNIX had been. But that was until the African Ubuntu. This is a predominantly desktop-oriented Linux distribution, based on Debian GNU/Linux but with a stronger focus on usability, regular releases, and ease of installation. Ubuntu is sponsored by Canonical Ltd, owned by South African billionaire entrepreneur Mark Shuttleworth who among things also went to space from the money he made by founding Thawte in 1995. The company specialized in digital certificates and Internet security and was later sold to VeriSign in December 1999, earning Shuttleworth Rand 3.5 billion (about 575 million US dollars at the time). The name of this software distribution comes from the African concept of ubuntu which may be rendered roughly as "humanity toward others", though other meanings have been suggested.The most recent version, Ubuntu 7.04 (Feisty Fawn), was released on April 19, 2007. Version 7.10 (Gutsy Gibbon) is scheduled for release on October 18, 2007. Ubuntu aims to use only free software to provide an up-to-date yet stable operating system for the average user.


The best way to perhaps understand Ubuntu is to understand the African philosophy that it came from. In the words of nobel laureate Archbishop Desmond Tutu ‘A person with ubuntu is open and available to others, affirming of others, does not feel threatened that others are able and good, for he or she has a proper self-assurance that comes from knowing that he or she belongs in a greater whole and is diminished when others are humiliated or diminished, when others are tortured or oppressed.’ while the Zulu maxim umuntu ngumuntu ngabantu ("a person is a person through (other) persons") may have no apparent religious connotations in the context of Western society, in an African context it suggests that the person one is to become by behaving with humanity is an ancestor worthy of respect or veneration. Those who uphold the principle of ubuntu throughout their lives will, in death, achieve a unity with those still living. Nelson Mandela explained Ubuntu as follows. A traveller through our country would stop at a village, and he didn’t have to ask for food or for water. Once he stops, the people give him food, entertain him. That is one aspect of Ubuntu but Ubuntu has various aspects. Ubuntu does not mean that people should not enrich themselves. The question therefore is: Are you going to do so in order to enable the community around you to improve? Philosophy aside this is in fact the best way to explain what open source software is all about. Software and computers have caused the greatest development that mankind has seen and to keep it away from those who wouldn’t or couldn’t pay licensing fees is by far the worst kind of oppression there is.


Free software and open source thought not very common with the common man has been the backbone of a lot of the world’s great computing endeavors for quite some time. The logic off course is that you want to save on the licensing costs and you need something that is strong and reliable and has the world’s population (geek population at least) working to always support and improve. Google is one of the greatest such examples with all its servers working on open source with code written in Python – an open source language. They buy plain jane machines which they ramp up into supercomputer levels through use of opens source operating systems. Those who will recall will remember that the Param PADMA – the Indian supercomputer made by CDAC is essentially 248 processors (54 Nos. of 4 Way SMP & 1 No. of 32 Way Symmetric Multi Processing architecture) running Linux (and AIX – IBM’s version of UNIX). This cluster is ranked 171 among the top 500 supercomputer sites of the world If however anyone thinks that is not a good enough performance – remember that the top most performing supercomputer over the last few years the Blue Gene/L and located at  the Terascale Simulation Facility at Lawrence Livermore National Laboratory, used by scientists at Livermore, Los Alamos, and Sandia National Laboratories runs Linux. The 360-teraFLOPS machine handles many challenging scientific simulations, including ab initio molecular dynamics; three-dimensional (3D) dislocation dynamics; and turbulence, shock, and instability phenomena in hydrodynamics. It is also a computational science research machine for evaluating advanced computer architectures.



So yes this free operating system developed and maintained by the world’s hobbyist geeks is in fact the best possible operating system for super computers – but what about you and I. For a long time there really was no way out of the hegemony of the Microsoft operating system with its GUI based ease of use and depth of applications. Sure there were always security risks and stories of application crashes that were legendary – but the engineers at Seattle worked on the code and by the beginning of the new century were finally able to deliver a much more stable o/s – the Windows XP built on the Win NT kernel. The software was good and the applications – the office productivity suite that MS makes most of its money on sat neatly together. Off course there were court cases and Department of Justice subpoenas to be answered from the Netscape and anti trust hearings of the last century – but somehow things were finding a way of solving themselves when most needed. Needless to add the O/s was one of their best selling ones and open source – ubuntu or otherwise really did not have a chance. But that is where the arrogance of a leader must come in and they launched Vista and that was – in my humble opinion the beginning of the end. Like all new operating systems (from Microsoft) it was buggy; needed terabytes of storage with Seagate (the memory makes) even insisting that Vista needed between 250 GB to 1 TB (terra byte) to be really useful! and have called Vista the poster child of storage!


But it wasn’t just storage that was the problem – before long Vista was attacked by a 13 year old virus called stoned angelina! Before long corporate IT managers were hesitant about the move to Vista and did not seem as excited as they should have been and even after a service pack installation 1 – corporate IS managers have said that they will probably stay with XP for upto 3 more years! Microsoft Corp. says it has sold 42 million volume licenses of Windows since it released Windows Vista to enterprise customers last November. But the company claims to have no statistics on how many of the corporate users who are eligible to move to Vista have actually done so. As part of its efforts to encourage organizations to take the plunge, Microsoft late last month announced that it will ship the first service pack update of bug fixes and functionality tweaks for Vista during next year’s first quarter. The SP1 release will be accompanied by a third and final service pack for Windows XP, Vista’s six-year-old predecessorMicrosoft not to be outdone declared that they would stop support for XP and all new computer manufacturers were forced to go the Wow factor and pre install Vista. This was the final sign of arrogance and the opinion makers of the IT business were soon out for blood. First Google sued them for issues with search on Vista and then the press was outing the low adoption numbers and before long the penguin lovers were all over stating that Ubuntu had won thanks to Vista with tips on how to get your grandma onto opensource and well there are also a list compiled of the five ways that Linux is better than Vista – better security; less hardware and storage needed; no DRM and limitations; increasing applications and no license fee!



The giant from Seattle recapitulated and confirmed extension of XP as the press asked with cheer Why Microsoft must abandon Vista to save itself?  and as a final insult while Microsoft is still pushing Vista hard, the company is quietly allowing PC makers to offer a "downgrade" option to buyers that get machines with the new operating system but want to switch to Windows XP.  And Microsoft has in the meantime launched Halo 3 a videogame with lot of effects and early adopters (and this is version 3 as the Windows os was when it finally succeeded)  and might perhaps even change tack to become a gaming company. And why not – if Apple Computers could become a digital music player and phone maker I am sure MS can also re invent itself. So where does that leave open source – in my opinion and experience since I already have it on my desktop – it is probably coming soon to a computer near to you. Delhi based computer resellers recently had a workshop on linux and that makes me think that the days of the pirated software on the ‘grey market’ (white box) PC are indeed over.



In Europe Linux is expected to soon kill Windows and by 2009 Russia expects to move totally to a Russian OS based on Linux to decrease dependence on foreign software. . India never to be left behind has already taken the lead at various state governments with Tamilnadu and Kerala having rejected MS Windows for Linux and have also voted against the OOXML initiative of Microsoft at the ISO. At Allahabad the courts there have given a rising reception to open source.


Indian Mobile telephony

The computer was 25 years old last year – the IBM PC as it was known then and while there were many that feted that birthday –a lot of forward looking people called the demise of this technology not too far in the future. Handheld computers and smart phones are expected to take their place and the name of the game is ubiquitous computing. The global PC market is estimated to be $200 billion a year and though growth rates are not what it used to be 2007 was in fact one of the better years with numbers between 12% to 17% growth in Q2 2007. A large part of the growth was attributed to the growth in portable (laptop) computers. A lot more interesting facts about the computer can be read here.  and the fact was that this is was also the year that Apple computers – hitherto a bit player in the computer industry has seen shipments of its computer line grow – thanks to what analysts call an ‘halo’ effect.


Global PC shipments in Q2 2007


Q2 2007

Q2 2006


















































Source: IDC

Gartner has actually put a higher number on the growth than IDC and ranks Apple fourth in the United States but are not in the top five internationally.  But even at its best the PC is still a slow laggard to what mobile phones have done in India. Reuters reported yesterday (23rd August 2007) that for Nokia India was in fact its second largest market – beating – hold your breath the US, but well after China which is their largest.  Nokia shipped 60 million handsets from its factory near the southern Indian city of Chennai in the 18 months to August, and CEP and President Olli-Pekka Kallasvuo expects demand to remain strong as India’s user base surges.’


The mobile phone it appears has truly and completely upstaged the PC and at India this is brought home by recent reports that most people now access the internet more through mobile phones than computers as a recent report in the Economic Times mentions This has a lot of significance for a developing nation such as ours and as Robert Jensen professor of Public Policy at Harvard (John F. Kenney school of Government) has shown through his research on the fisher men of kerala – a simple technology like mobile telephony increases their earning substantially with a commensurate increase in their quality of life; healthcare etc. The report can be accessed here from the NCAER (National Council of Applied Economic Research) here as a PDF  Mobile telephony is indeed going to be the way forward for our country – the device that will finally unlock the true potential of the vast network out there which earlier seemed limited to computer owners.


This is a fact that is recognized by the pone makers and Wharton School of Business, University of Pennsylvania had a good analysis of what Nokia has done at India as business policy to take the lead here.  And a ot of it has to do with making a good product. The made for India and made in India Nokia 1100 is estimated to be the best selling electronic product of all time selling 200 million units in comparison to a 100 millon iPods and 150 million Playstation 2 and shows where this market could go and the potential there. The fact is also borne out by another crude measure – the amount of PE and VC money following startups in India in the mobile telephony application space. Another interesting item in the news was a potential rumor that Google will launch its G phone – a mobile phone (and service) in India first in the next few days.



India at 60 and our internet – part 2

(continued from last entry…..) However besides the taste in porn and job sites and well organ sizes – there are differences also in the way people in India use the internet. To quote Om Malik ‘….The PC usage patterns are such that people don’t spend too much time surfing, but instead focus on specific tasks and actions, like sending email, trading stocks, checking job listings or matrimonial listings. Think transaction-based, task-oriented Internet usage!…..’(http://gigaom.com/2007/02/16/india-internet-start-ups) . This is borne out in fact by the Com Score study cited above which also measures the average monthly hours spent online. India is not in the top 15 – which may not be too surprising considering that the USA also does not figure in that list.


Top 15 Countries by Average Monthly Hours Online per Unique Visitor Among Visitors Age 15+* March 2006 Total Worldwide – All Locations Source: comScore World Metrix 

 * Excludes traffic from public computers such as internet cafes or access from mobile phones or PDAs.

Avg. Hours per Visitor March-06







South Korea










Hong Kong
















 If we look at the top 100 sites visited by Indians – this fact is also borne out by the presence of  a large sprinkling of bank sites (net banking) and stock trading sites besides travel (ticket booking sites) with the Indian Railways as expected coming in really high. The other big things is the fat Indian marriage act with a lot of ‘holy’ matrimony sites present in the top 100 listing by Alexa.


So 60 years after we got free – Indians are about looking for a good job; looking to get married (and perhaps meet people outside marriage also what with the prevalence of social networking sites in the top 3 visited by Indians); stocks; beating long queues for ticket bookings and porn – the usual suspect! Though the data on this matter may be open to argument and perhaps needy of a bit more validation – the largest online regional communities seem to be the telugus of Andhra Pradesh with a site devoted to Telugu cinema coming in at 62 followed closely by Ramoji Rao’s Eenadu – the largest circulated Telugu daily with perhaps Tamilians coming in second with Dina Malar coming in at 86.




India at 60 and the internet

India in 2007 – that is 60 years now that we have been free and the day was spent like last year with me lazing around and watching all the feel good television stuff about the great progresses we have made and all that is well and good with the great Indian dream run that we have been having. I spent some time feeling good about the Indian internet revolution that has come about in my time – from the days of the slow and moody dial up connections to the fairly passable broadband pretenders. And yeah the state owned telephone company (PTT) declared today to be the day that they launched IPTV for all. This is the cutting edge off course – there was a CDN (Content Delivery Network) Tender out there some time back from the state owned  telcos and the roll out is now final and is called IOL Broadband (http://www.iolbroadband.com) . It is early days yet and I guess there will be the early hiccups and things will slowly get better before they are good but the ball has been set rolling. In traditional television delivery, all programming is broadcast simultaneously. The available program signals flow downstream and the viewer selects which program he wants to watch by changing the channel. IPTV, by contrast, sends only one program at a time. Content remains on the service provider’s network and only the program the customer selects is sent to the home. When a viewer changes the channel, a new stream is transmitted from the provider’s server directly to the viewer. Like cable TV, IPTV requires a set-top box.

The growth in Indian Internet usage has been stuff of legends with growth rate of 30 – 35% among users and depending on who you ask – this is a country of between 30 – 40 million users and a report from JuxtConsult (http://www.juxtconsult.com) a Delhi based online research and consultancy puts the number at  slightly over 30 million users (http://www.marketingvox.com/archives/2007/06/14/indias-internet-users-reach-30-million)

 Another report by the IAMAI – the Internet and Mobile Association of India (http://www.imai.in) states the Internet population of India at 42 million users in 2006 – a miniscule 3.6% of the total population http://www.internetworldstats.com/asia/in.htm.  This data could be wrong as there seems to be a decrease in the population between ’05 and ’06 but then three different sources of information are being considered (ITU; CI Almanac and the IAMAI) here which may cause the error. Whatever the final numbers are – it would appear that there are a fairly large number of Internet users at India – not perhaps as a part  of the total population  in comparison but definitely valid.

 Internet Usage and Population Statistics:




% Pen.

Usage Source




0.1 %





0.3 %





0.5 %





0.7 %





1.6 %





2.1 %





3.6 %

C.I. Almanac




4.5 %

C.I. Almanac




3.6 %


Whatever the actual figures may finally be – India does seem to rank among the top 10 nations by Internet population as per this report from ComScore – a Reston, Virginia based global Internet information provider to which leading companies turn for consumer behavior insight that drives successful marketing, sales and trading strategies (http://www.comscore.com). They rate the number of global Internet user universe at nearly 700 million users (http://www.comscore.com/press/release.asp?press=849).


Top 15 Online Populations by Country, Among Visitors Age 15+* March 2006 Total Worldwide – All Locations Unique Visitors (000)Source: comScore World Metrix 

 * Excludes traffic from public computers such as Internet cafe and, access from mobile phones or PDAs.

Unique Visitors


Worldwide Total


United States








United Kingdom


South Korea









































60 years of free India and we have done a lot and we have shown the world we have quite an appetite for the internet and the announcement of IPTV will only make things better and broaden the base. But a qualitative look at what Indian’s do on the internet provides some key insights. Using the data from Alexa a web usage company owned by Amazon.com and forgetting for a minute the potential for bias (http://forums.seochat.com/alexa-ranking-49/how-accurate-is-alexa-4536.html) we get some interesting observations. The first Indian site in the top 100 sites at India is Rediff – making one think that their valuations were not so off the mark. Orkut is the second most popular site at India narrowly begin Yahoo the global leader. What does that mean – social networking rules here ? Facebook is at 21 and MySpace at 65 and Linked In is at 75 in the top 100 listing – for more details see here http://www.alexa.com/site/ds/top_sites?cc=IN&ts_mode=country&lang=none

The second most popular Indian site in the top 100 listing is job portal Naukri.Com coming in at 9th place. That either means people are looking for jobs like never before (HR Managers beware) and career mobility is in and also proves that a strong advertising message with the Hari Sadu and ‘guess who has heard from us’ spots on Indian TV have worked. The next best Indian site is Indiatimes.com from Bennet & Coleman – main stream media’s online presence coming at 13th.


Sex sells like anything else and anywhere else and the number 1 site in India is Debonairblogs – coming in though at a lowly 18 out of the top 100. This site however hosts pictures and video downloadable from other sites like Rapidshare – which however comes in at 11. Most of the people who visit job site Naukri also visit it appears another job portal  jobsahead and penis extensions. What that might mean is that Indians do not have the jobs they want and perhaps the right sized tools to do it with! http://news.bbc.co.uk/2/hi/south_asia/6161691.stm


Computing everywhere…


 I read a piece recently where a disgruntled user wrote to Steve Jobs after a bad experience with his new Apple laptop and the CEO apparently himself (through his Executive Assistant that is) that the complainant gets a new computer. This was a great (though much debated in that little space of the blogosphere that it inhabited) event and references may be found at http://www.consumerist.com. Others like I admitted that this showed that he was god – a few snickers though did suggest that it was in fact a PR exercise. And well I knew he had to do it; in the background Rupert Murdoch and it is then speculated Google also made a bid for Dow Jones. Not to be left behind, Microsoft did its bid for Yahoo! A recent reference to the Rockefeller Crucible in an investment letter spoke about the way Standard Oil in America increased its monopoly perfectly legally and how CEOs of companies who are today following the same philosophy are returning fantastic share holder value. It would not be so much monopoly as I think the right word is Oligopoly and to quote as usual the wiki – ‘…. is a common market form. As a quantitative description of oligopoly, the four-firm concentration ratio is often utilized. This measure expresses the market share of the four largest firms in an industry as a percentage. Using this measure, an oligopoly is defined as a market in which the four-firm concentration ratio is above 40%. For example, the four-firm concentration ratio of the supermarket industry in the United Kingdom is over 70%; the British brewing industry has a staggering 85% ratio. In the U.S.A, oligopolistic industries include accounting & audit services, tobacco, beer, aircraft, military equipment, motor vehicle, film and music recording industries…..’ (http://en.wikipedia.org/wiki/Oligopoly). A large part of where the world is going will be controlled by a few people; it already is to a great extent. In those times you just hope that the people in charge have some benevolence around them and believe in their talk and would do their bit for the little guy out there.


In 1996 I read Nicholas Negroponte’s book on the Digerati where the future now will not be so much about molecules as they will be about bits and bytes. If you are following courier and logistics company stocks I am certain you will point out that these people seem not to be going out of business but quite on the contrary seem to be moving that much more of molecules every year. Which is in fact true as goods will have to move around and you cannot really imagine (yet) that you are wearing and electronic Nike shoes but hey any one who has seen Star Trek – the TV show will remember ‘…Scotty, beam me up….’ and god bless that actor’s soul; his ashes were a few days back sent to deep space as he had requested. There are people working in that line I am sure and as many a black hole or other such deep space time conundrum will soon open up at the particle accelerators at France and Switzerland we will find a way to get things across. You must off course not miss the irony that the protocol that got this giant network of networks out from the labs to the world wide web was invented at CERN – le Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research) which might today also be the place the next big discovery that makes mockery of time and space and media as we know it now! Companies understand about labs and that is why when they buy – they take the engineers and the tech staff – the scientists as it were. In the early part of the 20th century AT&T understood that to push the envelopes of technology and maintain its grip on the market; it was needed to push the envelopes of technology to levels  that the labs of universities would not be themselves be able to do. Yet the question often comes up – had http been invented at private labs wouldn’t the fact of royalty’s perhaps have curbed what it has now been able to do?


I think computing should be about free and the recent announcement of the ministry of HRD at India stating that a target of getting a laptop PC down to $10 an unit has been taken up. They had earlier turned down Mr. Negroponte’s offer of the OLPC at $100 an unit. This is ambitious and knowing Indian ingenuity and the fact that we achieved terra flops of computing with our own home made machines. The PARAM Padma from CDAC at Pune has an heritage of being in the top 500 supercomputer lists of the world the stated objective there was to beat a price and performance point. There are many such examples and options for getting the prices down and we have not yet begun on what the Chinese can do once they set their mind to the task! The OLPC is in fact made at Taiwan and one of the main reason that per port costs of many telecom solutions and systems have come down are because of the rise of the Chinese and the Indian consumer who would knock prices down to derive value. In 1998 there was a privately owned telephone company here at India that provided cellular telephony to their market (the government defined cell phone circles of Tamilnadu and Kerala) at less than two and a half American cent a minute. Indian mobile telephony is actually an industry with the lowest price point to the end customer and the highest entry cost to the operators and yet it represents a huge opportunity. Also Tamilnadu and Kerala are the two states at India that have taken the most aggressive steps to move to open source software.


I think computing is going that way – like it or not the future will have to be about little mobile private hand held devices that are small but with strong deep embedded technology to enhance connectivity and productivity and at the lowest end of this device spectrum – they will probably be free. I think it is more likely than ever that we will use mobile phones for more than just to speak into and just take a look a Nokia’s latest product the N95 as an harbinger; ‘it is not one thing it is many’..http://nds1.nokia.com/tutorials/support/apac/phones/n95/apac_english/index.html..  Dr. Schmidt at Google calls it the dawn of ‘cloud computing’. I think true – the hard drives are going to soon congeal into a ‘cloud’ of disk arrays and SANs and NASs if they do not first become as a recent announcement from Dell of a flash based hard drive laptop showcased; completely based on microprocessors elements that remove the need for that physical fast spinning magnetic storage as we know it now. The flip side is things are going to need I would imagine then a lot of cooling and don’t let us even get started on what that may do for global warming. The new class of Penryn and other such exotic named processors and the road maps of the chip makers seem to indicate that the future will be about more that 8 ‘cores’ – a server on a chip would then be possible. With data exchange rates reaching terabytes also  – why not have the applications all resident in the server farms – which can in the future store perhaps more data per square inch than off the present day hard drive based ones and have them pulled by hand held devices by users. Thin client computing as in the past but in a mighty different shape and form factor and application density. A company called Microvision is in fact developing miniature photo displays – projectors that can be connected to mobiles etc. to show their content to larger spaces. The possibilities are mind blowing and they are venture capital backed at the moment but the direction it shows is that things are going to be about more immersive computing if not ubiquitous and the chances are the battle of the free ware / open source initiative against the other corporate guys will have kind of become irrelevant as both will co exist possibly in discrete spheres of influence. The servers will run the paid versions and the client devices will possibly work with a mix of free and paid soft ware – but prices will be very low $1 perhaps – which will be relevant since they will be one time use types mostly. The personal computer will possibly not die – there is a requirement always for a massive storage device with a display and a logic processor to be your back up and I think that is a space it will take with ever more layers of back ups like i Pods and other storage based computing and perhaps gaming equipment. And media will also probably become free – no more DMCA and RIAA and attempts at controlling the storage media that an user pays for but more creative commons http://creativecommons.org – the spiritus mundi – where it is not about the economics any longer.

Machines start to learn

I have been ever since I can recall fascinated with machines and humans and how they engage together and develop. While at college during the early 1990’s and studying for a graduate degree in cellular microbiology and genetics; I had learnt how biological cells – the micro components of our existence – mutate and become different than when they start out. We learnt how a concept known as ‘genetic engineering’ then in its infancy can cause new gene pairs and whole new characteristics. The Drosophila Melanogaster – better known as the common fruit fly was our guinea pig in the labs. Even then we at our classes often wondered if we could take this mechanism to the more inanimate machines and what could in fact happen if we did? Somehow the concept of artificial intelligence did not then come to our non computer addled minds. Computers then were in fact assisting the whole process of gene sequencing; understanding as it were the way life came about and the introduction of computer software in fact substantially decreased the time it took for the complete human genome to be sequenced.  

Anyone who knows his computers and recognizes how viruses work will not in the least be surprised with this biological reference that may seem out of place in the cold hard world of bits and bytes. Trojans; Worms and viruses – what are otherwise in fact bits of software code  are known to mutate and reproduce (send out copies of themselves)  in the wild and crazy world of the ‘botnet’ and the various other dark and sallow breeding pits of the IRC channels which turn into distributed denial of service attacks. That is a lot of tech geek speak with biological overtones and what that essentially   means is that computer code infects your computer and makes it a ‘zombie’ and then uses it to attack servers across the world. And thus  the future comes about and it is amongst us as we speak and what projects may come if the mind be let to wander. What would you perhaps then say about computers and processor motherboards and microprocessors that can re organize who they are depending upon the software (firmware) that has been loaded on to them! These are known in technology speak as FPGAs – field programmable gate arrays – and as the name suggests – out there in the field you could program them to think different from when they started out!  

That degree in  genetics did not help much; I landed up selling computers and software and other such paraphernalia and even got good at it. That is also when I realized I wish I had been an über geek with a degree in computer engineering and not in the life sciences. But it was too late now and I could not be because I could not program.  I had in fact done some work in BASIC including a certificate course in 1984 while at junior high school but I never really took it ahead and today to tell you the truth though I know what they are I cannot do VB or PERL or Python or even the very basic C+ or # or even a Java script. I have told myself I should do something about it and hope maybe I can. But I have been busy selling HPC – high performance computing machines since the late 1990’s and I know that those men in the jump suits at Intel or elsewhere can bring forward a mean processor out for the rest of those coder dudes to build their dreams on.  

The first ever computer I saw was something called an IBM compatible PC/ XT DOS and the first ever video game I played on it was called ‘Pong’. This was way back when computers were not supposed to be used by ordinary mortals and Bill H. Gates the 3rd was still trying to get his iron grip over the world. Today as anyone would have told you; the average computer does transactions that are a million times more than what the ones in 1984 did. Intel announced a few days back a processor that could perhaps be clocked to 3.3 Ghz which would be a quad core – 4 individual cores and they were here only following what their competition – AMD did some time back. The new processors have really geeky names but that is as they say just the tip of the iceberg. Recently Thom Sawicki, technology strategist for the Intel Communications Technology Lab discussed the future and guess what they have come up with: an 80-core announcement! Specifically it isn’t just that someone has put 80 cores on a single chip and called it a "processor" instead it is about the larger implications of massively multicore processor for system- and network-level architecture. In his words: "Once you jump to terascale, you have to ask and need to ask ‘what are the implications for everything? The platform of the future when you get to terascale will look different and act different. It won’t be a CPU of 80 cores surrounded by a chipset and some peripheral. You’ll see a much tighter, more integrated organization." The picture that Sawicki paints is of a ‘server-room-on-a-chip’ a single piece of silicon that uses many cores and virtualization to do the kind of work that it currently takes multiple networked servers to do. Sawicki gave the example of a hypothetical multicore chip that can run a high-volume e-commerce solution on a single piece of silicon. Instead of web server box that takes orders and then sends them over the network to another machine for processing, you could use two separate cores for these tasks, with each core running a virtual server.  

Ah man what all that geek speak means is that they are coming out with even faster and  better and perhaps even hotter processors. And we are only speaking of Intel – we have not yet even started talking about what Texas Instruments; Motorola; Philips or Samsung  are doing. And just when you though – ah that is a lot of computing – this professor student duo at the Oslo University Kyrre Glette and professor Jim Tørresen have developed; or rather written software that makes hardware imitate evolution at runtime and all changes to existing hardware have to be made through software. What their hardware does is par up “genes” in the hardware to find the hardware design that is the most effective to accomplish the tasks at hand. Just like in the real world it can take 20 to 30 thousand generations before the system finds the perfect design to solve the problem, but this will happen in just a few seconds compared to the 8-900.000 years it took humans to go through the same number of generations. This team first started to use evolution back in 2004 when they made a chicken robot “Henriette”, yes a chicken. The chicken robot used evolution, this time software based to learn how to walk on its own. Evolution solves a lot of problems that programmers cant solve, a programmer can’t think of every problem that might occur if say a robot was sent to Mars and fell into a hole, through evolution that robot could learn how to climb out of the hole without the interference of humans. The team now wants to make a robot designed to help in the installation of oil pipes and other oil related equipment at 2.000 metres depth, these depths make it almost impossible to communicate with a robot, you’ll either have to have 2-3 kilometres of wires or communicate through echo signals which in turn will give a multi second delay. Their research paper can be accessed here (PDF). And they were not even the first in this Paul Layzell and Jon Bird at the University of Sussex in Brighton applied a program to a simple arrangement of transistors and found that an oscillating output ‘evolved’. When they looked more closely they found that, despite producing an oscillating signal, the circuit itself was not actually an oscillator. Instead, it was behaving more like a radio receiver, picking up a signal from a nearby computer and delivering it as an output! The machine had indeed learnt!  

And where that will take us is for us to imagine!