How far can A.I. go ?

Academically, A.I. is simulation of human behavior in computer technology ! Philosophically it’s an attempt of creating alternative consciousness. Imitation of our natural self. Metaphysically observer and Universe are inseparable. You cannot have a reality unless and until there is someone to observe. That’s the rule of the game like you cannot have a Video Game unless there is someone to play with it or you cannot expect a video game unless player has a willingness to play it. It is just like creating an alternative playground for the mind.
So what is holding back the revolutionary breakthrough in A.I. What brings life in machine, it’s machine learning. When we pump in consciousness in the hardware but before that we have to set some basic rules for the machine to follow, an axiom to build upon a theory. Axioms cannot be debated as they are the very foundation of the theory and their validity cannot be argued upon. Once machine learning is enabled in a device it has to crunch in humongous amount of data to develop a consensus on the affairs of an issue. So the machine has to do lot of iterations to come to a logical conclusion. So after Machine Learning we need evolution in Big Data. Now the problem is physical. We need faster and faster processing capabilities to crunch the humungous amount of data. Quantum computing holds the key to this challenge. Once we usher into era of Quantum Computing we may see a huge processing of data happening under Big Data for Machine Learning to evolve and A.I. to revolutionize our existence. Beyond this we need a common machine language to communicate with the machine which has to understand some 6500 human languages spoken on the planet. NLP is just like text mining. A huge amount of text data is generated on internet of which only 21% data is structured, rest all unstructured. In order to make some meaningful inference from this NLP has to evolve to break down and understand the human languages.

With advancement of edge processing and IOT becoming mainstream we will never face dearth of data. For any Big Data model to be meaningful we need sizeable data points. Now we have tools and gadgets which can be embedded at end point and they can communicate with central hub seamlessly, thanks to cloud computing architecture which has broken the barriers. Once we have adequate data, ML (machine learning) can interpret and analyse data to draw a logical conclusion. Applications are innumerable, be it health, medical care, population science. You name it and we can work out applications. For example, if we want to search for Black Holes from millions of spectral images collected by telescopes, we can work out an AI model. Results, faster and accurate. If we want to understand the effects of weather on crop yield, we can work out a model which will assemble data from weather services and IOTs fixated at farms. We have a model. So sky is the limit when it comes to AI.

Leave a Comment

Your email address will not be published. Required fields are marked *