Secret to GoogleBrain and RankBrain algorithm revealed. One discusses about their impact in digital marketing such as SEO, mobile, PPC and social media. One is going to give a historical overview about GoogleBrain and analyse the pattern, then we will conclude our finding about the current situation and future changes in search engine algorithm.
Back in 2006 there were some interests in implementing artificial intelligence in Google search engine algorithm. A few years later in 2014, GoogleBrain was established after acquisition of DeepMind, a British artificial intelligence company which was founded in 2010. They worked on how to play video games based on machine learning and artificial neural networks (ANNs). The smart artificial intelligence revolution can recognize patterns in digital representations of sounds, images and data.
Intelligence from Big Data
What is deep learning? It is about iterative algorithm, learning at different levels of abstraction, non-linear transforms and typically neural nets.
Iterative algorithm is a simple way of solving a problem. One is not going to explain how iterative algorithm works as it belongs to subject of data structure and algorithm. In here one explains about Google RankBrain algorithm and its impact in search engine optimization and business.
Learning at different levels of abstraction happens simultaneously.
Big data consists of Internet, Meta data: tags, translations, mechanical Turk. Big Data on its own is not a useful thing. It’s bunch of information unless you apply a methodology to make use of it.
You can’t understand big data. None of us would go and read a phone book. It’s useless. Big data makes sense if you make use of it. For example before none of search engine programmers looked at data in order to give an output such as: house, dog, cat etc. Because in the past tags were not in use among users. Today my grandmother knows how to use hashtags and tags, but in the past it was not popular among people. Sometimes she says: “Maria remember to add tags on my photo when you post it on instagram or blog.”
Algorithmic advances are unlabeled data, unsupervised re-training, structure NN (feature detectors) and successive layer of learning. Today people look for variety ways to work on unlabeled data.
Before, the videos related to cats were not labeled with the name of “cat” on it. They just looked at random images and cat just came out of it. When i studied computer engineering, we had several courses related to artificial intelligence. One learned that in deep learning a person can have multiple learning at different levels of abstraction. I used this pattern in my PhD project “Implications for Upgrading Accelerated Learning Practices In Educational Systems”
Back to Google search engine. Before there were 1 Billion synopses in GoogleBrain while a human adult’ brain has 100 Trillion synopses and infant has 1q of trillion synopses.
The number of synopses per neuron grows as a power law (a term in biology) in brain mass.
There should be a better improvement to structure deep learning methodology.
“We actually think quantum machine learning may provide the most creative problem solving process under the known laws of physics.” Google Blog, 2009
Now their machines are 50 times bigger than when they claimed above back in 2009.
Google claims its D-wave quantum computer is 10 times faster.
D Wave native sampling for deep learning:
Below shows D wave placement in the process.
Input data–> D Wave –>classification
When an algorithm is written, there are tags with similarities that can be fetched on an 80 millisecond such as: car, BMW, Porsha etc. Or keywords such as: kids, children, clown, fun, and color with similar images.
Machine learning is a particular approach to computational task. When an algorithm designer writes an algorithm his or her job is not finished. They have to get their data through different processes to get it to work properly.
For example to fetch data about “Lamborghini” if the machine fetches duck, then a search engine programmer has to penalize it for sending a wrong output. So in machine learning it is all about training the algorithm.
Historical Overview of Neural networks: 1980s – 2009 were the dark ages
According to ImageNet the computation Error rate for recognizing images between 2010 to 2014 the error rate was up to 79% but it became lesser until 2014 to 20% which was adding new primary sense to the machine.
Human brain uses 20w energy while your laptop uses 50w. Your brain uses more computation than your laptop.
So what’s missing? It should be done many operations at once to adopt with the change. Some of you may have guessed where I am going with this statement about Google RankBrain.
Google has algorithm and data to make the algorithm work in order to succeed. Google really needs to scale things up the way biology has done during years of evolution. Google RankBrain is not finished. It takes way more time than many claim and presume online.
Google has algorithm to make this work and machines to processes it. Google engineers can build a new hardware and take the computer processing into the whole new direction but it takes time.
Why do we need big data?
We need data to make it work.Here is the question how this algorithm is going to help businesses and big cooperation’s in solving their problems? The truth is most of the businesses don’t care about deep learning.They care about what comes out of the box.
Artificial intelligence AI in networking context is like building capabilities upon capabilities. The ability to understand languages and images. The ability to ask questions or being told what question you should be thinking about.
Organizations should democratize technology in order to gain users’ trust, increase their company’s ‘share of wallet and provide a good user experience.
In search engine optimization context Google RankBrain algorithm is in its infancy. Here is why. We see a trend in focusing entirely on human labeled training data to unsupervised approaches. There is a pattern in everything; people don’t need to have teachers to teach them things all the time. The reliance on human data is a scalability challenge for this technology. If we solve this process and use all the data out there it’s going to be open up a new world.
Unsupervised learning is a type of machine learning algorithm used to draw inferences from datasets consisting of input data without labeled responses. Cluster analysis involves applying one or more clustering algorithms with the goal of finding hidden patterns or groupings in a dataset. Clustering algorithms form groupings or clusters in such a way that data within a cluster have a higher measure of similarity than data in any other cluster.
A hidden Markov model (HMM) is one in which you observe a sequence of emissions, but do not know the sequence of states the model went through to generate the emissions. Analyses of hidden Markov models seek to recover the sequence of states from the observed data.
If we look at Nero networks they contain two parts. Neuro network itself and learning principle. If we build more general computers that process neuro networks then we need to supervise them in order to progress.
There has not been sufficient progress on unsupervised learning therefore you do not need to worry about RankBrain‘s impact on your website’s ranking. Google has not built an unsupervised algorithm to outperform supervised algorithm. Google has not built them yet. They do have algorithm and machine to process the information however Google should come up with learning principles.
Where is the money in deep learning
Companies should see the use of it.They are already making changes and making big investments on deep learning. When it comes to digital marketing you should follow digital marketing such as: search engine optimization, mobile marketing,content marketing and PPC processes. Your job is to test what works for your business and not rely on assumptions.
For more information about how to become the market leader read my book multilingual digital marketing 4th edition.