top of page

How has the "Humming Bird Algorithm" by Google cyber secured the E-Commerce landscape?

Search Engine Optimization(SEO) is one of the most important investments any E-Commerce business makes to increase revenue and profits. E-commerce stores thrive on analytic data and KPIs like the number of new or returning visitors, time on site, bounce rates, etcetera. In addition, many businesses also consider Search Engine Marketing and Social Media Marketing KPIs to increase profit and reach. Therefore, it becomes crucial for SEO professionals to understand how social media platforms or search engine works. Since its algorithm is not public, many software engineers are hired to reverse engineer and draw insights to optimize the site. Being an SEO professional is a full-time job involving working as or with a web developer and designer. This also explains my work as a computer engineering student in an e-commerce business.

Search Engine Optimization can also be categorized as Black Hat and White Hat SEO. While Black Hat SEO involves tricking the search engine into ranking the website on top, White Hat SEO involves following best practices and guidelines from Google/other Search Engines for Website Ranking.

When Google Updated its Search Engine Algorithm to Hummingbird Algorithm in 2013, it took the E-commerce businesses at large. Websites lost their ranking in Search Results and went from the first page to the seventh page of Google. It was the first significant update after 2001!

So, what is the Hummingbird algorithm update?

The Hummingbird update is a significant update of Google's search algorithm focusing on website content, uniqueness, and originality. It interprets natural language queries and words in a search string to deliver more accurate results. It was more user-friendly and prioritized user safety as well. There had been past updates named penguin and panda, but the hummingbird algorithm was a game changer.

As a beginner in Cybersecurity, I look into this algorithm as a change to make the internet more secure than it used to be because the update essentially sought to penalize poor content and manipulative, black-hat SEO tactics, such as cloaking, private blog networks, and link schemes.

The algorithm that penalized Black Hat SEO Practitioners

1. From keywords to making unique content

As an SEO professional, one of our main tasks was to focus on keywords search and mention it in the metatag of the website's robot.txt file. As a web developer, you know how ROBOT.txt is an essential file for web crawlers.

Hummingbird changed how Google understands the language used in a searcher's query. The update was based on semantic search, or searching with meaning rather than matching-up words or phrases. Previously, Google looked at the individual keywords in a question and returned results with the same or similar keywords on a page. However, after the change, it was no more applicable.

2. Shift from having more links to having more context.

Hummingbird algorithm update considered that businesses were buying a bulk domain and trading domain links for a prize for backlinks to be seen in the search results. A backlink is a link from some other website to that web resource. A web resource may be a website, web page, or web directory. A backlink is a reference comparable to a citation.

The Page Rank algorithm gave a lot of importance to links and backlinks. If your website were cited on many other websites, your website ranking would go up. This was changed with the changing algorithm.

E-commerce businesses then used to buy multiple domains and use the strategy to manipulate search ranking. Although back in 2013, "Black Hat SEO" was not seen as something illegal. Instead, it was seen as playing by the rules of the game!

3. A check on cloaking to an extent

Cloaking is something that a server is programmed to do only when it sees a search engine robot visiting the site. It's called 'cloaking' because it involves hiding the content people see and substituting it with content designed to rank higher in the search engine results pages (SERPs). It is a black hat SEO technique where a website shows one version of a URL, page, or piece of content to the search engines for ranking purposes while showing another to its actual visitors.

It is still a cyber concern as many websites use DNS Cloaking and Malware/Spyware Cloaking. However, the algorithm did keep a check on this.

Threats still exist!

Several threats still exist, and SEO and website developers are asked to keep a check on them. Some of these threats include the following:

  1. Using ROBOT.TXT manipulation for dodging content

  2. CNAME Cloaking is still an issue, and can we be used for spyware operations

  3. Images can be stenographed. Although the hummingbird algorithm changed this practice, images are now recognized as much as text. However, cybercriminals are black hat SEO practitioners who are always up in their game.

If you need any Search Engine Optimisation services or Website designing and development serves, you can contact us at

About the Author's Experience on the Subject

"While as a Search Engine Optimiser & Web Developer for Apex Flower Mart during my sophomore years of Computer Engineering, my work involved:

  • Tracking SEO algorithms.

  • Using social media marketing to increase website traffic.

  • Working on making the website rank on top of Google and other search engines like Yahoo

As a part of a company member of Google Business Group, my main task involved learning about tools and techniques that could be incorporated into the website. As an SEO specialist, books on "How to Get to the Top of Google by Tim Kitchen" and "The Google Story - by David A. Vise" were my bible. In this blog, I want to share insights on Black Hat and White Hat Search Engine Optimisation and How Website Developers, Marketers, and SEO experts must have basic cybersecurity know-how for customer cybersecurity. "

112 views0 comments
bottom of page