SEO Glossary 2020: Essential Terminology
To play freely around with SEO, you need to know the meaning and definition of the most useful SEO terms, phrases, and jargon in the industry. This glossary, with 269 terminologies and their meanings, is designed to serve the primary purpose of helping you in learning SEO. Before digging deep into search engine optimization, you should be familiar with these terms and their importance in SEO. Since SEO best practice has been continuously changing and evolving, I exclude those terms and phrases from the glossary that are no longer in use. My primary focuses are only on those terms that are valid at present and can help learn SEO strategies and techniques. To make the glossary more usable, I’ve been updating it regularly.
301 redirect is an HTTP code that indicates that a web page has moved permanently from one URL to another. A 301 redirect send site visitors and search engines to a different URL than the one they originally typed into their browser or selected from a search engine results page. If you don’t redirect permanently moved URLs, browsers will return ‘404 not found’ error codes that can adversely affect the ranking of your website.
A 302 HTTP code indicates that a web page is found or moved temporarily to a different URL. The 302 redirects are used when the webmaster puts some URLs at different locations for some temporary reasons.
404 Not Found
A ‘404 – not found’ is a status code that a browser receives from the server when a webpage or URL that a user is looking for is not found for a certain reason. This status code is displayed as an error message on the user’s browser screen. Prominent reasons for 404 errors are:
- the URL a user looking for has been moved to another URL without proper redirection, or
- the URL has been deleted permanently, or
- there is an error in the URL.
See: Split Testing
Above the Fold
‘Above the fold’ is the top portion of a webpage that appears on the page before the user scrolls. Page Layout Algorithm of Google lowers the rankings of websites featuring too many ads in this space.
An algorithm is a process or a set of rules to be followed in solving problems in an automated way. Search engines use algorithms to discover pages on the internet and rank them in the most appropriate way for the search queries. Google’s algorithm includes more than 200 criteria that are taken into account when determining the rank of a web page for relevant search queries.
Alt Text (Alternative Text)
Alt Text or Alternative text is a description of a graph/image that can be inserted as an attribute in an HTML document to tell search engines the nature or contents of an image. Alt text helps search engines in knowing what each image means and how the information it conveys fits with the rest of the content on the page. Images with alt text get better ranking in Google Image search.
An Accelerated Mobile Page (AMP) is a Google initiative to build fast loading pages for mobile users. AMPs are designed to load quickly in slow networks. The pages are powered by the AMP HTML framework. Implementing AMP for a blog or website is an SEO best practice.
An anchor text is a clickable text in a hyperlink that helps website users to directly jump to a particular section of a long page, instead of scrolling it down. Anchor texts are generally used in a menu or table of content. Through anchor text, the table or menu links to different sections of the page. The words contained in the anchor text can determine the rank of the page on search engines.
Also known as: Link Level, Link Text or Link Jump
Artificial Intelligence (AI)
AI is the science of making computers, computer-controlled robots, or software think intelligently like the intelligent humans do. Rather than following a set of programmed rules (like an algorithm), an AI system can learn from experiences in specific contexts and situations. In contrast to programmed or per-instructed algorithm, an AI can make and carry out decisions without human intervention.
An authority site is a very high quality website that is trusted and respected by industry experts, other websites and search engines. Such types of websites have usually many incoming link from other hub/expert sites. Authority sites enjoy high page rank by search engines. Back-links from authority sites also improve ranking of a website on search engines. Wikipedia is an example of an authority site.
Also known as: Trusted Site
A back link is a hyperlink into a page or site from any other page or site. A page with a lot of quality back links tends to rank higher on search engines. Back links from high quality websites inform search engines that the website ‘linking to’ is also trustworthy and of high quality.
Also known as: Inbound Link or Incoming Link
Baidu is the most popular search engine in China. It was founded in 2000.
A bad neighbor, in SEO context, is a site with low quality and/or illegal content. Link to bad neighbors adversely affect the rank of a website.
Bing is the search engine by Microsoft. It replaced Microsoft Live Search (previously MSN Search and Windows Live Search) in 2009. Since 2010, Bing has been powering Yahoo Search.
In the context of SEO, Black Box is a complex computer program that can be viewed in terms of input, output and the relation between the two, but the internal working can not be understood. Due to its confidential nature or for any other reason, there is no access to the processes. Google’s algorithm is an example of black box.
Black Hat SEO
Black hat SEO practice refers to a number of aggressive SEO tactics that do not follow search engine guidelines. These unethical practices focus only on search engines and not on real audiences. Black hat practices include – cloaking, keyword stuffing and using private blog network. Black hat strategies may help in short term SEO gain, but, in the long run, they drastically harm a website’s rank on search engines. At the worst, such practice can lead to manual action of removing a website from search engines’ index.
A blog (or weblog) is a website or part of a website that contain informational writings on a specific subject usually in informal or personal style. It may be, otherwise, defined as the writing and reading space on the internet. A blogger is a writer on the internet having specialized knowledge in a certain field of interest. Blogs help in improving SEO of websites.
Bounce rate refers to the percentage of users who visit a website and then leave it without viewing any other pages. Higher bounce rate negatively affect a website’s ranking.
In web terminology, breadcrumbs refers to a horizontal bar above the main content which helps the user to understand where they are on the site and how to get back to the root areas. Breadcrumbs help in making website user friendly. It also help in SEO. Google consider breadcrumb as an enhanced SEO feature.
A branded keyword is search phrase that include a company brand name exactly or in variation. Examples: Google Search Console, Moz SEO, Samsung mobile etc.
Also known as Brand Keyword.
In contrast to exact match, a broad match setting of keyword is an instruction to search engines to include all variations and synonyms in search result for a certain keyword. For example, if the defined keyword is ‘homes in Delhi’, when a broad match is set properly, the search engine might include matches for ‘real estate in Delhi’, ‘house for rent in Delhi’, ‘home in NCR, New Delhi flats for sale’ or, even ‘apartment in Rohini for sale’ etc for search result.
A broken link is a link (external or internal) on a web page that no longer works. Improper URL setting, removing a webpage from the destination or changing the destination of a URL without implementing proper redirect (such as, 301 redirect,) are the main reason of broken link problem. Broken links affect SEO, drastically.
Cache or web cache is a mechanism that temporarily stores web content in a system to reduce future page loading times.
In SEO, a cached page is a snapshot of a webpage as it appeared when a search engine last visited it.Any update after the search engine’s last crawl will not visible in cached page.
When a single page can be accessible through many URLs, the most preferred URL for search engine indexing is termed as canonical URL. A canonical URL is useful to avoid duplicate content indexing so that penalty by search engines can be avoided.
ccTLD stands for country-code top-level domain. For instance, a company based in India can have a domain like: www.example.in, where .in is the ccTLD.
Clickbait is a piece of online content that is intentionally over promising or misleading in headlines, typically designed to entice people to click or visit a website in order to sell advertisement. Clickbait generally capture users with sensational and snappy headlines, such as, ‘you won’t believe this’ or ‘learn how this 9-year-old girl could earn a million dollars within a month’. Search engines and social media treat clickbait as immoral practice.
Click Through Rate
Cloaking is a black hat SEO practice, unethically used for higher page ranking, in which the content presented to a user is different from the content presented to the search engine crawlers. Since cloaking mislead search engine crawlers, it is regarded as high level unethical practice. Cloaking might lead to ban a website from indexing in a search engine.
CMS stands for Content Management System, a web-based application that lets people create a good quality websites with little knowledge of coding. WordPress, Droopal and Jumla are examples of CMS. WordPress is the most popular CMS that empowers more than 34% of websites worldwide.
As SEO is concerned, co-citation can be defined as the frequency with which two websites are mentioned together by a third-party website, even if those two items don’t link to each other. In other words, co-citation occurred when two separate websites are linked to another website. This is a way search engines might establish relation between the two.
Also see Co-Occurrence.
In SEO, co-occurrence is a loosely used term to described that if certain search terms or phrases occurred to happen simultaneously for numerous searches, a webpage having non of the keywords in its content might rank for the phrase due to the semantic proximity of the content with the search. For example, if a webpage about ‘women apparel store’ might be rank in Google’s ’boutique shop’; even though not having the words ’boutique’ and ‘shop’; mainly because of ‘women apparel’ is usually combined with ’boutique shop’ in numerous search queries. This term along with ‘co-citation‘ was coined by Rand Fishkin of Moz to predict the future of SEO in November 2012. Perhaps, he has derived the term from linguistics. However, the concept is yet debatable.
Comment spams are the poorly written comments for blog posts or in forums, often off-topic and mostly self-promotional, posted by spambots basically for the purpose of getting free links to the spammer’s website and unsolicited advertising.
Also see: link spam.
In the context of SEO, content is the medium of information and communication in a number of forms that are directed towards an audience or end user. Blog posts, articles, white papers, images, info-graphics, podcasts, videos are some example of content or specifically, web content. Content is meant to be consumed and distributed by an audience and one of the most important search ranking factors. Content is often credited as the king or even as the entire kingdom.
In online marketing, conversion occurs when a user completes a desired action on a website. Examples of conversions include: making a purchase, adding a item in cart, subscribing email newsletters etc. Conversion is the ultimate goal of an SEO strategy.
Conversion rate is a metric of success of an online marketing or SEO effort that can be defined as the percentage of total website users complete a desired action.
Conversion Rate Optimization (CRO)
Conversion rate optimization (CRO) is the systematic process of improving the conversion rate (i.e., the percentage of users completing a desired action) on a website, both in quantity and quality.
Correlation is a systematic approach to study the extent of relationship between two or more variables. Since most of the ranking factors of search engines are covered within a black box, SEO professionals often rally on correlation research to discover new tactics of SEO practice. Correlation is not about something happened for another thing, but mostly about how both the variables affect each other.
Crawl budget refers to the total number of URLs a search engine can and want to crawl on a website on a given day. This number may slightly varies from day to day, but overall it is relatively stable.
Crawl errors are issues encountered by search engines as they try to access certain web pages. These errors prevent search engine bots from reading the content and indexing the pages. A crawl error occur either due issue in entire site or due to problem in a particular URL. Hence, Google separately specifies site errors and page errors.
A crawler is an automated program that systematically browses the internet for new web pages and updates. This process is known as web crawling, web-indexing or web-spidering.
Also known as Search Engine Robot or simply Bot, Spider and Web Crawler.
The process of gathering information, using a crawler, from the billions of public webpages to update, add, and organize webpages in a search engine’s index.
CSS stands for Cascading Style Sheet, that is a mechanism for describing how HTML elements (e.g., color, fonts, space) should appear on webpages and adapt when viewed on different devices.
CTR stands for click-through rate. It is a performance metric expressed in percentage that provide the ratio of the number of times a link in a specific organic search result or paid ad or email is clicked to the number of times of impression, i.e., the organic search result or paid ad is viewed. For example, if a search result is viewed 200 times and clicked 50 times, the CTR will be 25%.
Customer journey or purchase path is the entire touch-points, interactions and experiences that a prospect have with a brand in entire life cycle.
While SEO is concerned, data is the numerical representation of information about the target audiences that is needed to make informed decisions about SEO strategies and practice.
A dead-end page is a webpage that has internal or external link to no other webpages. Once a user or bot arrives on this page, there is no place to move forward. Dead-end pages are not good for SEO.
A deep link is a hyperlink that points to any webpage other than the homepage or a link that points to content inside a mobile app. ‘Deep’ refers to depth of a link in a hierarchical structure of webpages or content.
De-indexing is the act of removing a website or a webpage from the index of a search engine either temporary or permanently. De-indexing may be voluntary action by the webmaster or as a penalty by a search engine due to violation of guidelines, in the form of manual action.
Disavow simply means to ignore. Google’s Disavow tool allows webmaster to tell Google to ignore low quality and spammy back-links. Disavow tool is helpful when you are unable to remove spammy links to your website, because the linking sites are not under your control. Utmost care is required while using disavow tool as it may also remove usefiul links
A domain name is the address of a website, typically ending in an extension like .com, or .org, where people can find it on the internet. For example: businesskrafts.com is the domain of this website. Each domain name represent an IP (internet protocol) address.
Domain authority refers to the overall strength of a domain that can help webpages within that domain to rank quickly. It is all about the quality links a particular domain earned over the time to represent a website’s overall quality profile. It is also presented as a metric, developed by SEO software company Moz, to predict the ability of a website / domain to rank in search engines. Domain authority uses a 0 to 100 logarithmic scale.
Doorway pages are low quality webpages that are created to manipulate search engine ranking for specific keywords, only for the purpose of redirecting users who click on that page to a different website. A doorway page is, although, not exactly the same as cloaking, the effect is similar as the users and search engines are served different content.
Also known as gateway pages, jump pages, portal pages, link pages and by few other names as well.
DuckDuckGo is an internet privacy company as well a search engine that highly emphasizes on user privacy and in avoiding the filter bubbles (search personalization). It was was founded on September 28, 2008.
Duplicate content refers to a significant amount of content that appears on the internet more than one place, either on the same website or on different websites. Google might penalize severely for duplicate content, including de-indexing a website.
Dwell time refers to the amount of time that passes between when a user clicks on a search result and then returns backs to the SERP from the reffered website. Dwell time highly affect the page rank. Short dwell time can be an indicator of low-quality content to search engines.
E-A-T stands for Expertise-Authoritativeness-Trustworthiness. It is one of Google’s top ranking factors. Pages with high level expertise, authoritativeness and trustworthiness is considered as high quality page. Webpages with adequate and clear main content (MC), good reputation, enough auxiliary and supplementary information, easy navigation and regular updating are considered as high quality pages.
For further information, you may read Google’s Quality Guidelines.
E-commerce refers to buying and selling of products or services on the internet and all involving procedures, such as, transmission of data and money online.
An editorial link is a link, basically in the form of citation, placed in the content of another website to specify a authoritative and trustworthy source. This type of links are earned natural links that indicates the credibility of a webpage for search engines. Natural links are the best quality back-links.
Also known as Natural Link, Organic Link.
Ego-bait is a brown hat link building strategy of asking a webmaster to give a link in exchange of a back-link from your website.
Also known as Link Exchange.
Engagement metrics are the methods to measure how users interact with webpages and content. 10 Top engagement metrics include:
- Total numbers of users
- Click-through rate
- Conversion rate
- Bounce rate
- Dwell time
- Time on page/site
- New vs. returning visitors
- Frequency and recency
- Screen flow
The entities of a knowledge graph are the people, demographics, locations, devices, organizations, websites, events, groups and other facts in Google’s Search Analytics that help webmasters to find out the traffic flow.
In opposition to broad match, an exact match is the keyword match type that allows (Google Ads or Bing Ads) advertisers to reach the prospects searching for the content exactly defined in the keyword or search phrases. Exact match keywords do not include variations or synonyms. For instance, if the search string is “homes in Delhi”, the search results will include only these pages where the “homes in Delhi” string is present and won’t include matches for “homes in New Delhi’, ‘Delhi home’, ‘real estate in NCR’, etc. Exact matches are useful when searching for a competitive keyword because it filters the results, rather than delivering millions of broad matches for paid search engine marketing.
An expert document is an unaffiliated web documents (such as, e-books, white-papers or expertise blog posts) with links from numerous hub pages. Google’s Hilltop Algorithm uses expert documents to determine relevance and ranking.
A featured snippet is a highlighted summery of a webpage placed at the top of SERP (of Google) for a certain query, usually as the answer to a question like – what, where, when, who or how. It include the page title and URL. It is also known as rich result, direct answer and position zero. Websites practicing best SEO may have chances to be featured as rich result by Google.
In general, feed refers to what you need. It is the content that is delivered to you via special websites or programs such as news aggregators. Googles Feed Burner is an example of feed program.
An FFA (Free For All), also termed as link farm, is a website or a page therein with many outgoing links to unrelated websites, containing little unique content. These are only intended for search engines, and have little value to real users. Thus these are ignored or penalized by the search engines.
Findability refers to how easily the content on a website can be discovered, both internally (by users) and externally (by search engines). Well structured websites with defined sitemaps can have better findabilities. It help in gaining better SEO rank.
Flash is an interactive media technology that makes sites more interesting. At the same time, Flash can kill your search rankings because search engines can’t index Flash content directly.
Footer links are the links that appear at the footer (button) section of a website. As they share common links in all pages, footer links serve better SEO purpose.
Frames separate more than one documents on a single screens of a website, designed to appear simultaneously. Frames are not good for SEO because spiders seldom correctly navigate them. Generally, users dislike frames because it is almost like having many tiny monitors on a screen, neither of which shows a full page at a time.
See Doorway Page.
Geotargeting is a technique of delivering different web contents to different visitors based on geographic locations, such as country, state, city, PIN code, IP address etc. Geotargeting can be used for local searches, when your business interested in traffic from a particular location only.
A ghost blogger is a person who writes for blogs by others without any self credibility, but in exchange of handsome payment. Her/his name does not appear with a blog post or article she/he has written, but typically the credit goes to another person who pays for the writing. Ghost writing is heavy in SEO industry as well as in other sectors like the celebrity world.
Gizmos (aka, gadget or widget) are small applications used on web pages to provide specific functions such as a hit counter or IP address display. Gizmos can make good link bait.
As on 10 April 2020, Google is the most popular and powerful search engine in the world. More than 73% of searches are powered by Google. Founded in September 1998 by Larry Page and Sergey Brin, Google speedily departed from a human-edited web directory to web crawling technology and complex algorithm that analyzes hyperlinking patterns and ranks websites. Google is also world’s biggest internet company with $510 billion market capital.
Google analytics is a free but feature rich web analytics service by Google that helps webmasters gather and analyze data about website usage, such as, audience behavior, traffic acquisition sources, content performance, trends over time and more. It’s an essential tool to use for tracking SEO and digital marketing performance.
A unethical SEO practice intended to make a website rank higher on Google Search for an irrelevant, off-topic, unrelated, surprising or controversial search. This was accomplished by having a large number of websites link to a certain webpage with specific anchor text to help it rank for that term. The practice is also known as Google washing.
Google Bot is the web crawling program by Google that perform a task autonomously to find and add new websites and webpages to its index so that those can be shown for relevant search queries. Boots are known as robots, crawlers, spiders.
Google dance is a slang, developed in 2002, when Google made volatile changes in search indexing. The term has been already outdated.
Hummingbird is the official name of a major search algorithm change in Google that was officially announced in September 2013. The name was derived from the speed and accuracy of hummingbird. The main goal of Hummingbird was to provide better search result by understanding the context of the query rather than returning result for certain keywords. It was the most significant change in Google search algorithm since 2001.
Panda is the official name of a significant change in Google’s search algorithm that was officially lunched in February 2011, followed by a series of updates. The objectives of this algorithm was to reduce low quality thin content in search result and reward high quality webpages. In 2016, Panda became a part of Google’s core search algorithm.
Penguin is the official name of a major search algorithm change in Google that was officially announced in April 2012. The purpose of Panda was to penalize spammy pages and content with keyword stuff and low quality link building practices as well to reward high quality white hat SEO practice. Panda became a part of Google’s core search engine algorithm in 2016.
Based on distance and location ranking parameters, Google Pigeon could improve the relevance and accuracy the result for local search query. This major algorithmic update was lunched in July 2014. Pigeon is not the official name of the algorithm, rather the name has been given by the SEO industry. Pigeon updates positively affected the result for both normal local search and Google Map positioning.
Google RankBrain is a sophisticated artificial intelligence (AI) based search algorithm that helps Google to understand a search query, officially introduced in October 2015. It is not a programmed algorithm, rather the programming in RankBrain is based on machine learning system. It trains itself to create experience based programming and upgrades itself through experiences.
Google Sandbox is a partially mythical filter, believed by many SEO professionals that it really exists. They believe that Sandbox prevents new websites to rank well for any search query, even after SEO best practices. Google has never confirmed the existence of Sandbox. In fact, despite of numbers of quality content, a new website typically can not gain adequate domain authority in few months and, therefore, unable to rank well for competitive search queries. Since Google does not consider only the quality of few contents and insignificant outbound links, it is quite normal. Overnight SEO success is really unpractical. Since adequate human involvement is required, developing quality content within two or three months is not practicable for a website. SEO is not an overnight magic. For example, BusinessKrafts.com struggled for not less than eight months to be ranked well for targeted keywords. For us, it is normal.
Google Search Console
Google’s Search Console is a completely free service by Google for webmasters with several helpful features, including submitting site maps for indexing, fixing index issues, inspecting URL index, acting in accordance of errors and warnings, observing performance for search queries, improving search ranking, checking mobile usability of a page, validating AMP, monitoring outbound links and more. It is an essential tool for SEO best practice. The earlier version of Google Search Console was know as Google Webmaster. If you are not using search console yet, find the link here and sign up today.
Google Trends is a website at https://trends.google.com that provide free tools to webmasters so that they can observe search trends for a topic or term over times in different countries or language based region and to find out comparative statistics nd able to choose best keyword ranking factors for their websites. Google define it as – ‘explore what the world is searching’. It is highly helpful in SEO best practice and well recommended for all SEO practitioners.
Google Webmaster Guidelines
Google’s webmaster guidelines are meant for developing and designing websites those are friendly to Google. These guidelines helps in developing quality content that can rank well in Google search and in building qualified links that can help in better ranking by Google. Clear instructions are provided in the guidelines to improve websites and web content in accordance to Google’s complex algorithm based preference. Webmasters those follow the guidelines have better chances of getting better rank for their websites for relevant search queries. The basic part of Google’s guideline is to make valuable, useful and engaging websites, webpages and content for human users and not to use any tactic for Google Search Engine.
Google Webmaster Central Blog
This is Google’s official blog for webmasters and SEO professionals. This blog is highly helpful as all updates and news regarding Google Search is available here. If you are a serious SEO professional, you need to subscribe this blog.
Gray Hat SEO
Grey hat is a degree of SEO practice that keep itself in between white hat and black hat practices. While Google and other search engines like Bing discourage grey hat SEO, many SEO experts including Neil Patel advocate for it as there is less fear of penalty by a search and more opportunities of better ranking. To our best experiments and experiences, gray hat SEO should be equally avoided in order to achieve long term SEO success.
Guest blogging (guest posting) refers to the practice of writing for others website in exchange for a backlink to own website or blog. It is widely used by bloggers as an SEO practice as well to create brand awareness.
Heading tags are the HTML elements of a webpage that defines title of the document or page (H1), main headings or sections (H2) and sub-headings (H3 to H6) as well separate content into sections. In the context of SEO, H1 is the most important heading tag, while H6 is the least. Heading tags should be used naturally on the basis of structural arrangement and should incorporate target keywords where relevant.
A head term is a popular keyword with high search volume that is usually difficult to rank for.
Also known as head keyword and short term keyword.
Hidden text is the text or link in a webpage that can not be seen or read by a user, and normally intended to manipulate search rankings by loading webpages with rich keywords. This is considered as spammy practice that goes against Google’s Webmaster Guidelines and can be resulted in heavy penalty.
Also known as Content Cloaking.
Hilltop Algorithm is a Google’s search algorithm that decides how much a document is relevant for a certain search query. This is determined on the basis of reference of expert webpages to an authority webpage. The algorithm was created by Krishna Bharat and George A. Mihalia and accrued by Google in 2003.
HITS stands for Hyperlink-Induced Topic Search. It is a link analysis algorithm that rates webpages on the basis of both authorities (inbound links) and hubs (outbound links). HITS algorithm was developed by John Kleinburg.
A home page is the default front page of a website that loads first when a internet user enter the domain name of a website in a browser. Typically, home pages are designed in such a way that a user can get a bird’s-eye view of the website as well can navigate easily to important pages and sections in it.
hreflang Tag attribute is used when a website has similar content in different languages or variant of a single language. This attribute tell Google which language is used for a certain content so the the search engine can serve the content to the user of that language. If you have a multilingual site, you can use hreflang tag in your XML sitemaps to provide signals to Google about language variations for content. Google will variably show your content in accordance to the language of search query. Following is a sample of hreflang code:
< link rel="alternate" href="http://example.com" hreflang="en-us" />
.htaccess file is a hidden server configuration file for use of web servers running on Apache. It can be used to rewrite and redirect URLs.
HTML stands for Hypertext Markup Language, the standard markup language used for creating webpages and web applications. It defines the meaning and structure of web content.
An HTML sitemap is list of pages in a website and meant for website users to help them navigate through the website. Since unless manual update, this sitemaps remain static, these are otherwise, known as static sitemaps.
Also see XML sitemap.
An HTML tag is a set of characters that constitute a formatting command for a webpage and represent the root of an HTML document. HTML tags are the primary tags of a webpage. This tags elements can be used to improve the effectiveness of SEO for webpages and websites.
HTTP (Hypertext Transfer Protocol) is the underline protocol of World Wide Web that defines how data is transferred from a computer server to a web browser and what action web servers and browsers should take.
Hypertext Transfer Protocol Secure uses a Secure Sockets Layer (SSL) to encrypt data transferred between a website and web browser. HTTPS is a minor Google ranking factor.
HTTP Status Code
An HTTP status code is a server’s response to a browser’s request. When a URL is entered into a browser’s address bar, the browser request the server where the URL exists to deliver the files within the URL. To this request, the server responses with different status codes in three digits, such as 201 or 404. Different status codes have different meanings. For example, 301 code refers that the URL has been permanently moved to a different URL and 404 code informs that the URL is not found. Knowing the meaning of status codes can help you to diagnose site errors and improve your search ranking. You can find a list of status code and their meaning in Wikipedia.
A hub page is a central resource (e.g., page or article), dedicated to a specific topic or keyword. It is continually updated and linked to, and also links out to topically-relevant webpages. Hub-pages are generally ranked high by search engines and links from hub pages helps improving SEO of a website or blog. Pages having back-links from hub pages are considered as authority pages by most search engines.
A hyperlink is a link from one page to another or from one place on a page to another place on the same page. Hyperlinks are inbound and outbound. The hyperlinks that start and end on the same site are called internal hyperlinks. Hyperlinks are important SEO factors.
In the context of SEO, an impression counts for a single view of your web link one time on SERP.
See Back Link.
In the context of SEO, index is the database search engines use to store and retrieve information regarding web-pages, posts and media gathered during the crawling process.
Indexing is the action of including web-pages in their database by search engines throw web crawling.
Indexability refers to how easily a search engine bot can understand and add a webpage to its index.
An indexed page is a webpage that has been discovered by a crawler, has been added to a search engine index, and is eligible to appear in search results for relevant queries with all enhancements.
Information architecture refers to how a website is organized and where various content and navigational elements are located on webpages.
Information retrieval is the process of searching for information or file (e.g., text, images, video) from a large database and then presenting the most relevant information to an end user.
Internal links are hyperlinks within a website those connect pages and sections of the website with each others. Though the links are commonly used for main navigations, those are also useful in establishing information hierarchy within a website and helping the users find most valuable pages. Proper internal linking can boost SEO.
See also Website Navigation.
An Internet Protocol (IP) Address is a unique string of numbers separated by dots assigned to each device connected to a computer network for the purpose of identifying each other and communicating. IP address is the core of any network in the internet. An IP address serves two basic purposes identify the network the address belongs to and the exact location of the address.
A keyword, in the context of SEO, is a phrase that consists more than one words and meaningfully able to tell search engines about the significant content of a webpage. For the purpose of search engine optimization and marketing, keywords are highly significant as they are able to rank a webpage for certain search queries. The keywords used on webpages can help search engines determine which pages are the most relevant to show in organic results as well in paid search advertisement for a\some searcher queries in accordance to significance. Keywords usually represent topics, ideas, or questions and typically meant for search engines to understand the content in proper context.
Also known as Keyphrase.
Keyword Cannibalization is the self-competition among webpages within a website that occurs by targeting same keywords for multiple pages so that the pages from the website rank for the same search query (keyword) on a SERP. Such phenomena is not good for SEO. It can also adversely affect authority, CTR and conversion rate.
Keyword density is measured either in percentage or in ratio on the basis of how often a keyword or phrase appears within the content of a webpage in comparison to the total words in that page. It is believed that higher keyword density can help for better ranking by search engines. However, there is no evidence to prove it. Rather, higher density of keywords in content might be considered as keyword stuffing or keyword spam and can be resulted in penalty.
Keyword research is a most important SEO task that help SEO professionals discovering alternative terms and phrases for a particular topic that searchers enter into search engines, as well as the search volume and competition level of those terms. This task can be performed by using a number of free and paid tools available on the internet. Google Trends and Google Keyword Planner are two prominent Keyword Research Tools.
See keyword stuffing.
Keyword stuffing refers to spam practices of increasing keyword density, adding irrelevant keywords or repeating keywords in unnatural way in a webpage in the hopes of increasing search rank. This spam tactic is against Google’s Webmaster Guidelines and can result a manual action.
Knowledge Graph is a knowledge base used by Google to collect and present facts and information on entities (people, places, and things and their connection) that is placed in a info-box at the top right panel of SERP for relevant search queries. On mobile devices, knowledge graphs are placed at the top.
Knowledge Panel is the info-box that appears at the top or on the right top on the first SERP of Google for relevant queries. This panel is based on Knowledge Graph and contains facts and information on people, places, things and links to related websites.
KPI stands for key performance indicator, the measurement method businesses use to evaluate success of marketing and business activities. In SEO, KPI is used to measure the success of strategies and techniques.
A landing page is a specially designed standalone page in a website with a ncertain ‘call to action’ feature. SEO and digital marketing professional target that the landing page will be opened first when a user click on a search engine result or promotional link.
A lead is a person who require your product or service or/and interested in it. A lead is also termed as potential customer or perspective. A lead can be confirmed if she/he shares her/his contact details and other information relevant to a business deal.
In the context of SEO and web technology, a link (short of hyperlink) is an HTML object that make connection between different websites, different pages within a website and different sections within a page. In SEO terminology, the primary categories of links are two – internal links and external links.
An internal link enables users to navigate within a website, while external links make connections between different websites or apps. Links play a critical role in search engines evaluation and ranking of websites.
Link bait is an SEO technique or art of creating interesting content for a website that encourage other web content producer to link to it. The objective of link bait is to improve search rank of a website or webpages with adequate back-link. Link bait may be misused in many ways, for example, creating intentional provocative content.
Link building is the actions of generating numbers of quality inbound links to a website from other websites with the primary goal of improving search engine ranking of the website or webpages. It is the second most important part of SEO, after quality content.
Link condom refers to the use of a number of methods that can avoid passing link love to another page, or to avoid possible detrimental results of endorsing a bad site by way of an outgoing link, or to discourage link spam in user generated content.
A (web) link directory is simply an online directory or catalogue of websites, usually separated by related categories, either maintained by a human or a systematic program. Inclusion in a directory may be free or paid that depends on the management policy of the directory owner. Link directories have been widely misused for search engine ranking and, hence, search engines have upgraded themselves to prevent such misuse. It means, simply because of your website is listed in a directory, you are not going to gain additional search engine ranking.
A linkerati is an influential person on the internet, either by personality or position, who is most likely to be a target of link bait campaign. The Linkerati includes – social media influencers, social taggers, forum posters, resource maintainers, bloggers and other content creators – who are able to create incoming links or/and link generating traffic.
Link exchange is a reciprocal linking scheme often facilitated by a site devoted to directory pages. Link exchanges usually allow links to sites of low or no quality, and add no value themselves. Quality directories are usually human edited for quality assurance.
Link equity is a search engine ranking factor that measures the power of an inbound link in terms of relevance, authority, and trust. An inbound link from an authority site is considered as valuable or more powerful that can further help in better search engine ranking. In fact, search engines consider links from other websites as votes. While links from authoritative or trusted websites are considered as worthy votes, links from pointless link firms are considered as insignificant votes. Link equity plays a major role in SEO.
When a group of websites link to each other, usually using automated programs, in the hopes of artificially increasing search rankings. A spam tactic.
Also known as Link Network, Blog Network etc.
Link juice is a slang term, often used by SEO professionals, to refer the power of a link that passes value or equity and authority from one page to another within a website or from one website to another. It is based on the idea that links from authority sites passes more juice to a site than the lowly valued link farms. The purely professional equivalent for this slang is – Link Equity.
An outgoing link, which passes trust, unencumbered by any kind of link condom. Link love passes link juice.
(link exchange, reciprocal linking) Two sites which link to each other. Search engines usually don’t see these as high value links, because of the reciprocal nature.
Every type of link that points to a particular website. The quality of a website’s link profile can vary widely, depending on how they were acquired and the anchor text used.
Link popularity is a measure of the value of a site based upon the number and quality of sites that link to it.
Link spam refers to the activities of posting unwanted links through user generated content like blog comments. It is also known as comment spam.
The user visible text of a link is link text. Search engines use link text to indicate the relevancy of the referring site and link to the content on the landing page. Ideally all three will share some keywords in common. Link text is also known as anchor text and jump text.
Link velocity refers to how quickly (or slowly) a website accumulates links. A sudden increase in link velocity could potentially be a sign of spamming, or could be due to viral marketing or doing something newsworthy (either intentionally or unintentionally).
Local SEO refers to the processes of optimizing local businesses for Google Maps, Bing Businesses, etc. For local SEO, your business should have a physical location that can be placed on the map. You not necessarily need a website for this purpose.
It is a file that records users’ information, such as IP addresses, type of browser, Internet Service Provider (ISP), date/time stamp, referring/exit pages, and number of clicks.
Log File Analysis
It is the process of exploring the data contained in a log file to identify trends, administer the site, track user’s movement around the site, gather demographic information, and understand how search bots are crawling the website.
Long-tail keywords are longer and highly specific phrases with more than two words and less competitive in SEO and SEM. Such keywords help better ranking for highly competitive entities.
LSI stands for Latent Semantic Indexing – a technology that was invented in 1988 at Bell Labs by a team of IT engineers to retrieve computer information on the basis of hidden semantic relationship of terms. Few SEO professional believe that Google and other search engines use LSI to rank webpages. However, there is no such evidence. Moreover, this technology has been outdated.
Machine learning refers to the ability of a system to learn from experience without human interference. It is a subset or application of artificial intelligence (AI) that make a system to learn from data and improve algorithm so that it can perform a complex task with accuracy without any explicit programming.
Manual action is Google’s penalty by human reviewers (Google’s manual web spam team) for considerably high degree of violation of webmaster guideline. In cases, when Google’s algorithm can not resolve the issue for severe search engine spam, Google’s team manually reviews a website to confirm whether it has failed to comply with Google’s Webmaster guidelines. In most of manual cases, the penalized websites is demoted; but in rare cases of multiple spams, the entire website is removed from Google’s search index. Manual actions can be taken against the entire website or just for certain webpages.
Hacked site, unnatural link, pure spam, thin content, cloaking, sneaky redirection, spammy structured markup, keyword stuffing, hidden text, user generated spam etc. are the most common reasons for manual action.
Meta description is an HTML tag that provide the summery of a webpage, typically within 160 characters, and can appear as the snippet in SERP when the search phrase, fully or partially, is within the description. Though it has no role in search engine ranking, relevant and catchy description can help in increasing click through rate (CTR). alternative to meta description, search engines automatically create snippet from the text in the page for matching search query.
Meta keyword are html tags that contain most important keyword of a page. Though the practice of adding meta keywords to a web document is still alive, most of the search engines, including Google, ignore them just to prevent keyword stuffing.
Meta tags are the tags those are placed in the HTML source code of a webpage to describe its contents to search engines, but not visible on the webpage. The three common types of meta tags are – title tags, meta description tags and keyword tags. While title tag is fully relevant for search engines, meta description is partially relevant and keyword tags have no relevance.
In the context of SEO, metrics refer to a number of methods, available on the internet, free or paid, to measure the success of SEO efforts. Google’s page rank, Alexa’s traffic rank, Moz’s domain authority are three widely used SEO
matrices. Google Analytics provide a number of matrices, for free, to measure the overall performance of a website, such as – bounce rate, average session duration, conversion rate etc.
In the context of SEO, MFA stands for Made For Advertisements – websites that are created and designed mostly for advertisement with little or thin useful content. Such websites usually give little value to users. However, with sensational contents, they manage to get traffic, mostly from social media and paid marketing. Search engines do not give them any preference.
A mirror site is an identical site at a different URL. Creating mirror site is not good for SEO, because search engines analyses duplicate content up to the level of manual action.
Monetizing refers to earning from a blog or video channel by placing advertisement and affiliate link in between content. Blogs with adequate engaging content and SEO best practice can monetize better.
Natural links are the back-links that your website gain naturally from other websites or blogs because the webmasters or blogger of the other sites think that it is useful for their reader or it is required to cite a proper source. Typically, search engines love natural links.
Negative SEO refers to use of a number of unethical methods used to harm the ranking of a competiter’s website. Such methods include: hacking the website, creating unnatural suspicious back-links, removing quality back-links, creating links with illegal product names, content spam etc.
A niche is a specialized area of interest consisting of a small group of highly-passionate people.
Noarchive tag is meta robots tag that tells search engines not to store a cached copy of a specific page. This tag prevent search engines showing the cached link of a page in SERP.
Nofollow tag is a meta robots tag that tells search engines not to follow a specific outbound link. Either because of a website doesn’t want to pass authority to another webpage or because it’s a paid link.
Noindex tag is a meta robots tag that tells search engines not to index a specific webpage in its index. This is done when the page is a private page or the content of the page can negatively affect the rank of a website.
Nosnippet tag is a meta robots tag that tells search engines not to show a description with your listing. In this case, though Google may ran your page and show in SERP, for description , it will return a note – ‘No information is available for this.’
Off-page SEO refers to all the SEO practices that take place outside of a website. Besides the prominent link building techniques, social media marketing and social bookmarking are commonly used off-page SEO tactics. Off-page SEO can help search engines to determine the quality of web content. More back-links and reference, more social sharing and bookmarking etc. indicate that the page has better content.
On-page SEO, refers to all those SEO practices take place within a website. On-page SEO include – publishing quality content, optimizing website for all device types, optimizing html tags, improving internal link placement and website navigation, improving information architecture, eliminating code errors, bug issue fixing etc.
Organic Search Result
Organic search results, in contrast to paid search results, refer to the results appear on SERP those are natural or unpaid. In order to ensure that your webpages will appear as organic search result, you have to do SEO best practice.
An orphan page is a webpage that is not linked to any other pages on that website. Orphan pages have low chances of being indexed and ranked well.
An outbound link is a link that directs visitors from one webpage to another, either in the same window or in a separate window as it is set up by the link builder.
PageRank is an algorithm of Google search that determines the position on SERP for certain search query on the basis of quantity and quality of links to a webpage.
Page speed refers to how of time it takes for a webpage to load completely.
Page view refers to how many times a webpage is loaded in a browser by the action of the visitor to view. A reloading counts for another pageview by the a visitor.
A paid link is a back-link to a website that is purchased. Selling and buying links are huge violation of Google’s Webmaster Guideline. So, Google can penalize severely, including manual action.
Paid Search Ads
Paid search ads are the pay-per-click (PPC) advertisements that appear with the organic results on SERP.
PBN stands for Private Blog Network – a group of websites those link to each other and finally all of those link to a target site, typically known as money site. A PBN use expired domains those have already have good back-link profile so that can serve as pre-built authority site. They further link to each other to earn further authority. All these sites finally link to the main business site or money site and, in this way, pass authority. This is purely a black hat SEO practice. Google can take severe action against a PBN.
Also known as: Private Link Network (PLN)
Penalty by Google refers to demoting the rank of a webpage or website either the manual action by Google Webspam Team or automatically by algorithm update. The ultimate objective of penalty is to control webspam or black hat SEO practice.
Persona is a fictional character that represent an ideal website visitor or customer, but based on actual research based data, such as, demographics, behavior, needs, motivations, and goals. Creating person helps marketers in understanding the users’ perspective and market segments so that they can work on a specific strategy in accordance to market demand.
In the context of SEO, personalization refers to the automated functions of search engines that create a set of search results tailored a s specific user on the basis of cached personal records, such as search history, web browsing history, interest, online behavior and location. Personalization is also used for the placement of PPC ads.
PHP stands for Hypertext Preprocessor – the widely used open source general-purpose server side scripting language that is mostly used in website and web application development. It can be embedded with HTML. Initially, it was developed for web development only and stood for Personal Home Page.
Piracy is, in general, infringement on copyright. In the context of SEO, if Google finds any piracy of web content, it take action immediately.
Pogo-sticking is when a visitor immediately return back to SERP just after landing on the target page. This happens when she/he don’t find on the page, what she/he is searching for. Pogo-sticking is extremely bad for SEO.
Position in SEO is exactly the rank of a web-page for certain search query.
PPC (Pay Per Click)
Pay per Click (PPC) marketing is one of the popular form of advertising where the advertiser is charged a certain amount when a user click on the ad. PPC ads are placed on SERP for relevant query or on search engine’s partner network on the basis of cached data about the user’s interest. The advertiser has to pay nothing if no one will click on the ad. How much the advertiser has to pay for a click is determined by the bid amount, competition, ad quality and the quality of the landing page. PPC strategy and SEO strategy correlate to each other.
In the context of SEO, proximity is a search metric that measures how close the words in a search query to the keywords in the content to be eligible to appear on SERP for that query. For example, for the search query ‘interior designer in Delhi’, the search engine may return a result of webpage that contain keywords ‘your best choice for interior design in Delhi and NCR.
In SEO, QDF stands for ‘query deserves freshness’ – an search algorithm by Google in which the search engine determines whether a search query is for newer or up-to-date content and rank webpages in accordance to. For example, if your query is ‘essential SEO terminology in 2019, Google might rank this page #1.
As a sum up of Google Webmaster Guideline, quality content can be defined as unique, original, valuable and engaging content that is meant for real users and not for search engines. In opposition , automatically generated, scraped, thin and duplicate contents as well as contents with black hat SEO attempt don’t qualify for quality content. Quality content is a vital requirement for lasting SEO advantages.
A quality link is simply a back-link that originates from an authoritative, relevant, or trusted website. Quality link is highly crucial for better search ranking. In order to get quality link, you have to create high quality content.
A query is the word, words or phrase that a user enters into the search box a search engine in order to find relevant search result.
Also known as: A search
See: URL Parameter
Rank in SEO refers to the position of a webpage in SERP for a certain search query. Ranking depends on a number of factors and executed by a complex series of algorithm of search engines.
Ranking factors are the criteria applied by search engines to determine the rank or position of a webpage for a certain search query. Google has more than unique 200 ranking factors, most are latent and few are declared, that determine the ranking of a certain webpage. Most common ranking factors are website security and accessibility, page speed, quality of content, mobile usability, content optimization, navigation architecture, quality of back-links, user experience, social signals and real business information.
Also known as: Ranking Signal
Reciprocal links are the hyperlinks that two or more websites provide to each other on the basis of mutual agreements or business relationship. Main objective of reciprocal link share to help users to find relevant information easily. However, reciprocal link generation can be misused in generating irrelevant links as a practice of black hat SEO.
In web technology and SEO, redirect is a technique that send users and search engine to a different webpage instead of the one they have requested, generally because of the page requested is not available for any reason, such as, permanently or temporarily moved to a new URL or completely deleted. 301 redirect for permanent move and 302 redirect for temporary move are the two highly used redirect techniques.
A referrer is a website that sends visitors to another website by using a link. In other words, it is a source that refer your webpage for certain useful information. Most of the web analytics programs including Google Analytics provide webmasters with valuable referrer information that can be used for further improvement in digital marketing and SEO.
Regional Long Tail (RLT)
Regional long tail keyword (RLT) is a multi word keyword which contains a city or region name, especially useful for the service industry serving to local population only. Such keywords have high tendency to be ranked higher for a local search query.
Reinclusion is the process of requesting a search engine to re-index a website or webpage after de-indexing.
Relevance is the way search engines determine how closely connected the content of a webpage match to the context of a search query.
Reputation management or online reputation management (ORM) is the practice of crafting perception of a brand or person by influencing online information about the entity on social media, websites and search results by maximizing the visibility of positive mention and minimizing negative mention.
A responsive website is a website that is optimized by design to website designed to automatically render well to different devices and screen sizes, whether it may be a destop computer or smart phone. Designing responsive website is a SEO best practice.
A rich snippet is the structured data markup that can be added to the HTML of a webpage to provide contextual information to the search engines. This information can be displayed in the SERP with an enhanced listing. The enhanced listing is also known as a rich snippet.
A robots.txt is a text file that webmasters create at the root of a website to instruct search engine robots not to crawl certain webpage(s). It is also known as Robots Exclusion Protocol (or Standard).
ROI stands for return on investment – a metric in percentage that measures the performance of marketing activities including SEO. This is calculated by dividing how much revenue earned via the activity by the cost of the total investment and then multiplying by 100.
Schema is a outline or blueprint that represent the organizational structure of a database. In a XML document, schema defines the element and structure of the document. Valid XML schema in a webpage helps search engines to show rich data or featured snippet to search users.
A search engine is a program based system that allows internet user to search for and get certain information on the World Wide Web. Search engines systematically crawl and store webpages and file in a database, known as search index. Bots (aka, web crawlers or web spiders) are used to build and update search index as well to analyze and rank webpages for search queries by a series of complex algorithm.
At present, the most popular search engine is Google. Bing, Yahoo, Yandex and Baidu are some other popular search engines.
Search Engine Robot
Search engines track every search users conduct (text and voice), every webpage visited, and every ad clicked on. Search engines may use this data to personalize the results for signed in users.
Also known as: Web Browsing History.
Search engine marketing (SEM) is a popular form of digital marketing that involve gaining visibility of and traffic to a website through a search engine, like Google, typically by paid advertising. Search engine marketing is alternatively known as PPC (pay per click) marketing. PPC includes placing paid advertisement on SERP, web networks, web apps and video channels.
Search engine optimization (SEO) refers to complete technical and strategic practices that result in improving the visibility of a website and its content in search engine’s organic (unpaid or natural) result. Effective SEO practices help in increasing quantity and quality traffic flow to a website. Since SEO involves a number of aspects and processes, understanding of each aspect, process and related terms is primary requirement for SEO best practice. This glossary is, in fact, is a entry point.
SERP stands for search engine result page – the page with results that search engines display to users in response to a search query. Result format and inclusion varies in accordance to the query. A typical SERP contain:
- Organic search results with page title, link and brief description.
- Organic featured snippet with page title, link and long description (at the top).
- Organic result with rich snippet such as images, ratting, video etc.
- Knowledge panels with additional information (at top right panel).
- Maps of local businesses with images and useful information (at the top or top right panel).
- Paid search result with or without extension.
- Videos with titles.
- Shopping ads.
- Related questions.
- Related search.
- Other result as relevant.
Sitelinks are the links of sub-pages of a website that is shown bellow the top ranked organic search result or any paid search result so that the user can navigate to any sub-page from the SERP. For paid search inclusion, you can set the sitelinks while creating the ad. But for organic search result, it completely depends upon search engine’s algorithm, solely on the basis of site authority. Maximum sitelinks that can be shown with a search result is six.
A sitemap is a list of pages on a website, often category-wise. There are two types of sitemaps:
- HTML sitemaps are meant for the site users to help them navigate a website.
- XML sitemaps are meant for crawlers to help them crawl all the pages of the website and index.
Sitewide links are links in a website that is placed in header, footer and sidebar so that those can appear on each page.
Skyscraping is a SEO technique of writing highly engaging content so that other webmasters would love to link with.
Also known as: Skyscraper SEO
Social signal refers to the engagement activities on social platforms like share, comment, like, pin etc. that can indirectly benefit search engine ranking of a web content. Google publicly declares that there is no direct relation between social engagement and page ranking, but there are significant evidences to believe the theory of social signal.
Spambots are autonomous computer programs designed to send bulk spam on email and comment in forum and post. These are one type of web crawlers that gather email ids, post URLs around the internet in order to spam your inbox and posts.
Also see: Comment Spam
Split testing can be performed by creating two or more variation of same content to measure the effectiveness of click-through-rate (CTR) and conversion and to determine which version is working better. Though split testing is good for search engine marketing (SEM), it is bad for SEO, since duplicate content is involved.
Also known as: A/B Testing
SSL stands for Secure Server Layer. An SSL certificate use Secure Server Layer technology to encript data sent to the server and hence, ensure security, authentication and identity of a website. A website with SSL certificate use HTTPS protocol instead of simple HTTP.
See: HTTP Status Code
Stop words are the words those are used more frequently in a given language. For example – a, an, in, of, for, the etc. In the past search engines were ignoring such words; but with gradual development of search engines, these words are often meaningful.
A subdomain is a separate section that exists within a main domain. For example – subdomain.example.com is a subdomain of example.com.
As a practice and science of classification, in web design and SEO, taxonomy refers to organizing and categorizing content of website to maximize findability so that user experience can be enhanced. Taxonomy is an important SEO factor.
Time on Page
Time on page simply refers to how much time a user, in average, spent on a webpage. This is also a metric to measure how good your contents are doing. Search engines use this metric to evaluate the content quality.
A title tag is an HTML meta tag that acts as the title of a webpage, appears in search results as a clickable heading as well in the title bar of a web browser when the page is open. Since a title tag is basically mean for search engines and search users, it should be written carefully and strategically. Most of search engines show a maximum of 65 characters in title.
Top-Level Domain (TLD)
In the hierarchical domain name system, a domain name with a highest level domain extension is known as a top-level domain. For example businesskrafts.com is a top level domain, because .com is the highest level domain extension name. .com, .org., .net and .info are the top level domain extension. There are also country specific and industry specific option.
Also known as: Generic Top-Level Domain (gTLD)
Traffic collectively refers to the visitors to a website.
For search engines, trust is the overall reputation and authority that a domain has a domain over the time with quality and natural back-links.
TrustRank is a link analysis technique used by Google to to separate useful pages from web spam.
User Generated Content (UGC)
User generated contents (UGC) are the form of contents those are created by online users in social media, forums, wikis or even in blogs in the form of posts, answers, queries, feedback etc. For the purpose of SEO, quality UGCs have important role for a website or a blog.
Universal search is the search result that appears on SERP that include images, video, blog, shopping, maps altogether as search engines pulls data from different specialty databases.
Also known as: Blended Search.
An unnatural link is a link that Google identifies as suspicious, deceptive or manipulative. An unnatural link can result in manual action.
URL stands for uniform resource locator, a specific string of characters that lead to a resource on the web. It is the letter-based web address (e.g., www.businesskrafts.com) entered into a browser to access a webpage.
URL parameters (aka, query string) are attributive values added to URLs in order to:
- track traffic source to the landing page,
- dynamically change the content of a webpage, and
- indicate how search engine should handle parts of your webpages.
A typical URL parameter contain a key and a value, separated by an equal sigh (=), joined by an ampersand (&) and comes after a question mark (?). For example: www.example.com/apparel?category=jeans&color=blue. This parameter represent to www.example.com/apparel/jeans/blue.
Usability refers to the ease of using a website. Site design, browser compatibility, disability enhancements, menu placement and some other factors play a role in improving usability of a website. Better usable websites can have better search engine ranking chances.
An user agent is a software that can act on behalf of the user. In the context of SEO, it is simply a web crawling software.
User Experience (UX)
User experience refers to the overall feeling users are left with after interacting with a brand, products and its online presence. In the context of SEO, it is the overall feeling of a web user with your website and its content.
A vertical search is a specialized search by people interested in a particular area. For example, people searching on Amazon are interested in shopping and on Google scholar are interested in scholarly papers or articles.
Vertical Search Engine
Vertical search engines are specialized search engines meant for people interested in a particular area, such as, Amazon for shopping, YouTube for Video and Google Scholar for scholarly papers.
A virtual assistant is a digital assistant based on programmed language that can understand natural language as well as voice command and can perform tasks as the user request. Examples of virtual assistants are – Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana and Google’s Assistant.
(SEO) visibility is the prominence and positions a website or webpage occupies within the organic search results for relevant query. All webmasters desire that their websites will get better visibility on search engines. Quality SEO practices improve visibilities of websites on search engines.
Voice search is a voice-activated technology that allows users to speak into a device (usually a smartphone) to ask questions or conduct an online search, in order to get result on SERP in accordance. Now a days, voice searches are getting more popular.
A webmaster is a person who is responsible for managing and maintaining a website.
A webpage is a page within a website with a unique URL. It is also can be defined as a document that exists on the World Wide Web that can be viewed by web browsers and can be displayed on a monitor of a computer or mobile device.
Web scrape is a technique used by search engines to gather and copy data from websites so that the webpages can be stored in a searchable index. A bot or web crawler is a web scrapping software. Scrapping of a webpage involve fetching (downloading) and extracting from it by a search engine.
A website is a set of related web resources, such as, webpages and multimedia hosted together on a server, identified with a common domain name and can be viewed by a web browser.
Website navigation may be defined as the roadmap within the structure of a website that allows visitors to seamlessly explore and visit useful pages, sections therein and information contained in the website. Placing menus, sub-menus, page links, footer links in most user friendly structure can help in better navigation of a website. Primary objective of better website navigation is to improve user experience. Both well structured navigation and good user experience affect SEO positively.
Also known as: Internal Links and Site Architecture.
Webspam is a method that exists solely to deceive or manipulate search engine algorithms with thin or/and irrelevant content that does not has useful for real (human) users. Webspamming may help you in temporary SEO rank improvement, but you might face severe penalty including de-indexing your website for such unethical practice. Spamming should be avoided with greater care.
Also known as: Black Hat SEO, Spam, Spamdexing, Search Spam, etc.
White Hat SEO
White hat SEO practice refers to the use of optimization strategies and techniques that focus on human users instead of search engines and completely comply with the guidelines, terms and conditions of search engine. This is the universally acceptable and ethical practice of SEO. Google rewards white hat practice enormously but it may take much time.
Word count refers to the total number of words that appear within the copy of content. Too little (or thin) content can be a signal of low-quality to search engines. The much words you include in a webpage or blog post is better for SEO. However, the thumb rule is relevance, despite of cases.
WordPress is the most popular as well as free and open source blogging and website content management system (CMS) based on PHP and MySQL database. WordPress powers 32% of websites worldwide. The platform can be used to build and maintain each aspect of a website with little knowledge of coding. Most of the bloggers use WordPress for their blog as they don’t need to bother about coding.
XML or extensible markup language is a markup language search engines use to understand website data, structure and hierarchy.
An XML (extensible markup language) sitemap contains XML files that lists the URLs of a website. It ensures that search engines can find and crawl them all, and can understand the structure of the website. It allows webmasters to include additional information about each URL, such as, when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. XML sitemaps are only machine-readable.
Born in April 1994, Yahoo was most popular search engine in the ’90s. Until June 2000 when a then-unknown search engine called Google began powering Yahoo’s organic search results, Yahoo search was mostly human-powered. The deal continued until 2004. Since 2010, Yahoo’s organic search results have been powered by Microsoft’s search engine, Bing.
Yandex is the most popular search engine in Russia that was founded on September 23, 1997 by Arkady Volozh and Ilya Segalovich.
YMYL stands for ‘Your Money or Your Life’. It is a quality control Google search guideline for the pages that contain tips on finance, happiness, health, parenting or nutrition. Advice like this can have a very big impact on a life of an individual and bad information can have irreparable consequences. This is the reason why Google is very strict on YMYL pages and judges them harshly, insisting on high quality, reliable information to protect the user.
Also See: E-A-T
Yoast SEO is a popular SEO plugin for WordPress blogs and websites.
YouTube is owned by Google and the biggest and most popular video sharing service in the world. It is also the second most used search engine after Google with more than 1.9 billion monthly active users. YouTube video blogging is used extensively for SEO purpose.