300+ SEO Tips and Tricks
We bring you 300+ SEO tips & tricks from various online sources and SEO experts in the industry who understand the search engine ecosystems. You about to read more advice than a book for FREE!

[Updated 2022] 300+ SEO Tips and Tricks

  1. The words or phrases you used as internal links are crucially significant. Link to internal site pages with the words you want to rank for as it is more valuable than the page’s content itself.

  2. The Meta description does not improve how well your page ranks in Google directly. However, it can influence the number of people drawn to click on your result, thus increase traffic to your website.

  3. Use search operators to obtain more information about your website. E.g. search: site:yourdomain.com and Google will tell you the # of pages it has indexed for your domain. They will make a rough list according to the order of importance. Through this, you can see which page Google prefers on your site.

  4. NEVER use the Remove URLs tool found in GSC on your old website if you want to move your full or partial website. Your website will not move faster. It only has an impact on what’s noticeable in Search, and it could hurt your site in the short-term.

  5. If your SEO strategy is producing two blog posts of 500 words every week, I guarantee you are wasting your money and resources.

  6. No one can tell you the specific keyword a searcher has searched for on Google and landed on your website from organic search, regardless of what some tools are claiming to be able to do.

  7. Tools like “Majestic’s Clique Hunter” can help you find easy link possibilities. Name a few of your competitor website and you will receive a list of the links that your competitors have and you don’t. It can help in closing the gap on the areas where your competitors are talked about, and you’re not.

  8. If the URLs of your web page work both without and with a trailing slash (/), Google search engines will likely assume that you have two identical copies of your website online. Choose if you want to have it or not and then set up 301 redirects permanently. Without the 301 redirect, it will result in having ‘two’ pages competing with each other in the search ranking.

  9. Are you curious on where you rank? Even if you do it incognito mode, you will not see a fair ranking representation. You get average rankings through GSC, but only be for the words they choose. You can use SEMRush as it’s affordable, and it can give you a lot of keyword ranking info for your website, specific words you want to trace, and your competition.

  10. The chrome plugin “Ayima Redirect Path” lets you live view what redirects (such as (JS/301/302 etc.) are occurring.
  1. Chrome plugin “Keywords Everywhere” is a free extension to acquire search value, search volumes and suggestions covered with each search you perform. Keep it on all the time and you will be able to build a good ‘feel’ on how other users search and for search competitiveness.

  2. Remember, page speed results will vary every time you run it. It depends on many various factors. If you plan to use the page speed tools, run various tests on many pages, and gather a few averages.

  3. PDF documents can be indexed by Google just fine and render them as HTML. It means that the PDF document links count as normal web links – plus, PDFs are easy to share as well.

  4. Google has always denied and that there is no good proof or whatsoever that the posts on social media platforms like Facebook and their associated engagements like “likes” or “share” have a direct impact on your rankings. If someone keep insisting on this, you could be dealing with a non-savvy SEO agency.

  5. First words in your website are extremely important. If you want to place the brand name in the page title, it should come after then page description.

  6. Posting longer content does not guarantee a better ranking. Some studies show this, but when you check the source data, it’s just that content is way better (plus, there is a higher chance that more effort has invested in longer content). The Internet is not short of content quantity – it’s short of content quality. Therefore, quality surpass quantity.

  7. Content doesn’t just involve text. As the saying goes, a picture speaks a thousand words. Likewise, for video. Google learn the type of content that best fits the queries, and you can gather clues as to the kind of content you can make by looking at the current ranking. Example, search “How to cook rice” and you’ll see all videos are ranking at the top.

  8. Don’t believe anyone who say Googlebot can process JavaScript as it crawls. JavaScript rendering happens later, and sometimes it takes weeks, and can cause problems for client-side rendered sites.

  9. MYTH: Writing new content equates to “fresh” content. It does not universally apply. Some searches need freshness, while others don’t. Do not add or write new content for the sake of being fresh.

  10. Even if your website is not collecting information or e-commerce, all of your websites must be https and not http. Why? Because it is best to protect all users’ privacy, and Google sees it as a positive ranking signal.
  1. Google ignores Meta keyword tags and has been doing it for years. Therefore, do not waste your time and energy listing down keywords in your CMS.

  2. Free extension like “User-agent Switcher” can help you identify yourself as a Googlebot to the sites that you visit. It is interesting to know that some websites gives you a different experience, especially when they see you as Google. It can also help uncover the reasons if you ever get warning message like “This site may be hacked.”

  3. Are your competitors copying your website content? You can file a DMCA warning directly to Google. It can help remove content from the search results of your competitor. Click here to do so: https://www.google.com/webmasters/tools/dmca-notice?pli=1

  4. Older domain is definitely better. Because the longer a domain is live and getting links for, the better it is since boosting up a reputation has an impact on how well the pages on it can rank.

  5. An agency can say that website is slow, when in fact, it isn’t. Test the site speed (and others) using various online audit tools like GTmetrix, pingdom, LightHouse

  6. If you have paginated (page 1, 2, etc.) content, you can use the special “Prev” and “Next” markup to aid search engines to get a better understanding of what is happening. Find out more over here: https://webmasters.googleblog.com/2011/09/pagination-with-relnext-and-relprev.html

  7. You can use “Visual Ping” service to monitor any website changes. This is good if you want to have the latest updates from Google Webmaster Guidelines page. You will receive an email when Google update the page.

  8. Don’t use Disavow Tool under GSC as more harm might probably happen than good. The tool is for disavowing links if you get a manual penalty or if you’re aware that paid or blackhat links occurred, and you wish to remove them proactively. In most cases, about 99% that those ‘spammy’ links should be left if Google think that they are spammy, they will ignore it. So focus on building positive things instead.

  9. Adding pages to robots.txt doesn’t prevent them from getting indexed. You should use the noindex tag if you want a page stopped from getting indexed. If you need more details about noindex, click here: https://support.google.com/webmasters/answer/93710?hl=en

  10. You can outrank the others for seasonal terms like “best cny hamper deal 2021” by keeping it on the same URL each year (eg. /best-cny-hamper-deal), and just change the year in the content. However if you wish to retain the old content, move the whole page to a new URL (eg. /best-cny-hamper-deal-2021).
  1. Any unique URL counts as another page to a search engine. For example: If both the non-www and www versions can be accessed, Google sees these as separate pages. Now in some cases where the content is alike and available through various URLs, use 301 redirects to avoid duplicate content.

  2. Google claim to treat sub-folders and subdomains the same when it comes to search ranking. However, some SEO experts have examples whereby subfolders have out-performed subdomain when it comes to ranking. So, before you create a sub-domain for your website, ask yourself do you really need one and the benefits of having one?

  3. Go back to the basics. Create a free Google Search Console (GSC) account if you don’t have one yet. It can give you much diagnostic information from Google on your website, notify you of problems, hacks, penalties, and show your average website rankings, including keywords that your website is ranking for.

  4. Do not do your redirects at “database level” – like in your CMS backend. If you will do the redirects “higher up,” your website is going to be faster, like the one in the htaccess file. If your website performs faster, it will be beneficial for the users, and it will rank better at the same time.

  5. Use the canonical tag if you want to test various webpage designs on live URLs in order to avoid confusing search engines having duplicate content while having the website live.

  6. Generally, the use of a VPN is a good idea. It is helpful for SEO. VPN allows you to change the countries and see the different search results looks like.

  7. COMMON MISTAKE: Using robots.txt to block the entry to the theme files/CSS of your website. Google should have access to these so that they can understand the content better, and they can render your website accurately.

  8. Google “mobile-first” index means that they are looking at your website like they are using a smartphone. It means that if your website has a “mobile version” with lesser content compared to the desktop version, it is unlikely to be discovered by Google.

  9. Generally, interstitials and pop-ups annoy users. Since Jan 2017, Google specifically said that sites with obscure content are likely not rank as well.

  10. For all of your webpages, add a self-referential canonical tag. Doing so will make it clear to Google which version to give credits to if someone intend scrape your content and repost it on elsewhere.
  1. Heard about the rumors going around that 50% of the searches by 2020 will be through voice search? Do not believe it.

  2. If you want to get a good ranking on Google, you must have other websites to link to your webpages. If you think of SEO as a one-off optimization, you are likely not to succeed in ranking your website. SEO is more than just technical optimization. Technical optimization only gives you a base to build on, and nota total solution.

  3. Modifying page content using JavaScript with Google Tag Manager should be the absolute final resort as it will take a long time for Google to index.

  4. Don’t think too much about linking to other sites that are useful and relevant to the users. It is absolutely fine because that is how the web works!

  5. You can estimate how much your organic traffic worth by knowing how much it will cost you to buy that search traffic via paid search. The CPC or cost-per-click of the specific keyword is set according to demand.

  6. It can be a useful to examine your raw server log files. You can know how Googlebot interacts with your website and see if it’s stuck or has an error response.

  7. Do not just look at the details on algorithm updates. The algorithm updates mainly represent the overcoming of technical challenges which at the end of the day, it’s still heading towards the same end goal.

  8. What is cannibalization? It is when you have many URLs (more than one) that are targeting the same key phrase or intent. It causes technically optimized websites that have adequate content to have poor ranking. It is one of its core problems.

  9. If you want to discontinue a popular product or model from your e-commerce website, do not delete the page. You can update it by explaining that you are removing the product and then link it to the related or nearest alternative products. Why? Because it is more convenient for the user, and it will prevent organic traffic loss at the same time.

  10. A canonical tag is not an instruction. Make sure to avoid using it on webpages that are not related because Google will simply ignore it.
  1. When you are migrating your site, also migrate the URLs that is not within your site’s internet link structure too. This is usually overlooked, and can result in a decrease in ranking after migration.

  2. Use an img tag within the HTML if you want your images to be indexed.

  3. Want to have a better probability that your videos will appear in the search results? Create video sitemaps as it gives more details to search engines about the videos that are hosted on your webpages, help them in the ranking.

  4. Register a Google My Business (GMB) account for free. It will help you begin ranking in the local map pack.

  5. “Google Trends” is beneficial for you to observe the weekly, monthly and yearly searches trends. Know whether is a particular trend going up or down? How greatly do they differ?

  6. The UX engagement metrics are not used directly by Google as part of their algorithm (such as dwell time, CTR, etc.) They have mentioned this for years consistently, and such theories are referred to as “made up crap” according to Gary Illyes from Google. It may also go the same for direct “social engagement metrics”.

  7. In Google Analytics, look at the last 12 months to see if you have content pages that have no traffic. It could be the best time to consider redoing, removing, or consolidating those pages.

  8. Make use of schema; this is crucial.

  9. It is against the guidelines of Google under the ‘link schemes’ if you send someone a free product review and then acquire a link. There is a penalty for doing so.

  10. Domain age play a part in ranking. It is almost impossible to rank a brand-new site for any competitive word.
  1. If you are using a Screaming Frog tool and experiencing 403 errors while trying to crawl your website, it may be due to Web Application Firewalls or other similar services, like Cloudflare, which will by default will block the crawlers.

  2. Preferably, you only want to have one h1 on a webpage, which should be descriptive of the page content. Preferably, your h1 and page title should be somehow similar.

  3. If you attempt to stop the content from being indexed, it must not be in robots.txt because the crawler won’t reach the page to discover your noindex tag.

  4. If you want the website content to rank well for months/years, design your website to link it in your site hierarchy from ‘high up’. Generally, it is unwise to post evergreen content in a sequential blog. It will disappear deeper into your website. If you always create relevant and evergreen content, it must continuously be prominent.

  5. Don’t waste your time on particular ‘keyword density’ because it is not a thing. Text analysis goes far beyond it, as well as tf-idf, it means you are not writing for your humans but robots, and you are missing a point here. The algorithm is always working to define what’s most helpful for humans.

  6. The usability of your website, including all the crucial pages, must be accessible without the use of JavaScript. Turn off JavaScript, navigate and click around your website. If you notice part missing or broken things, it can cause huge problems for Googlebot.

  7. The factors that affect the ranking in local map pack is different from the ‘normal’ Google search ranking.

  8. Stuck with creating quality content ideas? Use a broad subject and put it on AnswerThePublic to know the types of questions people are searching for in Google.

  9. As the last hope, when the development queue is hindered and you are sinking into technical debt, you can modify the canonical tags or page titles via JavaScript with Google Tag Manager (GTM). It can take awhile to be indexed, but it works.

  10. The Pagespeed Insights tool of Google will provide you a “Field data” if you have sufficient traffic to your website. Here’s the Pagepeed Insights tool of Google: https://developers.google.com/speed/pagespeed/insights/
  1. 1 out of 5 searches that occur in Google are unique and have never occurred before. The large majority of searches are words that have lesser than 10 searches each month. If you are simply choosing the key phrases according to the volume from the “keyword research,” you are missing a lot of traffic share, plus you are making things hard for yourself since many others are also doing the same thing.

  2. The key phrases on Google My Business reviews can help your company to rank well for those words.

  3. It is always helpful to do sub-folders than separate TLDs or sub-domains if you serve many countries on a single website. This means: yourdomain.com/sg/, yourdomain.com/my/ is always better to use thansg.yourdomain.com and my.yourdomain.com.

  4. Use Google Alerts for basic brand monitoring, free. It is a chance for you to do ‘link reclamation’ -wherein there are sites that mention your website or brand and are not providing you that link. Start a pleasant conversation, give them more details, value, insight, and then have that request in for you to acquire the link.

  5. The tool Screaming Frog has a free but limited version that can help you crawl your webpages quickly to uncover issues like duplicate page titles and 404s.

  6. Make use of YouTube as part of your SEO plan. SEO is not only about Google search.

  7. Google previously had a bug, and it has de-indexed millions of web pages at random – at times, even home pages of big companies. But the said bug has been fixed. You don’t need to panic if it has affected your websites because these URLs must automatically resolve. Speed up re-indexing if you are in a rush. You can submit the de-indexed URL through your Google Search Console account.

  8. Google Trends can trend YouTube searches but has been commonly-overlooked.

  9. SEO GOLDEN RULE: There is definitely no ‘SEO CHANGE’ you must do on your website which can cause poor user experience. No, period.

  10. If you want to see a quick view of your site’s content structure and discover its problems, create a visual crawl map. You can use a tool called Sitebulb. It has a trial version that you can check out.
  1. Google operator (site:) can help you discover major indexing issues. For example, you have a 5,000-page website but only 20 of them are indexed. However, it will not provide an accurate count of the number of pages included in the index.

  2. If you are performing site migration, try to change as little things as possible. For example, if you can move from HTTP to HTTPS, then do that first. It will be easier to diagnose and troubleshoot any issues.

  3. To avoid a typical error in targeting multiple languages/countries, use the ‘x-default hreflang’ found on the region/language selector page.

  4. Bounce rate is not one of the ranking factors. In some cases, a high bounce rate can also be beneficial for your website. It needs to be taken in context with the searcher intent.

  5. Before JS renders, your webpage’s cache is based on the First Meaningful Paint. It means that pages that have loading elements/screens that last for too long may be caching, and Googlebot will not understand what is on your web page.

  6. You don’t need to worry about TF-IDS if you focus on thinking about your audience, their intent, and having an idea on the subject when writing your content.

  7. There are 2 types of competitors: “Business Competitors” – who you are already aware of, and “Search Competitors” – those that rank on top of your website for keywords or terms you want. These are the one you will be competing in SEO.

  8. URLs are case sensitive. Search engines will consider yourwebite.com/PageOne and yourwebsite.com/pageone as separate pages. For your main, indexable, and navigable URLs, stick to lowercase as much as possible to make ranking and sharing easier.

  9. Penalty and algorithm update are not the same. If you lose a huge amount of traffic as well as rankings due to an algorithm update, this is not a penalty, and you may not be able to “fix” it.

  10. SEO considerations should start the moment you are building a new website. What schema will you utilize? How can you manage the migration? Do you know which content is chronological and which ones are evergreen? Do you know how to avoid cannibalization? Don’t be too confident in thinking that you can simply implement SEO once your website is built.
  1. “Duplicate content penalty” does not exist. Not unless your website is absolute spam, there is no harm if somebody will copy your page or you have a few copied content. The content may be filtered out of the search results, but it will not get your website penalized.

  2. If you need someone to write content for you, do not hire generalist copywriters. Google and users are looking for natural experts and insight and not a reworked piece from reading other articles that already exist because competition in the internet world is fierce.

  3. You can get penalized if you add keywords in the name of your Google My Business.

  4. If you are getting “not part of property” errors or notice like data is missing when in your Google Search Console report, be aware as Google sees www, non-www, http and https of your website as separate properties. You should add all of them in your Search Console and redirect to the one that you prefer.

  5. It is crucial to correctly categorize your business or brand in Google My Business to appear for searches that are generic map-based.

  6. Do not use fragmented snippets if you are using schema. You can bind them together with @id. For example, this blog belongs to this Website, written by Author from this Organization, which owns the Website!

  7. Excellent advice from an SEO and content marketing specialist, Stacey MacNaught: Contents comes first before format. You do not need an infographic and a video. Think of a content idea, then choose how you frame it.

  8. It is not possible to optimize for Rankbrain. Rankbrain is the name of one component of Google search that deals particularly with queries that Google has not seen before through AI to examine and interpret the intent.

  9. It is worth spending time to look at the previous 12 months of analytics data and discover those web pages that does not get any traffic. It is good to know the weak spots of your content, what needs improvement or revising, or at times just removing it.

  10. Keyword cannibalization means that you have more than one webpage ranking for the same keywords and how it can have a radical effect on the ranking of your website. Here’s a free tool that you can use to check your site for cannibalization. https://strategiq.co/how-to-identify-keyword-cannibalisation/
  1. You can obtain historical URLs of a website by substituting ‘example.sg’ in this URL with your domain (https://web.archive.org/cdx/search/cdx?url=example.sg&matchType=domain&fl=original&collapse=urlkey&limit=500000)

  2. If you want to calculate organic traffic after you have completed site migration, keep in mind to only calculate the unbranded traffic. It is highly improbable for you to lose branded traffic after migration.

  3. The Google ‘diversity’ update limits the number of organic results that a website can have, normally to 2, does not constitute ‘special’ results like Google news, rich snippets, etc. It means you should consider other angles that you can utilize to dominate the SERP real estate.

  4. It is a great idea to have an all-secure website (HTTPS) through SSL/TLS as it is one of the ranking factors. Secure websites actually rank better.

  5. To dominate Google, get your website content published in multiple places and not just in your own websites.

  6. Perform broken link reclamation by checking the server logs or using tools, like Majestic, to know the websites that link to malformed URLs. To reclaim the links and gain additional traffic, you should set up 301 redirects.

  7. The ‘build your own site’ platforms like Wix and SquareSpace are not optimal or ideal for Search Engine Optimization. Even Shopify and other bigger platforms do not let you edit the robots.txt file. However, they can be great for starters.

  8. Just choose between non-www or www, and setup 301 redirect on to the other URL. Google counts URLS with and without www as duplicate pages.

  9. It is almost impossible to get rank your website in the competitive sector if you do not have a strategy to get people to link to your website.

  10. Google has announced both Search and Assistant support for FAQ as well as the How-to structured data.
  1. It is easy to noindex pages in robots.txt by utilizing Noindex: /page-dont-want-indexed/. Google does not officially support using this, but currently it works.

  2. It is inexpensive to have a 360 photo taken just for your Google My Business. It can help in attracting more visits to your physical store.

  3. If you have a ‘voucher code’ box and you set it as the final step of your checkout, it can harm your conversion rate as you will send people off on a wild chase to get one.

  4. SEO rarely has ‘quick wins,’ but if you focus on your content that rank in positions 3 to 10, it will provide you the fastest way to gain traffic onto your website.

  5. At times, instinctively following the advice of Google is not in your best interest, at least in the short term.

  6. Name, Address, Phone (NAP) citations are crucial for your local SEO and ranking in the Google map-pack.

  7. Paying for Google Ads (or pay-per-clicks) will not boost your organic ranking on Google.

  8. If you audit a website using the Lighthouse Chrome extension, do it on incognito tab as other extensions can affect the results.

  9. Links to your website from your posts on social media platforms like Facebook and LinkedIn do not directly help with boosting your ranking in Google.

  10. Do not underestimate Google Images ranking. Make sure that your image assets are well-optimized and marked up.
  1. Hreflang tag can be used cross-domain. If you have .com.sg or even other language websites, you can help Google to recognize the relationship and have them rank better in SERPs.

  2. The homepage of your website has the same capability in ranking just like any other page. It is not ‘special’ and does not have more power to rank well compared to other pages. It just tends to pick up the majority of the links, which is why it ranks easily, nothing more.

  3. The intent outshines content length. You have to take note that the length of content is not one of the ranking factors. There are some correlations, largely due to the fact that longer content usually has more effort put into it, more links are earned, and you have more opportunities to make long-tail queries. However, do not make it longer just for the sake of it.

  4. You can update page contents like page title with JavaScript. It may take several weeks before the JavaScript version can be processed and indexed thought.

  5. Do not set a timeframe or arbitrary deadline if you plan on performing a website migration. Check your Analytics data first, and then make a plan so that you can launch in your quietest time. It will help in minimizing possible traffic losses.

  6. Nofollow links can give a constant boost in ranking, particularly in local packs. Google might use nofollow as ranking hints, so do not avoid nofollow links.

  7. If you want to know the amount of traffic you are getting from Google Images, go to Google Search Console, choose Performance, click on Search Results, and change the ‘Search Type‘ to Images.

  8. Providing Google with rich snippet results does not have an impact on your standard 1-10 rankings.

  9. When auditing a site, you must crawl the website with and without JavaScript, and with various user-agents and compare the differences.

  10. If you are a beginner in SEO, it is best to spend more time researching and learning how search engines operate and the goals they want to achieve than particular tactics in SEO. If you know the foundations, you will gain a solid framework for making better decisive decisions.
  1. It is almost ever beneficial to search engines and users to have a “view all” page in product sites. The consolidated lag is greater when clicking into many pages than one single page, giving your users a slower overall site experience.

  2. Did a competitor duplicate your content wherein Google is ranking it and not your original content? You can simply notify Google using the DMCA form.

  3. The internal rel=”nofollow” links have been confirmed by Google that they will always be interpreted as nofollow, referring to their update where they mentioned rel=”nofollow may be interpreted as a hint. It means in some situations, they can still be useful in managing something like internal faceted navigation.

  4. Google now support different attributes to identify the types of links. For example, “sponsored” for paid links and “ugc” for user generated content links.

  5. To give you a kick start, get a domain that has a good quality backlink history.

  6. Search Engine Results Pages have a featured snippet. Use SEMrush tool to keep tabs on the different SERP feature types that is showing in your niche.

  7. For a quick keyword cannibalization check, search in Google using this: site:yourdomain.com intitle:”key phrase to rank for”. It will return the webpage where you have the key phrase you want to rank for. If you see multiple pages, you might confuse the search engine as to what pages you desire to rank for this key phrase. Consider redirecting and canonicalizing to the correct page.

  8. If you know you have backlinks that break the Google’s Webmaster Guidelines or if you have received a manual penalty, you can submit a disavow file, listing down the domains and links you would like Google to ignore.

  9. If a company offers Gold, Silver, and Bronze package deals to SEO, it’s possibly rubbish. Does it mean with a Bronze package, your website will rank slower as compared to having a Gold package?

  10. If a company guarantees their recommended changes will put you at the first spot in SERPs, better find someone else. The best way to measure success is not by monitoring the keywords individually. No one can estimate the algorithm changes in the future or what measures your competitors will do once you start climbing to the top. Just like a lot of things in life and business, if it seems like it’s too good to be true, it probably is.
  1. Google disregards anything after a hash (#) in URLs. You must not use # in your URLs when loading new content, unless you use it to skip to anchor points.

  2. Ensure redirects go to the canonical version of a webpage. A common mistake is redirects going to a page, which then has a canonical tag telling Google that another page is the canonical version.

  3. A blog page is a terrible spot if you thought of hosting ‘evergreen’ content like the how-to-guides. If the news/blogs section of your site is chronological, the content will sink the hierarchy of your website. It will be harder for users to find the site, it will be more clicks away, and will later be considered by search engines as less valuable.

  4. Do not give up on your outreach even if you are not getting any response in trying to link from newspapers. Most of the newspapers have many journalists that are reporting related topics, so try finding another contact.

  5. SSR or Server-Side Rendering is crucial, and you should not be leaving it to Google to try and process critical Javascript.

  6. Do not use the noindex tag if you want to run A/B tests, instead use the rel=canonical tag.

  7. Pages that are blocked by robots.txt cannot be crawled by Googlebot. However, if the disallowed webpage contains links that are leading to it, Google may determine it to be worth indexing even though they cannot crawl the page.

  8. Keep in mind to noindex your website’s when in staging and dev environments so they won’t be revealed yet, possibly of ruining your live website rankings in the future. Remember to make it indexable when it goes live!

  9. When it comes to categorization in Google My Business and Local SEO, less is MORE. You will get better results with fewer but more specific business categorizations rather than covering everything.

  10. UX components like no popups, mobile-friendliness, site speed, etc., are factors which affects Google ranking.
  1. A research tool called “AnswerThePublic” can show you common questions that people are searching for about a certain topic in Google. It is an awesome way for you to begin creating topics for your plan in creating quality content.

  2. Google can sometimes ignore your meta description and use any on-page content it can find, believing it is more relevant for the searcher. This is usually a good thing as a dynamic meta descriptions can give better CTR.

  3. Google now allows you to manage your site snippets displayed on search. You can find all the details over here: https://webmasters.googleblog.com/2019/09/more-controls-on-search.html

  4. If you want to see a better view of your profile, use a selection of tools when performing a backlink analysis, like Majestic, SEMrush, Ahrefs, Moz and even the free Google Search Console.

  5. Generally, on your internal search pages, you should utilize the NoIndex tag. It’s a poor UX for a site visitor to go from a search page to another search page.

  6. You do not need an HTML sitemap if you have a website made of well-thought internal links and navigation wherein search engines can easily access them.

  7. Use the site: operator in Google to know all the webpages that have been indexed.

  8. Google does not dominate everywhere. If you want to target countries like Russian-speaking places, you must rank in Yandex. Likewise, if you want to target China market, you must rank in Baidu. which can be different from Google ranking.

  9. SEO audit is likely to have almost no measurable value for your small businesses unless: (1) You have the means to implement recommended changes. (2) You will spend on continued SEO effort. Generally, a technical audit will only have an immediate effect if the website is already well established to begin with (i.e. the website has an existing adequate backlink profile). For small businesses that almost do not have links, creating technical adjustments will have a little impact. It’s like you are tuning an engine without any fuel.

  10. The links listed in a disavow file are only disavowed while listed. You can update and resubmit the file if you think you made a mistake.
  1. Generally, it has been accepted that sub-folders of your site perform better compared to sub-domain that are sharing in the equity of the main domain. If you are instructed to “you need” to place a part of your website on a subdomain, you can show it to the user as a subfolder by utilizing a reverse-proxy.

  2. One way to help users find the information that they need quickly is to use anchor jump points in your content. However, you can also get more links within the SERPs to help with the CTR.

  3. If you are using desktop tool like Screaming Frog on a large website, you don’t need to crawl every page to find the main technical issues. The problems usually exist over templates. You can audit a sample to give you an insight as to what you need to fix quickly.

  4. Don’t let a ‘zero’ monthly search volume put you off producing content. Over 90% of the key phrases in ahrefs database have <20 searches per month, this is what the longtail is. The important bit is this though: You’re only getting volume for that exact phrase. If you write the content well, there’s a few hundred variations on most phrases that suddenly make it a lot more appealing – none of which will initially look appealing through volume data.

  5. It’s a test for you when you produce content to acquire links and coverage. When you have the data, story, headline – ask yourself: “So what?” Why would others care? If you have an answer, move on to the next phase.

  6. In the ‘Coverage’ of Google Search Console, make sure that you remove the ‘crawl anomalies’ report. While GSC is quick to relay 404 and 5xx glitches, the crawl anomalies usually are overlooked, but this can expose serious problems like many timeouts that can stop ranking and indexing.

  7. If you’re trying to measure organic performance, especially at this time of year, you need to look at Year on Year (YoY) figures. If you’re running an e-commerce site and you’ve seen an organic uplift in clicks/impressions over the last 30-90 days compared to the previous 30-90 days, this should not come as a surprise – it’s coming to Black Friday, Cyber Monday and Christmas shopping season! If you really need to do this type of analysis, use Google Trends data to normalise your traffic to see if there is uplift after adjustment.

  8. Search intent shifts sometimes occur during a certain of the year where the intent of a person’s query changes. For example, ‘Halloween,’ changes from informational to a commercial term when it’s near to Halloween. If this happens, Google will switch a site’s ranking drastically to conform to the intent. If you will notice a fluctuation in your website’s ranking in events happening during the year, nothing wrong could be happening in your site, it could be the intent is not the best during that time.

  9. In making decisive SEO decisions, consider if the ranking factors are non-query or query dependent. For instance, the PageRank ranking factor is query-dependent, and it applies likewise to all websites. Another query-dependent factor is content freshness because there are searches that deserve freshness. It means that internal linking is crucial to all websites, but content freshness may not be.

  10. There are still many companies that offer ‘SEO ad platforms,’ which are primarily advertorials or adverts that ‘pass SEO benefit.’ Your website will be at substantial risk of being penalized.
  1. If you don’t know where to start link building, get the names of your top competitors (ranks 1) and place their site in the Majestic link tool, then click ‘backlinks.’ You will see a list of their best links and also know the strategies they use to get them.

  2. Collation and curation can help in getting more mileage and links from your website content. If you are utilizing tools to answer questions regarding products or product categories, you could check and combine all the details into a guide or single page that you can use for outreach efforts – it is easier to build resources in acquiring links from.

  3. If you want pages to rank well for higher volume terms, you should link them ‘high up’ within the hierarchy of your website, like the homepage or main menu.

  4. When setting up keyword tracking, it is useful to use multiple tags on keywords. This allows you to view how well you rank for a specific topic, product or service. This information can help form your SEO strategy, determining what you need to do around specific topics to get rankings. For instance, if Google ‘likes’ your site for a specific topic, just building new content means you’ll likely rank for it off the bat. If you’re struggling on a topic, you’ll need to gain more authority, which a lot of the time comes down to getting links.

  5. Your targeted keywords can, at times, be close to impenetrable. Try considering other phrases with the same intent but lower in search volume, than wasting resource on making it to the top but does not give you any guarantee. You can compare the data on search volumes, Google Trends, and cost per click.

  6. Canonical tags must be place in the header, and putting them in the body means they will be ignored.

  7. Link’s quality is much more important than quantity. It does not work well if you set targets on volumes of links, and it can be an outdated SEO approach.

  8. There are many reasons why a person may get several search results, but the impact of personalization is, generally, exaggerated. Personalization is very little, while things such as device and geography can change search results. The personalization of search results usually occurs in groups like when Google completes your related queries in a row. Aside from that, in the organic results, only very little is ‘personal’ to you.

  9. Prioritize technical SEO audit procedures. One of the most crucial factors you need to consider is the cost and difficulty of executing change. It will be a complete waste of time to push for changes that have negligible impact, even if it means battling vital technical debt to obtain it. There are some other things you can look at and focus.

  10. Search Engine Optimization is not only about Google. Bing also has great traffic when it comes to B2B queries, particularly with organizations where their IT is locked down, and they have no other choice but to use older browsers defaulting to Bing.
  1. It is worth more having two links from two separate domains than two links from the same domain.

  2. Did you know that a redirect chain can cause issues? There is no reason to redirect the internal links of your website. If you notice a redirecting internal link, update the link and make sure it lands at the end location.

  3. There is a Google algorithm update called ‘BERT’, and there is nothing you can do to ‘optimize’ for it, the same way that you can’t ‘optimize’ for RankBrain. Focus on producing better content for your site visitors and concentrate on how you can satisfy the intent and maximize user experience at the same time.

  4. If you are doing a crawl on your website, and you are getting things such as HTTP 504: Timeout error, then the website might be timing out for Google as well, and this is bad. There is no reason that most sites should not be run via a service like Cloudflare.

  5. For the broken pages (404), make sure that it returns a 404: Not Found header instead of a “200 OK” header. This is identified as ‘soft 404.’ It is a bad practice to return a success code than the 404/410 not found. A success code lets the search engines know that a real page at that specific URL exists. Doing so may result in a page listed in the search results wherein they will continue to crawl that URL that does not exist instead of the real pages.

  6. You can use Natural Language API of Google and help you see how Google understands the topics and entities. It is a good way to get a comparison and highlight the opportunities you have missed.

  7. You can try a new keyword research tool called AlsoAsked.com. It can mine Google simultaneously for “people also ask” queries and help you group them into topics.

  8. An SEO agency on-going focus should be on actions that drive results. There is no such thing as a ‘monthly audit.’ Audits can a good place to start because it can help you identify gaps, opportunities, issues, define strategies, and roadmaps, but within a reasonable timeframe they are finite. On-going website monitoring, particularly on big websites, is crucial, but you can automate it. Reporting and benchmarking is important too, but you can mostly automate them. If someone tries to offer a ‘monthly audit,’ there is a good chance it will not represent good value.

  9. If the JavaScript is being used to render the DOM, you will not see what is happening on the webpage if you click on ‘view source.’ ‘View rendered source’ is an awesome Chrome extension that can help you compare the non-rendered and rendered source. It is handy to make quick progression on your technical SEO audits.

  10. Google disregards crawl-delay stipulated in robots.txt, but you can make changes in the Google Search Console.
  1. Here’s a little technical tip that not a lot of people know. If you have an e-commerce website and you want rich results but are not able to get the schema on your website, you can accomplish it by submitting a product data feed in Google Merchant Centre. You don’t need to spend money on ads. The structured data you provide is what Google can use to boost your search result.

  2. It’s that time of year when you’ll be doing sales, so make sure if you’ve got products or categories in special /sale/ URLs that you have canonical tags set up to the original pages and you 301 redirect them once the sale is over instead of killing the pages – sales attract links!

  3. It is best to set up cloud monitoring of the website if you are dealing with a lot of input from different teams because it can help a lot in spotting the problem and preventing any errors.

  4. There is no specific tool that can tell you organic key phrases that a person searched and clicked on. Google only has that kind of data, and they do not give that out.

  5. Don’t hesitate to link to other good sites, especially if it’s helpful. However, linking out does not help in boosting your rank directly.

  6. During a website migration, it is best not to alter the page titles as much as possible to get a good picture of any site issues.

  7. If you really want to show users something and not search engines, it’s worth keeping in mind that Google ignores everything after a ‘#’ in a URL… Has some “interesting” uses.

  8. For spot-checking, it’s good to use cloud than desktop-based tools. It will be easy for you to check your website for accidental updates, broken links to robots.txt using a cloud-based tool. It’s very common during the initial audit of a website to find a lot of broken links. Remember, if the users are unhappy, so are the bots.

  9. It’s not a bad idea to have the http to https 301 redirects before the HSTS to ensure that everything is working.

  10. As of now, only the US search results are affected by Google BERT algorithm. Here is an intriguing side note: Bing has been using BERT (Transformer) for many years.
  1. The benefit of having 302 (temporary) redirects is they can maintain a redirected URL in Google’s index. It will be easier on your end if you are going make changes regularly and to avoid having to setup multiple redirects.

  2. The URL Removal Tool doesn’t do what it’s says in the Google Search Console. It just temporarily hides the URL from the search results of Google. It means if you want to have a result removed from Google permanently, you need to mark that webpage as noindex.

  3. Automated SEO audit is ‘free’ because it offers little value.

  4. Google decided to end the data-vocabulary support last April 6th, 2020. Before that time, you must have already switched to schema dot org, or else you have lost rich snippets.

  5. If you do use an automated audit, be aware they don’t account for the size of your site, which will often dictate the magnitude of the problem. As an analogy, if you have a leaky tap in your bathroom, this isn’t a huge problem – but if your house has 500,000 bathrooms and there is a leaky tap in every one of them, it’s a big problem!

  6. Do you plan on discontinuing products permanently on your e-commerce website? Consider redirecting the links to and alternative products. Otherwise, remove the page with a HTTP 410 status code.

  7. If you get a Status Code ‘0’ while running Screaming Frog, it means the bot of Screaming Frog is timing out before the page responds (not the server or website). The Screaming Frog default is 20 seconds before it time out.

  8. You’d save yourself from wasting too much time if you know which SEO pieces of advice are myths and which ones are misconceptions.

  9. The Chrome User Experience Report (CrUX) is some of the best data you can get. This is real-world user data sent by user’s browsers on how long your site is taking to load. If you have enough visitors, you can get this report in your Google Search Console.

  10. If you mark your items as ‘out of stock’ or disable checkout while putting your business on pause, make sure to update all related schema. You might trigger a schema penalty in Google if the schema doesn’t harmonize with the details on-page, which could make Google to ignore everything.
  1. Google is going to focus heavily on mobile-first indexing this coming May. It’s a good time to check that your content and UX on mobile is matching up to your desktop.

  2. Google is now treating “rel=nofollow” as hints, which means it can disregard nofollows on external websites.

  3. Webpages that have links and are marked ‘noindex’ will be treated as ‘no follow’ links eventually.

  4. SHOPIFY HACK: Shopify platform will not let make changes to the robots.txt file. However, you can create a 301 redirect from a /robots.txt to a file that you control elsewhere. Google will still pick this up and it will function as a robots.txt.

  5. There is a removals tool in the Google Search Console. There are three things that this tool can do: 1st: let you hide the URLS from displaying in Google search temporarily. 2nd: tell which content is “outdated content” because it is not in Google. 3rd: tell you which of the URLs you have were filtered by the Google SafeSearch adult filter.

  6. An SEO audit report that’s automated and made by a tool found online has almost no value except if it’s put in perspective to your brand or business by a person that can understand both it and Search Engine Optimization. Such tools seldom provide a good action plan and would usually provide false positives.

  7. Some websites will serve a diversified content or experience, depends on the user agent. There is a Chrome extension that will let you switch to any user agent, including Googlebot. It is a helpful diagnostic tool when you need to deal with websites that need things like server-side rendering.

  8. You can utilize the cache: command when performing a search if you want to get an idea of how Google presently views the content of another website. Try this: cache:domain.com/your-page-url/, and you will see how Google process the website. If you notice navigation or a lot of content missing, they might have an issue.

  9. When you’re getting an SEO site audit, generally the larger the site is, the more value technical fixes will hold – and the smaller the site is, generally content suggestions will carry more value. Here is a beautiful graph to demonstrate this.

  10. Image optimization has usually been overlooked for e-commerce websites. Many people start their purchase journey through Google Images and not on the main search page. It is also because of the fact the product schema is visible on some of the Google Images, which is an opportunity for a lot of retailers.
  1. You can use a Chrome extension that can highlight nofollow links automatically on any page you will visit. It is very helpful to see how others are utilizing it and see that you do not miss anything when auditing websites.

  2. Getting an overview of how your competitors are ranking and what they are doing can be invaluable

  3. Do not just take down or disable your website if you need to pause your business online during this pandemic. If you remove the pages, Google may drop them, and you be struggling in ranking when you come back.

  4. Keyword research should account for the terms that your customers are using, whether or not they are ‘technically correct’. A related example I found today: We are hearing the word ‘unprecedented’ being used a lot in relation to Coronavirus. Google Trends suggests that there has been a significant amount of people that for whatever reason, have had to Google the meaning of the word. Definitely worth thinking about in a wider marketing and comms sense too!

  5. If you want to redirect old URLs that have links, for example, from a site migration, you must put the redirects in place and refrain from deleting them unless you can update the source link.

  6. It is very much possible to break the webmaster guidelines of Google and trick it into ranking your website site better than it actually deserves. But, ALL blackhat SEO is not permanent. You are doing things in its gap within the algorithm, and this gap is constantly shrinking, and you will get caught eventually.

  7. A simple SEO test for you that a lot of businesses can’t do: Choose a key phrase you wish to rank for and then ask yourself: which page do I want to send the users to when they search for this key phrase, and does the page reflect that query or intent? “Homepage” is normally not the answer and if “it could be this page or this one” then you have some problems to solve!

  8. Google utilizes DMCA actions in ranking algorithm. Another reason to report competitors that are scraping and copying your content.

  9. Google does not “favour” long content. Here is a more specific detail: There is a greater possibility for a long content as better content and would rank better at the same time, with no regard for the actual word count. It means that ‘this should be a 1,000-word article” or “this webpage should be 500 words” are completely meaningless. Logically, there is no reason for a word count to be one of the ranking factors, and Google said this multiple occasions.

  10. The ‘SEO audit’ score of Lighthouse means almost nothing. You may be getting a 100 score but may also have massive website technical problems. The Lighthouse tool will check only that individual page, and it does so by no context to your business or to what you rank for.
  1. One of the keys to creating good content is answering people’s inquiries. Length is not crucial but usually connected with having better content. It means one should understand the intent and approach it from a topic than the keyword point of view.

  2. Google is doing its best to comprehend “what” things (i.e. people, companies, brands) and identifying these entities are connected to their knowledge graph.

  3. Links are important and you can get easy links by using a tool like Majestic to find broken incoming links. That means, sites that link to you that currently go to broken pages. You can either get them to update the link (to where it is meant to go) or simply set up a 301 redirect your end. To do this, login to Majestic, enter your domain then go to Pages > Crawl Status > 404 and you’ll get a full list of broken incoming links! Easy!

  4. If your Screaming Frog crawls as Googlebot are being blocked (server doing reverse IP check), you can usually get around this by setting a custom HTTP Header X-Forwarded-For with a known Google IP.

  5. Always use multiple tools to confirm the results when doing SEO audits. You may soon realize that these tools are giving you different results.

  6. If your e-commerce site has a faceted navigation/filter that gets you to a product sub-category that has search volume, it is good to practice to make sure this page is accessible by standard links (i.e. not checkboxes) and has its own URL so it can rank.

  7. Google has started displaying PDF thumbnails in search. If you have many PDF contents, it is crucial to begin tracking interactions because click-through rates will most probably improve.

  8. Many recommends having an external XML sitemap because it can give an easy audible list of canonical/indexable URLs. It will be easier to know when the rogue URLs could be either be present or linked to.

  9. There are going to be many changes in the behavior of individuals moving on to online and e-commerce. If competitors are slowing down, you might want to consider “double-down” on your SEO. Websites will always want content, more and more people will spend time on the Internet, wherein the opportunity has become bigger than ever.

  10. As people are staying more online, the intent and search demand are shifting quickly, and there’s much more things you need to consider than before when doing digital marketing.
  1. LCP or Largest Contentful Paint is the new metric in Lighthouse performance report. It marks the point when the “largest” or primary content loads up and becomes visible to the user. The Largest Contentful Paint or LCP is a crucial complement to FCP or First Contentful Paint which only capture the very start of the loading experience. LCP gives a signal to the developers how quick it is for a user to see the page content. Aim for a LCP score that is below 2.5 seconds.

  2. Descriptive anchor text within your website is helpful for users as well as search engines to figure out how to rank webpages. Review the kay pages on your website and take note that internal links are equating roughly to what you desire to rank for. For a start, remove those “learn more” and “click here” anchor texts and replace it with something descriptive.

  3. E-A-T or Expertise, Authority, Trust – you will hear many SEO individuals talk about it. It is not a ranking factor to make things clear. It is an idea of combining various factors and metrics to estimate these things. For example, old-fashioned links that are incoming to your site can add to E-A-T. It is a helpful concept but there is nothing unusually new.

  4. As the cache: operator provides a helpful hint on what Googlebot has observed, it is not really reliable, and there are Google’s tools that you can use for this task. It is recommended to use a URL inspection tool or a Mobile-Friendly Test that can be found in the Search Console.

  5. Use the ‘noindex’ tag via X-Robots if you want to prevent resources from getting indexed wherein you cannot update the HTML of a PDF, for example. Remember not to use nofollows or robots.txt to stop things getting indexed.

  6. You can use Screaming Frog to validate structured data you’ve added by Javascript by enabling JS rendering (Config > Spider > Rendering) and then checking ‘Google Validation’ in the Spider Configuration

  7. The impact of a “ranking factor” can change massively by industry or niche, even with time of year. Don’t take it for granted that things working well for others will necessarily translate into success for your website.

  8. The default Googlebot timeout is 180 seconds.

  9. Google will score the webpages on both ‘query dependent’ and ‘query independent’ signals. Examples of the query dependent signals: synonyms, keyword hits, and proximity. Example of query independent signals: language, PageRank, and mobile-friendly.

  10. Although the EMD or Exact Match Domain may not be as powerful as it used to be, it can prompt Google to think that a search phrase is a “navigational” query (example: searcher looking for that specific site). With this, exact match domains yet manage to punch beyond their weight with regards to ranking.
  1. While it’s nice to have one H1 on a page to be clear about the subject, having multiple H1s in a template isn’t going to cause you any SEO issues. It is highly likely there will be other things that are more valuable to spend your time on.

  2. The ‘correct’ title tag length is not based on a number of characters. It’s the length required to uniquely and succinctly describe your page content.

  3. Not redirecting non-canonical URLs is one of the common mistakes seen when dealing with website migrations and it can cause a loss in search traffic. The non-canonical URL, like the ones that have tracking parameters in, also requires redirecting, especially if they are receiving traffic or links.

  4. If your website has an internal search bar, you can determine if there is a need to make more pages or modify your keyword targeting. Look at what people are searching for. Also, double-check that you have pages that have been optimized for this intent.

  5. The svg format is supported in schema for things such as LocalBusiness Image.

  6. SEO and PPC work together in a lot of ways. You can use the data from your PPC for your SEO works. It can be very beneficial to buy traffic for keywords because it lets you “scout them out” and check whether they convert or not for you before putting a lot of effort into ranking them.

  7. You can speed up a webpage’s deindexing process by utilizing a 410 and not a 404. The difference between these two status codes is that 410 means “Gone,” while 404 stands for “Not Found.” The “Gone” 410 signal is much stronger for search engines, showing that the URL is removed on purpose, which will be deindexed faster.

  8. Featured snippets are awesome, however, they are not that stable compared to ‘traditional’ rankings because they can change and rotate instantly. Google is now testing featured snippets that are not in the #1 rank.

  9. It is sometimes hard to predict the impact of technical SEO changes on large websites. Therefore, it would be best to do SEO A/B testing.

  10. Using the site: query can find out the pages that are not in the index of Google.
  1. Real-time monitoring of what your competitors you are ranking for can be useful so they don’t get a lead on you with good ideas. You can use a tool like Ahrefs to get an alert when a competitor site starts ranking for a new keyword. If you use Ahrefs you can find it in Alerts > New keywords > Add alert > Enter competitor’s domain > set report frequency > Add

  2. Shopify is not SEO-friendly “out of the box”. It will generate multiple Collection URLs for a single product (and a URL without the collection). They will try to handle this through canonicalization, however, most of the internal link structure still points to non-canonical links, which is bad. You can fix this by editing the collection-template.iquid and removing the collection reference from the where hrefs are being generated. Easy.

  3. When you hear others discuss “Google’s Algorithm,” it is not only one huge algorithm. It is made up of many various algorithms, some are core while others are not, and some are used back at Google’s end, while some others are used right at the moment of search.

  4. Google is not using data from Google Analytics as a basis to rank or index your website. They do not use “bounce rate” as one of the ranking factors.

  5. If you want to get an overview of your website’s index coverage, the most reliable way is to utilize Google Search Console, view Coverage -> Valid. The site: command can return changing results and return pages that are not in Google’s index.

  6. Generally, many noindex the “tag” page because nobody likes to land on a tag page from search, making them to do another search action to get to what they want. “Tag” pages are usually in “thin/low quality” that do not provide a good user experience.

  7. Ensure that all your important pages are easily accessible by clickable, normal links. It may sound silly, but when you rely on functionality, like the dropdown boxes, it will make it improbable for search engines to discover them.

  8. Just because a page is crawled and discovered by Google does not mean it will be indexed and appear in search.

  9. When declaring hreflang tags for pages in different languages, they must be reciprocal. It means that if the English version of the website says “the Chinese page version is here”, this Chinese page version must have the tag referring back to the English page version. If it’s not reciprocal, the hreflang tag will be ignored.

  10. Google places many SERP verticals (images, news etc) based on their understanding of searcher intent. In my experience, how high up Google shows “People Also Asked” results will give you a good indication to the breadth of intent in a query. For queries where the “People Also Asked” questions are at the top, it means Google is trying to clarify the intent. If you’re trying to rank for these terms, this should be reflected in your content.
  1. If you want Google to index an updated page quickly, use the Google Search Console function > URL Inspection -> Enter the URL -> Request Indexing.

  2. In 2021, the ‘web vitals’ metric will now be ranking signals aside from being crucial for users. The three web vitals – Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift – can be visible on any webpage that you land on. Right click to inspect the page -> lighthouse -> Generate report

  3. Your site’s CSS can affect how much PageRank passes through a link.

  4. Try linking to the “canonical” version of a webpage if you link internally. Google can ignore your canonical tag and rank the ‘wrong’ webpage if you begin to link to non-canonical variants.

  5. Do not just delete content that is not getting links or driving traffic. There is a process you can go through: (1) make sure that in the first place, it is intent/keyword targeted. (2) Check if the page is cannibalized by other content on the website. (3) Conclude if the information can be merged somewhere with different articles, especially if cannibalization issues are present. (4) Can you build, expand, or improve the content? (5) Can you have the content displayed in a different format or write it with more impact?

  6. If you have a website that’s HSTS enabled, it means that browsers will not try and access the non-https version, and you will only view a “307” redirect (not a real server redirect). This can be tricky if you are diagnosing issues. Use the Screaming Frog and disable the “Respect HSTS” policy to examine what’s going on behind the 307 redirects. Go to Configuration > Spider > Advanced.

  7. How long should you leave redirect in place? The official answer of Google for this is “as long as possible,” or at least a year. Probably, permanently.

  8. If Googlebot encounters noindex, it skips rendering and JavaScript execution. This means any Javascript on a noindex page will never be rendered.

  9. There are two ways to know how much is SEO worth for your business. #1 Based on PPC data, you can know the cost per click for a particular key phrase. It can give you an idea of the market cost if you intend to buy these quantity and quality of traffic. #2 How important is the web traffic for you? For example, if you are able to get X number of visitors for a particular key phrase, how many % can you turn into a desired commercial action and how much that action worth? If #2 is way lower than #1, it means your competitors with another revenue model or revenue efficiency can acquire more value from the web traffic. It also means that they can outbid you if you choose to do PPC or even out-invest you if you do SEO.

  10. At times, it can be hard to get an overview of where a particular is linking to when you are trying to find and remove web pages problems. Use Screaming Frog and configure it to see the link on a single page. Go to Configuration -> Spider -> Limits, and then you set the crawl depth to “1”. Then put the precise URL in the top bar that you want to check, and you will receive a clear list of the resources and links on that one page.
  1. Every time we use a website, the page moves around as it loads. There is one statistic that Google has picked to measure how much this movement affects your user experience called “Cumulative Layout Shift” (CLS). CLS measures how much of an impact on rankings you will get if your site does not follow best practices and negatively impacts its visitors’ experiences by constantly moving while loading new content.

  2. If you are working on a huge site, attempting to optimize Meta descriptions and titles manually for all pages will be ineffective and will give you a low return for the efforts. You can begin with an ‘optimized’ template that can be generated automatically, at the very least. You can also concentrate on optimizing those critical pages or proceed to do task with higher priority.

  3. Every part of a web address can influence ranking. Protocol: Google favors https over http domain. Sub-Domain: Sub-folder rank better than sub-domain. Domain name: Exact Match Domain (EMD) can prompt Google to think that a generic query phrase holds navigation intent. Top-level domain: Country-code TDLs can make your site easier to rank in a specific geographic location.

  4. Shopify is guilty of linking to non-canonical URLs, and canonical tags only act as a hint. Sitebulb provides an objective way to assess the problems with this by tracking traffic on “wrong” domains while connected to Google Analytics data.

  5. As long as your code functions, it does not need to be W3C compliant – this is not a “ranking factor” as you see on some audits. To quote Google: “As long as it can be rendered and SD extracted: validation pretty much doesn’t matter.”

  6. Big site? You’ll get more pages indexed with lots of small sitemaps, rather than 50k URL sitemaps.

  7. Did the user reformulate or modify their question? It is something that Bing particularly listed as one of the ranking factors. Changing a question or query can give a strong signal that the results do not suit the searcher’s intent. With this, it is good to have a first-mover advantage, especially on “broader” search phrases. You can look at the “people also asked” and “people also searched” data to see if there are ways to develop your content and get the formulations, wherein most likely, in the long term, original query ranking will likely change.

  8. Domains that already have links and have been in there for a long time are very valuable.

  9. If a newspaper is trying to sell you online advertorial links “with SEO value” then they are potentially putting you at risk.

  10. Don’t disavow links just because they have low Domain Authority (DA) or Trust Flow (TF)
  1. Google still uses “non-supported” schema types to understand page content better. So if you’re adding schema to your page it is best to do a “full” job, rather than just using the list of schema where Google can generate a special result.

  2. Have an SEO specialist work with you at the beginning if you are planning on building a new website. You’ll get more value in preventing potential issues from happening and identifying more opportunities like schema or other details that you did not include in keyword research. SEO is a continual process and not something that you implement and forget.

  3. Whether a link is ‘counted’ by Google is a decision made in context to the rest of your link profile. Google has stated if you have “on the fence” links that they can’t classify, a good link profile may mean you get the benefit of the doubt on those links, a bad link profile may mean all those links are discounted too. Much like if someone you trust and someone you don’t trust both tell you a hard to believe story, you are more likely to give the trust-worthy person the benefit of the doubt!

  4. In any specific TLD you buy, there’s no more or less “trust.” For instance, Google does not trust a .SG domain over a .TK domain. Some domains are indeed being utilized more for spam (and there could be a perception issue), but you begin algorithmically on equal footing. We are pointing to the concept of trust, disregarding the truth that ccTLDs, for example, may influence the gTLDs for geographic purposes.

  5. You can use Google’s Indexing API to “directly notify Google when pages are added or removed. This allows Google to schedule pages for a fresh crawl, which can lead to higher quality user traffic.” Google says only to use it on Job Ads / Streaming pages, but it actually works on any page.

  6. You can improve your CLS score on infinite load pages by removing any footer with content in it that continues to get pushed down. If you think about it, users won’t be able to access it anyway, so it’s likely just frustrating them.

  7. If you are having to merge or change URLs including pages that rank or show things like featured snippets, it is always a good idea to verbatim copy over the paragraphs that are ranking when you setup the redirect to try and maintain those positions. I have just moved over a page that was ranking no1 with a featured snippet to another URL and maintained the snippet.

  8. If you’re stuck in a rut for content ideas, using a tool like BuzzSumo can quickly show you which content is popular and being shared around a topic. It’s a great way to kickstart your ideation process

  9. Many automated SEO audit tools will return “duplicate content” issues for URLs that correctly use hreflang tags. If, for instance, you have almost identical pages for an English (UK) and English (AU) page that correctly use hreflang tags, you don’t need to worry about “duplicate content”. One of many examples where automated tools can give false positives

  10. When SEOs talk about “Crawl budget”, it is an idea referring to the number of pages on your website that Google may crawl. It’s usually just a thing that websites having many pages must worry about – but if you have a website with hundreds of thousands of pages, it is usually helpful to consider optimizing where you want to send the robots.
  1. If you’re running multiple locales over sub-folders, it’s really helpful to setup separate properties within Google Search Console. This gives you a super quick way to get insights into that locale, without having to worry about filtering data – huge time saver.

  2. Do not be bothered if you notice various rankings on the same search word that are on two separate computers. Even on the same IP, computer, location, signed out (or in), you may still notice similar website ranking in two separate positions. The infrastructure of Google in serving results is shifting, huge, and no single occurrence of “the index” – it is only a norm that all is harmonizing towards.

  3. Guest posting, even the paid ones, will get the websites to rank. However, this is not a recommended long-term strategy as there are better things that you can invest in because that’s what SEO is all about – it’s long-term. There are affiliate sites that are ranking from nothing but through paid posts. However, the difference is that if an affiliate website drops in ranking or gets a penalty, it’s pretty easy to get the money and then move on, which is not the same with brands and most businesses that have equity.

  4. Google My Business has a “Posts” feature that lets you post COVID-19 updates/support, updates, offers, and events, straight to the SERP. It’s an easy way to control the search result, expand the real estate you are taking up and control the message you want to deliver at that moment.

  5. If you’re using on-page hreflang tags, you can audit them with Screaming Frog by selecting ‘Crawl’ and ‘Store’ Hreflang under ‘Config > Spider > Crawl’. This will help you quickly identify where you have issues.

  6. Google will index and rank content that is hidden, such as in tabs. The only difference is that content that is immediately hidden when a page load won’t be shown in snippets by Google.

  7. Google Search Console now supports Regular expression or Regex. It means you can acquire more specific and quicker data pulled on things like non-brand versus brand search queries.

  8. Keep in mind: You cannot go back to the previous position you were in if you get a manual action for links and remove the penalty. It’s because the links considered manipulative will both be discounted or removed. It means that popularity, trust, link equity, etc. – must be replaced. It is why manipulative link building is not to be considered an “investment” like many SEO is.

  9. Using an estimated global “click-through rate” is not that helpful because depending on the search intention, and therefore SERP layout, the CTR for position no1 in Google will vary between 13.7% and 46.9%

  10. It’s important to accept that all “link building” carries some kind of risk. The only way you can guarantee links is if you place or pay for them, which carries a risk for breaking Google’s guidelines – or – at the other end of the scale, you create content and do outreach, which is not guaranteed to get you links, so you risk wasting time and effort, there are no guarantees in marketing
  1. Google Web Stories are an AMP format which is another avenue to explore to get more search traffic. If you’re running a WordPress site, Google recently released an official plugin to help you make them.

  2. The ‘News’ filter in Google Search Console will only tell you the clicks occurring in Google Search “News” tab. It will not tell you about the news clicks that appear in the (Top Stories) “All” results.

  3. From this year onwards, page speed has become a huge ranking factor. Page speed is super important for loads of other reasons. However, you’re not going to directly lose rankings because a page takes 5s instead of 3s to load.

  4. When you’re pushing a new site live, do a quick scan to make sure that your noindex tags and robots.txt are allowing both crawling and indexing, otherwise, it’s not going to be a great start to a site launch

  5. The W3C is not one of the ranking factors. It’s a red flag if you are told to check the W3C validation for SEO. The 13C validation helps avoid errors. But if your HTML is broken badly, it can cause problems, but stern validation itself will not affect the rankings.

  6. It is normal to experience ranking fluctuations. You will notice a rise and fall in the positions daily or weekly, even if you are not changing anything on your website. There are thousands of algorithm adjustments in a year, many moving parts, alterations in the link graph that can power up websites, and competitors altering things. Do not assign actions or inactions quickly to such small changes. Drastic trends of changes can see more than months are the things that you should act upon.

  7. It’s possible to serve different web experiences based on user-agents, meaning if your site has specific issues with bots, you can serve a “bot-friendly” version to Google, Facebook and the like.

  8. KD or Keyword Difficulty – a proprietary metric that can be calculated differently by various tool vendors, so be very cautious if you want to base your decisions using this metric.

  9. The “Auto-redirect to Base URL” choice in Magento will redirect users automatically to your base URL (example: from www to non-www). However, the default redirect established was a temporary 302 redirect, so make sure that you change to a permanent 301 redirect.

  10. Use a 301 redirect if you have the same content that is on two different URLs, and the other URL is not serving its end (it is not a parameter that filters or sorts, for instance). And do not combine them using a canonical tag. For illustration, if you have the non-www and www versions of your website, choose one and utilize the permanent 301 redirect with it.

Let's increase your ranking

Claim your S$2,000 audit for FREE by telling us a little about yourself or your business.
No obligations, no catches, no hidden clauses. Just real, revenue results.

SEO Singapore