Skip to main contentdfsdf

    • The first concept to consider in writing content for search engines is to make the content available to search engines within the first 100 lines of code.
    • each page of content should have a clear and unique theme.

    5 more annotations...

      • Once you've chosen (or developed) a crawling tool, you need to configure it to behave like your favorite search engine crawler (e.g., Googlebot, Bingbot, etc.). First, you should set the crawler's user agent to an appropriate string.

         
          Popular Search Engine User Agents: 
           
        •   Googlebot - "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
        •  
        •   Bingbot - "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
    • Before we can diagnose problems with the site, we have to know exactly what we're dealing with. Therefore, the first (and most important) preparation step is to crawl the entire website.

       

        Crawling Tools

       

        I've written custom crawling and analysis code for my audits, but if you want to avoid coding, I recommend using Screaming Frog's SEO Spider to perform the site crawl (it's free for the first 500 URIs and £99/year after that).

       

        Alternatively, if you want a truly free tool, you can use Xenu's Link Sleuth; however, be forewarned that this tool was designed to crawl a site to find broken links. It displays a site's page titles and meta descriptions, but it was not created to perform the level of analysis we're going to discuss.

       

        For more information about these crawling tools, read Dr. Pete's Crawler Face-off: Xenu vs. Screaming Frog.

  • Aug 06, 12

    Seo tool for diagnosing problems regarding backlinks and serp's analysis

  • Aug 06, 12

    "Sometimes SEO experts go too far in their desire to push their clients' sites to top positions and resort to questionable practices, like keyword stuffing. Keyword stuffing is considered an unethical practice because what you actually do is use the keyword in question throughout the text suspiciously often. Having in mind that the recommended keyword density is from 3 to 7%, anything above this, say 10% density starts to look very much like keyword stuffing and it is likely that will not get unnoticed by search engines. A text with 10% keyword density can hardly make sense, if read by a human. Some time ago Google implemented the so called “Florida Update” and essentially imposed a penalty for pages that are keyword-stuffed and over-optimized in general. "

    • Generally, keyword density in the title, the headings, and the first paragraphs matters more. Needless to say that you should be especially careful not to stuff these areas. Try the Keyword Density Cloud tool to check if your keyword density is in the acceptable limits, especially in the above-mentioned places. If you have a high density percentage for a frequently used keyword, then consider replacing some of the occurrences of the keyword with synonyms. Also, generally words that are in bold and/or italic are considered important by search engines but if any occurrence of the target keywords is in bold and italic, this also looks unnatural and in the best case it will not push your page up.
    • Another common keyword scam is doorway pages. Before Google introduced the PageRank algorithm, doorways were a common practice and there were times when they were not considered an illegal optimization. A doorway page is a page that is made especially for the search engines and that has no meaning for humans but is used to get high positions in search engines and to trick users to come to the site. Although keywords are still very important, today keywords alone have less effect in determining the position of a site in search results, so doorway pages do not get so much traffic to a site but if you use them, don't ask why Google punished you.  

       

    5 more annotations...

    • A lot of developers are trying to impress their visitor by implementing massive Ajax features (particularly for navigation purposes), but did you know that it is a big SEO mistake? Because Ajax content is loaded dynamically, so it is not spiderable or indexable by search engines.
    • Another disadvantage of Ajax — since the address URL doesn't reload, your visitor can not send the current page to their friends.

    7 more annotations...

1 - 7 of 7
20 items/page
List Comments (0)