Monday, June 8, 2009

Spider Traps & Search Engine Optimization

Search engine optimization is the most challenge able and important part of your web site designing. Do you know your own web site make hidden traps to search engine spiders.


What is Spider trap?

A spider trap is anything that would prevent a Search Engine from crawling your website. Spider Traps can include dynamic pages, pages containing text on images, no text links, password protected parts of your site, or frame sites.


How to identify them?

You can identify spider traps by analyzing source codes, url names, url parameters, variable names, image names, file names, user loggings and session restrictions.


Some of common spider traps!



  • URL’s with session id or random number. (e.g: http://www.lankaemarketing.com/33343423453453)

  • Dynamically generated random URLs

  • Calendars that use dynamic pages with links that continually point to the next day or year.

  • Created unintentionally by creation of infinitely deep directory structures like (this can be a web folder structure or a link path)

  • e.g.http://lankaemarketing.com/Portfolio/customers/webpage/website/customer/lanka/web/....)

  • Complex navigation structures (Menu structure with 100 of url’s which don’t have proper back link for each page)

  • Pages filled with a large number of characters, crashing the lexical analyzer parsing the page. (Lexical analysis is the process of converting a sequence of characters into a sequence of tokens. Programs performing lexical analysis are called lexical analyzers or lexers.). This can be applying on source code or page text.

  • Calendars that use dynamic pages with links that continually point to the next day or year.

  • Created unintentionally by creation of infinitely deep directory structures like (this can be a web folder structure or a link path)

    e.g. http://lankaemarketing.com/Portfolio/customers/webpage/website/customer/lanka/web/....)

  • Pages filled with a large number of characters, crashing the lexical analyzer parsing the page. (Lexical analysis is the process of converting a sequence of characters into a sequence of tokens. Programs performing lexical analysis are called lexical analyzers or lexers.). This can be applying on source code or page text.

  • SEO friendly URL’s with session id or random number.

  • Dynamically generated random URLs

  • large urls (more than 60-100charactors)

  • Un professional menu structure (JavaScript / Flash / Image)


No comments:

Post a Comment