There are software programs that generate hundreds of virtually identical pages, which use different keywords, without any human intervention. I have seen such pages. They are ugly and donít make any sense.
I recommend you avoid using such programs. Search engines consider these types of pages as spam and will catch up with you sooner or later.
According to the search engines, 30-40% of all webpages on the internet fall into the category of spam. I bet a large percentage is doorway pages created by software programs. So you see why search engines donít like them?
Search engines can, and do, detect spam doorway pages and are actively removing them from their index.
So what do the search engine optimizers who use software generated doorway pages do, when theyíre removed from the search engines? They go back to their clients and tell them that the search engines have dropped all their pages and they would need to spend more money to create and optimize new pages. Talk about getting ripped off!
On a final note, if you create a doorway page for a specific search engine, you do not want other search engines to find and index the page. Because if they do index a page not intended for it and also indexes the page created specifically for it, the two virtually identical pages could set off the spam alarm bell. So how do you avoid this predicament?
The answer is to use a robots.txt file to instruct each search engine spider which directories and pages they are allowed to crawl and index. Read the What Is A Robots.txt File? section for more information.
Hallway pages are similar to site maps, except that they only contain links to doorway pages. Hallway pages are so-called because they are like hallways leading to many doors. Do not include links to your doorway pages on your site map. Use a hallway page instead.
Dynamic pages are database driven pages created "on the fly," (in real time) depending on the user's interest or request. Dynamically generated pages are commonly created using programming languages such like ASP (Active Server Pages), Cold Fusion, or Perl (CGI).
Anyone who surfs the world wide web would have come across a dynamically generated website. For example, all search engines dynamically generate a particular page of search results based on the search terms you enter.
There are 3 reasons why search engines do not index dynamically generated pages:
The best way to turn dynamically generated webpages into search engine friendly pages is to use a system that converts your database into static pages whenever it is updated. Alternatively, you may use a software program to convert an entire database driven site into a static one.
Use software fixes to convert ASP, Cold Fusion, or Perl generated URLs into ones that search engines can crawl and index. Note that these software fixes do not convert ASP, Cold Fusion, or Perl generated pages into HTML static pages. They simply convert the URLs into HTML versions, so that search engines will index the contents properly.
Here is a tool that converts databases into static HTML files: