Hi guys
We have just redesigned our site, which has been created with asp.NET. The original site used plain text links which the googlebot managed to use to a limited extent. I think the reason it could not use all of them was due to the way the url is constructed, i.e. http://www.???.com/dir.aspx?cat=14340. We are now using 3 combo boxes which are dynamically populated from our database. I am concerned that the googlebot will not be able to crawl using these so I wonder if anyone coould help? I had the idea of using a site map with links to all our dynamically created pages, which could be linked to each page? I would really like some help here as optimization is our life line :-)
Cheers J