Web Programming help, one for the Pros.

Google Site Maps.

An xml file listing all the pages on your website. Used by google to index your entire product range rather than just your homepage. Easy to do but what XML editor is free to download and easy to use..?

alternatively, if you know how to make google site maps in Dreamweaver MX then please share...cheers

GG, any ideas on this one?
 
I've been thinking about doing it for a while now, but haven't had time.

Google have published the file specification, haven't they?

I was going to have a PHP page populate the file automatically for my forums site, by pulling out a list of the threads, and then writing the XML manually.

I have written an automatic news feed. I'll post the code in here, and it should hopefully get you started. It does give you the key, of sending the "text/xml" header, and then shows you how to intersperse the data you want into the file. Hope it's of use...

PHP:
<?php

// prepare HTML text for use as UTF-8 character data in XML
function cleanText($intext) {
    return utf8_encode(htmlspecialchars(stripslashes($intext)));
}

// set the file's content type and character set
// this must be called before any output
header("Content-Type: text/xml;charset=utf-8");
require("misc/opendb.php");

$query = "select * from FrontIndex where 1 order by Created desc limit 10";
//echo $query;
$feedcontent = mysql_query($query);
$na = mysql_num_rows($feedcontent);
$phpversion = phpversion();

// display RSS 2.0 channel information
echo "<?xml version=\"1.0\" encoding=\"utf-8\"?>";
?>

<rss version="2.0">
   <channel>
      <title>giles-guthrie.com's Latest Articles</title>
      <link>http://giles-guthrie.com</link>
      <description>The latest additions to the giles-guthrie.com experience.</description>
      <language>en-us</language>
      <docs>http://backend.userland.com/rss</docs>
      <generator>giles-guthrie.com, powered by PHP/$phpversion</generator>
<?php
// loop through the array pulling database fields for each item
for ($i = 0; $i < $na; $i++) {
   $row = mysql_fetch_array($feedcontent);
   ?><item>
         <title><?php
   echo cleanText($row["Headline"]);
   ?></title>
	<link><?php echo cleanText($row["Link"]); ?></link>
	<description><?php
	echo cleanText($row["Text"]);
	?></description>
    <pubDate><?php echo cleanText($row["Created"]); ?></pubDate>
	<guid isPermaLink="true"><?php echo cleanText($row["Link"]); ?></guid>
      </item>
	  <?php
}
?> </channel>
</rss>
 
thanks Giles,

i would be very interetsted in a php xml parser if i could get my hands on one. I look after a number of commercial website and the one that i am generating site map for has 1000+ pages. Obviously this is database firing out tempates of product pages. Creating the site map manually would be very time consuming. I did find a free one on the web but like most of 'free' things there is a price to pay. A truncated file with only 30 urls.

we are the number one ranked site of about 14 million in Yahoo, MSN etc but sit at 300-ish on Google. We have tried everything to push it up the rankings but it is a very competitive business. Only 8% of our visitor come from google and if we were in the top 10 that should rise to 80%. (5,000 unique per day)

i had an old xml map on there for about a year but it doesnt meet google specification and was ineffective..

cheers TS
GilesGuthrie
I've been thinking about doing it for a while now, but haven't had time.

Google have published the file specification, haven't they?

I was going to have a PHP page populate the file automatically for my forums site, by pulling out a list of the threads, and then writing the XML manually.

I have written an automatic news feed. I'll post the code in here, and it should hopefully get you started. It does give you the key, of sending the "text/xml" header, and then shows you how to intersperse the data you want into the file. Hope it's of use...
 
thanks, i'll try that out.

i dont run any forums...its e-commerce, shopping system intergration, payment gateways etc. I am a paypal applications developer and database designer using msql platform.

i could PM you a couple of the site urls if you wish. i like to keep my client information private.

Impreza04
I found ths quite useful for google sitemaps

http://www.auditmypc.com/free-sitemap-generator.asp

gets everything and follows robots.txt

If you have a forum it helps more if you identify your IP as a robot, log-out (important, as I found it is recognised as your account if you don't) and run..

Large sites, best to leave overnight :P
 
Then you shouldn't really have a problem, as long as you don't have Session ID's. And I imagine you wouldn't have alot of public pages :)
 
Impreza04
Then you shouldn't really have a problem, as long as you don't have Session ID's. And I imagine you wouldn't have alot of public pages :)

plenty of session IDs over 4,000 items in peoples shopping carts globally at any one time...1000+ public pages.

i want at least some of them listed in Google..

thanks again.
 
Back