Jump to content
Science Forums

Recommended Posts

Posted

I put up my website with one the hosting companies a week ago. Now, a week later, when I type my website into a search engine, gogle, altavista, yahoo, it does not come up. What's the deal? Some of you IT buffs :) must know how this works. Where could be an error. Help:confused:

Posted

A few things...

 

Do you have a sitemaps.xml and robots.txt file in your root directory? If not, you should create those. Go to Create your Google Sitemap Online - XML Sitemaps Generator to get one automatically generated. For the robots file, you can create one in Google Webmaster tools: https://www.google.com/webmasters/tools/home?hl=en

From there, you can verify and submit your site to Google for free. Other search engines have similar tools. Do a search.

 

Something to keep in mind is that sometimes it takes a while for search engines to index a site. Following the steps above should help speed things up though.

 

Once it's listed, the next step is to do some SEO, but that's a whole different game...

Posted

Freezy, thanks a bunch. Here is what happened: The hosting site has an automatic verification button/page for all newly signed up domains in the control panel, to verify their domain with Google Webmaster. I was having problems getting the verification from Google thourgh the host's webpage--my control panel. I suppose this verification enables the spider to map the website. Let me know if you think that's incorrect.

 

Anyway, after I read your post I went to Google Webmaster's webpage and registered my domains under my google account. To do that, to place my new domains in my google account, Google generated a META verification for me to place in my HTML text. I did that and was able to verify the domain with google through Google Webmaster in my Google account, but not in the Host's Control Panel.

 

Now, do I just have to wait for the spider to crawl it? Does this sound right?

Posted

it doesnt have to be sitemap.xml, just a sitemap.htm or html works also (with your sitemap page linking to all the pages on your site).

 

I think the number of pages helps too, such as a minimum of 10 pages or more. Not as sure about where the top end/bottom end are.

 

My pages were last cached by google 2/9/10. Seems to be about every two weeks, but sometimes once a month where google re-caches.

 

More changes to a site make google update more often.

 

Links on other websites back to yours boost search results.

 

2 years ago a keyword search on a primary aspect of my site put me on page 11 of google results. Now I come in on the first page of google results.

 

Wait until you start going through your log files :confused:

Posted
Freezy, thanks a bunch. Here is what happened: The hosting site has an automatic verification button/page for all newly signed up domains in the control panel, to verify their domain with Google Webmaster. I was having problems getting the verification from Google thourgh the host's webpage--my control panel. I suppose this verification enables the spider to map the website. Let me know if you think that's incorrect.

 

It seems like that feature basically automates the verification process that you have to do through Google so that they can ensure that you are the owner of the site.

Anyway, after I read your post I went to Google Webmaster's webpage and registered my domains under my google account. To do that, to place my new domains in my google account, Google generated a META verification for me to place in my HTML text. I did that and was able to verify the domain with google through Google Webmaster in my Google account, but not in the Host's Control Panel.

 

Now, do I just have to wait for the spider to crawl it? Does this sound right?

Yeah, it's a waiting game.

 

I would definitely check your robots.txt file as that is even more important if the search engines can't see you. Basically, the robots file sets the permissions for search engine bots to crawl and index your site. If it is set with a "no" permission, then the bots will not crawl your site. Just generate it from the Google webmaster tools site and that will work for all the other search engines as well. :confused:

Posted

Freez, I am unclear about the function of robots.txt. Google explains that one should be generated only if I want to limit the access to some pages in my domain. So I am unsure why I should do this to improve searches. It seems inconsistent with what Google suggests.

 

Hre is the message from Google: "If your site has content you don't want Google or other search engines to access, use a robots.txt file to specify how search engines should crawl your site's content."

Posted

Make sure someone else links to it, too. Google needs to find it via the web and see that it's not an isolated island.

 

It takes time to get listed, sometimes weeks and months. Have patience while you improve your website. :confused:

Posted
Make sure someone else links to it, too. Google needs to find it via the web and see that it's not an isolated island.

 

It takes time to get listed, sometimes weeks and months. Have patience while you improve your website. :confused:

 

Thanks for the encouragement, Tormod. My site is merely engineering service website. Nothing fancy. I am not sure I can really cross link it too much, other than maybe subscribing to some or few of the trade listing services that are available.

Posted
Freez, I am unclear about the function of robots.txt. Google explains that one should be generated only if I want to limit the access to some pages in my domain. So I am unsure why I should do this to improve searches. It seems inconsistent with what Google suggests.

 

Hre is the message from Google: "If your site has content you don't want Google or other search engines to access, use a robots.txt file to specify how search engines should crawl your site's content."

 

Right, and you want to just make sure that your host hasn't generated a robots file for you that has permissions denied. If you don't have one already, then this is not the case then you don't need to worry. If you do have a robots file, delete it or overwrite it with one that Google generates with all access.

 

Here's what one of my robots.txt from a website I want fully indexed looks like:

 

User-agent: *
Allow: /

 

If you wanted to not let the spiders crawl certain pages (admin pages, proof pages, etc.) then the robots file can be configured to only allow access on pages you want.

 

In my experience, once a sitemap is submitted to Google, it takes less than 48 hours for it to show up in search results, but, ymmv.

 

Make sure you don't have any crawl errors in your Google webmasters tools dashboard. It should just show dashes all the way down (no errors).

Posted

No prob. In the meantime, set up a Google Analytics account so you can track your traffic. It takes a day or two to start generating data, but once it does, you can be dead sure that Google has your site indexed.

Posted

I just put up my first (multi page, commercial) website this week, but made sure I did at least some homework on SEO type tasking before publishing.

 

All that is in this thread seems like sound advice.

 

The one that stands out to me as most vital, in early going, is establishing ((quality)) links around vast landscape of the Web to your site. Both creates (marketing) buzz and makes search robots very happy/hungry for your site.

 

Adding dynamic/syndicated data is close 2nd, but less applicable if site is static. If at all possible to include blog into the site, that can help.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...