Evan Goldin
Evan Goldin
Rate this post

So you’ve set up your Google Webmaster tools — and verified your network, as we described recently — and now its time to take advantage of all the cool new features you have!

Once you have verified your network, log in to the Webmaster Tools Dashboard and click on your site’s URL. From here, you can see when Google’s Robot, the Googlebot, has last visited and indexed your page. Google does this periodically, and this is when it generates the content you see when searching. Next to “Index Status:”, you can also confirm that the Googlebot is able to index your network’s pages.

Picture%202-2.jpg

More after the jump. And remember, your network must be completely public to be verified by Google.


From there, let’s find out what people are typing to get to your site! On the Overview page, click on the “Top Queries” link right in the middle of the page.

top%20queries-1.jpg

This can provide a wealth of information. In the list on the left, you can see what search queries your site is appearing in. In the right column, Google tells you what search queries are actually being clicked on and bringing people to your site. Knowing what people are looking for when they make it to your network can truly help you reshape your network to drive traffic.

Perhaps your running a NFL-oriented network called Pigskin Central, and your Google Webmaster data tells you that 90 percent of your clicks are coming people who have searched for “football videos”. Yet, you don’t have any videos. Well, it could be time to re-evaluate that decision!

You can also further refine this data, if there’s a particular category of searches you’re looking to discover:

– You can change the search type, so that you’re purely examining web queries or blog queries or image queries.
– You can change the location, so that you’re only seeing searches on Google.co.uk or Google.au or Google.com.
– Just mouse over the timeline to change the time period you’re looking at. This can help you learn how searches and query clicks are changing over time.

But this is just scratching the surface of the data. Let’s head over to “What Googlebot sees”!

Picture%207.jpg

On this page, you can see what text other sites are using to link to you. This can be very insightful, because you can learn what other sites think your network is about. And that matters, because it can greatly impact your traffic and your placement in search results.

If you designed your network to be a place for New York Giants to come together, but most of your links say “San Francisco Giants,” you’ve got communication work to do!

Next, it’s on the “Links” section. Click on “Pages with external links”.

externallinks.jpg

This site is also rife full of valuable data. From here, you can find which pages are being linked to by external sites and other networks. If 90 percent of the links are pointed toward your Forum, you know what section you want to work on improving. By seeing where outside traffic is coming (and which pages or section are the first viewed by potential members), you can really know how to prioritize improvements and additions to your networks.

“Pages with internal links” can be similarly insightful. You can find out which pages and sections inside your network are being linked to. This is better at helping you understand what pages are helpful to and popular with your current members.

The last of the major features we’ll go over is using robots.txt and “Remove URLs” to block Google’s access to parts or pages of your network. Click on “Tools,” then “Analyze robots.txt”.

Robots.txt is a text file that you can place in your site to prevent cooperating search engines from visiting (and adding to their search results) specific pages in your network or, if you want, your entire network! Obviously, this isn’t a way to drive up traffic, but it can be a very useful tool.

Unless you’re ahead of the game, the “Status” on this page should read “404 (Not Found). If don’t already have a robots.txt file, open up a text editor and head over to the Web Robots page. Figure out what you want to block and how to block it, then save the file as robots.txt. Upload it to your site by utilizing the Site Editor by visiting:

https://www.ning.com/?view=apps&appUrl=yournetworksubdomain&op=edit

Where yournetworksubdomain is replaced by the subdomain of the address of your network. Make sure you’re looking at the root (lowest level) of your network before uploading.

Once you’ve set up your robots.txt file, Google Webmaster’s “Analyze robots.txt” page will check whether it is working and what effect it’s having. To expedite the removal of a page, add it to robots.txt and then click “Remove URLs” in Google Webmaster.

Those are just the major features! Give Google Webmaster a try for yourself and watch that traffic grow!