الثلاثاء، 14 أبريل 2009

Alexa Measures Page Views Inaccurately?

For this site there is a massive difference between the page view data provided by Alexa and the page view data provided by Google Analytics and those provided by SBI, the company who host the main Pictures-of-cats.org website. This site is a sub-domain of the http://www.pictures-of-cats.org. I say that Alexa measures page views inaccurately.

Here is a comparison between these three sets of data:

sbi page views
This shows the SBI data. There was a drop on the 11th April by a about 10% for visitors but page views went up (12,560). The figures are pretty stable, almost a straight line. These figures are for the www.pictures-of-cats.org site. This blogger site gets about 3,500 page views per day.

alex pag views
Sorry this image is hard to read. But importantly you can see that "yesterday" the page views figure is .000004% while for the 3 months it is .00005%. The daily figure, the first figure, is 8% of the average figure. So page views dropped by 92%!! That is clearly completely incorrect. Page views are very stable for this site and the figure large enough, surely, to be able to make them reasonably accurate.

People will say that you have to be in the top 100,000 of Alexa to be accurate but being just outside that mark should, I would have thought, allow Alexa to at least provide accuracy better than this. Particularly as it is so important to webmasters who want to progress and eventually get some good advertising.

Google page views
This shows a 10% Easter drop in page views on Google Analytics. The actual figures are higher as a number of pages don't have the analytics urchin code on them so are not counted. The point is the drop is a tiny fraction of that measured by Alexa.

What is upsetting is that the Alexa figures are the ones that count! Don't get me wrong, I like and am addicted to Alexa! But isn't it possible for Alexa to work with other companies to make figures more accurate for those sites that on the margins of the 100,000 mark or worse? These are often very good sites with great content. After all to get to Alexa 118,000 (the ranking at present for this site) takes a hell of a lot of work. Don't people who work that hard deserve some degree of accuracy? The Alexa page view figures are ruining the Alexa ranking. Because of the inaccurate page view figures the Alexa rank for yesterday was 666,000 or so, a dramatic fall that will hurt the 3 month average (note: things have changed one month later. Alexa ranking is now under 100,000 and getting better for the time being, but what I say above did occur and it is indicative that something was wrong at least at that time).

If Alexa can't make things accurate people won't use them. OK the people who have benefited from the recent changes will love Alexa but the bottom line is that Alexa must be built on accuracy otherwise the whole thing lacks meaning. Is it just for the big boys, who represent a tiny fraction of one percent of the total number of internet users? That can't be correct surely?

Update: things have changed for the Pictures of Cats org website for the better! Not sure why but the Alexa ranking is climbing and page views are too. I have always had difficulty with Alexa but maybe things are improving. If so good on Alexa.



From Alexa Measures Page Views Inaccurately? to Home Page

الاثنين، 13 أبريل 2009

Change the title tags for your blog

I just changed the title tags for this blog as described by Blogger Buster and it didn't work properly. It may have been me - not sure but the tag <title> was in the title as viewed by the browser! The idea is to improve SEO by presenting at the top of the page above the browser menu top left, right at the top of the page, the title to the article (the post) and only the title to the post and not the name of the blog as well. i.e at the top of the page and on the Google listed search result you get this:
  • "Change the title tag for your blog" and not:
  • "ABOUT CATS AND OTHER THINGS Change the title tag for your blog"
By the way, if you use custom search on your site, once you have made these changes the search results are easier to read because they only have the title and not the blogger website name before the title. Here is a picture of the Google listing for this post. You can see that the Blogs name is missing. Interestingly, Blogger buster has retained the website's name but I thought she said she had made these changes - not sure.
title tag change
The process is easy, you just swap some code. With the new blogger templates this process isn't necessary, apparently, but for the older templates it is. Anyway you can check by following the first part of the following process. If the code searched for is there, then it can be changed. You go to edit html and use Ctrl F to bring up a search box (bottom left of the screen), into which you type or paste: <data:blog.pagetitle/>. That immediately brings up the code painted in green. This code is very near the top of the template code, by the way, just above the template author information and what I call definitions. You swap the code for this:

<b:if cond='data:blog.pageType == &quot;index&quot;'>
<title><data:blog.title/></title>
<b:else/>
<title><data:blog.pageName/></title>
</b:if>

As I said, I found that doing this left in the title tag: <title>. What I mean is this the actual title tag was showing at the top of the page and on the browser tab. So what I did was to swap not only the code: <data:blog.pagetitle/> with the new code but also the tags: <title> and </title> on either side of it. That worked fine.

Thanks Blogger Buster. I am not sure how effective it will all be though! Wait and see. One last thing. As the experts always say, save the template before starting. You can download it easily and then upload as easily. In this instance a preview does not allow you to check so you have to save the new code which makes saving the code beforehand more important.



From to Home Page

What Are Keywords

What are keywords? The answer to that question is simple. But the answer to the a question on how important they are today and how to use them is more complicated.

The word ,"Keyword" is actually a misdescription as it includes single words but nearly always means 2, 3 and 4 or more word phrases too. Keywords and key phrases are words and phrases that people enter into search engines to find information about a certain subject.

For example, saying I am a cat lover, which I am, and I want to find information on the British Shorthair cat. I might enter into the Google search bar, "british shorthair cat" (capitals are irrelevant in search engine use). Or I might search for "british shorthair" or "Brit SH" or "shorthaired british cat". The possibilities are wide. From a webmaster's point of view we like to know two things about keywords:
  • the "demand" for keywords in respect of the subject matter searched for
  • the "supply" for the same keywords that are demanded
"Demand" means the popularity of the various keywords in relation to the subject matter in question. For example, people might prefer or use "british shorthair" more often than "british shorthair cat".

Webmasters will prefer to use keywords with the best demand because if they can satisfy a high demand they get more traffic. But and this is important, if the supply for a keyword is also high, meaning there are a lot of websites supplying the demand for information on that particular subject (keyword) then there is high competition and your site will not get in the first page or two of search results, so no one will find your site. Although there might be millions of people searching for information using a very popular keyword (lots of potential traffic) you won't get the traffic because you are invisible on page 10 or so of a Google search.

The ideal keyword, then, is one with high demand and low supply. This equates to lots of people wanting to buy a special mobile phone, for example, but it is only available at one shop in the country. What do you think would happen? The shop would be very busy indeed - plenty of traffic. What webmasters do, therefore is use programs to search for relevant keywords that have high demand (popular) and poor supply (no websites). The best known keyword provider is Wordtracker.

Once a keyword is selected it is used as the title to the article and used within the article a little more than usual. One complication is that keywords that are three words long can be hard to fit into an article without things becoming artificial. Take the keyword for this post, "What are Keywords". That is very difficult to work in. I just worked it in once and at the beginning so that makes twice. But to work in "what are keywords" three times is hard - I just did.

That means that single keywords or double word keywords are easier to use but are usually not that good because there is high demand and high supply, putting you in the highly competitive zone and it is depressing to build great content and be invisible. The key is to go for keywords that are lower in demand but accordingly very low in supply, say 20 websites supplying information on that keyword. In theory that should get your site in the first two pages at the worst as there are 10 listings on each page.

So the answer to the phrase, "What are Keywords" is that they are words or phrases that are used to search for information utilising one of the search engines.

The next question is, are they still useful? Yes and no. Google has got so sophisticated that it is less important. Things are less mechanical these days. And you can over due keywords, which the seach engines don't like and can spot. That will go against you. One last thing. The keyword should also be used in the first 90 characters of the article and in a link in the article. Images should have "alt" tags which should ideally also use the keyword and the photo captions should use the keywords too.



What are keywords to Home Page

الأحد، 12 أبريل 2009

Alexa Algorithm has Changed

I am sure that the Alexa algorithm has been changed at the same time as the website was updated. There was a decision to upgrade everything, what you see and what you get. Some sites have benefited massively and some have fallen. This site has fallen. That is despite building a lot of content over the last 4 months or so.

The Google Analytic figures are more or less stable for this blogger site subdomain and the main site. There has been a small fall off in page view but the Analytics figure show a 0.03% drop over months so this is infinitesimal. Also my web hosting company shows very little change. But Alexa on one day showed that this site had a traffic ranking of 1.4million when for months and months it was at the 115K mark. That represents a drop by a factor of more than 10! Alexa is also rating my page views much lower and I say it is inaccurate. And I know the site has to be below 100,000 for accuracy but why? Surely that is a problem that Alexa needs to address. Also this site is ranked 118K close to 100k. But it is falling....! Traffic rank for Sunday was severely depressed:

alexa-traffic-rank-for-pictures-of-cats-orgShowing the downward trend.
This is not seasonal but sudden and at about the time the Alexa website changed


On the basis of stable traffic figures but a falling Alexa ranking (it is now around the 180k mark, a big drop) I conclude that the mathematical formula used by Alexa has been changed. And it seems that the change is in favor of tech sites, blogging and social networking sites as the buzz seems to be to be that these are more modern and useful to companies like Alexa.

This may be a reaction to the credit crunch, the financial crisis. It is seen as a symptom of the "old way", the bad way and we need to move forward and out of that world. We need new people to manage the banks and basically business. The Alexa people probably reflected on all this and decided that the future was young modern sites and the algorithm had to reflect that. That is my guess, of course. I am probably off the mark a bit or a lot but it feels that the changes are based on that kind of mentality.

world percent internet usersChart published under published under
Wikimedia® creative commons license license = Attribution-ShareAlike License
Author Kozuch

I also think it pays to stand back and look at the big picture to find answers to what the intentions of Alexa are (note: this is just me speculating). Alexa are owned by Amazon and Amazon are in the internet business. The more people in the world who can use the internet the more business they will do. Globally including developed and the developing world 22% of the people use the internet (see chart above). There is plenty of room for expansion. To achieve that Amazon need to encourage and facilitate internet use. This can be achieved by encouraging tech sites to flourish to educate people and social sites to spread the word. To achieve that, I argue, Alexa have changed the algorithm to rank more highly these sorts of sites and businesses including new social networking sites. Lets not forget a site can be ranked anywhere by Alexa. They control the rules and it is not just based on how many visitors the site gets. Google have become the world's most powerful business, I believe, on the back of encouraging internet use by providing free software and products. It is the classic "preselling" technique so talked about on internet marketing sites.

It seems most of the cat sites are affected negatively. i-love-cats is affected positively, however, why? Well it has been around a long time with lots of inbound links and it does have a forum. That might be a factor. I don't think anyone has actually figured out the underlying changes. But I think what I have said above are relevant.

Update: Well, there is no doubt that Alexa have made changes to their algorithm as there has been too much disturbance to the traffic rankings. The dust has been kicked up. I say this having checked a number of sites. Some tech site have improved their rankings by large numbers. However, the day after I reported the above the traffic ranking has, at least for one day, returned to normal for my site so right now all seems to be kinda OK.

One factor in benefiting would seem to be if your site has a lot of inbound links, a decent page rank basically. It seems this is more of a factor in their algorithm. For example, Blog Buster has improved dramatically while enviroman's Blogger tips and tricks has dropped because he basically screwed up. His site was hosted by Blogger but he wanted to go fancy and upmarket and have a "proper" URL so he bought a domain name and redirected his domain from Blogger to the other hosting company. The URL looks nicer but as his URL changed he lost his inbound links and his pagerank and now his Alexa ranking too all for a nice URL. Never do this! The URL is unimportant. The content is important. Also it would have made sense to talk about SEO on Blogger sites if you yourself use a Blogger site.

Further update 14-4-09: The downward trend stopped yesterday but that is probably due to the massive effort I have put in recently!

16-4-09: There have been a lot more sites mentioning this since I first made the post. I guess that confirms it. It's history already! In hindsight I don't think the changes to the algorithm have been big (you couldn't do that as it would undermine the whole thing) but there have been changes and when they were made there were initial glitches and now it is more settled.

27-4-09: Things have returned to normal. Traffic is up. But why? For me, there was strong evidence that the Alexa algorithm had changed because there was too much change in traffic rankings at the time this post was first made but maybe they changed it back in a panic when they saw the drastic change it had on traffic rankings. I can only speculate.

2-5-09: Yes, another update. Traffic is up for this site and the Alexa ranking is currently getting better. So, although changes certainly took place it is not clear what happened subsequently. I mentioned the website, i-love-cats.com that improved dramatically in Alexa rankings after the change. That site has no gone back to "normal". It almost seems that they made changes and then changed back but it is not clear.



From Alexa Algorithm has Changed to Home Page

السبت، 11 أبريل 2009

Solid Cat Coats

The world "solid" as imagined refers to continuous color. As expected, solid cat coats come in a wide range of colors both in high density color and diluted (see diluted cat coats). In the cat fancy, the full colors are black and red. Dilute black makes blue and dilute red creates a cream solid color. Naturally, there are fine shading differences from cat to cat in the same color. It is not unusual for faint tabby markings to be present despite the fact that we are talking about "solid" (and therefore continuous and unbroken color). These are described as, "ghost markings" and are most commonly present in the cream and red cats.

Here are some pictures of the solid cat coats:

solid cat coats

solid cat coats

The pictures are copyright Helmi Flick and they are reproduced from the www.seregiontica.org website with the express permission of the owner. Please respect copyright always - thank you.



From Solid Cat Coats to Home Page

Featured Post

i hate cats

i hate cats, no i hate f**k**g cats is what some people say when they dislike cats. But they nearly always don't explain why. It appe...

Popular posts