‏إظهار الرسائل ذات التسميات SEO. إظهار كافة الرسائل
‏إظهار الرسائل ذات التسميات SEO. إظهار كافة الرسائل

الخميس، 14 مارس 2024

Google search console core web vitals Cumulative Layout Shift (CLS) can be caused by website Menu


This is a rapid note! I have thought this for a while but Google search console core vitals Cumulative Layout Shift (CLS) can be caused by the menu on a website. 

That sounds odd but when a menu is quite heavy with lots of submenu items compared to a very light one it can cause CLS issues. 

This will have a negative impact on the Google search console assessment and it may, have a slightly negative impact on Google search engine results.

I'm just flagging up this point for anybody who is struggling with CLS as I did for a while. I have resolved the problem on my WordPress website with a couple of plug-ins and in reducing the menu to a very simple state where there are no submenus. It looks weak and rather simplistic but it works.

There may be compounding factors but core vitals can be complicated and hard to fix - almost impossible sometimes - so a simple fix is welcome.

I don't think menus are important either. How many visitors click on menu items? Very few I would say based on my experience.

-------

P.S. please forgive the occasional typo. These articles are written at breakneck speed using Dragon Dictate. I have to prepare them in around 20 mins.

الاثنين، 26 مايو 2014

Internet Writers Waffle For SEO Reasons

It can be difficult to write a concise compact article for a website because it might be too short for search engine optimisation (SEO) reasons.  Sometimes a writer might simply wish to answer a question in an article.  The answer to the question may take one line of text.  That would not be enough for search engine optimisation reasons.

Sometimes the nub of a news story can be stated in a couple of lines or in a paragraph of text.  But a paragraph of text amounts to about 50 or so words and that is not enough for search engine optimisation reasons.

Classically, the average Internet article may be something between 300 to 1,500 words averaging about 500 to 700 words, perhaps. Writers feel compelled, or are told by their editors, to write articles of around 500+ words.

If the topic that they're writing about really demands far less words then a lot of what they write will be padding --- what I refer to as waffle.

A lot of Internet readers want to find the answer to a question that they have and they want to find an answer quickly in the first paragraph of text but often Internet writers waffle their way through about three initial paragraphs before even beginning to answer the question or provide relevant information.

I would hope that the search engines become more refined so that a person can write even one word as an article and that that article is found successfully by search engines and ranked highly provided the single word completely addresses the topic at hand.


الخميس، 8 مايو 2014

Google Blogger Is Better Than WordPress

Yes, Google Blogger is better than WordPress in my opinion.  I've use Google Blogger for about 7 years.  I have used WordPress for about 2 years.  I've also had my website hosted by SiteSell for about 5 years. Don't use SiteSell.

Note: I am referring here not to websites hosted by WordPress but using the WordPress interface (the code, software) while the site is hosted on a commercial server like Hostgator -- you see it is bloody confusing ;). That is the first point. WordPress = confusing : Blogger = Simple.

Why do I think Google Blogger is better than WordPress?  Firstly, Blogger is more reliable.  It is way more reliable.  I have a clear sense that Google engineers are better than WordPress engineers. That makes sense because Google is an enormous company worth many billions of dollars with thousands of engineers writing the very best modern code.

A recent problem highlights what I mean.  Every so often WordPress issue an update to the basic code and the latest update was 3.9. It proved to be incompatible with certain plug-ins and themes.....and the visual editor does not work properly for a lot of people....and image manipulation and positioning etc. is a lot worse than the previous version which is causing a lot of problems for a lot of people and...  I needn't go on but the truth is you don't want to update your WordPress software when they say you have to because you're bound to get a problem somewhere when, for example, a bloody plug-in does not work properly after the update.

There are thousands of plug-ins for WordPress but lots of them - if not all of them - are scripts and they may slow down the load time of your site because the script code is at the top of the page and this causes a blockage when the browser loads the webpage.  Just another glitch. This is important, though, because search results (SEO) are partly dependent on load times.

WordPress is more flexible than Google Blogger.  I will concede that but you don't need flexibility beyond a certain point and there is a perfectly adequate amount of flexibility in Google Blogger to create a website that really appeals regarding appearance.

I almost forgot.  Google Blogger is entirely free.  If you want a unique URL then you have to pay for that but they're very cheap anyway normally.  But the actual hosting of the website is entirely free.

As the WordPress, there will be a fee to host the website and it will be between $100 and $2000 perhaps even more depending on what sort of hosting you choose.  These prices are per annum.

The most important benefit of using Google blogger is reliability.  You have piece of mind and most people these days simply want to focus on writing and not be bogged down with coding things.  With WordPress you really do need to know a bit about code because not infrequently you are involved in patching up incompatibility issues or problems here and there.

And the code is complicated.  WordPress code isn't just straightforward HTML.  It does depend on what version of WordPress you are using but  unless you're comfortable with coding you might be intimidated.  In addition, even simply running the control panels of plug-ins and your website generally can be complicated.  Google blogger is much more straightforward which allows you to focus on what matters, content.

Lastly, as Google Blogger is a Google product, if you consistently write  good (excellent) content on your Google blog, Google will like it and will find it in search results.  So the last good point about Google blogger and one reason why it is better than WordPress is because it is inherently better from the point of view of SEO and you haven't got to do any fancy SEO work to achieve that (unlike WordPress).

If you have questions please leave a comment.

الأحد، 4 مايو 2014

Cat Websites: Success Not Dependent upon Quality

The Internet is not what visitors to websites think it is.  The quality of a website is not the major factor in how many visits a website gets.  Quality of content is surprisingly a secondary factor. The major factor is how well it is advertised. And one of the best ways to advertise a website is via Facebook. 

A huge number of website owners advertise their website through Facebook.  In other words, they create an article on their website and they put a link to that article on their Facebook webpage (it is very easy to create a Facebook webpage).  Then they build a massive audience to their Facebook webpage.  They do this by paying people to "like" the page through Asian businesses who charge a low fee.

Most of these businesses are in Asia where they can pay people a very small amount of money to homeworkers to “like" any website that you want, including a Facebook webpage.

There are very many Internet businesses who will guarantee to get for you thousands of "Facebook Likes" to your Facebook webpage.

When the website owner puts a link to their website on their Facebook webpage everyone who likes the Facebook webpage receives a notification to the link.  Then they click on the link and go to the website.  The result is that the website gets more hits and this is a form of advertising.

That is the reality of the internet. I doubt whether anyone is interested but they should be because the websites with the best content should get the most hits.

What is the best content? That is another story but it is not really the best content....


الخميس، 15 مارس 2012

New ways to present information on the internet

We need to look for new ways to present information on the internet. We needn't be stuck in the rut of simply writing about it. I am referring to mapping on this occasion. Google Maps have advanced quite dramatically, recently. We (independent website builders) owe it to ourselves to use Google's free software to our advantage as we are so painfully dependent on this internet giant that dominates us. And it might not be free indefinitely.

Almost any information can be presented on a map because almost all information can be referenced to a place. Take cats, my pet subject! I have mapped USA animal rescue, UK animal rescue sanctuaries, tiger reserves and more. A lot of information can be presented more effectively on a map and there are hidden SEO benefits as well, which I touch on below.

The map below shows animal rescue sanctuaries in the UK:



There are a lot of directory websites that list businesses and other organisations. Some of them provide directions to the listed business. And some provide a map below the address. The mapping and directions are secondary to the written information - the address etc. We can turn that upside down and make the location the primary information and the other details secondary. Where the location of a business is of primary importance this is a better way of presenting the information. Google maps allows us to do that.

An example, in the world of cats, is boarding catteries (cat hotels). The location of the cattery is very important. You need to minimise travel time to the place (to reduce stress for you and your cat) and you need to see the place beforehand. Information about boarding catteries is best presented on a map. This principle applies to many different sorts of information. You can use your own imagination.

SEO

There is an unexpected SEO benefit from web pages that contain an embedded Google map. Visitors will stay on the site longer as they explore the map, zooming in and out and clicking on the place markers to see the information contained inside the markers. The Alexa website ranking is based on a number of criteria one of which is the amount of time spent on the site. You will probably find that your Alexa ranking will improve if you have some prominent and well produced maps on your site.

Mapping as a way of presenting information is an emerging trend and I recommend that you join the trend now to get ahead.

Google Maps - ways to map

There are essentially two ways to use Google maps that I know of:
  1. Manual input. You find the location using Google maps or by yourself and then you place the marker at that location and add the details.
  2. Automatic input using Google fusion tables. This is a beta program in development. You create a spreadsheet containing information in columns. One column contains information that allows Google to map the organisation that you have listed. A good address can suffice. Google allows other source information to be used. Using fusion tables you have to trust Google to map accurately so it is important to provide good information so as to not mislead or confuse Google maps. Fusion tables is the only way to map large amounts of information as  manual mapping of say 4,000 businesses would take about 6 months full-time!

الخميس، 26 يناير 2012

Google Plus One under fire already

26th Jan 2012  - Europeans will be given the right to delete information that has been collected by Google under proposed legislation. This was announced by the European Commission on 25th Jan 2012.

We are told that Google and Facebook will be subject to tight limits on the amount and kind of information that they can hold on their members.

The legislation is set to come into law in about 2 years time provided of course that it is passed by the European Parliament after being ratified by member states. It makes sense though because there is no regulation of the internet or hardly any. This allows the biggest internet sites to do as they please to their advantage. This worries ordinary people and chimes with the recent Occupancy protests across Europe and America

The existence and success of Google Plus One (+1) depends on the collection of personal information, the likes and dislikes, of individuals who go online. This news must cast doubt on the long term viability of Google's recently introduced +1 search and social media programme.

I hope that you have noticed (many have not) that there are now two kinds of Google search: a search based on your Google +1 preferences and socialisation and secondly a conventional search that is more objective. The first finds information that is personalised for you and the second finds information that is meant to be the best on the internet. I say "meant" because even god almighty Google gets it wrong too frequently. It favours the big sites and these are far from correct every time.

The sole reason why Google changed its search algorithm and search method was to force people to use Google Plus One. People were not doing it voluntarily. Have you noticed the low +1 numbers compared to the high Facebook like it numbers?

This was a dangerous ploy by Google because their success as a company is built on finding the best sites and web pages on the internet. To connect that objective to one that forces people to use Google as a social media network must compromise Google's ability to provide the best search results.

You can turn off the social media search element if you want to and if you can find the right button to push. Most people won't know that personalized search is on never mind know how to turn it off. Google are relying on that.

الأحد، 11 ديسمبر 2011

How fast should a web page load?

ANSWER: as fast as you can turn the pages of a book. This is an important question these days as Google has decided that the next thing that they are going to improve on the internet is page load time. Google can dictate to all the rest of us. They manage the internet. They feel duty bound to ensure that the internet improves and expands because it benefits Google, financially. They have to govern the internet because no one else is or can.

If they can improve the internet and make sure that it is used more they will make more money. There is still a lot to do to improve the internet. There is still a lot of people who don't use the internet. And there is still a lot of internet providers who provide a slow service.  I am guessing but Google is probably thinking of the important emerging markets: South America, India and China (when they stop censorship).

In order to ensure that pages load at a reasonable speed across the globe, Google has changed their algorithm to force people who are internet publishers to lighten their webpages so that they load fast.

To assist in this task they have a website that allows you to check the load speed of your webpage by their standards. These are their standards please note. They assess the page, provide a mark out of 100 and offer advice. Follow it.

Google have demonstrated their keen desire to improve the internet in terms of page load speed by rejigging Adsense load methods so that it does not delay page load. You will see this. The Adsense loads after the page.

CONCLUSION: Make sure that your pages load fast. Redo the images. Make them lighter. Is a GIF a better format? Do you need all those third party scripts etc. Is the page too long? They all slow load times. The modern idea is a process of dumbing down or less science and more practicalities. Don't make a page too good by making it too long. Think concise.

الأربعاء، 9 نوفمبر 2011

Is SEO Dying?

The answer is YES. For people who don't know, "SEO" means search engine optimization. In practice it means creating a webpage that has a better chance of being found and listed highly by Google and the other search engines.

A lot of website builders use SEO. A lot of people teach SEO. It is big business and big business want SEO to remain important so they will talk it up. But it is artificial.  Google doesn't like artificial methods that gets a webpage noticed. It simply wants the best content and webpages of any kind to be on the first page of a search irrespective of how good the SEO is because Google is a business and its purpose is to find the best on the internet without qualification. I guess that is common sense except for one modern exception: Google have been accussed of biasing their search results to favour those that make Google more money! As at August 2020, this has resulted in an enquiry by the US government.

SEO is dead because it allows bad sites to get good Google listings. Google does not like that. If you use Google Blogger (Blogspot) the free website building application, you can ignore SEO completely and it has no detrimentally effect in respect of Google's search listing and that is all that matters.

Recently Google downgraded websites that were not from branded names (e.g. Yahoo, About and Wikipedia) and which were created using the best SEO techniques. This is another sign that Google is bypassing the SEO sites. It wants the best not the best SEO.

Yes, SEO is dying and the businesses built around SEO, of which there are hundreds of thousands, are dying too - gradually. It is the beginning of the process. Website owners are still propositioned by SEO businesses but they are dying out too. They realise that their time is up. Google has always punished those who overdo SEO. It is called black hat SEO.

Google is the most important element of the internet. Please the company. Create a Blogger site. Work well. Write well. And make good content consistently and you can forget bloody stupid SEO! I feel I have to add an update. For many years now I have completely ignored SEO on the websites that I write for and manage. There has been no negative impact in terms of Google search results. In fact my experience tells me that if you overdo it, as I mention above, it has a negative impact on search results. It also slows down your writing and makes articles less natural and more artificial because you have to knit into the article the keyword. At best, you can add a little bit of SEO if you feel like tweaking the article but by and large I would ignore SEO in 2020. In fact it has been pretty much redundant for many years. Google has become too refined and sophisticated to be fooled by SEO. It doesn't need it. In the past it did but we're talking about 13 years ago. These are my observations based upon first-hand experience.

السبت، 9 مايو 2009

Delete Images with Caution

When amending or upgrading a page of your website delete images with caution. It takes a considerable amount of time for Google image search to favour your images; perhaps 18 months for an image to appear on page one of an image search results page.

If you are thinking of changing the image on the page that you are working on it is advisable to do a search in Google for images under the name of the html file of the page that you are working on.

For example, I am working on my “Cat Facts” page on which there are several images. On a Google images search for “cat facts” I noticed that three images from the current page are on the first 2 pages of results.

I will make sure that those images stay put and I intend to work around them, deleting text around the photo and rebuilding that way to preserve the hard earned success of the current situation.

From Delete Images with Caution to Home Page

الثلاثاء، 5 مايو 2009

Google Image Search builds Traffic

Google image search builds traffic. I run this Blogger site which is part of the main site: http://www.pictures-of-cats.org. Pictures of Cats.org (PoC) is about pictures of cats, for sure. But it is about much more than that. There is a ton of information on the site; lots of words. More words in fact than pictures. Yet most of my traffic to PoC comes from Google Image Search. Scottish-Fold-Liberty This is such an important traffic builder but I sense that it is a somewhat forgotten, second string thing. Something that just works in the background to add an extra bit of traffic.

As the internet gets faster with improved broadband speeds (and in the long term it will get even faster) the old clichés about using small picture files to keep page load times quick are becoming redundant. We were advised to upload pictures not larger than 15,000 bytes. This is a small image. I use larger sizes, up to about 30,000 bytes. If there is only one image on the page I go up to 80,000 bytes. However, image file size needs to be considered every time to ensure decent load times.

Here is an image of the top 5 referrers to PoC (referrers means those sites and pages from which traffic to PoC arrives) for May 2009 (to day 5):

Pictures-of-cats-org-referrers-1

The important thing to note is that the No.1 referrer is Images.google with www.google a close second. In the top 20 referrers 8 are image searches. As said, Google image search builds traffic.

I don’t do anything special to get the images seen by Google. I simply ensure there is an “alt” tag. Even when there is no alt tag Google finds them. I find it is best to leave the pictures alone once on the page. Rotating pictures to freshen up the page is not a good idea as Google does take its time to find and list images. This is not because of something Google is doing but because of the enormous number of images on the internet. It is wiser to refresh with new text.

I tend also to make sure the image and alt tag chime with the subject matter of the post to ensure that the post or article is focused and coordinated. Please don’t forget that Google image search builds traffic. And when using Blogger it is best for SEO purposes to have a Picasa Web account and to upload the pictures using the compose mode image uploader. These images are saved on Google servers and listed in your Picasa Web album. Google image search lists Picasa Web album images so make them public if you can.

From Google image search builds traffic to Home Page

الأربعاء، 29 أبريل 2009

How to Get Traffic to Your Blog

How to get traffic to your blog. This is what most of us think about. There is no point, really, in writing a blog unless someone is reading it and the more the better if you are trying to make a bit of money at the same time. These are my points in getting traffic to your blog:
  1. Content and patience are king. Keep writing genuinely useful content that people want to read and look at. Have patience to allow Google and the other search engines to list and find your blog. It will take months to get any real traffic. This is a kind of test for us. Google weeds out the people who do not have staying power. I am sure that part of the very complex Google Algorithm is a formula which asks if the blog has been around for “x” amount of time and whether new posts have been added over that time. Google is far bigger than the others so you’ll need to be found and listed by Google in a search. That is one reason why I use Google Blogger. Google understands it’s own products and I say it will tend to favour very slightly Blogger over, say, WordPress.
  2. Keep post fairly snappy but content rich. The attention span of modern visitors is short!
  3. Use Keywords (What are Keywords). These tell you what people are searching for and you can research supply and demand of keywords. Supply in this context means the number of websites that provide information on a particular subject and demand in this context means the number of people looking for information on a particular subject. You might try Wordtracker. I use SBI (Sitesell) Brainstorm It!
  4. Although less important today, the title to the post should be a selected keyword that has high demand and low supply.
  5. The keyword referred to at 4 above should be used in the post (but not over used) and as “alt” tags on photos.
  6. The keyword should be used in the first line or two of the post.
  7. The keyword should be in a link in the post.
  8. You should get inbound links to your blog to improve Page Rank. This helps to get your blog listed higher by the search engines. This will come in time if the blog is good as people will want to link to it but initially it means pushing things along by (a) submitting to article sites (b) making comments on other blogs and leaving URLs to your blog (d) getting your blog listed in directories and (e) participating in link exchange agreements.
  9. If you can get friends to link to your blog.
  10. Join and participate in forums, where your blog is relevant. Include the URL of your blog when appropriate.
  11. Search engines may index your blog using the site feed so make sure it is activated.
  12. Submit your blog to the search engines, for indexing.
  13. Make frequent posts, say one or two a day. As mentioned make the posts as good as possible. Take your time and think long term.
It is hardly ever possible to achieve success on the internet without patience, persistence and determination. Here is another post of the same subject: How to Publicise Your Blog.



From How to Get Traffic to Your Blog to Home Page

الثلاثاء، 14 أبريل 2009

How Useful Are Keywords?

How Useful Are Keywords? Keywords used to be the be-all-and-end-all of website building as it gave the webmaster a tool to target customers and find the best words or phrases around which a web page could be built that was search engine optimized. It meant being seen by the search engines rather than being invisible and God there is no joy in being invisible. There is no point in doing it if we are invisible.

If you are not sure what keywords are, you might like to read this post: What are Keywords?

Things are changing or have changed, though. For example, this Blogger site tells us what we want to know as to whether keywords are useful or not. In Blogger you can totally ignore them and still get found by Google provided you have a decent site with decent content and inbound links. The last is very important and is a consequence of the first (i.e. good content). I have built nearly all my pages on keywords but it is becoming increasingly irrelevant. Even the guys at Wordtracker said you can ignore them! (Wordtracker is the number one keyword search website).

So you don't have to knit keywords into articles with the same kind of earnestness as before. But they are still a great market research tool. Keywords tell us as surely as a market research company what people are interested in. That I guess is obvious. What people search for tells us what interests people. We should supply what is demanded. Currently one area that is demanded perhaps more than any other is information on internet marketing and by that, in this instance, I mean how to get your site to be effective and actually do something. Only a very tiny percentage of websites actually do something worthwhile.

If a site gets 30 visitors a day, what is the point except to have some fun building it? And as competition hardens day by day it get harder to be seen. So in response to "How Useful Are Keywords?", the answer is they are useful, indeed essential at least at the initial stages of building a site and particularly with a non Blogger site. Over time the old rules apply. If what you are selling is good, eventually people will buy it. In other words if your content is good in the end people will find it. It just takes longer if the pages are not SEOed well. And keywords form the backbone of SEO work.

I use keywords like this:
  1. I have an idea for a topic.
  2. I check the keywords for that topic. SBI provide that service or you can use Wordtracker for instance.
  3. I pick a keyword that (a) has high demand and low supply (b) can be used in the article - some are simply unsuitable to be used in an article.
  4. I use the keyword in the article (a) as a title (b) in the first line (c) a little more than average in the article (d) in a link (e) perhaps as an alt tag for a picture.
But as I said Google is more flexible particularly with Blogger sites so using keywords in a formulaic way is less important but still useful.



From to Home Page

Alexa Measures Page Views Inaccurately?

For this site there is a massive difference between the page view data provided by Alexa and the page view data provided by Google Analytics and those provided by SBI, the company who host the main Pictures-of-cats.org website. This site is a sub-domain of the http://www.pictures-of-cats.org. I say that Alexa measures page views inaccurately.

Here is a comparison between these three sets of data:

sbi page views
This shows the SBI data. There was a drop on the 11th April by a about 10% for visitors but page views went up (12,560). The figures are pretty stable, almost a straight line. These figures are for the www.pictures-of-cats.org site. This blogger site gets about 3,500 page views per day.

alex pag views
Sorry this image is hard to read. But importantly you can see that "yesterday" the page views figure is .000004% while for the 3 months it is .00005%. The daily figure, the first figure, is 8% of the average figure. So page views dropped by 92%!! That is clearly completely incorrect. Page views are very stable for this site and the figure large enough, surely, to be able to make them reasonably accurate.

People will say that you have to be in the top 100,000 of Alexa to be accurate but being just outside that mark should, I would have thought, allow Alexa to at least provide accuracy better than this. Particularly as it is so important to webmasters who want to progress and eventually get some good advertising.

Google page views
This shows a 10% Easter drop in page views on Google Analytics. The actual figures are higher as a number of pages don't have the analytics urchin code on them so are not counted. The point is the drop is a tiny fraction of that measured by Alexa.

What is upsetting is that the Alexa figures are the ones that count! Don't get me wrong, I like and am addicted to Alexa! But isn't it possible for Alexa to work with other companies to make figures more accurate for those sites that on the margins of the 100,000 mark or worse? These are often very good sites with great content. After all to get to Alexa 118,000 (the ranking at present for this site) takes a hell of a lot of work. Don't people who work that hard deserve some degree of accuracy? The Alexa page view figures are ruining the Alexa ranking. Because of the inaccurate page view figures the Alexa rank for yesterday was 666,000 or so, a dramatic fall that will hurt the 3 month average (note: things have changed one month later. Alexa ranking is now under 100,000 and getting better for the time being, but what I say above did occur and it is indicative that something was wrong at least at that time).

If Alexa can't make things accurate people won't use them. OK the people who have benefited from the recent changes will love Alexa but the bottom line is that Alexa must be built on accuracy otherwise the whole thing lacks meaning. Is it just for the big boys, who represent a tiny fraction of one percent of the total number of internet users? That can't be correct surely?

Update: things have changed for the Pictures of Cats org website for the better! Not sure why but the Alexa ranking is climbing and page views are too. I have always had difficulty with Alexa but maybe things are improving. If so good on Alexa.



From Alexa Measures Page Views Inaccurately? to Home Page

الاثنين، 13 أبريل 2009

Change the title tags for your blog

I just changed the title tags for this blog as described by Blogger Buster and it didn't work properly. It may have been me - not sure but the tag <title> was in the title as viewed by the browser! The idea is to improve SEO by presenting at the top of the page above the browser menu top left, right at the top of the page, the title to the article (the post) and only the title to the post and not the name of the blog as well. i.e at the top of the page and on the Google listed search result you get this:
  • "Change the title tag for your blog" and not:
  • "ABOUT CATS AND OTHER THINGS Change the title tag for your blog"
By the way, if you use custom search on your site, once you have made these changes the search results are easier to read because they only have the title and not the blogger website name before the title. Here is a picture of the Google listing for this post. You can see that the Blogs name is missing. Interestingly, Blogger buster has retained the website's name but I thought she said she had made these changes - not sure.
title tag change
The process is easy, you just swap some code. With the new blogger templates this process isn't necessary, apparently, but for the older templates it is. Anyway you can check by following the first part of the following process. If the code searched for is there, then it can be changed. You go to edit html and use Ctrl F to bring up a search box (bottom left of the screen), into which you type or paste: <data:blog.pagetitle/>. That immediately brings up the code painted in green. This code is very near the top of the template code, by the way, just above the template author information and what I call definitions. You swap the code for this:

<b:if cond='data:blog.pageType == &quot;index&quot;'>
<title><data:blog.title/></title>
<b:else/>
<title><data:blog.pageName/></title>
</b:if>

As I said, I found that doing this left in the title tag: <title>. What I mean is this the actual title tag was showing at the top of the page and on the browser tab. So what I did was to swap not only the code: <data:blog.pagetitle/> with the new code but also the tags: <title> and </title> on either side of it. That worked fine.

Thanks Blogger Buster. I am not sure how effective it will all be though! Wait and see. One last thing. As the experts always say, save the template before starting. You can download it easily and then upload as easily. In this instance a preview does not allow you to check so you have to save the new code which makes saving the code beforehand more important.



From to Home Page

الأحد، 12 أبريل 2009

Alexa Algorithm has Changed

I am sure that the Alexa algorithm has been changed at the same time as the website was updated. There was a decision to upgrade everything, what you see and what you get. Some sites have benefited massively and some have fallen. This site has fallen. That is despite building a lot of content over the last 4 months or so.

The Google Analytic figures are more or less stable for this blogger site subdomain and the main site. There has been a small fall off in page view but the Analytics figure show a 0.03% drop over months so this is infinitesimal. Also my web hosting company shows very little change. But Alexa on one day showed that this site had a traffic ranking of 1.4million when for months and months it was at the 115K mark. That represents a drop by a factor of more than 10! Alexa is also rating my page views much lower and I say it is inaccurate. And I know the site has to be below 100,000 for accuracy but why? Surely that is a problem that Alexa needs to address. Also this site is ranked 118K close to 100k. But it is falling....! Traffic rank for Sunday was severely depressed:

alexa-traffic-rank-for-pictures-of-cats-orgShowing the downward trend.
This is not seasonal but sudden and at about the time the Alexa website changed


On the basis of stable traffic figures but a falling Alexa ranking (it is now around the 180k mark, a big drop) I conclude that the mathematical formula used by Alexa has been changed. And it seems that the change is in favor of tech sites, blogging and social networking sites as the buzz seems to be to be that these are more modern and useful to companies like Alexa.

This may be a reaction to the credit crunch, the financial crisis. It is seen as a symptom of the "old way", the bad way and we need to move forward and out of that world. We need new people to manage the banks and basically business. The Alexa people probably reflected on all this and decided that the future was young modern sites and the algorithm had to reflect that. That is my guess, of course. I am probably off the mark a bit or a lot but it feels that the changes are based on that kind of mentality.

world percent internet usersChart published under published under
Wikimedia® creative commons license license = Attribution-ShareAlike License
Author Kozuch

I also think it pays to stand back and look at the big picture to find answers to what the intentions of Alexa are (note: this is just me speculating). Alexa are owned by Amazon and Amazon are in the internet business. The more people in the world who can use the internet the more business they will do. Globally including developed and the developing world 22% of the people use the internet (see chart above). There is plenty of room for expansion. To achieve that Amazon need to encourage and facilitate internet use. This can be achieved by encouraging tech sites to flourish to educate people and social sites to spread the word. To achieve that, I argue, Alexa have changed the algorithm to rank more highly these sorts of sites and businesses including new social networking sites. Lets not forget a site can be ranked anywhere by Alexa. They control the rules and it is not just based on how many visitors the site gets. Google have become the world's most powerful business, I believe, on the back of encouraging internet use by providing free software and products. It is the classic "preselling" technique so talked about on internet marketing sites.

It seems most of the cat sites are affected negatively. i-love-cats is affected positively, however, why? Well it has been around a long time with lots of inbound links and it does have a forum. That might be a factor. I don't think anyone has actually figured out the underlying changes. But I think what I have said above are relevant.

Update: Well, there is no doubt that Alexa have made changes to their algorithm as there has been too much disturbance to the traffic rankings. The dust has been kicked up. I say this having checked a number of sites. Some tech site have improved their rankings by large numbers. However, the day after I reported the above the traffic ranking has, at least for one day, returned to normal for my site so right now all seems to be kinda OK.

One factor in benefiting would seem to be if your site has a lot of inbound links, a decent page rank basically. It seems this is more of a factor in their algorithm. For example, Blog Buster has improved dramatically while enviroman's Blogger tips and tricks has dropped because he basically screwed up. His site was hosted by Blogger but he wanted to go fancy and upmarket and have a "proper" URL so he bought a domain name and redirected his domain from Blogger to the other hosting company. The URL looks nicer but as his URL changed he lost his inbound links and his pagerank and now his Alexa ranking too all for a nice URL. Never do this! The URL is unimportant. The content is important. Also it would have made sense to talk about SEO on Blogger sites if you yourself use a Blogger site.

Further update 14-4-09: The downward trend stopped yesterday but that is probably due to the massive effort I have put in recently!

16-4-09: There have been a lot more sites mentioning this since I first made the post. I guess that confirms it. It's history already! In hindsight I don't think the changes to the algorithm have been big (you couldn't do that as it would undermine the whole thing) but there have been changes and when they were made there were initial glitches and now it is more settled.

27-4-09: Things have returned to normal. Traffic is up. But why? For me, there was strong evidence that the Alexa algorithm had changed because there was too much change in traffic rankings at the time this post was first made but maybe they changed it back in a panic when they saw the drastic change it had on traffic rankings. I can only speculate.

2-5-09: Yes, another update. Traffic is up for this site and the Alexa ranking is currently getting better. So, although changes certainly took place it is not clear what happened subsequently. I mentioned the website, i-love-cats.com that improved dramatically in Alexa rankings after the change. That site has no gone back to "normal". It almost seems that they made changes and then changed back but it is not clear.



From Alexa Algorithm has Changed to Home Page

Featured Post

i hate cats

i hate cats, no i hate f**k**g cats is what some people say when they dislike cats. But they nearly always don't explain why. It appe...

Popular posts