Important Notice: No Part-Time Jobs for Watching YouTube Videos. đŸš« We DO NOT offer any part-time jobs for watching YouTube videos in exchange for money. Beware of SCAMS! đŸš«
Home Blog SEO

Google+ Ranking Second among the Largest Social Networking Platforms

The popularity of social networking sites needs no introduction. The launch of Google+ shows excellent promise and it is proving itself to other major social network sites as a tough competitor. For those who were skeptical about Google+, the data from Global Web Index, a company specializing in internet analytics may come as a surprise. The research shows that Google has moved ahead of Twitter in the 4th quarter in 2012, and has secured the position as the second major social networking platform.

According to the report, Facebook still enjoys a lead ahead of the competition. As one of the largest social networking sites, it has over a billion members totally, with 693 million users who are active. Its growth, with reference to active user accounts was reported to be 33%. Using information compiled from 31 countries, which make up ninety percent of the population globally, Global Web Index established 343 million active users to Google+ and 288 million active users to Twitter. The data also showed that Google+ has earned a growth rate of 33 percent while Twitter has a growth rate of 40 percent.

The data from the report also indicated a comparison of the sites based on the worldwide population who use these social networking sites actively, on a monthly basis, with 25 percent using Google+ actively, 51 percent using Facebook, and around 21 percent using YouTube and Twitter. The video sharing services offered by Google continues to be appreciated, winning it some percentage points. YouTube moved ahead of Twitter to secure a position as the third largest social network. In the top-ten list of largest social networks, LinkedIn made to the 10th spot with over 100 million active usage and ten percent growth rate of active users.

The 5th to 9th positions in the list belonged to social networking sites that are popular in China, where censorship on internet is prominent. Qzone, Sina Weibo, Tencent, Youku, as well as Todou are some of the Chinese social networks noted on the list.

Looking Back at the Google Highlights 2012

The year witnessed some standout changes put forth by Google, which had a great influence on SEO. SEO experts and consultants keep themselves informed about the various updates. Google continued to lay emphasis on quality and standards as a means of promoting the websites. The significance to trust, credibility, and reputation was heightened. Google also got assertive with responding to irresponsible web standards and poor quality. This goes to prove the importance and high levels set by Google for the content and design of a website.

The month of April introduced Penguin, which acted as a tool for penalizing websites that were constructed with artificial outbound links. Google made a strong stand to detach poor standard and artificial links. They can be considered as well planned and implemented steps taken by Google to highlight websites with good standards. Giving importance to credibility of links, it also disregards links that are unreliable. This gives an idea about the reaction and response of Google towards websites that make use of untrustworthy links.

Besides penalizing poor quality, Google also spots out and appreciates high quality. This has, certainly proved to be encouraging to the websites, and has shown that high quality is also rewarded. In April, the announcement of a crucial algorithm change issued its importance at reducing the rankings for the sites that has low quality. The algorithm focused on several bad practices, such as overuse of keywords and poor content.

  • The practice of over-optimization also came into light. The misuse of the concept of relevance by using too many keywords on the page, or the use of numerous exchange links.
  • Google has expanded its abilities with which the search engines can execute JavaScript and AJAX.
  • Site-wide back-links were also brought to attention, and were held in comparison to the keyword counting process that Google uses.
  • The increased number in static advertisements placed above the fold was also assessed and those risking content were penalized.
  • Google also sets to identify those topics that offer freshness and quality. This ensures that the algorithm spots recent and updated information.
  • In September, Google brought forth the announcement that exact match domains, which characterized low quality, would be targeted.
  • In the same month, an update was put forth by Google targeted to increase the number of domains shown in the search results.
  • The algorithm, Panda, was designed to focus on websites that made use of poor standard of content. It also witnessed thirteen updates since November 2011. It has been effective in motivating websites to clean up substandard content and take efforts to design and plan well written content.

Websites also need to concentrate on tags and in using them correctly. Google strives to improve the quality of websites and set new benchmarks for them to serve the online users effectively, while building their incomes. Such standards and updates also reflect the high level of quality and efficiency that Google demands.

Google Clears the Air and Web Space with Disavowing Links Tool

If you own a website, chances are that you have been plagued by bad links – these could have been placed either by someone who considers you competition, or via automatic malicious links. Such links could create problems with search engines, because your site could very well be blacklisted.

In such a scenario, it is essential that you remove all such links, but sometimes, you find yourself helpless, because you are unable to do so.

With the launch of the new disavow links tool by Google, there was an air of cheer amongst all those people who were being bogged down by bad links on their sites. While some of the bad links were being placed manually, there are those which make their way onto a site, with a malicious intent.

While this tool is meant to remove the bad links that people are finding difficult to delete from their site, it is definitely not meant for making life easier for you.

Googlerepresentatives cleared the air that this tool is not for those people who would just want to make a list of the bad links on their site, and expect all of them to be cleared off. Rather, this was meant only and only for those links, which owners of websites are not able to remove, even after repeated efforts.

So, if you are also facing trouble with non-removable links, then you might want to get in touch with the Google team and try out their disavow links tool.

Top Heavy 2: Google Updated the Page Layout Algorithm

In association with the updates on Google’s search algorithm, the announcement of its Page Layout filter refresh, which was discussed in January, is also referred to as the ‘Top Heavy’ update. With a brief glance over the last few updates, one can observe that the Panda Update 20 and the EMD Update 1 occurred over the 27th and 28th of September 2012 and the Penguin Update 3 and the Top Heavy 2 rolled out on the 5th and 9th of October 2012.

Described as a ‘minor’ update, it had a significant effect on 0.7% of the queries in English language. It is related to feedbacks from the users who respond that when they click on a search result, they find it difficult to search the actual content, and are dissatisfied with the overall experience. The users prefer to view the content immediately rather than scrolling through the various ads. Thus, the update can affect websites that do not have relevant “above the fold” content. If the click on a website reveals a significant part of the initial web page dedicated to ads, or lack of visible above the fold content, the website may not secure high ranks due to low browsing experience.

Although called as “Page Algorithm” update, it has been aptly described as “Top Heavy” as it targets web pages whose top portions are heavy with the ads. The second refresh associated, the Top Heavy 1 released on the 19th of January, 2012 affected less than 1 percent of the English query searches, while the Top Heavy 2, released on 9th of October 2012 had an impact on 0.7 percent of the same. As suggested by the update statistics, it can be observed that those who make changes recommended by Google are now free from the effects of the Top Heavy updates.

Various services have analyzed the impact of the Google updates like Top Heavy on various search results. One can view these graphs and illustrative charts to get a better idea about the impact of the Top Heavy update.

Google Panda Updates: A Look at the Road Travelled

A name well known in the field of search engine optimization, the Google Panda was brought forth to target the poor-quality sites and lower their ranks. Since its first rollout, Google’s Panda continue to be updated. The focus has been on improving the overall quality of the websites. The first Panda update from Panda 1 to Panda 2 had a huge impact. The updates are not usually announced and sometimes, Google announce them short notice after the fact.

Following a series of updates described to be minor; one of the significant revisions was the rolling out in international languages beyond English. After a belated statement from Google describing one such minor update to be major, the Panda 3 had also arrived. The Panda 3.5 update went unnoticed because its occurrence was at nearly the same time as the Penguin update. The ranking modifications and changes that indicate a Panda update were cloaked under the Penguin update.

Updates and renewals of the Panda algorithm on a monthly basis, targeted at poor quality pages, may even be so minor that they go unobserved. Many wonder that without the point system and discarding the level of the update as ‘major’ or ‘minor’, a consecutive increment in the update serial number could have yielded a result of Panda 19. So, can the Panda 20 be expected any time soon?

The evolution of the algorithm is designed to target those web pages that are of low standards and lower their rankings in the search results.

Guidelines for Page Quality Gain Significance

The internet offers a convenient platform for bringing the online businesses and the audience closer. The update of the Google Panda algorithm continues to leave its impact emphasizing on high quality web pages. The responsibilities of the search quality raters have increased as Google is focusing on rating the specific quality of each landing page also. This has also added extra pages to the section describing the page quality ratings procedures and guidelines that is handed to manual raters. The concept of page quality is essential.

Well written content that is informative, enjoyable, and neatly organized are some of the characteristics of a good quality web page. In contrast, the poor quality content in pages is confusing, unreliable, and often messy. Each landing page receives an overall grade from the raters. Main content, layout, relevancy of the content, and purpose are some of the factors for which suitable grades have to be assigned. Lowest, low, medium, high, highest are the options available for the rater for the task.

The purpose of the Panda algorithm update was aimed at removing the poor quality pages from the indexes. Frequent and reliable updating of a social networking profile with posts, comments, and links can improve its quality. The meticulous explanation of the rating details also helps the raters respond correctly to the question regarding the status of the website related with the page under review. The raters also perform rating tasks side-by-side by evaluating advanced or basic results corresponding to a search query. This helps in weighing the present search results against the ones after the update. With so much emphasis on web page quality, it is essential to turn the spotlight on the overall webpage quality including content so that the websites can score high in these challenges.

Are Google’s New Search Trends Useful?

Erstwhile, Google used to match the query of searchers with the words given on the web pages. The ranks of the web pages were based on the external links these web pages had. With the advent of new technology, Google’s search functionary has become smart and advanced. Its algorithms have evolved in several ways, which has made straight effect on the search results. Whether it’s benefiting or adversely affecting the searchers, is a big question in itself?

To have a better understanding of it, read the example. If you search the band U2 from its name on Google search engine, you will find executive news editor Matt McGee’s website on the third rank of the first page, and the images or videos of the band on the fifth and sixth positions in the search engine.

The reason for the famous videos of U2 being on fifth or sixth rank is that searchers have not used U2 keywords with ‘video’ or ‘U2 songs’. If the Google starts showing more U2 videos on search results and users do not click on them, there is a possibility that Google might stop showing them. It is surprising that Google has several patents around this idea. To understand the intent of searchers, Google look at what the previous searchers have typed and clicked on to query about a particular thing, and then it integrates those signals for ranking the websites.

Google’s spelling correction application loads dictionaries of how words are spelled along with the slight misspelled version of those words. Google is able to do it when searchers correct the searches and click on different spelling version. Google uses this data to suggest the query with a different spelling and treat the misspelling of the specific word behind the scene.

Google has moved ahead of words matching and shows results based on searchers intent. Google’s experts say that keywords do not have to be in their original form. Google’s team do lots of research on synonyms so that they can find premium quality web pages that generally do not use the same words as searchers type in their queries.

Google Focusing on Site Clustering, Sitelinks Changes, and Page Quality

Google has once again made announcements regarding search quality and algorithm changes-a significant declaration that the company forget to make in the month of July. The biggest announcement of the year by now is reporting of 86 changes from the months of June and July. The main issues discussed in the report are related to site clustering, site link changes, and the page quality. Google has shown concern about site clustering due to which some sites completely dominate the first 8-9 rankings in the Google search.

Google announced through its blog post that they are working on the site-clustering project to make their system web results better and simpler. They also mentioned that they have made their algorithm for clustering web results from same path or same site more consistent. Google has tried several tactics over the years to control the site links being displayed on the first page of Google for the same website. Now that Google has announced that it is working on multiple projects to limit the rankings of a single website, then the users can expect to see few more changes in the coming months.

Google has not described about its big changes, but there are chances that half of the changes will be related to ‘page quality’. Google said in its blog post that the project launch would help the users land on high quality content on web pages from trusted sources. They have incorporated the new data into the panda algorithm to detect high quality websites and pages.

The Google post also included four suggestions or changes in the way sitelinks work. First is ‘Gas station’, which helps remove the boilerplate text in sitelinks titles maintaining the information that is useful to the users. Second is ‘Manzana2’, which will be used for improving clustering and ranking of links in the expanded sitelinks feature. Third is ‘Yoyo’, which is introduced to detect more useful text in sitelinks. Fourth is ‘challenger’, it helps in getting rid of generic boilerplate text in the web results ‘titles’, especially for sitelinks.

Google Analytic’s Multiple Channels Funnels Reporting API

Analysing how promotional activities and marketing endeavours encourage conversion is a complex matter. It is complicated due to the availability of multiple marketing channels for interaction, which customers use before conversion or purchase. Recently launched Multiple Channel Funnels in Google Analytics helps in drawing the attention towards the full path users follow to conversion, rather than depending on the last click, which gives blurred picture about what moves customers in the real market. Now, the data is available through an API, which facilitates developers to widen and automate the use of conversion cases with the data.

The API (Application Programming Interface) allows the developers to make query for metrics like first interactions conversions and last interaction conversions along with assisted conversions. It also supports the top paths, path length, and time lag to integrate conversion path data into the application. Important use cases involve combining the conversions paths data with other sources such as cost data and creation of new visualisations along with use of data for initiating automate processes.

The multiple channels help analyse the interactions that takes place across the internet through different digital media. The multi-channel funnels show how different channels work together and convert a purchase decision into sale. This tool is mainly built to help people make important marketing decisions regarding advertising and promotion at macro and micro levels.

The five multi-channel funnels tool is capable of showing how customers interacted during 30 days before making purchase decision. The metrics drawn from the reports have shown the impact of different channels through conversion, time lag that helps understand the time period taken in conversion, path length that shows the number of conversations that took place prior to purchase, and top conversion paths customers take.

Confusion: Google Sends Warnings with the Message of Ignoring Them

From the last few months, Google has been vehemently sending warning and notices to publishers either they act as per their guidelines about bad links or get ready to be penalised. On 20th July, Google sent latest warning to the publishers that they could relax this week; is that what’s called transparency. It has created lots of confusion among the experts. To decipher what Google wants to say requires scrutinisation of its earlier posts, quite similar to the latest one.

From the last weeks of the March to the early weeks in April, Google started sending serious warning related to fake and unnatural links. The Google search quality team has sent warnings to the websites’ owners mentioning they have detected that some of their site’s WebPages are built on the techniques that do not comply with the Google’s webmaster Guidelines. They were warned to detect some artificial and unnatural links pointing to their sites that are used for manipulating the page rank. The messages sent by Google team have created confusion among the readers, as it was not clear whether Google team is penalising them for having these links on the sites or it was just informing them and not taking any negative action.

In the last weeks of April, the Google penguin has created havoc among publishers, as it was specially designed to detect spam and penalising publishers who are into the practise of using bad linking activities. Other advices of Google such as ‘get rid of bad links’, ‘do not ignore link warning’, etc has been read and followed by many publishers. Google constantly sends warning and advices to the publishers; however, it completely depends on them whether to follow or not. If anyone ignores such warnings, the only thing one has to face at the end is low ranking. It’s a friendly advice better follow the warning than ignoring them.