5/19/2019»»Sunday

Mariamman Thalattu Book

5/19/2019
    1 - Comments

Top SEO News, 2017

    Google will keep in secret the number of search quality algorithms

    Oct 08/2017

    How many search quality algorithms does Google use? This question was put to the John Mueller, the company’s employee during the last video conference with webmasters.
    The question was:
    'When you mention Google's quality algorithm, how many algorithms do you use?'
    Mueller responded the following:
    'Usually we do not talk about how many algorithms we use. We publicly state that we have 200 factors when it comes to scanning, indexing and ranking.
    Generally, the number of algorithms is a casual number. For instance, one algorithm can be used to display a letter on the search results page. Therefore, we believe that counting the exact number of algorithms that Google uses is not something that is really useful [for optimizers].
    From this point of view, I can’t tell you how many algorithms are involved in Google search.'

    Gary Illyes shares his point of view on how important referential audit is

    Oct 08/2017

    At the Brighton SEO event that took place last week, Google rep called Gary Illyes shared his opinion about the importance of auditing the website's link profile. This information was reported by Jennifer Slagg in the TheSEMPost blog.
    Since Google Penguin was modified into real-time update and started ignoring spam links instead of imposing sanctions on websites, this has led to a decrease of the value of auditing external links.
    According to Gary Illyes, auditing of links is not necessary for all websites at the present moment.
    'I talked to a lot of SEO specialists from big enterprises about their business and their answers differed. These companies have different opinions on the reason why they reject links.
    I don't think that helding too many audits makes sense, because, as you noted, we successfully ignore the links, and if we see that the links are of an organic nature, it is highly unlikely that we will apply manual sanctions to a website.
    In case your links are ignored by the 'Penguin', there is nothing to worry about.
    I've got my own website, which receives about 100,000 visits a week. I have it for 4 years already and I do not have a file named Disavow. I do not even know who is referring to me.
    Thus, in the case when before a website owner was engaged in buying links or using other prohibited methods of link building, then conducting an audit of the reference profile and rejecting unnatural links is necessary in order to avoid future manual sanctions. It is important to remember that rejecting links can lead to a decrease in resource positions in the global search results, since many webmasters often reject links that actually help the website, rather than doing any harm to it.
    Therefore, referential audits are needed if there were any violations in the history of the resource. They are not necessary for many website owners and it is better to spend this time on improving the website itself, says Slagg.

    -- () 18:45, 22 April 2009 (UTC) Same problem here: 'Server Error The service you requested is not available yet.' Neverwinter nights 2 cd-key. 18:40, 22 April 2009 (UTC) I get a 503 as well. () 18:37, 22 April 2009 (UTC) Fine here.

    Aug 4, 2017 - 15 months ago. Unable to locate my older resume. Josanna howell in Monterey Park, California. 6 months ago. Can't log into my resume on indeed. Download and Print my resume. Log into your account here. Click on Download Resume, to the right of your resume. Open the PDF and print. Find my old resume You can view your Indeed resume in these ways: Log into your account here to view and edit your resume or here and click Post your resume (bottom of the page). The website also offers free resume samples and excellent job search advice. Ask Your Employer: See if current or previous employers have a copy of the. Edit, update or replace my resume. If you want to edit, update or replace your resume, log into your Indeed account here. Click Edit or Add next to any section that you want to change. If you want to delete an item or section, click the x next to the section.

    Googlebot still refuses to scan HTTP/2

    Oct 08/2017

    During the last video conference with webmasters Google rep called John Mueller said that Googlebot still refrains to scan HTTP.
    The reason is that the crawler already scans the content that fast, so the benefits that the browser receives (web pages loading time is decreased) are not that important.
    'No, at the moment we do not scan HTTP / 2. We are still investigating what we can do about it. In general, the difficult part is that Googlebot is not a browser, so it does not get the same speed effects that are observed within a browser when implementing HTTP / 2. We can cache data and make requests in a different way than a regular browser. Therefore, we do not see the full benefits of scanning HTTP / 2.
    But with more websites implementing push notification feature, Googlebot developers are on the point of adding support for HTTP in future.”
    It should be recalled that in April 2016, John Mueller said that the use of the HTTP / 2 protocol on the website does not directly affect the ranking in Google, but it improves the experience of users due to faster loading speed of the pages. Therefore, if you have a change, it is recommended to move to this protocol.

    Google does not check all spam reports in manual mode

    Oct 08/2017

    Google employee named John Mueller stated that the search team does not check all spam reports manually during the last video conference with webmasters.
    The question to Mueller was the following:
    'Some time ago we sent a report on a spam, but still have not seen any changes. Do you check each and every report manually?'
    The answer was:
    No, we do not check all spam reports manually. '
    Later Mueller added:
    'We are trying to determine which reports about spam have the greatest impact, it is on them that we focus our attention and it is their anti-spam team that checks manually, processes and, if necessary, applies manual sanctions. Most of the other reports that come to us is just information that we collect and can use to improve our algorithms in the future. At the same time, he noted that small reports about violations of one page scale are less prioritized for Google. But when this information can be applied to a number of pages, these reports become more valuable and are prior to be checked.
    As for the report processing time, it takes some considerable time. As Mueller explained, taking measures may take 'some time', but not a day or two.
    It should be recalled that in 2016, Google received about 35 thousand messages about spam from users every month. About 65% of all the reports led to manual sanctions.

    Google Image Search loses market share to Amazon and Facebook

    Aug 14/2017

    The share of Google in the search market grew from 58.84% in October last year to 64.8% in March 2017. At the same time, the share of Google Image Search fell to 21.8% in favor of Amazon and Facebook. This information has come from analysts of the American company Jumpshot in partnership with co-founder Moz Rand Fishkin.
    During the research, they analyzed search data in Google Search, Images, Maps, YouTube, Yahoo, Bing, Amazon, Facebook, Reddit and Wikipedia for the period from October 2016 to May 2017 with a sole purpose to determine the resources that accounted for the largest number of search engines Sessions and traffic.
    Generally, at this period Amazon's share went up from 0.4% to 2.30%, and Facebook's 0.8% to 1.5%. Bing and Yahoo both showed growth of up to 2.4%, while Google Maps was ranked up to 1.2%. The activity of Google Search, Bing, Amazon and Facebook showed growth, while Google Images, YouTube, Yahoo and Google Maps lost their positions.
    The report also included data on search volumes and CTR in the US. The number of search sessions in Google has exceeded 30 billion a month (as of October 2016). By May 2017, the growth trend remained at the level of 10-15% compared to the previous year.
    The results of the organic search in 2016 went down to the bottom. In December they were ranked at 54% (despite the fact that in January and February of the same year their level was at 57% and 56%, respectively, and taking into account the traditional activity stop after the winter holidays).
    November 2016 gave the highest rates of search activity without clicks and was ranked at 45.5%. At the same time, the lowest indicator was in October, which is only 40.3%.
    According to Jumpshot, the largest traffic is generated by Google: about 63% in May 2017, with about 60% in October 2016. During this period, YouTube also showed better results and went up by 0.2%, while Amazon rose by 0.1%. Traffic from Facebook, Yahoo, Reddit, Imgur and Bing almost died, and that’s only Wikipedia that remained at the same level.

    Google tests a new search results format with ready-made answers

    July 11/2017

    English-speaking users noticed that Google is testing a new format for the search results that would include ready answers.
    From now on the website, the content of which was used to generate a response will no longer be displayed in the search results. The reference to it is contained only in the block with the answer.
    'Google removed the result from the search on the page that was already shown in the block with the answer for this query. Now the block with the answer is the only result for the page on a specific request, 'says The SEM Post blog
    It is noted that the new feature is currently available for many users, but not all of them. This can mean a large-scale testing or a gradual launch.

    Google updates the guidelines for assessors third time this year

    Aug 05/2017

    It's third time this year that Google has updated the guidelines for assessors (experts assessing the quality of search results and the pages displayed in it). This time, the changes are even smaller than in the previous version of the document, which was published in May 2017.
    The latest innovations will mainly be interested to SEO specialists who work with non-English pages.
    For instance, the pseudoscientific and fake content details have been clarified, comments displaying pornographic ads on websites that do not contain adult content have been removed, new examples of pages with the lowest quality have been introduced, as well as a completely new section on the display of results in English for non-English-speaking locales.
    There are changes that are purely of a natural style: for example, the selection of some words in italics has been removed. The section on using the Foreign Language label for pages in a foreign language like Ukrainian and Russian is replaced with an example of Catalan and Spanish.
    A complete guide for assessors Google is a 160 pages book.
    It should be recalled that the Google assessors guide has already been updated in March and May this year. The main changes aimed at combating dubious content in search results took place this March. The largest May updates affected the assessment of the quality of news websites, in particular the use of the 'Upsetting-Offensive' label that was introduced in March.

    Google adds tags for recipes, videos and products in the image search

    Aug 03/2017

    Google added tags for recipes, videos, products and GIF to the image search results. Now when searching for images, users will immediately see which type of content the individual results are related to.
    The Google rep commented on the new feature saying the following:
    'These badges will help you find those images that involve additional actions or contain more detailed information.'
    To display a label on a website page, appropriate marking of structured data should be added: for recipes, goods or video. GIF-images Google algorithms will recognize and mark automatically, thus, markup is not needed for them. New badges will not always be displayed just like extended snippets. Filling in the fields for the recommended properties of the markup increases the chances of getting them.
    Google also updated its structured data verification tool. Now it processes markups for images.
    It should be recalled that Google started showing videos and recipes in the search results for pictures starting from last month.

    Google Drive will become a backup tool

    June 17/2017

    Google plans to make a backup tool out of Google's cloud service. Soon it will be available to track and archive files inside any folder the user specifies. This can also be the contents of the entire hard disk or the Documents folder.
    The backup function will be available from June 28 after the release of the new Backup and Sync application, which is the latest version of Google Drive for Mac / PC.
    It is assumed that users will have the opportunity to open and edit files located in the cloud. It is still not clear whether they will be able to synchronize information between multiple PCs using Disk as an intermediary.
    Since the auto update to Backup and Sync is not planned, the company recommends installing a new application immediately after being released.
    The new feature is primarily targeted at corporate Google Drive users.

    Google keeps ignoring the Last-Modified meta tag

    Aug 14/2017

    Google still ignores the Last-Modified meta tag in the search. This was stated by the company’s employee, John Mueller providing a response to a question from one of the webmasters on Twitter. Sun tv shows today.
    The question was:
    'In 2011 you said that Google does not use the http-equiv =' last-modified 'tag for crawling. Is that still so? '.
    Mueller replied the following:
    Yep, we still do not use it.
    - John ☆ .o (≧ ▽ ≦) o. ☆ (@JohnMu) August 11, 2017
    The tag was originally used to alert the crawlers that the page was updated, or to specify the date the page was last refreshed.
    In 2011 John Mueller made a post on the Webmaster Central Help forum in which he stated that Google does not use the Last-Modified meta tag for scanning, indexing, or ranking. This tag is also not included in the list of meta tags considered by Google. With all this, other search engines can still use it.