In this year 2017, the SEO techniques and Google Algorithm is updated. These SEO and Google Algorithm have been updating each and every year. Here, I have mentioned some new techniques of SEO which helps you to get the high position as well as more revenue from your website.
Using a schema markup is becoming increasingly important with changing Google and user trends. Schema makes it easier for search engines to understand your site, thereby helping to ensure that it is displayed correctly. Schema can also be particularly helpful when Google decides to display rich answers, such as Quick Answers or a Rich Card. Google likes to display answers that make it easier for users to find what they are looking for.
Hybridization and breaking down barriers
As users become increasingly sophisticated online and the demands of digital marketing draw professionals closer together, it is clear that the brands maturing in modern marketing break down the silos that separate their digital marketing departments.
Mobile users access their email messages, desktop users redeem social media coupons, and those clicking on your PPC ads expect a consistent user experience with what they had when they landed on your site through organic.
To reach these customers, an estimated 80% of digital marketers worldwide expect to be running hybrid campaigns, and professionals need to be prepared for these changes.
To make sure your team is on board:
- Host trainings where you help members of different teams get to know each other’s goals and strategies
- Create collaborative projects where members of different teams come together for joint goals
- Develop common documents between the different teams that define vocabulary, expectations, and roles so that everyone can communicate effectively
Changes on the SERPs
Google has been experimenting this past year with the SERPs. Specifically, they have been increasing the number of characters allowed in some of the Meta descriptions and titles.
This trend can be a challenge for marketers to take advantage of because they have not been rolled out to all websites, nor has Google announced that they are permanent.
For the sites that do receive the extra real estate, however, there are great opportunities for including more keywords and more compelling descriptions to help attract people to the website.
To take advantage of these developments, you should consider:
- Continuing to use your main keyword at the beginning of your title and meta description in case you are restricted to the original character limits
- Using the extra space to expand your description
- If your meta descriptions are less than 100 characters, increase them to avoid having your description get buried with the new longer limits
SEM alignment and intent signals
Since searches with commercial intent on average display a higher number of ads at the top of the page than other searches, click-through-rates are lower for organic search results as compared to those with fewer top-of-the-page.
Knowing which terms have organic search results above the fold is critical to prioritizing efforts.
For these topics, organic and paid search teams should work together in targeting these terms to boost ROI for both paid and organic efforts.
It’s also important for these teams to understand what content is currently ranking for these buying terms and to align their content strategy and planning to create web pages and assets that map to what searchers are looking for.
Conversely, for discovery topics where search results have fewer ads, organic search teams should take the lead in identifying content needs and developing high-performing online experiences that attract and convert more customers.
Mobile and speed
Since Google first introduced the AMP project at the end of 2015, the importance of speed, particularly on mobile devices, has grown.
Google has always known that slow load times hurt the user experience, which is why a one second delay results in a 7% reduction in conversions. With AMP, Google can amplify the importance of speed.
AMP was initially created for news sites. The format strips away all the extras of a website, helping it load faster. According to Google, using AMP can improve loading speeds by 15 to 85%.
Mobile has also seen a growing emphasis on the importance of micro moments.
The micro moments describe the reflex of people turning to a device to answer an immediate need.
These needs fall under one of the following categories:
- The I-want-to-know moments
- The I-want-to-go moments
- The I-want-to-buy moments
- The I-want-to-do moments
When brands are able to answer these needs, they are able to provide the optimal mobile experience and improve their reputation.
These AMP pages serve the I-want-to-know moments in particular. Recently, however, AMP has begun spreading beyond this niche industry to the I-want-to-buy moments, with eBay optimizing millions of pages for ecommerce users.
As speed, due to the AMP optimization, begins to dominate more of the mobile digital ecosystem, it will become even more important for brands to shorten their loading times. Even brands that do not use AMP will still be impacted by the increasing expectations of customers for fast sites.
To make sure your brand is ready, consider:
- Avoiding unnecessary images and scale down the images that are placed
- Do not use images that are very complicated and would require long load times
- Keep only the essential cookies
- Compress your website if possible
SEO in 2017 is likely to be just as surprising and exciting as it has been in the past. Brands need to use their time now, however, to start strengthening their sites for the trends of the future.
Article Courtesy: SearchEngineWatch.Com
What is Google Crawl Budget ?
The number of times a search engine spider crawls your website in a given time allotment is what we call your “crawl budget.”
Ensure Your Pages Are Crawlable
Your page is crawlable if search engine spiders can find and follow links within your website, so you’ll have to configure your .htaccess and robots.txt so that they don’t block your site’s critical pages.
Of course, the opposite is true if you do want to prevent a page from showing up in search results. However, it’s not enough to simply set your Robots.txt to “Disallow,” if you want to stop a page from being indexed. According to Google: “Robots.txt Disallow does not guarantee that a page will not appear in results.”
If external information (e.g. incoming links) continue to direct traffic to the page that you’ve disallowed, Google may decide the page is still relevant. In this case, you’ll need to manually block the page from being indexed by using the noindex robots meta tag or the X-Robots-Tag HTTP header.
noindex meta tag: Place the following meta tag in the <head> section of your page to prevent most web crawlers from indexing your page.
X-Robots-Tag: Place the following in your HTTP header response to tell crawlers not to index a page.
Use Rich Media Files Cautiously
However, even if Google can read most of your rich media files, other search engines may not be able to, which means that you should use these files judiciously, and you probably want to avoid them entirely on the pages you want to be ranked.
Avoid Redirect Chains
Each URL you redirect to wastes a little of your crawl budget. When your website has long redirect chains, i.e. a large number of 301 and 302 redirects in a row, spiders such as Googlebot may drop off before they reach your destination page, which means that page won’t be indexed. Best practice with redirects is to have as few as possible on your website, and no more than two in a row.
Fix Your Broken Links ASAP
If what Mueller says is true, this is one of the fundamental differences between SEO and Googlebot optimization, because it would mean that broken links do not play a substantial role in rankings, even though they greatly impede Googlebot’s ability to index and rank your website.
That said, you should take Mueller’s advice with a grain of salt – Google’s algorithm has improved substantially over the years, and anything that affects user experience is likely to impact SERPs.
Set Parameters on Dynamic URLs
Spiders treat dynamic URLs that lead to the same page as separate pages, which means you may be unnecessarily squandering your crawl budget. You can manage your URL parameters by going to your Google Search Console and clicking Crawl > Search Parameters. From here, you can let Googlebot know if your CMS adds parameters to your URLs that doesn’t change a page’s content.
Clean Up Your Sitemap
XML sitemaps help both your users and spider bots alike, by making your content better organized and easier to find. Try to keep your sitemap up-to-date and purge it of any clutter that may harm your site’s usability, including 400-level pages, unnecessary redirects, non-canonical pages, and blocked pages.
The easiest way to clean up your sitemap is to use a tool like Website Auditor (disclaimer: my tool). You can use Website Auditor’s XML sitemap generator to create a clean sitemap that excludes all pages blocked from indexing. Plus, by going to Site Audit, you can easily find and fix all 4xx status pages, 301 and 302 redirects, and non-canonical pages.
Make Use of Feeds
Feeds, such as RSS, XML, and Atom, allow websites to deliver content to users even when they’re not browsing your website. This allows users to subscribe to their favorite sites and receive regular updates whenever new content is published.
While RSS feeds have long been a good way to boost your readership and engagement, they’re also among the most visited sites by Googlebot. When your website receives an update (e.g. new products, blog post, website update, etc.) submit it to Google’s Feed Burner so that you’re sure it’s properly indexed.
Build External Links
Now, in addition to Crowe’s excellent point, we also have evidence from Yauhen Khutarniuk’s experiment that external links closely correlate with the number of spider visits your website receives.
In his experiment, he used our tools to measure all of the internal and external links pointing to every page on 11 different sites. He then analyzed crawl stats on each page and compared the results.
Maintain Internal Link Integrity
While Khutarniuk’s experiment proved that internal link building doesn’t play a substantial role in crawl rate, that doesn’t mean you can disregard it altogether. A well-maintained site structure makes your content easily discoverable by search bots without wasting your crawl budget.
A well-organized internal linking structure may also improve user experience – especially if users can reach any area of your website within three clicks. Making everything more easily accessible in general means visitors will linger longer, which may improve your SERPs.
The Google AdWords community is scratching their heads this morning after trying to use the Google AdWords Keyword Planner and to find that it is only giving back data ranges when there is no campaign set up.
Previously, you did not need to set up and run a campaign to get pretty accurate search volume metrics. But something changed this morning. Here is a screenshot:
Article Courtesy: SEJ
SEO is the process of increasing the targeted traffic to your targeted web page from popular search engines, such as Google, Bing and Yahoo via organic search result. A website or webpage appears in the SERP (Search engine result page), the more traffic it will receive from the search engine. In the SERP, we may target different kinds of search, including image search, local search, and video search and so on.
For this, I have updated some quick tips for getting best rank in SERP (Search Engine Result Page).
- Content is the best optimization. Any type of optimization is effective only in combination with high quality and unique content. Ideally, the text contains keywords that highlight the overall theme of the post. The content should be understandable and readable. Use your keywords wisely. Text should be written primarily for visitors, not for search engines.
- Exchange backlinks with other relevant websites relating to your topics and with high authority websites (websites with high page ranking). Also link between pages within your website, but do this with caution and only if it’s relevant. Don’t link every page to each other.
- Think of a good name for your image files. Google also draws images in the calculation of the ranking. Google displays both text and images on its search results page, drawing on relevant images contained in the post itself, which are linked to the corresponding page. Implement keywords of a specific subject in your file name, instead of DSC93948.jpg use Tennis-Andre-Agassi-New-York.jpg if you have a picture of Andre Agassi in New York for example. Don’t forget to use the alt tag and title tag for your images, which also provide better usability and optimization for your search ranking.
- Use short permalinks, including keywords. Use an understandable permalink. Instead of http://yourwebsite.com/page-id?495/ use http://yourwebsite.com/andre-agassi-new-york/. Please note that only the first four words in a permalink are relevant. Google doesn’t care about any succeeding words in your permalinks.
- Use search engine optimized themes. Some themes use a lot code to create the layout and design. Google will wade through the source code to find the relevant content. The more code you have, the less content and keyword density. Fast performing themes are good for your Google ranking, since the performance of a website is one of two hundred criteria that determine your ranking.
- Create a sitemap in XML format. With Word-Press and plugins like Google XML Sitemaps it is a very easy task. Google has a variety of other useful tools, too, such as Website Optimizer and Webmaster Central.
- Highlight informative content or keywords on your website. Use h1-h6 to highlight headings or subtitles of an article or important sentences. Use the strong tag, too. But please don’t overdo it!
- Add your posts to social networks. But please don’t only promote yourself. Promote other posts and websites you like. People will honor your kindness and link back to you.
- Don’t use black hat techniques or mirror html sites. If you try to trick Google, they will find out. Promise. Recently, some pretty high-profile portfolio sites were recently punished by Google for such practices. Being sneaky will do more harm than good.
- Don’t use flash. Flash is virtually invisible to search engines. Use Word-Press. Period.
The final 64 have been compiled from over 200 ranking factors confirmed by Google, each at the top of 10 respective categories.
- Domain factors
- On-page SEO factors
- Site-level optimization
- Link-Building Strategies
- User Engagement
- Social Media Integration / “Social Signals”
- Brand-Specific Keyword Ranking Factors
- Website SPAM Prevention Factors (Panda)
- Off-Page Webspam Factors (Penguin + Panda)
- New Rules Specific to Google algorithm or Hummingbird updates
Vote on the Finalists
Have an idea of which Google SEO tactics work best? Vote for the finalists or let us know your comments in the box below.