Indexable and Non-Indexable Pages
We look at the website structure and weight up which pages were supposed to be closed or opened for indexing. It is significant for online stores and media business where the website has plenty of category/rubrics pages, search filter pages and etc. The determination of indexed categories can be more thoroughly reconsidered within the development of the search engine marketing strategy (SEM) as far as the SEO specialist takes a decision what categories are worth to be SEO-adapted only after the semantics analysis. However, before SEM we face a situation that the website has an incredible number of duplicates, which split the ranking indicator among all these pages. So, if there are about 100+ duplicated pages, the rank will spread between them and, as a result, every page will have a low rank.
User-Friendly URLs for Indexable Pages
We reconsider the URLs’ structure by rewriting URLs with dynamically-generated parameters, figures, unreadable symbols, underscores and uppercases. Only those URLs must be improved which pages are considered for search engine optimization.
If your business has user-unfriendly URLs and they are not important for SEO promotion, then we close these pages from indexing. If they are important then we create a user-friendly link and set redirect 301 from the previous URL to the new one to save the linking mass.
Inner Broken Anchor Links
We check if there are 404-status anchor links or inner anchor links with old insecure HTTP protocol and redirects on your website pages. All links to external resources besides related external domains should be No-follow.
Meta Tags Availability
Every page of the site must have the only 1 title meta tag, 1 description meta tag with a relevant length and recognizable structure. For the best SEO practice, meta tags should contain a relevant keyword; the keyword strategy for promoting pages is defined in SEM services.
Open Graph Tags
We set open graph tags in code for future users’ social networking posts. Setting these tags shows your page links visualized with Title, Description and Image, while posting anywhere in social networks so that they look like representatively.
Hreflang
If the site has various language versions, Hreflang tags should be adjusted. If there are many pages with similar content written for different regions, Hreflang allows Google Search to show only pages relevant for the language or region of the visitor.
Redirects
We make sure that no Redirect chains and loops, HTTP protocols in the end URL, homepage having any mirrors and other variations. In case of restructuring the website and migrating to another domain, we prepare a table for 301 redirects. If the site is not being developed from scratch and we consider restructuring its pages, then we should redirect the linking mass from previous pages to new ones. For the matter, we elicit the site’s pages which external backlinks are referring to, gather them in a table, filter duplicates out and set redirects 301 to give the linking mass to all relevant pages of the site in avoidance of the letdown in the Google ranking. If the redirected page lost the previous content, we find a proper alternative to it by choosing the page of a higher category. If the redirected page has gained a good linking mass and perfectly-ranged in search but doesn’t exist anymore, then far better to create an alternative page with related content, similar meta tags and keywords, and set redirect 301 to the new one.
Microdata
Using semantic markup improves the visualization of the site snippet in Google Search. Search robots crawl a page in which elements are marked as Microdata, and then this page is displayed with these additional data in the search results. For example, the price, product image, rating, number of reviews marked in the code with Microformats will also be displayed in the snippet. With the help of such a snippet, a user can immediately get important information about products /services, even without visiting the site, and an attractive and informative snippet encourages users to click the link. Thereby, Microdata makes traffic more targeted and the site more customer-oriented, not only in terms of search engines but also Internet users. Increasing CTR of a snippet is one of the behavioral factors taken into account by search engines and directly affects the ranking and page ranking.
Sitemap
Every site (apart from one-page sites) should have a generated Sitemap.xml file in the root folder. Sitemap.xml is generated after all site pages structure is determined and right before the launch.
Robots.txt
Every site must have a robots.txt file added to the root folder of the site where web-optimizer writes index status for definite or various search platforms (Google, Bing, Yandex, Yahoo, all of them). Robots.txt contains the guidance to search bots on how to index the website and which pages to close from indexing. Added to this, it is critical to creating a correct robot.txt as far as its correctness increases the website quality in terms of search bots and generally improves ranking.