PIXEL SEO is the game changer in SEO Company History, Breaking out huge profit with Pixel SEO Company

it’s important to note that a website can have multiple robots files if you’re using sub domains for example if you have a blog on domain.com then you’d have a robots.txt file for just. the root domain but you might also have an ecommerce store that lives on store.domain.com so you could have a separate robots file for your online store that means that crawlers could be given two different sets of rules depending on. the domain they’re trying to crawl now. the rules are created using something called directives . while you probably don’t need to know what all of them are or what they do there are two that you should know about from an indexing standpoint. the first is user agent which defines. the crawler that. the rules apply to .. the value for this directive will be. the name of. the crawler for example google’s user agent is named googlebot .. the second directive is disallow this is a page or directory on your domain that you don’t want. the user agent to crawl for example if you set. the user agent to googlebot in. the disallow value to a slash you’re telling google not to crawl any pages on your site not good now if you were to set. the user agent to an asterisk that means your rules should apply to all crawlers so if your robot’s file looks something like this then it’s telling all crawlers please don’t crawl any pages on my site while this might sound like something you would never use there are times when it makes sense to block certain parts of your site or to block certain crawlers for example if you have a wordpress website top company seo . you don’t want your wp-admin folder to be crawled then you can simply set. the user agent to all crawlers . set. the disallow value to slash wp admin now if you’re a beginner i wouldn’t worry too much about your robots file but if you run into any indexing issues that need to be troubleshooted robots.txt is one of. the first places i check alright. the next thing to discuss are sitemaps sitemaps are usually xml files . they list. the important urls on your website so these can be pages images videos . other files insight maps help search engines like google to more intelligently crawl your site now creating an xml file can be complicated if you don’t know how to code . it’s almost impossible to maintain manually but if you’re using a cms like wordpress there are plugins like yoast . rank math which will automatically generate sitemaps for you to help search engines find your sitemaps you can use. the sitemap directive in your robots file . also submit it in google search console next up are redirects a redirect takes visitors . bots from one url to another . their purpose is to consolidate signals for example let’s say you have two pages on your website on. the best golf balls an old one at domain.com bestgolfballs2018 . another at domain.com best golf balls seeing as these are highly relevant to one another it would make sense to redirect. the 2018 version to. the current version . by consolidating these pages you’re telling search engines to pass. the signals from. the redirected url to. the destination url .. the last point i want to talk about is. the canonical tag a canonical tag is a snippet of html code that looks like this its purpose is to tell search engines what. the preferred url is for a page . this helps to solve duplicate content issues for example let’s say your website is accessible at both http colon double slash yourdomain.com . https colon double slash yourdomain.com . for whatever reason you weren’t able to use a redirect these would be exact duplicates but by setting a canonical url you’re telling search engines that there’s a preferred version of. the page as a result they’ll pass signals such as links to. the canonical url so they’re not diluted across two different pages now it’s important to note that google may choose to ignore your canonical tag looking back at. the previous example if we set. the canonical tag to. the unsecure http page google would probably choose. the secure https version instead now if you’re running a simple wordpress site you shouldn’t have to worry about this too much cmss are pretty good out of. the box . will handle a lot of these basic technical issues for you so these are some of. the foundational things that are good to know when it comes to indexing which is arguably. the most important part in seo because again if your pages aren’t getting indexed nothing else really matters now we won’t really dig deeper into this because you’ll probably only have to worry about indexing issues if . when you run into problems instead we’ll be focusing on technical seo best practices to keep your website in good health hey it’s ammo . welcome to. the final lesson in this module . actually it’s the last lesson in ahrefs seo course for beginners in this lesson we’re going to go through some technical seo best practices so you can keep your site in good health let’s get started so. the first thing you should do is ensure that your site structure follows a logical hierarchy site structure is simply. the way you organize content on your website you can think of it like a mind map at. the top you’d have your home page then you’d probably have main topics that branch out from your home page like your services page your blog . about page then from these main topics you’d probably have even more branches to other pages these branches represent internal links which are just links from one page on your site to another . they help search engines understand. the relationship between these pages site structure also helps search engines to crawl your pages more efficiently which is why having a logical hierarchy is important now what we’ve talked about is pretty basic stuff . you may already be doing this but it can get more complex as you add more pages to your site like blog posts category pages or product pages we have a full video on how to use internal links to rank higher on google so i’ll link that up for you in. the description alright. the second thing is to ensure your pages don’t load slow as you may know pagespeed has been a confirmed ranking factor for desktop search since 2010 . in 2018 google announced that they’d be using page speed in mobile search rankings now you don’t need to obsess over every millisecond it takes for your page to load google says. the speed update as we’re calling it will only affect pages that deliver. the slowest experience to users . will only affect a small percentage of quarries so bottom line you don’t want your pages to load slow . there are two very basic things that i think every website should do. the first is to cache your website’s content caching is basically a way to temporarily store copies of files so it can be delivered to visitors in a more efficient way . most web hosting companies that i’ve come across have caching features