hello everyone welcome to n blit series episode in this case stud we will be focusing on dep. com which is an SEO project that was losing traffic for the last two and a half year it is heavily affected by The Helpful content update and we will be explaining what are the main problems in this website and how we helped it to reverse actually its negative ranking State and for the first time in the two and a half year it started to actually increase its rankings one more time while its main competitor is actually losing rankings since the domain here is an exact mesh domain we will also be focusing on the EMD nature and since we already have an holistic a case study uh inside the exact M domain uh world and methodologies I won't be repeating the same information that's why I suggest trying to read this case study and also check some of the other buad SEO series episodes in the blad playlist in our Channel it will be really helpful for you to understand why the S changes tend to actually rank exact or partial Mash domains and if there are multiple exact M domains with the same name what will be happening to them and we will be examining the 15 of the uh August broadcore algorithm update or what it is trying to do and how we help this website for reversing its negative ranking State and again everybody at SEO case study actually focuses on unique and different suggestions and subjects of the SEO science uh in that specific case which means that I won't be repeating some of the obvious things but if you want to understand them I suggest strong the suggest to watch the previous videos as well so I will be showing it actually from the ASAS first this is that that come actually two years or five years of metrix you will see actually for two years or over two years it was losing rankings Mo for most of the time and in this case with the 15 of the August it started to increase actually queries and also together with the rankings as well one main thing here is that after winning the 15 of the August core update the website as many of the ACT project owners think that uh once they start to increase the rankings they actually stop publishing if you watched just the one previous video it is explaining such like that money which is a finance source and you will see that the two person Discovery uh hits percentage in the cross St analysis there and you will see that why the website actually stopped being prioritized for further being crawled or for further being actually rank for new type of queries once it actually started to be cwed less and published less and the SC hits SC lower it also starts to lose some rankings not in terms of traffic but in terms of indexation because total quer numbers were decreasing if you watch it there you will be seeing the dedicated suggestions for this so I won't be focusing on that angle one more time and with that side this is the main competitor de. org the main thing is that since 2019 in every SEO case that I publishing I mainly focus on the core updates if you lost a core update or any major Google update until the next update you won't be seeing any cover it you won't be able to naturalize most of the negative ranking State you can just slow down it in this case you should always focus on the core update date if you have a winning for the core update then you will see that sudden increases or sudden reversals and here we see that at the 15 of the August this specific website actually started to increase started to increase its rankings but after that uh let's say in this area August actually from let's say 7,000 to do 12,000 it is able to come and for the queries from 6 to 3,000 and to the 7 to5 and again one more time since it stopped publishing on that moment the search agent started to actually decrease the prioritization of the source one more time when it comes to the main computer de.
org after we win the main core update it's strongly actually start to lose rankings one main thing here is that once you have actually this type of let's say this type of changes together with your competitor I strongly suggest you to try to see that actually humd uh these websites are reverse correlating with each other as much as possible for instance the section here and the section here are clear that actually they are slightly reversing with each other because both of them are exact match domain and a search engine can rank multiple exact match domains for a phrase like depth it will be really hard for them to actually trust multiple websites with the same brand name because this is actually about a concept that we use in the old Google dice or old let's say s is Google bombing basically it means we directly create actually multiple websites with the same name then we close entire first three pages with his exact mesch domains after that point Google actually started to decrease the chance of ranking of exact mesch domains and usually they choose one representative which means these are survival twins while de. org is increasing in the rankings de. com won't be able to come to the its old days the small the topical map that has created in this area has to focus on actually ranking the that.
org initially and also the same goes for backlink processes as well and we'll be focusing on pay strength distribution in this project too so one main problem in the source is actually after wining the core update stopping the Publications because it they think that actually it will continue like that but in normal conditions as I explained in subtracted money these are these are called opportunity Windows against your competitor while your competitor is losing rankings and you are gaining it means that for until to the next core update you will will be more rankable and they will be less rankable and this is a perfect opportunity to search Eng and show that actually you are covering as much as their topics we have 66,000 queries in this case and or let's say 6,000 United States and traffic compared to is really less which means that actually website doesn't have any problem in terms of relevance the main problem is a bit actually responsiveness and overall comparative Authority and in this case if this Source keep publishing new documents by complete topical map when the next core update they can directly replace their survival twin and rank way much higher so this is the second point if you in an exact M domain project you should always calculate how many different projects exist with the same name and if you can actually exal their Authority here we actually have a comparison between how much my car is that come and how much my car is that or in a similar way so that's why I'm telling I don't repeat the same suggestions read the case study so if we return to the bluehead context to be able to reflect your Authority or pay strength distribution for these old projects because this website actually is created like 1999 it's a very old website it has really good amount of actually links the main problem here is that this website actually has a research section but research section doesn't internally link anywhere for instance credit card survey nearly has really good amount of links as you can see in this area and really good amount of links and these links like medical medical dep survey or some other type of surve V they or statistics they are all being linked really well but the problem here is that until to the late latest times if you click one of these you will see that they still don't have proper internal links through the main content which means you actually get all of your page rength to this page all the external links just go here but from here it is not distributed with the contextual relevance to the other subsections of the website so you should always understand the concept of page rank sculpting and how you should be Distributing your in internal and external page rank this was one of the problems and to be able to solve that issue I actually created a really a different type of topical map usually people think that topical map is only for semantics and not related to PTI but it's not true the semantics live in content in images and also in links as well in this case if we have a Ser page for medical dep survey you should be linking to the medical Deb consolidation which is directly monetization page and in the core section of topical map and then we also have other related Concepts like medical billing or mental health one more time because these are survey and DEP because mental health and money survey is connected to dep stress which is a psycholog topic plus also one more time connected to Medical medicine or let's say health context in the context of the depth or de consolidation or depth relief one more time so that's why in this specific content briefs you will see that actually we are Distributing page Rank by taking from external sources we distributed internal lims soon to the topical the course I'll be adding new lectures especially for expired domains by explaining how you can actually revive the page rank from 20 years old backlinks and properly distributed for your internally new created subsections because usually people buy a med let's say Journal a research journal or research magazine then they turn into an affilate Source or e-commerce source so since the website's previous state and current state are too different from each other sometimes even if the search engine ranks these emds or expired domains as well initially good after a point they balance is relevance to be able to have permanent uh increase there we try to adjust these relevance flow and natural as much as possible so that's why I suggest you to try to join the course or join our community or read these case studies or Jo into our cohort systems meet with people and try to understand how they are implementing these topical Maps out of respect to the client I won't be opening to the initial briefs and these are this is like the third time I actually showing some of the initial files this brings up to another subject which is the web Decay one problem in this type of projects is that since they are really old they have web Decay which means everything in this specific Source already has been decayed and we can try to understand it like this if I search for types of interest rate you will see that the Google directly actually CWS entire web and then they try to actually get the definition definitions of different type of interest rate types but some of these things are actually wrong it's normal because the open information extraction is not easy and if you look at to the results we have Investopedia 40 million collect amount like 6 million keywords half million referring domains second one 12 million traffic 1. 5 million keywords and 14,000 referring domain the fourth one fourth ranking one four million collects one to 1.