hello everyone welcome to another blad SEO series episode and this episode we'll be focusing on a new SEO case stud that I have published first of all this SEO case study is mainly published for helping small and the medium level businesses or size businesses especially from the health care Industries normally I don't care actually call Health as an industry but let's say health related businesses or services or hospitals doctors any kinds of Clinic actually can benefit from this specific case study that is published in this area I would like to actually share these things fa earlier this is actually from last month but lots of things actually are happening inside the agency outside of the agency as you know we have performed the first holistic SEO Mastermind directly in our country kushara Turkey where I also moved in that's why my actual background is different because it takes time to fill a giant Villa with the furniture so we already performed our private first inclusive and invite only Mastermind and we had really good uh people who joined us like owner of search Atlas or creator of Edward tools James to mass singers or CEO and owner of the cloudway and many other business people they were directly here we had three days of uh Mastermind one for ASO one for automation one for business we also performed a mini conference and two different touristic visits this is actually my way of just announcing it since it was it is not announced publicly it was a private event that we don't promote next year too we will be doing it and one conference in istan one Mastermind private and invite only in Kush every year we will keep H keep making it and this was a funny announcement for that in this specific newsletter I also announced this new SEO case study and one for from this channel already this was the last blue head SEO video but this video actually is recorded one and a half year ago nearly but I believe it's still protex Varity that's why I suggest you to check it it is a long one not like a blue hat it is like a mini lecture actually a mini course mainly for programmatic and this one was the other mainly for the healthcare besides that of course we keep publishing new success shares from our community and ibal ly is here or also our friend uh Patrick stlp today he also shared directly he designed a semantic content Network and for a B2B business a direct lless has seen this increase in the rankings uh in a really exponential growth way and he thanks us and we thank him as well there are 14 more success shares that I need to be actually completing this 246 to 260 success shares in the last one and half year but I don't have time to announce all of them in the social media today I will mainly be focusing on this case study I will transfer it as a simple video form so that you can actually understand some simple practical checklist that you can always use for local SEO technical SEO image or visual related metrics then I would like to also focus on the concept of of the cost of retrieval and how we can actually evaluate it because when I say the cost of retrieval people usually focus on the technical isue aspects or the page speed or the page size but it's not what we actually mean we mainly mean that mean that actually according to your Source context the content will be evated in a different way the algorithms or the large language models that will be used on your website again will be different according to the prominence of your website first they can focus on actually question answering algorithms on your side or they can focus on topic modeling so all the all these specific uh NLP tasks they have different type of costs we will explain that in the last bottom part here you will see four different websites as you know wo. com is a case study and success story that I have published way earlier Adonis Hakim the owner of the site actually has been asking me to actually publish something like a case study the first section I kept it very simple for people the last part is slightly more detailed with practical examples of mainly the semantics so it is the kinds of unification of blue hat uh simple step case study along with a small holistic SEO case study with Advanced methods as well there is a simple example of topical map creation here and also the difference between row and the process topical map as well so the video probably will be around 20 25 minutes so stay with me let's do some advanced stuff and also some simple checklist you can also read the article of course so the article actually has been uh divided into three sections the one for one actually focused on the local businesses mainly in this section you will see some a simple to-do list just 12 steps for your local business listings some of them are really simple but some of them have a few nuances for instance many businesses they don't use the questions question answers answer sections in their Google business profile you can actually take all the Reddit articles that are only related to your industry and articles that also take traffic from the Reddit then you can use this Reddit tress as questions in your Google business profile you can also write answers for them sometimes these are ranking directly in the people also ask questions or features in the as well it also helps you to justify your relevance for certain type of aspects if you're able to use certain engrams or let's say contextual phrases in these questions and answers in your Google business profile along with also the reviews or answers to the reviews it will be really helpful for you to increase your entity relevance to the topic that you are trying to rank for another part is the service area even if you have an address or physical address you should always Define some service areas for yourself and try to choose the areas that actually you you are targeting you might be a physical address from Turkey for health tourism but you should still have at least one Virtual Office from let's say Ireland because you might be getting your clients from there or you should Mark there as your service area directly as much as possible one of the websit here is actually Oscar Viet com Au and first time those first time in it history it actually exceeded the rankings of the opsm Au which is the main competitor but just using the semantics for the last two years we don't use any links or anything for that I will also show you another website which uses links uh too as you know we are not against links we just use different methods according to the different conditions that's the definition of all to anything for ranking there are some other parts here since I have written this article very earlier I also say here actually activate the messaging uh for Google business profile but around line I believe the Google is actually Google actually deactivated it but still you can use actually Google Chat API to integrate uh chatting or chat service or customer service to your website by using Google's own chat API it will be helpful uh for doing that there are many other things here like using barcodes to get more reviews in your business or also the one of the other things is that if you have more photos in your business listing it will be getting more Impressions and rankings you should also try to use exi data for these images if you can and I don't suggest anyone to use the fake reviews and lately Google as you know they got more aggressive and they can suspend the Google Business profiles it doesn't it doesn't work to lose your three years of Google Business listing just because of the fake reviews but U as you know no one can prove that too because if you're able to make a Google business profile shut down just based on the fake reviews you can also do it to your competitor that's the thing that always actually hold Google back and as long as people don't play with their patients but sometimes you can try to support yourself if there is also negative review attack on you I didn't write it here uh but as a non-written suggestion if People's attack your business profile mainly for negative reviews try to have some nine level uh local gu accounts thanks to these accounts you can actually report and remove these negative reviews if you have an alterative high level high level experience uh accounts in your uh in your let's say tool list let's say you have a n level nine local guide account in Google business profile you can report and it will be helpful for you to actually remove these type of uh negative reviews as well and uh one more thing is the review answers try to actually use these contextual answers as much as possible aligning them with your title text in the website because your website and your Google business profile one of them is trying to rank for the web the other one is trying to rank for the map so if you're able to align them as in entity oriented way it will be amplifying your ranking possibility even in a better way as well with that said uh second section mainly focus on the technical optimization in the healthcare SEO there are different type of user agents here one that I want to mention here is Google other so no one actually focused on this but there is a new user agent it mainly focus on actually uh it mainly focus on the research and development there are many websites that actually I need to mention in this case study and for instance dcom is is one of them but it will be a future article because this website was Lo losing rankings for the last two and a half month and for just last five months we deleted many pages added some new and we were able to actually reverse the ranking States gradually and slowly this is the first time that in the last two and a half year the website actually rank gain new queries and new rankings the important thing here is that if your website has won the core update from the 15 of the August you can actually see this Google other agent a lot if you're see does it means that your website has been used for internal research if it is if it has more than 30% of the craw allocation it is it also means that actually is even more important check this section and also if they are hitting four or four URLs make sure that they are also redirected to somewhere important too if you have too many for a force they will remove your website from these tests it's better to being tested for these new AI features as much as possible rather than being actually delayed for that part again if you come uh to do some other sections like Google extended usually people try to block this from robots I suggest you to try to be more tolerant because if you're able to use by GM API and if you're able to satisfy users Google will actually try to understand your website better since they focus on AI if you allow them to being tested being used for also internal research and the AI type of features it will be helpful for you to actually gain further prioritization in some certain type of uh let's say value understanding for your website again there are some metrics that I would like to mention here one of them is response time optimization as I always say try to aim under 100 milliseconds for that for Dom sici try to aim under 900 elements for semantic HTML always use semantic HTML especially for your parent HTML parent elements user agent analysis I probably will be doing a separate maybe case study just for user Lo analysis but the thing is try to see whether Google is coming from this Google bot other Google extend Google bot extended or some other areas too try to focus on mainly the Google smartphone and Google other right now it will be more helpful try to return any 302 404 or to 10 4 410 to the actually a kinds of 301 that makes sense and for L file and license too you might be having very good amount of unnecessary URL hits this is something I always check let's say your website has uh just 10,000 products but in your log files we see 800,000 different URLs and only let's assume 5,000 of them are from your product pages in this case the search engine probably remembers many old urls and you should try to actually remove this from your log files by cleaning it and there is also a new case study need for that because this is a very detailed uh a different type of case study I try to put usually 1,000 URLs through a s map if you are able to chunk your sit Maps it will be better for you to actually optimize your rankings in a better way and there are many other things here that I suggest you to check for web security optimization or DNS prage or uh let's say preconnect for instance if if it is not about the content use DNS pref if it is about the con content use preconnect there are two different things I suggest you to check the difference between them to understand it further never skip this security optimization as well at least with your response leers always activate a pwa for your website it will be helpful for you to actually optimize further and more frequent Google hits when it comes to Str structur data I have given some Str structured data samples uh as well and I also have given some simple examples for dynamic rendering too uh to go a little bit better let me explain a few Concepts here CSS refactoring means that actually you are trying to decrease the amount of the CSS selectors on your website even you can use C GPT right now I have created many agents GPT agents for my community one of them actually focus on G CSS refactoring which means you give your HTML from HTML we get all the list of CSS selectors you give them your CSS it checks which selectors are used and it cleans the CSS for that page talk if you have certain type of product pages you can create a product CSS and product CS doesn't have to use the unused CSS selectors in this case and JavaScript res shaking means that you are trying to remove unused functions unuse variables basically it is also really helpful for increasing your CW efficiency anything that is not relevant to the content should be also deferred especially for JS when it comes to the dynamic rendering is a long story but basically imagine you are able to show a clean version of your website by just checking actually the user agents you say that if the user agent has the Google Google bot bot or some other of the other search engine craing system user agents you give a different HTML for it and this HTML version can be very much cleaner than the other one and again there are other type of canonicalization or uh let's say King suggestions and of course always try to use mainly these picture Source set because this will allow you to actually save really good amount of bandwidth one another trick here is that try to put a to the first version use the jpg to the second version because a can't use exif data or license of the image but jpg can use this way you can show Google to actually jpg for exif and aif for the speed when it comes to I don't use webp since it creates some indexation misunderstanding uh for some of the search engine croing hits and when it comes to the semantics the most important part until here actually it's just like a blue head case with simple suggestions after this point I wanted to focus on one of the things that I didn't explain in a detail at least by focusing on some certain type of detail I have actually just uh these two senten as an example the weather is nice today I am planning to go for a walk in the park two simple sentences one of them actually has six tokens the other one has the 12 tokens which means that according to do NLP processes or nature language processing related tasks uh for these two sentences our cost won't be same when we say the cost of retrieval it is mainly about how we can actually refine the index further by understanding the topicality and also how we can create uh let's say different type of knowledge representations whether triples whether frame semantics or semantic R labels and the process is not always same that's why in this area I have try I have tried to give some certain type of cost understanding I didn't use CPU cost here in a t way because to be able to show you any type of cost the number here has to be really long after the comma uh or that so that's why it say zero but it's not zero actually it's like 0. 00001 type of number and but the main point here is that according to the Token number as you can see the cost in terms of a special time cost it will be increasing the main point here is that whenever you are using a contextless word or unnecessary word or whenever you are structuring your sentences in the wrong way it will be causing you to Tire the search engine let's say in your website you have 600 pages and 6 million sentences to be able to process these 6 million sentences and create an information graphic from your only your website we have to perform all these NLP related tasks whether machine translation whether part of speech tech whether let's say core reference resolution or entity linking or different type of NLP tasks will be performed here here I have given different type of aggregated and guested costs for different type of tasks for named entity recognition this might be the cost and for post tagging this might be the cost what the search engine will be doing firstly is whether you are vert to understand you are vert to actually being processed or not that's why sometimes you see that some websites suddenly they get actually all the features in pets or they also use it because search engine finds are more efficient more accurate and cheaper to rank cheaper to understand web Source on the web so here we explain this concept uh slightly then I actually focus on how to create a topical map in this case I also have created a simple table let's say you an e-commerce Health Website or dentist optometrist dietician Hospital clinic doctor or whatever anything that actually related to your uh Source context it will be affecting what type of pages you will be having according to the types of the pages and what functions these Pages have how you approach these topics will need to be actually changing for instance if you talk about what is the main reason for the cancer we will be giving a certain type of of course answer here but your answer has to be connected to what you do you should always justify why you are ranking there you shouldn't just actually chase the traffic but search engine should know and understand your business and they should create a relevance from your business entity and the topic if your entity is not related to topic you will be just forcing the search engine based on relevance and Page rank but eventually they will be dropping the website from there this is one of the reasons that many affiliate sources are actually losing traffic because they are V websites in my terminology which means you are looking the w wall is looking at to you it doesn't have a function it's just a text with a wall and also another thing is uh the topical map is not list of keywords it's not list of entities or not list of also topics it mainly explains actually which topics will be on the same page in what format and where they will be linking to and in this format what type of functions and components and content types and contexts will be processed in what order it has it is a kinds of coordinate structure and organization for our future content and in this case if it is an e-commerce what you will be internally linking from the reason of the cancer will be changing according to your Source if you are an Ecommerce you will be linking something else if you are let's say a hospital or clinic you will be linking something else in this case your topical map will be changing according to that and I am explaining these here plus also uh what what type of questions should be followed or components should be followed after what type of components it also has been explained in these specific sections in the last part I directly give the five main components in the topical maps and many s owners they try to cooperate with me lately and before cooperating with any type of s I would like to train them about the true of truth about this terminology I created this concept I invented it but lately it started to become like a boss term everyone tries to use it for just selling purposes to increase inre the evoluation of their sus by while also taking really good amount of investments from the investors but it should serve a purpose in a functioning way here I also give some website examples and uh local SEO context too because as you know Oscar Wy this is the first time that it is passing opsm Au and here I also have given some examples about how you can read the ser and how you can actually create a kinds of topical map by just reading the Ser a little bit then I have also given an example mainly from the chiropractor this is a ro topical map let's say we first actually get certain type of uh entities from a certain type if you look at these things these are anotomy terms that are relevant to do K Chiropractic ISM or chiropractors and then we get uh commercial commercial terms mainly actually for uh again chiropractors especially for localities then we Bridge them to get to each other these are directly about kir proactor these are about location these are about Anatomy that is always connect to the chiropractors one more time so if you want to actually process these three different list of entities to form a kind of topical map you will need to understand which section actually can be connected to the which another area when I say the outo segment if you read this section you will see that every topical map has five components one is core second one is outdoor section outdoor section is not directly for uh monetization is mainly for actually more historical data historical data means that clicks or text over or query sessions or any type of query log sessions and search engine result page behavior that the search engines are actually following through the years if you're able to get more Impressions and if the search engine is confident enough to test your website and if you're able to pass the quality tests relevance tests you will be starting to actually increase your ranking further and further this is mainly about that and and in this case this outoor section allows you to gather more historical data and FL your relevance to the core section where you are trying to make money that's why in this methodology you don't always need external links because you use mainly historical data with user behaviors and query loog sessions to trigger this ranking but if you use external page rank of course this process will be fastened because to trust a website search enges may only have two signals user behaviors or again users from a different angle which means links or your m and and a sentiment around these links or meaning or context around these links and mentions so if you use historical data still it will be working but it will be just slightly working in this case and or slower it will be working and then I explain you how you can merge these auor and core sections with each other by focusing on the attributes of the chiropractor like certificate after that we give certain type of entities again for that then we explain a bit how you can use our agents and how how you can actually start to create this topical map for instance for the entity of the back pain treatment directly you will see that there are some subc connections for them acute back pain chronic back pain lower back pain upper back pain middle back pain mechanical back pain radicular pain or many other things so do does it mean that you have to open to rented Pages just for the back pain treatment no it doesn't mean that that's why I'm telling that topical map is the best form of knowledge representation for higher possible relevance in this case you will need to understand that actually when you look at for instance in this area the let me just find section back pain causes and treatments should be a local agnostic web page which means it should be a page about the subject rather than a locality to connect it to the core section of the topical map for monetization purposes a back pain treatment New York City page should be created then does it mean that we should also open pages for New York acut P pain treatment and lower back pain treatment Etc no you should always consolidate ranking signals as much as possible which means you should open less or fever Pages the intersection between entities of acute back pain treatment New York City and back pain treatment New York City doesn't have enough level of documents or query search logs to construct a new Index this is Advanced if you watch this lecture from Prova ravan who is actually vice president of Google currently many people criticize him but this speech is from 14 sorry 10 years ago here he explains actually semantics on the web and how the indexes are constructed based on the document and query statistics if there is not enough level of documents on the web it won't be creating actually a new index and in this case we also see that the semantic distance between back pain and acute pain are very close to each other in this case they can actually coae together in a page without losing in your relevance in this case they will actually be going to the same page which means in the micro context in the main content you will have actually lower back pain treatment and then in the sub other sections you will have actually this acute back pain or other types of the back pains and you will be explaining them and you'll be using some internal links to the other sections to actually flow these relevance as much as possible you will also be linking your actual back pain treatment New York City localization page from that area too and probably I explain it somewhere here let me read it for you yeah every topical map should stay Dynamic and change according to the indexes of the search engines this requires an always own checking schedule with a semantic SE expert and routine still the acut back pain can be a separate page but not alone it has to be a single page together with lower back plane because their semantic distance to each other is shallow after you review all these combinations and possible intersections you start designing the heating video image paragraph list table C or ping distribution in these content networks to construct the most lasting higher quality lower C sematic content Network which means that once you get this roow topical map you will need to understand what type of a knowledge representation actually you can create for your own local business in this case here as you see this was also a local not local actually this is an Eco site mainly for uh mainly for sorry I guess there is a problem with my payment system for the premium basically in this one it will be focusing on 27 articles that we published and we outranked directly with medical news today or Healthline or with MD it was an e-commerce site just about luxurious water and Jamal Kish explained how we did it together like six months ago and there are other examples here this is do st.
com which means doctor side for appointments for the doctors this website also was losing rankings then we actually deleted the unnecessary Pages it starts to increase its rankings the good thing here is that at the 15 of the August it's competitor lost the update we actually won it the thing is usually in the core updates if both of the main competitors win the core update it means that it is a cluster to Cluster comparison if your competitor loses and you win is inside the cluster comparison different core updates actually perform different type of ranking vol volatilities you should understand this first sometimes all the affilate sources can lose rankings it usually if it happens one representative stays there like Forbes which got manual panalty lately because algorithmically demoting it is very hard it's a very high Pace rank source and another one here is that Iain Diamond rehab. tha. com I published that case success story by earlier and it also won the core update and it was hit with the helpful content update is coming back important thing here is that uh even if you don't change anything sometimes again it might be coming back like this it's not a problem at all uh but also it means that to be able to coming back if you are able to define or refine your sentences your topical Maps always on you will be increasing your chance of ranking actually even further these are the results of the wello right now it's in the all High Time section and what happened in this area if you look at my technical SEO related Parts you will again see this website created unnecessary Pages a lot and it actually allocated or forced the search engine to allocate more crawling hits initially it worked well then they removed most of these Pages after adding them as you can see 400,000 craw hits a day this is the first time 10 times more if you get this type of a craing allocation you have to make it Justified if you don't you will be losing rankings right now we will be focusing on actually fixing these unnecessary pages and communicating with the search engine because there is not 800,000 products here which is a problem right now it is working but in in the future it won't be working that's why these type of things can be used I mean if you increase your website 10 times more it will be creating actually R ranking good at the beginning but if the content is not good it will be causing you to lose the rankings again I will show you a e-commerce example not held the same problem happened here too because developer team actually duplicated the side two times it became way much bigger then we started to clean the other URLs again it started to come back in a similar way this is what I call a New Concept not that new but I use it sometimes search engine tolerance they can tolerate you until a while but after a point if you keep repeating these they will actually delate your source and for that subract that money is a really good example as you know this is a new SEO case that I published this year it's a brand new site it doesn't lose rankings content here and content here is same if you look at my source YouTube channel you will see it directly but let's check the C stats you will see that craw starts from 40,000 craw hits to a day to the three ,000 what happened do you think actually what caused that that we have 400 Pages index 40,000 index craw hits it means that the search engine actually as I say we didn't use links here at all complet the search engine starts to gather Impressions on the site they started do minimal tests then they actually do all these NLP tasks and tests on on the website they like the results then they actually with a core update increase your rankings then they assume that you will keep with the same quality that's why they assign high level of actually query craw hits for you which means your website is being prioritized remember search agent tolerance it's a concept that I use to explain how tolerant the search engine will be you will be for you even if you actually make mistakes like publishing two unnecessary Pages or doing slightly spammy things the second one is actually search engine prioritization they will prioritizing your website if they craw 40,000 times a website with 400 Pages then what happened here if you look at here I always have keyp in this area I always say this will need to be like 99% 98 at least and again smartphone as I say is a good one Discovery is 2% they sto publishing new content that was the main problem in this website and you used opportunity window here because as I say there are three ranking State negative natural positive if you're in the positive ranking State you should keep publishing if you stop if the search engine realizes that they won't be actually coming back that frequently again so indexations will be slower the new query rankings will be where hard way harder and if this website doeses a mistake like that right now if it this website publishes 800,000 Pages mistakenly search Eng will be decreasing their rankings really quickly and fast they won't be tolerating it right now your tolerance also will be decreased so there are many things that I explained here we mainly focused on the health of course uh but there are many other contexts to be honest there are many case studies that I will be publishing this year and I'll will be in Cypress and check my SEO we will be doing multiple meetups in these areas for instance BK book.