Search
Close this search box.

Digital Decay: 38% Of Webpages from 2013 Have Vanished

Digital Decay

A new study by Pew Research Center reveals the fleeting nature of online information: 38% of webpages from 2013 are no longer accessible a decade later. Search Engine Journal is reporting that the analysis, conducted in October, examined broken links on government and news websites and in the “References” section of Wikipedia pages. The findings reveal that: >>> 23% of news webpages and 21% of government webpages contain at least one broken link >>> Local-level government webpages, particularly those belonging to city governments, are especially prone to broken links >>> 54% of Wikipedia pages have at least one link in their “References” section pointing to a non-existent page Social Media Not Immune To Content Disappearance To investigate the impact of digital decay on social media, Pew Research collected a real-time sample of tweets on X and monitored them for three months. The study discovered that “nearly one-in-five tweets are no longer publicly visible on the site just months after being posted.” In 60% of these cases, the original posting account was made private, suspended, or deleted. In the remaining 40%, the account holder deleted the tweet, but the account still existed. Certain types of tweets are more likely to disappear than others, with more than 40% of tweets written in Turkish or Arabic no longer visible within three months of posting. Additionally, tweets from accounts with default profile settings are particularly susceptible to vanishing from public view. Defining “Inaccessible” Links & Webpages For the purpose of this report, Pew Research Center focused on pages that no longer exist when defining inaccessibility. Other definitions, such as changed content or accessibility issues for visually impaired users, were beyond the scope of the research. The study used a conservative approach, counting pages as inaccessible if they returned one of nine error codes, indicating that the page and/or its host server no longer exist or have become nonfunctional. Pew Research Center’s study sheds light on the extent of this problem across various online spaces, from government and news websites to social media platforms. The high rate of link rot and disappearing webpages has implications for anyone who relies on the internet as a reliable source of information. It poses challenges for citing online sources, as the original content may no longer be accessible in the future. What This Means For SEO Professionals This study underscores the need to regularly audit and update old content, as well as consistently monitor broken links and resolve them promptly. SEO professionals should also consider the impact of digital decay on backlink profiles. As external links to a website become inaccessible, it can affect the site’s link equity and authority in the eyes of search engines. Monitoring and diversifying backlink sources can help mitigate the risk of losing valuable links to digital decay. Lastly, the study’s findings on social media content prove that SEO efforts should focus on driving users back to more stable, owned channels like websites and email lists.

Google Confirms Links Are Not That Important

Links a Signal for Ranking

Google’s Gary Illyes confirmed at a recent search marketing conference that Google needs very few links, adding to the growing body of evidence that publishers need to focus on other factors. Gary tweeted confirmation that he indeed did say those words. Links a Signal for Ranking Links were discovered in the late 1990’s to be a good signal for search engines to use for validating how authoritative a website is and then Google discovered soon after that anchor text could be used to provide semantic signals about what a webpage was about. One of the most important research papers was Authoritative Sources in a Hyperlinked Environment by Jon M. Kleinberg, published around 1998 (link to research paper at the end of the article). The main discovery of this research paper is that there is too many web pages and there was no objective way to filter search results for quality in order to rank web pages for a subjective idea of relevance. The author of the research paper discovered that links could be used as an objective filter for authoritativeness. Kleinberg wrote: “To provide effective search methods under these conditions, one needs a way to filter, from among a huge collection of relevant pages, a small set of the most “authoritative” or ‘definitive’ ones.” This is the most influential research paper on links because it kick-started more research on ways to use links beyond as an authority metric but as a subjective metric for relevance. Objective is something factual Subjective is something that’s closer to an opinion. The founders of Google discovered how to use the subjective opinions of the Internet as a relevance metric for what to rank in the search results. What Larry Page and Sergey Brin discovered and shared in their research paper (The Anatomy of a Large-Scale Hypertextual Web Search Engine – link at end of this article) was that it was possible to harness the power of anchor text to determine the subjective opinion of relevance from actual humans. It was essentially crowdsourcing the opinions of millions of website expressed through the link structure between each webpage. What Did Gary Illyes Say About Links In 2024? At a recent search conference in Bulgaria, Google’s Gary Illyes made a comment about how Google doesn’t really need that many links and how Google has made links less important. Patrick Stox tweeted about what he heard at the search conference: ” ‘We need very few links to rank pages… Over the years we’ve made links less important.’ @methode #serpconf2024″ Google’s Gary Illyes tweeted a confirmation of that statement: “I shouldn’t have said that… I definitely shouldn’t have said that” Why Links Matter Less The initial state of anchor text when Google first used links for ranking purposes was absolutely non-spammy, which is why it was so useful. Hyperlinks were primarily used as a way to send traffic from one website to another website. But by 2004 or 2005 Google was using statistical analysis to detect manipulated links, then around 2004 “powered-by” links in website footers stopped passing anchor text value, and by 2006 links close to the words “advertising” stopped passing link value, links from directories stopped passing ranking value and by 2012 Google deployed a massive link algorithm called Penguin that destroyed the rankings of likely millions of websites, many of which were using guest posting. The link signal eventually became so bad that Google decided in 2019 to selectively use nofollow links for ranking purposes. Google’s Gary Illyes confirmed that the change to nofollow was made because of the link signal. In 2023 Google’s Gary Illyes shared at a PubCon Austin that links were not even in the top 3 of ranking factors. Then in March 2024, coinciding with the March 2024 Core Algorithm Update, Google updated their spam policies documentation to downplay the importance of links for ranking purposes. At the beginning of April Google’s John Mueller advised that there are more useful SEO activities to engage on than links. Mueller explained: “There are more important things for websites nowadays, and over-focusing on links will often result in you wasting your time doing things that don’t make your website better overall” Why Google Doesn’t Need Links The reason why Google doesn’t need many links is likely because of the extent of AI and natural language undertanding that Google uses in their algorithms. Google must be highly confident in its algorithm to be able to explicitly say that they don’t need it. Way back when Google implemented the nofollow into the algorithm there were many link builders who sold comment spam links who continued to lie that comment spam still worked. As someone who started link building at the very beginning of modern SEO (I was the moderator of the link building forum at the #1 SEO forum of that time), I can say with confidence that links have stopped playing much of a role in rankings beginning several years ago, which is why I stopped about five or six years ago.

Steps to Consider Before a Site Migration

Site Migration

Steps to Consider Before a Site Migration Navigating a site migration is akin to traversing a treacherous landscape fraught with potential pitfalls. The stakes are undeniably high, as a misstep could lead to a catastrophic outcome for your website’s performance and visibility. However, armed with the right strategies and foresight, you can mitigate risks and ensure a successful transition. Here, we delve into five crucial tips to steer your site migration towards a favorable outcome. 1. Thorough Preparation: Safeguarding Your Assets Before embarking on the migration journey, meticulous preparation is paramount. Begin by meticulously documenting the existing website to safeguard crucial assets: Database and File Backup: Download the website’s database and files, storing duplicates in multiple secure locations. This redundancy mitigates the risk of data loss or corruption. Comprehensive Site Crawl: Conduct a comprehensive crawl of the website to capture its entirety. Save the crawl data and export it in a structured format such as CSV or XML for future reference. Always verify the integrity of your backups to ensure they’re free from corruption. If delegating migration tasks to a third party, maintain a personal backup as a failsafe measure. 2. Exhaustive Website Crawl: Mapping the Terrain A thorough website crawl serves as your compass in the migration process. After conducting the crawl, create backups of the data to safeguard against loss. Post-migration, utilize this crawl data to identify any anomalies such as missing URLs, faulty redirects, or misconfigured pages. Utilise tools like Screaming Frog’s list mode to crawl specific batches of URLs, facilitating targeted analysis and troubleshooting. 3. Migrating to a New Template: Navigating the Design Terrain Transitioning to a new website template is a pivotal yet potentially perilous endeavor. Follow these guidelines to navigate the design landscape effectively: Image and URL Consistency: Ensure consistency in image URLs, alt text, and titles, especially when incorporating new images. Code Optimisation: Scrutinise the template for hardcoded heading elements, opting for CSS styling where feasible to maintain structural integrity. Content and Analytics Integration: Prioritize consistency in URLs and ensure seamless integration of analytics and tracking codes. Anticipate potential fluctuations in search engine evaluation post-migration, communicating this to stakeholders beforehand. Leverage tools like Screaming Frog for post-migration audits, facilitating comprehensive analysis of image sizes and heading tags. 4. Seamless Web Host Migration: Ensuring Continuity Migrating to a new web host demands meticulous planning and execution. Follow these steps to ensure a seamless transition: Maintain URL Structure: Preserve existing URL structures to minimize disruption and preserve link equity. Implement 301 Redirects: Facilitate seamless redirection from old URLs to their new counterparts, preserving both user experience and search engine rankings. Optimise Performance: Prioritise performance optimisation, ensuring fast and reliable hosting to enhance user experience and search engine favorability. Conduct a final walkthrough of the new site to verify functionality and address any discrepancies before the cutover. 5. Acknowledging Limitations: Adapting to Constraints Embrace flexibility and adaptability in the face of client-imposed limitations. Understand and accommodate external factors that may influence migration feasibility, such as legal restrictions or business imperatives. Maintain open communication channels with stakeholders and seek third-party validation when warranted, ensuring alignment between migration objectives and organizational constraints. Navigating the Migration Landscape Site migrations represent a formidable challenge, demanding meticulous planning, and strategic execution. By adhering to these guidelines and embracing adaptability, you can navigate the migration landscape with confidence and secure a successful outcome. Remember, meticulous preparation, comprehensive analysis, and proactive adaptation are your allies in the journey towards a seamless site migration.

Amazing Paving of Sunderland

Amazing Paving

We at Direct Submit are now working with Amazing Paving of Sunderland to help establish and promote their range of landscaping and garden design services. They specialise in the design and construction of all types of landscaping work including driveways, patios, sitting areas, pathways, walling, relays, drainage, turfing, fencing and rockeries. With friendly and flexible service and over 20 years of experience, they are a family run business serving Sunderland and the North East. For more information and advice regarding your project, please contact them today on 07468 186 748 or visit their landscape gardening website for a free site visit and quotation. If you are a business in the North East or the wider UK, then our SEO services can help you gain exposure to a whole new set of potential customers. As a leading UK-based Digital Marketing & SEO Agency, our experienced team is available to help your business get onto the first page of Google. We’ve helped loads of businesses across the North East and the UK improve their search ranking positions and we’d love to see if we can help you too. Contact us today on 01207 283878 or visit our Digital Marketing Services page and ask how we can apply our proven and cost-effective SEO process to help market your business on the Internet.

Core Web Vitals are Used by Google in their Ranking

Core Web Vitals

Core Web Vitals are Used by Google in their Ranking Google updated their Page Experience Documentation in order to make it explicitly clear that Core Web Vitals are used by their ranking systems. What’s curious about the change is that Google continues to not say that Core Web Vitals are a ranking factor. Googlers And Statements About Ranking Factors Something kind of weird about Googlers is that it seems like they tend not to use the phrase ranking factor. For example, I did a site:search of former Google engineer Matt Cutts’ blog and I couldn’t find a single instance of him writing the phrase “ranking factor” in any of his posts. Same with his YouTube videos when talking about links like here and here. Google’s John Mueller on Core Web Vitals John Mueller said that Core Web Vitals was a ranking factor three years ago on Reddit in reference to the Core Web Vitals (CWV), but Google’s Page Experience In Search Results explainer never explicitly says Core Web Vitals are ranking factors. Which brings us to Google’s SearchLiaison who caused a stir in February 2024 when he tweeted that Google’s documentation didn’t say that Core Web Vitals (CWV) were a ranking factor or a signal. He tweeted: “And people go “Well, what does ranking really mean. Maybe it’s signals? They didn’t say it’s not signals!” So do we have a signal page experience signal? No. That’s why we made a page that says “There is no single signal.” Oh but wait, so you have multiple signals? Yes, we anticipated this question which is why we have on that same page “Our core ranking systems look at a variety of signals.” Which leads to things like “So is CWV a signal and if I don’t meet those, am I doomed?” Which is why that same page says “However, great page experience involves more than Core Web Vitals.” We don’t list what is and isn’t a ranking signal on that page because things change. Maybe something was once; maybe it shifts but aligns with other things we might do to understand page experience. We’re trying to guide people toward some useful resources and things to thing about with page experience but in the end — do whatever you think is providing a great experience for your visitors.” And in another tweet on the following day he wrote (referring to the Page Experience In Search explainer): “I didn’t say we have a page experience “ranking signal” nor do we have some single signal like that. The page below specifically says we do NOT have something like that. “Is there a single “page experience signal” that Google Search uses for ranking? There is no single signal. Our core ranking systems look at a variety of signals that align with overall page experience. We don’t say there’s one particular thing people need to do, nor do we say if you don’t do a particular think, you won’t rank. We say look across a range of things and try to provide a good page experience to your visitors” SearchLiaison is right. The Page Experience In Search Results explainer document didn’t say that Core Web Vitals is a ranking factor, not even in 2022 when it was first published. Google Almost Says CWV Is A Ranking Factor After all the explaining without acknowledging Core Web Vitals as a ranking factor and two years of opaqueness in their Page Experience In Search Results documentation about CWV in relation to ranking factors, Google changed their mind and updated their documentation to almost say that Core Web Vitals are a ranking factor. This is the ambiguous part that was removed from the documentation: “What aspects of page experience are used in rankings? There are many aspects to page experience, including some listed on this page. While not all aspects may be directly used to inform ranking, they do generally align with success in search ranking and are worth attention.” The above passage was replaced with this new paragraph: What aspects of page experience are used in ranking? Core Web Vitals are used by our ranking systems. We recommend site owners achieve good Core Web Vitals for success with Search and to ensure a great user experience generally. Keep in mind that getting good results in reports like Search Console’s Core Web Vitals report or third-party tools doesn’t guarantee that your pages will rank at the top of Google Search results; there’s more to great page experience than Core Web Vitals scores alone. These scores are meant to help you to improve your site for your users overall, and trying to get a perfect score just for SEO reasons may not be the best use of your time.” The new documentation doesn’t use the phrase “ranking factor” or “ranking signal” in reference to the core web vitals. But it now explicitly acknowledges that CWV is used by Google’s ranking systems, which is less ambiguous than the previous statement that high CWV scores are recommended for “success with Search.” Read Google’s updated documentation

Google Confirms: High Quality Content is Crawled More Often

Quality Content

Google Confirms: Quality Content Is Crawled More Often Google’s Search Relations team reveals that high-quality, user-centric content is the key to increasing crawl demand, debunking crawl budget myths. SEO professionals have long discussed the concept of a “crawl budget,” which refers to the limited number of pages search engines can crawl daily. The assumption is that sites must stay within this allotted budget to get pages indexed. However, the Search Engine Journal is reporting that in a recent podcast, Google’s Search Relations team debunks misconceptions about crawl budgets and explains how Google prioritizes crawling. How Googlebot Prioritises Crawling Dave Smart, an SEO consultant and Google Product Expert, acknowledges the confusion surrounding crawl budget: “I think there’s a lot of myths out there about crawling, about what it is and what it isn’t. And things like crawl budgets and phrases you hear thrown around that may be quite confusing to people.” Gary Illyes answered Dave with a question: “All right. I will turn this around and I will ask you, if you operated a crawler, how would you decide what to fetch?” David Smart, the SEO consultant, responded: “You need to do it by looking at what’s known, finding somewhere to start, a starting point. And from that, you get the links and stuff, and then you would try and determine what’s important to go and fetch now, and maybe what can wait until later and maybe what’s not important at all.” Gary Illyes expanded on how Google decides how much to crawl by explaining the role of search demand. This is what Gary said: “One is the scheduler, which basically says that I want to crawl this …But that’s also kind of controlled by some feedback from search. …if search demand goes down, then that also correlates to the crawl limit going down.” Gary does not explain what the phrase “search demand” means. But the context of his entire statement is from Google’s perspective. So from Google’s perspective “search demand” probably means search query demand. Search query demand makes sense because if nobody’s searching for Cabbage Patch Kids then Google doesn’t really have a reason to crawl websites about Cabbage Patch Kids. But again, Gary did not explain what “search demand” means so we have to look at it from the context in which that phrase was spoken. Gary finishes his thought on that topic with this sentence: “So if you want to increase how much we crawl, then you somehow have to convince search that your stuff is worth fetching, which is basically what the scheduler is listening to.” Gary does not elaborate what he means by “convince search that your stuff is worth fetching” but one interpretation could be to make sure it’s relevant to user trends, which means keeping up to date. Focus On Quality & User Experience So, what can websites do to ensure their pages get crawled and indexed efficiently? The answer lies in focusing on site quality. As Illyes puts it: “Scheduling is very dynamic. As soon as we get the signals back from search indexing that the quality of the content has increased across this many URLs, we would just start turning up demand.” By consistently improving page quality and the usefulness of your content to searchers, you can overcome any assumed limitations on crawling. The key is to analyze your site’s performance, identify areas for improvement, and focus on delivering the best possible experience to your target audience. Clarifying Crawling Decisions Google’s recent insights clarify that a fixed “crawl budget” is largely a myth. Instead, the search engine’s crawling decisions are dynamic and driven by content quality and search demand. By prioritizing quality, relevance, and user experience, site owners can ensure that their valuable pages get discovered, crawled, and indexed by Google – without worrying about hitting an arbitrary limit. Need some content writing ideas?

SEO Strategies & Benefiting Your Business

The Power of SEO: How Applying SEO Strategies Can Benefit Your Business In today’s fluid business environment, where every business vies for attention and visibility, search engine optimization (SEO) emerges as a beacon of hope. It’s not just about ranking higher on search engine results pages (SERPs) anymore; it’s about crafting a digital presence that resonates with your audience and drives tangible results for your business. In this guide, we’ll explore the intricacies of applying SEO strategies and delve into the myriad benefits it brings to businesses of all sizes. Understanding Search Engine OptimiSation (SEO): Before diving into the benefits, let’s establish a clear understanding of what Search Engine Optimisation or SEO entails. SEO is the practice of optimising your website and online content to improve its visibility and ranking on search engine results pages. It involves various techniques, including keyword research, on-page optimization, link building, and content creation, all aimed at enhancing your website’s relevance and authority in the eyes of search engines like Google, Bing, and Yahoo. The Benefits of Applying SEO to Your Business Enhanced Visibility and Increased Traffic: The primary goal of SEO is to improve your website’s visibility on search engines. By optimizing your website and content for relevant keywords and phrases, you increase the likelihood of appearing in search results when users are looking for products or services related to your business. Higher visibility translates into more organic traffic to your website, as users are more likely to click on websites that appear on the first page of search results. Targeted Traffic and Higher Conversion Rates: One of the key advantages of SEO is its ability to attract highly targeted traffic to your website. Unlike traditional advertising methods, which cast a wide net and target broad audiences, SEO allows you to reach users who are actively searching for information or solutions related to your business. This targeted traffic is more likely to convert into leads or customers, resulting in higher conversion rates and a greater return on investment (ROI) for your business. Improved User Experience: User experience (UX) plays a crucial role in the success of any website, and SEO can significantly impact UX in a positive way. By optimizing your website for search engines, you also improve its usability and accessibility for human users. This includes making your website mobile-friendly, improving page load times, and ensuring easy navigation and intuitive site structure. A seamless user experience not only enhances customer satisfaction but also increases the likelihood of repeat visits and referrals. Build Trust and Credibility: In the digital age, trust and credibility are paramount for businesses looking to establish a strong online presence. SEO can help build trust and credibility by improving your website’s authority and reputation in the eyes of both users and search engines. By consistently producing high-quality, relevant content and earning backlinks from reputable websites, you demonstrate expertise and authority in your industry, which instills confidence in potential customers and encourages them to engage with your brand. Long-Term Sustainability and Cost-Effectiveness: Unlike paid advertising, which requires ongoing investment to maintain visibility and traffic, SEO offers long-term sustainability and cost-effectiveness. While it may take time to see significant results from SEO efforts, the benefits can be long-lasting and continue to accrue over time. Once your website starts ranking well on search engines, you can enjoy a steady stream of organic traffic without having to constantly pay for clicks or impressions. Additionally, the cost of acquiring organic traffic through SEO is typically lower than that of paid advertising, making it a more cost-effective strategy in the long run. Competitive Advantage: In today’s competitive marketplace, businesses that invest in SEO gain a significant competitive advantage over those that neglect it. By outranking competitors on search engine results pages, you can capture a larger share of organic traffic and attract more potential customers to your website. Moreover, by staying ahead of the curve and adapting to changes in search engine algorithms and consumer behavior, you can maintain your competitive edge and stay relevant in your industry. SEO Strategies & Your Business Applying SEO strategies to your business can yield a multitude of benefits, ranging from increased visibility and traffic to improved user experience and credibility. By investing in SEO, you not only enhance your digital presence but also drive tangible results for your business, including higher conversion rates, increased revenue, and long-term sustainability. As the digital landscape continues to evolve, SEO remains a cornerstone of any successful online marketing strategy, offering businesses the opportunity to thrive in an increasingly competitive environment.

Google March 2024 Core Update: Reducing Unhelpful Content

Google March 2024 Core Update

Google has announced a significant update to its search algorithms and policies to tackle spammy and low-quality content on its search engine. Google March 2024 Core Update: Google’s extensive March 2024 Core Update tackles low-quality content and introduces new spam policies targeting manipulative practices. The March 2024 Core Update, which the company says is more extensive than its usual core updates, is now rolling out. This update includes algorithm changes to improve the quality of search results and reduce spam. Improved Quality Ranking One of the main focuses of the March 2024 Core Update is to enhance Google’s ranking systems. “We’re making algorithmic enhancements to our core ranking systems to ensure we surface the most helpful information on the web and reduce unoriginal content in search results,” Elizabeth Tucker, Director of Product for Search at Google. The company has been working on reducing unhelpful and unoriginal content since 2022 and the March 2024 update builds on those efforts. The refined ranking systems will better understand if webpages could be more helpful, have a better user experience, or seem to be created primarily for search engines rather than people. Google expects that combining this update and its previous efforts will collectively reduce low-quality, unoriginal content in search results by 40%. Google states: “We believe these updates will reduce the amount of low-quality content in Search and send more traffic to helpful and high-quality sites. Based on our evaluations, we expect that the combination of this update and our previous efforts will collectively reduce low-quality, unoriginal content in search results by 40%.” New Spam Policies In addition to the ranking adjustments, Google is updating its spam policies to remove the “lowest-quality” content from search results. Google states: “We’ll take action on more types of these manipulative behaviors starting today. While our ranking systems keep many types of low-quality content from ranking highly on Search, these updates allow us to take more targeted action under our spam policies.” Scaled Content Abuse Google is strengthening its policy against using automation to generate low-quality or unoriginal content at scale to manipulate search rankings. The updated policy will focus on the abusive behavior of producing content at scale to boost search ranking, regardless of whether automation, humans, or a combination of both are involved. Google states: “This will allow us to take action on more types of content with little to no value created at scale, like pages that pretend to have answers to popular searches but fail to deliver helpful content.” Site Reputation Abuse Google is addressing the issue of site reputation abuse, where trusted websites host low-quality, third-party content to capitalize on the hosting site’s strong reputation. Google provides the following example of site reputation abuse: “For example, a third party might publish payday loan reviews on a trusted educational website to gain ranking benefits from the site. Such content ranking highly in Search can confuse or mislead visitors who may have vastly different expectations for the content on a given website.” Google will now consider such content spam if it’s produced primarily for ranking purposes and without close oversight of the website owner. Expired Domain Abuse Google’s updated spam policies will target expired domain abuse, where expired domains are purchased and repurposed to boost the search ranking of low-quality content. This practice can mislead users into thinking the new content is part of the older, trusted site. Timeline The March 2024 Core Update is starting to roll out now. Websites have a two-month window to comply with the new site reputation policy. The other changes come into effect this week. “Search helps people with billions of questions every day, but there will always be areas where we can improve,” Tucker stated. “We’ll continue to work hard at keeping low-quality content on Search to low levels and showing more information created to help people.” Google’s announcement emphasises the company’s ongoing commitment to improving the quality of its search results.

Google’s 5-Step Plan To Diagnose Ranking Drop

Google Ranking Factors

Google Plan To Diagnose Ranking Drop Google’s Search Liaison, Danny Sullivan, recently offered guidance on how to diagnose ranking declines. Sullivan provided the advice on X (formerly Twitter) to Wesley Copeland, owner of a gaming news website, who sought help after seeing a significant drop in traffic from Google searches. According to Copeland’s post, he’s been struggling to understand why his website’s status as the go-to source for Steam Deck guides has changed. He stated: “Hey Danny! Any chance you could take a look at my website please? We used to be the go-to for guides on Steam Deck but got hit pretty badly and I’m a bit lost as to why.” A Five-Step Plan To Diagnose Ranking Drop Sullivan recommended several steps to diagnose and address potential issues with the website’s performance: 1. First, use Google Search Console to compare the site’s metrics over the past six months versus the prior period. 2. Next, sort the Queries report by click change to identify notable decreases. 3. Check if the site still ranks highly for those terms. 4.If so, the content quality and SEO may not be the problem. 5. Recognize that Google’s ranking algorithms evolve continually, so some volatility is expected. “If you’re still ranking in the top results, there’s probably nothing fundamental you have to correct,” Sullivan assured. He elaborated that changes in traffic could be due to Google’s systems finding other content that could be deemed more useful at the time. Implications & Insights For SEO Professionals Sullivan’s advice highlights the importance for SEO professionals to regularly analyze performance using tools like Google Search Console. His recommended approach can provide insights into traffic changes and identify areas to potentially optimise. High search rankings require aligning with Google’s evolving ranking criteria. Google continually improves its algorithms to deliver the most relevant content to users. Therefore, search ranking fluctuations are expected. Copeland’s experience demonstrates the volatile nature of SEO, demonstrating that even well-established websites can be impacted by changes to Google’s ranking priorities. Sullivan’s final words offer a mix of assurance and the reality of SEO: “But you probably don’t have any fundamental issues, and it might be the mix of how we show content could change to help you over time.” The conversation between Copeland and Sullivan is a lesson in staying vigilant and responsive to the ever-evolving demands of Google’s algorithms. Article Source: Search Engine Journal

Google Removes More Fake Reviews Due to New Algorithm

Fake Reviews

Google Removes More Fake Reviews Due to New Algorithm Google has a new review algorithm that the search company says it better and faster at taking down fake reviews from local listings in Google Search and Google Maps. The online blog, Search Engine Journal is reporting that “In 2023, this new algorithm helped us take down 45% more fake reviews than the year before,” Google announced. Google receives a lot of contributions to its Google Maps and local listings, this includes reviews, photos, updates to listings and more. In fact, Google said it receives “around 20 million contributions per day on Maps and Search.” New reviews spam algorithm Google said it launched a new algorithm last year to better to detect and remove fake reviews. Google explained that this is a “machine learning algorithm that detects questionable review pattern even faster.” It looks at “longer-term signals on a daily basis,” for example “if a reviewer leaves the same review on multiple businesses or if a business receives a sudden spike in 1 or 5-star reviews.” The algorithm works to catch both “one-off cases and broader attack patterns,” Google wrote. For example, a network of scammers falsely claimed that for a low fee they would connect people to high-paying online tasks, like writing fake reviews or clicking ads across the internet. Google said its algorithm was able to “quickly identied this surge in suspicious reviews thanks to its ability to continuously analyze patterns, like whether an account had previously posted reviews.” Then that was used by human investigators to analyzed reports from merchants who recently saw a spike in fake 5-star reviews. Google was then able ot use those patterns to refine the algorithm to remove more of these fake reviews. This led to Google removing 5 million fake review attempts related to this scam alone in just a few weeks. Review spam fighting metrics Google, like every year, released some metrics on how it fought fake reviews and contributions to the local results in Google Search and Maps. Here are some of those metrics: >>> Google blocked or removed over 170 million policy violating reviews in 2023 (up from 45% from 2022) >>> More than 12 million fake business profiles were removed or blocked >>> 14 million policy viloating videos in 2023 (7 million more than last year) >>> Blocked over 2 million attempts to claim Business Profiles that did not belong to them Last year, Google shared the number of policy violating photos but omitted it this year. Google said the number was higher, Google only wanted to share the wanted to highlight the video piece this year. Also, Google changed how what they look at for fake business profiles from creating fake profiles to claiming fake profiles. Spam and fake information in search is a hassle we all see and have to deal with as marketers. None of us like to be hit with spam, and none of us like to have fake reviews left on our businesses or clients’ business listings. Google is trying to reduce spam efforts but as you can imagine, it is a cat-and-mouse game. As Google comes up with more ways and techniques to prevent and/or reduce spam, spammers find other ways around those efforts.