Search Engine Traffic Levels 2024

Search Engine Traffic

Search Engine Traffic Levels in the UK The digital market in the United Kingdom is always changing. It adapts to new technology and how people use it. A key part of this market is the search engine market share. This area plays an important role in the UK’s digital landscape. This blog post will look into how the UK search engine market has changed in 2024. It will talk about trends, major companies, and how mobile usage affects this market. The UK search engine market is mostly led by Google Search and apps that provide related statistics. Google has held a big part of user searches for many years. Still, its position is starting to change. Even though Google is still very important, its market share has gone down a little. This change encourages us to pay more attention to new competitors and understand what users really want. This change shows that the search market is getting more competitive. AI, new search platforms, and increased mobile search are all changing how this market operates. Comparing 2023 to 2024: Significant Changes and Trends In 2024, there was a slight change in the UK search market share. Google had a small drop in its overall market share, but it still does very well, especially in the mobile search market. Bing made significant gains. It increased its market share, mainly in desktop searches. This gain is partly because it is competing with other search engines like Baidu from China. Bing’s growth is mostly due to its use of AI-powered search. This trend shows a major change in how people search for information. Now, users don’t just use normal search engines. They also look at social media sites like Instagram and TikTok to discover new products. They search for products on different platforms too. If you want specific numbers, a Statista account can help. The table below shows the key changes in market share from 2023 to 2024: Search Engine Market Share in 2023 Market Share in 2024 Google 92.07% 90.48% Bing 2.99% 3.93 % Yahoo – 1.90% DuckDuckGo – 1.86% The way people search for information is changing. It is not just about making your site good for Google anymore. A strong plan should look at different search platforms. It is important to understand how users’ search habits are changing. The Role of Emerging Search Engines in 2024 Google and Bing are well-known search engines. But, we should also explore other search engines and platforms. DuckDuckGo is focused on privacy and is getting more users in the UK. Social media and social networks, such as Instagram, LinkedIn, and TikTok, are important search tools too. Young people often use these platforms to find products, check trends, and discover local businesses. This change in how people use social media to find things shows a big shift in behavior. It brings new challenges and opportunities for marketers. SEO strategies now need to adapt. They should work to be visible on common search engines and key social media platforms. By understanding the age and interests of users on different platforms, businesses can create better content. This helps them connect more with their audience. It’s important to include social media optimisation in the overall SEO strategy. Today, this is a key part of a successful plan. Impact of Mobile Usage on UK’s Search Engine Landscape Mobile search is growing fast in Great Britain. People are using their phones more to find information, products, and services. Now, mobile search is more popular than desktop searches. This shows how important good statistics are. They help us understand these changes and the big shift in how users act and what they want. More people are using their smartphones. Because of this, businesses should focus on mobile-first SEO. Websites need to be easy to use on all sizes of screens. The loading speed of a page is very important for mobile search ranking. It is key for businesses to understand how people search on mobile devices. This will help them be seen and stay connected in our digital world. Mobile vs Desktop: Shifting Preferences in 2024 As we get into the digital age, more people are using mobile search engines for their keywords. This shift is helping to raise website traffic. We notice this in the drop of desktop search market share. To keep up with this trend, businesses need to update their online strategies. The increase in mobile search is connected to the growth of social media. More people now use platforms like Facebook, Instagram, TikTok, and Twitter for quick searches. Users want quick information, reviews, and recommendations when they are out and about. This is why businesses must have a strong presence on social media. It’s important to make content easy to find and share. Not looking at the rise of mobile search can be bad for businesses. If companies do not make their websites easy to use on mobile, users might not like their experience. This can lower page views and result in less interaction. In the long run, it might mean fewer sales. A mobile-first strategy is now key for businesses that want to do well in the changing world of online search. How Mobile Search Trends are Reshaping Market Shares The rise of mobile search has changed the way search engines compete. This change is clear, especially in November. With fewer desktop searches and more people using mobile devices, businesses need to focus on mobile optimisation. This focus is important for getting noticed online. Here are the ways mobile search trends are changing things in the industry: To be relevant, businesses need to adapt to the mobile search market. If they don’t focus on mobile optimization, they might lose their visibility in searches. This could mean losing customers in a world that cares about mobile. UK Search Engine Traffic Share Knowing the UK search engine market share for 2024 is very important. This includes different formats like XLS. Understanding this information can help businesses improve their

Internal Linking Best Practices

Internal Linking

Importance of Internal Linking in SEO Internal linking refers to the practice of hyperlinking one page of a website to another page on the same domain. These links are paramount in guiding users through your website and establishing a cohesive structure for search engines. By connecting related content, internal links help improve the user’s journey and signal to search engines what the site is about, enhancing SEO. Crucially, internal linking allows webmasters to strategically distribute PageRank (a metric used by Google to rank web pages) across the site. How Internal Links Differ from External Links While internal links connect pages within the same website, external links (also known as outbound links) direct users to a different domain. This distinction is crucial for SEO. Internal links help in spreading link equity to various pages on your website, whereas external links can help establish the authority and relevance of your content by linking to high-quality sources. However, both types of links play unique roles in the overall SEO strategy; internal links optimize the navigation and distribution of link equity within a site, while external links help build relationships and enhance credibility through external references. The Role of Internal Linking in Website Architecture Internal linking is integral to website architecture as it outlines the hierarchical structure and flow of a website. Effective internal linking ensures that all important pages are easily accessible, which enhances user experience by providing coherent navigation paths. For search engines, a well-structured internal linking system facilitates efficient crawling and indexing of web pages. By creating strategic internal links, webmasters can highlight key pages and direct both users and search engines to the most valuable content. This approach not only improves the user experience but also helps in maintaining a logical and organised site structure. Recognising the significance of internal linking allows for a more robust, user-friendly, and search engine-friendly website, fundamentally increasing its likelihood of achieving higher rankings in search results. While comprehending internal linking is crucial, its strategic application—facilitated by carefully developed methods and practices—will ensure an optimised and effective site structure. Benefits of Strategic Internal Linking Improved Website Navigation and User Experience Strategic internal linking enhances website navigation and the overall user experience by guiding visitors through your site logically and intuitively. When users can easily find related content, they are more likely to stay longer on your website, reducing bounce rates. Clear internal linking pathways allow users to discover valuable information that complements what they are currently viewing, which keeps them engaged and more likely to convert. Furthermore, a well-structured internal linking system enables users to understand the breadth and depth of your content offerings. By connecting pages through relevant links, you provide a seamless experience that mirrors users’ interests and navigation patterns. This ease of access fosters a positive interaction with your site, potentially increasing the likelihood of return visits and repeat engagements. Enhanced Crawlability and Indexing for Search Engines For search engines, internal links act as a map to navigate through your site. They help search engine bots understand the structure of your website and efficiently index its content. Well-placed internal links ensure that all your essential pages are discoverable by search engines, including those deep within the site. This is crucial for the visibility of your content in search engine results. Internal linking facilitates a hierarchical site structure, which search engines can easily follow and understand. When key pages are linked frequently across the site, it signals their importance to search engines, which can positively influence their ranking. Effective internal linking can also help distribute link equity more evenly, ensuring that even less prominent pages gain some level of visibility. Better Distribution of Link Equity Throughout Your Website Link equity, also known as link juice, is the value that a link passes from one page to another. Internal linking helps in the strategic distribution of link equity across your website. By doing so, you ensure that all pages, especially those that are newer or less authoritative, receive the necessary boost needed to rank well in search engine results. Linking to key pages from highly authoritative ones can lift the visibility of these crucial pages, bolstering their SEO performance. Additionally, spreading link equity through balanced internal linking prevents the uneven accrual of authority on a few pages, which can lead to a more robust and uniformly strong website presence. Strategic internal linking not only benefits individual page performance but also reinforces the overall site architecture, making it a vital aspect of any SEO strategy. Next, we will delve into the different types of internal links and their applications for optimal website performance. Types of Internal Links Navigational Links Navigational links are a fundamental part of a website’s framework, providing users with a map to explore content. These links are primarily found in the menu, footer, and through breadcrumb trails. Menu Links Menu links form the backbone of your website’s navigation. Typically located at the top of a webpage, they are crucial in enabling users to find primary sections and categories of your site. Proper organisation and clear labelling in menu links can significantly improve user experience and engagement. Footer Links Footer links are often overlooked but are equally important. They serve as a secondary navigation point, guiding users to important but less frequently accessed areas like contact information, privacy policies, and site maps. Effective footer links enhance usability and ensure vital information is easily accessible. Breadcrumbs Breadcrumb links provide a trail for users to follow back to the starting or previous pages. They are usually found near the top of a page and reflect the hierarchy of the site. Breadcrumbs improve user navigation by making it easier to understand and traverse the site’s structure. They also benefit SEO by reinforcing the website’s hierarchy in search engine crawlers. Contextual Links Contextual links are embedded within the content of your pages and direct users to related or supporting information. These links are pivotal for SEO as they help distribute link equity across various pages and

Blindly Following SEO Tool Guides

a person typing on a keyboard

Blindly Following SEO Tool Recommendations SEO tools can be invaluable for optimising your site – but if you blindly follow every recommendation they spit out, you may be doing more harm than good. Let’s explore the biggest pitfalls of SEO tools and how to use them to genuinely benefit your site. Why blindly following SEO tools can harm your siteSEO tools are a double-edged sword for anyone involved in content creation or digital marketing. On the one hand, they offer valuable insights that can guide your strategy, from keyword opportunities to technical optimisations. On the other hand, blindly following their recommendations can lead to serious problems. Overoptimised content, cosmetic reporting metrics and incorrect technical advice are just some pitfalls of overreliance on SEO tools. The online SEO Blog, Search Engine Land, has reported that when site owners mistakenly try to optimise for these tool-specific metrics. This is something Google’s John Mueller specifically commented on recently when urging bloggers not to take shortcuts with their SEO: “Many SEO tools have their own metrics that are tempting to optimize for (because you see a number), but ultimately, there’s no shortcut.”I’ve worked with thousands of sites and have seen firsthand the damage that can be done when SEO tools are misused. My goal is to prevent that same damage from befalling you! This article details some of the worst recommendations from these tools based on my own experience – recommendations that not only contradict SEO best practices but can also harm your site’s performance. The discussion will cover more than just popular tool deficiencies. We’ll also explore how to use these tools correctly, making them a complement to your overall strategy rather than a crutch. Finally, I’ll break down the common traps to avoid – like over-relying on automated suggestions or using data without proper context – so you can stay clear of the issues that often derail SEO efforts. By the end, you’ll have a clear understanding of how to get the most out of your SEO tools without falling victim to their limitations. SEO tools never provide the full picture to bloggersWithout fail, I receive at least one panicked email a week from a blogger reporting a traffic drop. The conversation usually goes something like this: Blogger: “Casey, my traffic is down 25% and I’m panicking here.”Me: “Sorry to hear this. Can you tell me where you saw the drop? Are you looking in Google Search Console? Google Analytics? A blog analytics dashboard? Where do you see the drop?”Blogger: “Uh, no. I’m looking at the Visibility Graph in [Insert SEO Tool Name here] and it’s showing a noticeable decline!” This is a common response. I’ve gotten the same email from both novice and experienced bloggers. The issue is one of education. Visibility tools, in general, are horribly unreliable. These tools track a subset of keyword rankings as an aggregate, using best-guess traffic volume numbers, third-party clickstream data and their own proprietary algorithms. The result: these tools tend to conflate all keyword rankings into one visibility number! That’s a problem if you suddenly lose a ton of keywords in, for example, positions 50-100, which lowers the overall visibility number for the entire domain. It’s likely those 50-100+ position keywords were not sending quality traffic in the first place. But because the blogger lost them, the visibility index has decreased, and boom, it looks like they suffered a noticeable traffic drop! Plenty of visibility tools and metrics exist in the SEO space, and many have value. They can and should be deployed quickly to pinpoint where actual SEO research should come into play when diagnosing problems. But as SEOs, we educate clients that these same tools should never be the final authority on matters as important as traffic drops or troubleshooting possible SEO issues. When forming solid hypotheses and recommended action items, always prioritize first-party data in Google Analytics, Google Search Console, etc. Questionable SEO tool recommendations from recent experienceIt’s not just these “visibility metrics” that give tools a bad name. Many of the most popular tools available in the niche provide outdated metrics that have been debunked as a waste of time for SEO priority purposes. One of those metrics is the popular text-to-HTML ratio metric. Briefly defined, the metric compares the amount of text on the page to the HTML code required to display it. This is usually expressed as a percentage, with a “higher” percentage being preferred, as that signifies more text in relation to the code. Even though this has been repeatedly denied as a ranking factor this is still a reported audit finding on most crawling programs and popular SEO tool suites. The same can also be said when discussing the topic of toxic links and disavow files. Yet, Google has publicly communicated multiple times that toxic links are great for selling tools and that you would be wise to ignore such reports as they do nothing for you. I can only speak to my experience, but I’ve only ever improved sites by removing disavow files. Unless you actually have a links-based manual penalty that requires you to disavow links (you shouldn’t have gotten them in the first place), you should stay away from these files as well. Finally, another great “tool recommendation” to ignore is the purposeful non-pagination of comments. One of the simplest ways to increase page speed, reduce DOM nodes and improve a page’s bottom-line UX is to paginate comments. For years, the most popular SEO plugin on the planet, Yoast, provided a Site Health Warning that discouraged users from paginating comments. Fortunately, after much back-and-forth on Github, this was resolved. You’ll still find this recommendation on many auditing tools and SEO plugins even though it’s against Google’s own pagination best practices. It’s important to understand that the best tools have moved beyond antiquated lexical models like keyword density, word count, TF-IDF and basically “words” in general. Semantic search has been the order of the day for years, and you should invest in tools

Local SEO is important in 2024

Local SEO

Local SEO & Location Based Search Strategies Local SEO is important in 2024. This guide will show you how to grow your business using location-based search strategies. More people are using their phones to find quick solutions. Because of this, making your business easy to find through Local SEO is a great way to reach customers who are ready to buy nearby. This guide will explain why Local SEO is so important today. It will also share how Local SEO works and give practical steps to help your business show up better in local search results. The Growth of “Near Me” Searches and Mobile Optimisation Consumer behavior has changed a lot because more people use mobile searches, especially for local options. Searches with words like “near me” have grown a lot. This shows that people want fast and easy choices. A recent study shows that 76% of people who search for something nearby on their phone visit a business within 24 hours. This trend shows how important Local SEO is for businesses like restaurants, repair services, and retail stores. It helps them get noticed and bring more customers in by being there for consumers when they need them. By focusing on Local SEO, businesses can benefit from the trend of searching from mobile devices based on location. This helps them connect with people who are looking for products and services nearby. If someone is searching for a “restaurant near me” or an “emergency plumber,” businesses that show up in local searches are more likely to attract these customers who are ready to make a purchase. Higher Conversion Rates for Local Searches Local searches lead to more sales than wider searches. When people look for a specific service, like “pizza delivery in [city]” or “emergency dentist near me,” they usually want to act quickly. Google data shows that 28% of local searches end in a purchase. This is much higher than the average for general search queries. This urgency and purpose in local searches make Local SEO a great way to attract eager customers. By focusing on Local SEO methods, businesses can boost their chances of getting sales through online bookings, phone calls, visits to stores, or purchases. For physical stores, Local SEO helps connect online presence with in-store earnings. How Local SEO Works: Key Components for Success Local SEO is not just about showing your business in search results. It helps search engines know two important things about your business: where you are and what you provide. Google looks at several factors when it ranks businesses in local searches. Here are the main parts that affect local rankings: When these parts come together, Google is more likely to show your business in local search results. This can help you attract local customers and gain an advantage. Types of Local Search Results: Local Pack and Regular Listings Google shows two main types of results for local searches. These are the Local Pack and organic listings. The Local Pack is sometimes called the Map Pack. Each type of result helps raise a business’s visibility in different ways. Local Pack (Map Pack) Results The Local Pack is an important part of Google’s search results for local queries. It usually shows up at the top of the page and lists the three best businesses related to the search term. The Local Pack gives key details like the business name, its location on a map, contact info, hours of operation, and customer ratings. Since it is at the top of the search results, the Local Pack helps users quickly find important information. This makes it very useful for businesses that are listed there. Organic Search Results (the “Blue Links”) Below the Local Pack, Google shows regular search results, known as the “blue links.” Unlike the Local Pack results, which share Google Business Profile details, organic results direct you to websites. Ranking in organic search results with the Local Pack can improve your visibility and increase your chances of getting local traffic. For local businesses, showing up in both the Local Pack and organic results can be a strong way to take over the search page and draw in more customers. To improve for both kinds of results, you need to know what factors Google looks at for each one. Key Ranking Factors for Local SEO: Relevance, Distance, and Importance To decide which businesses show up in the Local Pack and Google Maps, Google uses three main requirements: For organic search results, which show up below the Local Pack, Google looks at these and other factors, like: By focusing on these factors, Local SEO helps make your business more visible in both the Local Pack and regular search results. This connects you with more possible customers near you. Rising Competition in Local Search With more businesses seeing the benefits of Local SEO, competition for local search rankings is very strong. It’s not enough anymore to just set up a Google Business listing. To stand out, businesses must actively manage reviews, keep their NAP information the same, and often post content that is related to their location. Investing in a good Local SEO plan can create a big impact, especially in tough markets. By focusing on local keywords, getting good reviews, and making sure business information is correct, businesses can improve their local rankings. This will help them have an advantage over competitors who are not as well optimized. Getting Started with Local SEO: Affordable and Good Solutions Local SEO services can be a good way to improve your business’s online presence and reach more local customers. A custom Local SEO plan includes methods such as: With a tailored Local SEO plan, your business can bring in more qualified visitors. This way, you can get more leads, boost website traffic, and attract more customers to your physical locations. Whether you run a retail shop, a restaurant, or any kind of service, having a strong local presence on search engines like Google and Bing is very important

Core Web Vitals Documentation

Core Web Vitals

Core Web Vitals & Next Paint (INP) Scores The official Core Web Vitals documentation was updated with new details on Interaction to Next Paint (INP) scores. Posted by the Search Engine Journal, the official documentation for how Core Web Vitals are scored was recently updated with new insights into how Interaction to Next Paint (INP) scoring thresholds were chosen and offers a better understanding of Interaction To Next Paint. Interaction to Next Paint (INP) Interaction to Next Paint (INP) is a relatively new metric, officially becoming a Core Web Vitals in the Spring of 2024. It’s a metric of how long it takes a site to respond to interactions like clicks, taps, and when users press on a keyboard (actual or onscreen). The official Web.dev documentation defines it: “INP observes the latency of all interactions a user has made with the page, and reports a single value which all (or nearly all) interactions were beneath. A low INP means the page was consistently able to respond quickly to all—or the vast majority—of user interactions.” INP measures the latency of all the interactions on the page, which is different than the now retired First Input Delay metric which only measured the delay of the first interaction. INP is considered a better measurement than INP because it provides a more accurate idea of the actual user experience is. INP Core Web Vitals Score Thresholds The main change to the documentation is to provide an explanation for the speed performance thresholds that show poor, needs improvement and good. One of the choices made for deciding the scoring was how to handle scoring because it’s easier to achieve high INP scores on a desktop versus a mobile device because external factors like network speed and device capabilities heavily favor desktop environments. But the user experience is not device dependent so rather than create different thresholds for different kinds of devices they settled on one metric that is based on mobile devices. The new documentation explains: “Mobile and desktop usage typically have very different characteristics as to device capabilities and network reliability. This heavily impacts the “achievability” criteria and so suggests we should consider separate thresholds for each. However, users’ expectations of a good or poor experience is not dependent on device, even if the achievability criteria is. For this reason the Core Web Vitals recommended thresholds are not segregated by device and the same threshold is used for both. This also has the added benefit of making the thresholds simpler to understand.Additionally, devices don’t always fit nicely into one category. Should this be based on device form factor, processing power, or network conditions? Having the same thresholds has the side benefit of avoiding that complexity. The more constrained nature of mobile devices means that most of the thresholds are therefore set based on mobile achievability. They more likely represent mobile thresholds—rather than a true joint threshold across all device types. However, given that mobile is often the majority of traffic for most sites, this is less of a concern.” These are scores Chrome settled on: Screenshot Of An Interaction To Next Paint Score Lower End Devices Were Considered Chrome was focused on choosing achievable metrics. That’s why the thresholds for INP had to be realistic for lower end mobile devices because so many of them are used to access the Internet. They explained: “We also spent extra attention looking at achievability of passing INP for lower-end mobile devices, where those formed a high proportion of visits to sites. This further confirmed the suitability of a 200 ms threshold. Taking into consideration the 100 ms threshold supported by research into the quality of experience and the achievability criteria, we conclude that 200 ms is a reasonable threshold for good experiences” Most Popular Sites Influenced INP Thresholds Another interesting insight in the new documentation is that achievability of the scores in the real world were another consideration for the INP scoring metrics, measured in milliseconds (ms). They examined the performance of the top 10,000 websites because they made up the vast majority of website visits in order to dial in the right threshold for poor scores. What they discovered is that the top 10,000 websites struggled to achieve performance scores of 300 ms. The CrUX data that reports real-world user experience showed that 55% of visits to the most popular sites were at the 300 ms threshold. That meant that the Chrome team had to choose a higher millisecond score that was achieveable by the most popular sites. The new documentation explains: “When we look at the top 10,000 sites—which form the vast majority of internet browsing—we see a more complex picture emerge… On mobile, a 300 ms “poor” threshold would classify the majority of popular sites as “poor” stretching our achievability criteria, while 500 ms fits better in the range of 10-30% of sites. It should also be noted that the 200 ms “good” threshold is also tougher for these sites, but with 23% of sites still passing this on mobile this still passes our 10% minimum pass rate criteria. For this reason we conclude a 200 ms is a reasonable “good” threshold for most sites, and greater than 500 ms is a reasonable “poor” threshold.” Barry Pollard, a Web Performance Developer Advocate on Google Chrome who is a co-author of the documentation, added a comment to a discussion on LinkedIn that offers more background information: “We’ve made amazing strides on INP in the last year. Much more than we could have hoped for. But less than 200ms is going to be very tough on low-end mobile devices for some time. While high-end mobile devices are absolute power horses now, the low-end is not increasing at anywhere near that rate…” A Deeper Understanding Of INP Scores The new documentation offers a better understanding of how Chrome chooses achievable metrics and takes some of the mystery out of the relatively new INP Core Web Vital metric.

Local SEO & Attracting New Business

Importance of Local SEO

The Importance of Local SEO Local SEO (Search Engine Optimisation) is a targeted approach to optimizing your online presence to attract more business from relevant local searches. This type of SEO focuses on optimising your business for a defined geographical area, ensuring that when potential customers search for products or services in their vicinity, your business appears prominently in the search results. As mobile searches become increasingly localised, understanding and implementing local SEO strategies is crucial for businesses looking to enhance their visibility and attract customers in their area. Defining Local SEO Local SEO is the practice of optimising your online content specifically for local markets by utilizing various techniques and strategies that increase your visibility in local search results. It encompasses a variety of tactics including optimising your Google My Business listing, acquiring local backlinks, and ensuring your business information is consistent across all online platforms. By tailoring your SEO efforts to a specific location, you can increase your chances of ranking higher on search engines when users search for services or products that are available near them. Benefits of Local SEO for Businesses Implementing a robust local SEO strategy can yield numerous benefits for businesses of all sizes. First and foremost, local SEO helps to drive targeted traffic to your website and physical locations, significantly increasing the likelihood of conversion. For instance, studies show that 76% of people who search for something nearby visit a business within a day. Furthermore, local SEO builds trust and credibility, as users tend to prefer businesses that appear in local search results, often interpreting high rankings as an endorsement of quality and reliability. Statistics Highlighting the Impact of Local SEO Statistics underscore the importance of local SEO for businesses. According to Google, 46% of all searches have local intent, indicating that users are looking for specific services or products in their area. Additionally, a study by BrightLocal found that 87% of consumers read online reviews for local businesses, demonstrating that reputation management is critical in local markets. Moreover, 88% of people trust online reviews as much as personal recommendations, highlighting the significant influence of customer feedback on consumer behavior. Working with Direct Submit SEO Agency What to Look for in an SEO Agency Choosing the right SEO agency is paramount to your success in local SEO. When evaluating potential agencies, consider their experience and expertise in local SEO specifically. Look for an agency that has a track record of improving local search rankings for businesses like yours. Additionally, transparency in their processes and communication is essential; a reputable agency should be able to explain their strategies clearly and provide regular updates on progress. Lastly, client testimonials and case studies can provide insight into the agency’s effectiveness and reliability. The Role of a Direct Submit SEO Agency At Direct Submit, a leading UK Digital Marketing Agency we specialise in ensuring that your business is listed on pertinent local directories and search engines, often through direct submission methods. This involves not only submitting your business to popular directories but also optimizing those listings to include relevant keywords, photos, and customer reviews. The agency’s role also extends to monitoring your online reputation, managing reviews, and ensuring consistent information across platforms to enhance your credibility with both search engines and potential customers. How to Collaborate Effectively with Your SEO Agency Effective collaboration with your SEO agency is vital for achieving optimal results. Start by clearly communicating your business goals, target audience, and any specific areas you want to focus on. Establish a timeline and agree on key performance indicators (KPIs) to measure success. Regular meetings can help maintain alignment, allowing both parties to discuss progress, challenges, and adjust strategies when necessary. Providing feedback and sharing insights about your customer interactions can also help the agency tailor their strategies to better fit your business needs. Strategies to Attract New Business Optimising Your Google My Business Listing One of the most effective strategies for improving your local search presence is optimising your Google My Business (GMB) listing. A fully optimised GMB profile includes accurate information such as your business name, address, phone number, and operating hours. Additionally, incorporating high-quality images of your business, products, and services can significantly enhance user engagement. Regularly updating your listing with posts about promotions, events, or new products can keep your customers informed and encourage them to visit your site or physical location. Leveraging Local Citations and Reviews Local citations, mentions of your business on other websites, play a crucial role in local SEO. Ensuring that your business is listed in reputable local directories not only improves your visibility but also helps search engines validate your business’s legitimacy. Furthermore, managing customer reviews is vital; positive reviews can significantly influence consumer decision-making. Encourage satisfied customers to leave reviews on platforms like Google, Yelp, and Facebook. Responding to reviews — both positive and negative — demonstrates your commitment to customer service and can enhance your brand’s reputation. Creating Locally Relevant Content Producing content that resonates with your local audience can further enhance your local SEO efforts. This can include blog posts, articles, and social media content that focuses on local events, news, or community highlights. By incorporating local keywords and topics of interest, you can attract more local traffic to your site. Additionally, creating content that addresses the specific needs and preferences of your local audience can help position your business as an authority in your niche, further driving engagement and conversions. Measuring Your Local SEO Success Key Performance Indicators for Local SEO To evaluate the effectiveness of your local SEO efforts, you need to identify and track key performance indicators (KPIs). Some important KPIs include local search rankings, website traffic from local searches, conversion rates, and the number of calls or inquiries originating from local listings. Monitoring changes in these metrics over time can provide valuable insights into how well your local SEO strategy is performing and where adjustments may be necessary to improve results. Tools to Track Local SEO Performance There are

Google’s SEO Tip for Fixing Canonical URLs

Canonical URLs

Google’s SEO Tip for Fixing Canonical URLsGoogle have offered an SEO tip for making Google choose the correct canonical URL when it ranks the wrong one. They answered a question on LinkedIn about how Google chooses canonicals, offering advice about what SEOs and publishers can do to encourage Google how to pick the right URL. What Is A Canonical URL?In the situation where multiple URLs (the addresses for multiple web pages) have the same content, Google will choose one URL that will be representative for all of the pages. The chosen page is referred to as the canonical URL. The Search Engine Journal has published an update from Google Search Central that explains how SEOs and publishes can communicate their preference of which URL to use. None of these methods force Google to choose the preferred URL, they mainly serve as a strong hint. There are three ways to indicate the canonical URL: Some of Google’s canonicalization documentation incorrectly refers to the rel=canonical as a link element. The link tag, , is the element. The rel=canonical is an attribute of the link element. Google also calls rel=canonical an annotation, which might be an internal way Google refers to it but it’s not the proper way to refer to rel=canonical (it’s an HTML attribute of the link element). There are two important things you need to know about HTML elements and attributes: HTML elements are the building blocks for creating a web page.An HTML attribute is something that adds more information about that building block (the HTML element). The Mozilla Developer Network HTML documentation (an authoritative source for HTML specifications) notes that “link” is an HTML element and that “rel=” is an attribute of the link element. After Reading The ManualThe person reading Google’s documentation which lists the above three ways to specify a canonical still had questions so he asked it on LinkedIn. He referred to the documentation as “doc” in his question: “The mentioned doc suggests several ways to specify a canonical URL. So, if we consider only point 2 of the above. Which means the sitemap—Technically it contains all the canonical links of a website. Then why in some cases, a couple of the URLs in the sitemap throws: “Duplicate without user-selected canonical.” ?” As pointed out above, Google’s documentation says that the sitemap is a weak signal. Google Uses More Signals For CanonicalizationJohn Mueller’s answer reveals that Google uses more factors or signals than what is officially documented. He explained: “If Google’s systems can tell that pages are similar enough that one of them could be focused on, then we use the factors listed in that document (and more) to try to determine which one to focus on.” Internal Linking Is A Canonical FactorMueller next explained that internal links can be used to give Google a strong signal of which URL is the preferred one. This is how Mueller answered: “If you have a strong preference, it’s best to make that preference very obvious, by making sure everything on your site expresses that preference – including the link-rel-canonical in the head, sitemaps, internal links, etc. “ He then followed up with: “When it comes to search, which one of the pages Google’s systems focus on doesn’t matter so much, they’d all be shown similarly in search. The exact URL shown is mostly just a matter for the user (who might see it) and for the site-owner (who might want to monitor & track that URL).” Important Takeaway to NoteIn our experience it’s not uncommon that a large website contains old internal links that point to the wrong URL. Sometimes it’s not old internal links that are the cause, it’s 301 redirects from an old page to another URL that is not the preferred canonical. That can also lead to Google choosing a URL that is not preferred by the publisher. If Google is choosing the wrong URL then it may be useful to crawl the entire site (like with Screaming Frog) and then look at the internal linking patterns as well as redirects because it may very well be that forgotten internal links hidden deep within the website or chained redirects to the wrong URL are causing Google to choose the wrong URL. Google’s documentation also notes that external links to the wrong page could influence which page Google chooses as the canonical, so that’s one more thing that needs to be checked for debugging why the wrong URL is being ranked. The important takeaway here is that if the standard ways of specifying the canonical are not working then it’s possible that there is an external links, or unintentional internal linking, or a forgotten redirect that is causing Google to choose the wrong URL. Or, as John Mueller suggested, increasing the amount of internal links to the preferred URL may help Google to choose the preferred URL.

Several Bugs within Bing Webmaster Tools 

Bing Webmaster Tools

Bugs within Bing Webmaster Tools Recently there were reportedly at least five bugs within the Bing Webmaster Tools API and the corresponding documentation. The bugs range from setting country and regions to site move requests, plus bugs with the overall API documentation. Ryan Siddle was the first to cover this, posting about it on LinkedIn. He wrote, “We’ve found 5 bugs in the Bing Webmaster Tools API and documentation.” He said the “Bing Webmaster Tools team have already confirmed 2 are now scheduled to be fixed.” Here are some screenshots he posted of Bing confirming a few of the issues: Ryan said the reason the bugs were found “is because we’re creating an open source wrapper written in Python (releasing very soon) and created various tests (destructive and non-destructive).” “I’ll have another exciting update in a week or two about how we’re now using BWT in various ways,” he added. Fabrice Canel from Microsoft confirmed, “Yes, team is looking at this. Thanks for reaching out Ryan, Barry.”

Google’s Answer on Ideal Content Length for SEO

Content Length for SEO

Google’s Answer on Ideal Content Length Google’s John Mueller answered a question on LinkedIn about the ideal content length for performing well on Google. Participants in the discussion pressed for specifics, raised concerns about being SERP-blocked by Reddit, and suggested that Search Console should offer content feedback. Mueller’s response challenged SEOs to rethink their approach to content. What’s The Best Length Of Content For SEO?The Search Engine Journal is reporting that the underlying problem is the question itself which is asking what should be done in order to make better content for Google, which is the opposite of what Google’s algorithms are set up to identify. Yet, there is some merit to the question because maybe some people are new to publishing and don’t really understand what the best length is for content. On the other hand, publishing content that’s so long that it veers off topic is a mistake that many people, regardless of experience level, commonly make. This is the question asked: “Hi John, is there an ideal content length that performs better on Google search results? Should we focus on creating longer, in-depth articles, or can short-form content rank just as well if it’s concise and valuable?” There are a lot of ideas about how to make content so it’s understandable if someone is confused about it. Google’s John Mueller answered the question and it was a good answer. However others had concerns about the ranking choices that Google makes that can block good content from ranking. Mueller answered: “There is no universally ideal content length. Focus on bringing unique value to the web overall, which doesn’t mean just adding more words.” Mueller’s suggestion to focus on bringing “unique value” with published content is good advice. Adding unique value doesn’t necessarily mean adding more images, more content, less content, more graphs, or step-by-steps. All of those things could be helpful but only if it’s relevant to a user and their query. Yet, as someone pointed out in that discussion, a site with good content could still lose out in the SERPs due to Google’s “preference” for showing sites like Reddit. A person with the user name SEOBot _ wrote that Google should offer more information and feedback about what “unique value” content means in relation to their own content. While it might seem strange that a publisher is unclear about what constitutes “unique value” content, the question calls attention to the confusion that some publishers feel about how sites are ranked by Google. This is the follow up question asked by that person: “…do you have any example of content on the website that follows this and is able to get the Google love. Focus on bringing unique value to the web overall, which doesn’t mean just adding more words.” This is a very vague and unrealistic ask if the GSC can start pinpointing this content/section as not making any sense or not adding any value. We really eager to learn and know how the content is actually generating value to the web. If all the value is being generated by top publishers/brands then what exactly the small publishers/niche site owners suppose to write to survive?” Mueller responded: “SEOBot _ If you’re looking for a mechanical recipe for how to make something useful, that will be futile – that’s just not how it works, neither online nor offline. When you think about the real-world businesses near you that are doing well, do you primarily think about which numbers they focus on, or do you think about the products / services that they provide?” What Mueller seems to be saying is that focusing on site visitors, not Google, is the way to understand what “unique value” content is. I recently presented at a search marketing conference on the topic of seven things publishers can focus on to improve their content. There’s a lot to say about optimizing content but really, publishers and SEOs can get pretty far by taking Mueller’s advice about thinking about how you would approach selling to people in an actual store or focusing on writing for people (like I’m doing right now). Others joined the conversation to essentially ask the same thing, looking for specifics on what Google is looking for in content. Mueller had said all there is to say about it. Mueller advised: “If you count the words in best seller books, average the count, and then write the same number of words in your own book, will it become a best seller? If you make a phone that has the same dimensions as a popular smartphone, will you sell as many as they do? I love spreadsheets, but numbers aren’t everything. “ What are We to Take from this Exchange?If everything a person has learned about SEO centers around strategies for keywords, worrying about “entities” and whether articles are interlinked with the right anchor text then what Mueller is saying will sound confusing. We’ve been doing SEO for over 20 years and remember a time where SEO was about creating content and links for Google. But this isn’t 2004, it’s 2024 and we’ve reached a time with SEO where it’s increasingly not about creating content for Google.

The Role of Keywords in Online Visibility

Good Keyword Research & SEO

Introduction to Keyword Research Keyword research is a fundamental process in which individuals systematically analyse and identify words and phrases that people enter into search engines. This process helps to uncover the specific terms and queries that have a high probability of driving traffic to a website. Essentially, it acts as a blueprint for understanding the language that potential customers or readers use, thereby enabling content creators and businesses to tailor their online presence accordingly. Role of Keywords in Online Visibility Keywords play a pivotal role in a website’s online visibility. When search engines crawl and index websites, they look for relevant keywords to match queries initiated by users. If a website is optimised with pertinent keywords that align with user searches, it has a better chance of appearing in the search engine results pages (SERPs). This enhanced visibility is crucial, as a prominent position in search results significantly boosts organic traffic and, consequently, potential conversions. Moreover, incorporating relevant keywords effectively bridges the gap between what the target audience is looking for and what the website offers. Importance of Keyword Research for Businesses and SEO The importance of keyword research for businesses and Search Engine Optimisation (SEO) cannot be overstated. Through meticulous keyword research, businesses gain valuable insights into customer behaviour, preferences, and market trends. These insights guide the creation of content that resonates with the target audience, thereby improving user engagement and satisfaction. From an SEO perspective, keyword research is indispensable. It forms the backbone of any successful SEO strategy by ensuring that the content is not only relevant but also competitive. Effective keyword research helps in identifying high-value keywords that strike a balance between search volume, competition, and user intent. Additionally, it aids in uncovering long-tail keywords—more specific and less competitive phrases that can attract highly targeted traffic. Ultimately, keyword research helps businesses to: The next step in harnessing the power of keyword research is to delve into its significance for SEO. Understanding how keyword research impacts search engine rankings and aligns with user intent will be key to mastering this essential skill. Keyword Research for SEO How Keyword Research Impacts Search Engine Rankings Keyword research is pivotal in determining search engine rankings. Search engines like Google use complex algorithms that take into account how well a website aligns with users’ search queries. Effective keyword research identifies high-value terms that can be strategically embedded within a website’s content, dramatically improving its chances of ranking higher in search results. By using pertinent and popular keywords, a business can signal to search engines that its content is relevant, valuable, and authoritative. Relationship Between Keywords and User Intent Keywords are not just about search volume; they are integral to understanding user intent. Each search query represents a specific intent—whether it’s informational, navigational, or transactional. Grasping this intent allows businesses to target potential customers more accurately: Recognising user intent helps businesses create content that directly addresses the needs of their target audience. This, in turn, improves user satisfaction and can increase conversion rates, as the provided content meets the specific needs of users. Keyword Research as a Foundation for Content Strategy Keyword research is the cornerstone of a robust content strategy. It offers insights into what potential customers are searching for, guiding the creation of relevant and engaging content. By integrating target keywords across blog posts, product descriptions, and other online materials, businesses ensure that their content is both discoverable and aligned with user expectations. Moreover, keyword research aids in content optimisation, which involves placing keywords strategically in titles, headings, and meta descriptions. This not only improves search engine rankings but also enhances the readability and relevance of the content. A well-executed keyword strategy also ensures that the content remains competitive. By evaluating keyword trends and search volumes, businesses can adapt to changing market dynamics, maintaining their visibility and relevance over time. Transition to the Next Topic As businesses strive to harness the full potential of keyword research, understanding the different types of keywords and search queries becomes crucial. This knowledge will further refine their keyword strategies and enhance their online presence. Types of Keywords and Search Queries Short-tail vs. Long-tail Keywords Keywords can be broadly categorised into short-tail and long-tail keywords based on their length and specificity. Informational, Navigational, and Transactional Search Queries Understanding the types of search queries is crucial for aligning your content with user intent. LSI (Latent Semantic Indexing) Keywords and Their Importance LSI (Latent Semantic Indexing) keywords add significant value to your SEO strategy. These are keywords and phrases that are semantically related to your primary keyword. For example, LSI keywords for “apple” can be “fruit,” “orchard,” or “granny smith” if the context is fruit, or “iPhone,” “MacBook,” and “iOS” if the context is technology. Understanding and employing these various types of keywords and search queries enable businesses to create more targeted, user-centric content, thereby enhancing their online presence and search engine ranking. Proper utilisation of short-tail, long-tail, and LSI keywords, along with an in-depth understanding of user intent, forms the backbone of a successful SEO strategy. This comprehensive approach to keyword research ensures that your content is not only visible but also relevant and engaging to your target audience. Keyword Research Tools and Techniques Overview of Popular Keyword Research Tools Effective keyword research is facilitated by a variety of tools that provide valuable insights into search trends, keyword difficulty, and competition. Some of the most widely used keyword research tools include: These tools are essential for gaining a competitive edge and understanding the landscape of your target keywords. Techniques for Identifying Valuable Keywords Combining the use of these tools with strategic techniques can help identify the most valuable keywords for your business: Combining these techniques ensures a comprehensive approach to keyword identification. Analysing Keyword Difficulty and Search Volume Understanding keyword difficulty and search volume is crucial for determining the best keywords to target: Balancing difficulty and volume while considering your site’s authority and resources is key to an effective keyword strategy. Transitioning to practical application,