Search

Thursday, February 29, 2024

How Google Search Generative Experience is impacting ads

The impact of Google’s Search Generative Experience on ads within Google’s SERPs has been explored in a new study.

Research conduct by SEO platform SE Ranking found that:

  • Regardless of SGE snippet presence, ads appear more often at the bottom of search results.
  • Ads at the top of the SERP accompany SGE snippets more often than shopping ads (carousels).
  • Shopping ads (carousels) usually appear above the SGE snippet.
  • Niches like Fashion and Beauty, and Ecommerce and Retail, are more likely to feature shopping ads.

Why we care. Understanding the impact SGE has on ads in Google SERPs is crucial for businesses and marketers so that they can make data-led decisions for more effective marketing.

The study. SE Ranking examined 100,000 keywords across 20 niches, each with different search intents and volumes. The analysis provided insights into how often Google offers AI-generated responses, the length of content in the Search Generative Experience (SGE), linking patterns, variations across niches, and ad placement.

The findings. When SGE snippets with text were present (18,455), the study found:

  • Ads appeared at the top 4,280 times (23.19%).
  • Ads appeared at the bottom 6,499 times (35.21%).
  • Shopping ads appeared in the form of carousels (as shown in the screenshot below) 2,660 times (14.41%).
  • This means that in 5,016 instances (27.17%), no ad accompanied the SGE snippets with text.

Ad type. The study also looked at different ad types combined with text-based SGE snippets. Researchers found:

  • Both shopping ads and ads at the top of the SERP accompanied SGE responses in 1,018 cases. 
  • Shopping ads appeared alongside text-based SGE snippets in 2,660 cases.
  • In 3,262 cases, only ads at the top of the SERP appeared alongside text-based SGE snippets.

Carousel shopping ads. Carousel shopping ads can be positioned above, below, or in the sidebar of the SGE snippet.

Ad placement. The study also looked into the most popular location for shopping ads in the SGE SERP. he data suggests a clear preference for having shopping ads above the SGE snippet:

  • Shopping ads appeared above the SGE snippet 2,969 times (80.72%). 
  • In 502 cases (13.65%), shopping ads were placed below the SGE snippet. 
  • Shopping ads appeared in the sidebar 207 times (5.63%), making this placement the least common.

What SE Ranking is saying. An SE Ranking spokesperson said in a statement:

  • “The main takeaway here is that we really need to stay on top of things. Changes seem to be happening practically every day.”
  • “This becomes extra clear when we compare these results to a previous study we did. When we ran our first round of SGE research back in late 2023, the results showed that only about 4% of keywords failed to trigger an SGE snippet. This time,  12.3% of keywords did not trigger an SGE response.”
  • “While this study has shed some light on Google’s AI-powered responses, much remains uncertain. We’ll be keeping a close eye on SGE’s latest developments and do more research to stay up-to-date.” 

What Google is saying. Philipp Schindler, SVP and CBO, Google, said during Alphabet’s 2023 Q4 earnings call:

  • “As we shared last quarter, Ads will continue to play an important role in the new Search experience, and we’ll continue to experiment with new formats native to SGE.”
  • “SGE is creating new opportunities for us to improve commercial journeys for people by showing relevant ads alongside search results.”
  • “We’ve also found that people are finding ads either above or below the AI-powered overview helpful, as they provide useful options for people to take action and connect with businesses.”

Get the daily newsletter search marketers rely on.


Deep dive. Read the SE Ranking report in full for more information.



from Search Engine Land https://ift.tt/viMH37S
via IFTTT

Becoming a world-class PPC ad buyer: 8 key lessons

In PPC advertising, the skills and knowledge of ad buyers directly impact campaign success and return on investment. 

So, what separates good ad buyers from the truly world-class?

The journey from a novice to an expert in ad buying is paved with trials, errors and invaluable lessons. 

This article delves into personal experiences and industry insights, offering a comprehensive guide to mastering the art of ad buying.

1. Invest your own money in ads

The first and most profound lesson in ad buying comes from spending money. The direct correlation between personal investment and the urgency to learn is undeniable. 

Firsthand experience lays a solid foundation for understanding the intricacies of ad buying.

When your own finances are at stake, you’re propelled to analyze, adjust and optimize your ads to get the most profitable results in the least amount of time. 

Nothing will teach you how to maximize what is possible faster than spending your own money and turning a profit from ad buying.

If you are managing the ad budgets on behalf of someone else, maintain campaigns for your own project on the side with an ad budget that is significant to you. You’re ability to produce results will compound exponentially. 

2. Don’t just test, extract learnings

Testing in Google Ads is so easy today that you can take it for granted. You can test ad copy, bid strategies and campaign structures. The ability to run A/B experiments is practically endless.

The hazard in testing lies when you conduct lots of testing but not extract learnings from the tests. For example, finding a winner and moving on to the next idea to “beat the control.”

True mastery comes by forming hypotheses, testing them thoroughly and developing a true understanding of AI, user behavior and best practices.

Delve into the “why” behind each success or failure, and let these insights inform your strategies across all projects.

Dig deeper: A/B testing mistakes PPC marketers make and how to fix them

3. Understand industry nuances

As an expert, you will understand that ad buying is not a one-size-fits-all. Strategies that triumph in one industry may falter in another.

For example, an intent-based structured campaign may perform well for retail, but a themed-based structure may be best for finance. 

Recognizing and adapting to these differences is crucial. Whether it’s deciding to launch a YouTube campaign for conversions or what call-to-action to use for retargeting, each industry demands a tailored approach.

Stay attuned to nuances and leverage past successes while remaining flexible and innovative.

4. Master CRO and funnel dynamics

Ad buying is only half the battle. A well-optimized website and funnel can significantly impact the results of your ad campaigns. 

As a professional ad buyer, understanding the principles of conversion rate optimization (CRO) and the mechanics of successful funnels is essential. By immersing yourself in these areas, you can ensure that your ad buying is maximizing returns, reducing the likelihood of wasted spend. 

The more you can expose yourself to what works well, the better you can influence the make-or-break elements of your ad buys. And the better you can diagnose problems with conversion rates if they arise. 

PPC funnel example

Dig deeper: Boosting search conversions: 5 behavioral strategies to test


Get the daily newsletter search marketers rely on.


5. Build and engage with your community

What is the best way to expose yourself to CRO winners and successful funnels? Join communities where folks share and learn what is working in their companies. 

No ad buyer is an island. Making new friends with fellow marketers, entrepreneurs and other ad buyers can provide you with new knowledge and fresh perspectives. 

This is best done by attending workshops, joining conferences and participating in masterminds. The time and money investment pays lots of dividends.

Build your network as a support system and resource for collaborative learning and innovation.

6. Seek inspiration beyond ads

World-class ad buyers draw inspiration and learnings from a variety of sources.

Observing nature and analyzing different industries and everyday experiences can provide insights into consumer behavior and new marketing strategies. 

Take a statistics class to learn about Bayes’ Theorem. Apply this to your ad testing and buying decision-making. 

Walk into a casino and notice how everything from the carpet to the lack of windows is intentionally designed to get people to stay in the building. Use this insight and start thinking differently about traffic or how to combat ways ad platforms can set you up to lose money.

Nature is a great place for optimization inspiration. In nature, ecosystems exist that support life and all are vital.

This is true, although no individual contributor needs to be perfectly optimized. They just all need to be healthy to have a thriving environment. Conversely, if one area is suffering, everything is suffering.

Open your mind to the world and apply these diverse perspectives to your ad buying approach. Remember, innovation often comes from the most unexpected places.

Dig deeper: 5 essential PPC skills every agency pro must have

7. Develop your mind

As an ad buyer, you have opportunities to push yourself each and every month to brainstorm solutions to move your performance forward. 

It can be difficult to produce something impactful each month, year on end.

When you discipline yourself to do this every month, rain or shine, over the years, your ability to produce brand-new, never-tried-before ideas will get easier and easier. Just like lifting heavy weights with consistency. Your muscle gets stronger.

Push yourself to brainstorm solutions to move accounts forward every month, especially when it is hard and you feel there is no answer.

Similarly, push yourself to tackle complex analysis. Doing hard things over and over makes them easier, and eventually, you will get really good at it. You may even find it to be fun in the end.

8. Embrace continuous improvement

As you work to become a world-class ad buyer, never label yourself as bad at something. For example, “I’m not good at ad writing” or “I’m not good at running display campaigns.”

It’s important to understand that although everyone has their own strengths and weaknesses, not being good at something is usually because you haven’t done that thing enough times yet. 

Instead of using labels of inadequacy, adopt a growth mindset and view every inadequacy as an opportunity for learning. If you find certain ad buying aspects challenging, remember that proficiency comes with practice. Continuous improvement is the hallmark of a world-class ad buyer.

Dig deeper: 3 core skills that will help advance your SEM career

Invest, test, optimize, network and grow as a PPC ad buyer

Becoming a world-class ad buyer requires dedication, continuous learning and a willingness to experiment. By incorporating these valuable tips into your approach, you can develop the skills and knowledge necessary to deliver exceptional results consistently. 

Your growth is limitless and your potential is only bounded by the limits you set for yourself. Decide to begin your journey today and transform your ad buying skills to be extraordinary.



from Search Engine Land https://ift.tt/DVasfpd
via IFTTT

How to reach new audiences with multi-platform search advertising

For years, Google and Bing have dominated paid search. Brands have focused most of their PPC budgets on these two search titans.

However, the search landscape is evolving rapidly. Platforms like TikTok, Pinterest, YouTube, Reddit and Instagram offer exciting new opportunities for savvy advertisers looking to expand their reach.

These alternative platforms come packed with engaged users actively searching for products, inspiration and information. Each also enables targeting options using keywords, interests and more. This allows you to get your brand and products in front of massive new audiences beyond just Google and Bing.

This article explores how to leverage these additional search platforms to complement your existing PPC efforts and connect with customers throughout more touchpoints in their journey. 

Redefining search advertising: Other platforms to explore

Search advertising is the placement of ads within search engines when a user searches for a particular product or service. With this in mind, any platform with a “search” bar could yield potential as part of a search advertising strategy.

This helps us redefine platforms we typically may not associate with search advertising. Let’s explore the platforms I deem to have the most potential to support and complement your search advertising through traditional means.

Dig deeper: Search universe analysis: A deep dive

TikTok

Known to most as a social media platform, TikTok has been making huge strides to disturb Google’s search dominance.

The platform has carved a hole in the market by offering the most potential to ecommerce brands through TikTok Shop. It is currently positioned as a disruptor, with an Adobe study showing 41% of people surveyed having used it as such.

The growth of TikTok as a search engine lies with a surge in usage from Gen Z, but the user demographic has been expanding, creating further opportunities for brands.

There are many reasons users are drawn to TikTok as a search engine, including:

  • Short-form video content, making information more digestible.
  • Personalized content experience.
  • Higher user reviews for products.
  • User-generated content (UGC), making the content more relatable.

To position itself further as a search engine, TikTok released a search volume tool, TikTok Keyword Insights, to help businesses understand trending keywords within their platform content.

TikTok Keyword Insights

From inspiration to impulse buys, TikTok Shop offers the most potential wins for ecommerce businesses on the platform. TikTok was the second-largest social media platform for purchases, behind only Instagram in 2023, per Statista.

Dig deeper: Is TikTok a search engine? Why meeting searchers’ needs matters more than semantics

Pinterest

Pinterest is a social and search platform. Users go to Pinterest to gain inspiration through searches and recommended content.

From an advertising perspective, brands can take advantage of keyword targeting. Those familiar with Google Ads will know these as the building blocks of search advertising.

Like Google Ads, Pinterest allows you to choose match types, including negative keywords. But unlike Google, the match types on Pinterest still work as their names suggest.

While Pinterest caters to user searches, advertisers should bear in mind that it does sit higher up the purchasing funnel but can be utilized effectively as part of a multi-touchpoint strategy.

Dig deeper: 8 new and updated Pinterest products for advertisers


Get the daily newsletter search marketers rely on.


YouTube

YouTube is the second largest search engine, behind its sibling engine, Google.

While Google is typically used for finding products or services, YouTube is more commonly used for inspiration, information and education.

The billions of views on YouTube offer advertisers an enormous opportunity to influence users at different points of their purchasing decisions. You can create the initial need and support post-purchase campaigns to build retention.

YouTube ads can be targeted using interests, demographics and keywords. 

Dig deeper: YouTube advertising: The ultimate guide

Reddit

Users go to Reddit to express opinions and seek advice from individuals and communities with experience in a given area.

This often incorporates product reviews and suggestions, providing an opportunity to reach an already engaged audience.

The caveat is that you want to ensure the community is positively engaged with your product or service. You also want to check whether an active community talks and shares about your product or service.

In Reddit, you can target specific communities, but in true search engine fashion, you can also target based on keywords.

Reddit - audience targeting

Dig deeper: 5 must-know Reddit Ads tactics for B2B marketers

Instagram

As TikTok gains popularity as a search engine, Instagram is repositioning itself as an alternative, which is evident in its recent ad campaign.

Instagram ad as a search engine

Instagram’s ad targeting options align with Facebook and other Meta platforms.

Despite this, there’s a chance to optimize your organic Instagram posts for better search visibility and boost ad delivery within search results and Discovery pages.

Dig deeper: Instagram Ad formats: Best practices for effective ad creative

How to incorporate more search engines in your PPC strategy

Changes in consumer behavior create opportunities for businesses to innovate in responding to user searches. This involves incorporating creative visuals to engage and inform potential customers more effectively.

To understand which platforms will benefit your business, consider:

  • Your sector.
  • Where you have an active community.
  • How they may respond to seeing your brand in each environment.

Tools such as KeywordTool and Glimpse offer insight into search volumes, helping you understand search intent and trends within each platform.

Glimpse works as an add-on to Google Trends.

Glimpse tool - Multi-platform search

Tailor your creative to each platform environment and user intent and launch any activity with a test budget. Develop a comprehensive measurement plan so that you can determine what success in each platform could look like.

Expanding your search advertising across multiple platforms widens your audience reach, boosts engagement with existing audiences and enhances brand awareness and recognition, ultimately capturing more demand for your product or service.

Dig deeper: How to optimize your social media pages for search



from Search Engine Land https://ift.tt/Jjq7nuh
via IFTTT

Wednesday, February 28, 2024

This day in search marketing history: February 29

Spreading Santorum drops on Google

In 2012, Spreading Santorum, the page defining “santorum” as a byproduct of anal sex, finally dropped from the top results on Google.

The related anti-Santorum blog, however, remained. And a page from Urban Dictionary kept the definition alive, more explicit than before.

Google told Search Engine Land this ranking change was related to its improved SafeSearch algorithm, which made irrelevant adult content less likely to show up for many queries.

Read all about it in “Spreading Santorum” Drops At Google; New Site Keeps Anal Sex Definition At Number One.


Also on this day


Bing’s “Search Wave” Showcases Search Volume For 2016 Presidential Candidates

2016: In preparation for Super Tuesday primaries, Bing rolled out a new tool that determines the top-searched candidates.


msnNOW Is Driving More Traffic To Bing, But Is It Artifically Inflating Searches?

2012: Experian Hitwise reported that downstream traffic from msnNOW to Bing jumped 21% between the first and second weeks since msnNOW’s launch.


New comScore Study Suggests 50 Percent Of Local-Mobile Search Happening In Apps

2012: 49% of smartphone and tablet owners were using apps to find local information.


Groupon Buys Travel Search Site Uptake Mostly For Headcount

2012: Uptake was an ambitious travel site that never quite broke through.


YouTube To Add Live Video This Year

2008: YouTube’s founder said they never had the resources to do it correctly.


Updated: IAC Ready To Drop Ask.com Search Technology & Partner With Google?

2008: Ask.com said the rumors were false and “our Teoma technology will continue to power search engine results on Ask.com.”


Search In Pictures: SMX Bowl, Google Jeans, Bart’s Chalk Board

2008: The latest images showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more.


From Search Marketing Expo (SMX)


Past contributions from Search Engine Land’s Subject Matter Experts (SMEs)

These columns are a snapshot in time and have not been updated since publishing, unless noted. Opinions expressed in these articles are those of the author and not necessarily Search Engine Land.


< February 28 | Search Marketing History | March 1 >



from Search Engine Land https://ift.tt/4kp1NjP
via IFTTT

Google faces $2.27 billion lawsuit by publishers over advertising practices

Google is facing a $2.27 billion lawsuit by 32 media groups claiming that the company’s digital advertising practices have led to financial losses.

The publishers, including Axel Springer and Schibsted, are based in various countries across Europe, such as Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, Hungary, Luxembourg, the Netherlands, Norway, Poland, Spain, and Sweden.

What the lawsuit is saying. A statement issued by the media groups’ lawyers Geradin Partners and Stek said per Reuters:

  • “The media companies involved have incurred losses due to a less competitive market, which is a direct result of Google’s misconduct.”
  • “Without Google’s abuse of its dominant position, the media companies would have received significantly higher revenues from advertising and paid lower fees for ad tech services. Crucially, these funds could have been reinvested into strengthening the European media landscape.”

What Google is saying. Google denies the allegations and has described them as “speculative and opportunistic.” Oliver Bethell, Legal Director, Google, told Search Engine Land in a statement:

  • “Google works constructively with publishers across Europe – our advertising tools, and those of our many adtech competitors, help millions of websites and apps fund their content, and enable businesses of all sizes to effectively reach new customers.”
  • “These services adapt and evolve in partnership with those same publishers. This lawsuit is speculative and opportunistic. We’ll oppose it vigorously and on the facts.”

Timing. This lawsuit follows the French competition authority imposing a $238 million fine on Google for its ad tech business in 2021, as well as the charges brought by the European Commission last year, both of which are referenced in the media groups’ claim.

Dutch court. The group chose to file the lawsuit in a Dutch court because the country is well-known for handling antitrust damages claims in Europe. This decision helps avoid dealing with multiple claims across different European countries.


Get the daily newsletter search marketers rely on.


Deep dive. Read our antitrust trial updates article for information on Google’s legal battles in the U.S., where the tech giant is being sued by the U.S. Justice Department.



from Search Engine Land https://ift.tt/hCGLmEO
via IFTTT

Google Analytics 4 launches default Google Ads report

Google Analytics 4 introduced a default Google Ads report, now available within your account’s performance reporting section.

To access this report, you need to link your GA4 profile with your Google Ads account

Why we care. The inclusion of this Google Ads report, formerly found in GA4’s predecessor Universal Analytics, simplifies data access within GA4. It helps you figure out what’s working well and what needs improvement, making it easier to optimize your campaign more effectively.

First spotted: The report was first spotted by Senior Performance Marketing Manager and Google Ads expert, Thomas Eccel, who shared a preview on X:

To locate this report, go to the Advertising section, navigate to “Performance”, and click on “Google Ads”.

What Google is saying. Google confirmed the new report to Search Engine Land, explaining it’s part of its Advertising workspace update. A spokesperson recently confirmed:

  • “If you currently run ads campaigns or monetize your properties with ads, make sure your ads accounts are linked to continue getting these actionable insights. If no account is linked, you will see a page that prompts you to link to an ads or publisher account.”

Get the daily newsletter search marketers rely on.


Deep dive. Read our GA4 Advertising workspace update report for more information.



from Search Engine Land https://ift.tt/dTUV9aq
via IFTTT

4 SEO tips to elevate the user experience

Brands and businesses must balance optimizing their online presence through SEO with providing an excellent customer experience.

This raises the question – can SEO redefine client experience, or does it risk overshadowing other important elements of the customer journey?

While SEO is key for visibility and accessibility, companies must be careful not to prioritize it over user experience or broader marketing strategy. 

Learn how to balance SEO with other efforts to build brand loyalty and meaningful customer relationships.

Balancing SEO with other crucial elements

While SEO is crucial in redefining your client experience, finding a balance and avoiding overshadowing other crucial elements in the customer journey is important.

Be careful not to prioritize SEO metrics over user experience or other aspects of your marketing strategy. 

For example, as you optimize for search engine rankings, you shouldn’t sacrifice the authenticity and relevance of your content. 

Make sure that your SEO efforts align with broader marketing initiatives to cultivate brand loyalty and nurture meaningful customer relationships.

Dig deeper: SEO and UX: Finding the strategic balance for optimal outcomes

SEO best practices that improve the user experience

1. Treat your visitors to a great user interface

User interface (UI) plays a pivotal role in website performance and user engagement. Websites with shoddy user interfaces might fail to retain visitors’ attention.

Difficult navigation, cluttered layouts and lack of informative content contribute to higher bounce rates and shorter session durations, adversely affecting SEO metrics.

To address this:

  • Prioritize creating intuitive, user-friendly interfaces that captivate and retain visitors. 
  • Ensure easy navigation through clear menu structures, informative service pages and engaging multimedia elements such as images and videos. 
  • Align UI design principles with SEO best practices to create immersive digital environments that foster meaningful interactions and drive conversion.

Get the daily newsletter search marketers rely on.


2. Establish good linking habits

Linking strategies play a crucial role in SEO and user experience. 

Backlinking, the process of acquiring inbound links from external websites, can improve site accessibility and SEO rankings and enhance the credibility and authority of the website. 

Internal and external links serve as pathways for users to navigate through relevant content and access valuable information.

Strategic external linking to authoritative sites reinforces trust and legitimacy, enriching the user experience. 

Directing users to reputable sources and complementary content establishes your brand as a reliable source of information within your respective industries.

Additionally, internal linking structures guide users through the website’s hierarchy, facilitating seamless navigation and encouraging deeper engagement with the content.

3. Enhance accessibility for all users

To improve user experience and SEO performance, prioritize accessibility

Ensure that all users, regardless of their abilities or disabilities, can navigate and engage with the website effectively. 

Accessibility encompasses various considerations, from accommodating users with visual or auditory impairments to those with motor or cognitive limitations.

From an SEO perspective, accessible websites perform better in search engine rankings as they provide a more inclusive and user-friendly experience. 

Here’s how to enhance accessibility to improve both SEO and user experience:

  • Semantic HTML and alt text: Use semantic HTML tags to structure content logically, making it easier for screen readers to interpret and navigate. Provide descriptive alt text for images to assist users who rely on text-to-speech technology or have images disabled.
  • Readable typography and contrast: Choose legible fonts and appropriate font sizes and ensure sufficient color contrast between text and background to enhance readability for all users, including those with visual impairments.
  • Testing with assistive technologies: Regularly test the website with assistive technologies like screen readers and voice recognition software to identify and address accessibility barriers. Conduct usability testing sessions with users of diverse abilities to gather feedback and improve accessibility features.

Dig deeper: 10 internal linking best practices for accessibility

4. Optimize page load times

Page load times are critical factors that significantly impact user experience and SEO performance. Users expect instantaneous access to information, and slow load times can lead to frustration and abandonment. From a UX perspective, slow-loading websites diminish engagement and deter users from exploring further.

Optimizing site performance through technical audits, code enhancements and content optimization strategies improves website stability and enhances SEO rankings.

By reducing page load times and streamlining the browsing experience, you can create frictionless interactions that captivate users and drive organic traffic.

Dig deeper: Page speed and experience in SEO: 9 ways to eliminate issues

Crafting an engaging experience for customers with SEO

SEO strategies and user experience efforts should work in tandem to create an optimal customer journey. Prioritize user-centric design and accessibility while implementing SEO best practices.

Ensure that your focus on search engine optimization does not overshadow building meaningful relationships and enhancing overall brand value. 

With the right balance of SEO and UX considerations, you can gain visibility in search results while providing an engaging, seamless experience for customers. The key is integrating SEO seamlessly into your overall digital marketing and content strategy.



from Search Engine Land https://ift.tt/j2BADO7
via IFTTT

WalkerOS: A data collection alternative to gtag.js

Google’s popular gtag.js library makes collecting data for Google Analytics 4 and Google Ads straightforward.

However, it also ties you into Google’s ecosystem. You lose control and flexibility when tracking data.

Enter walkerOS. This new open-source tracking library from ElbWalker aims to give you customizable control back. It lets you send data wherever you want, not just to Google. It also claims better performance through a lightweight codebase.

This article explores if walkerOS lives up to its promises. We’ll also:

  • Compare its features, flexibility and ease of use vs. the Google tag. 
  • Learn the cases where switching makes sense, along with the potential downsides.

What is gtag.js?

The Google tag, or gtag.js, is a JavaScript library by Google that tracks and collects data, serving as an all-encompassing link between your site and various Google services, including Google Ads and Google Analytics 4. 

As opposed to ga.js and analytics.js, which were only limited to analytics, gtag.js provides a single solution. 

It achieves efficiency by using other libraries instead of handling analytics and conversion data capture directly, essentially acting as a framework for those libraries.

This makes it easier during the setup and integration processes while reducing the need for extensive code changes. 

Gtag.js combines multiple tracking tags into one, unlike Google Tag Manager. This simplifies user experience, allowing for easier event detection and cross-domain tracking. 

Overall, it provides detailed insights into visitor behavior and traffic sources more easily, improving its usefulness.

Dig deeper: Google releases simple, centralized tag solution

Why should you look for a gtag.js alternative?

While gtag.js is the industry standard for Google Analytics and Ads tracking, there are situations where alternatives are preferred. Reasons include privacy, lightweight libraries, server-side data collection and data ownership to avoid vendor lock-in.

Alternatives may provide better control over user data, aiding compliance with regulations such as GDPR and CCPA. They may offer features like data anonymization and selective data collection. This ensures data is managed in line with organizational privacy policies, reducing the risk of data sharing with third parties.

Page speed is vital, so optimizing for JavaScript library performance matters. While gtag.js is lightweight, using multiple libraries can slow down a site.

Smaller libraries improve load times, enhancing user experience, especially on mobile. Consider multi-destination libraries for better performance.

From a data security perspective:

  • Sensitive information can be kept more secure and the risk of being intercepted or manipulated on the client side is reduced.
  • Server-side data collection can bypass issues related to ad blockers or browsers that restrict tracking scripts, potentially offering more accurate analytics data.

Exploring alternatives offers flexibility in data management, avoiding vendor lock-in and pricing constraints.

Owning your data enables seamless integration with various systems and custom analytics solutions. For instance, if consent for Google Analytics 4 is denied, your tagging server might not receive all data.

What is walkerOS?

Here’s where the walkerOS library comes into play. 

WalkerOS (a.k.a. walker.js) offers a flexible data management system, allowing users to tailor data collection and processing to their needs. 

It’s designed to be versatile, from simple utilities to complex configurations. Its main objective is to ensure data is sent reliably to any chosen tool. 

Simply put, you can implement walker.js and send data to all places for analytics and advertising purposes you need. No need to have a massive amount of different tags. 

The walkerOS event model offers a unified framework to meet the demands of analytics, marketing, privacy and data science through an entity-action methodology. 

This approach, foundational to walkerOS, systematically categorizes interactions by identifying the “entity” involved and the “action” performed. This structured yet adaptable model ensures a thorough understanding of user behavior.

WalkerOS stands out for its adaptability in event tracking, allowing customization based on specific business needs rather than conforming to preset analytical frameworks. 

The philosophy behind walkerOS is to make tracking intuitive and understandable for all stakeholders, enhancing data quality and utility within an organization.


Get the daily newsletter search marketers rely on.


Working with walker.js and what to look out for

Getting started requires some tech knowledge and understanding, but it isn’t as hard as it seems. The walker.js web client can be implemented directly via code via the Google Tag Manager (recommended) and via npm.

All events are now sent to the dataLayer from which we can start the tagging via Google Tag Manager.

The tagging process means we want to define the events we want to capture and send, like filter usage, ecommerce purchases, add to carts, item views and more. 

Walker.js supplies a good round of triggers that we can use starting from click, load, submit, hover or custom actions. You can also add destination tags and define where to send the captured data. 

WalkerOS event journey

Walker.js works on prebuild destinations like Google Analytics 4, Google Ads, Google Tag Manager, Meta Pixel, Piwik PRO and Plausible Analytics. It also offers an API to send custom events to any destination that can receive them.

I recommend using their demo page to play around with it.

Switching away from gtag.js: What to consider

Switching from gtag.js to an alternative like walker.js for tracking and data collection comes with considerations and potential drawbacks, depending on your specific needs and setup. Here are some of the main points to consider:

Integrating with Google products

In terms of integration, gtag.js is designed to work seamlessly with Google’s suite of products, including Google Analytics, Google Ads and more.

An alternative like walker.js does not offer the same level of native integration, potentially complicating the setup with these services. You need technical understanding to implement and maintain. 

Feature support and customization

Gtag.js supports a wide range of out-of-the-box features tailored to Google’s platforms. Walker.js may not support all these features directly or might require additional customization to achieve similar functionality.

Ease of implementation for Google users

Gtag.js provides a straightforward implementation process for those already using Google products. Users might find that walker.js requires more technical knowledge to customize and integrate effectively. 

Google’s extensive documentation and community support make troubleshooting and learning easier. Walker.js, being less widespread, may have more limited resources for support and guidance.

Exploring GA4 data collection and tracking options

The decision between using gtag.js or switching to an alternative like walker.js depends on your specific use case and needs. If you heavily rely on the Google ecosystem and want seamless integration, then gtag.js is likely the best choice.

However, for those needing greater control and flexibility with their data collection and usage across systems, walkerOS offers a lightweight, customizable tracking solution. 

While the setup requires more technical knowledge, the ability to own your data and reduce vendor lock-in provides strategic long-term benefits for many businesses.

Dig deeper: How to set up Google Analytics 4 using Google Tag Manager



from Search Engine Land https://ift.tt/JNRWe3g
via IFTTT

Tuesday, February 27, 2024

3 reasons not to block GPTBot from crawling your site

The next phase in ChatGPT’s meteoric rise is the adoption of GPTBot. This new iteration of OpenAI’s technology involves crawling webpages to deepen the output ChatGPT can provide. 

AI improvement seems positive, but it’s not so clear-cut. Legal and ethical issues surround the technology.

GPTBot’s arrival has highlighted these concerns, as many major brands are blocking it instead of leveraging its potential.

Websites blocking GPTBot

But I truly believe there’s much more to gain than lose by fully (and responsibly) embracing GPTBot.

Why do AI bots like GPTBot crawl websites? 

Understanding why bots like GPTBot do what they do is the first step to embracing this technology and leveraging its potential.

Simply put, bots like GPTBot are crawling websites to gather information. The main difference is rather than an AI platform passively being fed data to learn from (the “training set,” if you will), a bot can actively pursue information on the web by crawling various pages. 

Large language models (LLMs) scour these websites in an attempt to understand the world around us. Google’s C4 data set makes up a large portion (15.7 million sites) of the learning body for these LLMs. They also crawl other authoritative, informative sites like Wikipedia and Reddit. 

The more sites these bots can crawl, the more they learn and the better they can become. Why, then, are companies blocking GPTBot from crawling?

Do brands that block GPTBot have valid fears?

When I first read about companies blocking GPTBot from crawling their websites, I was confused and surprised.

To me, it seemed incredibly short-sighted. But I figured there must be a lot to consider that I wasn’t thinking deeply enough about. 

After researching and talking to agency professionals with legal backgrounds, I found the biggest reasons.

Lack of compensation for their proprietary training data

Many brands block GPTBot from crawling their site because they don’t want their data used in training its models without compensation. While I can understand wanting a piece of their $1 billion pie, I think this is a short-sighted view. 

ChatGPT, much like Google and YouTube, is an answer engine for the world. Preventing your content from being crawled by GPTBot might limit your brand’s reach to a smaller set of internet users in the future.

Security concerns

Another reason behind the anti-GPTBot sentiment is security. While more valid than greedily hoarding data, it’s still a largely unfounded concern from my perspective. 

Top reasons organizations are banning ChatGPT

By now, all websites should be very secure. Not to mention, the content GPTBot is trying to access is public, non-sensitive content. The same stuff that Google, Bing, and other search engines are crawling daily. 

What caches of sensitive information do CIOs, CEOs, and other company leaders think GPTBot will access during its crawl? And with the right security measures, shouldn’t this be a non-issue?

From a legal standpoint, the argument is that any crawls done on a brand’s site must be covered by their privacy disclaimer. All websites should have a privacy disclaimer outlining how they use the data collected by their services. Attorneys say this language must also state that a generative AI third-party platform could crawl the data collected. 

If not, any personally identifiable information (PII) or customer data could still be “public” and expose brands to a Section 5 Federal Trade Commission (FTC) claim for unfair and deceptive trade practices.

I get this concern to some degree. If you’re the legal department of a big-name brand, one of your primary objectives is to keep your company out of hot water. But this legal concern applies more to what’s input into ChatGPT rather than what GPTBot crawls. 

Anything input into OpenAI’s platform becomes part of its data bank and has the potential to be shared with other users – leading to data leakage. However, this would likely only happen if users asked questions relative to stored information. 

This is another unwarranted concern to me because it can all be resolved by responsible internet usage. The same data principles we’ve used since the dawn of the web still ring true – don’t input any information you don’t want shared. 

An impulse to save humanity from AI advancement

I can’t help but think that leaders at some of these brands blocking GPTBot have a bias against the advancement of AI technology.

We often fear what we don’t understand, and some are frightened by the idea of artificial intelligence gaining too much knowledge and becoming too powerful.

While AI is evolving rapidly and beginning to “think” more deeply, humans are still largely in control. Additionally, legislation governing AI will grow alongside the technology.

When we finally reach a world of “autonomous” AI platforms, their functionality will be guided by years of human innovation and legislation. 


Get the daily newsletter search marketers rely on.


3 reasons not to block ChatGPT’s GPTBot

So why should you allow GPTBot to crawl your site? Let’s look on the bright side with these three primary benefits of embracing OpenAI’s bot technology.

1. 100 million people use ChatGPT each week

By not allowing GPTBot to crawl your site, there’s a 100 million-person audience you’re missing out on maximizing brand visibility. 

Sharing access to your website content can help ensure your brand is both factually and positively represented to ChatGPT users. 

This means there’s a higher chance that your brand will actually be recommended by ChatGPT, leading to more traffic and potential customers.  

Some brands report getting 5% of their overall leads, or $100,000 in monthly subscription revenue from ChatGPT. I know our agency has already gotten some leads from ChatGPT, too.

Another way to consider this is as a positive digital PR (DPR) play. You should leverage DPR strategies like brand mention campaigns in today’s landscape. 

Permitting GPTBot to crawl your site only adds to these efforts by allowing ChatGPT to access your brand information directly from the source and distribute it to 100 million users positively. 

2. Generative engine optimization (GEO)

Whether you have fears about AI, we can all agree that it’s changing the marketing landscape. Like all new technologies and trends in our industry, those slow to embrace AI as a conduit for new business and brand exposure will miss the proverbial boat. 

GEO is picking up steam as a sub-practice of SEO. You’ll miss a significant opportunity if you’re not targeting some of your marketing efforts to be in this marketplace. Competitors may pick up after you let it slip through the cracks. 

We know it’s easy for brands to fall behind in today’s fractioned and ever-growing marketing landscape. If your competitors spend years working on GEO, maximizing LLM visibility and developing skills and expertise in this area, that’s years ahead of you they’ll be. 

Now, GEO reporting capabilities haven’t caught up to the value yet, which means it will be tough to measure an ROI, but that doesn’t mean it’s something to ignore and fall behind on.

Brands and marketers must start embracing LLMs like ChatGPT as an emerging acquisition channel that shouldn’t be ignored.

3. OpenAI’s pledge to minimize harm

A healthy distrust of AI technologies is important to its legal and ethical growth. But we also need to be open-minded and realize we can’t be effective as marketers if we resist and choose not to grow and innovate in the direction of things. 

OpenAI clearly states “minimize harm” as one of the guiding principles of their platform. They also have policies to respect copyright and intellectual property and have stated that GPTBot filters out sources violating their policies.

By allowing GPTBot to crawl your site’s content, you’re contributing to the clean and accurate training data OpenAI uses to enhance and improve its information accuracy.

As AI technology marches on, it can be easy to get caught up in skepticism, fear, and noise. Those struggling to embrace and maximize it will get left behind.



from Search Engine Land https://ift.tt/0e7bd1r
via IFTTT

7 reasons why your AI content sucks (and how to fix it)

BuzzFeed was one of the first major publishers to adopt heavy AI publishing. 

They drew scrutiny when a litany was plagiarized, copy-and-pasted, factually incorrect, awkward and simply poorly written.

Most recently, they’ve resorted to shutting down entire business units because of their inability to compete.

Sports Illustrated, also an early adopter, suffered from similar issues and is now also laying off staff to stem the bleeding.

Notice a pattern here?

Using AI to create content isn’t bad in and of itself.

But it often produces bad content. And that’s the problem. 

This article dissects AI-generated articles and contrasts them with one crafted by a human expert to illustrate the potential pitfalls of relying solely on AI-generated content.

Why brands are flocking to AI-written content

Look. It makes sense.

AI’s promise is incredibly seductive.

Who wouldn’t want to automate or streamline or replace inefficiency?! 

And I can’t think of a more inefficient process than sitting in front of a blank white screen and starting to type. 

As a red-blooded capitalist, I empathize. However, as a long-term brand builder, I can also recognize that AI content just simply isn’t good enough.

The juice ain’t worth the squeeze.

Too many fundamental problems and issues still don’t make it viable to use for any serious, ambitious brand in a competitive space.

In the future? Sure, who knows? We’ll probably serve robot masters one day.

But right now, the only potential use case we’ve seen that makes any possible sense is around extremely black-and-white stuff.

You know the classic SEO playbook: Glossaries. 

Straight plain, vanilla, top-of-funnel definitions.

Every SEO and their dog has heard about the “Great SEO Heist” – an infamously viral SEO story.

Now, I’m not going to kick someone while they’re down. 

But I am going to kick the $#!& out of their content ‘cause it’s just not any good. 

So let’s travel back in time for a second. 

Let’s look up the warm, sunny days of Summer ‘23 when the brand-in-question ranked well using AI content. Then, let’s ignore the noise around it and rationally assess the content quality (or lack thereof).

Whoosh – top organic rankings from August ‘23:

What do you notice?

Tons of glossary-style, definition-based content.

Makes sense on the surface. The way LLMs work is by sucking in everything around them, understanding patterns and then regurgitating it back out. 

So it should, in theory, be able to do a passable job at vomiting up black-and-white information. 

Kinda hard to screw up. Right? 

Especially when you understandably lower the bar and do not have any expectations for true insight or expertise shining through. 

But here’s where it goes from bad to worse.

Problem 1: Top-of-the-funnel traffic doesn’t convert

This might sound like a trick question, but shouldn’t be:

Is the goal of SEO to drive eyeballs or buyers?

Ultimately, it’s both. You can’t drive buyers without eyeballs. 

And you often can’t rank for the most commercial terms in your space without having a big site to begin with. 

This Great SEO Catch-22 is why the Beachhead Principle is valuable. 

But if you had to pick one? You’d pick buyers. You ultimately need conversions to scale into eight, nine and 10-figure revenues.

Now. There is a time and place for expanding top-of-the-funnel content, especially when you’re in scale mode and trying to reach people earlier in the buying cycle.

However, as a general rule, extremely top-of-the-funnel work won’t convert. 

Like, ever. 

In B2C? In low-dollar amounts, impulse or transactional purchases? Possibly. But still unlikely. It’d require one helluva Black Friday discount.

But B2B? Or any other big decision that often requires complex, consultative sales cycles that naturally take weeks and months of actual persuasion and credibility? 

No chance. Here’s why.

Look up the Ahrefs example above, where one of the ranking keywords last summer was for “European Date Format.”

Now, let’s Google that query to see what we see:

That’s right, an instant answer! 

Exhibit A: Zero-click SERPs.

So, the searcher can get the answer they want without ever having to click on the webpage in question.

Kinda hard to convert visitors when they don’t even need to visit your website in the first place.

Think this pervasive problem will only get better when more people start using AI tools to sidestep or augment traditional Google searches?

Think again. 

Problem 2: Easy-to-come rankings are also easy-to-go

OK. Let’s look at another example.

The “shortcut to strikethrough” query was (at one point) the top traffic driver for this site.

So let’s dig deeper and unpack the competitiveness for a second.

All traditional measures of “keyword difficulty” are often biased toward the quality and quantity of referring domains to the individual pages ranking. 

They often neglect or gloss over or simply avoid measuring anything around a site’s overall domain strength, their existing topical authority, content quality and a host of other important considerations. 

(That’s why a balanced scorecard approach is more effective for judging ranking ability.)

But there are two big issues with the graph above:

Issue: Easy-to-rank queries are often easy to lose. 

All you need is a half-decent competitor worth their salt to actually publish something good and put out the minimum amount of distribution effort and you’ll lose that ranking ASAP.

Contrast this to a definition-style article we did with Robinhood waaaaaaay back in 2019, that’s still ranking well to this very day…  

… and that’s also competing against incredibly competitive competitors, too:

Good rankings only matter if you can hold onto them for years, not weeks! 

Issue 2: Low-competition keywords are still low competition ‘cause there’s no $$$ in it!

Competition = money. The lack of competition in SEO, like in entrepreneurship, is usually a bad sign. Not a good one.

So, can you use AI content to pick up rankings for extremely top-of-the-funnel, low-competition keywords?

Technically, yes. 

But are you likely to hang on to that ranking over the long term, while also actually generating business value from it?

No. You’re not.

Problem 3: AI content is (and always will be) poorly written

Fine. I’ll say it.

Most people aren’t good at writing. It’s a skill and a craft. 

Sure, it’s subjective. But you learn some indisputable truths when you get good at it.

Here, I’ll give you one helpful tidbit to keep in the back of your mind. 

How do you spot “good” vs. “bad” writing online?

Specificity. 

Good writing is specific to the audience and, more importantly, the selected words and the context provided to bolster its claims.

Bad writing is generic. It’s surface level. It’s devoid of insight. 

It sounds like a freelance writer wrote it instead of a bonafide expert on the topic.

And that’s why AI content manufactured by LLMs will always struggle in its current iteration.

Again, let’s look at actual examples! (See? Specificity!)

That box in red above? 

Any half-decent editor would just remove the entire thing ASAP. And probably question why this person is writing for them in the first place.

It says a lot without saying anything at all. Pure fluff.

Flaccid, impotent writing at its finest.

And the box in yellow? Slightly better. Barely, though.

At least it gives some actual examples. However, the problem with this section is twofold.

Again, the examples are extremely surface-level at best and sloppy at worst. 

This is like when a teenager spouts off about something they just Googled two seconds ago, trying to make it sound like they know what they’re talking about now.

You know what it looks like when an amateur simply regurgitates what other people are saying vs. actually doing research and being knowledgeable about which they’re speaking?

It looks exactly like that.

More importantly, while it mentions a few “advanced Excel formulas,” it fails to actually describe any “advanced Excel formulas.”

That’s a problem! Because it’s supposed to be the entire point of this section! 

Do you want to venture a guess as to why it’s failing to do that?

Because it doesn’t actually understand “advanced Excel formulas.” 

By definition, LLMs (and bad, amateur writers alike) don’t actually understand what they’re writing about. 

You can’t be specific about something if you don’t understand it in the first place.

AI content (and underlying LLMs) don’t understand how to associate different bits of knowledge together and then expertly knit arguments together to form a coherent narrative.

Now, I know what you’re thinking:

“OK, Mr. Smarty Pants. Show me an example of good writing in a definition article, then?”

Fine. I’ll see your bet and raise you.

Here’s the counter-example, showcasing actual fact-checked research into the centuries-old evolution of “checks and balances” across multiple cultures and civilizations through time.

Even if you knew what “checks and balances” were going into this, you undoubtedly just learned something about its evolution and context and now possess a greater understanding of the subject before you started reading.

Specificity, FTW!


Get the daily newsletter search marketers rely on.


Problem 4: AI content isn’t optimized well enough for search, either  

Today, I have the privilege of working with smart, amazing brands. 

But ~15-odd years ago? It was the opposite. 

It used to drive me nuts when companies would think that SEO is this magical process where you come in at the very end of a new website or piece of content and sprinkle your SEO magic pixie dust on it, and all will be good. 

And yet, fast forward to today, AI content often falls foul of the same logic.

Good “SEO” content today is engineered to be properly “optimized” from the very beginning.

It takes into account everything, including:

  • The audience’s knowledge or pain points.
  • True search intent.
  • The overall structure and style of content.
  • The structure and headers.
  • Questions being answered.
  • Related topics.
  • Other relevant information on your site.

Exhibit C:

Robinhood article on checks and balances

Once again, this is difficult to do well because it requires several experts to work together to determine how the vision and structure and execution of a piece looks before a single word is ever written.

AI content, on the other hand? 

Sprinkle away!

Yes, you can prompt it. You can finesse it (kinda). You can try to add decent headers.

But then you’re often left with something that looks like this: 

Excel article content brief

Length is fine. Headers and overall structure of content (based on SERP layout) are also fine.

But on-page optimization kinda sucks:

  • Semantic keywords and related topics are slim and much lower than average competitors ranking for this query.
  • How about internal links to reinforce your clusters and create a dense site hierarchy, improving topical authority around these subjects for the long haul? Also nonexistent.
  • How about image alt attributes for accessibility? Oh wait, there’s no images. Nevermind.
  • Or, little-to-no questions being answered, pain points being addressed, problems solved or related People also ask questions that Google will show you. 

Like these:

Check and balance - Related topics

This is the problem with shortcuts. 

When you do things correctly, from the beginning, you can plan and be proactive and specifically structure things to provide yourself with the best possible chance to succeed.

But when you’re over-relying on the Ozempic of the content world (AI), you’re forced to take shortcuts because of the self-imposed limitations.

The output is worse for it.

Problem 5: AI content mansplains – good writing imparts understanding

Specificity is a hallmark of good writing because it lets the reader know they’re immediately understood and provides insight that actually informs how they think.

AI and poor writing, in general, mansplains. 

It offers up generic crap that readers already know.

And this simple difference is also why visuals make such a giant difference online.

You shouldn’t have images in an article because it’s a dumb requirement before publishing. 

Your checklist says “one image per 300 words.” Check, marked.

A generic stock image might as well not even be included.

No, the real reason images are critical is because they shape the actual narrative! 

All of these words I’m typing before and after each image add context to the examples being shown that bolster my claims.

That way, I gain credibility. (We’ll come back to this below.)

And because I can back up my claims, you know I’m not just spouting B.S. 

So once again, let’s look at this entirely text-only AI article (even when discussing a visual concept):

Text-only article on Excel shortcuts

Meh. The writing still sucks.

But more importantly, AI can’t weave a connection between images and text. ‘Cause that still requires nuance and context (which it entirely lacks).  

Let’s contrast and compare that with the takeaway below, which does three important things AI + LLMs can’t do:

  • Provides a unique or novel simile for what “checks and balances” are and how they work.
  • Gives readers a shorthand of sorts, a mental leap in logic, to help them immediately understand the dynamic nature and tension of this intangible concept.
  • Shows a visualization that backs up the simile so that the sum of this section is greater than its parts.
Robinhood article with relevant image - checks and balances

AI, by contrast, could only hope or dream of doing this – if it outright copied this exact article. 

This expertly brings us to the next point below.

Problem 6: AI content is basically plagiarism

I mean, this one should be obvious by now. 

Once again, LLMs – by definition – are essentially a form of “indirect” plagiarism. It’s just re-sorting words together that most often appear in relation.

Look up any of the current lawsuits to see why authors, for instance, might upset that their copyrighted intellectual properties are being used to train these models. 

Typically, you’ll find that even bad amateur writers aren’t often stupid enough to “directly” plagiarize something. Just copy and paste other sources and pretend like it didn’t happen. 

But they’ll do what LLMs are doing, simply Googling the top few results and then rehashing or recycling what they see.

Let’s plug one of these articles into Grammarly then to see how it shakes out:

Excel formula article - Grammarly plagiarism checker

Not great. Not even good.

Yet again, the strengths of how LLMs work are also their greatest weakness, like some uber-nerdy form of jiu-jitsu.

This article in question kinda, sorta sounds like a bunch of other pre-existing academic journals – because the freaking model was trained on these same academic sources. 

“Good” SEO content should be:

  • Interesting.
  • Memorable.
  • Branded.
  • Useful.
  • Insightful.
  • Entertaining. 

Kinda hard to do that when you’re just recycling pre-existing content out there!

If a writer turned in an article to us with ~14%+ plagiarism, they’d be fired on the spot. 

How should Grammarly look when you check for plagiarism? Like this, clean as a whistle. 

Robinhood article - Grammarly plagiarism checker

Dig deeper: How to prevent AI from taking your content

Problem 7: Buyers buy from trusted brands, requiring credibility, something AI content lacks entirely

Like any good narrative, let’s finish where we started.

End at the beginning. (And yet another thing AI can’t do!)

Y’all know about E-E-A-T. We don’t need to retread old territory – no AI mansplaining necessary. 

Google has already warned/told you they value credibility.

But what if we back up a second?

  • Which sources will Alexa pluck from?
  • Who is OpenAI going to copy and paste, first?
  • Which Quora posts get the most upvotes?

That’s right. The best answers and the most thorough replies!

These are typically produced by some expert.

That’s ‘cause expertise builds credibility. And credibility or trust is ultimately why people decide to part with their hard-earned green with you vs. your competitors.

What hallmarks of credibility in content today that AI content completely lacks?

  • The actual writer themself is an expert writing from years of first-hand experience.
  • Expert quotes are sourced and used to bolster individual claims being made.
  • Third-party stats and links from reputable sources can either support arguments or provide counterexamples to uncover any potential bias and show the other side of an argument.
  • And the damn thing is fact-checked by at least a second (if not third) pair of eyes to actually prove the points are factual vs. falsehoods.

True credibility has nothing to do with putting a fake doctor’s byline on your AI article and calling it a day.

It’s like when your partner gets mad because you lied. Not because of what you said but because of what you didn’t. 

A lie by omission is still a lie, at least in adult land. 

The most successful, profitable companies today are run by adults working well together, pulling in the same direction over the years to build a memorable, differentiated, meaningful brand that will stand the test of time.

Not by grasping at straws, looking for shortcuts and silver bullets or phoning in with the bare minimum possible, then acting surprised when it doesn’t work, leading to entire teams being laid off or divisions shut down. 

Shortcuts might work over the short term. 

You might pick up a few rankings here or there for a few months. Maybe even a year or two.

But will it deliver sustainable growth five or 10 years from now? 

Just ask BuzzFeed or Sports Illustrated where a race to the bottom ultimately leads you.

Is SEO content an expense or an investment?

All of this begs the million-dollar question:

Is SEO content an “expense” or an “asset”?

Is “content” just an expense line on the P&L, to reduce it and minimize it as much as possible so it costs you the least?

Or, if done well, could it be an “asset” on the balance sheet, with a defined payback period, creating a defensible marketing moat that will produce a flywheel of future ROI that only grows exponentially over the long term?

Working with hundreds of brands over the past decade has shown me that there’s often a 50-50 split on this decision.

But it’s often also the one that is the best indicator of future SEO success.



from Search Engine Land https://ift.tt/sPjnpuO
via IFTTT