Google is having some technical difficulties this morning with its news engine. Google News, the news tab in Google Search, Google Discover, Google Trends and other Google services that use news publications, are currently not returning results for many searchers.
The issue I believe started a couple of hours ago but is now getting worse and happening for more and more Google searchers.
What it looks like. Here are some screenshots of Google News taking down parts of Google Search, the Google News homepage, Google Discover feeds and more.
Why we care. This may result in less traffic to news publishers, while Google works to address these issues. Google News, the news tab in Google Search, Google Discover and Google Trends all send a significant amount of traffic to publishers.
If you notice a decline in traffic this morning – this may be why.
from Search Engine Land https://ift.tt/gF1s30C
via IFTTT
One size certainly does not fit all in PPC advertising. The strategies that drive success in ecommerce often differ from those required for effective lead generation.
Lead generation through PPC is more challenging than ecommerce.
There are more obstacles and numerous sales funnel strategies to navigate.
Additionally, major ad networks often favor ecommerce brands because of their new features and educational resources.
This article wasn’t written to complain but to explore the nuanced approaches needed for each, highlighting key strategies for lead generation such as fraud prevention, monitoring micro KPIs and navigating the complexities of revenue tracking.
The challenges of lead fraud
Lead generation campaigns face the persistent challenge of lead fraud, which can skew performance data and drain budgets. This can be true for display, video and cross-network campaigns.
Unlike ecommerce, where the end goal of a transaction requires a credit card payment, lead generation can be susceptible to fraudulent leads due to the naturally low barriers to entry.
Because placement targeting and bidding are mostly algorithmically based, shady content publishers have realized that generating fake conversions for an advertiser will cause the algorithms to serve more ads on their websites and YouTube channels and even bid higher.
For the advertiser, even if Google catches and credits the invalid click activity, fraudulent leads generate false signals, making it difficult to optimize and ascertain a campaign’s true performance.
Effective strategies to counteract fraud leads and keep your data clean include implementing advanced verification techniques. Using CAPTCHA should be a given.
Look at including complex conversion criteria, manually excluding spammy placements and avoiding hot beds to improve lead quality such as children’s apps and parked domain websites.
Run an analysis of placements where your ad showed on Google Ads and sort by conversion rate. Manually copy and paste the channels or websites into your browser. In lead gen, if the performance is too good to be true, it probably is.
Additionally, despite Google’s claims otherwise, using video enhancements to extend your video’s formats has been found to lead to low-quality placements and leads. As a best practice, turn off this Google Ads setting for Video campaigns.
Ecommerce funnels are typically straightforward. A potential customer enters the funnel, browses products and adds to cart. The simplicity of this funnel allows for clear-cut strategies and optimizations.
In contrast, lead generation funnels are far more complex. They can include various lead capture tools, such as free downloads, assessments, webinars and form submissions, each industry requiring a different approach.
Whether in ecommerce or lead generation, the importance of email capture as a micro KPI cannot be overstated. Capturing an email address provides a crucial touchpoint for re-engagement and long-term relationship building, especially considering that most visitors will not convert on their first visit.
Micro KPIs, as defined here, are incremental goals leading to the ultimate goal (i.e., revenue in the bank). However, they are just as important because they lead you to the end result. Micro KPIs can include email captures, email-to-buy conversion rates, marketing qualified leads and add-to-carts.
In ecommerce, capturing an email can lead to purchases through follow-up marketing. Lead generation is the starting point for nurturing leads through the sales cycle, gradually building trust and moving leads closer to conversion.
Especially if your lead generation goal is in-depth, such as an application form or assessment, consider prioritizing an email capture or lead magnet to gain more potential for relationship building.
Whether ecommerce or lead generation, micro KPIs should be measured and campaign-focused. For example, a top-of-funnel campaign will contribute more effectively to the end goal of revenue generation if it is optimized toward a micro KPI versus purchases or sales-qualified leads.
For ecommerce, a top-of-funnel campaign optimized for add-to-carts will fuel more fire for remarketing and total new customers. For lead gen, the conversion optimization goal for a top-of-funnel campaign should be email opt-ins.
If revenue is the objective at the end of the day, micro KPIs will get you there, but they will differ for ecommerce and lead gen.
One of the most significant advantages ecommerce holds over lead generation is the straightforwardness of revenue tracking. In ecommerce, the transaction of the purchase takes place online, making it easier to measure and make informed adjustments.
In contrast, lead generation involves multiple steps and interactions before a lead even potentially results in revenue. The sales cycle can be long and is often completed offline, which introduces a level of complexity in tracking and attributing revenue. Lead quality tracking is vital for adjusting strategies and ensuring that each lead generation effort is cost-effective.
The steps or integrations needed to incorporate your CRM and offline data into your PPC campaigns will pay dividends due to the improved insights. The ability to analyze performance that you gain can be used to pinpoint where customers are coming from and, more importantly, where your ad buying is not resulting in leads.
Effective lead management systems and CRM integrations are essential for analyzing lead quality. They help bridge the gap between online marketing efforts and offline sales results, providing a clearer picture of campaign effectiveness and areas for improvement.
Adapting your approach to PPC success
PPC advertising for ecommerce and lead generation requires distinctly different strategies and mindsets.
While ecommerce benefits from straightforward transactional metrics, lead generation demands a deeper engagement with potential customers over a more extended period. Both require a keen understanding of the customer journey, a strategic approach to new customer development and meticulous attention to micro KPIs.
By mastering these elements, marketers can tailor their PPC campaigns to meet the unique demands of their targeted outcomes, whether immediate sales or cultivating potential leads for future revenue.
Site launches are arguably one of the most risky SEO endeavors. You can go from pure SEO bliss to catastrophe at the flip of a switch.
I’m sure any experienced SEO you ask could tell you a site launch horror story or two; whether it was their own doing due to lack of knowledge/experience or something completely out of their hands.
My hope with today’s article is that you may learn something from some of the lessons I’ve learned launching websites over the past decade. At the very least, perhaps it’ll help prevent one less site launch horror story.
Below are some of the major lessons I’ve learned and aren’t in any particular order of importance.
1. Excluding low-traffic pages from redirects during a site launch is risky
I learned this lesson the hard way when working at an agency years ago.
For whatever reason, at the time, we only redirected the top 500 pages of a site based on traffic, links and overall page authority.
If it was a CMS page (like category pages), those also got redirected.
As for the rest, if we could find a way to mass redirect chunks of them beyond that, we did. If not, we cut it off there.
When managing an ecommerce site with thousands upon thousands of pages, low search traffic to a specific page or set of pages can still add up to hundreds or thousands of visits overall. This can lead to a significant decrease in total traffic after launch.
If you want to avoid any type of traffic loss, make sure you set up a time with your web team in advance to discuss a 301 redirect strategy so that you can redirect the majority, if not all, of your old pages to your new pages.
2. Issues with redirects can cause major technical problems and traffic loss after a site launch
Even the most perfect 301 redirect strategy can go awry.
Not only is it important that you have a good strategy and list, but it’s almost just as important to test that redirects are working properly immediately after launch.
You’ll want to check for things like:
Redirect chains.
Loops.
302 redirects.
To do this, I recommend using Screaming Frog’s list mode feature to audit your redirects by uploading your redirect list and checking the final destination of each URL. I also use a tool called WhereGoes to check for redirect loops and chains. To me, it’s never a bad thing to use more than one tool for testing.
3. Changing the user experience can tank your new site’s traffic for months
I once worked on a furniture site. All kinds of changes were being made – from the overall site design to the navigation, filters and the checkout experience.
Making these changes is not bad. However, without the proper testing, things can go wrong quickly.
After launch, we experienced traffic loss and conversion rate drops. Customers also had issues at the checkout. There were a lot of problems, to say the least.
As time went on, things got slightly better, but I’m not sure they ever fully recovered.
You can expect issues with any launch, but when the conversion rate tanks, that is a major sign of user experience issues. Especially when you roll out changes like navigation updates, different filters and a new design.
This experience highlighted the importance of having a skilled user experience team for a successful site launch. It can make or break your site!
4. Not involving SEO from the beginning is a recipe for disaster
I was initially brought in only to collaborate with an outside agency on the redirect strategy (and some other SEO details) for the furniture company site relaunch I mentioned earlier.
I wasn’t included in site launch meetings. I had no access to wireframes, designs or related materials.
In hindsight, I should have requested more information. I assumed those aspects were being handled.
What I’ve learned since: it’s in your best interest to proactively involve yourself in any area that might impact SEO.
Many people often overlook your work as an SEO. You’re often better off assuming people don’t know what you know than assuming it is being thought about or taken care of because it likely isn’t.
If it is being thought about and being taken care of properly, that’s amazing. You’re one of the lucky few. At least you know you did your due diligence.
Here are things you should be involved in for a site launch:
Strategy meetings
Wireframe / information architecture reviews
Design / UX discussions
Keyword research for new pages
Staging site access and reviews
301 redirect strategy
Post-launch review
You can improve this process more through better cross-team collaboration and communicating the importance of what you do and why you do it.
This can be done through better processes, documentation, training and simply educating other teams. Oftentimes, in SEO, you have to be your own advocate.
Site launches are inherently risky endeavors for SEO, but proper planning and cross-functional collaboration can help you avoid potential pitfalls.
While not an exhaustive guide, learning from the hard-earned lessons outlined here can spare you from common site launch blunders that set SEO efforts back months or even years.
from Search Engine Land https://ift.tt/9xfBO0c
via IFTTT
Google responded to the bad press surrounding its recently rolled out AI Overviews in a new blog post by its new head of Search, Liz Reid. Google explained how AI Overviews work, where the weird AI Overviews came from, the improvements Google made and will continue to make to its AI Overviews.
However, Google said searchers “have higher satisfaction with their search results, and they’re asking longer, more complex questions that they know Google can now help with,” and basically, these AI Overviews are not going anywhere.
As good as featured snippets. Google said the AI Overviews are “highly effective” and based on its internal testing, the AI Overviews “accuracy rate for AI Overviews is on par with featured snippets.” Featured snippets also use AI, Google said numerous times.
No hallucinations. AI Overviews generally don’t hallucinate, Google’s Liz Reid wrote. The AI Overviews don’t “make things up in the ways that other LLM products might,” she added. AI Overviews typically only go wrong when Google “misinterpreting queries, misinterpreting a nuance of language on the web, or not having a lot of great information available,” she wrote.
Why the “odd results.” Google explained that it tested AI Overviews “extensively” before releasing it and was comfortable releasing it. But Google said that people tried to get the AI Overviews to return odd results. “We’ve also seen nonsensical new searches, seemingly aimed at producing erroneous results,” Google wrote.
Also, Google wrote that people faked a lot of the examples, but manipulating screenshots showing fake AI responses. “Those AI Overviews never appeared,” Google said.
Some odd examples did come up, and Google will make improvements in those types of cases. Google will not manually adjust AI Overviews but rather improve the models so they work across many more queries. “we don’t simply “fix” queries one by one, but we work on updates that can help broad sets of queries, including new ones that we haven’t seen yet,” Google wrote.
Google spoke about the “data voids,” which we covered numerous times here. The example, “How many rocks should I eat?” was a query no one has searched for prior and had no real good content on. Google explained, “However, in this case, there is satirical content on this topic … that also happened to be republished on a geological software provider’s website. So when someone put that question into Search, an AI Overview appeared that faithfully linked to one of the only websites that tackled the question.”
Improvements to AI Overviews. Google shared some of the improvements it has made to AI Overviews, explaining it will continue to make improvements going forward. Here is what Google said it has done so far:
Built better detection mechanisms for nonsensical queries that shouldn’t show an AI Overview, and limited the inclusion of satire and humor content.
Updated its systems to limit the use of user-generated content in responses that could offer misleading advice.
Added triggering restrictions for queries where AI Overviews were not proving to be as helpful.
For topics like news and health, Google said it already have strong guardrails in place. For example, Google said it aims to not show AI Overviews for hard news topics, where freshness and factuality are important. In the case of health, Google said it launched additional triggering refinements to enhance our quality protections.
Finally, “We’ll keep improving when and how we show AI Overviews and strengthening our protections, including for edge cases, and we’re very grateful for the ongoing feedback,” Liz Reid ended with.
Why we care. It sounds like AI Overviews are not going anywhere and Google will continue to show them to searchers and roll them out to more countries and users in the future. You can expect them to get better over time, as Google continues to hear feedback and improve its systems.
Until then, I am sure we will find more examples of inaccurate and sometimes humorous AI Overviews, similar to what we saw when featured snippets initially launched.
from Search Engine Land https://ift.tt/Vqi4hec
via IFTTT
Google Search Console link report may be broken or maybe Google decided to report on fewer links in that tool. Many are seeing huge declines in the number of links the tool is showing them when compared to last week.
More details. Starting Monday, May 27th, many SEOs began to notice a decline in the number of links being reported and then as of this morning, many are noticing massive declines, some even showing zero links being reported in the tool.
I am seeing a 40% reduction in the links reported today when compared to just a few days ago.
Here is my link report from today:
Here is a screenshot from the same report from May 27th:
Bug or feature. Google has not yet commented on this issue, so either this is some sort of reporting glitch – like we saw yesterday with product snippets report – or it is a feature, where Google is just showing fewer links.
Either way, you are not alone, everyone is seeing a huge decline in the number of links reported in the Google Search Console reports.
Why we care. Truth is, I don’t think you need to care about this. Everyone is seeing the decline in link counts. This is just a report and it does not directly influence your rankings in Google Search.
So continue working on your website and making content that matters, and maybe focus less on these buggy reports?
from Search Engine Land https://ift.tt/LPzExec
via IFTTT
If you’ve been following SEO trends over the past two years, you’ve probably heard the term “AI” more than enough times already. While I’ll admit it is getting a tad bit repetitive, it’s not without good reason.
AI technology will set the trajectory of nearly every industry in the modern economy. Some industries are further along the adoption curve than others, but at some point, the world we see today will be seamlessly woven together with AI.
With that imminent future in mind, it’s important to discuss how we use generative AI effectively.
My agency has dealt with clients on both ends of the AI spectrum. Some wish to avoid it in their content production at all costs, while others think it can do everything for them.
Unfortunately, neither is true.
This begs the question, “How should we use AI for our SEO content production?”
Here are six principles to ensure you’re getting the most out of the technology while still producing effective, Google-friendly content.
1. Know your desired outcome
Whether your content is AI or human-drafted, the creation process should always begin with an understanding of why it’s being created.
I’ve encountered a wealth of useless content produced without a real end goal and no connection to a brand’s larger marketing objectives.
With the rapid adoption of AI, the number of brands mass-producing useless content will grow. Some are adopting the mantra of quantity over quality, using AI to simply produce larger volumes of content.
But more content isn’t always the answer if there’s no goal tied to that content.
Different forms of content have different goals, and each and every piece, whether streamlined with AI or not, should serve a purpose and have a supporting strategy.
Here is a breakdown of content types and the strategy behind them:
Core site content: The goal of this content type is to educate prospective customers about products or services.
Blogs and informational content: These pieces aim to reach new audiences related to your products or services. A blog topic that’s relevant to your audience, products and services and that has proper promotion behind it can expose your business to new market sectors.
Shareable content: Use these deliverables as a way to grow authority and reach through backlinks and social engagement. To be effective, this type of content usually approaches a trending topic in your industry from an interesting perspective or shares proprietary data in a unique or fun way.
Thought leadership content: Leverage this type of content to position a person or brand as a leader or pioneer in their space. You can strengthen the strategy behind this content by including subject matter expert (SME) quotes or interviews.
Your content creation process should always lean heavily on audience research and knowledge. Just as it’s critical to know why you’re creating content, it’s critically as important to think about who you’re creating it for, especially if you’re going to be leveraging AI to help.
Some methods for gaining insight into your audience are:
On-site surveys
CRM data
Data on “most valuable” customers and/or customers with the longest relationship with the brand
Data on brand new customers (became customers within the last week)
Data on customers who entered the funnel but did not convert
Analytics data (audience, demographics)
Social data
Client intel
Survey data of ideal target audiences: Look for commonalities beyond demographics – psychographics
Once you have data points from several areas, you can create a customer persona. This persona is a detailed description of a person who embodies your ideal customer.
Creating this customer profile and keeping it in mind as you write your content serves as a constant reminder of who you’re speaking to, keeping tone and voice accurate and effective.
When leveraging AI, this persona information should be tied directly into your prompts to help ensure the output is less generic and more closely tied to your customer persona.
To integrate your persona attributes with your content even more, you can have AI generate your personas completely. If given the opportunity, this might make more sense.
Most generative AI platforms are designed to comb through vast amounts of data. While crawling these large datasets, the algorithms look for patterns.
If you can work with a more sophisticated platform and feed it as much customer data as you have available, you may be left with a better persona than you ever dreamed of to start.
3. Get a paid account
Privacy and security should be a major consideration for anyone looking to incorporate AI into their daily lives. This is especially true for content creators who want to leverage AI.
As you can see above, knowing your audience and leveraging your proprietary data insights for content production are critical to your content’s performance.
But with the wrong plan type, what you’re inputting into your AI tool of choice could be leveraged as training data and made public. You can get around this by signing up for a plan such as ChatGPT Team, which costs $25/month.
At our agency, before using any AI platform, we thoroughly review the terms and conditions to ensure our use complies with them and that our data – and, by extension, our clients’ data – is not used to train the platform.
Get the daily newsletter search marketers rely on.
4. Don’t expect to find one silver bullet all in one prompt
Contrary to popular belief, there usually isn’t one “holy grail” of a prompt that will get AI to give you the desired results.
For the best results, your team should continuously prompt throughout each stage of your content production process.
The help AI provides may look different at each stage. For example, you can use AI prompts for content to inform your research during the content ideation and outlining process. These platforms can help with research, such as giving lists of keywords and suggesting headings and subheadings that might perform well in the SERPs.
AI can also be helpful at the drafting stage. Starting with your customer persona as an input, query a generative AI platform with an appropriate question for each section of your blog.
The output generated will serve as a great starting point for your writing. However, it’s important to make sure you fact-check and edit that content to give it the ever-necessary “human touch.”
Certain AI platforms, such as Claude and Gemini (from Google), will allow you to upload your outline as part of your prompt. Simply attach your outline file and frame your prompt in a way that instructs your platform to write based on the uploaded file. What you’ll get won’t be perfect, but it will be a solid foundation for you to work from.
What if you’re not generating new content?
If you need to edit and refresh existing content, you can and should still use AI. The right AI platform can trim fluff, bring more clarity to your content and identify what can be added to strengthen your content.
These refreshing tactics could help give your current content the facelift it needs to perform well in this new era of Google AI Overviews.
5. Enhance E-E-A-T
In a nutshell, Google wants to reward content from people who have and demonstrate first-hand subject matter expertise and experience.
Yes, we’re talking about E-E-A-T, which stands for experience, expertise, authoritativeness and trustworthiness.
Many companies are already using generative AI platforms to mass-produce blog content. So E-E-A-T could become even more important as we ride the wave of AI technology.
To set your content apart, and provide a superior user experience to a competitor’s robot ramblings, work with:
Experienced authors
Hire an expert in the field to help with your content production or leverage an expert to review your content. If you work for an agency, ask your clients to provide a dedicated subject matter expert to formally review your content when needed.
If you’re in-house, ask who has the depth and experience needed to weigh in on the subjects you’re covering. You can also develop standard interview questions to ask your client about each topic, ensuring you have a bank of expert insight to improve content performance.
LinkedIn is also a great platform to identify and hire experts within certain fields to help write or professionally review your content.
SMEs
Strengthen your content with SME insights, quotes, resources or statistics.
If you can’t get in direct contact with an in-house SME or one on a client’s team, there are other ways to gain insights and quotes.
You can leverage sites like Connectively (formerly HARO) or respondent.io to farm questions directly from an industry expert. You may have to verify their credentials, but it’s an easy, low-cost way to add depth to your content.
Your client’s YouTube channel (if they have one) can also be a veritable gold mine of quotes.
Data
Leverage custom or proprietary data. Access to unique data in your space positions you with experience that Google will value.
Either internally or using a third-party data collection company, conduct a survey. Brainstorm a unique perspective on topics within your industry and develop a questionnaire.
6. Human touch is essential
AI can help you achieve 60-70% of the goals with specific tasks, but a human touch remains essential throughout the process. Although technology is advancing, glitches like AI hallucinations leave us at a point where a human touch is essential to producing stellar content.
Your aim should be to use AI to set your piece up for success, enhance the content and save yourself some time. But it’s critical to ensure you have the proper staff to check for plagiarism, optimize readability and edit for final spelling and grammar.
The important thing to remember is that AI isn’t perfect. It takes a human hand to oversee content and ensure it hits all the major benchmarks of good internet writing, like E-E-A-T, and polishing the piece to sound conversational, on brand and provide the user experience your readers and the SERPs are after.
Take it upon yourself to be the AI pro on your team.
AI isn’t going anywhere. It’s growing more intelligent with every passing day.
Stay ahead of AI news and trends, become familiar with new platforms and continue to responsibly explore new and exciting ways to implement AI to transform your workflow.
from Search Engine Land https://ift.tt/daPVMhE
via IFTTT
Google officially rolled out the requirement for Consent Mode v2 for Google properties in the European Economic Area (EEA) to ensure its properties complied with the Digital Markets Act (DMA) in March.
ad_user_data, which sets consent for sending user data related to advertising to Google.
ad_personalization, which sets consent for personalized advertising.
This launch caused a frenzy among PPC marketers to ensure compliance before the deadline.
However, a significant number of advertisers have not adopted consent mode, and their ad accounts are at risk of penalization.
Below are four ways to check your current Google consent mode configuration.
1. Check your consent mode configuration in Google Ads
The first and simplest place to start your checks is within Google Ads itself. Within your ad account, navigate to Tools and Settings > Measurement > Conversions > Diagnostics.
If consent mode is active, under Diagnostics, you will see the following widget:
What this tells you is that Google is:
Reading and recording consent statuses for users of your website.
Adjusting its tracking tags behavior based on the consent statuses it reads.
However, it doesn’t tell you whether the correct statuses are being passed.
Thus, further checks and tests are required.
Configuring consent mode via Google Ads
If the first step did not show you a consent mode active widget, then there is work to be done to get the consent mode setup.
Google has some direct integrations that you can access within your Google Ads account:
Navigate to Tools and Settings > Setup > Data Manager > Google tag > Manage > Admin.
Under Google Tag Management, click Set up consent mode.
Then choose your banner type and follow the steps
Once you select either your web platform or CMP (consent management platform), it will provide step-by-step instructions on how to set up consent mode.
If you don’t yet have a consent banner, it will also walk you through how to get started.
2. Checking consent status in Google Analytics 4
Like Google Ads, Google also released a feature within GA4 to check consent status.
Again, this lets you check if Google can read and record the consent choices made by users on your website.
To use this feature, follow these steps:
Navigate to your GA4 account.
Select Admin.
Under Data collection and modification, select Data streams.
Select your website data stream.
Click on the Consent settings drop-down.
The consent settings status in GA4 has three parts. It tells you whether measurement and personalization consent signals are active and allows you to verify how Google shares data between its services.
3. Checking consent configuration of tags in Google Tag Manager
Google released a feature in Google Tag Manager (GTM) to aid marketers in ensuring that the correct consent settings are applied to tags deployed through GTM.
The first thing you need to do is enable the consent overview setting in your GTM account:
Select your container, then click Admin > Container Settings.
Under Additional Settings, check the box to enable consent overview.
Once the setting is enabled, you can navigate back to your workspace and use it to check what consent settings have been applied to each tag.
To find this, you will need to switch from overview to tags, and then next to the blue New button, you will see a shield icon.
Once clicked, it will show you each tag in the account and whether consent settings have been configured. It also identifies any built-in consent settings within a tag. Google tags, for example, all have built-in consent within GTM.
4. Check consent status changes with Google Tag Assistant
You can use Google Tag Assistant to check what statuses are being passed and updated as users move through the website and interact with your cookie consent banner.
There are two places that you can run these checks:
Preview mode in GTM offers additional benefits. It allows you to see which tags are firing at each triggered event. This helps you understand whether the tags are reading the correct consent choices and whether they are firing without any consent choices.
To check the consent updates on the page through both methods, input your URL so it loads in debug mode.
On the left-side navigation, you will see all the events that fire in Tag Assistant. You can click through each event and toggle your output between Tags and Consent to see what consent choices have been made and which tags fired.
The consent statuses will look similar to the screenshot below:
You will see the default status. This is usually denied for everything other than essential cookies required for website functionality.
Depending on which event you select, you will see the on-page update and current state.
If the ad cookies were denied, you would not want your advertising platform tags to fire and drop cookies when you toggle back to tags.
One point to note is that Google tags will always appear as firing in preview mode as their consent settings are built in, and the tags adjust their behavior automatically.
Verify your Google consent mode configuration now
Consent mode will be a crucial element moving forward in any comprehensive tracking setup and ensuring your business is in line with regulations.
To avoid any ad account penalizations or data issues, you will want to ensure that you have consent mode enabled and that it is also done correctly.
From fireside chats to expert-led sessions featuring digital leaders who are blazing a trail in the world of omnichannel marketing, this year’s Emarsys Omnichannel and AI Masterclass is packed full of insights that will leave you buzzing with new ideas.
In this article, we’re going to take a look at three key sessions from the likes of Home Depot, Colgate-Palmolive and Kellanova that you won’t want to miss. Let’s dive in.
More than commerce: How CPG brands are building DTC engagement
Speakers:
Don Brett, Podcast Host, CPG View
Jamie Schwab, VP Global Digital Commerce, Colgate-Palmolive
Jamie Decker, VP e-Commerce, Del Monte
Diana Macia, Director, Global Omnichannel Capabilities, Kellanova (Kellog’s)
What’s in store:
Driven by the modern consumers’ demand for convenience and internal pressures to build brand affinity, the last few years have seen a stampede of consumer products brands opening up direct-to-consumer offerings.
However, with multiple channels in play, the fast-moving world of e-commerce comes with its own set of challenges. Join Don Brett, CPG View Podcast host and an esteemed panel of CPG guests as they discuss:
The importance of a first-party data strategy in 2024.
What value exchanges are being created across the customer journey.
How brands are balancing personalization with consumer trust.
AI’s role across this whole ecosystem and its new applications.
How Home Depot engineers online experiences that ‘Get More Done’ during peak seasons
Speakers:
Mauricio Gonzalez, DFC Logistic Solutions Design Coordinator, Home Depot
Uncompromising quality and supportive staff make Home Depot a one-stop-shop for all things DIY. Now, it’s taking this to the next level, with a brand transformation that delivers an unrivaled customer experience that delivers what customers need, exactly when they need it.
In this session, Home Depot’s Online Experience Department Head shares:
Home Depot’s journey to omnichannel marketing mastery, detailing their transformation from in-store to online.
Key insights into how they prepare for Black Friday, their biggest sales event of the year.
How Replacements, Ltd. plates up traditional and digital marketing to serve a broad demographic
Kara Lewis, Lead Client Strategy Manager, Attentive
What’s in store:
With a 40-year legacy and a mastercraft service that replaces precious pieces of china, it shouldn’t come as a surprise that Replacements Ltd. attracts a more “senior” demographic. However, as they’ve worked to attract younger customers, they’ve seen an interesting change – more of their older customers are engaging on newer marketing channels.
The result? A fascinating intersection between traditional and transformative marketing that requires a unique strategy. Join Replacements Ltd.’s E-commerce Marketing Manager and Attentive’s Lead Client Strategy Manager, as they discuss how Replacements Ltd. is using cutting-edge marketing tech and customer data to bridge the gap between old and new.
These are just some of the brands and sessions that are now confirmed for the AI & Omnichannel Masterclass hosted by Emarsys.
Sign up now to get access to all sessions, live and online from 12 to 13 June.
from Search Engine Land https://ift.tt/RfihZkP
via IFTTT
The search community is still unpacking and processing the huge reveal of the Google Search ranking documents that were made public yesterday morning. Everyone has been asking, why has Google not commented on the leak. Well, Google has finally commented – we spoke to a Google spokesperson about the data leak.
Google told us. Google told us that there are a lot of assumptions being published based on the data leak that are being taken out of context, that are incomplete and added that search ranking signals are constantly changing. This is not to say Google’s core rankings principles change, they do not, but the specific and individual signals that go into Google rankings do change, Google told us.
A Google spokesperson sent us the following statement:
“We would caution against making inaccurate assumptions about Search based on out-of-context, outdated, or incomplete information. We’ve shared extensive information about how Search works and the types of factors that our systems weigh, while also working to protect the integrity of our results from manipulation.”
Google, however, won’t comment about the specific elements, which are accurate, which are invalid, which are currently being used, how are they being used, and how strongly (weighted) they are being used. Google won’t comment about specifics because Google never comments on specifics when it comes to its ranking algorithm, a Google spokesperson told me. Google said if they did comment, spammers and/or bad actors can use it to manipulate its rankings.
Google also told us that it would be incorrect to assume that this data leak is comprehensive, fully-relevant or even provides up-to-date information on its Search rankings.
Did Google lie to us. That is hard to say for sure. There are some clear details about ranking signals Google historically told us they do not use, that were specifically mentioned in the leaked documents. Of course, Google’s statement says what is in the document may have never been used, been tested for a period of time, may have changed over the years or may be used. Again, Google won’t get into specifics.
Of course, a lot of folks in the SEO community have always felt Google has lied to us and that you should do your own testing to see what works in SEO and what does not work in SEO.
I, for one, trust people when they look me in the eye and tell me something. I do not believe the Google representatives I’ve spoke to over the years lied outright to me. Maybe it was about semantics of language, maybe Google wasn’t using a specific signal at that time or maybe I am super naive (which is very possible) and Google has lied.
Google communication. Google told me they are still committed to providing accurate information, but as I noted above, they will not do so in specific detail on a ranking signal-by-signal basis. Google also said that its ranking systems do change over time and it will continue to communicate information that it can to the community.
Does it matter. Either way, ultimately, these signals all point to the same thing. I believe Mike King, who was the first to dig into this document and help reveal the details, said that ultimately we need to build content and a website that people want to visit, want to spend time on, want to click over to and want to link to. The best way to do that is to build a website and content that people want like and enjoy. So the job of an SEO is to continue to build great sites, with great content. Yea, it is a boring answer – sorry.
What happened. As we covered, thousands of documents, which appear to come from Google’s internal Content API Warehouse, were released March 13 on Github by an automated bot called yoshi-code-bot. These documents were shared with Rand Fishkin, SparkToro co-founder, earlier this month.
Why we care. As we reported earlier, we have been given a glimpse into how Google’s ranking algorithm may work, which is invaluable for SEOs who can understand what it all means. As a reminder, in 2023, we got an unprecedented look at Yandex Search ranking factors via a leak, which was one of the biggest stories of that year. This Google leak is likely going to be the story of the year – maybe of the century.
But what do we do with this information? Probably exactly what we have been doing without this information – build awesome sites with awesome content.
The articles. Here are the two main articles that broke this story on this Google data leak:
PayPal is building an advertising business that will leverage the troves of data it collects on consumer purchases and spending habits.
What’s happening? The digital payments giant plans to create an ad network that allows merchants and brands to target PayPal’s roughly 400 million users with personalized promotions and ads based on their transaction histories.
Why we care. Advertisers should be interested in this because PayPal has a vast amount of purchasing data from 400 million users, so this could mean sophisticated targeting and advertising across multiple channels (as Paypal plans on serving ads beyond its platforms) from one platform.
Key hires.
Mark Grether, formerly head of Uber’s ad business, hired as SVP/GM of PayPal’s new PayPal Ads division to develop ad formats and build out the sales team.
John Anderson, previously head of product/payments at Plaid, hired as SVP/GM of PayPal’s consumer group.
Details. PayPal already offers an “Advanced Offers” ad product that uses AI to serve PayPal users with targeted discounts from merchants whenever they make a purchase.
The company plans to expand this to sell ads to brands outside of its merchant network that could be displayed across the web and connected TV.
What they’re saying. PayPal says users can opt out of having their data included in the ad targeting.
“If you’re someone who’s buying products on the web, we know who is buying the products where, and we can leverage the data,” Grether told the WSJ.
Between the lines. The move follows other finance giants like JPMorgan Chase entering the retail media ad space by monetizing their customer data.
PayPal’s ad business is still nascent and may struggle to move the needle for the fintech company whose core payments processing business has higher profit margins.
The big picture.PayPal’s ad ambitions come as the company ais to rebound from recent struggles, including major layoffs and a stock slide after forecasting muted profit growth this year.
from Search Engine Land https://ift.tt/A2zGl9b
via IFTTT
Understanding all the features of Google Analytics 4 (GA4) is essential to making the most of it. Doing so allows you to configure the tool to analyze data accurately and efficiently. It also lets you draw the best conclusions for designing, refocusing and defining your digital strategies.
GA4 users can configure many functionalities, including custom dimensions, which allow for more detailed and personalized data analysis.
What are dimensions in GA4?
Google defines a dimension as an attribute of your data. It describes your data, and it’s usually written in text rather than numbers.
An example of dimensions would be source/medium, which shows how a user arrives at a website:
Another example of dimension would also be the Event name, which shows the different events that happen on a website and how the user interacts with it:
When creating the GA4 account, the tool preconfigures a wide list of dimensions automatically by default.
However, if this is not enough for your strategy’s analysis and you need to analyze attributes in more detail and specifically based on the website’s objectives, you can create custom dimensions.
Events and event parameters
To understand custom dimensions and how to create them, you must first understand some GA4 concepts – events and parameters.
Events are the metrics that allow you to measure specific user interactions on a website, such as loading a page, clicking on a link or submitting a form.
Event parameters are additional data that site information about those events (i.e., additional information about how users interact with a website).
There are two types of event parameters in GA4, depending on how they are captured by the tool:
Automatically collected parameters: These are preconfigured by GA4, which automatically captures a set of parameters (e.g. page_location, page_referrer or page_title). Google provides a list of all these event parameters created automatically or enhanced measurement.
Custom parameters: These allow you to collect information that is not captured by default. This applies to recommended events and custom events, where custom settings are required.
What are GA4 custom dimensions?
Custom dimensions are attributes that allow you to describe and collect data in a customized way. Essentially, they are parameters you create in GA4 to capture information that is not automatically tracked by the tool.
Types of custom dimensions
Depending on the information that you want to collect in a custom way, you can create different types of custom dimensions, as indicated by Google:
User-scoped custom dimensions let you analyze user-related attributes that describe groups of your user base, such as age, language or country.
Event-scoped custom dimensions refer to interactions that happen on your website. It could be any event parameter created for the recommended events or custom events, such as generate_lead or login.
When is it recommended to create custom dimensions?
Before creating custom dimensions to analyze data in more detail, it is advisable to check whether these attributes already exist as automatic events predefined by GA4 or as options within enhanced measurement events.
To determine if the data you want to analyze is already provided by automatic events, you can consult the list that Google Analytics offers under Automatically collected events. These events are collected automatically, so the user does not need to perform any additional actions.
This is not the case for the enhanced measurement events, these must be activated within the GA4 account if you want to collect this information.
To activate these attributes, you will do it inside Admin > Data streams > Events > Enhanced measurement.
If the information you want to analyze is not included within these automatic events, it is recommended that you create it as a custom dimension.
Get the daily newsletter search marketers rely on.
Regardless of the type of custom dimension, it must be created via Google Tag Manager. Below is a step-by-step guide for configuring an event-scoped custom dimension.
Before getting started, make sure the GA4 and Google Tag Manager accounts are properly configured and linked.
Next, you need to define and create which event parameter you want and send it as a custom dimension in GA4.
In this case, you are going to show in the following image how to create the event parameter to analyze the URL of the video that a user plays on your website:
As it is a manually configured event parameter only included in Google Tag Manager, it will not be enough for GA4 to include it in its reports automatically. You will have to notify GA4 about it by going to Admin > Data display > Custom definitions.
Then, click on Create custom dimension.
Create the custom definition with the information from your event parameter:
Now that your custom dimension is created, use DebugView to check if it is being collected correctly and is properly configured.
In parallel, within the Custom definitions section, under Quota information, you can monitor the total number of custom dimensions created in your GA4 account.
How many custom dimensions can I set up in my GA4 account?
The number of custom dimensions that a user can create is limited, although it is often difficult to reach this limit.
To ensure you create only the most useful dimensions, first define the highest-priority KPIs for your website or app and then create and configure only those dimensions that are truly of interest. To avoid exceeding this limit, use predefined dimensions and metrics whenever possible.
How to analyze custom dimensions
Custom dimensions will provide you with additional information about your data. This information can be analyzed within GA4.
In the case of GA4, you can analyze the custom dimensions through the same reports that the tool offers in a customized way, either in the traffic or events panel, for example:
Custom dimensions can also provide more information about your data when using the explore section:
Creating custom dimensions is a valuable method for enhancing your analytics with valuable insights for your business.
from Search Engine Land https://ift.tt/eOuQ5oU
via IFTTT