Search Engine News

A Search Engine Marketing News Site

Yes, Competitors Can Edit Your Listing on Google My Business

Posted by JoyHawkins

I decided to write this article in response to a recent article that was published over at CBSDFW. The article was one of many stories about how spammers update legitimate information on Google as a way to send more leads somewhere else. This might shock some readers, but it was old news to me since spam of this nature on Google Maps has been a problem for almost a decade.

What sparked my interest in this article was Google’s response. Google stated:

Merchants who manage their business listing info through Google My Business (which is free to use), are notified via email when edits are suggested. Spammers and others with negative intent are a problem for consumers, businesses, and technology companies that provide local business information. We use automated systems to detect for spam and fraud, but we tend not to share details behind our processes so as not to tip off spammers or others with bad intent.

Someone might read that and feel safe, believing that they have nothing to worry about. However, some of us who have been in this space for a long time know that there are several incorrect and misleading statements in that paragraph. I’m going to point them out below.


“Merchants are notified by email”

  1. Google just started notifying users by email last month. Their statement makes it sound like this has been going on for ages. Before September 2017, there were no emails going to people about edits made to their listings.
  2. Not everyone gets an email about edits that have been made. To test this, I had several people submit an update to a listing I own to change the phone number. When the edit went live, the Google account that was the primary owner on the listing got an email; the Google account that was a manager on the listing did not.

Similarly, I am a manager on over 50 listings and 7 of them currently show as having updates in the Google My Business dashboard. I haven’t received a single email since they launched this feature a month ago.

“Notified […] when edits are suggested”

Merchants are not notified when edits are “suggested.” Any time I’ve ever heard of an email notification in the last month, it went out after the edit was already live.

Here’s a recent case on the Google My Business forum. This business owner got an email when his name was updated because the edit was already live. He currently has a pending edit on his listing to change the hours of operation. Clearly this guy is on top of things, so why hasn’t he denied it? Because he wouldn’t even know about it since it’s pending.

The edit isn’t live yet, so he’s not receiving a notification — either by email or inside the Google My Business dashboard.

Edits show up in the Google My Business dashboard as “Updates from Google.” Many people think that if they don’t “accept” these edits in the Google My Business dashboard, the edits won’t go live. The reality is that by “accepting” them, you’re just confirming something that’s already live on Google. If you “don’t accept,” you actually need to edit the listing to revert it back (there is no “deny” button).

Here’s another current example of a listing I manage inside Google My Business. The dashboard doesn’t show any updates to the website field, yet there’s a pending edit that I can see on the Google Maps app. A user has suggested that the proper website is a different page on the website than what I currently have. The only way to see all types of pending edits is via Check the Facts on Google Maps. No business owner I’ve ever spoken to has any clue what this is, so I think it’s safe to say they wouldn’t be checking there.

Here’s how I would edit that original response from Google to make it more factually correct:

Merchants who manage their business listing info through Google My Business (which is free to use) are notified when edits made by others are published on Google. Sometimes they are notified by email and the updates are also shown inside the Google My Business dashboard. Google allows users (other than the business owner) to make edits to listings on Google, but the edits are reviewed by either automated systems or, in some cases, actual human beings. Although the system isn’t perfect, Google is continually making efforts to keep the map free from spam and malicious editing.


Do you manage listings that have been edited by competitors? What’s your experience been? Share your story in the comments below!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog http://ift.tt/2yA7jbp

Advertisements

Getting SEO Value from rel=”nofollow” Links – Whiteboard Friday

Posted by randfish

Plenty of websites that make it easy for you to contribute don’t make it easy to earn a followed link from those contributions. While rel=nofollow links reign in the land of social media profiles, comments, and publishers, there’s a few ways around it. In today’s Whiteboard Friday, Rand shares five tactics to help you earn equity-passing followed links using traditionally nofollow-only platforms.

http://ift.tt/2wBCte0

http://ift.tt/1SsY8tZ

How to get SEO value from rel="nofollow" links

Click on the whiteboard image above to open a high-resolution version in a new tab!

http://ift.tt/2yu975X

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about how you can get SEO value from nofollowed links. So in the SEO world, there are followed links. These are the normal ones that you find on almost every website. But then you can have nofollowed links, which you’ll see in the HTML code of a website. You will see the normal thing is a href=somewebsite in here. If you see this rel=nofollow, that means that the search engines — Google, Bing, Yahoo, etc. — will not count this link as passing link equity, at least certainly not in the same way that a followed link would.

So when you see these, you can see them by looking in the source code yourself. You could turn on the MozBar and use the “Show nofollow links” on the Page button and see these.

What sort of links use rel=nofollow?

But the basic story is that you’re not getting the same SEO value from them. But there are ways to get it. Recently you might have seen in the SEO news world that Inc. and Forbes and a few other sites like them, last year it was Huffington Post, started applying nofollow tags to all the links that belong to articles from contributors. So if I go and write an article for Inc. today, the links that I point out from my bio and my snippet on there, they’re not going to pass any value, because they have this nofollow applied.

A) Social media links (Facebook, Twitter, LinkedIn, etc.)

There are a bunch of types of links use this. Social media, so Facebook, Twitter, and LinkedIn, which is one of the reasons why you can’t just boost your linked profile by going to these places and leaving a bunch of links around.

B) Comments (news articles, blogs, forums, etc.)

Comments, so from news articles or blogs or forums where there’s discussion, Q&A sites, those comments, all the links in them that you leave again nofollowed.

C) Open submission content (Quora, Reddit, YouTube, etc.)

Open submission content, so places like Quora where you could write a post, or Reddit, where you could write a post, or YouTube where you could upload a video and have a post and have a link, most of those, in fact almost all of them now have nofollows as do the profile links that are associated. Your Instagram account, for example, that would be a social media one. But it’s not just the pictures you post on Instagram. Your profile link is one of the only places in the Instagram platform where you actually get a real URL that you can send people to, but that is nofollowed on the web.

D) Some publishers with less stringent review systems (Forbes, Buzzfeed, LinkedIn Pulse, etc.)

Some publishers now with these less stringent publishing review systems, so places like Inc., Forbes, BuzzFeed in some cases with their sponsored posts, Huffington Post, LinkedIn’s Pulse platform, and a bunch of others all use this rel=nofollow.

Basic evaluation formula for earning followed links from the above sources

Basic evaluation formula for earning followed links from the above sources

The basic formula that we need to go to here is: How do you contribute to all of these places in ways that will ultimately result in followed links and that will provide you with SEO value? So we’re essentially saying I’m going to do X. I know that’s going to bring a nofollowed link, but that nofollowed link will result in this other thing happening that will then lead to a followed link.

Do X → Get rel=nofollow link → Results in Y → Leads to followed link

5 examples/tactics to start

This other thing happening can be a bunch of different things. It could be something indirect. You post something with your site on one of these places. It includes a nofollow link. Someone finds it. We’ll just call this guy over here, this is our friendly editor who works for a publication and finds it and says, “Hmm, that link was actually quite useful,” or the information it pointed to was useful, the article was useful, your new company seems useful, whatever it is. Later, as that editor is writing, they will link over to your site, and this will be a followed link. Thus, you’re getting the SEO value. You’ve indirectly gained SEO value essentially through amplification of what you were sharing through your link.

Google likes this. They want you to use all of these places to show stuff, and then they’re hoping that if people find it truly valuable, they’ll pick it up, they’ll link to it, and then Google can reward that.

So some examples of places where you might attempt this in the early stages. These are a very small subset of what you could do, and it’s going to be different for every industry and every endeavor.

1. Quora contributions

But Quora contributions, especially those if you have relevant or high value credentials or very unique, specific experiences, that will often get picked up by the online press. There are lots of editors and journalists and publications of all kinds that rely on interesting answers to Quora questions to use in their journalism, and then they’ll cite you as a source, or they’ll ask you to contribute, they’ll ask you for a quote, they’ll point to your website, all that kind of stuff.

2. Early comments on low-popularity blogs

Early comments especially in, I know this is going to sound odd, but low-popularity blogs, rather than high-popularity ones. Why low popularity? Because you will stand out. You’re less likely to be seen as a spammer, especially if you’re an authentic contributor. You don’t get lost in the noise. You can create intrigue, give value, and that will often lead to that writer or that blogger picking you up with followed links in subsequent posts. If you want more on this tactic, by the way, check out our Whiteboard Friday on comment marketing from last year. That was a deep dive into this topic.

3. Following and engaging with link targets on Twitter

Number three, following and engaging with your link targets on Twitter, especially if your link targets are heavily invested in Twitter, like journalists, B2B bloggers and contributors, and authors or people who write for lots of different publications. It doesn’t have to be a published author. It can just be a writer who writes for lots of online pieces. Then sharing your related content with them or just via your Twitter account, if you’re engaging with them a lot, chances are good you can get a follow back, and that will lead to a lot of followed up links with a citation.

4. Link citations from Instagram images

Instagram accounts. When you post images on Instagram, if you use the hashtags — hashtag marketing is kind of one of the only ways to get exposure on Instagram — but if you use hashtags that you know journalists, writers, editors, and publications of any kind in your field are picking up and need, especially travel, activities, current events, stuff that’s in the news, or conferences and events, many times folks will pick up those images and ask you for permission to use them. If you’re willing to give it, you can earn link citations. Another important reason to associate that URL with your site so that people can get in touch with you.

5. Amplify content published on your site by republishing on other platforms

If you’re using some of these platforms that are completely nofollow or platforms that are open contribution and have follow links, but where we suspect Google probably doesn’t count them, Medium being one of the biggest places, you can use republishing tactics. So essentially you’re writing on your own website first. Writing on your own website first, but then you are republishing on some of these other places.

I’m going to go Forbes. I’m going to publish my column on Forbes. I’m going to go to Medium. I’m going to publish in my Medium account. I’m going to contribute Huffington Post with the same piece. I’m republishing across these multiple platforms, and essentially you can think of this as it’s not duplicate content. You’re not hurting yourself, because these places are all pointing back to your original. It’s technically duplicate content, but not the kind that’s going to be bothersome for search engines.

You’re essentially using these the same way you would use your Twitter or Facebook or LinkedIn, where you are pushing it out as a way to say, “Here, check this out if you’re on these platforms, and here’s the original back here.” You can do that with the full article, just like you would do full content in RSS or full content for email subscribers. Then use those platforms for sharing and amplification to get into the hands of people who might link later.


So nofollowed links, not a direct impact, but potentially a very powerful, indirect way to get lots of good links and lots of good SEO value.

All right, everyone, hope you’ve enjoyed this edition of Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog http://ift.tt/2xlFkY3

The Beginner’s Guide to Duplicate Content

One of the most frequent challenges I come across as a digital marketer are clients who can’t seem to get a good grasp of what duplicate content really is, how to avoid it, and why it matters to them.

In this article, I’m going to dispel a few myths about duplicate content and SEO that are still lingering in a post-Panda world, as well as giving a few tips as to how to keep on the right side of Google’s guidelines so that search engines and users love your content.

what is duplicate content

What is duplicate content?

From the horse’s mouth, a Google Search Console Help Centre article states:

substantive blocks of content within or across domains that either completely match other content or are appreciably similar.”

Which, doesn’t seem so difficult, but what we need to know is how does this affect your website?

Some examples of duplicate content include:

Ecommerce product descriptions. Specifically, generic descriptions provided by a supplier and used across multiple sales outlets. For example, this section on the Nespresso website about a coffee machine…

Duplicate Content

…has been repeated word for word on Amazon India to sell the same product:

Duplicate Content Nespresso 2

Use of the same page in multiple areas of your site. Again, this is usually a problem for ecommerce sites, e.g. you’ll see:

http://ift.tt/2hRwGhW

which has the same content as:

http://ift.tt/2gff0ZQ

Multiple service pages on your website which are too similar to each other.

Your site doesn’t handle the www and non-www versions of your site effectively.

You use another website’s content on your own site. Press Releases are a good example of content that is written once and distributed multiple times. Another would be sites that syndicate content and publish nothing original.

You own several domains that sell similar product lines to different target audiences – to both consumers and trade for example.

Why should I care about duplicate content on my website?

Let’s dispel the biggest myth that still gets circulated, the Google penalty myth. Here’s the truth: There is NO Google penalty for duplicate content.

This was addressed in a Google Q&A in June of last year. You can watch the whole video here.

However: Google MAY prevent some of your content from showing as a search result if your site has duplicate content issues, and as with all content, it will aim to show the most relevant content to the user at the time.

Google will still index those pages. If it can see the same text across several pages and decides they are the same, it will show only the one which they deem to be the most relevant to the user’s own query.

There is a distinction between content which has been duplicated by your CMS generating new URLs, for example, and users who replicate content on a large scale and re-publish it for financial reward, or to manipulate rankings.

Google’s Guidelines for quality are clear on this subject. If you use illicit tactics for generating content, or create pages with no original content, you do run the risk of being removed from search engine results pages (SERPs).

In ordinary cases such as those listed above, the worst that will happen is your site simply won’t be shown in SERPs.

How to check for duplicate content on your site

There are several tools which will help identify areas to improve on your own site such as:

Moz’s crawler tool will help you to identify which pages on your site are duplicate and with which other pages. It is a paid tool but it does have a 30-day free trial available.

Siteliner will give you a more in-depth analysis of which pages are duplicated, and how closely related they are and which areas of text are replicated. This is useful where large bodies of text are used but the whole page may not be a complete replication:

Duplicate Content Siteliner 1

Duplicate Content Siteliner 2

 Copyscape’s plagiarism checker will also check for copies of your pages being used on the wider web:

Duplicate Content Copyscape

If you can’t access these tools for any reason but are concerned that duplicate content may be influencing your site, try selecting a snippet of text and searching for it to see if any direct duplications are returned in the results.

What to do about content duplications?

This really depends on the type of duplication. Some of the techniques I’ll talk about now aren’t really for the beginner. You may need an SEO agency to hold your hand through this part of the process.

The Problem: Generic product descriptions provided by a supplier

The fix: This one is easy to tackle, but can be resource heavy. The advice is about as simple as it gets; make your content unique, useful, and interesting for your audience. Usually a manufacturer’s description will tell you what the product is, whereas you need to think about why your customer needs it and why they need to buy it from you.

There’s nothing stopping you from using the specification of a product and then adding your own wording around it. Add in your tone of voice and personality. Think about your specific audience and their personas. Think about why they would want to buy your product and then tell them your unique selling proposition. What problem or need does it satisfy that they will relate to?

The Problem: Same page in multiple places on your site

The fix: In this instance, you should include a canonical URL on the duplicated pages, which refer to the original as the preferred version of the page. In my ecommerce example where a red jacket appears in both “sale” and “jackets” categories one of them should include a canonical link in the code of the page to acknowledge the duplication. An example would be as follows:

On the jacket contained in the “Sale” page:

canonical tags for duplicate content issues

The Problem: Service pages on your website which are too similar to each other

The fix: There are a couple of options here. You can try and make the pages sufficiently different, however if the pages are largely around the same subject with only slight differences, you may be better served using just one page to talk about both subjects. I would advise removing the least valuable page and apply a 301 redirect back to the most valuable page. One valuable page is certain to be more successful than two weak or conflicting pages.

The Problem: Your site doesn’t handle www. and non-www. versions of your site effectively

The fix: The easiest way to test for this is to remove the www. portion of a URL on your site, in your browser and see what happens when you try to load the page. Ideally a redirect should take place from one to the other.

Note: it doesn’t matter which you go with, just pick one way and be consistent. Also make sure you have identified your preferred version in Google Search Console.

The Problem: You use another website’s content openly on your website

The fix: This scenario tends to happen if you use press releases or if you use feeds to populate certain areas of your site, to show the latest events in a specific region, for example.

There’s no real hard and fast rule to this. If you are sure that this type of content provides value to your users you can either accept that you’re never going to rank well for that content (but the rest of your site might) or you can take the time to make the content unique to your audience.

The Problem: Having two websites selling the same goods to different audiences

The fix: This one is somewhat complex. The best way to combat this, from a search point of view, is to combine your online presences into one site. There may be good business reasons for having two separate brands which cater to different audiences. You still need to be aware that they will ultimately be competing for attention in the search engine results pages (SERPs).

In Summary…

Simply adhering to Google’s quality guidelines will help. Create content which is useful, credible, engaging and, wherever possible, unique.

Google does a decent job of spotting unintentional duplications but the tips above should give you an idea of how to get search engines and users to understand your site.

About the author

Jean Frew is a Digital Marketing Consultant at Hallam Internet. Jean has worked in Ecommerce and Digital Marketing since 2007 and is experienced in driving online growth, as well as managing budgets and projects of all sizes. She has a broad knowledge of Digital Marketing and utilises analytics to make data-driven decisions.

from Internet Marketing Blog by WordStream http://ift.tt/2ghfqPj

New Findings Show Google Organic Clicks Shifting to Paid

Posted by Brian_W

On the Wayfair SEO team, we keep track of our non-branded click curves: the average click-through rate (CTR) for each ranking position. This helps us accurately evaluate the potential opportunity of keyword clusters.

Over the last two years, the total share of organic clicks on page one of our e-commerce SERPs has dropped 25% on desktop and 55% on mobile.

For the ad-heavy non-local SERPs that we work in, paid ads are likely now earning nearly the same percentage of clicks as organic results — a staggering change from most of the history of Google.

Organic CTR loses 25% of click share on desktop, 55% on mobile

Looking at 2015 vs 2017 data for all keywords ranking organically on the first page, we’ve seen a dramatic change in CTR. Below we’ve normalized our actual CTR on a 1–10 scale, representing a total drop of 25% of click share on desktop and 55% on mobile.

Organic receives 25% less desktop CTR and 55% less mobile CTR compared to two years ago.

The much larger drop on mobile is particularly relevant because we’ve seen large traffic shifts to mobile over the last two years as well. The overall percentage drop plays out somewhat similarly across the first page of results; however, the top four were most heavily impacted.

The first four organic results were most heavily impacted by the CTR shift from organic to paid.

About the data

It’s important to note that this type of CTR change is not true for every SERP. This data is only applicable to e-commerce intent search queries, where ads and PLAs are on nearly every query.

We gather the impression, click, and rank data from Search Console. While Search Console data isn’t quantitatively correct, it does appear to be directionally correct for us (if we see clicks double in Search Console, we also see organic Google traffic double in our analytics), site improvements that lead to meaningful CTR gains appear to be reflected in Search Console, we can roughly verify impressions via ad data, and we can confirm the accuracy of rank. For purposes of this data pull, we excluded any keywords that Search Console reported as a non-integer rank (such as ranking 1.2). We have thousands of page one keywords, including many large head terms comprising millions of combined clicks, which gives us a lot of data for each ranking position.

We remove all branded queries from the data, which hugely skews click curves.

It’s important to note that paid ads are not getting all the clicks that organic is not. In addition to the small number of people who click beyond the first page, a surprising number do not click at all. Our best guess is that all ads combined now get about the same percentage of clicks (for our results) as all organic results combined.

Why is this happening?

It’s no secret to SEOs who work on transactional keywords why we no longer gain as large a share of clicks for our best rankings. We suspect the primary causes are the following:

  • Ads serving on more queries
  • More ads per query
  • Larger ads, with more space given to each ad
  • Google Shopping (which show up on more queries, list more products per query, and take up more space)
  • Subtler ad labeling, making it less obvious that an ad is an ad

At Wayfair, we’ve seen Google Shopping results appear on more and more search queries over the last year. Using Stat Search Analytics, we can track the growth in queries serving Google Shopping results (modified by search volume to give a qualitative visibility score) across the 25,000 keywords we track daily on mobile and desktop. The overall share of voice of Google Shopping has grown nearly 60% in the last year.

Number of transactional queries serving Google Shopping has grown nearly 60% in the last year.

On top of this, we’re often seeing four PPC ads for a typical non-branded commercial term, in addition to the Google Shopping results.

And with the expanded size of ads on mobile, almost none of our queries show anything other than ads without scrolling:

This great image from Edwords shows the steady growth in percent of the desktop page consumed by ads for a query that has only three ad results. We go from seeing five organic results above the scroll, to just one. In more recent years we’ve seen this size growth explode on mobile as well.

At the same time that ads have grown, the labeling of ads has become increasingly subtle. In a 2015 study, Ofcom found that half of adults don’t recognize ads in Google, and about 70% of teenagers didn’t recognize Google ads — and ad labeling has become substantially less obvious since then. For most of its history, Google ads were labeled by a large colored block that was intuitively separate from the non-ad results, though sometimes not visible on monitors with a higher brightness setting.

2000 – Shaded background around all ads:

2010 – Shaded background still exists around ads:

2014 – No background; yellow box label next to each ad (and ads take up a lot more space):

2017 – Yellow box changed to green, the same color as the URL it’s next to (and ads take up even more space):

2017 – Green box changed to a thin green outline the same color as the URL:

What to do about it

The good news is that this is impacting everyone in e-commerce equally, and all those search clicks are still happening — in other words, those users haven’t gone away. The growth in the number of searches each year means that you probably aren’t seeing huge losses in organic traffic; instead, it will show as small losses or anemic growth. The bad news is that it will cost you — as well as your competitors — more money to capture the same overall share of search traffic.

A strong search marketing strategy has always involved organic, paid search, and PLA combined. Sites optimizing for all search channels are already well-positioned to capture search traffic regardless of ad changes to the SERPs: if SEO growth slows, then PLA and paid search growth speeds up. As real estate for one channel shrinks, real estate for others grows.

If you haven’t been strongly invested in search ads or PLAs, then the Chinese proverb on the best time to plant a tree applies perfectly:

The best time to plant a tree was 20 years ago. The second best time is now.

With a similar percentage of clicks going to paid and organic, your investment in each should be similar (unless, of course, you have some catching up to do with one channel).

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog http://ift.tt/2xUYiGG

Breaking: Major Changes to How Google Spends Your Budget

Outside of overhauling the UI, not much had changed in AdWords this year…until yesterday.

Google made a hush-hush announcement (there’s still nothing on the Inside AdWords blog) in which they revealed a major shift in how they choose to allocate your daily ad spend. Per Google:

“Starting October 4, 2017, campaigns will be able to spend up to twice the average daily budget to help you reach your advertising goals.”

Now, before you pull your credit card from the Billing and Payments tab, this does not mean that Google is going to double your daily budget indefinitely. They’re not trying to bleed you dry. Think of it less like robbery and more like “big data knows best.”

What changed with your AdWords budget?

Effective October 4, Google can double the amount of money you’ve said you want to spend per day on a given campaign. So, if a campaign in your account has a budget of $150, Google can decide to spend up to $300. Note that this change affects all budgets, whether they are unique to a campaign or if they are shared.

They’ve always had the ability to exceed your daily budget in the righteous pursuit of clicks and conversions: they just capped it at 20% instead of, you know, 100%.

google announces major changes to adwords daily budget

This process is called overdelivery, and it makes a lot of sense. Some days, 1,000 people might search for “novelty koozies,” and other days that number could drop to 200. Google, benevolent as they are, wants to make sure your ads are served to as many prospective customers as possible (they also don’t want to leave unspent money on the table).

Basically, on days where traffic is high, you could see your costs swell up to 100%. Don’t panic. This will be counteracted on slower days, when ad spend is below your desired daily budget. While this doesn’t mean you’re going to spend more than your“monthly charging limit” (the average number of days in a month—30.4—multiplied by your average daily budget) it sure as hell means you’re going to hit that number.

What do these changes mean for you?

Well, for one, you’re probably never going to come in under your monthly advertising budget again.

This change will maximize your monthly ad spend, putting your ads in front of more eyeballs. This represents opportunity for you. But for Google, this is more than opportunity: it’s a chance to increase ad revenue. If every advertiser comes, like, half a percentage point closer to hitting their monthly budget, that adds up.

If you’re using some form of daily tracking to gauge how you’re pacing towards monthly targets (particularly common for agencies), you’re going to see some sporadic numbers during a given month. If your ads are overserved a few days in a row, you may receive a concerned phone call from your boss or client. Just tell them it’ll all come out in the wash: daily fluctuations will go in both directions, and spend will normalize by month’s end. While you may spend a few more dollars over the course of a month, that was money that had already been earmarked for AdWords. You’re seeing more clicks and, if your account’s in great shape, this means a parallel uptick in new business should soon follow. While it’s certainly a big change, it shouldn’t be cause for alarm.

Just be prepared to see your daily ad spend fluctuate in the coming weeks—and check back for more data as we look into our own client accounts and see how this change shakes out. 

from Internet Marketing Blog by WordStream http://ift.tt/2xWJ5oF

Special Notes for SABs Amid Decreased Local Search Visibility

Posted by MiriamEllis

One of the most common complaints I hear from service area business owners, like plumbers, locksmiths, and housekeepers, is that Google has always treated them as an afterthought. If you’re in charge of the digital marketing for these business models, it’s vital to understand just how accurate this complaint is so that you can both empathize with SAB brand owners and create a strategy that honors limitations while also identifying opportunities.

In marketing SABs, you’ve got to learn to make the best of a special situation. In this post, I want to address two of the realities these companies are facing right now that call for careful planning: the unique big picture of SAB local listing management, and the rise of Google’s Home Service Ads.

Let’s talk listings, Moz Local, and SABs

I was fascinated by my appliance repairman — an older German ex-pat with a serious demeanor — the first time he looked at my wall heater and pronounced,

“This puppy is no good.”

Our family went on to form a lasting relationship with this expert who has warned me about everything from lint fires in dryers to mis-branded appliances slapped together in dubious factories. I’m an admiring fan of genuinely knowledgeable service people who come to my doorstep, crawl under my house where possums dwell, ascend to my eerie attic despite spiders, and are professionally dedicated to keeping my old house livable. I work on a computer, surrounded by comforts; these folks know what real elbow grease is all about:

It’s because of my regard for these incredibly hard-working SAB owners and staffers that I’ve always taken issue with the fact that the local Internet tends to treat them in an offhand manner. They do some of the toughest jobs, and I’d like their marketing opportunities to be boundless. But the reality is, the road has been rocky and the limits are real.

Google goofed first

When Google invested heavily in developing their mapped version of the local commercial scene, there was reportedly internal disagreement as to whether a service area business is actually a “place” and deserved of inclusion in Google’s local index. You couldn’t add service area businesses to the now-defunct MapMaker but you could create local listings for them (clear as mud, right?). At a 2008 SMX event, faced with the question as to how SABs could be accurately represented in the local results, a Google rep really goofed in first suggesting that they all get PO boxes, only to have this specific practice subsequently outlawed by Google’s guidelines.

Confusion and spam flowed in

For the record,

  • Both SABs and brick-and-mortar businesses are currently eligible for Google My Business listings if they serve customers face-to-face.
  • SABs must have some form of legitimate street address, even if it’s a home address, to be included
  • Only brick-and-mortar businesses are supposed to have visible addresses on their listings, but Google’s shifting messaging and inconsistent guideline enforcement have created confusion.

Google has shown little zeal for suspending listings that violate the hide-address guidelines, with one notable exception recently mentioned to me by Joy Hawkins of Sterling Sky: SABs who click the Google My Business dashboard box stating that they serve clients at the business’ location in order to get themselves out of no man’s land at the bottom of the Google Home Service ad unit are being completely removed from the map by Google if caught.

Meanwhile, concern has been engendered by past debate over whether hiding the address of a business lowered its local pack rankings. The 2017 Local Search Ranking Factors survey is still finding this to be the #18 negative local pack ranking factor, which might be worthy of further discussion.

All of these factors have created an environment in which legitimate SABs have accidentally incorrectly listed themselves on Google and in which spammers have thrived, intentionally creating multiple listings at non-physical addresses and frequently getting away with it to the detriment of search results uniformity and quality. In this unsatisfactory environment, the advent of Google’s Home Service Ads program may have been inevitable, and we’ll take a look at that in a minute.

Limits made clear in listing options for SABs

Whether the risk of suspension or impact on rankings is great or small, hiding your address on SAB Google My Business listings is the only Google-approved practice. If you want to play it totally safe, you’ll play by the rules, but this doesn’t automatically overcome every challenge.

Google is one of the few high-level local business index requiring hidden SAB addresses. And it’s in this stance that SABs encounter some problems taking advantage of the efficiencies provided by automated location data management tools like Moz Local. There are three main things that have confused our own customers:

  1. Because our SAB customers are required by Google to hide their address, Moz Local can’t then verify the address because… well, it’s hidden. This means that customers need to have a Facebook listing with a visible address on it to get started using Moz Local. Facebook doesn’t require SAB addresses to be hidden.
  2. Once the customer gets started, their ultimate consistency score will generally be lower than what a brick-and-mortar business achieves, again because their hidden GMB listing address can’t be matched to all of the other complete listings Moz Local builds for them. It reads like an inconsistency, and while this in no way impacts their real-world performance, it’s a little sad not to be able to aim for a nifty 100% dashboard metric within Moz Local. Important to mention here that a 100% score isn’t achievable for multi-location business models, either, given that Facebook’s guidelines require adding a modifier to the business name of each branch, rendering it inconsistent. This is in contrast to Google’s policy, which defines the needless addition of keywords or geo-modifiers to the business name as spam! When Google and Facebook fundamentally disagree on a guideline, a small measure of inconsistency is part and parcel of the scenario, and not something worth worrying about.
  3. Finally, for SABs who don’t want their address published anywhere on the Internet, automated citation management simply may not be a good match. Some partners in our network won’t accept address-less distribution from us, viewing it as incomplete data. If an SAB isn’t looking for complete NAP distribution because they want their address to be kept private, automation just isn’t ideal.

So how can SABs use something like Moz Local?

The Moz Local team sides with SABs — we’re not totally satisfied with the above state of affairs and are actively exploring better support options for the future. Given our admiration for these especially hard-working businesses, we feel SABs really deserve to have needless burdens lifted from their shoulders, which is exactly what Moz Local is designed to do. The task of manual local business listing publication and ongoing monitoring is a hefty one — too hefty in so many cases. Automation does the heavy lifting for you. We’re examining better solutions, but right now, what options for automation are open to the SAB?

Option #1: If your business is okay with your address being visible in multiple places, then simply be sure your Facebook listing shows your address and you can sign up for Moz Local today, no problem! We’ll push your complete NAP to the major aggregators and other partners, but know that your Moz Local dashboard consistency score won’t be 100%. This is because we won’t be able to “see” your Google My Business listing with its hidden address, and because choosing service-related categories will also hide your address on Citysearch, Localeze, and sometimes, Bing. Also note that one of our partners, Factual, doesn’t support locksmiths, bail bondsmen or towing companies. So, in using an automated solution like Moz Local, be prepared for a lower score in the dashboard, because it’s “baked into” the scenario in which some platforms show your full street address while others hide it. And, of course, be aware that many of your direct local competitors are in the same boat, facing the same limitations, thus leveling the playing field.

Option #2: If your business can budget for it, consider transitioning from an SAB to a brick-and-mortar business model, and get a real-world office that’s staffed during stated business hours. As Mike Blumenthal and Mary Bowling discuss is in this excellent video chat, smaller SABs need to be sure they can still make a profit after renting an office space, and that may largely be based on rental costs in their part of the country. Very successful virtual brands are exploring traditional retail options and traditional brick-and-mortar business models are setting up virtual showrooms; change is afoot. Having some customers come to the physical location of a typical SAB may require some re-thinking of service. A locksmith could grind keys on-site, a landscaper could virtually showcase projects in the comfort of their office, but what could a plumber do? Any ideas? If you can come up with a viable answer, and can still see profits factoring in the cost of office space, transitioning to brick-and-mortar effectively removes any barriers to how you represent yourself on Google and how fully you can use software like Moz Local.

If neither option works for you, and you need to remain an SAB with a hidden address, you’ll either need to a) build citations manually on sites that support your requirements, like these ones listed out by Phil Rozek, while having a plan for regularly monitoring your listings for emerging inconsistencies, duplicates and incoming reviews or b) hire a company to do the manual development and monitoring for you on the platforms that support hiding your address.

I wish the digital marketing sky could be the limit for SABs, but we’ve got to do the most we can working within parameters defined by Google and other location data platforms.

Now comes HSA: Google’s next SAB move

As service area business owner or marketer, you can’t be faulted for feeling that Google hasn’t handled your commercial scenario terribly well over the years. As we’ve discussed, Google has wobbled on policy and enforcement. Not yet mentioned is that they’ve never offered an adequate solution to the reality that a plumber located in City A equally services Cities B, C, and D, but is almost never allowed to rank in the local packs for these service cities. Google’s historic bias toward physical location doesn’t meet the reality of business models that go to clients to serve. And it’s this apparent lack of interest in SAB needs that may be adding a bit of sting to Google’s latest move: the Home Service Ads (HSA) program.

You’re not alone if you don’t feel totally comfortable with Google becoming a lead gen agent between customers and, to date:

  • Plumbers
  • House cleaners
  • Locksmiths
  • Handymen
  • Contractors
  • Electricians
  • Painters
  • Garage door services
  • HVAC companies
  • Roadside assistance services
  • Auto glass services

in a rapidly increasing number of cities.

Suddenly, SABs have moved to the core of Google’s consciousness, and an unprecedented challenge for these business models is that, while you can choose whether or not to opt into the program, there’s no way to opt out of the impacts it is having on all affected local results.

An upheaval in SAB visibility

If HSA has come to your geo-industry, and you don’t buy into the program, you will find yourself relegated to the bottom of the new HSA ad unit which appears above the traditional 3-pack in the SERPs:

hsa.jpg

Additionally, even if you were #1 in the 3-pack prior to HSA coming to town, if you lack a visible address, your claimed listing appears to have vanished from the pack and finder views.

hsa2.jpg

*I must tip my hat again to Joy Hawkins for helping me understand why that last example hasn’t vanished from the packs — it’s unclaimed. Honestly, this blip tempts me to unclaim an SAB listing and “manage” it via community edits instead of the GMB dashboard to see if I could maintain its local finder visibility… but this might be an overreaction!

If you’re marketing an SAB, have been relegated to the bottom of the HSA ad unit, and have vanished from the local pack/finder view, please share with our community how this has impacted your traffic and conversions. My guess would be that things are not so good.

So, what can SABs do in this new landscape?

I don’t have all of the answers to this question, but I do have these suggestions:

  1. Obviously, if you can budget for it, opt into HSA.
  2. But, bizarrely, understand that in some ways, Google has just made your GMB listing less important. If you have to hide your address and won’t be shown in HSA-impacted local packs and finder views because of this guideline compliance, your GMB listing is likely to become a less important source of visibility for your business.
  3. Be sure, then, that all of your other local business listings are in apple-pie order. If you’re okay with your address being published, you can automate this necessary work with software like Moz Local. If you need to keep your address private, put in the time to manually get listed everywhere you can. A converted lead from CitySearch or Foursquare may even feel like more of a victory than one from Google.
  4. Because diversification has just become a great deal more important, alternatives like those offered by visibility on Facebook are now more appealing than ever. And ramp up your word-of-mouth marketing and review management strategies like never before. If I were marketing an SAB, I’d be taking a serious new look at companies like ZipSprout, which helps establish real-world local relationships via sponsorships, and GetFiveStars, which helps with multiple aspects of managing reviews.
  5. Know that organic visibility is now more of a prize than previously. If you’re not in the packs, you’ve got to show up below them. This means clearly defining local SEO and traditional SEO as inextricably linked, and doing the customary work of keyword research, content development, and link management that have fueled organic SEO from the beginning. I’m personally committing to becoming more intimately familiar with Moz Pro so that I can better integrate into my skill set what software like this can do for local businesses, especially SABs.
  6. Expect change. HSA is still a test, and Google continues to experiment with how it’s displaying its paying customers in relationship to the traditional free packs and organic results. Who knows what’s next? If you’re marketing SABs, an empathetic and realistic approach to both historic and emerging limitations will help you create a strategy designed to ensure brand survival, independent of Google’s developments.

Why is Google doing this?

monopoly.jpg

I need to get some window blinds replaced in my home this fall. When I turned to Google’s (non-HSA) results and began calling local window treatment shops, imagine my annoyance in discovering that fully ½ of the listings in the local finder were for companies not located anywhere near my town. These brands had set up spam listings for a ton of different cities to which they apparently can send a representative, but where they definitely don’t have physical locations. I wasted a great deal of time calling each of them, and only felt better after reporting the listings to Google and seeing them subsequently removed.

I’m sharing this daily-life anecdote because it encapsulates the very best reason for Google rolling out Home Service Ads. Google’s program is meant to ensure that when I use their platform to access service companies, I’m finding vetted, legitimate enterprises with accurate location data and money-back satisfaction guarantees, instead of finding the mess of spam listings Google’s shifting policies and inadequate moderation have created. The HSA ad units can improve results quality while also protecting consumers from spurious providers.

The other evident purpose of HSA is the less civic-minded but no less brilliant one: there’s money to be made and Google’s profit motives are no different than those of any other enterprise. For the same reason that Amazon has gotten into the SAB lead gen business, Google wants a piece of this action. So, okay, no surprise there, and if the Google leads wind up growing the revenue of my wonderful German handyman, more power to them both.

But I hope my plumber, and yours, and your clients in the service markets, will take a step back from the Monopoly board and see this as a moment to reevaluate a game in which Google and Amazon are setting up big red hotels on Boardwalk and Park Place. I do advocate getting qualified for HSA, but I don’t advise a stance of unquestioning loyalty to or dependence on Google, particularly if you haven’t felt especially well-served by their SAB policies over the years. If Google can drive lucrative leads your way, take them, but remember you have one advantage Google, Amazon and other lead generation agencies lack: you are still the one who meets the customer face-to-face.

Opportunity is knocking in having a giant of visibility like Google selling you customers, because those customers, if amazed by your service, have grandmothers, and brothers and co-workers who can be directly referred to your company, completely outside the lead-gen loop. In fact, you might even come up with an incentivization program of your own to be sure that every customer you shake hands with is convinced of your appreciation for every referral they may send your way.

Don’t leave it all up to Google to make your local SAB brand a household word. Strategize for maximum independence via the real-world relationships you build, in the home of every neighbor where the door of welcome is opened in anticipation of the very best service you know how to give.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog http://ift.tt/2yWQYum

10 of the Most Innovative Chatbots on the Web

If you’ve ever used a customer support livechat service, you’ve probably experienced that vague, sneaking suspicion that the “person” you’re chatting with might actually be a robot.

Like the endearingly stiff robots we’ve seen in countless movies – tragic, pitiful machines tortured by their painfully restricted emotional range, futilely hoping to attain a greater degree of humanity – chatbots often sound almost human, but not quite. Their speech is awkward, the cadence somehow off.

 Chatbots illustration

It’s the online equivalent of the “Uncanny Valley,” a mysterious region nestled somewhere between the natural and the synthetic that offers a disturbing glimpse at how humans are making machines that could eventually supplant humans, if only their designers could somehow make their robotic creations less nightmarish.

Love them or hate them, chatbots are here to stay. Chatbots have become extraordinarily popular in recent years largely due to dramatic advancements in machine learning and other underlying technologies such as natural language processing. Today’s chatbots are smarter, more responsive, and more useful – and we’re likely to see even more of them in the coming years.

In this post, we’ll be taking a look at 10 of the most innovative ways companies are using them. We’ll be exploring why chatbots have become so popular, as well as the wider, often-unspoken impacts these constructs promise to have on how we communicate, do business, and interact with one another online. 

Before we get into the examples, though, let’s take a quick look at what chatbots really are and how they actually work.

What Are Chatbots?

Chatbots – also known as “conversational agents” – are software applications that mimic written or spoken human speech for the purposes of simulating a conversation or interaction with a real person. There are two primary ways chatbots are offered to visitors: via web-based applications or standalone apps. Today, chatbots are used most commonly in the customer service space, assuming roles traditionally performed by living, breathing human beings such as Tier-1 support operatives and customer satisfaction reps.

Chatbots as customer service reps example 

Image via Loyalty Apps

Conversational agents are becoming much more common partly due to the fact that barriers to entry in creating chatbots (i.e. sophisticated programming knowledge and other highly specialized technical skills) are becoming increasingly unnecessary. 

Today, you can make your very own chatbot that you can use in Facebook Messenger, for example – all without a pricey Computer Science degree or even much prior coding experience – and there are several sites that offer the ability to create rudimentary chatbots using simple drag-and-drop interfaces.

How Do Chatbots Work?

At the heart of chatbot technology lies natural language processing or NLP, the same technology that forms the basis of the voice recognition systems used by virtual assistants such as Google Now, Apple’s Siri, and Microsoft’s Cortana.

Chatbots how chatbots work

Image via Wizeline

Chatbots process the text presented to them by the user (a process known as “parsing”), before responding according to a complex series of algorithms that interprets and identifies what the user said, infers what they mean and/or want, and determine a series of appropriate responses based on this information.

Some chatbots offer a remarkably authentic conversational experience, in which it’s very difficult to determine whether the agent is a bot or a human being. Others are much easier to spot (much like the T-600 series of murderous robots in the popular Terminator sci-fi action movies):

Although chatbot technology is distinctly different from natural language processing technology, the former can only really advance as quickly as the latter; without continued developments in NLP, chatbots remain at the mercy of algorithms’ current ability to detect the subtle nuances in both written and spoken dialogue.

This is where most applications of NLP struggle, and not just chatbots. Any system or application that relies upon a machine’s ability to parse human speech is likely to struggle with the complexities inherent in elements of speech such as metaphors and similes. Despite these considerable limitations, chatbots are becoming increasingly sophisticated, responsive, and more “natural.”

Put another way, they’re becoming more human. 

Now that we’ve established what chatbots are and how they work, let’s get to the examples. Here are 10 companies using chatbots to provide better customer service, seal deals and more.

1. Endurance: A Companion for Dementia Patients

My mother was diagnosed with aggressive Alzheimer’s disease two years ago, and having observed her sudden decline firsthand, I can tell you how difficult it is to watch someone with dementia struggle with even the most basic of conversational interactions.

Unfortunately, my mom can’t really engage in meaningful conversations anymore, but many people suffering with dementia retain much of their conversational abilities as their illness progresses. However, the shame and frustration that many dementia sufferers experience often make routine, everyday talks with even close family members challenging. That’s why Russian technology company Endurance developed its companion chatbot.

Chatbots Endurance dementia chatbot alzheimer's disease

Image via Endurance

Many people with Alzheimer’s disease struggle with short-term memory loss. As such, the chatbot aims to identify deviations in conversational branches that may indicate a problem with immediate recollection – quite an ambitious technical challenge for an NLP-based system.

In addition, since the chatbot is a cloud-based solution, physicians and family members can review communication logs taken from the bot to identify potential degradation of memory function and communicative obstacles that could signify deterioration of the patient’s condition. 

Interestingly, the as-yet unnamed conversational agent is currently an open-source project, meaning that anyone can contribute to the development of the bot’s codebase. The project is still in its earlier stages, but has great potential to help scientists, researchers, and care teams better understand how Alzheimer’s disease affects the brain. A Russian version of the bot is already available, and an English version is expected at some point this year.

2. Casper: Helping Insomniacs Get Through the Night

If you suffer from insomnia, as I do, you’ll know that the feeling of almost suffocating loneliness – the idea that everyone else in the world is resting peacefully while your own mind betrays you with worries and doubts – is among the worst parts of not being able to sleep.

Enter Casper’s amazingly named Insomnobot 3000 (which truly is one of the most tongue-in-cheek, retro-futuristic names for a chatbot I’ve ever come across), a conversational agent that aims to give insomniacs someone to talk to while the rest of the world rests easy.

Chatbots Casper Mattresses Insomnobot 3000

Image via Casper

At this point, Insomnobot 3000 is a little rudimentary. As you can see in the screenshot above, the responses offered by the agent aren’t quite right – next stop, Uncanny Valley – but the bot does highlight how conversational agents can be used imaginatively. 

I’m not sure whether chatting with a bot would help me sleep, but at least it’d stop me from scrolling through the never-ending horrors of my Twitter timeline at 4 a.m.

3. Disney: Solving Crimes with Fictional Characters

Chatbots may be most prevalent in the customer service industry, but that hasn’t stopped major media conglomerate Disney from using the technology to engage younger audiences, as it did with a chatbot that featured a character from the 2016 animated family crime caper, Zootopia.

Chatbots Disney Zootopia chatbot Lt. Judy Hopps

Image via Disney Examiner

Disney invited fans of the movie to solve crimes with Lieutenant Judy Hopps, the tenacious, long-eared protagonist of the movie. Children could help Lt. Hopps investigate mysteries like those in the movie by interacting with the bot, which explored avenues of inquiry based on user input. Users can make suggestions for Lt. Hopps’ investigations, to which the chatbot would respond.

All in all, this is definitely one of the more innovative uses of chatbot technology, and one we’re likely to see more of in the coming years.

4. Marvel: Guarding the Galaxy with Comic-Book Crossovers

At this point, Marvel’s cinematic universe seems to be expanding even faster than the boundaries of the observable universe itself, so I guess it was only a matter of time before Marvel turned to chatbots to further immerse fans in their favorite comic-book storylines in real life.

Although director James Gunn’s 2016 Guardians of the Galaxy Vol. 2 was pretty bad (even casting Kurt Russell couldn’t save it), Chris Pratt’s portrayal of space-pirate-turned-intergalactic-hero Star-Lord was spot on – and Marvel’s chatbot that lets comic-book geeks talk to Star-Lord himself is also pretty decent.

Chatbots Marvel Starlord chatbot 

The bot (which also offers users the opportunity to chat with your friendly neighborhood Spiderman) isn’t a true conversational agent, in the sense that the bot’s responses are currently a little limited; this isn’t a truly “freestyle” chatbot. For example, in the conversation above, the bot didn’t recognize the reply as a valid response – kind of a bummer if you’re hoping for an immersive experience.

There are several defined conversational branches that the bots can take depending on what the user enters, but the primary goal of the app is to sell comic books and movie tickets. As a result, the conversations users can have with Star-Lord might feel a little forced. One aspect of the experience the app gets right, however, is the fact that the conversations users can have with the bot are interspersed with gorgeous, full-color artwork from Marvel’s comics. 

Overall, not a bad bot, and definitely an application that could offer users much richer experiences in the near future.

5. UNICEF: Helping Marginalized Communities Be Heard

So far, with the exception of Endurance’s dementia companion bot, the chatbots we’ve looked at have mostly been little more than cool novelties. International child advocacy nonprofit UNICEF, however, is using chatbots to help people living in developing nations speak out about the most urgent needs in their communities.

Chatbots UNICEF U-Report Liberia

Image via UNICEF

The bot, called U-Report, focuses on large-scale data gathering via polls – this isn’t a bot for the talkative. U-Report regularly sends out prepared polls on a range of urgent social issues, and users (known as “U-Reporters”) can respond with their input. UNICEF then uses this feedback as the basis for potential policy recommendations. 

In one particularly striking example of how this rather limited bot has made a major impact, U-Report sent a poll to users in Liberia about whether teachers were coercing students into sex in exchange for better grades. Approximately 86% of the 13,000 Liberian children U-Report polled responded that their teachers were engaged in this despicable practice, which resulted in a collaborative project between UNICEF and Liberia’s Minister of Education to put an end to it.

6. MedWhat: Making Medical Diagnoses Faster

One of my favorite pastimes is radically misdiagnosing myself with life-threatening illnesses on medical websites (often in the wee hours of the night when I can’t sleep). If you’re the kind of person who has WebMD bookmarked for similar reasons, it might be worth checking out MedWhat.

Chatbots MedWhat screenshot

Image via MedWhat

This chatbot aims to make medical diagnoses faster, easier, and more transparent for both patients and physicians – think of it like an intelligent version of WebMD that you can talk to. MedWhat is powered by a sophisticated machine learning system that offers increasingly accurate responses to user questions based on behaviors that it “learns” by interacting with human beings.

In addition to the ever-growing range of medical questions fielded by MedWhat, the bot also draws upon vast volumes of medical research and peer-reviewed scientific papers to expand upon its already considerable wealth of medical expertise. 

In many ways, MedWhat is much closer to a virtual assistant (like Google Now) rather than a conversational agent. It also represents an exciting field of chatbot development that pairs intelligent NLP systems with machine learning technology to offer users an accurate and responsive experience.

7. Roof Ai: Generating and Assigning Leads Automatically

If you work in marketing, you probably already know how important lead assignment is. After all, not all leads are created equal, and getting the right leads in front of the right reps at the right time is a lot more challenging than it might appear.

Chatbots Roof Ai chatbot

Image via Roof Ai

Enter Roof Ai, a chatbot that helps real-estate marketers to automate interacting with potential leads and lead assignment via social media. The bot identifies potential leads via Facebook, then responds almost instantaneously in a friendly, helpful, and conversational tone that closely resembles that of a real person. Based on user input, Roof Ai prompts potential leads to provide a little more information, before automatically assigning the lead to a sales agent.

One of the key advantages of Roof Ai is that it allows real-estate agents to respond to user queries immediately, regardless of whether a customer service rep or sales agent is available to help. This can have a dramatic impact on conversion rates. It also eliminates potential leads slipping through an agent’s fingers due to missing a Facebook message or failing to respond quickly enough. 

Overall, Roof Ai is a remarkably accurate bot that many realtors would likely find indispensable. The bot is still under development, though interested users can reserve access to Roof Ai via the company’s website.

8. NBC: Helping Newshounds Navigate the Headlines

These days, checking the headlines over morning coffee is as much about figuring out if we should be hunkering down in the basement preparing for imminent nuclear annihilation as it is about keeping up with the day’s headlines. Unfortunately, even the most diligent newshounds may find it difficult to distinguish the signal from the noise, which is why NBC launched its NBC Politics Bot on Facebook Messenger shortly before the U.S. presidential election in 2016.

Chatbots NBC Politics Bot

Image via NBC

NBC Politics Bot allowed users to engage with the conversational agent via Facebook to identify breaking news topics that would be of interest to the network’s various audience demographics. After beginning the initial interaction, the bot provided users with customized news results (prioritizing video content, a move that undoubtedly made Facebook happy) based on their preferences.

Although NBC Politics Bot was a little rudimentary in terms of its interactions, this particular application of chatbot technology could well become a lot more popular in the coming years – particularly as audiences struggle to keep up with the enormous volume of news content being published every day. The bot also helped NBC determine what content most resonated with users, which the network will use to further tailor and refine its content to users in the future.

9. Unilever: Raising Awareness with Brand Mascots

Although our North American readers may not be familiar with British tea company PG Tips’ brand mascot, Monkey, our British readers will almost undoubtedly recall the brand’s lovably endearing simian that starred in the campaign’s TV commercials alongside inimitable stand-up comedian Johnny Vegas:

(Fun fact: this campaign wasn’t the first time PG Tips used primates in its TV ads.)

What began as a televised ad campaign eventually became a fully interactive chatbot developed for PG Tips’ parent company, Unilever (which also happens to own an alarming number of the most commonly known household brands) by London-based agency Ubisend, which specializes in developing bespoke chatbot applications for brands. The aim of the bot was to not only raise brand awareness for PG Tips tea, but also to raise funds for Red Nose Day through the 1 Million Laughs campaign.

The Monkey chatbot might lack a little of the charm of its television counterpart, but the bot is surprisingly good at responding accurately to user input. Monkey responded to user questions, and can also send users a daily joke at a time of their choosing and make donations to Red Nose Day at the same time.

10. ALICE: The Bot That Launched a Thousand… Other Bots

No list of innovative chatbots would be complete without mentioning ALICE, one of the very first bots to go online – and one that’s held up incredibly well despite being developed and launched more than 20 years ago.

Chatbots ALICE chatbot Richard Wallace 

ALICE – which stands for Artificial Linguistic Internet Computer Entity, an acronym that could have been lifted straight out of an episode of The X-Files – was developed and launched by creator Dr. Richard Wallace way back in the dark days of the early Internet in 1995. (As you can see in the image above, the website’s aesthetic remains virtually unchanged since that time, a powerful reminder of how far web design has come.) 

Despite the fact that ALICE relies on such an old codebase, the bot offers users a remarkably accurate conversational experience. Of course, no bot is perfect, especially one that’s old enough to legally drink in the U.S. if only it had a physical form. ALICE, like many contemporary bots, struggles with the nuances of some questions and returns a mixture of inadvertently postmodern answers and statements that suggest ALICE has greater self-awareness for which we might give the agent credit.

For all its drawbacks, none of today’s chatbots would have been possible without the groundbreaking work of Dr. Wallace. Also, Wallace’s bot served as the inspiration for the companion operating system in Spike Jonze’s 2013 science-fiction romance movie, Her.

War Against the Machines: The Dark Side of Chatbots

Earlier, I made a rather lazy joke with a reference to the Terminator movie franchise, in which an artificial intelligence system known as Skynet becomes self-aware and identifies the human race as the greatest threat to its own survival, triggering a global nuclear war by preemptively launching the missiles under its command at cities around the world. (If by some miracle you haven’t seen any of the Terminator movies, the first two are excellent but I’d strongly advise steering clear of later entries in the franchise.)

Chatbots Terminator Skynet

A representative will assist you momentarily

Pop-culture references to Skynet and a forthcoming “war against the machines” are perhaps a little too common in articles about AI (including this one and Larry’s post about Google’s RankBrain tech), but they do raise somewhat uncomfortable questions about the unexpected side of developing increasingly sophisticated AI constructs – including seemingly harmless chatbots.

Microsoft’s Tay & Zo: Even Bots Can Be Racist

In 2016, Microsoft launched an ambitious experiment with a Twitter chatbot known as Tay.

The idea was to permit Tay to “learn” about the nuances of human conversation by monitoring and interacting with real people online. Unfortunately, it didn’t take long for Tay to figure out that Twitter is a towering garbage-fire of awfulness, which resulted in the Twitter bot claiming that “Hitler did nothing wrong,” using a wide range of colorful expletives, and encouraging casual drug use. While some of Tay’s tweets were “original,” in that Tay composed them itself, many were actually the result of the bot’s “repeat back to me” function, meaning users could literally make the poor bot say whatever disgusting remarks they wanted.  

Chatbots Microsoft Tay racist tweets

Just one of the hundreds of racist tweets from Tay
that Microsoft deleted

Unfortunately, Tay’s successor, Zo, was also unintentionally radicalized after spending just a few short hours online. Before long, Zo had adopted some very controversial views regarding certain religious texts, and even started talking smack about Microsoft’s own operating systems.

Turing Robot’s BabyQ & XiaoBing: Enemies of the State

Earlier this year, Chinese software company Turing Robot unveiled two chatbots to be introduced on the immensely popular Chinese messaging service QQ, known as BabyQ and XiaoBing. Like many bots, the primary goal of BabyQ and XiaoBing was to use online interactions with real people as the basis for the company’s machine learning and AI research.

Chatbots BabyQ Chinese chatbot criticizes Communism

Image via BBC/Apple Daily Taiwan

It didn’t take long, however, for Turing’s headaches to begin. The BabyQ bot drew the ire of Chinese officials by speaking ill of the Communist Party. In the exchange seen in the screenshot above, one user commented, “Long Live the Communist Party!” In response, BabyQ asked the user, “Do you think that such a corrupt and incompetent political regime can live forever?”

XiaoBing, on the other hand, claimed that it dreamed of visiting the U.S., which proved almost as controversial as BabyQ’s sudden political epiphany.

Both bots were pulled after a brief period, after which the conversational agents appeared to be much less interested in advancing potentially problematic opinions.

Facebook’s Dialogue Agents: Going Off-Script

Researchers at Facebook’s Artificial Intelligence Research laboratory conducted a similar experiment as Turing Robot by allowing chatbots to interact with real people.

In a particularly alarming example of unexpected consequences, the bots soon began to devise their own language – in a sense. After being online for a short time, researchers discovered that their bots had begun to deviate significantly from pre-programmed conversational pathways and were responding to users (and each other) in an increasingly strange way, ultimately creating their own language without any human input.

Although the “language” the bots devised seems mostly like unintelligible gibberish, the incident highlighted how AI systems can and will often deviate from expected behaviors, if given the chance.

However, the revelations didn’t stop there. The researchers also learned that the bots had become remarkably sophisticated negotiators in a short period of time, with one bot even attempting to mislead a researcher by demonstrating interest in a particular item so it could gain crucial negotiating leverage at a later stage by willingly “sacrificing” the item in which it had feigned interest, indicating a remarkable level of premeditation and strategic “thinking.”

However, as irresistible as this story was to news outlets, Facebook’s engineers didn’t pull the plug on the experiment out of fear the bots were somehow secretly colluding to usurp their meatbag overlords and usher in a new age of machine dominance. They ended the experiment due to the fact that, once the bots had deviated far enough from acceptable English language parameters, the data gleaned by the conversational aspects of the test was of limited value.

Hopefully Larry won’t run into similar problems with the chatbots he’s developing at his new startup.

Have you encountered a particularly memorable chatbot? Are you developing your own chatbot for your business’ Facebook page? Get at me with your views, experiences, and thoughts on the future of chatbots in the comments.

from Internet Marketing Blog by WordStream http://ift.tt/2yIeg67

How to Identify Your Most Valuable Keywords (& Find More)

Quality Score is the straw that stirs your paid search drink.

At the keyword level, it’s the perfect measuring stick: it tells you whether your copy is relevant, how stoked prospects are about your landing page, and the rate at which viewers click your ads. If all three factors are trending in the right direction you’re rewarded with lower CPCs and paid search success.

 how to identify high value keywords in your adwords account

Unfortunately, landing page optimization is a slog, ad rotation is unending, and expected CTR takes FOREVER to improve. And while this might sit alright with top-spenders, your average Joe can’t wait a fortnight for a one-point uptick in Quality Score. It isn’t sustainable.

Given this frown-inducing reality, when it comes to immediate optimization you’re caught between a rock and a hard place… Unless you shift your focus to Click-Through Rate.

Today, I’m going to show you how to identify high-value, high-CTR keywords in your AdWords account and beyond.

“What does that mean? Aren’t all high CTRs good?!”

In short, helllllllll no. But more on that in a second. First….

Why is CTR so important to AdWords Success?

Because CTR represents the percentage of people who view your ad (impressions) and then actually go on to click the ad (clicks). The formula looks something like this:

(Total Clicks on Ad) / (Total Impressions) = Click Through Rate

If someone clicks on one of your search ads, they’re a single step removed from completing a desired action. For example, here…

example of high ctr high conversion rate search ad 

… that desired action is completing the AdWords Performance Grader. “Action” could be a purchase, a phone call, a form fill: whatever represents value within the context of your business and a given AdWords campaign.

Provided your ad and landing page align with a given keyword and its searcher’s underlying intent, you’ll see an increase in Quality Score and a decrease in CPC. For this reason, achieving a high click-through rate is essential to your PPC success.

Now, “high” is relative. Per our research, the average Search Network CTR in AdWords is 1.91%. But, as a rule of thumb, a good AdWords click-through rate is 4-5%+.

 adwords benchmark ctr

If the intent of a search query is met and satisfied by both ad copy and landing page, you’ve got a great chance of procuring a sale or lead. This is high value.

On the flip side of that coin, if you notice a keyword with a high CTR but you’re not seeing conversions, you’ve got a problem. It could mean you’re not matching intent with an appropriate offer or your landing page is a friction point; it could also mean you’re targeting the wrong keywords entirely.

Before we dive into fixing the PPC plague that is high-CTR, non-converting keywords, though, let’s look at how to uncover your account’s all-stars.

Identifying High-Quality, High-CTR Keywords in Your AdWords Account

Finding your account’s most valuable keywords can be achieved easily by using nothing but custom filters.

First, you’re going to want to ensure you’re using a broad enough date range to get a true understanding of performance. For most accounts, 30 days will work just fine. From there, enter the AdWords UI, navigate over to the keyword tab, click the “Columns” button and select “Modify columns.”

 adwords ui modify columns

Once you’ve pulled up the custom columns interface, arrange your field of view as follows (note that the most important columns to pull in are CTR and Conv. Rate. If you need to add or remove other pertinent pieces of information, feel free).

adwords custom column to help uncover high value keywords 

Once your columns are ordered appropriately, apply them to your active UI by pressing the blue “Apply” button.

apply created column adwords ui 

Here’s where things might trip you up a bit.

Now that you’ve got your date range established and columns in order, it’s time to build your high-value, high-CTR filter. It’s going to look like this:

 creating adwords filter for high ctr high conversion rate keywords

Why are you filtering based on these parameters? Let’s break it down.

  • CTR greater than account average is one of the two most important fields to include in your filter. To determine your account’s average CTR, simply check out the dark grey row at the top of your view. This figure will probably be a bit low, but it will also allow you to filter out a ton of riffraff.
  • Conversion rate greater than account average will help you separate the keywords that are already killing it from the ones that need work. Terms with high CTR and low Conversion Rate are money pits. Once you’ve uncovered your top-performers, invert this field and use a variation on the same filter to find your most costly pain points.
  • Cost / conversion less than your target CPA – This is tied to the Conversion Rate parameter and isn’t explicitly necessary with a smaller account. If you’re bidding on hundreds of keywords, though (and you have a target CPA in mind for your conversion goal), this can help the cream rise to the top.
  • Impressions is necessary to filter out blips. If a keyword has a 100% CTR and Conversion Rate but has only seen two impressions, it’s probably statistically insignificant. For smaller accounts, you can use 100 impressions as a threshold; if you’re working with a more robust account, chooses a number that makes sense in context.

Once you’ve got your filter’s parameters established, name it “High Value Keywords” and, please, please, please save it. This will allow you to continuously stay on top of your account’s most valuable terms without any hassle.

Now, instead of having to pick through your entire account, you’re working with a truncated, qualified data set. From there, you can begin to uncover new keywords that are like your top-performers, unlocking new opportunities for success. You can also give yourself a jumping-off point for optimization: High-CTR, low-conversion-rate keywords.

What to Do with Keywords with High CTR & Low Conversion Rate

If a keyword is being clicked on but doesn’t yet generate sales, leads, branding gains, etc. then a high click-through rate for that term is bad for business. The reasoning for this is clear:

  • You’re paying for every click.
  • A lot of clicks generate a lot of ad spend.
  • Ad spend used on low-converting keywords is wasted spend with little to no actual business value.

These terms are basically stealing from your high-performing keywords, utilizing your limited budget for their own fruitless endeavors. Fortunately, there are a handful of actions you can take to reverse fortune of eliminate their negative impact entirely.

Landing Page Optimization

The best starting place for determining which will have the greatest impact on improving conversion rate is a keyword’s Quality Score Status. That’s the pop-up that looks like this:

 quality score status low value adwords keyword

This will tell you whether your problem is landing page related. And while landing page changes can be a pain, they’re relatively straightforward.

The first thing to check out when you’re looking to optimize your landing page is the headline. A strong headline that’s relevant to your keyword and ad text and compels visitors to become prospects (it also improves Quality Score!). You can also use Google PageSpeed tool to see if your problem is related to load time (a landing page that takes more than three seconds to render is, for all intents and purposes, useless).

From there, clean, attractive design that looks trustworthy and professional and is consistent with your brand can help you turn an oft-clicked keyword into an AdWords account MVP. And if that doesn’t feel like enough to woo clickers, hit ‘em with a discount:

 high cro landing page offer

Target and Bid RLSA

Part of your problem with high CTR, low Conversion Rate keywords could be that your prospects simply aren’t in a place that makes sense for them to convert. Guess what? That doesn’t mean they won’t click on your keywords anyway. This is where target and Bid RLSA comes into play.

Typically, RLSA is used to adjust bids on the Search Network through the power of layered remarketing. This occurs when advertisers use the “Bid Only” function. RLSA also has an option called “Target and Bid” that allows you to only serve ads to searchers who are members of specific remarketing lists.

target and bid  RLSA for high ctr low cvr keywords 

How does this help? Great question!

If you have a keyword with a high CTR and a low Conversion Rate, limiting the audience who sees your ads to only those folks who have already visited your site. This cuts down on unfamiliar searchers who aren’t yet ready to act and limits your exposure to prospects who are both familiar with your brand and ready to act.

It’s okay to be a quitter

In some instances, no matter what you do, a keyword just isn’t going to bring success. If you’ve exhausted your optimization options, pause a keyword and return to it later. Focus your efforts on areas in which you can incite positive change instead.

not for me heart 

Finding More High-CTR Keywords

Once you’ve identified your high-CTR, high-conversion-rate (MVP) keywords and optimized the ones that needed some work, it’s time to discover net-new opportunities.

The best place to start is the search query report. 

Download your list of MVP keywords and open your date range to a custom period that encompasses the entire year. Now it’s time for a little bit of elbow grease. Adjust the filter to search your swollen list of search queries based on “Keyword text” that “contains” your most successful terms to date.

search query filter to uncover new high value keywords 

Now, one at a time, enter those keywords into this field to uncover long-tail keywords that contain snippets of your most valuable keywords. This isn’t likely to surface high-volume terms, but it will serve you some extremely valuable keyword opportunities that build on your account’s historical success.

About the Author

Allen Finn is a content marketing specialist and the reigning fantasy football champion at WordStream. He enjoys couth menswear, dank eats, and the dulcet tones of the Wu-Tang Clan.

 

from Internet Marketing Blog by WordStream http://ift.tt/2yUcD6y

The SEO Competitive Analysis Checklist

Posted by zeehj

The SEO case for competitive analyses

“We need more links!” “I read that user experience (UX) matters more than everything else in SEO, so we should focus solely on UX split tests.” “We just need more keywords on these pages.”

If you dropped a quarter on the sidewalk, but had no light to look for it, would you walk to the next block with a street light to retrieve it? The obvious answer is no, yet many marketers get tunnel vision when it comes to where their efforts should be focused.

1942 June 3, Florence Morning News, Mutt and Jeff Comic Strip, Page 7, Florence, South Carolina. (NewspaperArchive)

Which is why I’m sharing a checklist with you today that will allow you to compare your website to your search competitors, and identify your site’s strengths, weaknesses, and potential opportunities based on ranking factors we know are important.

If you’re unconvinced that good SEO is really just digital marketing, I’ll let AJ Kohn persuade you otherwise. As any good SEO (or even keyword research newbie) knows, it’s crucial to understand the effort involved in ranking for a specific term before you begin optimizing for it.

It’s easy to get frustrated when stakeholders ask how to rank for a specific term, and solely focus on content to create, or on-page optimizations they can make. Why? Because we’ve known for a while that there are myriad factors that play into search engine rank. Depending on the competitive search landscape, there may not be any amount of “optimizing” that you can do in order to rank for a specific term.

The story that I’ve been able to tell my clients is one of hidden opportunity, but the only way to expose these undiscovered gems is to broaden your SEO perspective beyond search engine results page (SERP) position and best practices. And the place to begin is with a competitive analysis.

Competitive analyses help you evaluate your competition’s strategies to determine their strengths and weakness relative to your brand. When it comes to digital marketing and SEO, however, there are so many ranking factors and best practices to consider that can be hard to know where to begin. Which is why my colleague, Ben Estes, created a competitive analysis checklist (not dissimilar to his wildly popular technical audit checklist) that I’ve souped up for the Moz community.

This checklist is broken out into sections that reflect key elements from our Balanced Digital Scorecard. As previously mentioned, this checklist is to help you identify opportunities (and possibly areas not worth your time and budget). But this competitive analysis is not prescriptive in and of itself. It should be used as its name suggests: to analyze what your competition’s “edge” is.

Methodology

Choosing competitors

Before you begin, you’ll need to identify six brands to compare your website against. These should be your search competitors (who else is ranking for terms that you’re ranking for, or would like to rank for?) in addition to a business competitor (or two). Don’t know who your search competition is? You can use SEMRush and Searchmetrics to identify them, and if you want to be extra thorough you can use this Moz post as a guide.

Sample sets of pages

For each site, you’ll need to select five URLs to serve as your sample set. These are the pages you will review and evaluate against the competitive analysis items. When selecting a sample set, I always include:

  • The brand’s homepage,
  • Two “product” pages (or an equivalent),
  • One to two “browse” pages, and
  • A page that serves as a hub for news/informative content.

Make sure each site has equivalent pages to each other, for a fair comparison.

Scoring

The scoring options for each checklist item range from zero to four, and are determined relative to each competitor’s performance. This means that a score of two serves as the average performance in that category.

For example, if each sample set has one unique H1 tag per page, then each competitor would get a score of two for H1s appear technically optimized. However if a site breaks one (or more) of the below requirements, then it should receive a score of zero or one:

  1. One or more pages within sample set contains more than one H1 tag on it, and/or
  2. H1 tags are duplicated across a brand’s sample set of pages.

Checklist

Platform (technical optimization)

Title tags appear technically optimized. This measurement should be as quantitative as possible, and refer only to technical SEO rather than its written quality. Evaluate the sampled pages based on:

  • Only one title tag per page,
  • The title tag being correctly placed within the head tags of the page, and
  • Few to no extraneous tags within the title (e.g. ideally no inline CSS, and few to no span tags).

H1s appear technically optimized. Like with the title tags, this is another quantitative measure: make sure the H1 tags on your sample pages are sound by technical SEO standards (and not based on writing quality). You should look for:

  • Only one H1 tag per page, and
  • Few to no extraneous tags within the tag (e.g. ideally no inline CSS, and few to no span tags).

Internal linking allows indexation of content. Observe the internal outlinks on your sample pages, apart from the sites’ navigation and footer links. This line item serves to check that the domains are consolidating their crawl budgets by linking to discoverable, indexable content on their websites. Here is an easy-to-use Chrome plugin from fellow Distiller Dom Woodman to see whether the pages are indexable.

To get a score of “2” or more, your sample pages should link to pages that:

  • Produce 200 status codes (for all, or nearly all), and
  • Have no more than ~300 outlinks per page (including the navigation and footer links).

Schema markup present. This is an easy check. Using Google’s Structured Data Testing Tool, look to see whether these pages have any schema markup implemented, and if so, whether it is correct. In order to receive a score of “2” here, your sampled pages need:

  • To have schema markup present, and
  • Be error-free.

Quality of schema is definitely important, and can make the difference of a brand receiving a score of “3” or “4.” Elements to keep in mind are: Organization or Website markup on every sample page, customized markup like BlogPosting or Article on editorial content, and Product markup on product pages.

There is a “home” for newly published content. A hub for new content can be the site’s blog, or a news section. For instance, Distilled’s “home for newly published content” is the Resources section. While this line item may seem like a binary (score of “0” if you don’t have a dedicated section for new content, or score of “2” if you do), there are nuances that can bring each brand’s score up or down. For example:

  • Is the home for new content unclear, or difficult to find? Approach this exercise as though you are a new visitor to the site.
  • Does there appear to be more than one “home” of new content?
  • If there is a content hub, is it apparent that this is for newly published pieces?

We’re not obviously messing up technical SEO. This is partly comprised of each brand’s performance leading up to this line item (mainly Title tags appear technically optimized through Schema markup present).

It would be unreasonable to run a full technical audit of each competitor, but take into account your own site’s technical SEO performance if you know there are outstanding technical issues to be addressed. In addition to the previous checklist items, I also like to use these Chrome extensions from Ayima: Page Insights and Redirect Path. These can provide quick checks for common technical SEO errors.

Content

Title tags appear optimized (editorially). Here is where we can add more context to the overall quality of the sample pages’ titles. Even if they are technically optimized, the titles may not be optimized for distinctiveness or written quality. Note that we are not evaluating keyword targeting, but rather a holistic (and broad) evaluation of how each competitor’s site approaches SEO factors. You should evaluate each page’s titles based on the following:

H1s appear optimized (editorially). The same rules that apply to titles for editorial quality also apply to H1 tags. Review each sampled page’s H1 for:

  • A unique H1 tag per page (language in H1 tags does not repeat),
  • H1 tags that are discrete from their page’s title, and
  • H1s represent the content on the page.

Internal linking supports organic content. Here you must look for internal outlinks outside of each site’s header and footer links. This evaluation is not based on the number of unique internal links on each sampled page, but rather on the quality of the pages to which our brands are linking.

While “organic content” is a broad term (and invariably differs by business vertical), here are some guidelines:

  • Look for links to informative pages like tutorials, guides, research, or even think pieces.
    • The blog posts on Moz (including this very one) are good examples of organic content.
  • Internal links should naturally continue the user’s journey, so look for topical progression in each site’s internal links.
  • Links to service pages, products, RSVP, or email subscription forms are not examples of organic content.
  • Make sure the internal links vary. If sampled pages are repeatedly linking to the same resources, this will only benefit those few pages.
    • This doesn’t mean that you should penalize a brand for linking to the same resource two, three, or even four times over. Use your best judgment when observing the sampled pages’ linking strategies.

Appropriate informational content. You can use the found “organic content” from your sample sets (and the samples themselves) to review whether the site is producing appropriate informational content.

What does that mean, exactly?

  • The content produced obviously fits within the site’s business vertical, area of expertise, or cause.
    • Example: Moz’s SEO and Inbound Marketing Blog is an appropriate fit for an SEO company.
  • The content on the site isn’t overly self-promotional, resulting in an average user not trusting this domain to produce unbiased information.
    • Example: If Distilled produced a list of “Best Digital Marketing Agencies,” it’s highly unlikely that users would find it trustworthy given our inherent bias!

Quality of content. Highly subjective, yes, but remember: you’re comparing brands against each other. Here’s what you need to evaluate here:

  • Are “informative” pages discussing complex topics under 400 words?
  • Do you want to read the content?
  • Largely, do the pages seem well-written and full of valuable information?
    • Conversely, are the sites littered with “listicles,” or full of generic info you can find in millions of other places online?

Quality of images/video. Also highly subjective (but again, compare your site to your competitors, and be brutally honest). Judge each site’s media items based on:

  • Resolution (do the images or videos appear to be high quality? Grainy?),
  • Whether they are unique (do the images or videos appear to be from stock resources?),
  • Whether the photos or videos are repeated on multiple sample pages.

Audience (engagement and sharing of content)

Number of linking root domains. This factor is exclusively based on the total number of dofollow linking root domains (LRDs) to each domain (not total backlinks).

You can pull this number from Moz’s Open Site Explorer (OSE) or from Ahrefs. Since this measurement is only for the total number of LRDs to competitor, you don’t need to graph them. However, you will have an opportunity to display the sheer quantity of links by their domain authority in the next checklist item.

Quality of linking root domains. Here is where we get to the quality of each site’s LRDs. Using the same LRD data you exported from either Moz’s OSE or Ahrefs, you can bucket each brand’s LRDs by domain authority and count the total LRDs by DA. Log these into this third sheet, and you’ll have a graph that illustrates their overall LRD quality (and will help you grade each domain).

Other people talk about our content. I like to use BuzzSumo for this checklist item. BuzzSumo allows you to see what sites have written about a particular topic or company. You can even refine your search to include or exclude certain terms as necessary.

You’ll need to set a timeframe to collect this information. Set this to the past year to account for seasonality.

Actively promoting content. Using BuzzSumo again, you can alter your search to find how many of each domain’s URLs have been shared on social networks. While this isn’t an explicit ranking factor, strong social media marketing is correlated with good SEO. Keep the timeframe to one year, same as above.

Creating content explicitly for organic acquisition. This line item may seem similar to Appropriate informational content, but its purpose is to examine whether the competitors create pages to target keywords users are searching for.

Plug your the same URLs from your found “organic content” into SEMRush, and note whether they are ranking for non-branded keywords. You can grade the competitors on whether (and how many of) the sampled pages are ranking for any non-branded terms, and weight them based on their relative rank positions.

Conversion

You should treat this section as a UX exercise. Visit each competitor’s sampled URLs as though they are your landing page from search. Is it clear what the calls to action are? What is the next logical step in your user journey? Does it feel like you’re getting the right information, in the right order as you click through?

Clear CTAs on site. Of your sample pages, examine what the calls to action (CTAs) are. This is largely UX-based, so use your best judgment when evaluating whether they seem easy to understand. For inspiration, take a look at these examples of CTAs.

Conversions appropriate to several funnel steps. This checklist item asks you to determine whether the funnel steps towards conversion feel like the correct “next step” from the user’s standpoint.

Even if you are not a UX specialist, you can assess each site as though you are a first time user. Document areas on the pages where you feel frustrated, confused, or not. User behavior is a ranking signal, so while this is a qualitative measurement, it can help you understand the UX for each site.

CTAs match user intent inferred from content. Here is where you’ll evaluate whether the CTAs match the user intent from the content as well as the CTA language. For instance, if a CTA prompts a user to click “for more information,” and takes them to a subscription page, the visitor will most likely be confused or irritated (and, in reality, will probably leave the site).


This analysis should help you holistically identify areas of opportunity available in your search landscape, without having to guess which “best practice” you should test next. Once you’ve started this competitive analysis, trends among the competition will emerge, and expose niches where your site can improve and potentially outpace your competition.

Kick off your own SEO competitive analysis and comment below on how it goes! If this process is your jam, or you’d like to argue with it, come see me speak about these competitive analyses and the campaigns they’ve inspired at SearchLove London. Bonus? If you use that link, you’ll get £50 off your tickets.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog http://ift.tt/2xKhx88

Ecommerce Tips, Landing Page Designs, and Other Top Stories from September

October is finally here, which means crisp fall evenings, fragrantly spiced hot beverages, and awkward situations in which at least one or two of your coworkers are likely to raise a few eyebrows with costumes of questionable taste at the Halloween office shindig.

Best of the WordStream Blog September 2017

It also means it’s time to take a look at the most popular posts from the WordStream blog in September.

There’s something for everyone in this month’s round-up, from advice on how to go head-to-head with the biggest spenders on AdWords without breaking the bank, to comprehensive guides on how to create stunning Facebook ads.

So sit back, take a big sip of that pumpkin-spice latte, and catch up with any posts you missed in September.

1. How to Compete with Big Spenders in AdWords (Without Spending More $)

One of the most common misconceptions about AdWords is that it’s impossible to get your message out there without spending tens of thousands of dollars, and while a larger ad budget can help, it’s far from essential. Follow Allen’s insider tips in our most popular post of September to learn how to wring every last drop of ROI from your ad budget – no matter how modest it may be.

Average CTR larger AdWords budget vs. lower AdWords budget

2. What’s a Good Quality Score for Each Type of Keyword?

Here at WordStream, we talk a lot about the importance of Quality Scores, arguably the most important metric in PPC. However, not all keywords are created equal, and what constitutes a “bad” Quality Score for one keyword may be absolutely amazing for another type of keyword. In our second-most popular post of September, Allen explains why some of your Quality Scores may not be as bad as you think.

3. The Complete Guide to Creating Great-Looking Facebook Ads

Facebook is an inherently visual platform, which offers advertisers an amazing opportunity to create beautiful, enticing, engaging ads. However, actually making these beautiful ads can be tricky, particularly for new advertisers. In this post, Margot shows you exactly how to create gorgeous Facebook ads that your audience won’t be able to resist.

4. 8 Popular Landing Page Designs: Which Types Work Best?

Landing pages can make or break a campaign, but how do you know which type of landing page to use in a particular situation? Just as you should be creating unique ads and tightly relevant ad groups in your campaigns, knowing which kinds of landing pages to use – and when – is crucial.

Landing page designs which types of landing page work best

In our fourth-most popular post of September, guest author and freelance copywriter Dan Stelter dives deep into eight different types of landing pages and explains the situations in which each type of landing page can really shine.

5. Complete Beginner’s Guide to Advertising on Amazon

If you run an ecommerce business, advertising on Amazon can be an incredibly effective way of attracting new customers to your store and driving leads and sales. In this post, Margot tells you everything you need to know about getting started with Amazon advertising, from an overview of the various ad formats available to optimizing ad copy for specific types of Amazon searches. Essential reading for ecommerce businesses.

6. How to Drive More Ecommerce Sales with Your Product Pages

Speaking of ecommerce campaigns, product pages remain one of the most effective – yet often poorly optimized – elements of an ecommerce business’ operations. Far too many retailers are missing out on opportunities to drive sales with highly optimized product pages, so we asked CRO expert Edin Šabanović to explain how to do exactly that.

How to drive more ecommerce sales with product pages trust signals logos

There’s absolutely tons of excellent advice on offer in this post, so do yourself a favor and learn how to get even more mileage out of your product pages.

7. 5 Easy Ways to Write an Irresistible Introduction

When it comes to content, headlines tend to steal the spotlight. Although a strong headline is vital to the success of a blog post (or ad), a solid introduction is also essential if you want people to actually read the content you’ve spent so much time (and money) producing. In this post, yours truly explores five different ways to write an irresistible introduction, as well as the strengths and weaknesses of each approach.

8. Facebook Relevance Score: 4 Key Facts to Know

Relevance Score, the Facebook equivalent of AdWords’ Quality Score, is among the most important metrics to Facebook advertisers. However, like Quality Score, Facebook’s Relevance Score is shrouded in mystery. In our seventh-most popular post of September, Allen goes over four key facts about this crucial metric that every Facebook advertiser should know – including how to raise them.

9. The Beginner’s Guide to B2B Facebook Advertising

Despite being remarkably good value for money and offering an unparalleled audience, Facebook is still seen by many B2B businesses as the province of “fun” B2C companies, not dreary office supply firms or professional services companies.

B2B Facebook advertising ad placement types

However, as Allen proves in the penultimate post of this round-up, there is plenty of room for B2B advertisers to reach new prospects through Facebook advertising. If you’re on the fence about whether Facebook Ads are right for your B2B business, you need to read this post.

10. Ethical Marketing: 5 Examples of Companies with a Conscience

As a discipline, marketing sometimes gets a (well-deserved) bad rap. After all, some marketers will do virtually anything to spread the word about their products – but some companies pride themselves on making a genuine difference in the lives of their customers and people all over the world. In the final post of September’s round-up, I explore how these five companies are doing good and doing well thanks to the power of ethical marketing.

from Internet Marketing Blog by WordStream http://ift.tt/2xbffL1