Friday, March 15, 2013

Why Google Analytics Tagging Matters - Whiteboard Friday

When Google Analytics doesn't know where a traffic source comes from, it assumes the traffic is direct and lumps it in with your direct visits. This happens frequenly with social shares, as many of us make the mistake of not tagging our links accordingly.

In today's Whiteboard Friday, Rachael Gerson sheds some light on "dark social" and explains why tagging in Google Analytics improves the accuracy of your referrals. Take credit for the work that you're doing, and tag your links!

 

 

Video Transcription

"Hi, everyone. I'm Rachael Gerson. I'm the head of analytics at SEER Interactive. We're a digital marketing agency in Philadelphia, although we are growing and spreading across the world. Although we're primarily known for our SEO, we actually have an amazing paid search team and a really talented analytics team. I want to share our story with you. The timing on this story is actually really convenient because it ties with what I wanted to talk to you about.
My sister wrote a blog post last night. She has a new blog. No one ever goes to it. I think I may be the only person who knows it exists. She wrote the post. I read it this morning and went, "This is really good content. I'm going to share this." And I put it out on Twitter.
She saw me share it, and she put it on Facebook and thought, "Okay. Let's see what happens." In the last 8 hours, she's gotten 74,000 page views to this one blog post. I'm looking at the real-time traffic right now, down here. There are 1,500 people on the site. This thing is blowing up. It's going viral.
We can see it spreading through Twitter. We can see it spreading through Facebook. We can see it being referred by random sites, but we're also seeing a lot of traffic come in as direct. Since no one knows this blog exists, I highly doubt they're typing in the 40 plus characters of the URL to go directly to this page. They're not. It's being shared socially. This is the idea of dark social.
It's not a new idea, but it's a fascinating idea, and that's what I wanted to talk to you about today, was this idea of dark social, that content spreads, if it's good content, socially, organically.
Dark social sounds like a bad thing. It's not. It's actually really awesome and really fun to dig into. Let's say that someone read this post earlier, and they shared it on Twitter, Facebook, whatever. We kind of know where that came from for the most part. They may have texted it to a friend or copied a link and sent it in chat. In both cases, when the person clicks on the link and goes to the site, they come in as direct.
Direct is Google Analytics' version of, "We have no idea what this is, so let's call it direct and throw it in that bucket." We know it's not direct. That's our dark, organic social. It's spreading organically in all different ways, and we're getting traffic because of it. It's pretty amazing.
I wanted to talk to you about the analysis I'm doing on the dark social side because it's really fun stuff. Unfortunately, in talking to a lot of people, I found they're not there yet.
Here's the problem. When we say direct it's our catchall bucket and we need to look at direct to get an idea of our dark social, organic social, whatever we want to call it, if things are not tagged properly, we can't dig into to what's [out] to this dark social side. Actually, we can't do anything. If things aren't tagged properly, you're not taking credit for the work that you're doing.
For your paid search, for your social media, for email marketing, whatever it is, you have to tag your links. Otherwise, you're not getting credit for the work that you're doing.
You know what really sucks, by the way? When you work really hard on a project and, at the last second, your boss takes credit for it. That was your project. You did all the work for it. Why is he taking your credit? It sucks!
What we're talking about right now is the digital marketing version of that. It's the online version, where you're giving your credit away for the work that you're doing. Honestly, you need that credit to keep your budget, to keep your job, to get a promotion, to get any of these things. You need to prove your value.
When we talk about tagging, it's using UTM parameters. Dark social, organic social, that's really sexy. It's fun. We can dig into that. UTM parameters are not sexy. They're not fun, but they're necessary. If you're not doing this, you're wasting your time and you're wasting your money. Now that sucks.
How are you wasting your time? If you're not doing this, you're putting all kinds of time, hopefully, into analysis, if you're looking at what you're doing, but your analysis is based on data that's not accurate. You're putting your time into marketing efforts that may not actually be working as well as you think they are. You're putting your money into marketing efforts. You need to know that your stuff's actually working. Keep doing that. Make your well-informed decisions to help the business and drive it forward.
Again, time is money. You need to make sure you get all this stuff right, so you can do all the other stuff.
Let's talk about a few examples of where tagging actually matters. If we're looking at Twitter, if you don't tag your links, things will still come in. You'll see t.co showing up. In your real-time traffic, you'll see Twitter as social coming in, and you'll see some of that in your multi-channel funnels as well.
If you tag your links, you're going to always know it's Twitter. You're going to know which campaign it was. You're going to know all the information you put into it. You're also going to be protected from the other side of it. That's when people use Twitter apps. For example, HootSuite doesn't come in as Twitter unless you've tagged it. People clicking on a link that you post on Twitter that's untagged in HootSuite are going to come in as HootSuite referral usually.
If you posted on TweetDeck, they're coming in as direct. By the way, I'm still playing with all of this, and it all changes. I've played with stuff that's changed before. So if this is different by the time it comes out, I apologize. Just keep up with it all the time.
That's our Twitter side. On Facebook, if we don't tag our links, they'll come in as Facebook referral. It's nice and easy. It's clean. We know what it is. The exception to that is if someone's trying to open a link in Facebook, they click on the link, it doesn't load fast enough, they're probably going to click Open in Safari if they really care about it. Once they open in Safari, that's a direct visit. We just lost the Facebook tracking in it.
There're also a missing piece here, and that's if you do tag this stuff, you get an extra level to your analysis. You can say, "This is all the same campaign. It's the same effort, same content." You can tie it together across all these different platforms, and that helps.
We get to email. If you're putting time and money into your email marketing, you want to take your credit for it. If you're not tagging your email, it's usually going to come in one of two ways:  One as a referral from all the different mail things that can come in or as direct.
At least with the mail, where is says mail.yahoo.whatever, we know it's mail. We can't track it down to what you did versus what someone sent. We have some analysis on it. If it's direct, you lose everything. So tag your email.
Paid search. It's nice. AdWords actually makes it really easy for us to tag our paid search. We can connect Google Analytics and AdWords very easily, and they play really well together. It's awesome. The problem is when you don't tag your stuff. If you don't tag your paid search, either through AdWords or through your manual tracking parameters on other platforms as well, it comes in as organic.
This actually happened to us at SEER. One of our SEO clients, we were watching their traffic, and organic traffic spiked. The account manager went, "Hey, guys, this is awesome." To which the client responded, "Oh, we forgot to tell you we launched paid search," and the account manager discovered they weren't tagging their paid search. This paid search manager accidentally just gave away their credit. We don't want to have that happen.
Let's say you've actually tagged everything properly in your URLs. All this is done. These are just a few examples, but all of the other stuff is taken care of. Let's look at the tracking on the site itself. We see this happen pretty often with paid search landing pages, where we have to put this on our checklist that this is done immediately.
We'll create brand new landing pages that are optimized for paid search for conversion. They're different from the rest of the site. They're a totally new template, which means that if the Google Analytics code is in a template already for the site, it may not be in here. If we don't have someone add it back in, what's going to happen is paid search will drive all this traffic to the site, they'll get to that page, go to page two. Page two has the Google Analytics code, but they don't know where it came from. This is going to show up as direct. Paid search just gave away their credit. We can't have that happen. You worked too hard for that credit.
I've also seen it where people make little mistakes with the tracking on the site. Spotify did this a few months ago, and I sent them a message to help them out with it. They were tagging all of the links on their site with UTM parameters. When visitors would hit those different links, they'd reset the visit ,and it would be a new visit with each one. Spotify, all their marketers were giving away their credit through that.
Let's say you've got all this other stuff right. Good job. That's awesome. There's still stuff that you can't control unfortunately. There are a lot of things that can cause traffic to come in as direct when it really isn't. I have a short list that people have been adding to at [bitly/direct-rome]. If you have others, keep adding them because I want to have a giant list of all the things we can tackle and fix, but the list just keeps growing.
If you look at mobile traffic, for example, iOS 6, we can't tell if it's search or if it's direct. That's a problem. For me, if I'm doing an analysis and I really need that part, or I really need to know that part for sure, I may cut that out so it's not throwing off my data. There are different ways to deal with that, and that's a whole other topic.
The point is control whatever you can. Where you control the spread of information, make sure you're doing your part. If you're sharing a link socially, tag your links. That way, if people want to share it or retweet it, the tracking is already in place there. If your posts on the site have social plugins, put the tracking in your social plugins too. It makes it easy if someone wants to hit the share on Facebook or to share on Twitter. It already has the tracking. It goes through, people get to the site, your tracking's in place, and you can breathe a sigh of relief.
Now once you've done everything else up here, your tagging is right on your URLs, your tracking is right on the site, there's nothing you messed up by accident, you've controlled everything you can with these other issues, you kind of have to accept what's left. You know that there's stuff that you can't account for. There's direct in there that may have been shared through a text, through a chat, through any other thing. You don't know where it actually came from.
First off, that gets a dark social. We can now start doing our awesome analysis, like dark social or other things, because we have confidence in our data. We can trust that we're making the right decisions for our business, and we can save our time and our money this way.
If you have questions or thoughts, hit me up on Twitter or in the comments below, because I love talking about this stuff. Maybe another time, we'll talk about this organic social idea."

Video transcription by Speechpad.com



Thursday, March 14, 2013

Creativity, Serendipity, and Championing B2B: An Interview with Adriel Sanchez

Adriel SanchezI sat down to talk with Adriel Sanchez, Sr. Director of Demand Generation at SAP. Every day, he digs into how he can help B2B businesses and their marketers with database marketing, telemarketing, digital marketing, and campaign management services. Adriel currently leads a demand generation hub of 70+ people to support SAP's Latin America business. You can find him on Twitter @Adriel_S or blogging at Marketing'pfft!

What's inspired you lately?

Recently, I had the privilege of joining 60 other top marketing execs from Fortune 500 companies for a 1.5 day private event in NYC. If you can't get inspired with that many smart people in a room, you're either in the wrong business, or you're a zombie. We touched on a lot of topics, but all centered around driving a culture of creativity and innovation.

Some key takeaways for me? Too many people and resource can actually stifle creativity, and when innovating, think rapid prototyping and make failure an option. You need a lot of bad ideas to get to a good one.

As a champion of B2B, you often rally against the notion that B2B is "boring" compared to B2C. What are some of your favorite B2B companies doing great inbound marketing out there?

Our CMO at SAP, Jonathan Becher, likes to say that "Big glass buildings don't buy software. People do..." Tragically, there aren't a heck of a lot of B2B marketing examples out there that break the 'B to Boring' stigma. B2B marketers need to 'bring the sexy back.' (Though I'm not sure we ever had it.)

That said, I love Adobe's Metrics Not Myths campaign. Another classic example (though by a brand that's not in existence anymore) is EDS' building planes in the sky ad. It also broke out of that B2B creative mold. SAP is doing quite a bit around sports and entertainment these days that is anything, but 'boring.' We recently launched NBA.com/stats.

You wrote recently about humanizing your brand through kindness. What's a humanizing experience you've had with a brand or noticed from the outside?

I love what the current White House administration has done with their brand. Regardless of your political leanings, you can't deny that the administration focuses hard on managing that brand, from the President on down. Their response to the We The People's petition to build a Death Star was brilliant. And best of all, their approach is backed by data. The amount of testing that went into the 2012 campaign's email marketing program was unprecedented in any organization, public or private.

You moved in your career from being a direct marketer to a social media marketer. What do you want to bring from direct marketing into social media?

First, I wouldn't describe myself as a 'social media marketer.' Any successful marketer today needs to understand social and its impact on how people engage with each other and the companies they buy from.

But my biggest lesson from direct marketing was how to align my activities to business outcomes. In a world where only 3-5% of the people you contact actually buy something, you will lose a lot of money FAST unless you're laser-focused on business outcomes. I feel like some self-described 'digital' or 'social' marketers today lose sight of those business drivers.

Conversely, what direct marketing habit did you have to break to dive into social?

The majority of core direct marketing tenets remain true. That said, there are three areas where I've had to evolve as a marketer:

First, the 40/40/20 rule of list, offer creative now needs to include context. A perfectly good offer to a targeted audience may fail if it's presented in a way that breaks accepted norms for a particular channel.

Albert EinsteinSecond, I've had to embrace Einstein's quote, "Not everything that can be counted counts, and not everything that counts can be counted." This one isn't easy for any direct marketer, but social media permeates everything. Try to quantify its cost or how much money it's bringing in as a stand-alone activity and you'll wind up with an incomplete picture at best.

Lastly, while we know negative option offers will almost always outperform positive option in the short-term, in today's world we need to err on the side of explicit customer permission.

What's your favorite social media medium to engage in?

Definitely blogging. Writing thoughts 140 characters at a time is fun, but it just doesn't satisfy my appetite for writing. My favorite part about blogging is actually introspection. It helps me crystallize my point of view on a particular topic.

When you're brainstorming for great content ideas, what are some of your favorite research or creative flow sources?

Definitely non-traditional sources. Whether I'm watching a video on a cool new gadget or reading about a groundbreaking medical development, I'm always trying to tie it back to my day-to-day challenges. Creativity is often serendipitous. If you put yourself in an environment and mindset where creativity can flourish, you'll notice great ideas coming from the most unexpected places.

Whether it's Grumpy Cat or the Harlem Shake, crazy social trends have caught our eye, even in the B2B space. (Heck, at SEOmoz, we did our own Harlem Shake video.) But what's something you were shocked never caught on?

I love Axe body spray's "Nothing beats an astronaut" campaign and am pretty surprised the spoofs haven't come in droves. Would it kill someone to create a "Nothing beats a marketing executive" version?

What are some innovative ways that you've seen people get their entire staff involved in their social media efforts and content creation?

Finding people interested in social is the easy part. What's difficult is achieving a sustained commitment to contribute in a way that adds real value to the community. This stuff takes time. The "what's in for me" principle is alive and well. Invest the time, and "I'll make ya famous." There's a company called EveryoneSocial with some interesting technology to help empower your entire workforce to be social media ambassadors.

If you're hiring for a social media manager, what are qualities that you'd look for?

Editorial background, above all else. I'd rather hire a someone with a journalism degree for this than an MBA in marketing. Find someone who can write, with a 'punchy' attitude, and has their finger on the pulse of current trends, news, etc., and you have a winner. Social media best practices and the ins and outs of your company's products are easier to teach than these other core skills.

A lot of people want more metrics from social media. What are three of your favorite analytics tools and what do you use them for?

We use Netbase for social monitoring and listening. It's got excellent natural language processing that takes sentiment analysis beyond the basics. It also has really good multi-language capabilities that continue improving.

Then there's a great solution offered by NextPrinciples that allows us to audit our hundreds of social media accounts across the globe against key reach and engagement metrics. It's critical to controlling the proliferation of accounts that plagues most large companies.

Lastly, social media objectives must tie to your business KPIs. We use our own CRM to track leads and opportunities that flow from social.

If you could change one thing about the way we use social media, what would it be?

I recently heard a story about a client of a major consulting company that reviewed the Facebook profiles of the consultants being assigned to his business before approving them. The client asked that a few be replaced because of photos posted on their walls. This isn't just an anecdote anymore. What we share in social media is public by default. Every picture, post, point of view, opinion, indiscretion. You have to work really hard to keep what's private private. I think it's time to consider whether private by default is the better option. Google+ goes is headed in that direction.

Thank you so much, Adriel, for a look into your world. If you're interested in hearing more from him, he'll be talking at the upcoming SES NY conference on the Building the B2B Social Media Machine panel.



Wednesday, March 13, 2013

Behind the Scenes of Fresh Web Explorer

In each of the following sections, I'll be talk about some of the hidden challenges to many of these seemingly-innocuous stages of the pipeline, as well as the particular ways in which we've tackled them. To kick this process off, we begin with the primordial soup out of which this crawl emerges: the our schedule of feeds to crawl.


Scheduling

Like you might expect on the web, a few domains are responsible for most of the feeds that we crawl. Domains like Feedburner and Blogspot come to mind, in particular. This becomes problematic in terms of balancing politeness with crawling in a reasonable timeframe. For some context, our goal is to crawl every feed in our index roughly every four hours, and yet some of these domains have hundreds of thousands of feeds. To make matters worse, this is a distributed crawl on several workers, and coordination between workers is severely detrimental to performance.

With job queues in general, it's important to strike a balance between too many jobs and jobs that take too long. Jobs sometimes fail and must be retried, but if the job represents too much work, a retry represents a lot of wasted work. Yet, if there are too many jobs, the queueing system becomes inundated with operations about maintaining the state of the queues.

To allow crawlers to crawl independently and not have to coordinate page fetches with one another, we pack as many URLs from one domain as we can into a single job subject to the constraint that it could be crawled in a reasonable amount of time (on the order of minutes, not hours). In the case of large domains, fortunately, the intuition is that if they're sufficiently popular on the web, then they can handle larger amounts of traffic. So we pack all these URLs into a handful of slightly larger-than-normal jobs in order to limit the parallelism, and so long as each worker obeys politeness rules, we're guaranteed a global close approximation to politeness.

Deduping URLs

Suffice it to say, we're reluctant to recrawl URLs repeatedly. To that end, one of the stages of this pipeline is to keep track of and remove all the URLs that we've seen in the last few months. We intentionally kept the feed crawling stage simple and filter-free, and it just passes _every_ url it sees to the deduplication stage. As a result, we need to process hundreds of millions of URLs in a streaming fashion and filter as needed.

As you can imagine, simply storing a list of all the URLs we've seen (even normalized) would consume a lot of storage, and checking would be relatively slow. Even using an index would likely not be fast enough, or small enough, to fit on a few machines. Enter the bloom filter. Bloom filters are probabilistic data structures that allow you to relatively compactly store information about objects in a set (say, the set of URLs we've seen in the last week or month). You can't ask a bloom filter to list out all the members of the set, but it does allow you to add and query specific members.

Fortunately, we don't need to know all the URLs we've seen, but just answer the question: have we seen _this_ url or _that_ url. A couple of downsides to bloom filters: 1) they don't support deletions, and 2) they do have a small false positive rate. The false positive rate can be controlled by allocating more space in memory, and we've limited ours to 1 in 100,000. In practice, it turns out to often be less than that limit, but it's the highest rate we're comfortable with. To get around the lack of being able to remove items from the set, we must resort to other tricks.

We actually maintain several bloom filters; one for the current month, another for the previous month, and so on and so forth. We only add URLs that we've seen to the current month, but when filtering URLs out, we check each of the filters for the last _k_ months. In order to allow these operations to be distributed across a number of workers, we use an in-memory (but disk-backed) database called Redis and our own Python bindings for an in-Redis bloom filter, pyreBloom. This enables us to filter tens of thousands of URLs per second and thus, keep pace.

Crawling

We've gone through several iterations of a Python-based crawler, and we've learned a number of lessons in the process. This subject is enough to merit its own article, so if you're interested, keep an eye on the dev blog for an article on the subject.

The gist of it is that we need a way to efficiently fetch URLs from many sources in parallel. In practice for Fresh Web Explorer, this is hundreds or thousands of hosts at any one time, but at peak it's been on the order of tens of thousands. Your first instinct might be to reach for threads (and it's not a bad instinct), but it comes with a lot of inefficiencies at the expense of conceptual simplicity.

There are mechanisms for the ever-popular asynchronous I/O that are relatively well-known. Depending on what circles in which you travel, you may have encountered some of them. Node.js, Twisted, Tornado, libev, libevent, etc. At their root, these all use two main libraries: kqueue and epoll (depending on your system). The trouble is that these libraries expose a callback interface that can make it quite difficult to keep code concise and straightforward. A callback is a function you've written that you give to a library to run when it's done doing it's processing. It's something along the lines of saying, 'fetch this page, and when you're done, run this function with the result.' While this doesn't always lead to convoluted code, it can all too easily lead to so-called 'callback hell.'

To our rescue comes threading's lesser-known cousin, coroutines and incarnated in gevent. We've tried a number of approaches, and in particular we've been burned by the aptly-named 'twisted.' Gevent has been the sword that has cut the gordian knot of crawling. Of course, it's not a panacea, and we've written a lot of code to help make common crawling tasks easy. Tasks like URL parsing and normalization, and robots.txt parsing. In fact, the Python bindings for qless even have a mode that is gevent-compatible, so we can still keep our job code simple and still make full use of gevent's power.

A few crawlers is actually all it takes to maintain steady state for us, but we've had periods where we wanted to accelerate crawling (for backlogs, or to recrawl when experimenting). By way of an example of the kind of power the coroutines offer, here are some of our crawl rates for various status codes scaled down to 10%. This graph is from a time when we were using 10 modestly-sized machines, and while maintaining politeness they sustain about 1250 URLs/second including parsing, which amounts to about 108 million URLs a day at a cost of about $1 per million. Of course, this step alone is just a portion of the work that goes into making Fresh Web Explorer.

Dechroming

There's a small amount of processing associated with our crawling. Parse the page, look at some headers, et. all, but the most interesting feature of this process is the dechroming: trying to remove all the non-content markup in a page, from sidebars to headers to ads. It's a difficult task, and no solution will be perfect. Despite that, through numerous hours and great effort (the vast majority of which has been provided by our data scientist, Dr. Matt Peters) we have a reasonable approach.

Dechroming is an area of active research in certain fields, and there are certainly some promising approaches. Many of the earlier approaches (including that of blogscape from our tools section, Fresh Web Explorer's predecessor) relied on finding many examples from a given site, and then using that information to try to find the common groups of elements. This has the obvious downside of needing to be able to quickly and easily access other examples from any given site at any given time. Not only this, but it's quite sensitive to changes to website markup and changes in chrome.

Most current research focuses instead on finding a way to differentiate chrome from content with a single page example. We actually began our work by implementing a couple of algorithms described in papers. Perhaps the easiest to conceptually understand is one in which a distribution of the amount of text per block (this doesn't have a 1:1 correspondence with HTML tags, necessarily) and then finding the clumps within that. The intuition is that the main content is likely to be larger sequential blocks of text than, say, comments or sidebars. In the end, our approach ended up being a combination of several techniques and you can find out more about it in our "dragnet" repo.


All told

Fresh Web Explorer has been in the works for a long while -- perhaps longer than I'd care to admit. It has been rife with obstacles overcome (both operational and algorithmic) and lessons learned. These lessons will be carried forward in subsequent iterations and future projects. There are many changes we'd like to make given this hindsight and of course we will. Refactoring and maintaining code is often more time-consuming than writing the original!

The feedback from our community has generally been positive so far, which is encouraging. Obviously we hope this is something that will not only be useful, but also enjoyable for our customers. The less-than-positive feedback has highlighted some issues of which we are aware, most of which are high on our priorities, and leaves us raring to go to make it better.

On many points here there are many equally valid approaches. While time and space don't permit us to present a complete picture, we've tried to pull out the most important parts. If there are particular questions you have about other aspects of this project or why we chose to tackle an issue one way or another, please comment! We're happy to field any thoughts you might have on the subject :)



Tuesday, March 12, 2013

The Guide to US Census Data for Local SEO

As tax time nears in the United States, it's hard not to wonder what exactly all that money is being spent on. Instead of getting into politics, I'd rather describe something our taxes do pay for and how it can help you plan an effective local SEO strategy.

During the daily grind, we can become accustomed to exclusive data available to us only through analytics platforms, Webmaster tools accounts, and other resources requiring a username, a password, and a mother's maiden name. This private-access mentality makes it easy to overlook that which is freely available to everyone ' including our own Census. I'm Harris Schachter (you might know me better as OptimizePrime) and I'd like to show you not what you can do for your country, but what your country can do for you.

Uncle Sam's Data

All of the information and images presented from the US Census are free to reproduce unless otherwise noted.

Using Census Data When Planning Local Strategy

During the planning phase of a local strategy, you need to identify which specific localities will serve you best, whether it be local content, social media, community engagement (geographic community & company community), on-site optimization, off-site citation building, link building, or anything else that goes into local SEO.

By using census data, these viable hyper-local markets can be identified before you even publish a single tweet. You can plan micro-campaigns designed to match each of the various cities, counties, towns, or even city blocks in your selected location. This type of analysis is particularly important when considering where to open a new brick-and-mortar establishment.

Demographic data can guide everything from the language and reading level of your content, to the methods by which it should be distributed. Distinct personas for each of the geographic components can be made to help you visualize the potential customers within them. Once armed with this information, local strategies (including everything you are going to learn from GetListed+SEOmoz) can be applied with laser precision.

You can spend hours on Census.gov exploring the myriad databases and tables. It can be overwhelming, so I'll just demonstrate three of the most useful resources. If you're an international reader, let this guide serve to motivate you to seek out what is available through your government.

Since it's been cold lately, and Richmond has as enough plaid and square glasses to rival Seattle, I'll use hipster snow boots as my example of a locally marketable product, targeting the 20-24 age groups. I'll look for viable hyper-local markets in the Richmond area since that is where I live, and do most of my local SEO here at Dynamic Web Solutions. I'll go through each of the three Census tools using this scenario.

1. Interactive Population Map

First is the Interactive Population Map. With this interactive map, you're able to utilize population data at the most granular views. Currently the data is for 2010, but if you suspect a large population shift since the last data collection you can use proportions instead of volumes to make your observations. The image below shows counties, but you can view data at the following levels (from widest to most specific): national, Indian reservations, congressional districts, counties/municipalities, subdivisions, places, census tracts, census block groups, and census blocks (basically city blocks).

You can segment population data by age, race, ethnicity, and housing status, and compare these features to those of nearby locations.

How to use the Interactive Population Map:

  1. Head over to the map. Enter your location into the Find field, place your area of interest within the cross hairs, and use the on-screen controls to adjust the view and detail level.
  2. Choose any of the segmentation tabs, select a location, and click Compare.
  3. You can compare up to 4 locations to examine their demographics side by side. Once in the compare screen, you can flip between the tabs to view populations by age, race, ethnicity, or housing status for each of your chosen locations.

In my example, I chose Richmond City and the nearby counties of Henrico, Chesterfield, and Hanover. Since my hipster snow boots business isn't concerned with any specific ethnicity, race, or housing status, I'll flip over to age since I am primarily focused on the 20-24 age group.

From the table, I can see the city of Richmond has more people in my target demographic (20-24) than the three neighboring counties. Interesting.

2. County and Business Demographics Interactive Map

Next up is the County and Business Demographics Interactive Map (or CBD Map). This is similar to the interactive population map, but provides more robust information in addition to population, race, ethnicity, age/sex, and housing status. This map layers in three business demographics: industries, business patterns, and raw counts of establishments per industry.

Industries are the general market classifications, such as Accommodation and Food Services, Construction, Manufacturing, Health Care, Real Estate, etc. Business patterns contain data on annual payroll, and employee counts (within a location or industry).

The CBD Map is limited to the county level, but the additional information makes it an essential tool to decide where to focus your marketing efforts. This map can display the number of establishments in each industry, in each location. The capability for local competitive analysis is priceless.

How to use the CBD Map:

  1. Head over to the map. Enter your city, state or zip code into the Find field. It should automatically switch to the County view on the left (under Geographic Levels). Choose any of the top demographic tabs ' anyone will do for now.
  2. Select a location and click 'Compare' at the bottom of the window.
  3. In the new window that appears, click 'Add Topic' to choose your areas of interest.
  4. Once you have your topic areas chosen, go back to the map and select up to 4 more locations.

Going back to our cooler-than-snow snow boots business, I chose Retail Trade from Industries, 20-24 from Age/Sex, and Total Establishments from business patterns. In addition to Richmond City, I again picked the neighboring counties of Henrico, Chesterfield, and Hanover.

The goal while using the CBD map is to identify areas with large shares of your target demographic, but low business counts for your industry. This is a good indicator of areas with many potential customers, but low competition for them. Using the table, I can do some quick math to rank the four locations along these criteria. The comparison metric to use in this instance is number of (20-24 year old) people per retail trade establishment.

Richmond has 33, Chesterfield has 19, Henrico has 14, and Hanover has 15. Richmond has the greatest number of potential customers per establishment, suggesting comparatively low competition for retail store customers. Interesting.

3. US Economic Census

The final data table is the Economic Census within the American FactFinder collection. This is the most powerful database of the three, and also the most complicated. Data contained here includes everything the interactive maps do but at a much more granular level. Specifically, industries can be further broken down by individual product or service, and how many establishments offer them in any given area. This resource also contains a search bar- a familiar face in an unfamiliar environment.

The FactFinder database is rather complex, so I'll dive right into how to use it. Because this one is so detailed, the accessibility and recency of data is highly variable, so you may have more or less than what I've found.

How to use the Economic Census:

Step 1. Visit the American FactFinder database. Don't be tempted to use the search bar just yet.

Step 2: Program

  1. Choose the Topics tab.
  2. Select 'Economic Census.'

Step 3: Location

  1. Choose your location by selecting the Geographics tab. Use the 'Geographic type' dropdown, and pick your level of detail.
  2. Select State from the next dropdown. I've selected County and Virginia respectively.
  3. Pick your actual locations from the next dropdown.
  4. Use the 'Add to your selections' button to select your criteria.
  5. You'll see the chosen options in the left sidebar under 'Your Selections.'

Step 4: Industry

  1. Select your industry by finding the North American Industrial Classification System (NAICS) number under Industry Codes.
  2. Do this by using the search bar to find your business. This one is much more detailed than the industry selections of the Interactive Map, so try a few queries until you get a solid match.
  3. For my example, I first searched for 'boots" with no luck. I then tried "shoes' and found the codes 4482 for 'shoe stores.' Check off the applicable industry code and click 'Add.'
  4. Close this window to reveal your search results.

Step 5: Database Results

  1. First, review your selections in the left sidebar.
  2. Check off the source most applicable to you, and make sure it is the most recent version.
  3. Select View.

Step 6: Data!

Finally, we've got the goods. First of all, I should warn you not to use your browser's back button ' all of your selections will be lost and the process starts over again. Instead, take note of the 'Return to Advanced Search button.' Use this if you want to go back to the search options.

Check out the data columns. Specifically, the most important are: geographic area, number of establishments, and sales.

Data Collected From the Economic Census

Due to the sheer number of search options, every research endeavor will be different. My results had three of the four locations in a data table, and the most recent data is from 2007. Immediately, we can see Chesterfield had 31 shoe stores, Henrico had 47, and the city of Richmond had 32. This is another good indicator for the city of Richmond, since it shows a relatively low number of shoe stores, and we already know it has the greatest volume of our target demographic.

Now let's look at the sales column for the total sales each location's shoe stores generated. Using our county population data from earlier, we can calculate how much the average person spends on shoes (customer value) in each location. Keep in mind the population numbers are from 2010 while the sales figures here are from 2007, but hey, we're just making estimations.

Sales, Population, and Businesses

Divide the sales figures by the total county population. I found the average person in Richmond to be worth $101, in Henrico $174, and in Chesterfield $85. Of the three locations, the average Henrico resident spends the most on shoes. But what about our target demographics of 20-24 year olds?

For this calculation, we'll apply the percent of target population within the total population, and apply it to sales for each location. Although this assumes the different age groups purchase shoes at the same rate, it will give us an estimated percent of sales contributed by our target demographic.

From the first analysis, we found the 20-24 year old group made up 13% of Richmond's population, 6% of Henrico's population, and 6% of Chesterfield's population. After applying these percentages to total shoe sales, we find our target demographic spending $2.7M in Richmond, $3.2M in Henrico, and $1.6M in Chesterfield.

At this point, it might seem wiser to go after Henrico County, since the target demographic spends the most on shoes there, in total. Given the sheer amount of money spent on shoes in that county, I might consider a separate strategy to attract Henrico's business.

However, keep in mind Henrico has 47 shoes stores, while Richmond only has 32, and Chesterfield has 31. Taking this competitive information into account, we can compute the sales generated by the target demographic, for each store in each location. The data translates into $84k in sales per Richmond shoe store, $68k per Henrico store, and $52k per Chesterfield store. This suggests individual shoe stores in Richmond generate more sales from our target demographic than they do in the other two nearby counties. In-ter-est-ing.

Analysis Results

After three rounds of analysis, Richmond looks like the ideal place to set up a shoe store (especially one that sells supa-fly snow boots to young adults).

So, what have we learned from all this? From the data available, I've found:

  1. Richmond has a greater volume of people in the target demographic than neighboring counties.
  2. Richmond has more potential customers within the target demographic per retail store than neighboring counties.
  3. A shoe store in Richmond generates more sales from the target demographic than a shoe store in a neighboring county.

Apply the Insights

Now that you've identified the most viable business locations, it's time to incorporate these findings into local strategies. Go after these promising localities by gaining relevancy and ranking through a variety of methods, including:

  • Hosting events in the chosen location to establish an audience.
  • Building inbound links from sites which rank well in/for the target area (and be sure to diversify these links).
  • Doing competitive analysis for the most visible websites in the locations uncovered by the analysis. Go through their backlink profiles for relevant links and try to attain them too.
  • Encouraging customers to leave reviews, with specific attention to people in the targeted areas. Include the reviewer's location in the review itself to gain more trust and influence among the potential customers.
  • Engaging with prospects in the identified locations through social media. Find them through various tools like Followerwonk's Twitter bio search and get the conversation going.
  • Creating content specific to the viable locations. Dedicate a section of your blog for things to do and see in the area, why you like doing business there, interview citizens, government officials, or well known residents. Publishing content about the area can gain you exposure well before your visitors are even looking for your products or services. Once aware of your business, they'll likely keep you in mind at some point down the road.
  • Optimizing your content with traditional on-site methodologies for the locations uncovered in the analysis (but don't overdo it).
  • Developing press releases specifically for the target locations and distributing them to online sources like chambers of commerce, colleges and universities, local newspapers, free publications, etc.
  • Considering mobile users and making sure your site delivers a satisfactory experience for people in the targeted areas. Local and mobile go hand in hand.
  • Finally (and possibly the most effective in the long-run) is to consider opening a physical store within the location.
    • Claim all profiles and listings from data aggregators using consistent NAP citations.
    • Use consistent NAP citations on the website itself.
    • Consider including the name of the location in a brand name.
    • Utilize rich snippets to take full advantage of your new location in the SERP
    • Complete your Google+ Local page with proper categorization, and mentions of the location within the business description.
    • Modify social media profiles to include this new location.

I encourage you to explore Census.gov, and subscribe to the Census RSS feed to make sure you don't miss any of their interesting publications. They recently released mobile apps for the true geeks out there (I recommend the iPad app "America's Economy"). Also be sure to check out the data visualization gallery to learn something new or just to get some data vis inspiration.

So the next time a Census taker knocks on your door, answer it! You never know what type of product or business you'll be working with in the future, but chances are good that you'll have data for it.



Monday, March 11, 2013

Testing: Moving Our Industry Forward

Over the past few years, our industry has changed dramatically. We have seen some of the biggest removals of spam from the search results, and a growing number of people are starting to focus on building quality links rather than just building links. Companies are starting to really invest in content, and sites are building better category pages and are improving their product descriptions. Content marketing is now a "thing." A big thing.

However, while all these changes are great, it seems as if we have stopped testing in order to adopt new ideas. While I know there are many exceptions to this generalization, I see the trend too often. Most SEOs work off of best practices, and while this is good, who can argue with having good page titles, headlines, copy, having crawlable paths to content, and building good links? We need to continue to refine these portions for the best results.

A great example of this sort of refinement is ranking factors research. A few years back, SEOmoz did some testing around H1's vs. H2's and said that the H1 doesn't provide an added benefit. Whether or not you agree with this idea, this example shows how factors can (potentially) change over time.

Over the last few years, Google has rolled updates that have had significant impact on search: the canonical tag, href lang, and rich snippets/support for schema, just to name a few. While there have been tests on these updates, we need to continue to test and keep our knowledge current. Google is continually testing new things, and we need to rely on testing to keep up. For example, back when Search Quality Updates were a thing, Google would "share" names and descriptions of updates and tests to the search engine algorithm. Frequently, there were 30-40 updates a month that they were rolling out or testing.

As you already know, this means there is huge potential for a high number of changes to the algorithm. We need to be testing (new things and old) to make sure we're staying current. 

Share your results

In addition to all the updates we are aware of, there is a lot that Google isn't telling us. This is what makes testing and sharing even more important. Barry Schwartz pointed out on Search Engine Round Table that Google left some important items out of their August/September update. Further, there are updates that Google will deny. If it weren't for people carefully watching and analyzing the SERPs and then sharing their tools (like Dr. Pete's MozCast), we would probably be largely unaware of much activity.

If we don't share our observations after testing, we face two problems. First, we can't confirm and verify what we see (and believe), and second, we can't move our industry forward. While the SEO industry is evolving and SEO is gaining more widespread acceptance, it is still seen by many as a mystery and a dark art. By sharing our tests and results, we educate the industry as a whole and raise not only the bar, along with our collective reputation. If we can retire bad practices and tactics that are of low-value, we bring more credibility to the industry.

Share your failures

We all want to conduct awesome, break through tests; it's really exciting to learn new stuff. However, we have a tendency to only share our successes, rather than our failures. No one really wants to share failure, and it's natural to want to "save face" when your test doesn't go according to plan. But the fact remains that if there is a test that "fails," it isn't a failure.

There is so much we can learn from a test that doesn't go as expected (and sometimes we don't know what will happen). Further, sharing the "failed" results can lead to more ideas. Last week, I posted about 302's passing link equity. I began this test because my first test failed. I was trying to see if a page that was 302'd to another page would retain its rankings. It didn't work, and the page I was testing dropped out of the SERPs, but it was replaced with the page on the receiving end of the redirect. This result led me to test them compared to 301s. On top of that, there was a really good comment from Kane Jamison about further tests to run to gain a better understanding. If I hadn't shared my "failed" results, I would have never learned from my mistakes and gained knowledge where I least expected it.

Below are a few other tests I've run over the years that ended up with "failed" results. I hope you can learn as much from them as I did.

Keyword research with Adwords

For this test, I needed to provide a comparison of the head vs. long term search volume related to tires. I had heard, at one point, that you could use Adwords impression data for keyword research. I decided to give it a try. I whipped up a rock solid domain and set up a broad match Adwords campaign. 

Tires^4!

(People even signed up!)

It didn't work. While we got a lot of impressions, we couldn't access the data. There was a category called 'Other Search Terms' that contained all the impression data we wanted.

Lesson learned: Adwords impression data isn't great for keyword discovery, at least in the capacity that we tried to use it.

Keywords in H2 tags

A few years back, I wanted to see if there was any advantage to placing an H2 tag around keywords in the content. The keywords were styled to look the same as the normal text; the only difference was the H2 tag. I rolled this out on about 10,000 pages and watched the results for a few months.

What did I find? Nothing. Exactly the same as the control group. Still, lesson learned. 

Link title element

This failed test is actually one of Paddy Moogan's. He wanted to test the link title element to see if that passed any value. He set the title to 'k34343fkadljn3lj' and then checked to see if the site improved its ranking for that term.

There was no improvement.

Later, he found out that Craig's site was actually down, so it probably wouldn't be ranking regardless of how it was linked to. This brings up a really important point in testing: double check everything, even the small points. It can be really frustrating to run a test and then realize it was all for nothing. 

Your "failed" tests

We've all been there, so it's time to share your story. What have you recently tested that didn't turn out exactly how you planned? If we can all learn from the mistakes of others, we're in a better place. Drop a line in the comments and let us all know!



Saturday, March 9, 2013

6 Ways to Use Fresh Links & Mentions to Improve Your Marketing Efforts - Whiteboard Friday

So I wanted to take some time and talk through the value that marketers can get from Fresh Web Explorer and not just from Fresh Web Explorer, because I realize it's one in a set of tools, but also from things like doing regular Google 24 hour searches to look for brand mentions and links, using other tools like Radian6 or an uberVU, which is inside empowering, Raven Tools fresh links and fresh mentions section. You can do a lot of these things with any of those tools.

I'm going to focus on Fresh Web Explorer for this part, but you can extrapolate out some ways to use this stuff in other tools too.

So number one, one of the most obvious ones is trying to find opportunities for your brand, for your site to get coverage and press, and that often will lead to links that can help with SEO, lead to co-occurrence citations of your brand name next to industry terms, which can help with SEO, could help with local for those of you who are doing local and have local businesses mentioned. It certainly can help with branding and brand growth, and a lot of times helps with direct traffic too.

So, when I perform a search inside Fresh Web Explorer, I'm getting a list of the URLs and the domains that they're on, along with a feed authority score, and I can see then that I can get all sorts of information. I can plug in my competitors and see links, who's pointing to my competitor's sites. Perhaps those are opportunities for me to get a press mention or a link. I can see links to industry sites. So, for example, it may not be a competitor, but anyone who's doing coverage in my space is probably interesting for me to potentially reach out to build a relationship with.

Mentions of industry terms. If I find, you know whatever it is, print magazines that are on the web, or blogs, or forums, or news sites, feeds that are coming from places that are indicative of, wow, they're talking about a lot of things that are relevant to my industry, relevant to my brand and to what our company's doing, that's probably an opportunity for a potential press mention.

Mentions of competitors brands. If a press outlet is covering, or a blog or whoever, is covering one of your competitors, chances are good that you have an opportunity to get coverage from that source as well, particularly if they try to be editorially balanced.

Mentions of industry brands. It could be that you're in an industry that, and you're not necessarily competitive with someone, but you want to find those people who are relevant to your brand. So for example, for us this could include things like a brand like Gnip or a brand like HubSpot. We're not competitive with these brands, SEOmoz is not. But they are industry brands and places who cover Gnip and HubSpot may indeed cover Moz as well.

Number two, I can find some content opportunities, opportunities to create content based on what I'm discovering from Fresh Web Explorer. So I plugged in "HTC One," the new phone from HTC, and I'm looking at maybe I can curate and aggregate some of the best of the content that's been produced around the HTC One. I can aggregate reviews, get really interesting information about what's coming out about the phone. I might even be able to discover information to share with my audience.

So, for example, we focus on SEO topics and on local topics. If we expect the HTC One to be big and we want to cover several different phones and how that's affecting the mobile search space, we can look at their default search providers, what sorts of things they do in terms of voice search versus web search, whether they have special contracts and deals with any providers to be tracking that data and who that might be going to, all those kinds of things, and we can relate it back to what we're doing in our industry.

You can also Fresh Web Explorer to find the best time to share this type of information. So, for example, the HTC One comes out and maybe you're working for a mobile review site and you're like, "Oh, you know what? This has already been covered to death. Let's do something else this week, or let's cover some other stuff. Maybe we'll hit up the HTC One." Or, "Boy, you know what? This is just starting to get hot. Now is a great time to share. We can get on Techmeme and get the link from there. We can be mentioned in some of the other press coverages. We still have a chance, a shot to cover this new technology, new trend early on in its life cycle."

Number three, we can track fresh brand and link growth versus our competitors. So a lot of the time one of the things that marketers are asking themselves, especially in the inbound field is, "How am I doing against my competition?" So I might be Fitbit, which is a Foundry cousin of ours. They're also funded by Foundry Group. They compete with the Nike FuelBand, and they might be curious about who's getting more press this week. We released a new version of the Fitbit, or we're about to, or whatever it is, and let's see how we're doing against the Nike FuelBand. Then when we have our press release, our launch, let's see how that compares to the coverage we're getting. Where are they getting covered that we are not getting covered? Where are we getting coverage where they are not?

We can then use things like the CSV Export feature, which is in the top right-hand corner of the Fresh Web Explorer, and we can look at CSV Export to do things like, "Oh, I want to filter out these types of sites. Or I only want a report on the high feed authority sites versus the low feed authority one. So I want to see only the places where my coverage is high."

A note on feed authority though. Be very careful here because remember that a great page on a great site might be discovered through a low quality feed. It could be that a relatively junky feed is linking to some high quality stuff. We'll discover it and report on the feed authority of the source where we discovered it. So you may want to try using metrics like page authority and domain authority to figure out where are you being mentioned and is that a high quality site, not just feed authority.

All right. Number four. Find fresh sources that link to or mention two or more of your competitors, but don't mention you. Now, this has been a classic tool. We've had a tool in our library at Moz, which is similar to SEO Book's HubFinder. Ours is called the Link Intersect tool, and what you can do here is you can plug in something like some ice cream brands and see how it writes. So "Full Tilt" and "Molly Moons" ice cream, and I actually want to put quotes around those brand names so that I can get mentions every time someone mentions the Moon and the name Molly that would pop in there, that wouldn't be ideal, minus D'Ambrosio, which is the best Seattle ice cream shop obviously. It's a gelateria. It's fantastic. Side note, it's possible that maybe owned by my cousin-in-law, but shh, let's not tell anybody.

Okay, and then if I'm Marco over at D'Ambrosio Gelato, I can see where are Full Tilt and Molly Moons getting mentioned that aren't mentioning me. If it's, "Hey, there was an article in The Stranger about ice cream and they didn't cover us." And, "Hey the Capitol Hill blog didn't cover us." Maybe they don't know that we also have a Capitol Hill location. We should get in there and talk to those folks. We should mention, maybe leave a comment, maybe just tweet at the author of the post, whatever it is and tell them, "Hey, next time you cover ice cream, you should also write about us."

Number five. Compare sources coverage. So this is actually a bit of a teaser, and I apologize for that. So the operator site colon will not be available at lunch. So when you're watching this video, you probably can't use the site colon operator to see different sources and to run a search like the CRO site colon SEOmoz. However, it will be coming soon.

When it is, you'll be able to compare, hey is SEOmoz or is HubSpot more active in covering the CRO topic? Are there different sources out there that maybe don't have coverage of a topic and I could go and pitch them for a guest post? I could find those content opportunities. I could know if a topic is saturated or if it hasn't been covered enough. Maybe I find sites or blogs that might be interested in covering a topic that I would like them to write about. I can see who's covered and who hasn't using this site colon operator to figure out the source and the level of coverage that they might have or not.

The last one, number six, is really about reporting. Fresh Web Explorer is going to show you these great sort of trends about how is a particular term or phrase or link doing, links to a site, mentions of a brand name, mentions of a phrase or an industry term, whatever it is. So I can plug in things like my brand, SD, which is our link operator for just seeing links to anything on the sub-domain. I can plug in my sub-domain, and then I can see, here's how that's gone over the past 7 days or 30 days. I can screen shot that and put it in a report. I can download using the export functionality. I can download the CSV and then filter or scrub.

A lot of times, for example, PR companies, companies that help you with your press will do this type of work. They'll assemble this kind of reporting. In fact, at Moz we use a firm called Barokas here in Seattle. Every week they send us a report of here are all the places that you were mentioned, and here are places that mentioned industry terms and that kind of stuff, which is really nice, but you're oftentimes paying a lot of money to get that reporting. You can actually do that yourself if you don't have a PR company that you're already using for this type of work. Of course, if you are a PR company, this might be an option for you to do that type of reporting.

These six, they are only scratching the surface of what you can do with Fresh Web Explorer, and I don't doubt that I haven't thought of hundreds of uses yet for the data that's inside Fresh Web Explorer. I really look forward to seeing some cool creative uses from you guys out there, and I hope that you are enjoying the product. If you would like, please give us feedback. I know the team would love to hear from you on this, and they're constantly working and iterating and updating and adding in things like the site colon operator. So very cool.

Thank you very much, and we will join you again next week for another edition of Whiteboard Friday. Take care."



Friday, March 8, 2013

The Google AdWords Landscape (Infographic)

We tend to think of AdWords as the domain of PPC specialists, but it's becoming clearer and clearer that Google's SERP advertising has a huge impact on the position and effectiveness of organic results. So, I wanted to ask a simple question ' what does the AdWords 'landscape' actually look like in 2013? In other words, where are the ads, how many are there, what combinations occur in the 'wild', and how often do they show up? I'll dive into some details below, but the answer looks something like this (click the image for a full-sized view)'

The Google AdWords Landscape

Embed this image:

The Methodology

We collected data from 10,000 page-one Google SERPs via Google.com on a weekday during normal business hours. Personalization was turned off, and the crawler emulated a logged-out Chrome browser. We parsed the major ad blocks (which have consistent DOM markers) and the links within those blocks. Keywords and categories were pulled from AdWords' keyword tools, with 500 keywords coming from each of 20 categories.

A Few Caveats

Naturally, keywords pulled from the AdWords' research tools are more likely to have commercial intent than the 'average' keyword (if such a thing exists), so these percentages may not be indicative of the entire world of search queries. We did run these numbers at other time periods and on other days, and the results were fairly consistent.

These statistics were computed by unique queries, not by query volume. The results seem to be very similar, though. For example, we found ads on 85.2% of the queries crawled ' if we weight those queries by Google's 'global' volume, we get ad penetration of 84.5%. The correlation between the presence of ads and query volume was virtually non-existent (r=-0.018). The correlation between the presence of ads and Google's competition metric was high (r=0.874). This is probably not surprising, since 'competition' is essentially defined by how many advertisers are vying for any given query.

The Changing Landscape

This is only a snapshot of a rapidly changing picture. For example, paid shopping results are still relatively new, but we discovered them on almost 20% of the queries we crawled. Unlike the traditional AdWords blocks, paid shopping can appear in multiple positions and forms, including the larger, upper-right format previously reserved for Knowledge Graph.

Even traditional top ads are evolving, with ads showing extensions, expanded site-links, lead generation forms, etc.  Expect Google to experiment with new formats on the top and right, and to blend advertising into the Knowledge Graph area to increase CTR. This changing landscape will impact the efforts of people in both paid and organic search, so keep your eyes open, and don't assume that this is something only the PPC team has to worry about.

I just wanted to thank Dawn Shepard for all her help putting together the infographic. I know it was probably a bit painful to hear "Make it kind of boring!" ;)



Announcing Fresh Web Explorer

Try Fresh Web Explorer

Why did we build Fresh Web Explorer?

One of the most challenging tasks as an online marketer is keeping track of all the latest blogs, forums, and news sites on the web that mention your brand or site. Many of the tools out there can be frustrating to use and don't have the metrics, scalability, or features that I need to effectively keep track of important links and mentions. Google Alerts can be hit or miss. Topsy is terrific, but it only covers social mentions. Trackur, Ubervu, Buzzstream, and SocialMention all offer a unique set of features, but I frequently rely on a number of different tools to provide me with an instant look into mentions of the sites and brands I track.

What's different about Fresh Web Explorer?

Fresh Web Explorer (FWE) functions a lot like Open Site Explorer, so the interface will be familiar to OSE users. However, the data is extremely recent, and rather than just show you links, we grab full text content of articles, blog posts, forum threads, user comments, and other web content. FWE doesn't just show you links, but all term, brand, or phrase mentions as well. 

Smooth Operator

Export FWE data to customize your reports

Getting agile with FWE

Try Fresh Web Explorer



Thursday, March 7, 2013

The Evolution of Roger

Hello, Moz community! My name is Derric Wise and I am the Art Director here at Moz. After three years here,, I have often been asked the same two questions:

  1. How do you pronounce SEOmoz? (It's not SEEmoz or SUMOZE; it's simply pronounced 'S-E-Omoz.' You can find out more background on the company's name in one of Rand's blog posts here.)
  2. What's the story behind your mascot, Roger?

The second question is attached to a longer of an answer, which is what this post is all about. Let me introduce you to Matthew Heilman who, as Creative Director in 2010, hatched the idea that eventually led to the creation of our beloved Roger.



Wednesday, March 6, 2013

When It Comes to Analytics, Are You Doing Enough?

We all know analytics are important. As marketers, we spend a great deal of time in the data. We all, hopefully, consider ourselves part analyst in many ways. At the foundation of a good marketing team, there is an accessible analytics platform that is set up to provide actionable insights. We should always feel that the data is just a log in away. We should feel we have the data to make great recommendations, troubleshoot issues, and forecast our efforts accurately. We should all feel totally in control of our analytics, and use them daily.

But then unicorns jump out of pink clouds and fly around our heads, because that is simply not the case. Ever.

Maybe a handful of you work on teams that are doing all they can do as it relates to analytics. Maybe some of you have even staffed your team with a handful of full-time analysts. More likely, you may all be trying to use data in your jobs, but not doing it as thoroughly or as effectively as you wish you were.

So let's talk about that. Let's talk about the different types of analytics and common places to start with them. I believe the number one reason marketing teams aren't as data-driven as they should be is because data is intimidating. However, knowledge trumps intimidation. The more you know, the more comfortable you will be to put on that analyst hat. And analyst hats are cool. So let's jump in.


What are the different types of analytics?

The goal of all data analytics is to leave us more educated than before so we can perform better in the future. Sounds simple, right? Well, not really. A common misconception among marketers is that all analysis is equal, which isn't exactly the truth. There are actually three types of analytics; predictive, prescriptive, and descriptive. Most marketers spend the majority, if not all, of their time on only one of them: descriptive. As you can imagine, that leaves a lot of awesome data and innovation on the table.

Let's run through the three and talk through the differences...

Descriptive analytics:

Descriptive analytics is when we data mine our historical performance for insights. Often, we are just looking to get context or tell a story with the data. This is most certainly at the heart of what most marketers do on a daily basis, particularly in their web analytics. We look at how we are doing, and we try to understand what is happening and how that is affecting everything else.

Typical questions include: "How did that campaign do?" "What sort of performance did we see last quarter?" "How did that site's down time affect other performance KPIs?"

Predictive analytics: 

Predictive analytics takes that one step further. It's less about the questions, and more about the suggestions. It involves looking at your historical data, and coming up with predictions on what to expect next. This is most readily used in our industry when we try to predict how next month will perform based on this month's performance (month over month predictions or MoM). While it seems like an obvious next step for analysis, it's amazing to me just how many marketers stop at descriptive, and fail to push into this arena of predictive analytics. Often, it's because this involves predictive modeling which can, again, be very intimidating.

Typical statements include: "Based on the last few months of data and our consistent growth, we can expect to increase another 25%," or, "Knowing our seasonal drop trend, we can expect to slow down by 10% in the next 6 weeks."

Prescriptive analytics:

This is where things can get fun. Prescriptive analytics takes forecasting and predictions a step further. With prescriptive analytics, you automatically mine data sets, and apply business rules or machine learning so you can make predictions faster and subsequently prescribe a next move. Marketers tend not to think of this "as their responsibility." That is for someone else to think about and solve. I think that is a super dangerous mindset, given we are on the hook for hitting the company's business KPIs. Prescriptive analytics can be a very powerful catalyst for success at a company. 

Typical questions include: "What if we could predict when customers leave us before they do, what could we surface prior to that to change their minds?" "What if we can predict when they are ripe for a second purchase and suggest it along side other products?" "What if we can predict what they would be most likely to share with a friend, how would we surface that?"


So, are you doing enough?

I ask this because somewhere along the way, marketers began to believe that descriptive analytics was our job, and "that other stuff" was for someone else to figure out. At SEOmoz, we are working hard to have each team working on all three types of data analysis in a variety of capacities. It's not easy. There is a stereotype out there that you have to break through. Data can be fun. It can be accessible, and it can be part of everyone's job. In fact, it really should be.

Imagine this for a second: just think about how much could get done if every team felt empower to tell a story with the data, make predictions off of it, and then brainstormed ways to operationalize that data to prescribe next steps for the biggest gains.

That is what being an analyst means and I believe we are all becoming more of an analyst as this industry continues to evolve. The platforms out there make it easier than ever, and the competition is more intense then ever. Why not be part of something more than just telling a story with the data? Why not suggest the next move? Why not create crazy ways to use the data? I think it's time we all put our analyst hat back on and had a little fun with it.

Hopefully, breaking down the types of analytics above is a great reminder that there is more than just descriptive analytics. At the very least, you can share with your team to inspire them to do more with the data in front of them. Best of luck to you fellow data lovers!



Tuesday, March 5, 2013

Tips for Real-World Marketing from SearchLove and LinkLove

I want to tell you a story about one of our favourite sessions - Let's Get Real - where we have all our speakers on stage at once. In this post, I'm going to:

  • Highlight some of the incredible tips and tricks our speakers gave away at our conferences at the end of last year.
  • Give away free HD videos of Let's Get Real from the conferences at the end of 2012 [skip to the video giveaway].
  • Share all the details of our upcoming conferences in London and Boston, along with the video deal we have running for SEOmoz PRO members [skip to the conference details].

All the speakers on stage

Some of the earliest conferences I travelled to the US to attend were SMX West and Advanced. Back then, Danny used to have a session called Give it Up that was supposed to be more like the kind of tips, tricks and stories you would normally only hear at the bar (in exchange for delegates promising not to share the stories publicly for a month). Although the formatting is a bit broken, you can get a sense of the kind of topics covered in this Marketing Pilgrim write-up from 2007. I particularly like Matt Cutts' story:

'Alright, I'll tell you about my favorite spammer of 06 ' When you buy a domain, you own it for a year. Usually you get hosting, or park the domain ... You set name server to 'lamedelegation.org.' Millions of domains are marked this way. But some are marked 'lame-delegation.org' with a hyphen ' This spammer ' registered lame-delegation.org.'

Parental advisory explicit content

I really liked the personal, conversational tone of the sessions and the glimpse behind the curtain. When we started running conferences, we used to end with similar sessions.

Over the years, we felt that the tips being shared weren't helping our delegates improve their marketing skills (Danny has done some similar soul-searching). They were still fun (and often funny), but they were increasingly unuseful; not something you could go back to the office and implement.

As a result, we introduced the let's get real panel where we invite all of our speakers on stage for a rapid-fire round of tips and ideas with the crucial difference: all of the tips should be the kind of thing delegate can go back to the office and use for themselves or their clients.

To give you an idea of the difference between a regular talk and let's get real, check out what Wil looks like on stage giving a formal presentation:

Wil Reynolds - Woah there

...and what he looks like rocking at let's get real:

Wil Reynolds - give it up

Anyway, in the run-up to our next set of conferences (in March in London and May in Boston) I thought I'd go back to last year's tips and share the most useful with all of you. Here we go!

These tips come from our most recent SearchLove conferences in London and Boston. If you'd like to watch them for yourselves, I'm giving the entire videos away for free at the end of this post. The credit for the tips goes to the individual speakers - though I've generally rephrased the tips in my own words - I've credited them as I go along:

Social Media

Beware management tools for Facebook -- Jen Lopez

Jen Lopez

If you routinely use tools like Hootsuite for managing your social presence between multiple team members and across multiple platforms, beware of the potential effect on the visibility of your Facebook posts. There are two big things to be aware of:

  • Posts made via external applications suffer in Edgerank terms and so have lower 'natural' visibility.
  • If you are unlucky enough to post at a similar time to others using the same application, the Facebook timeline will often group posts together under 'updates from Hootsuite.'

It's an ongoing challenge to manage multiple contributors across multiple platforms and the tools are a huge part of making that possible but it's worth experimenting to see how your reach is affected.

Check out G+ ripples to find influencers -- Jen Lopez

If you do a keyword search in Google+, the default ordering of results is heavily skewed towards heavily-shared content. By drilling into the ripples, you can find the influencers who are sharing content in any given space and who are having a particular influence on which pieces of content get widely shared.

Craig wrote an article on the power of building your filter bubble influence, and Jen's tip is a great place to get started working out who you need to influence.

Technical SEO

Get a sample of googlebot visits in log file format -- Richard Baxter

It can be tempting to spend all our time in graphical tools, but Richard pointed out one specific use-case that has brought old-school techniques back to prominence for him. As googlebot gets better at interpreting JavaScript and attempts to crawl more and more AJAX content, it also increasingly makes mistakes. He and his team saw a major publisher having huge numbers of non-existent URLs requested based on googlebot misidentifying slugs in the HTML as URLs [we've seen this as well] - and this led to him recommending that we get our client dev teams to provide us with samples of googlebot log file data.

Split test your SEO -- Mat Clayton

Mat is in the luxurious position of having complete control and authority over a massive site that gets loads of search visits, but nevertheless, I thought his stories were interesting and useful even if you're running smaller sites. He talked about applying the principles of conversion rate optimisation to SEO. Take user profile pages for example (they have millions of them over at mixcloud): split them into two buckets (A and B) and make a set of changes to B designed to improve their search visibility. Treat visitors from search as 'conversions' in a CRO sense and test to see if A or B is statistically better.

Create site speed videos -- Annie Cushing

Annie CushingA short-but-sweet tip from Annie - check out webpagetest for creating videos of your website loading alongside those of your top competitors. If you have a speed problem, this is one of the most powerful tools for getting management on-side with the (often considerable) investment needed to achieve significant speed increases.

Clean your sitemap with Screaming Frog -- Annie Cushing

Remember Duane Forrester talking about how clean your sitemap should be? Annie suggests a simple way of checking (on small-to-medium-sized sites). Use the list mode of Screaming Frog to run through your XML sitemap and check the status code of the pages it contains.

CRO - Conversion rate optimisation

What nearly stopped you buying? -- Stephen Pavlovich

Stephen PavlovichStephen described a simple set of three questions they include on the confirmation page at his experience days startup:

  • What's the one thing that nearly stopped you buying from us today?
  • How could we make our website better?
  • Is there anything else you want to say?

It's important, he says, to make the answers free-form text areas. The freedom to write what they want is a critical part of the process of getting useful feedback. The idea then is that you can check in regularly and take actions to fix common issues.

Rank for your [<brand> voucher code] search -- Dave Peiris

Dave highlighted the example of Argos (a UK high-street retailer) who have a good example of an on-site page targeted to Argos voucher codes (in the US, I think 'coupon' or 'coupon codes' would be a more common search term). People are increasingly interrupting the checkout process to go and search for discount codes and the search results are typically terrible. If they fail to find anything relevant to your brand, they could easily be diverted to a competitor. By bringing them back to your own site, you reduce the drop-off of your checkout process.

Give your FAQ and T&C pages some love -- Hannah Smith

Hannah pointed out how close to converting someone is when they check out your FAQ or T&C pages. When was the last time you read those kinds of page for fun? And yet, so many of us make those pages impenetrable to humans, give them tiny font, even make the navigation non-standard so that it's hard to get back to the money pages. Don't do that, says Hannah, quite rightly. (While we're talking about it, I love the 500px terms and conditions - lawyer and human friendly.)

Email marketing

Encourage people to reply to your email marketing -- Patrick McKenzie

Patrick McKenzieNot everyone knew Patrick at our conference - he's the second-from-top-ranked user on Hacker News under the username patio11. Although his presentation covered a wide range of tips for conversion improvement, it was his email tips that stuck with me and changed our campaigns - literally as soon as I got back to the office.

His top tip was to encourage people to reply to your email marketing. There's a temptation to think that this is a bad thing and some companies go so far as to send email marketing from a no-reply@ address. By simply ending with the line 'Hit reply if you have any questions - I read them all', you can increase engagement, sell more, get instant feedback and generally get closer to your community. I can vouch for this; we've been adding this to most of our emails since Patrick gave away this tip, and I can't count the number of positive reactions it's caused.

It's closely related to his second tip: to give customer services a name and a face. He relates the story of a specific customer services rep who has received three marriage proposals in the last year. No one's gone that far for me, but they have certainly seemed to appreciate the ability to chat 1:1.

Facebook retargeting with 'dirty' lists -- me

Everyone who's been kicking around for a while has a bunch of email addresses they can't use. The better you are at observing best practices for email list growth, the more you will find yourself with lists of email addresses for people who haven't opted in to hear from you.

With Facebook retargeting, you can put those email addresses to good use. Use your list of 'interested but not opted-in' to build your advertising presence.

Start your subject lines with 'RE:' -- Paul Madden

Paul's tip overlapped email marketing and outreach with a suggestion to test different beginnings for your subject lines. In particular, 'RE:' can garner much higher open rates by playing on the appearance of an ongoing conversation.

Send your competitors' email marketing to Evernote -- Stephen Pavlovich

Stephen has talked before about the power of Evernote for saving and browsing a swipe file. Since it offers the ability to add notes by email, he recommends subscribing to competitors' email lists and using gmail filters to direct their emails into your Evernote account. Do this well in advance of needing it of course, and then when a particularly significant time of year is approaching (Valentine's day for a flower retailer for example), you have a ready-made swipe file of all the things your competitors did this time last year.

Online advertising

Swap retargeting pixels -- me

When you have close partnerships with other companies whose audiences' interests overlap closely with those of your customers and clients, you can quickly grow your retargeting pool by including your pixel on their site. Add them into their own group so that you can run dedicated advertising to draw them into your own site and content.

Combine Facebook demographic targeting and retargeting -- Guy Levine

Guy LevineThe demographic targeting options for Facebook advertising are well known. By running tightly-targeted adverts driving visitors to your own landing pages, you can cookie those visitors with dedicated retargeting pixels that group them into buckets of people with similar interests. This gives you a powerful weapon for future content marketing (particularly at the agency level where having this kind of retargeting pool can be reused across multiple clients).

Drive reviews with retargeting -- Guy Levine

Don't think only of retargeting being for driving conversions; it can be useful post-conversion, as well. Guy advocated adding a retargeting pixel to your confirmation page so that you have a bucket of people who have bought from you. What should you do with this information? One example use-case Guy mentioned was to ask for reviews of the product purchased to drive rich content on your site.

Better content

Use HARO to solicit content input -- Wil Reynolds

You're all familiar with Help A Reporter Out (HARO), right? Realising that the content his clients are producing is often journalistic, Wil realised that they could be the reporter as well as the user of HARO. He's had success with soliciting content input from small business owners via HARO - especially photo / image-based content for inclusion in rich posts.

Screencast your interactive infographics -- Lexi Mills

As the technology underpinning our creative work has become more modern, we occasionally trip up against news rooms stuck using outdated operating systems and browsers. In these cases, they sometimes can't access fancy animated graphics, etc. Lexi recommended including a short screencast in your journalist pitches to make it easier to see on any platform.

Management

Individual contributor tracks -- Rand Fishkin

Rand FishkinRand decided to cover some areas that are closer to the things that have been taking up his personal time recently, particularly on the management front. One of the things that he talked about was also something he has written about in the context of wider team structure; namely, the need for strong career opportunities in your company for 'individual contributors.' He pointed out the need for there always to be progression opportunities for your best people other than forcing them into management if that isn't their goal.

Reach out to your employees' heroes -- Rand Fishkin

Rand used the example of Avinash as being someone that many of his team look up to. Rand's relationship with Avinash means that he has a chance of getting him to share great things written by the SEOmoz team. By doing this with great content and in a transparent way ('it would mean the world to X to hear that you had read their stuff'), he cements both relationships.

Some general marketing/web tips

Build your personal brand by owning a topic -- Justin Briggs

Justin pointed out that, for the bigger conferences, if you pitch a session topic and that topic is chosen to be a panel, you are 99% certain to get asked to be involved. So pitch great topics with credibility. He ran through a personal example - from writing an epic blog post and using it to pitch a competitor analysis panel at a major show. If you don't know Justin's background, you should read his personal post first time, every time, that explains just what an incredible journey his has been. It'll definitely make you think you can up your own game.

Run wpscan -- Paul Madden

In a lightning-quick tip, Paul recommended that if you run a WordPress site, you should run WPScan against your own site to check for any vulnerabilities. With the increase in hacking for SEO alongside exploits generally for all kinds of other reasons, it's going to be increasingly important to lock down your stuff.

Take screenshots of your competitors every day -- Mat Clayton

Mat and his team built a simple script to take a screenshot of the main pages of their competitors every day. He told a story about how they actually found it easier than their competitors to know which changes were working for them. I recommend reading about webkit2png and PhantomJS if you want to try this out for yourself.

Put your best content on your about page -- Mark Johnstone

As we all get better at making 'big content' that is closely on-brand rather than just classic 'internet bait' (something I know Mark and his team have been working on a lot recently), it makes more and more sense to integrate that great content into your normal website. In particular, try putting your top-performing content on your about page for two reasons: you drive people to your about page where they learn about your company, and potential clients wanting to learn more about your company get treated to your absolute best content.

Lisa MyersLink building and PR

Turn your link developers into content producers -- Lisa Myers

Lisa described the positive results they have seen from having link developers build out rich online profiles, with posts they've written, authorship information, photos, and biographical information. Outreach works so much better when it comes from people who are (and seem) real.

Build hack day projects on APIs and tell the API owners -- Rob Ousbey

Rob described a hackday project he built called Get Out Call. Based on the Twilio API, it is designed to let you send a text scheduling a call to your cell phone to get you out of sticky situations. The power of the API means that this was phenomenally easy to hack together but a big part of the PR value comes from the fact that it is built on a service provided by a hot startup. By letting them know that he had built it, he got their PR team to hook him (and Distilled) up with coverage.

Video marketing

Sign up for YouTube advertising -- Phil Nottingham

If you do any Google Display Network video advertising, you get to include overlay links on your YouTube videos directing people to your own website. If you have an active YouTube channel, you should sign up and spend a small amount before pausing your campaign; even after you have paused, you can continue to have a clickable area on your YouTube videos. You can see this in action on the Distilled YouTube channel where we have a DistilledU video that we used to run advertising for. Even now that we've stopped, there is still a clickable link to the Distilled website.

Local businesses

Leave useful comments on attractions in your local area -- David Mihm

David MihmDavid expanded on a tip Will Scott gives for businesses interacting on Facebook: where you can interact as a page (read: business) instead of as a person. Will talks about leaving useful comments on the stories of the local newspaper or other local entities. David expanded this tip to Google+. In the same way as with Facebook, an admin of a business page can choose to browse Google+ as that business. That means you can leave reviews as a business. This is even more useful than commenting on Facebook because it is less transient. Not only are there fewer reviews than comments, but they are on static pages and the most helpful reviews tend to rank towards the top all the time. The example he gave was that if you are a hotelier in Edinburgh, and you do a search for Edinburgh, you see Edinburgh Castle as one of the top places listed. By leaving a comment along the lines of 'the top 5 things my guests love about the castle,' you gain permanent mind share on the most prominent points of interest in your town.

Giving away the videos

We record all the sessions at our conferences and make them available to buy (as well as bundling them with DistilledU subscriptions). Although I've included many of the tips from the let's get real sessions above, I wanted to give you all the chance to see the whole sessions; I left out a few juicy tips for the interested reader to find and I think it's always great to watch the dynamic of people on stage.

So, I'm giving you all access to the videos of both London and Boston absolutely free.

The way our video hosting is set up means that the only way I can get you access is by giving you 100% discount codes to 'buy' them on our store. Just a heads-up:

  • (Free) registration is required on our site
  • You will be presented with a credit card form - but if you enter the code MOZREAL2013 you won't be charged anything, and you won't have to enter any credit card information

Incidentally, I've added full transcripts to both videos on our site thanks to SpeechPad.

London Let's Get Real

Get London Let's Get Real 2012 for free by registering for a free account and entering MOZREAL2013 at checkout.

London let's get real

Boston Let's Get Real

Similarly, get Boston's Let's Get Real 2012 by registering for a free account and entering MOZREAL2013 at checkout.

Let's get real - Boston

Get tickets to see us live in London or Boston

At this point, I'm obviously hoping that you are all so excited about the great content getting shared at these conferences that you simply can't wait to come to one.

Luckily, we have two conferences coming up (again, in London and Boston), and SEOmoz PRO members can use a PRO perk to get free videos added to any ticket purchases (see the bottom of that page).

London LinkLove, 15th March 2013

Check out the schedule and the speaker line-up and book your place here.

Buy tickets

Boston SearchLove, 20th & 21st May 2013

Check out the speaker line-up (the exact schedule will be announced soon) and book your place here.

Buy tickets

Interested in the west coast?

  • First, don't forget that Mozcon is coming up soon (I'm speaking!).
  • We are also hoping to bring SearchLove to the West Coast - you can register your interest here.

Just in case there's any lingering doubt in your mind, I'll leave you with a party photo :)

Searchlove party