Tuesday, January 31, 2017

Podcast Episode 64 – How To Scrape Domains For Your PBNs With Bluechip Backlinks

On this episode we talk about how you can scrape powerful domains using Bluechip Backlinks!

Check out Bluechip Backlinks here: http://ift.tt/1O6IFup

Ask your next SEO question here: hdho.semanticmastery.com

Music: Gramatik – Who Got Juice
All rights reserved


Podcast Episode 64 – How To Scrape Domains For Your PBNs With Bluechip Backlinks posted first on your-t1-blog-url

Do You Need To Use A Buffer Site Before Using Crowdsearch.Me For A Google Partner Site Like Diigo?

In episode 115 of Semantic Mastery’s weekly Hump Day Hangouts, one participant asked if a buffer site is needed before using crowdsearch.me campaigns for a Google partner site like Diigo.

The exact question was:

When using crowdsearchme for wanting traffic to come into our site from a Google partner like Diigo, do we need to start at a kind of buffer site to start with the search and bring the traffic into the google partner then to our site? (like use tumblr to go to diigo then to website). Or can we start with the DIigo directly? Thanks.

This Stuff Works

Do You Need To Use A Buffer Site Before Using Crowdsearch.Me For A Google Partner Site Like Diigo? posted first on your-t1-blog-url

Google Search Console Reliability: Webmaster Tools on Trial

Posted by rjonesx.

There are a handful of data sources relied upon by nearly every search engine optimizer. Google Search Console (formerly Google Webmaster Tools) has perhaps become the most ubiquitous. There are simply some things you can do with GSC, like disavowing links, that cannot be accomplished anywhere else, so we are in some ways forced to rely upon it. But, like all sources of knowledge, we must put it to the test to determine its trustworthiness — can we stake our craft on its recommendations? Let's see if we can pull back the curtain on GSC data and determine, once and for all, how skeptical we should be of the data it provides.

Testing data sources

Before we dive in, I think it is worth having a quick discussion about how we might address this problem. There are basically two concepts that I want to introduce for the sake of this analysis: internal validity and external validity.

Internal validity refers to whether the data accurately represents what Google knows about your site.

External validity refers to whether the data accurately represents the web.

These two concepts are extremely important for our discussion. Depending upon the problem we are addressing as SEOs, we may care more about one or another. For example, let's assume that page speed was an incredibly important ranking factor and we wanted to help a customer. We would likely be concerned with the internal validity of GSC's "time spent downloading a page" metric because, regardless of what happens to a real user, if Google thinks the page is slow, we will lose rankings. We would rely on this metric insofar as we were confident it represented what Google believes about the customer's site. On the other hand, if we are trying to prevent Google from finding bad links, we would be concerned about the external validity of the "links to your site" section because, while Google might already know about some bad links, we want to make sure there aren't any others that Google could stumble upon. Thus, depending on how well GSC's sample links comprehensively describe the links across the web, we might reject that metric and use a combination of other sources (like Open Site Explorer, Majestic, and Ahrefs) which will give us greater coverage.

The point of this exercise is simply to say that we can judge GSC's data from multiple perspectives, and it is important to tease these out so we know when it is reasonable to rely upon GSC.

GSC Section 1: HTML Improvements

Of the many useful features in GSC, Google provides a list of some common HTML errors it discovered in the course of crawling your site. This section, located at Search Appearance > HTML Improvements, lists off several potential errors including Duplicate Titles, Duplicate Descriptions, and other actionable recommendations. Fortunately, this first example gives us an opportunity to outline methods for testing both the internal and external validity of the data. As you can see in the screenshot below, GSC has found duplicate meta descriptions because a website has case insensitive URLs and no canonical tag or redirect to fix it. Essentially, you can reach the page from either /Page.aspx or /page.aspx, and this is apparent as Googlebot had found the URL both with and without capitalization. Let's test Google's recommendation to see if it is externally and internally valid.

External Validity: In this case, the external validity is simply whether the data accurately reflects pages as they appear on the Internet. As one can imagine, the list of HTML improvements can be woefully out of date dependent upon the crawl rate of your site. In this case, the site had previously repaired the issue with a 301 redirect.

This really isn't terribly surprising. Google shouldn't be expected to update this section of GSC every time you apply a correction to your website. However, it does illustrate a common problem with GSC. Many of the issues GSC alerts you to may have already been fixed by you or your web developer. I don't think this is a fault with GSC by any stretch of the imagination, just a limitation that can only be addressed by more frequent, deliberate crawls like Moz Pro's Crawl Audit or a standalone tool like Screaming Frog.

Internal Validity: This is where things start to get interesting. While it is unsurprising that Google doesn't crawl your site so frequently as to capture updates to your site in real-time, it is reasonable to expect that what Google has crawled would be reflected accurately in GSC. This doesn't appear to be the case.

By executing an info:http://concerning-url query in Google with upper-case letters, we can determine some information about what Google knows about the URL. Google returns results for the lower-case version of the URL! This indicates that Google both knows about the 301 redirect correcting the problem and has corrected it in their search index. As you can imagine, this presents us with quite a problem. HTML Improvement recommendations in GSC not only may not reflect changes you made to your site, it might not even reflect corrections Google is already aware of. Given this difference, it almost always makes sense to crawl your site for these types of issues in addition to using GSC.

GSC Section 2: Index Status

The next metric we are going to tackle is Google's Index Status, which is supposed to provide you with an accurate number of pages Google has indexed from your site. This section is located at Google Index > Index Status. This particular metric can only be tested for internal validity since it is specifically providing us with information about Google itself. There are a couple of ways we could address this...

  1. We could compare the number provided in GSC to site: commands
  2. We could compare the number provided in GSC to the number of internal links to the homepage in the internal links section (assuming 1 link to homepage from every page on the site)

We opted for both. The biggest problem with this particular metric is being certain what it is measuring. Because GSC allows you to authorize the http, https, www, and non-www version of your site independently, it can be confusing as to what is included in the Index Status metric.

We found that when carefully applied to ensure no crossover of varying types (https vs http, www vs non-www), the Index Status metric seemed to be quite well correlated with the site:site.com query in Google, especially on smaller sites. The larger the site, the more fluctuation we saw in these numbers, but this could be accounted for by approximations performed by the site: command.

We found the link count method to be difficult to use, though. Consider the graphic above. The site in question has 1,587 pages indexed according to GSC, but the home page to that site has 7,080 internal links. This seems highly unrealistic, as we were unable to find a single page, much less the majority of pages, with 4 or more links back to the home page. However, given the consistency with the site: command and GSC's Index Status, I believe this is more of a problem with the way internal links are represented than with the Index Status metric.

I think it is safe to conclude that the Index Status metric is probably the most reliable one available to us in regards to the number of pages actually included in Google's index.

GSC Section 3: Internal Links

The Internal Links section found under Search Traffic > Internal Links seems to be rarely used, but can be quite insightful. If External Links tells Google what others think is important on your site, then Internal Links tell Google what you think is important on your site. This section once again serves as a useful example of knowing the difference between what Google believes about your site and what is actually true of your site.

Testing this metric was fairly straightforward. We took the internal links numbers provided by GSC and compared them to full site crawls. We could then determine whether Google's crawl was fairly representative of the actual site.

Generally speaking, the two were modestly correlated with some fairly significant deviation. As an SEO, I find this incredibly important. Google does not start at your home page and crawl your site in the same way that your standard site crawlers do (like the one included in Moz Pro). Googlebot approaches your site via a combination of external links, internal links, sitemaps, redirects, etc. that can give a very different picture. In fact, we found several examples where a full site crawl unearthed hundreds of internal links that Googlebot had missed. Navigational pages, like category pages in the blog, were crawled less frequently, so certain pages didn't accumulate nearly as many links in GSC as one would have expected having looked only at a traditional crawl.

As search marketers, in this case we must be concerned with internal validity, or what Google believes about our site. I highly recommend comparing Google's numbers to your own site crawl to determine if there is important content which Google determines you have ignored in your internal linking.

GSC Section 4: Links to Your Site

Link data is always one of the most sought-after metrics in our industry, and rightly so. External links continue to be the strongest predictive factor for rankings and Google has admitted as much time and time again. So how does GSC's link data measure up?

In this analysis, we compared the links presented to us by GSC to those presented by Ahrefs, Majestic, and Moz for whether those links are still live. To be fair to GSC, which provides only a sampling of links, we only used sites that had fewer than 1,000 total backlinks, increasing the likelihood that we get a full picture (or at least close to it) from GSC. The results are startling. GSC's lists, both "sample links" and "latest links," were the lowest-performing in terms of "live links" for every site we tested, never once beating out Moz, Majestic, or Ahrefs.

I do want to be clear and upfront about Moz's performance in this particular test. Because Moz has a smaller total index, it is likely we only surface higher-quality, long-lasting links. Our out-performing Majestic and Ahrefs by just a couple of percentage points is likely a side effect of index size and not reflective of a substantial difference. However, the several percentage points which separate GSC from all 3 link indexes cannot be ignored. In terms of external validity — that is to say, how well this data reflects what is actually happening on the web — GSC is out-performed by third-party indexes.

But what about internal validity? Does GSC give us a fresh look at Google's actual backlink index? It does appear that the two are consistent insofar as rarely reporting links that Google is already aware are no longer in the index. We randomly selected hundreds of URLs which were "no longer found" according to our test to determine if Googlebot still had old versions cached and, uniformly, that was the case. While we can't be certain that it shows a complete set of Google's link index relative to your site, we can be confident that Google tends to show only results that are in accord with their latest data.

GSC Section 5: Search Analytics

Search Analytics is probably the most important and heavily utilized feature within Google Search Console, as it gives us some insight into the data lost with Google's "Not Provided" updates to Google Analytics. Many have rightfully questioned the accuracy of the data, so we decided to take a closer look.

Experimental analysis

The Search Analytics section gave us a unique opportunity to utilize an experimental design to determine the reliability of the data. Unlike some of the other metrics we tested, we could control reality by delivering clicks under certain circumstances to individual pages on a site. We developed a study that worked something like this:

  1. Create a series of nonsensical text pages.
  2. Link to them from internal sources to encourage indexation.
  3. Use volunteers to perform searches for the nonsensical terms, which inevitably reveal the exact-match nonsensical content we created.
  4. Vary the circumstances under which those volunteers search to determine if GSC tracks clicks and impressions only in certain environments.
  5. Use volunteers to click on those results.
  6. Record their actions.
  7. Compare to the data provided by GSC.

We decided to check 5 different environments for their reliability:

  1. User performs search logged into Google in Chrome
  2. User performs search logged out, incognito in Chrome
  3. User performs search from mobile
  4. User performs search logged out in Firefox
  5. User performs the same search 5 times over the course of a day

We hoped these variants would answer specific questions about the methods Google used to collect data for GSC. We were sorely and uniformly disappointed.

Experimental results

Method Delivered GSC Impressions GSC Clicks
Logged In Chrome 11 0 0
Incognito 11 0 0
Mobile 11 0 0
Logged Out Firefox 11 0 0
5 Searches Each 40 2 0

GSC recorded only 2 impressions out of 84, and absolutely 0 clicks. Given these results, I was immediately concerned about the experimental design. Perhaps Google wasn't recording data for these pages? Perhaps we didn't hit a minimum number necessary for recording data, only barely eclipsing that in the last study of 5 searches per person?

Unfortunately, neither of those explanations made much sense. In fact, several of the test pages picked up impressions by the hundreds for bizarre, low-ranking keywords that just happened to occur at random in the nonsensical tests. Moreover, many pages on the site recorded very low impressions and clicks, and when compared with Google Analytics data, did indeed have very few clicks. It is quite evident that GSC cannot be relied upon, regardless of user circumstance, for lightly searched terms. It is, by this account, not externally valid — that is to say, impressions and clicks in GSC do not reliably reflect impressions and clicks performed on Google.

As you can imagine, I was not satisfied with this result. Perhaps the experimental design had some unforeseen limitations which a standard comparative analysis would uncover.

Comparative analysis

The next step I undertook was comparing GSC data to other sources to see if we could find some relationship between the data presented and secondary measurements which might shed light on why the initial GSC experiment had reflected so poorly on the quality of data. The most straightforward comparison was that of GSC to Google Analytics. In theory, GSC's reporting of clicks should mirror Google Analytics's recording of organic clicks from Google, if not identically, at least proportionally. Because of concerns related to the scale of the experimental project, I decided to first try a set of larger sites.

Unfortunately, the results were wildly different. The first example site received around 6,000 clicks per day from Google Organic Search according to GA. Dozens of pages with hundreds of organic clicks per month, according to GA, received 0 clicks according to GSC. But, in this case, I was able to uncover a culprit, and it has to do with the way clicks are tracked.

GSC tracks a click based on the URL in the search results (let's say you click on /pageA.html). However, let's assume that /pageA.html redirects to /pagea.html because you were smart and decided to fix the casing issue discussed at the top of the page. If Googlebot hasn't picked up that fix, then Google Search will still have the old URL, but the click will be recorded in Google Analytics on the corrected URL, since that is the page where GA's code fires. It just so happened that enough cleanup had taken place recently on the first site I tested that GA and GSC had a correlation coefficient of just .52!

So, I went in search of other properties that might provide a clearer picture. After analyzing several properties without similar problems as the first, we identified a range of approximately .94 to .99 correlation between GSC and Google Analytics reporting on organic landing pages. This seems pretty strong.

Finally, we did one more type of comparative analytics to determine the trustworthiness of GSC's ranking data. In general, the number of clicks received by a site should be a function of the number of impressions it received and at what position in the SERP. While this is obviously an incomplete view of all the factors, it seems fair to say that we could compare the quality of two ranking sets if we know the number of impressions and the number of clicks. In theory, the rank tracking method which better predicts the clicks given the impressions is the better of the two.

Call me unsurprised, but this wasn't even close. Standard rank tracking methods performed far better at predicting the actual number of clicks than the rank as presented in Google Search Console. We know that GSC's rank data is an average position which almost certainly presents a false picture. There are many scenarios where this is true, but let me just explain one. Imagine you add new content and your keyword starts at position 80, then moves to 70, then 60, and eventually to #1. Now, imagine you create a different piece of content and it sits at position 40, never wavering. GSC will report both as having an average position of 40. The first, though, will receive considerable traffic for the time that it is in position 1, and the latter will never receive any. GSC's averaging method based on impression data obscures the underlying features too much to provide relevant projections. Until something changes explicitly in Google's method for collecting rank data for GSC, it will not be sufficient for getting at the truth of your site's current position.

Reconciliation

So, how do we reconcile the experimental results with the comparative results, both the positives and negatives of GSC Search Analytics? Well, I think there are a couple of clear takeaways.

  1. Impression data is misleading at best, and simply false at worst: We can be certain that all impressions are not captured and are not accurately reflected in the GSC data.
  2. Click data is proportionally accurate: Clicks can be trusted as a proportional metric (ie: correlates with reality) but not as a specific data point.
  3. Click data is useful for telling you what URLs rank, but not what pages they actually land on.

Understanding this reconciliation can be quite valuable. For example, if you find your click data in GSC is not proportional to your Google Analytics data, there is a high probability that your site is utilizing redirects in a way that Googlebot has not yet discovered or applied. This could be indicative of an underlying problem which needs to be addressed.

Final thoughts

Google Search Console provides a great deal of invaluable data which smart webmasters rely upon to make data-driven marketing decisions. However, we should remain skeptical of this data, like any data source, and continue to test it for both internal and external validity. We should also pay careful attention to the appropriate manners in which we use the data, so as not to draw conclusions that are unsafe or unreliable where the data is weak. Perhaps most importantly: verify, verify, verify. If you have the means, use different tools and services to verify the data you find in Google Search Console, ensuring you and your team are working with reliable data. Also, there are lots of folks to thank here -Michael Cottam, Everett Sizemore, Marshall Simmonds, David Sottimano, Britney Muller, Rand Fishkin, Dr. Pete and so many more. If I forgot you, let me know!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, January 30, 2017

Weekly Digital Marketing Q&A – Hump Day Hangouts – Episode 116

Click on the video above to watch Episode 116 of the Semantic Mastery Hump Day Hangouts.

Full timestamps with topics and times can be found at the link above.

The latest upcoming free SEO Q&A Hump Day Hangout can be found at http://ift.tt/1NZu6N2.

 

Adam: Actually 00:00:01] live so hey, everybody. Welcome to Hump Day Hangouts. Today is the 25th of January 2017. I’m going to stop wiggling in my chair and stay still while I do this. I just wanted to welcome everybody to Episode 116. We got people all over the place right now. Today, sadly, you’re just going to have to make [do 00:00:18] with the three of us.

Let’s say hi real quick to everybody. Hernan, what’s up, man?

Hernan: Hey, guys. Hey, everyone. The three of us, we’re looking good. I think that we could start like a reality TV or something with these women [or gentling 00:00:31]. We’re looking good. That will be fun. Anyways, I’m really excited to be here guys. Still getting used to these [webinar jump thing 00:00:39] but I think it’s moving forward pretty nicely. Cool.

Adam: Bradley, how about yourself? How are things down in Virginia?

Bradley: Good. It’s actually like 67 degrees today down here again. It’s absolutely ridiculous. I don’t understand what the hell is going on with the weather around here but it’s nice, it’s sunny. I’m not going to complain. It’s just weird.

Adam: Yeah, [inaudible 00:00:58] we have like doing the weather update which I think is what we do now. It was like 50s over here at the weekend, blue skies, just beautiful. Then two days later, it was like 25 degrees and we got eight inches of snow and it’s just like, “Okay. That’s winter up here.”

Bradley: We should try to get Marco on because he’s a vacationing [as we speak 00:01:15] and we should get them on a Speedo.

Adam: The weather is so good he can’t even say how good it is live on Hump Day Hangouts [inaudible 00:01:23].

Male: Awesome.

Adam: We would lose our audience [quickly 00:01:28] [crosstalk 00:01:29]

Hernan: [crosstalk 00:01:29]

This Stuff Works
Adam: I think Chris is a up in the mountain so he wasn’t able to make it either. As far as announcements, I think, Bradley, you’ve got some stuff about master class. Right?

Bradley: Yeah. I think that’s all we have. MasterClass is coming up at five o’clock guys, so in an hour. We had the local Kingpin update webinar last week and I announced the update, the second case study that we’re going to be adding. Hernan and I are tag teaming a case study for local Kingpin. We’re going to go over that briefly in master class today as well because master class and mastermind members are usually privy to our case studies as well. I just want to mention that we’re going to be going over that briefly today based upon what we started in local Kingpin last week. I’ve got some stuff together and already started working and building that case study out so I’m going to share some of that today.

Then obviously we’re going to get into the other case study that we’ve been working on now for a couple months really, the affiliate project for the ketogenic diet. It’s working real, I mean, it’s just crazy that the momentum that this project is picking up is just absolutely insane. I’m really, really excited about it and I’m going to be laying out what my plan is for really monetizing this in a unique way, something that I learned from Ben Adkins actually. I’m going to be sharing that today during the master class as well. Ir you guys are in the masterclass, make sure you attend. Otherwise, you have to catch it on the replay. If you’re not in yet, you should join.

Adam: Definitely. Definitely check it out. I put the link in there. You can see that. If you’re not in IFTTT SEO Academy V2, you might want to hop in there first but I’ll post that link too. Also, which I guess we should mention then, if you do join the master class and you’re not in neither, you get access to IFTTT SEO Academy.

Bradley: How did you guys switch, swap places. What the hell are you guys doing?

Adam: Are you serious?

Male: I don’t know.

Hernan: I didn’t touch anything.

Bradley: You guys [crosstalk 00:03:20] it’s like you just transposed your positions on the screen for me.

Adam: It’s just like moving around Bradley. Let’s get into the questions.

Bradley: Let me grab the screen, make sure you guys can see. Give them access to that, just so you’re aware of that. Can you guys see my screen all right?

Adam: Yeah, we’re good.

Hernan: Yes, sir.

Measuring Competitiveness Level Of A Keyword For A Local Video Marketing Campaign

Bradley: Okay. Cool. Jay is up first. He says, “Hi. I’m looking to start local video marketing for local businesses. From your experience, how do you measure how competitive keyword is ranked for? Would a tool like Long Tail Pro which analyzes the top 10 keywords for links, Trust Flow, Citation Flow, and other metrics then give you a competitive score? And then I guess it gives you a competitive score. Would that be a good guide?” Not really, Jay. Not really for local.

This Stuff Works
The reason I say that is because those tools are really going to be analyzing the results on the first page which are most likely going to be websites and so obviously they’re different. It’s really a whole different ballgame. It’s a different animal. I mean, obviously, there’s some similarities, there’s some overlap but typically, and also it’s going to depend on the keyword, Jay, because it’s become more difficult to rank for local keywords. Not all of them. We’ve we’ve been noticing that it’s been becoming more and more difficult for certain keywords and it’s just going to depend on the keywords.

That’s why there are a ton of different poking software out there or programs that will actually go out and test various keywords by uploading these real short little spam videos to a channel for various keywords. Then they’ll go out and track the rankings of those videos when they initially index and find out where they’re located. If you could find some videos that were just uploaded to a channel that are on page one or close to page one, even page two somewhere, obviously, then those are typically videos that you can or keywords that you can end up ranking for so you’d end up deleting those poking videos as were just purely for testing purposes and then go out and create a good video, one that’s optimized for conversions or strong call-to-action, all that kind of stuff and then do your normal SEO work on it and you should be able to rank for those.

Now, again, even those poking software don’t guarantee that you’re going to rank for those keywords but they just give you an indication or an idea of which keywords are going to be more likely to rank on page one, if that makes sense. In all reality, there’s nothing that I can say that’s going to help you with this other than telling you to just test and that’s the only thing that you can do for video, guys, especially for really any keyword, but for local is you got to test variations of your target keyword to find the ones that are likely to rank on page one because some keywords are just too broad or they’re not keywords that videos are going to show for.

I know I fought that and I’m sure many of the people that watch our Hump Day Hangouts have seen or experienced the same thing. We’ve gone after a particular local term and out of sheer brute force I’ve been able to rank it because I was determined to, no matter what, get the video ranked. Sometimes, like I’m what I’m trying to say is some of those, the local keywords are just … Google does not like placing a video for that keyword and so you can still jam it to page one with a lot of effort and a lot of work but it usually doesn’t stick anyways. Even if you get it to page one, it’s typically not going to stick and so it’s not worth the effort.

What I’ve learned over the last couple years especially is to not waste my time trying to force videos for local keywords that are too difficult. That’s why I mentioned, use something like a poking software and test several variations of your keywords and then go after the ones that are going to be much easier to rank that you have a much greater chance of ranking on page one. Don’t even worry about the real competitive ones.

Unless it’s absolutely critical to your campaign in which case I would go after the longer tail easier ones anyways and then set up a YouTube silo and put your top level term the one that’s the most competitive that’s going to be the most difficult to rank at the top of the silo and then use all the internal linking patterns that we discussed in YouTube silo Academy to help with that. Because then all your longer tail videos, the the videos targeting longer tail keywords, are going to help reinforce that silo and that keyword team. That will help you to rank that more competitive word but always start with the stuff that’s going to generate some results for you the quickest and the easiest because that’ll give you the motivation to continue on especially if it starts to actually generate calls or leads, in this case, with local stuff I’m assuming you’re trying to generate leads.

If you can get and this is like, for example, when talking with potential clients and they say, “Well, I wanna rank for this keyword.” I tell them, “Okay. Well, that’s great.” Most likely, we’re not going to be able to rank for that keyword, here are some alternatives. Let me explain, if we can rank you for these three keywords that will get you some traffic now or we could attempt to rank for this one keyword that you want and it could take three months with no guarantee that it’s going to rank at all, which would you prefer? Typically, a savvy business owner is going to say, “Okay. I see your point. Let’s go with the three that can generate traffic now, if that makes sense.” It’s really just about how you position it.

Again, I wish I could tell you, “Yes, you could use some tools that are going to tell you.” The only tools that I use for that now or it’s just I test by poking using one of the various tools that do it. There’s a bunch of them that do it. Peter Drew has one, Live Rank Sniper, Abs has one, Video Marketing Blitz. We just did a webinar for Megarray about two weeks ago. That’s a great, great tool. You can use that. There’s another one by Jeffrey Evans called a Video Instant Prospector I think or something like that, Buddy VIP. That’s another one. There’s a bunch of them out there. Just find one, pick one, and test that way.

Do you guys want to add to that at all?

Hernan: No. I think that you made a great job explaining the methods that we are using, mostly not to this time because it’s all about the risk-reward, the work-reward ratio that we are trying to use. Some people are just trying to rank themselves for, I don’t know, cheap flights or cheap credit cards. At the end of the day, it’s not as more [bit 00:10:00] smart business decision because you need to factor in the resources that you will be investing to get to that point. Build up from there, gather up some money resources, and then you can run for the really competitive keywords with back links, with anything else.

Bradley: As far as, Jay, the different methods that you point out here in this little list, yeah, these are all things that I recommend doing. I can’t say that, yes, all of these are going to help, if you use all these you’re going to rank for your keyword. I can’t tell you that because, again, it’s just some keywords that you may not be able to rank for at all, period, no matter what you do. Then there are some that you may be able to rank for but they won’t stick. Again, it’s every single keyword is going to be a case in itself. It’s unique so there’s no way for me to be able to say that. However, these are all extremely good methods that you should be using in my opinion. It’s not just because it’s our products. It’s because it works.

This Stuff Works
YouTube siloing, that’s absolutely critical in my opinion for ranking videos now and it’s so easy to do. There’s really no reason you shouldn’t be using YouTube silos. Channel optimization, that’s default. You should be doing that and video optimization as well. Same thing with IFTTT network. That’s just a default, standard operating procedure. Then video powerhouse? Yes, that’s an add-on service but it’s powerful and it’s becoming more powerful by the day.

Persona Networks On Branded Network Blogs

Mark’s up next. He says, “All these ways of saying hi, have you ever put all persona networks on just the branded network before? What I mean is, instead of five two-tier networks putting all personas on just the branded network blogs. I’m building a large [network 00:11:46].” I’m trying to visualize what he’s asking here, guys, in my head so forgive me if my reading is broken. “I am building a larger network for the home services niche and we’ll be using it for blog post syndication for multiple sites. I’ll be using related content feeds in the personas and we’ll be using a lot of them blogs videos, [Pintereses 00:12:02], Pinterest images, etc. I want a strong network so I can syndicate my maps as well as blog post but I will be sprinkling them through. We’re not putting the blog posts through all networks, just blog it here and there is, uh, is this pointless or have you tried something similar before?”

Okay, Mark. In my mind, I can’t picture the setup here as to what it is that you’re asking unfortunately so I can’t really give you a advice either way on this because I just don’t understand what you’re saying. Do you want to put all the networks on the branded network? I don’t quite understand what you’re saying. If you could clarify your question, I’d be happy to answer it. For whatever reason we don’t get to it today, just post it in one of our Facebook groups or Google Plus groups and I’ll jump in and answer it because I would like to answer this question for you but I’m not sure what the setup, the configuration here that you’re describing. I just can’t visualize it.

Hernan: Right. If I can add to this, Bradley, I had [last 00:12:57] part of a blog ones that they have the main branded syndication network and they have the main branded syndication networks but all of these personas they were contributors or authors to the blog. You had, so [today 00:13:13] we have their own editor or her own, I don’t know, whatever.

Bradley: Author profile.

Hernan: Contributor. Author profile, yeah, that’s right. They will blog for the blog. They were all personas and each of those personas will have their own articles syndicated out to their each own individual networks. Does that make sense?

Bradley: That does make sense, yeah.

Hernan: The point is you will have all of these personas contributing to the same network but, on the other hand, each persona will have their own network which only their articles would be syndicated out. That’s easily done with the slash author slash the name of the author slash RSS or slash feed. You can get [a feed 00:13:55] pretty much out of anything from WordPress. That [way 00:13:59] we have been doing. The reality is that, I don’t think it was worth the effort, completely honest, because I mean we had a ton of extra work for these little networks because we had to manage them all and we have to build links to them all so the results like they didn’t [were 00:14:21] up to the effort that was implied on building all of those networks. If that’s something that helps you out, Mark. If not, you can do what Bradley said and repost your question on our various three Facebook groups or whatever.

This Stuff Works
Bradley: Again, I like simplicity, guys. I learned over the years now to not overcomplicate stuff. Mark, I’m not saying you are but I like to try to keep things as simple as possible. Maybe what you’re referring to here is trying to simplify something that you think is complicated and that may be the case. Again, without me understanding exactly the configuration that you’re trying to describe here, I can’t give you advice specifically on this but I can say try to keep stuff simple as much as possible. Only add complication when necessary or complexity when necessary because, otherwise, projects can snowball out of control very, very quickly. I don’t just mean snowball out of control with the amount of work but the amount of time, the amount of effort. The more complex they are, the more likely things are to go wrong and so my point is to just try to keep things as simple as possible.

Again, I’m not stating, Mark, that you aren’t. I’m just saying for everybody’s benefit because I know we oftentimes and I mentioned this before but people will send us a support ticket with this really elaborate graphic that they took four hours to create with all these silos and sub-silos and all these internal linking structures and everything is drawn out and they do beautiful work on these graphics that you know took them a day or two literally to create this graphic and send it in and say, “What do you think of this silo setup? Do you think this will work.”

I usually take the wind right out of their sale because I say, “That’s way too complex. It’s a carpet-cleaning site for a small town. Are you kidding?” You could have a flat site. You don’t even need silo structure for that. Again, it’s just a matter of trying to keep things as simple as possible. I think that comes with experience but I’m trying to prevent some of you guys from going through overcomplicating stuff. I know, especially in the SEO industry, we often want to overcomplicate things. For a lot of local stuff, you don’t need to do that anyways unless you’re in a really, really competitive market.

Best Practices When Using Serpspace Maps Powerhouse Service For Client Sites

All right. We’re going to keep moving. Rogers got three in a row. We typically don’t like that. Let’s see. [inaudible 00:16:43] Looks like multiple questions per question or per submission so we might have to skip over a couple and come back to them, Roger, because typically we don’t want to monopolize time. “Can you talk about the best practices for using Serp Space Maps Powerhouse Service?” I’m not going to comment on that yet. If Marco was here, Hernan or Adam, if you guys want to take a stab at that, I’d be happy to hear it.

Hernan: Yep.

Bradley: [crosstalk 00:17:08] [that to 00:17:09] Marco.

Hernan: Yeah. I would say that from listening to [Markham 00:17:15] from knowing what he’s developing, I would say that, first, you need to have your maps on point since you have everything optimized on your side of things before submitting the Maps Powerhouse, before submitting your maps to be boosted and embedded. Have in mind that maps can be embedded as an iframe and iframe passes a ton of juice. We are using the same approach as we are with Video Power House. Have that in mind but, again, if you do not have like [appropriate 00:17:48] silo map or [appropriate 00:17:49] silo video, it won’t work that well. That would be one of the best practices.

This Stuff Works
The other thing that you need to have in mind is that your questions really resembles what Jay asked because, on some cases, you need to be poking to see if your website will be able or if your maps, rather, will be able to rank on the [3-pack 00:18:11] or the organic search engines or the organic rankings. Have that in mind. I think the answer that we gave to Jay really applies to this because, at the end of the day, we are talking about [3-pack 00:18:20] but we’re also talking about embedding iframes. That’s basically what’s changing.

Bradley: He says, “Should these be used for clients?” I say yeah, you can, Roger. He says, “If so, how embeds should you create for client’s site before it gets spammy?” Think of it this way. It’s an iframe. It’s a Google property iframe. All we’re doing is syndicating Google properties to as many places as you want so it’s not a spam signal. It’s the same as syndicating YouTube videos, guys. It’s an iframe owned by Google. Syndicating maps and map embed is not a spam signal for the website, the money site, if that makes sense, at least it’s not now.

I’m not saying that that’s going to always be forever that way. As it stands today, it’s not a spam signal. You can, as many credits as you have in Serp Space, that’s how many you can do before it gets spammy. You have unlimited credits. You can do unlimited embeds before it gets spammy. If you only got a thousand credits, then it’s a thousand credits before it gets spammy. Does that make sense?

Keyword Relevancy

Number two, “Should you populate the keyword relevancy edition with as many relevant keywords as possible?” Again, I’d rather have Marco comment on that. That is not my area of expertise but he’s not here today, Roger. We could answer that probably in a support ticket though. That would be something that we could probably answer in a support ticket for you. I’m sorry, guys. I’m not completely up to speed on the maps embed system yet so forgive me.

Secondary Embeds

“Please explain the use of secondary embeds. What are the benefits? How do you use them? How much should you use them?” Secondary embeds are all the networks that had been created. They’re all themed, they’re aged. [Daddy 00:20:01] has been working on it. We’ve got over 500,000 [Web2’s 00:20:03] now in various categories that are all themed and being posted to on a regular basis. By the end of February, the end of next month, our team is planning on having over a million [Web2’s 00:20:14]. That’s the goal. Those are really, really powerful. That’s why they’re expensive because, as far as the credit usage, but they are really powerful. That’s something that I don’t recommend unless you absolutely need it. Just go with standard tier one embeds to begin with. See what kind of results you get. If you need additional push, then you can come back and order a secondary embeds.

Hernan: What I wanted to say also to Roger and to all of the guys listening to this is that please don’t come and ask us, “How many links I need to get to page one? Or how many embeds I need to, you know, try and come to [3-pack 00:20:48]?” It’s impossible for us to say that. It’s like literally there’s no way we can tell you guys how many embeds you get. Whoever tries to do that, it smells fishy. You should just have that in mind. You need to be approached of tests like don’t do [not 00:21:08] all at once, Roger? My suggestion and Bradley’s suggestion and the guys is that, just to look a handful of embeds. Do 500. I don’t know. Do a thousand and then see. Then do another thousand and then see how it looks. Then you test and test.

That will allow you two things. Number one, see if you’re embeds and if your techniques are working because you’re giving the time buffer for them to work which is super important, number one. Number two, it will save you credits and credits mean money so you will be actually saving money and time by doing this. Number three, you don’t get over that which is I think one of the main concerns, like how much is too much? You never get over that because once you rank, you rank. Boom. Why do you need more embeds if you’re a ranked. You know what I mean?

Bradley: That’s right.

Hernan: Take it step-by-step, Roger. That would be my approach to it. This has been our advice for everyone doing like IFTTT networks. Some people want to do a gazillion IFTTT networks and we say, “Okay. Let’s start with that, with one?” Take it slowly. Again, we cannot tell you how much is too much. You need to do the test.

Bradley: Do the bare minimum. Do the bare minimum required to get results, guys. As Hernan said, when I go in and I order a map embed or a video embed, from Video Power House, either one, I usually start off with around 50 embeds and then I wait. I wait for a week or two and see what kind of results I get from that. If I need to, I’ll go in and order a second one. Now, those of you that are in a real hurry, which I know most SEOs are, you can go with more.

This Stuff Works
That’s fine. My point is if I can end up getting results from only 50 embeds, why would I want to spend 200 credits or 200 embeds if I could do it with 50? Not only that but if I do 50 and I get some results and it starts to slip a little bit, I can do another 50 instead of blowing my load all at one time so to speak. You know I mean? That’s the way I do it and that’s how I’ve been doing it. Same thing with videos and maps, pretty much one and the same when it comes to that sort of strategy.

Next, we’re going to try to run through these very quickly and the only reason why I’m going to actually answer all these questions, Roger, is because they’re about our services. This would be a beneficial to others that may have similar to questions. Otherwise, this is a lot of damn questions in three posts.

Best Practices In Using IFTTT Networks Service In Serp Space For Client Sites

“Can you talk about best practices [prior to 00:23:43] IFTTT networks in the Serp Space marketplace for client sites? Number one, should you use them for client sites?” Yes, you should be using branded networks for absolutely everything online in my opinion. [Markham 00:23:54] makes a joke. If his dog sits around, sits still long enough, he gets an IFTTT ring around his neck. I get that because if any sort of websites, properties, money sites, money channels, they should have branded networks. That’s part of the Semantic Web so, yes, you should use them on client sites.

“What would you do for a second, et cetera? Uh, how much would you do and how often?” All of this is explained in the IFTTT training. I’m pretty sure it’s explained when you order networks too but at least when we deliver a network, we have a series of videos that explains what’s best practices and what you should do next, that sort of thing. A bit quickly [inaudible 00:24:27] if you create or build a network or buy a network from us, either one, whichever way you get them, once you have a network, start publishing content from whatever you’re using to produce the content. If it’s WordPress, start publishing posts to your blog. If it’s YouTube, start uploading or live streaming videos.

Once your network is done, start publishing content. Once you start doing that, you start priming your network with posts and it starts to theme the network through the post or the content that you’re syndicating. Then I always recommend sending the network over to getting links built to it. We have an IFTTT link building boosting service or you can use your own tools. It’s fine. If you have link building tools, that’s fine. Use your own link building tools. Point is we usually end up building links to that and then just continue posting content.

If you need additional links because you need an additional boost, then [get and 00:25:23] order additional links but just do things slowly one step at a time like what Hernan just said. Get the network created, start publishing content, get links built to it. If you need more after sometime, continue publishing content. The number one most important factor, guys, for your networks to work is to continually publish content. Then if you need additional work or additional push, then you can add, build additional links to them as well as do other things like order done for you drive stacks or do build your own drive stacks, whatever, but they’re very labor-intensive just like network so it’s better just to hire it out.

DFY RYS Folder Stack

That leads us to this last part of Roger’s questions about DFY or Done For You RYS Folder Stacks, “Does this still work?” It works better now than it did when we first launched the service and I mean that. It’s freaking fabulous what it does. Maybe walk us through the order form. I’m not going to do that. It would take too long. “Do you just need one stack for a client site or should you create more than one?” Start with one, Roger. If you get results, then you don’t need anymore.

This Stuff Works
“When does it become too spammy?” They’re Google Drive files, guys. They’re not spammy. At least, not yet. I mean, yeah, they’re spammy but Google doesn’t think of them as spammy yet. They’re Google Drive files, so you’re linking from Google to your own properties. “Is the stack created under the client’s account? No. Typically, the stacks are going to be created under persona count but you can assign the client as a manager and you can transfer ownership of all the files and folders to the client. When you order the done-for-you services, we don’t want your client’s account information. We don’t want it. In fact, we will reject the order if you send it. We will create under a persona account and then you can add your client’s profile as a manager and you can even transfer ownership if you desire.

“Do you supply the content for the folder? Is that up to us?” There’s really no content to supply. You supply some images and some keywords. That’s it. That’s all you need to do. Supply some images and graphics if you want them themed or branded well and some keywords. That’s all you need to do. What needs to be done after stack is created. Build links to it. It’s pretty much it.

“Are there any other ongoing tasks that need to be done?” Not really, not drive stacks. Like I said, if you still need additional push, if you didn’t get where you want them to be from the stack itself, order links to it. Build links to the stack, all of the files in the stack to the main folder. There’s so many URLs. It’s ridiculous from drive files. It’s just silly. If you need to, you can always order, you can start siloing with inside the stacks, guys. You can use drive folders to create silos. You can do all kinds of stuff inside drive and we teach all that in RYS Academy or you can purchase it from Serp Space.

Roger, again, one step at a time, buddy. Order just one at a time. That’s all you should be doing, guys. Don’t waste your money. Don’t order three or four networks for one project. You don’t need that, or three or four done-for-you drive stacks. You don’t need that. Just do one at a time. See what kind of results you can get. I want to keep moving.

Passing Authority Into Silos When You Add Tier 1 On Each Subdomain And Tier 2 Stack On Top

Greg says, “Scenario: new client, 100 subdomains, uh, city subdomains. WordPress site is currently configured with Canonicals Passing Authority to the perfect siloed categories on root domain all ranking solid and top 10 and serves for medium to difficult keywords.” That sounds awesome, Greg. I’m going to plus one that, actually. That is an industrial-strength project there. I want to add T1 on each subdomain with tier two stack on top. That’s going to be a lot of work, Greg.

“Suggestion on how best to pass authority in two silos with this configuration. Suggestion on anything else to root domain with stacks to subdomains. PS, you white guys need to get a tan.”

Hernan: Yes, we do.

Bradley: I mean, suggestion on best to pass authority in the silos, it doesn’t seem like you need any suggestions because if you’ve got the canonicals all pointed from the subdomains sites pointed, so the economic canonicals are pointed to the categories, the categories on the root domain, from the subdomains. You’re reinforcing so all the stuff that you do on the subdomains is basically pushing the credit back up to the category silo on the root site. I’m assuming that that’s what you’re trying to do is actually rank the root site and that seems like a great way to do it.

The only other thing that I would suggest maybe doing is besides just using canonicals is maybe create some physical links as well, some actual links from your subdomain sites up because that becomes very, very, very powerful as well. The canonicals are going to push basically credit but it doesn’t push link equity. You still need an actual hyperlink for that.

I would play around with that, Greg. I wouldn’t just all of a sudden go out and on a hundred subdomains go create a hundred links up to the category page. I wouldn’t do that. What I would do is probably take a handful of them, select one category on the root domain to test with against the others or maybe select two so that you have you got two tests against the control, the control being all of your silos the way that they are configured now and then test a couple other internal linking strategies from the subdomains to the categories. That way you can test different configurations and see what kind of results.

This Stuff Works
If you get better results from one configuration than another, then repeat that configuration on another silo and see if you can repeat those results. If you can, then you’re onto something. At least when you’ve got a project that big, it looks like you have plenty of opportunities to test different configurations.

It sounds like you got, that’s pretty advanced project there, Greg. It’s awesome. This is something we can get into, Greg, if you want on mastermind when we have mastermind tomorrow. We’d be happy even if you like you drew at them. I just talked about keeping things simple but this is a big project so that’s different. For a big project if you have a graphic or diagram, don’t spend from now until tomorrow at mastermind time creating a graphic, Greg. Don’t do that. If you have a very simple diagram of what it is that you have or at least like maybe one or two silos out of this structure, if you could diagram that out, then we can play around with that tomorrow on mastermind. We’ll dig into that a little bit deeper.

Go ahead, [Mark 00:31:50].

Hernan: Also, Greg, if you want to order the networks from Semantic Mastery [levels 00:31:54] now, we’ll give you a [ball for it 00:31:56] because it’s ton of work, man.

Making Profits With Google Suggested Keywords Having Low Search Volume

Bradley: Yeah, no shit. No kidding but we’re going to charge you extra for the tan joke. “Hi, guys. I follow your Hump Day Hangouts for a few weeks now and I love your content. Thanks for the valuable info. I will plus one that.” Thank you for saying that. “I watched Bradley’s keyword research videos recommended in the previous Hump Day Hangout in the info is great again. Thanks.”

You know what’s crazy is that most keyword research videos were recorded three or four years ago and they’re still relevant today which, it’s insane because [I guess 00:32:28] there’s certain principles that just don’t change. Anybody that doesn’t know what he’s talking about, I set up a little affiliate funnel like two, maybe three years ago now. Three years ago now I think. I don’t know. It’s been a long time, it’s been two years at least, keywordsuggest.co. It’s free. It’s an opt-in form. You opt in and it redirects you to or send you in the email the link to the training site and it’s just keyword research stuff. If you guys are interested in that, go opt in, keywordsuggest.co.

“My question: When I use this approach for affiliate keyword research, the long tail keywords I found in the end with this process didn’t really have any search [volume 00:33:06] showing up in Google keyword planner or very little, although they were extracted from Google Suggest. What’s the point in going after such keywords if I can’t get any draft from them or what am I missing here?”

Good question. What you’re missing here is, guys remember, Google AdWords, the keyword planner, excuse me, the Google Keyword Planner is an AdWords tool. It is not an SEO tool. In order for a phrase, a search phrase to show up in Google Suggest, there has to be some history of people actually searching for that term. Does that make sense? Google will not suggest search queries that don’t have a history of being searched for by actual people typing that full search query in. Google doesn’t just arbitrarily select search queries for, excuse me, suggest search queries to people that start typing in. It goes based upon its own historical data. If it has a suggest or if Google suggests it, it means there’s traffic there.

We’ve been saying that for two years now and that’s why Power Suggest Pro is my absolute favorite keyword tool in the whole world. That’s really what that affiliate, that keyword funnel is that I’m talking about now, guys. It’s really to promote Power Suggest Pro. I talk about how you can use free tools for everything. Power Suggest Pro costs $57. It’s a one-time purchase, no update costs, nothing, no subscription cost. It’s probably the best keyword tool I’ve ever used in my life. I love it for SEO.

This Stuff Works
What I’m trying to say is in AdWords, guys, AdWords won’t even let you bid on long search queries or search keyword phrases. It won’t let you write. It will tell you that there’s low search volume so it won’t even display your ads or anything. I think it’s anything over 10 words it’ll tell you just flat out it won’t but Power Suggest Pro will sometimes show you search or suggested phrases that are sometimes 10, 12, they’re like almost an entire senate. That means people have searched for that before.

My point is is that you don’t, if you go to AdWords keyword tool and you put post in suggested phrases you’re going to get very, very little traffic results. It’s not going to show any search volume. If at all, it’ll be very, very little. That’s fine because people that are bidding on keywords aren’t usually bidding on long phrases. They’re bidding on words within that phrase using various match types. Broad phrase, exact or modified broad, so they don’t need to bid on that full phrase. They only need to bid on a couple, two, three keywords out of the phrase to get their ad to show.

My point is don’t use the Google Keyword Planner for SEO terms. You can use it to give you ideas or it’s for some maybe benchmark numbers but do not count on the Google Keyword Planner for actual bona fide genuine search volume for SEO because it’s not. It’s an AdWords thing. Suggested phrases are much better for SEO because you can optimize for those long phrases in blog posts. You can specifically address those long phrases. You will start to generate traffic from that and they’re a hell of a lot less competitive too, by the way.

“So in other words, do you think it’s possible to make profits with affiliate site to optimizing blog posts and pages for Google suggested type long tail keywords or am I just wasting my time?” Absolutely that is my suggestion is that you don’t focus on the short tail keywords, at least not initially. If you build your silo structure properly within your site, those short keywords that show search volume so that the much broader shorter phrases. Those are going to be the top of your silos. You want to build your silos out and then you want to populate your silos with articles and blog posts that are targeting these long tail phrases and you will start generating traffic from those long tail phrases first. Over time, you will build authority and theme that silo. You’ll reinforce the theme of the silo.

Eventually, if you continue to produce content and get traffic, you should end up ranking for your top level terms too. Does that make sense? Always go after the long tail stuff to get some results. Again, like I said, I don’t know if you guys are seeing my full screen but Power Suggest, I’m just about to open it up right here. This is one of the simplest keyword tools [I work 00:37:27]. One of the reasons why I love this is because it’s so simple. How many keyword tools are out there that you need a freaking degree to work? This one is you just punch in a keyword and you hit Go and it spits out a whole bunch of keyword phrases that are actually being searched for, whether the Keyword Planner shows it or not. Hopefully that helped.

Sending Links In Hebrew From An English Article To A Hebrew Website

[Asi 00:37:49] is up. He says, “I don’t know where to start but I will try. I do SEO for a living in Israel but in Israel there are limited possibilities for building links. My first question is if there’s a possible way to send links in Hebrew from an English article to a Hebrew website” Hernan, I’m going to let you answer this because you do the foreign language stuff. I don’t.

Hernan: I’m trying to reread the question. In Israel, yeah. Links in Hebrew from an English article to a [inaudible 00:38:19]. Tier one should be English, yeah, definitely. Is there an advantage to send English link to a Hebrew website? That’s a little experiment by Google. The reality is, [Asi 00:38:32] that we talked about this repeatedly when your Google or where your any company for that matter and when you need to invest your resources and by resources I mean AI, I don’t know, linguistic, whatever you want to call it, of course, you will invest it on the bigger market which is Google.com.

That makes that the other languages, Spanish, for example, Portuguese, et cetera, I haven’t worked in Hebrew but I think it will be pretty much the same as working in any other language or foreign language. Things become much easier in terms that the algorithm falls off, like it goes out of schedule if you would. All of the things that are being deployed right now in English will take approximately you could say one, maybe one year and a half to achieve the biggest languages. I’m not talking about Hebrew. I’m talking about like Spanish, for example, just a big language comparatively to do the people that use the language. I’m not saying here is less language but it’s just the sheer amount of people that use that language to navigate online.

Yeah, definitely. My point is that you can use PBN in English as to tier one. You can turn a website that wasn’t English into Hebrew and Google will not de-index it. PBNs in English have a big [problem 00:39:59] with this indentations. This is not the case with some other languages. I have PBNs in Norwegian, for example, that they have been around forever. They are fucking spammy. My point is that, yeah, go to town. You can go to town with tier one in English because I truly I understand your pain in terms of it’s super hard to find language-related links in the same language. Not only that. If you start looking deeper like if you try finding, I don’t know, niche-related links in Hebrew is going to be impossible. Yeah, go ahead.

This Stuff Works
Bradley: Thanks, Hernan. He says, “The second one I’m going to do SEO for a major one million searches a month word in English. Does aggregate SEO look spammy by Google and big keywords?” I don’t follow that question, [Asi 00:40:55]. Sorry, buddy. I just don’t understand what you’re asking here. “Does RSS look spammy by Google and big keywords? Does aggregate RSS look spammy?” I’m not sure what you mean. If you can try to clarify that question for me, I’m sure it’s just lost in translation here somehow. If you want to try to clarify that, I’ll try to answer that. Sorry.

Sending All Links Built From Serp Space To An Indexer

Wong is up. He says, “Bradley, I have links built in Serp Space six months ago. Currently most of the Web 2.0, excuse me, that is founded by Ahrefs but Google still not indexing it yet. Is it necessary to send all of them to an indexer?” A couple things: Number one, it’s funny that you’re seeing … If you had links built to your Web 2 network, you probably will start seeing some of your Web 2 links from your network showing up in Ahrefs and Majestics because they were powered up by the link building package.

Because we talk about this a lot, Wong, most of your Web 2 links you’re not going to see in Ahrefs or Majestic because they just don’t get picked up. They’re not significant enough for them to be picked up by Ahrefs or Majestic because you got to think the computing power required to keep an index, a fresh index of active and live links for sites, guys. A lot of the Web 2 site network or, excuse me, Web 2 links are not going to show in those unless they’re particularly powerful links which, if you build links to your IFTTT networks, naturally some of those links are going to become powerful enough that they will start to show in the link analysis tools.

However, just because you’re saying they’re not indexed in Google and that may be true, you may have done a site colon operator or an info colon operator and looked at some of your Web 2s and you’re not actually showing in the index but that doesn’t mean Google doesn’t know they’re there and isn’t counting them. That’s something that you can find if your site is connected to search console. You should be able to go to search console and then click in the left sidebar and one of the drop-downs. It’s links to your site. You click on that and then you can expand or even download a list of links.

We talked about this every week now for the last few weeks. Terry Kyle recently did a blog post and a test where even Google starting to show less and less of those links. If you take a look at that links to your site section inside search console, you’ll see a bunch of WordPress, Diigo, Tumblr, Delicious, you’ll see those links showing up, not the specific URLs where the link sits but you’ll see the root domains and you’ll see that you have most of your IFTTT properties will show in that list is what I’m saying.

Again, just because Google doesn’t have those, part of the reason why some of those links don’t show in the index, guys, like if you do a site colon operator search for the specific Web 2 links is because its duplicate content, don’t let me scare you guys away but what I’m saying is it’s a republish to syndicated post from the original source. Google will put some of those in what they call supplemental index. It won’t show but that doesn’t mean Google doesn’t know they’re there. Does that make sense? Just keep that in mind. It’s not really necessary to have them show up in the Majestic or Ahrefs. I like it when they don’t show. The reason why is because I don’t like competitors looking and seeing what the hell I’m doing. I prefer it when they don’t show but if you do, yes.

Do you want to send them to an indexer? Yeah, I always do. I always at least send the profile homepage URLs like the profile URLs to an indexer. Typically, we don’t like grab the post URLs, the syndicated post URLs and send those to an indexer because there’s not really a way to automate that now. There used to be but that plugin that did that which is called Backlink Commando is no longer being supported so we can’t do it by, it’s not an automated method anymore. Yeah, if you want to send your profile URLs to an indexer, that’s not going to hurt anything.

This Stuff Works
You can send all your post links too if you want. It’s just a manual process. Unless somebody knows of an automated way, we don’t have one at the moment. The best indexer and this comes straight from our link building manager’s mouth and I trust his judgment, so if he says it is so, I believe it. That’s expressindexer.solutions. Right now, he says he’s getting his GSA, which are spam links, 80% of his GSA links indexed through this service right here. This is what I would recommend doing for indexing services right now.

Purchasing G+ Followers & Likes

We’re going to keep moving. We only got a few minutes left. Let’s see. Virginia [Surgeons 00:45:27], I think he’s Toby. Yeah, Toby. There we go. “In purchasing G Plus, is purchasing G Plus followers and likes worth doing it?” Not really. I think you’re better off if you can find a good social signal service and do it that way. I don’t mean like buying, like what I’m saying is Dan Anton has actually got one. It’s a good service. It’s called Social Network Signals. That’s a good one but typically just buying spam followers and Plus One is not a good idea.

When it first started it was okay but I don’t recommend doing that now. I really don’t. I don’t recommend doing that at all because it just looks spammy. There’s less activity on Google Plus than there is like Facebook and Twitter as far as like shares and that kind of stuff. If you have content out there that all of a sudden has an unusual amount of followers or likes, it’s just a real spam signal. I recommend just avoiding it. Your effort and money should be spent on stuff that’s going to be more productive.

Paul is up. “Hey, guys. Glad you posted a link to this page because last week it didn’t matter where I went. It would only go to the webinar registration page so I could not post a question. I want to give some results I’ve had in the short term with Megarray. Listen to this guys. I’m crushing it with this program. Between 60 and 80% of my videos on the first page, the majority of them are number one. We aren’t talking about low competition keywords either. I’m ranking in niches like gifts, natural supplements, wow, attorney sites and storage sites. All are very competitive. I have covered around 600 cities now. This is definitely a game-changer.”

Guys, anybody that missed that Megarray webinar, we have a replay for it. I think it’s a very, very powerful tool when I said so on the webinar and that wasn’t bullshit. That was the truth. I know those guys, they got a full team of developers behind that tool that have been working on it for months because they were in contact with us about features and all kinds of stuff so I know they put a ton of development behind it. It’s a great, great tool.

Let’s see. “I put up two sites in the last two weeks. Only optimize them on page, branded network, of course, nothing else. I then in, then did a video campaign on both sites and both came from nowhere in the rankings too. One is on page four the other went to page two. One site is an attorney site and the other is the storage site. Wow. one company I promoted the product. It generated 11 leads and six sales. The sales averaged over a thousand dollars each in less than two weeks.” Paul, that’s amazing, man. Thanks for sharing that. “To think I almost didn’t go to the webinar because of your lack of enthusiasm, Bradley, about the products when you promote it to webinar. Thanks for promoting.”

This Stuff Works
Guys, my lack of enthusiasm from promoting it was misguided in that. The only thing was we had recently promoted video marketing glitch was Abs product and Abs product is really, really great. I hired a VA and trained him how to use it specifically to use that tool. We typically don’t like to promote a similar product to our list within a certain period of time. Since we have promoted video marketing blitz, we almost hesitated promoting Mega-Ray at all because we didn’t want to piss off our list, you know what I’m saying, for promoting a similar product from two different developers.

I believe in both products though and that’s why we did it and so that a lack of enthusiasm was really my hesitation and not wanting to piss you guys off by promoting another product. Hopefully you guys understand by now that we don’t just promote products just for the sake of making money. We don’t do that and hopefully that you guys all agree with that. Anyways, again, I’m more enthusiastic about it now, Paul. Thanks.

Guys, go check it out. We have to wrap it up. We’ve got to get ready for the masterclass webinar and it takes a few more minutes to get ready now that we have this new platform we’re working on. Unfortunately, I’m not going to be able to get the rest question but thanks for everybody being here. Master class starts in a few minutes. Be there. If you’re not there yet, you should. Excuse me. If you’re not there yet, you should probably join us because we’re [crossing it 00:49:25] over there with our live case studies [inaudible 00:49:27] now.

Hernan: Be there.

Adam: Awesome. See you everybody. If you didn’t get your questions answered, remember it’s first-come, first-served. We get busy so by all means get those questions on there when we send out those emails. We mean it, first-come, first-served. We do run out of time. We want to help everybody but we got to cut it off at some point. Hopefully, everybody can get in earlier next time.

Bradley: We do have people are actually putting questions in the chat box. That’s crazy. I thought we were using the event page.

Hernan: [inaudible 00:49:55] tomorrow on the podcast.

Bradley: That’s cool. We’ll work it out next week [and we’ll get 00:50:00] back.

This Stuff Works

Weekly Digital Marketing Q&A – Hump Day Hangouts – Episode 116 posted first on your-t1-blog-url

How Would You Setup YouTube Videos’ Silo Structure For Local Cities?

In episode 115 of our weekly Hump Day Hangouts, one viewer asked about the best way to set up the silo structure of YouTube videos to optimize for local cities.

The exact question was:

How would you recommend YT structure for local cities? Do you agree with “”others”” that say channel name = KW+USA, then playlist = KW+state, then videos titles = KW+Counties, then description to put in cities??? If not what channel and silo structure do you recommend? Finally, how many silos per site? Thanks!!! Happy New Year!

This Stuff Works

How Would You Setup YouTube Videos’ Silo Structure For Local Cities? posted first on your-t1-blog-url

Why You Should Steal My Daughter's Playbook for Effective Email Outreach

Posted by ronell-smith

During the holidays, my youngest daughter apparently had cabin fever after being in the house for a couple of days. While exiting the bedroom, my wife found the note below on the floor, after the former had slyly slid it under the door.

Though tired and not really feeling like leaving the house, we had to reward the youngster for her industriousness. And her charm.

Her effective "outreach" earned plaudits from my wife.

"At least she did it the right way," she remarked. "She cleaned her room, washed dishes, and read books all day, obviously part of an attempt to make it hard for us to say no. After all she did, though, she earned it."

Hmm...

She earned it.

Can you say as much about your outreach?

We're missing out on a great opportunity

Over the last few months, I've been thinking a lot about outreach, specifically email outreach.

It initially got my attention because I see it so badly handled, even by savvy marketers.

But I didn't fully appreciate the significance of the problem until I started thinking about the resulting impact of bad outreach, particularly since it remains one of the best, most viable means of garnering attention, traffic, and links to our websites.

What I see most commonly is a disregard of the needs of the person on the other end of the email.

Too often, it's all about the "heavy ask" as opposed to the warm touch.

  • Heavy ask: "Hi Ronell ... We haven't met. ... Could you share my article?"
  • Warm touch: "Hi Ronell ... I enjoyed your Moz post. ... We're employing similar tactics at my brand."

That's it.

You're likely saying to yourself, "But Ronell, the second person didn't get anything in return."

I beg to differ. The first person likely lost me, or whomever else they reach out to to using similar tactics; the second person will remain on my radar.

Outreach is too important to be left to chance or poor etiquette. A few tweaks here and there can help our teams perform optimally.

#1: Build rapport: Be there in a personal way before you need them

The first no-no of effective outreach comes right out of PR 101: Don't let the first time I learn of you or your brand be when you need me. If the brand you hope to attain a link from is worthwhile, you should be on their radar well in advance of the ask.

Do your research to find out who the relevant parties are at the brand, then make it your business to learn about them, via social media and any shared contacts you might have.

Then reach out to them... to say hello. Nothing more.

This isn't the time to ask for anything. You're simply making yourself known, relevant, and potentially valuable down the road.

Notice how, in the example below, the person emailing me NEVER asks for anything?

The author did three things that played big. She...

  • Mentioned my work, which means she'd done her homework
  • Highlighted work she'd done to create a post
  • Didn't assume I would be interested in her content (we'll discuss in greater detail below)

Hiring managers like to say, "A person should never be surprised at getting fired," meaning they should have some prior warning.

Similarly, for outreach to be most effective, the person you're asking for a link from should know of you/your brand in advance.

Bonuses: Always, always, always use "thank you" instead of "thanks." The former is far more thoughtful and sincere, while the latter can seem too casual and unfeeling.

#2: Be brief, be bold, be gone

One of my favorite lines from the Greek tragedy Antigone, by Sophocles, is "Tell me briefly — not in some lengthy speech."

If your pitch is more than three paragraphs, go back to the drawing board.

You're trying to pique their interest, to give them enough to comfortably go on, not bore them with every detail.

The best outreach messages steal a page from the PR playbook:

  • They respect the person's time
  • They show a knowledge of the person's brand, content, and interests with regard to coverage
  • They make the person's job easier (i.e., something the person would deem useful but not necessarily easily accessible)

We must do the same.

  • Be brief in highlighting the usefulness of what you offer and how it helps them in some meaningful way
  • Be bold in declaring your willingness to help their brand as much as your own
  • Be gone by exiting without spilling every single needless detail

Bonus: Be personal by using the person's name at least once in the text since it fosters a greatest level of personalization and thoughtfulness (most people enjoy hearing their names):

"I read your blog frequently, Jennifer."

#3: Understand that it's not about you

During my time as a newspaper business reporter and book reviewer, nothing irked me more than having people assume that because they valued what their brand offered, I must feel the same way.

They were wrong 99 percent of the time.

Outreach in our industry is rife with this if-it's-good-for-me-it's-good-for-you logic.

Instead of approaching a potential link opportunity from the perspective of "How do I influence this party to grant me a link," a better approach is to consider "What's obviously in it for them?"

(I emphasize "obviously" because we often pretend the benefit is obvious when it's typically anything but.)

Step back and consider all the things that'll be in play as they consider a link from you:

  • Relationship - Do they they know you/know of you?
  • Brand - Is your brand reputable?
  • Content - Does your company create and share quality content?
  • This content - Is the content you're hoping for a link for high-quality and relevant?

In the best case scenario, you should pass this test with flying colors. But at the very least you should be able tp successfully counter any of these potential objections.

#4: Don't assume anything

Things never go well when an outreach email begins "I knew you'd be interested in this."

Odds suggest you aren't prescient, which can only mean you're wrong.

What's more, if you did know I was interested in it, I should not be learning about the content after it was created. You should involved me from the beginning.

Therefore, instead of assuming I'll find your content valuable, ensure that you're correct by enlisting their help during the content creation process:

  • Topic - Find out what they're working on or see as the biggest issues that deserve attention
  • Contribution - Ask if they'd like to be part of the content you create
  • Ask about others - Enlist their help to find other potential influencers for your content. Doing so gives your content and your outreach legs (we discuss in greater detail below)

#5: Build a network

Michael Michalowicz, via his 2012 book The Pumpkin Plan, shared an outreach tactic I've been using for years in my own work. Instead of asking customers to recommend other customers for a computer service company he formerly owned, he asked his customers to recommend other non-competing vendors.


Genius!

Whereas a customer is likely to recommend another customer or two, a vendor is likely able to recommend many dozens of customers who could use his service.

This is instructive for outreach.

Rather than asking the person you're outreaching to for recommendations of other marketers who could be involved in the project, a better idea might be to ask them "What are some of the other publications or blogs you've worked with?"

You could then conduct a site search, peruse the content the former has been a part of, then use this information as a future guide for the types and quality of content you should be producing to get on the radar for influencers and brands.

After all, for outreach to be sustainable and most effective, it must be scalable in an easy-to-replicate (by the internal team, at least) fashion.

Bonus: Optimally, your outreach should not be scalable — for anyone but you/your team. That is, it's best to have a unique-to-your-brand process that's tough to achieve or acquire, which means it's far less likely others will know about, copy or use it or one like it.

Awaken your inner child, er, PR person

Elements of the five tips shared above have been, singularly, on my mind for the better part of two years. However, they only coalesced after I read the note my daughter shared, primarily because her message delivered on each point so effectively.

She didn't wait until she needed something to get on our radar; never over-sold the message; was selfless in realizing we all likely needed to get out the house; didn't assume we were on the same page; and activated her network by first sharing the note with her sister first, and, through her mom, me.

Now, the question we must all ask ourselves is if the methods we employ as effective?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Sunday, January 29, 2017

Do You Have Any Issues With Burner App On Phone?

In episode 115 of the weekly Hump Day Hangouts by Semantic Mastery, one viewer asked if the team had encountered problems with burner app on phone.

The exact question was:

Hi guys, couple quick things. First, I’m testing out Megarray- was impressed I put in a ticket and they called me on the phone. Pretty decent support! Will report back after some tests. Next, anyone have issues with burner app on phone sometimes not working with Google coming back and saying this number is not valid? Finally, I just finished watching the YT Silo course- GREAT STUFF!

This Stuff Works

Do You Have Any Issues With Burner App On Phone? posted first on your-t1-blog-url

Saturday, January 28, 2017

What Are Your Thoughts On Serplify?

In episode 115 of our weekly Hump Day Hangouts, one participant asked about the team’s insights on the SEO tool Serplify.

The exact question was:

Have you guys seen the new Serplify by Mo Miah it a serp shaker type software. Here is YT https://www.youtube.com/watch?v=ykc85-futKs

This Stuff Works

What Are Your Thoughts On Serplify? posted first on your-t1-blog-url

Friday, January 27, 2017

Google Pop Up Update With Example

So, you’ve probably heard about the Google announcement back in 2016 dealing with mobile user experience…right?

If not, you might not be aware that they announced that as of January 10, 2017, “pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as highly”.

Hmmm…along with their images showing what they consider to be “bad” pop ups and interstitials, most people might want to check this out!

Luckily, Hernan made some changes on the Semantic Mastery website to make sure we were offering a good experience while complying with the new mobile search requirements – and he recorded a video showing you exactly what we did:

The good news?

It’s not that hard to reconfigure and most cases are very easy to deal with – either delaying a pop up based on user interaction via:

  • Timed delay, or
  • User movement of the page (i.e. waiting until the page has been scrolled)

Hopefully this gets you moving in the right direction – providing a good user experience while also not suffering from a negative ranking signal!

You can see the original announcement on the Google Blog Here: Helping users easily access content on mobile

While this was done on our site using Ninja Pop Ups, the same applies no matter what WordPress plugin you are using, or if you are on a different CMS altogether.


Google Pop Up Update With Example posted first on your-t1-blog-url

How To Index Citations?

In episode 115 of Semantic Mastery’s weekly Hump Day Hangouts, one viewer asked how to index citation sites in Google.

The exact question was:

Hey guys hope you’re all starting the year off with a bang! I’m excited about this year and believe I will be able to finally join the Mastermind so looking forward to that!

My question is, How are you indexing your citations? I still have quite a few not indexed and they were built a while ago now. If I just copy and paste the url of the citation page into google and it doesnt show does that mean its not indexed or may my links still be indexed? Cheers

This Stuff Works

How To Index Citations? posted first on your-t1-blog-url