Tuesday 27 August 2013

Know When to Cut Your Losses: How To Switch to a New Domain


Over years of operation, websites pick up a lot of baggage in the links department. Google’s Penguin algorithm seems to take no account of links accumulated on your website prior to this becoming an offence against their terms of service. If you innocently built links in the sure and certain knowledge that it was OK to do so, way back when, you might now be eyeballing a disastrous drop in traffic in recent months…
If you read the book that said build deep links, with keywords in anchor text then you are surely in Google’s gun sights.
Panda, Penguin and Phantom repairs meet with mixed success
Some site owners solve their issues and are a duly restored to the ranks of the faithful. Others struggle to make sustained ranking gains, with SERPs profiles like a rollercoaster diagram after being hit bymultiple algorithms and subsequent version releases.

There comes a point where the cost of fixing the problems greatly exceeds the cost of migration to a new domain. If your site has thousands of incoming links, and a close examination indicates the majority are potentially harmful, then you are at the point where the evaluation of domain migration should be made.
Efforts to clean up your link portfolio by trying to get low-quality links removed may not have sufficient impact. This scenario was expected when Google set out to tackle link manipulation by application of penalties.
Google’s Max Cutts has said publicly: “ If you’ve cleaned and still don’t recover, ultimately, you might need to start all over with a fresh site.”
In some cases it may be the only recourse that a site has, should the sheer volume of bad incoming links exceeds a certain threshold.
In my experience on a recipes site with approximately 8,000 links generated by a previous owner, many were blogroll-type links from low-quality Blogger, Blogspot and WordPress.com cooking sites.  Of those sites, less than half had any contact details whatsoever. Of those that did, many were fakes; no@way.com being an excellent example of an anonymous site’s owners attitude to efforts to contact him/her.
Out of 10-plus e-mails sent to possibly valid addresses, a mere eight responses were received over a two-week period. One site owner wanted $5 per link to remove 23 links.  Four indicated that they had removed all links. One stated that the site was only a month old and had no outward links — the probability being that the domain expired and was acquired by a new owner.  Link removal campaigns can easily drag on for many months, regardless of how hard you work at it, or if you use Disavow to assist.
Traffic had already declined from 240,000 per month in January 2012 to 70,000 in May 2013. A “Starting Over” domain migration made a lot more sense under such circumstances… That was completed by late June, with an immediate response in traffic growth.
  • The July traffic was 93,000.
  • As of Aug. 9, the trend continues, with a predicted 135,000 visitors for the month.
A similar pattern of results has occurred on several other sites I’ve worked on.
How Difficult is Starting Over?                                                                       
Actually, its not as difficult as it might at first appear… and its certainly a lot less onerous than attempting to fix 11,000 bad links.
The presumption is that this is not a “churn and burn” site — its a well-designed site with a serious investment in good content that has fallen into disrepute via inappropriate link building practices.
Pros of Switching Domains
If the original domain has great content, and the majority of bad links are directed at the domain name rather than internal pages, it should be possible to transfer significant value across to the new site. That’s done by transferring content to the new domain, and using the power of 301 redirects to transfer an individual page’s rankings and intrinsic value over to its counterpart on the new domain.
Change of Address Support — both Google and Bing Webmaster Tools provide ‘change of address’ procedures.
Loss of Links Issues — every poor link you leave behind improves your prospects in terms of rankings. Bear in mind that you may be getting penalized from multiple quarters:
  • Low quality of links — bad neighborhoods, wrong genre etc.
  • Deep links from low-quality sites.
  • Over-optimization of anchor link text, exact-match keywords in link titles.
  • Link acquisition speed of past quick link building efforts — too many, too fast.
Any combination of the above leaves a site completely crippled and on life-support. Leaving the mess behind you en masse may be more cost-effective than trying to rehabilitate it.
Many Good Links are Salvageable — in my experience, most good sites that already link to you are receptive to a polite request to edit your domain name and/or anchor text in the link to your old site. Right now, in Google’s Penguin equation, one good link is literally worth more than 11,000 bad ones.
301 Redirects will preserve all user bookmarks and deep links from other sites but DO transfer the stigma from suspect links. So, while you can preserve all of the residual page value in the existing site by adding a 301 redirect for every page, post and category etc, some caution is required.
  • Identify internal pages with low-quality incoming links, especially those with keyword in anchor text.
  • Either get the links removed or don’t add a 301 redirect for them. Instead, delete the content and leave a brief note with a standard hyperlink explaining where the content has moved to.
Speed — rankings start accumulating within a week on the destination domain, and expand rapidly.
Cost — given that ALL of the current site  is transferred to a new location, it may take a day or so to:
  • Move it to a new domain.
  • Configure the 301 redirects.
  • Add the appropriate settings in Google and Bing webmaster tools.
Time needs to be allocated to requesting owners of the good links to edit the links to you from old to new domain — but speed is not crucial to a positive outcome.
Conversely, there’s literally weeks of work trying to sort out the bad links problem before any improvement to rankings and traffic will be discernable.
Cons in Domain Migration — Pretty minor, all things considered.
Potential Client Confusion — There may be the odd client who gets slightly confused if you change the domain name. However, because you need to keep the home page in place for at least six months, and all required internal pages will be redirected automatically when links are clicked or followed, it should be a minor issue. Use a domain name that is almost identical and your visitors will barely notice.  For example, www.mywebsite.com becomes or www.mywebsite.co or some such similar approach. If the design and layout remains similar, it should prove to be a seamless transition.
The Mechanics of Domain Change
The new domain must be selected, and this could be an opportunity to select a better fit to your brand. Businesses evolve over time, so the domain name that suited the start-up business might now bear little relevance to what you now do. Alternatively, that aspect may be fine and you will just want to use a visually similar domain name to minimize confusion.
Consider Future-Proofing the New Site
As you are creating the new site from the old content, this could be a good opportunity to fix the known shortcomings of the old website. Examine items such as:
  • Navigation structure
  • Search-friendly URLs
  • Menu organization
  • Blog integration
  • Page load times
  • Mobile responsive design
  • Call to action and unique selling proposition
These will help ensure you make the most of the opportunities and get the best value from your relocation investment. The content must be transferred across — and if it’s static HTML, then getting it into a modern content management system makes the most sense.
Understanding Your Old Site’s Link Profile
Most links tend to be to the domain name rather than the internal pages — both good and bad. Orchestrated Link-Building Campaigns tend to include a range of primary internal pages, so reviewing previous worksheets detailing what links were built will give you a head start on determining which pages might be a problem.
To make informed decisions on how to handle content transfer and redirection, you need access to reliable data. A thorough understanding of WHO links to WHICH pages with WHAT anchor text is required. Armed with that information, you can then prioritize which pages need special care to prevent contamination of the new domain.
Google and Bing Link Data Accuracy
Both webmaster tools provide you with incoming link information, albeit with slightly different perspective / emphasis. Combining both sets increases the overall count.
Data accuracy from both these sources is sub-optimal and makes the entire process twice as hard as it should be.
Both Google and Bing appear to cling desperately onto historical links data that may literally be years old. Some of these links you may very well know are dead and gone due to your own efforts. Percentages vary, but sometimes more than 40 percent of the links are no longer functional. This really clutters up the data because there can be a large amount of false positive links to sift through.
That also raises the question: “Is the data Google makes available to webmasters the SAME data that Penguin uses to slam sites for inappropriate links?”
If so, using substandard data to apply severe penalties would be an inexcusable state of affairs.
If the data differs, that’s equally inexcusable as it makes webmaster compliance with guidelines and terms of service unnecessarily difficult. The problem with Penguin is that it does not necessarily punish the guilty. The innocent also become collateral damage.
  • Many site-owners contracted out work to supposed professionals who took unapproved short cuts…
  • A site changes hands, and the new owner bears the burden of repairing the previous owners damages…
  • Mr. Know-it-all next door gives the mom ’n pop site owner some seriously bad advice, and they take it.
  • A good directory (e.g. DMOZ) gets cloned and spawns dozens of copies.
Third Party Tools
I use LinkAssistant to help combine and sort out the messy data — importing CSV link data files into LinkAssistant is a simple task. Then I set the software to query each linking site via the “Update Backlink Data” option for:
  • Link Status — live, not found, nofollow, noindex, page not found, site not found
  • Backlink page, anchor text, anchor URL
That gives the best possible information to tackle the problem of page link contamination head-on. I do that via a series of steps after the backlink data is updated. The goal is to prioritise thin out the backlinks that are not part of the problem:
  • Copy all the data to a spreadsheet.
  • Sort by link status.
  • Remove all rows with; nofollow and page not found (link / site not found may only be a temporary status).
  • Sort by backlink page.
  • Remove all rows with original domain name as the backlink page (because we are leaving that behind).
  • Sort by domain name.
  • Remove all rows where backlink is from known good domains; dmoz.org, facebook, plus.google etc.
  • Go through and highlight the potentially damaging low-quality domains.
  • Sort by anchor text.
  • Remove all rows with domain name / website / click here anchor text — unless they are from a known bad site.
  • Highlight in red the potentially damaging “over-optimized” anchor texts.
  • Sort by Anchor URL.
Now you can examine the residue closely — it will show you who links to which pages with what anchor text. You can now decide which pages you might be best NOT 301 redirecting to the new version of the site on the new domain.
The Disavow Tool
If you’ve attempted link removal and disavow options on the original site, this may translate into some value to the new site. It reduces potential damage from adding a 301 redirect on a tainted page to the respective page on the new domain.
It may also make sense to submit the same disavow request for the new site, in an effort to prevent any 301 redirects inadvertently transferring bad karma from the old site to the new site. The political correctness of doing this remains a mystery… comments on other’s experiences are welcomed.
The effectiveness of the disavow tools at both Google and Bing seems very questionable indeed. Many people have yet to see any demonstrable impact of its use — positive or negative. The conspiracy theorists that who hold that its just Google’s way to get webmasters to help identify low-quality websites may indeed have a point.
The problem is that of time. If your site has gone off the rails into the ranking chasm, and traffic has been slashed by 60 percent or more, you could literally go out of business before the disavow tool works for you.
Seriously, its more than likely going to take MONTHS before demonstrable improvements to rankings accrue via the disavow tool, or link removal for that matter.
Switching to a new domain is probably the single fastest way out of a serious predicament.
Change of Address at Google and Bing
Fiddling with Bing’s address change option is problematic because it makes the fundamental assumption that you’ve 301′d the entire site at domain level. However, Bing does respond properly to the individual 301 redirects on pages on the original site.
Google sensibly assumes that you will apply 301 redirects to each individual page across the website being moved. Google also expects and recommends that the domain and its redirects are left in place for a minimum of 180 days (six months). That gives sufficient time to transfer all residual value to the new site.
Commonsense suggests that you might want to keep that old domain under your management for a long time to come, to prevent an unscrupulous competitor gaining control of it.
301 Redirects
Some experience of Excel spreadsheets helps with getting these sorted out. The point being that this allows you to transfer the full value of every good page from the old site across to its equivalent page on the new site. Basically, you need a URL list and there are several sites that allow you to produce such a list free, for up to 500 URLs.
Alternatively, there are free downloadable programs that will perform a similar task — I use a Windows app called Sitemap Generator.
The Carrot or the Stick?
Google’s focus from the outset of Penguin seems to have been more on punishment for breaching guidelines and terms of service than on encouraging compliance. There’s a presumption of guilt and application of punishment — meted out in dictatorial circumstances;
  • By an entity that is investigator, prosecutor, jury, judge and executioner.
  • And who changes the rules as it pleases, with any redress or oversight.
Why, that would be considered unjust pretty much anywhere except North Korea. Too much stick and not enough carrot, in my view…
Give Us a Carrot…  
Frankly, Google does NOT provide enough quality information (let alone assistance) for a ‘mere mortal’ webmaster to confidently get their website back on the right track after link penalties. Even seasoned professionals struggle with restoration of a site’s status after a Penguin penalty. It is little short of a nightmare situation, and what would help enormously is:
  • Accurate incoming link data with NO dead links included.
  • Indications of which are nofollow and therefore harmless.
  • Outline of which links are harmful and why.
  • A Disavow Tool that actually WORKS in a timely fashion.
  • Ignoring links that pre-date the change to Google’s Terms of Service.
Consider mom ’n pop, struggling to get a new venture off the ground, and studying prevailing literature. The preponderance of readily available advice is now in direct contravention with many aspects of Google’s new rules. Everyone wants to compete — across all levels — its part of the human condition to strive to do things better, smarter and faster than your competitors…
Every pre-2012 book ever written in the history of the Internet emphasizes the importance of links to attain decent rankings.
  • Go down to any local library to do some research on how to improve your rankings and the book you borrow will place you in jeopardy.
  • Do a bit of online research and most of the guidance you find will be wrong.
  • Ask a local “expert” and they will be blissfully unaware that the SEO world suffered a paradigm shift in the past year.
  • Ask your web designer / developer and he or she will have minimal comprehension of the extent and severity of penalties for doing it wrong.
Google arbitrarily changed all the “old” rules… Overnight, the efforts you made to remain visible above the ever-growing crowd of websites five or more years ago now destroys your online business… And good luck with sorting it out, because Google thinks you are guilty of breaching its (new) terms of service and guidelines. Collateral damage is no concern of theirs, it seems.
A company the size of Google really ought to have some compassion, and an ethos to helps nurture the websites that make up the bulk of its SERPs content. Sure, identify errors of judgments but provide:
• Comprehensible warnings.
• Accurate data.
• Tools that work to resolve issues.
It would be nice if that was done BEFORE a site owners had to face abandoning a much-loved domain to extricate his business from a mess that may not have been of his own making… Or, heaven forbid, go out of business as a direct consequence of draconian penalties.
Summary
The most important stage of the site migration process is gaining a thorough understanding of the potential problems lurking within internal pages. Leaving all the low-grade domain-level incoming links behind is a given as you switch to a new domain. However, you must exercise great care not to transfer any bad karma from internal pages loaded with bad links.
There’s no indication of Google providing us with a bunch of carrots anytime soon, and many more people will realize that they are between the proverbial rock and a hard place.
Migrating a site to a new domain is not rocket science – it just requires careful attention to detail. Do so, and in my experience, the results are invariably positive.
resource: http://www.sitepronews.com/2013/08/27/know-when-to-cut-your-losses-how-to-switch-to-a-new-domain/

Thursday 22 August 2013

Video SEO: Getting Your Video Ranked on Page 1 of Google


You have finally figured out how big a boost video marketing can be to your overall Internet marketing efforts. Let’s face it; people would rather watch videos nowadays than read an article. And a lot of businesses are now leveraging that trend by posting tons of video content on their YouTube channel designed to create brand awareness, introduce and sell new products and services, promote upcoming events and generate new business.
Now, posting great video content on YouTube is one thing, making it easy for your target audience to find the video is another. Yes, posting a video on YouTube will generate traffic organically from YouTube searches and possibly from social media shares, especially if the content is of high value. However, imagine how wonderful it would be if your videos appeared on the first page of Google for your target keywords. That would be just amazing, right? But the real question is how do you do this.
Well, it’s not as difficult to get a YouTube video ranked on Google as you may think. All you need is a little video SEO and a bit of elbow grease and you should be able to get there.
Here are 3 key steps to help you along the way:
1. Optimize Your YouTube Video Page
The YouTube page where your video is posted is just like any landing page or website. It has the same on-page elements that Google uses to evaluate for search placement. So, like any landing page, you will want to optimize it for the keywords you are targeting. Here are the things you will need to do.
  • Title: Write a compelling title that contains your target keyword. As much as possible, don’t exceed 70 characters, and make sure to have your target keyword within the first 5 words of the title.
  • Description: Write a description of no less than 300 words. Insert your target keywords generously into the content. 4 to 5 exact match instances of the keywords you are targeting is a good rule to follow. The keyword should also appear in the first sentence of the description and in every paragraph thereafter. Including other relevant keywords in the description also helps. Finally, don’t forget to include links to your website and relevant links to other videos that can be found on your YouTube channel.
  • Tags: This portion should contain all the keywords that matter. Your target keywords as well as other relevant tags. For instance, if you are targeting the keyword “Video SEO” for that particular video page, consider adding tags like “video”, “video marketing”, “SEO”, and “search engine optimization”.
  • Transcript: If your video has voice-over or features someone talking, then it will be definitely worth your while to upload a transcript of the video script. Now, this only becomes beneficial if the keyword is mentioned at least 2 times in the video. This is an optional optimization technique since there have been cases where a video that didn’t have voice-over or a transcript still ranked on Google. In any case, having this element on your video page helps a lot, so consider getting it done.
Additionally, the audience retention rate of a video plays a factor in Google and YouTube search placement. In order to maximize this, you will want to not only produce great video content but keep the length of the video below 5 minutes. Shorter videos tend to generate better audience retention rates.
2. Get Immediate Traction for Your Video
Once your YouTube video has been published and the video page optimized for search, the next thing you will need to do is to get instant traction for it. This is because YouTube tends to rank videos that have a good deal of views, likes, comments, and social shares high on searches. Eventually, those same factors will increase the video’s visibility on Google.
So, how do we do that? Here are some actionable steps you can take:
  • Embed the video on your website or blog. Doing this will not only create a high valued, relevant backlink to your YouTube video page, but will also provide the video with instant activity, especially if your website or blog gets a steady stream of web traffic.
  • Share the video via your social media accounts. Facebook, Twitter, LinkedIn, Pinterest, and Google+ are just some of the social networks you should share your video on. Ask your network of friends and followers to watch, comment on, like, and share the video. If you have a Facebook Fan Page or a Google+ business page, consider sharing or embedding the video there as well.
  • Email the video to your friends, colleagues, clients.
  • Try to ask everyone you know that has an active account on YouTube to create a video response to your video. Every video response your YouTube video gets increases the page’s trust factor and popularity, which will have a significant effect on search placement on both Google and YouTube.
  • Submit the video to SMO like social bookmarking sites like Stumbleupon and Delicious.
3. Get Some Consistent Link Building Done
Again, Google treats a YouTube page like any other landing page. With that being said, link building will play a significant role in getting your video to the top of Google. This is especially true if the keywords you are targeting are highly competitive. A steady monthly link building campaign consisting of a variety of link types such as social bookmarks, blog articles, web 2.0 posts, article directory submissions, and video syndication, will greatly affect the video’s visibility on Google over time. Typically, with consistent link building, you can get a video ranking on page 1 of Google for your target search terms in less time than you think.
Resource: http://www.sitepronews.com/2013/08/22/video-seo-getting-your-video-ranked-on-page-1-of-google/

Monday 19 August 2013

The Best SEO Tactics for Bing


It’s no secret anymore that Google has some serious competition. Sure, no one says “Bing it” the way they say “Google it”, but popular slang isn’t the only measurement of search engine success.
In the handful of years Bing has been in the mix, they’ve only gone one direction: up. With the power of Microsoft behind the technology and marketing prowess, that’s not likely to let up any time soon.
Are all search engines created equal? Not a chance. As it’s shaping up, Google and Bing both have definite strengths and weaknesses. That’s good news for merchants who want to differentiate themselves, but it does add SEO complexity, as tactics are definitely not “one-size-fits-all.”
Bing and Google: Similarities and Differences
Bing and Google certainly have more in common than not, but the differences are what make them interesting. Likewise, it’s these standout traits that business owners need to take note of, because they help determine which search engine should be at the forefront of all SEO strategies. While it’s never advisable to leave either one out of your efforts, it’s important to choose which site the bulk of your time and dollars should be focused on. For the first time ever this year, that may very well be Bing.
Google has one thorough, dynamic, and tricky as can be algorithm. They excel at squashing spammers (they’re not perfect, but they do an amazing job of sweeping out the cheaters), keeping content results clean and consistent, and showcasing a simple, intuitive user-interface.
Bing, on the other hand, has become the social darling, as it is the only search engine that has successfully integrated Facebook into its rankings. This is pretty revolutionary because your Facebook friends’ likes and dislikes are integrated into the search results. Google does the same with Google+, but let’s face it, there’s little contest regarding which is more powerful.
What this has come to mean is that for local search results, Bing reigns supreme. For informational searches and web-wide research, Google is the go-to.
Are You a Google or a Bing Business?
For merchants, this is actually an easy decision. If you run a locally owned business, Bing’s results are increasingly superior.
Don’t believe me? Take a cue from the brilliant Bing-it-On Challenge marketing campaign and see for yourself.
Search for something like “Las Vegas hotels”. See if you don’t agree with the masses that Bing’s results are not only more graphically-rich, but also more targeted. Then, search for something like “Define: nebula” – and Google will likely kick Bing’s behind.
Who’s Winning the Popularity Contest?
There’s no denying Google still has the majority of the market share, but those numbers continue to slip. Furthermore, a lot of folks are running surveys to see which search engine users actually prefer, and many are giving top marks to Bing.
In a survey earlier this year, Search Engine Watch asked a gaggle of searchers which experience they preferred, specifically regarding the layouts of each site.
The results were split into several interesting categories, but the gist of the data was this: 63 percent of users preferred Bing’s layout as it related to social search and 53 percent preferred Google for universal search. More proof that if you’re a local business, it’s time to seriously consider putting your bang in Bing.
Top Tips for Excelling on Bing
If you want to hit the top of Bing’s rankings for your chosen keywords, here are the best ways to score a high ranking:
  • First and foremost, make sure that social tactics are a huge part of your marketing strategies. A heavy presence on Facebook, Pinterest, LinkedIn and related sites is a must.
  • Be a heavy Facebook user, as opposed to Google+. Leverage your fans, and encourage social signals (Likes, comments, etc.)
  • Test your keywords specifically against the Bing audience. What will get you high rankings in Google will likely be different in Bing. The smaller audience also means less competition for your keywords, so you’re more likely to do well using more popular selections on Bing than on Google.
  • Bing is a stickler for error free XML sitemaps. Make sure yours have zero 404s, or the Bingbot might ignore the whole thing.
  • The Bingbot is also big on Robots.txt files. If your site doesn’t have one, it risks being completely ignored by Bing.
  • Bounce rates are also big deciding factors. If most visitors bounce off a page before spending a certain chunk of time, the entire website may suffer a ranking decrease.
  • Just like Google, content – good, quality, fresh, current, irresistible content – is essential to a high ranking.
Whether you decide to go full throttle into a Bing approach, or stick to the tried-and-true Google, make sure you at least keep some strategies in mind for both. Bing may not yet be the homecoming queen, but it’s earned enough votes to make the reigning royalty start quaking in the knees. That means you’re very wise to not ignore this new kid on the search block. Get Best Internet Marketing Services.
resource: http://www.sitepronews.com/2013/08/19/the-best-seo-tactics-for-bing/


Saturday 17 August 2013

Hacking your Twitter Account to Go Viral


Promotional Twitter “hacks” have been in the news a lot recently. What began as brands truly getting hacked by strangers turned into a frenzy of companies trying to imitate the same thing. Why?
Because it enhances their social presence, gives them attention and can even help their promotions! All businesses want to increase the number of followers they gain in a day, and the possibility of your tweets going viral is tempting for anyone.
But aren’t traditional marketing tactics safer than a risky fake “hack” that could potentially damage your reputation and credibility? Chipotle, MTV, Jeep and Burger King are a few of the companies that have been hacked – either by themselves or by someone else – in the last year.
Learn from their mistakes and successes to see if Twitter “hacks” can really be promotional for your business!
“Hacks” That Can Benefit Your Business Promotions
Chipotle is a company that was open about the fact that they faked their own hack in July 2013.After posting a bunch of bizarre tweets, such as “twitter friends search bar”, “twitter Find avocado store in Arv” and “Hi sweetie, can you please pick up some lime, salt, and onions? Twitter”, let’s examine how Chipotle thought their “hack” through:
  • They weren’t controversial. A Chipotle representative named Chris Arnold told Mashable “it was definitely thought out: we didn’t want it to be harmful or hateful or controversial.” While the tweets were bizarre, nobody was offended or upset.
  • The tweets coincided with their 20th anniversary promotion. Since their most recent promotion involved putting clues in various places, they figured adding complex clues to their social media presence would aid the promotion. People might rest easier knowing the “hack” was part of an overall promotional effort to engage with customers.
  • They wanted to get people talking. By tweeting non-controversial, mysterious phrases, they hoped their followers would pay attention and try to figure it out.
And because of these reasons, Chipotle’s “hack” may have been a success. They added about 4,000 followers the day of the “hack” when they typically only add about 250 followers a day.
The “hack” received mostly positive reactions, and this could be because of the careful, non-controversial way they approached it. It’s not a stunt they are likely to repeat frequently, but it gave them a moment of blaring attention and assisted their promotional efforts.
They’re even considering releasing a t-shirt with words from one of the most viral tweets during the “hack.” In this case, Chipotle seems to be using the attention they received from the “hack” in brilliant, marketable ways.
“Hacks” That Can Cause Negative Publicity
MTV is another brand that told followers their “hack” was a prank, by tweeting that users had been “Catfish-ed.” The hack was apparently an effort to promote their sister network, BET Experience.
They approached their “hack” differently than Chipotle by tweeting things about celebrities, using capital letters and asking followers questions. Here are the ways MTV’s “hack” differed from Chipotle’s:
  • They were raunchy. One tweet that may have provoked criticism was raunchy and directed at three celebrities. It said “Wowz. @SelenaGomez, @AshBenzo + @VanessaHudgens showed major skin n’ sideboob at the #SpringBreakers premiere.” While their intention for tweets like this is unclear, it could easily make followers upset and cause an uproar.
  • They seemed to follow the motto of “all news is good news.” This motto might work for some companies trying to gain publicity, but in MTV’s case, there was a lot of controversy surrounding their “hack” because they didn’t seem to think about the effects of what they were posting.
Even though MTV’s hack has been more widely-criticized than Chipotle’s, they’re still getting a lot of free coverage and attention. The question is, do they really want a bad track record? It’s up to you to weigh the costs and see if a marketing stunt like this is worth it.
You Don’t Have to “Hack” to Gain Followers
Once followers realized MTV’s “hack” was a hoax, Denny’s Diner quickly posted a tweet that cleverly made fun of MTV.
They tweeted “OMG we hacked ourselves because it’s the cool thing to do!” with a picture of pancakes. Their tweet went viral and proved that Denny’s didn’t have to hack their account to gain publicity.
They stayed on top of the Twitter stunt and used humor and timing to release a tweet that got the world talking. This type of social media tactic can also give you just as much, if not more, attention and followers!
What Happens if You Actually Get Hacked?
Burger King and Jeep are two companies that actually had their accounts hacked by strangers – and many believe the same person may have hacked both accounts.
The hackers posted offensive, controversial tweets about Jeep employees doing drugs and selling their company to Cadillac. For Burger King, their images and background were changed to McDonald’s logos – their biggest competitor.
Jeep dealt with the hack by removing the tweets and not publicly acknowledging the hack on Twitter. By being quiet and composed, followers could possibly feel sympathetic towards Jeep, and the backlash from the hack might not be as strong.
Either way, the hacks brought Burger King and Jeep’s names to the center of the media’s attention. If you can handle a devastating nightmare like getting hacked in a professional way, there’s a chance you can actually turn the table around and use the attention in positive ways.
Clearly, hacking your own account is a risky, controversial marketing tactic. It’s not for everyone, especially since the larger brands seem to be the only ones bold enough to try it.
However, it’s definitely food for thought for large and small businesses alike. You may have a bunch of social media accounts, but are you thinking outside of the box to find creative Social Media Marketing solutions that can help your promotions?
http://www.sitepronews.com/2013/08/16/hacking-your-twitter-account-to-go-viral/comment-page-1/#comment-89516

Friday 16 August 2013

Trsst to Offer Secure Twitter-Like Social Networking


Move over Twitter, there could soon be a new micro-blogging social network in town.
Trsst has launched a Kickstarter campaign in a bid to offer a secure platform for encrypted messaging with Twitter-like functionality.
To break it down, Trsst will supply completely secure person-to-person messages and messages that are broadcasted will be digitally signed. The Bitcoin-style database ensures no one has any control over your data.
“Think of Trsst as an RSS reader (and writer) that works like Twitter but built for the open Web,” reads the description on the Kickstarter website. “The public stuff stays public and search-indexable, and the private stuff is encrypted and secured.
Created by Michael Powers — who is also the creator of Hotel Me and AppTap — Trsst is all about giving users back the power over their own data.
““Only you will hold your keys, so your hosting provider can’t sell you out. Trsst sites can look and feel like Facebook or Twitter or Tumblr,” Powers said. “And from day one, you can follow all your favorite sites and bloggers that post RSS feeds. And they’ll be able to follow you.”
Although Powers does not come out and say it, it is obvious Trsst was inspired by the recent revelations regarding the National Security Agency’s (NSA’s) wide-spread surveillance programs such as PRISM, which can force technology companies like Facebook and Google to hand over user data.
The following is an excerpt from Powers’ explanation of why a social network like Trsst has become essential:
You’ve read the news.  People are finally waking up to what the rest of us have known for some time: everything everyone does on the Internet is being collected and harvested for inevitably nefarious purpose.
Trsst mainstreams good crypto practices and usage among the general consumer audience to make global communications — on the whole and in the aggregate — more free, private, and secure.
At the end of the day, every company you trust — Google, Facebook, Twitter, Apple, or Baidu — is a corporation owned by shareholders and subject to governmental jurisdiction.
• At any time, the directors and shareholders of these companies may revoke their promises to you about privacy, and they may do so without even notifying you about it.
• At any time, the governments under which these companies operate may enact legislation that appropriates or nationalizes the data in their possession, including your personally identifying information and stored communications. 
• This may have already happened.  There is no company not under the jurisdiction of a government.  No place is safe.
The only hope we have is a decentralized cryptography-based messaging infrastructure that no government can control where no corporation need be trusted and all communications are encrypted and only you hold the decryption keys.  
We have the technology.  And the time is now.
Revolutions are started on social networks like Twitter and Facebook. Dissidents, informants, confidential sources, journalists, and those they trust all rely on these services.  Trsst will better preserve their causes, their freedom, their livelihood, and even their lives.
I hope you will agree that this needs to happen and back the project.
Powers is looking to raise $48,000 for his project and, with just 28 days left to achieve the goal, he still must raise $45,000.

Wednesday 14 August 2013

A Step-By-Step Website SEO Audit Guide




At some point SEOs need to audit a site to find out what is going wrong and what needs to be fixed. There might be a number of things preventing a website from reaching its full potential but finding those problems can be difficult. The outline below will help educate you and show you what you should be looking at and how to fix it.
If you are performing SEO on behalf of clients, especially new clients, you need to have their sites thoroughly examined for technical issues. Whether a site has crawling issues, indexing problems, or other issues that are inhibiting the site’s ability to rank, this process will find it.
It should be used when new clients sign on but could also be used as a sales tool. Free site audits can be compelling for showing your leads what is wrong with their sites, and shows them the route you would take to deal with those issues.
Most websites that you come across are going to have something wrong with them. Having a process in place to efficiently identify these issues is essential to maintaining site health and rankings.
Let’s get started.
Screaming Frog Spider
Your first task is to use a tool called Screaming Frog. It is a free application that can be used to crawl sites. When the client’s site has been crawled, you can export the data to excel and analyze it. The tool looks at every page of the site and reports back on the following:
  • Duplicate Pages – Identifies any pages where the content is the same or similar to another page on the web.
  • Errors – Reports any client or server issues, such as 404 pages.
  • External Links – Shows all of the sites that you link out to.
  • Title Tags – Shows any missing, duplicate, short or long titles.
  • Description Tags- Shows any missing, duplicate, short or long descriptions.
  • URL Issues – Shows URLs with upper cases, dynamic URLs, URLs that are too long or have underscores.
  • Redirects – Shows any permanent or temporary redirects.
  • Headings – Shows information on any h1, h2 or h3 tags used on the site.
  • Meta Robots – Lets you know what you are allowing to be indexed, and what you aren’t.
  • Anchor Text – Identifies the anchor text you are using for any images or web pages.
  • Internal Links – Shows where you are linking to other pages on the same domain.
  • Follow & Nofollow – Shows you which links are follow links and which are nofollow. This can be useful to quickly find links that need the nofollow tag added to them, which will minimize link juice being passed to other sites.
  • Bot Crawling – Crawls the site as the Google, Bing or Yahoo bot, allowing you to see what the search engines see.
  • Images – Gives you information on the site images, the alt tags used, and where alt tags are missing.
  • Page Depth Level – Finds how many levels deep the search engines have to crawl to find all your content.
  • File Size – With smaller file sizes, your website will load faster. Identifies spots where you need to make files smaller.
When the site has been crawled, which will take a few minutes; you can export the data and refer back to it as needed.
Google Webmaster Tools and Analytics
Make sure the site is registered on Webmaster tools and in Google analytics, and that you can access them both.
Through Webmaster tools, you can see any crawl issues that Google is encountering, the general health of the site, and loads more. It is the easiest way to discover any problems with your site.
Webmaster tools needs to be checked once a month for all clients at a bare minimum, to ensure that any problems that crop up are dealt with quickly.
Keyword Analysis
The information on title tags and description tags that you acquired from the Screaming Frog tool can be used to understand what the site is currently trying to rank for.
You can then combine that information with Google analytics to see if the site is actually getting traffic for the keywords you are targeting. You can then decide where keywords need to be changed and target keywords that will bring more quality, relevant traffic to the site.
URLs
The Screaming Frog report will also give you information on the URLs that are used across the site. Ideally, the URLs need to obey the following rules:
  • Must be static – Static URLs contain only numbers, letters and dashes. For some reason, search engines struggle to understand URLs if they contain anything other than this.
  • Easy to remember – Make sure all URLs are user-friendly. Simpler URLs are ideal.
  • Under 100 characters – As a rule of thumb, make sure URLs aren’t longer than 100 characters.
If any of the URLs fail to meet these criteria, you may want to consider changing them. If you do, make sure you redirect the old URL to the new one to maintain any of the link juice flowing to that URL.
Title Tags
The data from the Screaming Frog tool will give you information on all of the title tags used across the site. Title tags are probably the single most important place to insert your target keywords. Make sure all of the title tags follow these guidelines:
  • Must be 50-70 characters in length.
  • Must be unique.
  • If possible, and if it makes sense, use the target keyword for that page twice.
  • Use the name of the city for the business if it makes sense to do so.
Description Tags
Descriptions don’t actually help with rankings, but they do substantially alter your click-through rates. Description tags need to be compelling, and give good reason for searchers to want to click through and visit the site. Follow these guidelines:
  • Make sure that every description is unique and relevant to that page.
  • Include a call to action.
  • No more than 160 characters in length, no less than 51. But ideally, the longer they are the better.
  • Use the target keyword for that page where appropriate. It will appear bolded in the search results which will draw searchers’ eyes to your listing and improve click-through rate as a result.
  • Use the name of the city for the business, if appropriate.
Headings
H1, H2 and H3 tags need to contain the target keywords for that page because they help a lot with ranking. Follow these guidelines:
  • Search engines generally feel that keywords that are bigger in size are more important and give them more authority as a result. Therefore, it is important to make the headings big, prominent and ideally the first thing that people see on the page.
  • They should only contain text – no images or logos.
  • Use a H1 tag at the top of the page and break up paragraphs of text with h2 or h3 tags. They work well from a usability standpoint and help with rankings.
Content
You need to make sure there is enough content across the pages on the site. You can’t use any of the tools to help you with this, but it should be easy enough to just scan the site and identify where more content needs to be added.
Each page should have a minimum of 300 words of content. If any of the pages don’t, add some more. But don’t add content for the sake of adding it; only add useful, engaging content.
It is more important to fulfill the needs of your actual visitors rather than that of the search engines. Large chunks of spammy, unnecessary content will just ruin the credibility of the entire site.
Make sure content is split up into bite size chunks as testing has shown that visitors skip over content if it is laid out in one massive paragraph.
Internal Linking
If you link to inner pages of the site where appropriate, search engines find it much easier to crawl the entire site. You can refer to both the Screaming Frog data and Webmaster tools for information on internal linking.
Where you can, ensure you don’t have more than 100 links on any one page, and around 2 or 3 internal links.
Using anchor text that is too rich with your target keywords can actually have a negative effect on rankings so only use target keywords around 10%-30% of the time when linking internally. Use click here’s or similar text links instead.
Image Text
Every image must have an alt tag. This is for the benefit of the search engines and visually impaired users. The alt tag needs to describe the image and contain a keyword if that keyword is relevant to the image. Alt tags do have a slight impact on rankings so try and get the keyword in where possible.
Nofollow
If one page links to another, and Google sees the linked to page as relevant, some of the link juice is passed through. When the nofollow tag is used, Google will then pass 50%-100% less link juice through.
Use the nofollow tag on sitewide external links, blog comments, and anywhere else where you don’t want to lose link juice.
Excluding Pages
Search engines don’t like it when you have loads of pages with barely any content on them. Where you can’t add more content, you may wish to stop that page from being indexed altogether.
Simply addto any pages that you don’t want to be indexed.
Sitemap
Sitemaps make it easier for search engines to index all of the pages on a site. Go here to get a sitemap created, and then submit it to Webmaster tools. It will tell you how many submitted URLs are actually getting indexed.
Make sure you check Webmaster Tools regularly to ensure the sitemap is still working as it should.
Redirects
There are two types of redirects: a 301 redirect (permanent), and a 302 redirect (temporary). 302 redirects are a dead end for SEO, and don’t pass any link juice.
Unless the reason for a redirect is truly a temporary solution, such as a timed promotion, a 301 redirect should be used.
Take a look through the data from the Screaming Frog tool and make sure deleted pages or URLs that have changed are using a 301 redirect.
Duplicate Content
The search engines really don’t like duplicate content. Following the Panda update to Google’s algorithm, sites with duplicate content have been identified and penalized.
There are 3 main ways to deal with duplicate content:
  • Rewrite the content to make it unique.
  • Use URL redirects.
  • Use a rel canonical tag to specify the original page to the search engines. You can use this tag within the head section of the page to show the canonical web page:
Broken Links
If a spider comes across a broken link when crawling a site, it will stop crawling and immediately leave the site. If there are too many broken links on a site, it will be seen as providing a bad user experience, and rankings will suffer as a result.
You should use Webmaster Tools to check for broken links and you can also use the Xenu link sleuth tool. When you find broken links, use a 301 redirect.
Page Load Speed
Google recommends that sites load in 1.4 seconds or less. Any longer and rankings are not going to be as good as they could be.
Use Pingdom’s speed tool to test the speed of the site and then use the pagespeed tool from Google, to identify solutions. If the load time is longer than 1.4 seconds, there are a number of actions you can take.
  • Browser caching
  • CSS sprites for images
  • Reduce image file size
  • Combine CSS or javascript into fewer files
  • Install the W3 total cache plugin, if the site is using WordPress.
Incoming Links
Use Open Site Explorer to analyze inbound links. You will get a good idea of the type of link building activity that has been performed on the site in the past.
Thoroughly look through the link profiles and search for spammy or low quality links. It may be that certain links are having a negative effect on the site’s ability to rank, and you may need to use the disavow tool to discount certain links.
Domain Authority
Use SEO moz’s domain authority rank, which can be found in Open Site Explorer, to judge the overall authority of the site.
It ranks the site from 0-100, and is currently the best way of working out how much authority the site has, which is going to seriously effect its ability to rank.
Compare the domain authority to that of competitors. If the site’s authority doesn’t stack up well against theirs, you’ll want to avoid targeting the same keywords as them.
As a rule of thumb, if the domain authority is lower than 30, you’ll want to adjust the competitiveness of the keywords that you target.
Once you build a site’s domain authority to a decent level, ideally above 40, you can begin to target more competitive keywords.
Social signals definitely have an impact on rankings, and it is
important to assess the number of mentions the site has. Again,
you can find this data within Open Site Explorer.
There are two simple ways to modify a site to increase the chances of acquiring social signals.
  • Integrate sharing buttons clearly across the site and on any blog posts.
  • Create content that is worthy of sharing, and reach out to people to ask for feedback.
That’s it. Hopefully you learned a few things about auditing a site. Don’t forget to bookmark so that you can come back as and when you need to. If you have any comments, concerns, or queries, leave them below and I’ll get back to you.
resource: http://www.sitepronews.com/2013/08/14/a-step-by-step-website-seo-audit-guide/