9Tech Info

all new tech news, updates

  • home
  • Contact Us
  • Privacy Policy
  • Disclaimer

.

.

YouTube for Android brings offline playback to India, Indonesia and the Philippines

Posted by Shankar
Along with a Material Design refresh and search filters, the latest update for YouTube’s Android app allows users in Indian, Indonesian and Filipino markets to save videos to their devices to watch offline.
The new feature is available for select content including movies, music videos, trailers and more, so that users in areas where high-speed mobile data connections aren’t easily accessible can enjoy buffer-free video.
Once videos are downloaded, they can be played back from the app’s Offline section without the need for an Internet connection for up to 48 hours. Users can also choose the quality of the videos they’re downloading, which helps manage available bandwidth and device storage space.
➤ YouTube [Android]

Is Amazon the New Google?

Posted by Shankar

Forget Bing, forget Yahoo – Google’s Executive Chairman and former CEO Eric Schmidt now knows who the biggest threat to his company’s search dominance is: Amazon.com.
In fact, Amazon and Google are going toe to toe in a myriad of industries, making their rivalry one of the biggest tech showdowns on the planet. And while search may not be the most obvious point of contention, it’s becoming so to Google, who have called out the Seattle giant as a formidable threat.
How Amazon has Become a Google Search Rival
What does the world’s biggest online marketplace know about search? When you’re as massive as Amazon, it has to be a core competency, and they’ve certainly done well to give the people what they want.
In Schmidt’s words, spoken at a recent visit to Native Instruments, it’s all about convenience. “But, really, our biggest search competitor is Amazon. People don’t think of Amazon as search, but if you are looking for something to buy, you are more often than not looking for it on Amazon.”
It’s not just products or books that people are searching on Amazon, either. Even Schmidt noted that there’s a myriad of answers users are searching for by the thousands every day, showing cracks in Google’s search monopoly.
How much does Google still dominate the search stratosphere? The numbers, while still significant, are showing decline. Right now, Google receives around 233.1 million unique users each month, which exceeds Amazon’s 172 million. But, that’s not a huge gap for Amazon to close in on and they’ve been doing just that for a long time.
Google Fires Back – Takes Aim at Amazon’s Core Business
Not to be left out of any online war, Google has started to infringe on Amazon’s ecommerce world too. They’ve launched same-day delivery services in San Francisco as a test run, looking to rival Amazon’s similar offerings. Both companies are prepared to battle it out with each other and are using a PR attack in the drone space, each hoping to become the biggest player in this new technological advancement in product deliveries.
Google has also proven quite successful in hardware, becoming a huge player in the mobile space with their Android platform, making Amazon’s Kindle play an even harder sell. And Google’s wearable technologies like Google Glass are more innovative and newsworthy than Amazon’s current related offerings.
While Amazon continues to be king in the cloud storage space, Google is not making it easy on them. Google has its own enterprise cloud storage and a clear winner in this battle is not yet obvious.
In short, these companies have a lot of businesses in common. Amazon wants consumers to subscribe to their Prime service and download entertainment on their Kindle TVs or Kindle Fires. Google would rather you do the same on your Nexus phone, or stream via Chromecast. Amazon wants you to store your music with their AMS service; Google woos customers with Google Music cloud storage. The similarities are indeed long.

Amazon’s Additional Attempts to One-Up Their Rival
Not to give you whiplash, but Amazon is looking to take more than just Google’s reign on the search scene. In August, Amazon announced a massive $1 billion dollar purchase of online-gaming giant Twitch, showing an obvious intent to become more mighty in the world of entertainment. Since Twitch had been in talks with Google as well, this marks as a double blow and a very telling sign for the future.
These two are even butting heads in the physical office space. Not long ago, Google set up a new London-based office in King’s Cross. Amazon answered back by breaking ground on a new 5,000 employee office just down the road from Google’s Campus. Which means they both will be fighting over top engineering talent in the area as well.
And, finally, despite the millions Amazon spends on Google ad buys as an apparent business partner (in 2013 ad buys exceeded $157 million, and that’s just domestically), they are also looking to steal some thunder there as well. Rumor has it Amazon is building their own robust ad network platform. If they’re able to compete with Google in this arena, this would be the ultimate blow. Ads have long since been the biggest revenue generator for Google, and losing their foothold in this area could be truly devastating.
There’s a big looming question through all these similarities – might the two become one in the not-so-distant future? It would be an epic deal to last the ages, but it’s not out of the question.
We’ve already seen during the economic downturn that mighty conglomerates can indeed fall to pieces; Schmidt is certainly not being dramatic when he emphasizes how critical it is for his company to keep advancing. As he stated recently in Berlin, Google needs to stay on the pulse of innovation, or “someone else will innovate around us, leaving us obsolete over time.”
Filed Under: 5 basic Tips Ecommerce Featured Google Lumia seo Windows Phone Writing Content

The Hidden Cost of Free: Five Ways Free, Low-Cost Website Templates Kill Your Business

Posted by Shankar
As a new business owner, it can be tempting to want to save a few bucks by setting up your website using free or low-cost website template services.
While it’s true, you may be able to get up and running in a few minutes, there are some hidden costs to setting up your online presence using these website building tools.
As an experienced professional, over the years my team and I have had to help “rescue” many businesses from these so-called easy website solutions. Their level of frustration and lack of results finally got to the point of picking up the phone and getting help.
Do you use a free or cheap website building service? After reading this article, I’d love to hear about your experience in the comments section below.
Here are some compelling reasons why these free/low cost website building tools are costing you and why investing in a professional website is critically important:
1. Cheesy cookie cutter templates cheapen your brand
One of the most important elements of your online brand identity is your business look and feel. Having a logo and brand colors can help you to stand out and make you more memorable.
Truth is that most of the free and low-cost website building tools force you to choose one of their prefabricated templates.
This hurts you in several ways:
First, anyone else who uses that website template will have exactly the same look and feel as your business. This can cause brand confusion if another business appears identical to yours. Plus it makes your business look “cheap” that you didn’t invest in unique branding.
Secondly, it forces you to choose from the selection of available templates. You may not find one that suits your business where it will properly attract your ideal client.
Third, often these templates look old and outdated. This will make your business look amateurish and coming across as being cheap. No one wants to hire a business with that perception in mind, would you?
Hiring a professional web design company helps you to have website graphics that are unique and perfectly suited to your business. And to take that a step further, you can then apply your branding consistently across all your marketing for a unified, professional look.
2. Cheap website templates keep you stuck on their services forever
Think you will own your website? Think again. When you use free or low-cost services for your website, your content is locked into their unique proprietary systems and template.
This means you risk getting shut down with no warning at any time for a variety of reasons like being accused of spamming or some other broken rule against their terms of use policy.
Moreover, it is impossible to download your website files and upload them to a new website hosting service. If you ever want to change providers or the one you are using goes out of service, you are tough out of luck.
3.You can’t customize or add any frills
The dirty little secret these low-cost or free website services don’t tell you is you truly have to use the website as is. If you want to add any custom scripts or functionalities, you have no way to access the server to add these.
This can be very frustrating.
If you work with a professional Web developer, he or she can set your custom website up on a reputable hosting server that provides customization access to your website and add any functionality you can dream of. The sky’s the limit.
4. Difficult to optimize for search engines
While some free/low cost website templates say they offer tools for SEO, these tools pale in comparison to the ability a professional SEO service has to properly optimize your website and drive traffic.
Many of the website templates don’t give you access to the key elements needed to affect SEO such as meta tags, titles, and header tags.
Plus, website templates often use flash technology which can’t be picked up by search engines or be seen on most smart phones.
Bottom line, if you want to drive traffic, hiring a website design service ensures that you’ll be able to properly optimize your website to rank well in the search engines.
5. Ugly Banners
Don’t cheapen your professionalism online by using a website template that adds free banner ads for other companies.
Not only is this ugly, but it drives visitors away from your website.
Building your own custom website means the only products and services you advertise are the ones you choose. That way, you keep the website visitors on your website (as opposed to sending them off to somewhere else when they click on those banners) with a better shot at adding them to your mailing list or buying something.
If you’ve been managing your website by yourself until now, it’s hard to know everything about running a business, especially when it comes to your website. It may be time to have a team who can help you create a unique online presence that sets you apart and gives you the flexibility to customize it to meet your unique business needs.

How Google Manually Rates Your Website

Posted by Shankar

Over the years, Google has made reference to its stable of human raters used to determine the quality of sites, helping to improve the algorithms used for search. Those raters use the “Human Rater Handbook” to check for a wide variety of things Google has dubbed important, and they rate sites according to how they fulfill all that criteria.
Sounds like a valuable tool for SEOs to get their hands on, huh? Well, in July a copy of the handbook leaked out onto the Internet, and people have been trying to glean insights from it ever since, looking for the key to rising through the Google ranks.
The truth is, there’s no one magical key. This document doesn’t deliver any earth-shattering revelations about how Google decides its rankings. But it is a good reminder about what Google considers important, and for that reason it should be required reading for SEOs.
Of course, the document itself is also super long, 160 pages, so if you’re looking for the Cliff’s Notes version, read on for a quick primer on what we can learn from Google’s “Human Rater Handbook.”
The Role of Human Raters
But first, a quick summary of the role of human raters. Google uses algorithms to deliver what it believes are the most relevant results to people doing searches.
The human raters give input to ensure that these are, indeed, useful search results. Their job in a nutshell:
  • Determine the effectiveness of search results
  • Test changes to the algorithm
  • Analyze the quality of different websites
  • Assess a site’s reputation
  • Note a site’s supplementary content
What Are Some of the Most Important Points in the Quality Rater Guidelines?
The quality rater guidelines are long and very in-depth, touching on dozens of different things raters should keep an eye out for. Here is a summary of the most important and most relevant takeaways for SEOS.
1. High-Quality Content is King
What you have heard is true: content is indeed the make-or-break part of search. Google rates sites with the best content the highest. This means content that is not only useful to people searching it out but also written clearly and easily understood. The best thing your web site can do, based on these guidelines, is hire a professional writer to ensure you have the best content.

2. It’s Key to Link to Other High-Quality Sites
Google has its raters be on the lookout for links to other high-quality sources on the web. The logic is if your site is reputable, you will be linking to other reputable places and not to spammy sites that are just looking for a sale. High-quality sites include trusted resources such as The New York Times or The Wall Street Journal. Low-quality sites include link farms and places where anyone can give “expert” commentary, such as Yahoo Answers.

3. Updating Your Website Regularly is Critical
SEOs have been telling their clients for years to keep their sites updated with fresh content. These rater guidelines remind us why. Google puts a high value on information that is up to date, and for good reason. When you do a search on, say, “latest SEO guidelines,” you don’t want to read resources from two years ago. They are dated and unreliable. As a consumer, you want the fresh stuff. This is one thing SEOs and Google can agree on.

4. Including Contact Information Can Boost Your Site
Google hates spammers; this we all know. So anything that makes your site appear less-spammy is going to play well with the human raters. This includes putting contact information on your pages, so that people can get in touch with you. Leaving such information off is a hallmark of webspam. Include your information under a heading such as “contacts” or “how to get in touch.” This way it’s obvious to the Google team you are legit.
5. Positive Reviews Can Make a World of Difference
The Google human raters have been trained to be wary of sites that receive negative reviews. While they will not necessarily punish sites without any type of review, they will crack down on those that have complaints logged against them. And they will reward those with positive reviews. Monitoring your reputation online is thus all the more important, because not only do you not want negative reviews coming up in queries about your company, you also want to boost your search results.
6. Supplementary Content Is a Helpful Tool
The human raters at Google are always looking for ways that websites can help out their users. That is why Google rewards sites with supplementary content, such as resource pages or informative articles, with better search results. To give your site a lift, think about what other content might be useful to visitors.
7. Beware of Your Ad Placement
Advertising can be a hindrance when it comes to search rankings. Raters are told to keep an eye out for any disruptive advertising that distracts from the content of the page. Sites with lots of
ads at the top or ads that are hard to ignore will be penalized.
In Google’s Own Words
What better way to figure out what Google is looking for in a web page than to hear it from the company itself? The good news is the leaked guidelines largely confirmed SEOs have been taking the right approach with most of our search tactics for years. If you are getting good results, there’s probably no reason to change what you are doing.

The Missing Ingredient in SEO Writing

Posted by Shankar

SEO writing is a specialized type of writing, where the author tries to develop content that will catch the eye of Google and result in a Web page being listed in Google’s search results.
Most writers who engage in this type of writing focus on getting the right keywords into the text to win the Google listing and to score a click from Google to their website or their client’s website.
I have done article marketing myself since 2000 and I know from personal experience this technique works. In 2010, before I closed down a couple of my less profitable domains, I was seeing three-quarters of a million visitors a year to my websites, with 6.5 million page views.
Thirty-five percent of my traffic was derived from Google: 265,000 visitors per year.
About 50 percent of my traffic came directly from the articles I had on the Internet that were promoting my websites: 375,000 visitors per year.
Even in the first 10 months of 2014, my current websites have seen 189,000 visitors, with 990,000 page views.
The bottom line is that article marketing still works, when you do it correctly.
The problem is that the way most SEO writers do it WILL NOT produce the same kinds of results I have seen.
Most freelance SEO writers are focused only on producing just enough words to get paid and sprinkling in enough keywords to make their services seem valuable to their webmaster clients.
Given how most SEO writers approach writing for the search engines, it is no wonder that Google’s Farmer update devastated the article directories a few years ago. Ezine Articles, for example, saw a 90 percent decrease in its Google search results, which led also to a significant decline in EZA’s traffic.
Even after the devastation of the Farmer update and subsequent Google updates, most SEO writers have missed the boat on how to overcome the real problem.
When they took the hit, EzineArticles responded by telling authors to increase the average word count of articles from 300 words to a minimum of 400 words per article (and 600 words in “spam prone” niches, in 2013). In doing so, it helped perpetuate the myth that the problem was related to low word count.
When flying airplanes, all pilots know if they begin a journey of 1,000 miles, and they are off by a single degree on their trajectory, they will end up 16.7 nautical miles from their intended destination. If their flight is off by five degrees, they will end up a full 83 miles from their intended destination.
If you are telling your SEO writers that the solution to the problem of how to overcome Google’s anti-spam technology is to write more words, then you are sending people down a path that will leave them miles from their intended destination.
The problem with SEO articles was NEVER low word counts…
The problem with SEO articles was that they never appealed to the people, who were finding those articles in Google’s search listings.
Google was intent on improving their user’s search experience. To make sure that their users did not go somewhere else for search services, Google had to take steps to improve the quality of materials available to their users.
Google’s users do not want to read 600 words of gibberish, intended to fill a page with the words that will look good to Google’s search algorithm.
Instead, Google’s users wanted exactly what they have always wanted:
• Articles that answers a question they might have…
• Articles that help them solve a problem…
• Articles that tells a story they want to read…
• Articles that deliver the best quality information in an easy-to-understand and interesting manner…
When was the last time that you read a “SEO article” and found it informative and fun-to-read?
What is that you said?
Never?
Our future customers are selfish…
For some reason, people want to be able to read fun or interesting articles that answer to their specific needs.
The one thing, above all other things, that SEO writers fail to do is to have empathy for their readers.
They fail to realize that people want to read articles that offer more than a jumble of words that contain keywords of interest to their readers.
They fail to give readers what they want.
Search engines like keywords… People want answers.

Here is the interesting aspect of this approach to writing articles…
If your article answers the reader’s questions in a way that the reader felt was interesting or useful, then people will share your article on social media.
And what is the key to getting Google to pay attention to your article? Links from external websites, including social media?
Finally, we can see the nuts and bolts behind creating SEO content that makes a real difference for the people using it…
Sure, keywords are important to get the attention of Google, but so is the quality of the story you tell.
Google needs two pieces of information to judge your content useful to its users:
• Keywords – to understand the nature of the content;
• Links from Third-Party Websites – to understand the importance of the content.
Any SEO writer, who fails to understand what it takes to create content that appeals to the end-user, is in essence stealing money from your marketing budget, by selling you false hope and pretend rewards.
It has been my experience that you can hire freelance writers at pretty much any rate at which you want to pay for those services. You can hire writers for peanuts, or you can pay more to ensure a better quality article.
When you hire a freelance writer, it is your job to tell your writer what kind of content you want in return for your money.
If you tell your writer that you want SEO articles, you will most likely get 600 words of pure gibberish, dotted with a few keywords here and there.
If, however, you instruct your writer to create an article that answers a question for the reader, then you will get the kind of content you want written, and it will actually help with your SEO goals.
Remember that old anecdote… “He who has the most money makes the rules.”
If the freelance writer wants your money, they will follow your rules. So, don’t be afraid to tell people what kind of content you want them to create for you.

“The Missing Ingredient in SEO Writing” should be obvious by this point…
If you don’t give readers what they want, then Google cannot and will not give you what you want – more Google listings, more traffic and more sales…
Now that you know how to make your SEO writing shine, what are you going to do with this information?
Filed Under: Article Marketing Blogger Lumia SE Optimization Social Media Writing Content

What You Can Learn from KISSmetrics’ SEO Strategy

Posted by Shankar

In this post, I will provide a comprehensive technical Inbound audit for KISSmetrics. This example is extremely detailed, and it covers a wide range of inbound topics. But before the inbound marketing audit, let’s cover the basics…
Who is KISSmetrics?
KISSmetrics is a Web analytics solution that helps companies make smarter business decisions, and boosting ROI.
Headquartered in San Francisco, KISSmetrics is backed by a syndicate of angels and early stage funds.
Why Audit KISSmetrics?
Like everyone working in SEO and inbound marketing, we all overlook or miss things (even basic things). The goal of this post is to help KISSmetrics, to help others learn from their inbound successes (they’ve done remarkable things in this arena), and to give a third-party perspective on a great example.
To be fair, “audit” seems like a terrible word, but in this case, think of “audit” as “being helpful.” The purpose of this audit is to give an outsider’s view to a great company. All in all, I hope this helps KISSmetrics. Neil and team are outstanding
Disclaimer
Before I jump into the technical details of the inbound audit, it’s important to note that I have no affiliation with KISSmetrics (I am also not currently a user of their product – so you don’t have to worry about affiliate links). KISSmetrics also did not ask me to complete this audit. As a result, I don’t have access to any of the site’s analytics or webmaster tools accounts, and I don’t have access to the site’s content management system (CMS).
In a typical audit, I would begin by sifting through the data and then narrowing in on problem issues. So… if I make completely inaccurate observations, I blame my data from the third-party tools, ie (Searchmetrics, Ahrefs, SEMrush, Moz, etc…). Don’t get me wrong – these tools are awesome, but they generally work better from the inside of the company.
The goal of this audit is to help KISSmetrics. The aim is never to critique in a negative, harmful way, but to help promote KISSmetrics through inbound marketing by giving them the perspective of an objective third party. With those disclaimers out of the way, let’s begin.
*If you have any questions: @elioverbey
Since this post will be extremely long – here is a detailed navigation to help you understand where you are throughout this audit. As you will see, this audit will spend a majority of time on optimization (top of funnel), but lower areas of the funnel will be covered as well. The audit will work through four main areas as seen below: Optimization (Attract), Conversion, Customers, and Delight.
KISSmetrics Web Structure
Optimization / Attract
Search Visibility
Robots
Robots Meta Tag
Accessibility
Performance
Site Architecture
Authority Flow
Click Depth
On Page Factors
HTML Markup
Structured Data
Head Tags
Open Graph
Twitter Cards
Titles
Meta Descriptions
Images
URLs
Duplicate Content
External Links
Off Page Factors
Backlinks
Backlink Distribution
Backlink Source
Anchor Link Analysis
Image Links
Conversion / Leads
Calls to Action
Landing Pages
Forms
Close / Customers
Contacts / E-mail
Delight
Social Media
Engaging Content
Summary
Indexability / KISSmetrics Web Structure
In order to understand this audit, you need to understand how KISSmetrics is set up. At first glance, you’ll notice that KISSmetrics is constructed using quite a few subdomains:
  • kissmetrics.com/ – 18 pages
    • *tweets.kissmetrics.com – 2,587 pages
    • blog.kissmetrics.com – 1,038 pages
    • *status.kissmetrics.com – 70 pages
    • grow.kissmetrics.com – 7 pages
    • focus.kissmetrics.com – 10 pages
    • support.kissmetrics.com – 347 pages
    • middleman.kissmetrics.com – 1 pages
    • styleguide.kissmetrics.com – 7 pages
    • uptime.kissmetrics.com – Redirects to status.kissmetrics.com
    • demo.kissmetrics.com – 100 pages
An * (asterisk) indicates that the subdomain is blocked by the robots.txt.
In total, KISSmetrics has 4,185 pages across 11 domains and subdomains.
As you glance at the site structure of KISSmetrics, you notice that their content is divided up into subdomains. Due to the structure of KISSmetrics, this inbound audit will focus solely on two areas of their site: the root domain (kissmetrics.com) and their blog (blog.kissmetrics.com). The other subdomains do not contain information pertaining to this inbound audit (ie. styleguide) and will not be included in the analysis.
Inbound Marketing
In its most basic form:
“Inbound marketing focuses on earning, not buying, a person’s attention.” – Brian Halligan
The principle of inbound marketing has been around for years, but the term has been coined by HubSpot (well done Dharmesh and Brian). Inbound Marketing is all about creating great content that pulls people in.
Since HubSpot is the go-to on inbound, this audit will use their structure as a model. As this audit continues, you’ll notice how the funnel naturally works.
kissmetric_attract
Optimization / Attract
The first step in inbound marketing / SEO is attracting the right audience. Content is the single best way to attract new visitors to your website. In order to be found by the right prospective customers, KISSmetrics must not only create optimized content, but foster an environment in which content can thrive.

Search Visibility

In order to set the framework for the entire audit, it’s important to analyze the site’s performance. When looking at KISSmetrics’s traffic, there are 3 important questions:
  • Is the traffic growing?
  • Has the site been penalized?
  • What is the percentage of organic traffic?
To determine KISSmetrics’s traffic performance, I used two tools:
Searchmetrics Suit, and one of my new favorites, Similar Web. Finding accurate third party data is incredibly difficult, but after using Google Analytics on a network of sites, I’ve found that these two programs provide the most accuracy in capturing third-party data.
As you can see from the graph below, there are some red flags for KISSmetrics (the tools join all traffic from all subdomains together):
kissmetric_chart
As the graph above shows, the site’s visibility decreased dramatically after May 18, 2014. This date is important because it also corresponds to when Google updated Panda 4.0.
Immediately, there is cause for concern on KISSmetrics’s site. As you begin to look over the third-party data, there is a strong correlation between the data (although third party) and Google’s Panda update…
There are three possible solutions to this drop in traffic:
  1. The data is inaccurate
  2. The site was algorithmically penalized due to duplicate content
  3. The site was algorithmically penalized due to guest bloggers
The first possibility: The data from the third party is inaccurate. In fact, it seems highly improbable that KISSmetrics was hit with an algorithmic penalty, right? Neil Patel (co-founder of KISSmetrics and SEO expert) wrote an article for Search Engine Journal about the Panda 4.0 update the day after Panda 4.0 was released.
In the post, Neil describes the update, who was affected, and how to fix it. Does it not seem like KISSmetrics (his company) would be prepared for Panda? Since he knew so much about Panda, he would have protected his assets.
Besides that, KISSmetrics writes lengthy (2,000+ word average), unique content that is well received (only 12 of their articles don’t have any comments – meaning, the other 1,000+ have engagement via comments).
That is one possibility. But the other possibilities do cause for more concern:
The second possibility: KISSmetrics was hit by Panda 4.0 and here is the probable reason:
For some odd reason, the KISSmetrics blog exist on two protocols: http and https. Meaning, the same content (although unique and engaging) can technically be seen as duplicate.
Let’s look at an example. As of today, the most recent post from KISSmetrics is: Website Testing Mistakes That Can Damage Your Business (this post was chosen due to recency, but this same problem exists throughout). You can find this article in two places:
  • http://blog.kissmetrics.com/website-testing-mistakes/
  • https://blog.kissmetrics.com/website-testing-mistakes/
As you can see from the image below, the same page exists in two places (unfortunately, this is duplicate content and cause for concern):
kissmetric_video
You’ll notice that in the image above, on the left is the normal protocol (http://), and on the right is the secure protocol (https://). Google drops the http:// protocol in the browser on the left by default. I also checked out the source code on this same page to find three problems that could have led to this drop in traffic on May 19th (Panda 4.0), I highlighted 3 key problems in the diagram below:
kissmetric_sourcecode
  1. Once again, I highlighted the protocol so you can see the two exact pages in two locations.
  2. Both of these pages are INDEX, FOLLOW. If one of these pages was NOINDEX, there wouldn’t be a huge problem, but KISSmetrics is telling the search engines to index both of these pages. This is another red flag.
  3. The third red flag is the canonical. The canonical tag tells the search engines that a given page should be treated as though it were a copy of the URL. In this case, each page canonicals itself. One page says the content should canonical to http:// and the other page says the content should canonical to https://. Basically, each page is claiming original.
Finally, Google has noticed that the blog exists in two locations. As you can see below, a query for the blog brings back both protocols (http and https).
kissmetric_blog
This means that the KISSmetrics blog was probably hit by an algorithmic penalty on May 19th.
Even if the third party data is wrong (and KISSmetrics was not hit by a penalty) they should fix these problems immediately. KISSmetrics should decide which URL is primary, and then 301 or canonical.
Even more, as it currently stands, the site is dispersing its links among https and http. If they fix this problem, the could see a rise in rankings due to the migration of their links.

The third possibility:
 KISSmetrics was hit by Panda 4.0 because of guest bloggers.
I am not implying that KISSmetrics was penalized because “they didn’t stick a fork in their guest bloggers.” But, KISSmetrics could have been penalized by duplicate content from their guest bloggers (quite a few of KISSmetrics posts are written by guest bloggers). Guest writers could have written for KISSmetrics and then re-published that content on their own blog.
Based on the KISSmetrics publishing guidelines, there are no guidelines stating that the writer cannot re-publish his or her own content. Although the duplicate content issue could be covered in other forms of communication to the writers (contract, etc), it is a possibility.
Penalty Conclusion
Although I am hoping for the first possibility, (I wouldn’t wish a penalty upon anyone) these are issues that KISSmetrics should thoughtfully consider. Again, I cannot stress the unreliability of third party data, but in any manner, these are issues that should be addressed immediately. Hopefully, Neil can chime in on this issue. I’d really love to know if these have been issues for KISSmetrics (plus, Neil is extremely transparent in order to be helpful).

Robots

A robots.txt file is used to restrict search engines from accessing specific sections of a site. Here is a copy of KISSmetric’s root domain Robots.txt file:
kissmetric_sitemap
On their blog, KISSmetrics uses another robots.txt file:
kissmetric_robot
There are a few improvements that KISSmetrics could make in the construction of their robots.txt.
First, on the blog, KISSmetrics should consider blocking ‘/wp-content/plugins/’. Sometimes developers put links in their plugins.
Second, on the blog, KISSmetrics should rethink blocking /wp-includes/’ in robots. There are better solutions for blocking robots than in the robots.txt file (ie, NOINDEX).
Finally, KISSmetrics should add a working sitemap to their blog. The current sitemap (Sitemap: http://blog.kissmetrics.com/sitemap.xml.gz) does not work.
kissmetric_404
Fixing this will help the robots to index their pages.
Robots Meta Tag
Each page on a site can use a robots meta tag to tell search engine crawlers if they are allowed to index that page and follow its links.
WordPress is a great open source platform – one that KISSmetrics built their blog on – but duplicate content is one thing you have to be very mindful of. Using the robots meta tag will help prevent duplicate content. Content duplication issues include tags, categories, and archives.
Quite a few of KISSmetrics’s pages have a meta robots tag. In the case mentioned above (wp-includes), it would benefit KISSmetrics to include a “noindex” robots meta tag on a per page basis, rather than blocking an entire directory.
One thing KISSmetrics should consider ‘NOINDEX’ is blog subpages. As you can see below, the sub-pages are indexed:
kissmetric_viewsourceblog
Since these page take up crawl bandwidth, but don’t have unique content, and historically have a low CTR, KISSmetrics (based on their analytics) should consider “NOINDEX, FOLLOW” on these archive based pages.
KISSmetrics has successfully implemented this change on subpages of categories, topics, and topic sub-pages, and should consider adding the same meta robots tag to the other archive based structures. The “NOINDEX, FOLLOW” will remove the subpages from the index, but will still allow links to pass.
Accessibility
This section covers best practices for both search engines and users. Many of the search engines’ accessibility issues were mentioned above in index-ability, and now we’ll cover accessibility as it mainly relates to personas with the benefits of robots in mind.
Performance
According to Google Webmaster tools:
You may have heard that here at Google we’re obsessed with speed, in our products and on the web. As part of that effort, today we’re including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests… While site speed is a new signal, it doesn’t carry as much weight as the relevance of a page.
It’s tempting to dismiss site speed as an important SEO ranking factor, but if Google says it matters (even a small percentage), then it matters. Even if you dismiss speed as an optimization factor, it is also an inbound factor that can’t be ignored. Who likes a slow site?
KISSmetrics’s site speed is fairly slow, and KISSmetrics could easily make a few tweaks to make their site much more efficient. As you can see below (via GTMetrix), the site speed for the root domain is:
kissmetrics_analytics
The problems on the root domain above are quick fixes. The blog (blog.kissmetrics.com) did much better, scoring an 88 percent and a 79 percent.
Reducing the number of files needed to load the site, and thereby reducing the number of HTTP requests, will make KISSmetrics’s site load more quickly. Currently, the root domain makes 64 requests when loading the page (which is somewhat surprising, considering the homepage is only a few images and includes less than 70 words.)
There are usually three parts to fixing this:
  • Reduce the number of JavaScript files
  • Reduce the number of CSS files
  • Reduce the number of images
On the homepage, the Time to First Byte is efficient (300ms), but the page load is somewhat slow – anywhere from 2.95 seconds on GTMetrix to 1.95 seconds (Pingdom). They received a 55/100 on Page Speed Insights via Google (42 on Mobile), and 75/100 on Pingdom. KISSmetrics’s homepage has room for improvement.
Another concern is that their blogs have over 100+ https requests, but surprisingly, even with all the requests, the page speed was excellent: 933kb.
The site could be improved with the following:
  • Eliminate render-blocking JavaScript and CSS
  • Minimize HTTP requests – KISSmetrics’s pages will load more quickly with fewer requests. Minimizing these requests involves reducing the number of files that have to be loaded, such as Javascript, CSS, and images.
  • Combine Javascript and CSS into external files with links from the header. This allows the external page to be cached so that it will load faster (there are 10 CSS files and 4 JS files loaded separately on the blog).
  • Implement server side / browser caching – This creates a static html page for a URL so that the dynamic sites don’t have to reload / be recreated each time the URL is requested.
  • Load Javascript asynchronously.
  • Use a CDN – such as Amazon. The CDN will allow users to download information more quickly (as far as I can tell, they are not using a CDN).
  • Finally, use 301′s only when necessary. A 301 forces a new URL, which takes a longer time to load (93 of their pages use 301).
Site Architecture
The site architecture defines the overall structure of a site, and it has a number of important SEO implications. For example, when a page receives external authority, the site architecture defines how that authority flows through the rest of the site.
Additionally, since search engine crawlers have a finite crawl budget for every site, the site architecture ultimately dictates how frequently pages are crawled (or if they’re crawled at all).
Authority Flow
To understand how authority flows through the site, I performed an analysis on the site’s internal links.
Based on that analysis, here is the distribution of the site’s links (these values have been rounded to the nearest percentage):
kissmetric_authority
As you can see above, 96 percent of the pages have a value less than 0.1 (pages with authority values closer to 0 have the least authority and pages closer to 1 have highest authority), relative to the other pages. Most of the site’s internal authority is held by 3.8 percent of the pages.
The root cause for this authority is the site’s navigation. All of the pages that have large amounts of internal links (1800+), are found in the footer or header:
/infographics (1,834)
/marketing-guides (1,830)
/webinars (1,822)
/topics (1,820)
Since these links appear sitewide (i.e., on every page), these page receive multiple internal links, while the other pages receive very few internal links.
Even though the article categories on the site break down the articles into substantial sections, the internal links do not seem to pass through to individual articles (which is the case with almost every navigation).
The related post section on each article works well to accomplish this purpose (an example is below), but not every post includes the related post widget.

Sony Posts $1.2B Loss for 2Q as Mobile Continues Slide

Posted by Shankar

Sony has reported a $1.2 billion net loss for its second quarter as the Japanese firm’s mobile division continues to struggle.
The loss was not as great as Wall Street had been predicting, primarily due to demand for PlayStation 4 consoles. It was those sales that helped to offset the company’s writedown of its Xperia Smartphones. Thomson Reuters predicted Sony would have an operating losses of $1.47 billion.
The company posted operating losses of $785 million in the quarter, that is in spite of a 7.2 percent year-on-year revenue increase totalling $17.45 billion.
Sony has lost money in six out of the last seven years and, lately, it has been because of it beleaguered mobile division which, in this quarter, suffered a loss of $1.58 billion.
Sony’s gaming unit is a boon for the company, however, helping offset mobile losses. Gaming revenue jumped 83 percent year-on-year thanks to the PlayStation 4. Sony’s device business, meanwhile, which includes Smartphone camera lenses and image sensors posted $271 in revenue, an increase of 187 percent compared to the year-ago quarter.

Apple CEO Says He’s ‘Proud to be Gay

Posted by Shankar

Tim Cook is gay and proud of it.
The Apple CEO penned a missive for Bloomberg Businessweek, published today, in which, for the first time, he publicly acknowledged his sexual orientation.
Although Cook said he has never denied being gay in his day-to-day life, he has always aimed to keep is personal life private from society-at-large.
One of his favorite quotes — from Dr. Martin Luther King Jr. — however, convinced him he was not doing all he could as a public figure to ensure those who live a gay lifestyle are not discriminated against.
“I believe deeply in the words of Dr. Martin Luther King, who said: “Life’s most persistent and urgent question is, ‘What are you doing for others?’ ” I often challenge myself with that question, and I’ve come to realize that my desire for personal privacy has been holding me back from doing something more important,” Cook wrote. “That’s what has led me to today… I’ve had the good fortune to work at a company that loves creativity and innovation and knows it can only flourish when you embrace people’s differences. Not everyone is so lucky.”
Cook said he is “proud to be gay,” adding that it has given him “a deeper understanding of what it means to be in the minority and provided a window into the challenges that people in other minority groups deal with every day.”
Cook said it has “been tough and uncomfortable at times,” but has given him confidence to be himself and the ability “to rise above adversity and bigotry. It’s also given me the skin of a rhinoceros, which comes in handy when you’re the CEO of Apple.”
“I don’t consider myself an activist, but I realize how much I’ve benefited from the sacrifice of others,” he added. “So if hearing that the CEO of Apple is gay can help someone struggling to come to terms with who he or she is, or bring comfort to anyone who feels alone, or inspire people to insist on their equality, then it’s worth the trade-off with my own privacy.”
Newer Posts Older Posts Home

About 9Tech Info

9TechInfo.blogspot.com is a blog for Tech Geeks and Bloggers where we share fresh and updated information related to Latest Mobiles,Automotive,Internet, Social Media, Gadgets, Blogging, How-To Guides,SEO and much more related to Technology. You can check out about the admin of the blog here and check out our Sitemap.

Search The Blog

Labels

5 basic Tips Android ads Article Marketing block ads Blogger Ecommerce Featured Google how to block ads Lumia Phone Battery Life SE Optimization Search Engine Optimization seo Social Media Windows Phone Writing Content

"© Copyright 2014" 9Tech Info · All Rights Reserved · And Our Sitemap · All Logos & Trademark Belongs To Their Respective Owners·