Informational keyword research is a subject that has been covered thousands of times across every SEO blog, publication, and web design company. However, with voice becoming a more prominent way of searching, it’s important that it’s now taken into consideration. With voice usage growing, marketers need to understand how their audience is using this technology, and how they can adapt to this. Keyword research has been advancing dramatically over the last couple of years. No longer are the days of simply sorting by the highest search volume and creating a page; it comes down to much more than that. Semantics, categorisation, ranking difficulty vs reward, questions, featured snippets, people also ask. The list goes on. A straightforward task has now become much more complex as well as time-consuming, and it’s important that it’s right the first time as keyword research will tend to influence your strategy, projections and, in some cases, KPIs. We’ll be using a variety of paid and free tools within this guide. However, even without the paid tools it will give you a large dataset but will require a little more manual work. Keyword SetWe’re going to be basing this on you already having a website that is established in some form or another, meaning you already have some rankings which can be used as an initial starting point. However, if this isn’t the case you’ll be able to simply skip these steps, although in some cases it could cause you to miss out on some smaller keywords. Search ConsoleUnderrated a lot of the time and critiqued heavily, however it’s a free tool that does give you lots of data, especially with recent updates; you’re able to obtain 16 months’ worth. It can sometimes be slightly inaccurate, however that’s much more data that you will be able to get from many paid tools. To get the most out of this it’s best to try and filter down to the pages which are informational on your site – for example, your blog, a hub or guides. This will give you a top list of URLs which you can then dig into and provide you with a list of keywords to expand on. We suggest taking a list of the main subjects you find here to be able to dig into them further later on. AutocompleteAnother free tool that we can take advantage of? Autocomplete, or Google Suggest, has been around for 14 years now, which simply seems insane. The feature, created on a bus by Kevin Gibbs, has given us bundles of joy throughout the years as well as causing controversy in other instances but it does give us a great research tool. This is something that we have created in-house tools to take care of, simply due to the nature of the task. However it’s something you can carry out manually by adding your keyword into the search box and grabbing the suggestions. It’s worth noting here that the monthly search volumes, as well as CPC estimates, are all generated by the keywords everywhere tool which is a must have as it can give you instant feedback on how many people are searching for the keyword as well as commercial intent. This is something that will work well for some industries but not all. Again, take down all of the keyword suggestions you find and place them into your list of ideas generated from Search Console. Now, this is where we start to look at voice in more detail. We know that voice search is mainly used for asking questions – the whos, wheres and whats of the world. Having a list of these question modifiers allows you to take the Google Suggest keyword data even further.
These question modifiers allow us to really dig down into the questions people are asking, as these may not always have a lot of search volume and be easily find-able. By using wildcards in the search queries alongside the main subject, you will expand the keyword set even further. Some questions may not make sense and may not be keywords you want to go after. But, doing this for all of the questions modified, along with your key subjects, will give you a great starting position for your voice focused keyword research. Related Queries & People Also AskYes, we’re still on the same page yet there are more sections where we can continue to expand the keyword set. Related queries and people also ask are great free sources to further expand on what you already have. If you do this with all of the question queries you have obtained from the initial search console and suggested search data, you will then have a huge list of questions people are asking about your chosen areas. This gives you a great starting point for targeting people searching via voice as well as normal typed searches. CompetitorsThis isn’t anything new but is an important step in making your data set as fool-proof as possible. Entering your competitors into Ahrefs or SEMrush, filtering down by their informational areas, or simply anything with one of the question-related modifiers and adding this to your list is a very quick and simple way of making sure you aren’t missing anything the competition is doing. AhrefsThis is one of our favorite tools right now. It’s been proven to be one of the best at backlink exploration but is also great for keyword data, especially when building up a keyword set. Creating a keyword listFirst of all, you’ll need to set up your keyword list, based on the previous data you have gathered. This will then automatically pull through the search volumes, clicks, difficulty and many more data points. It’s important to note that some of the data may not have been updated in some time. If this is the case you’ll need to use your credits to re-run this to gather new data. QuestionsAs we know, voice search is all about questions. This is where we can further expand on the keyword set by using Ahrefs’ questions section. This is going to be one of the most important areas of the keyword research. Ahrefs does only take the first 10 keywords from the keyword list though, so it may be worth inputting the main categories, to begin with, to find questions around those and gradually digging deeper once you have the main questions. There are going to be a lot of long tail questions with very little search volume, depending on the niche. It’s worth filtering through these to see if they are in fact useful. If so keep them, if they don’t make sense, get rid. Other data gathering toolsAs well as the ‘normal’ keyword research tools there are many other places you can find data on questions people are searching for. Again, depending on the niche this may or may not be useful to you. Quora is a great place for gathering information around voice-based searches as it’s a platform for exactly that – asking questions. This is simply the top five results for that seed keyword in the search bar. Search for some of your main topics, grab the lists of results and again feed them into your keyword set, either on Ahrefs or your excel sheet. The same can be said for Pinterest. This is great for lifestyle and retail websites as Pinterest is, of course, a very visual platform. This search has very quickly given me another list of suggested searches, which I can then take, expand on using the variety of different techniques already outlined and again add to the already extensive data set we have collated. Pinterest is great platform to find keyword ideas for retail, to inform both offline and online strategies. Local intent keywordsVoice search is not only about content driven informational terms. Sometimes it may be as simple as asking for opening times, contact details or business history. It’s important that these are factored into your keyword sets. Also, if you aren’t a business with a lot of branded search volume it may be difficult to find specific keywords with tools. However, data from GSC may still provide you some great insight. We would suggest having a separate document to make sure that any questions around your specific brand are answered. This may mean making sure your structured data is up to scratch, Google Local Listings are complete or the content on your site includes what people are looking for. The Final Keyword SetDoing a keyword research for voice is not much different to how you would normally carry out a keyword research for informational. However, there is much more of a focus on questions – not only high-volume questions but ones which could have 0-10 searches a month simply due to the long tail nature. Running through this process multiple times, with your different seed keywords, will help build out an extensive list of questions and informational terms people are searching for. This will not only help influence your content strategy but your sales strategy – knowing what people are looking for is the first step in understanding your user or customer base. from https://searchenginewatch.com/2018/10/18/guide-voice-search-keyword-research/
0 Comments
The rise of voice search is no secret, and many companies are still wondering how to address it. When it comes to search data, how can we monitor which queries are from voice? In this article, Jason Tabeling shows how he finds insights into voice search from his own company’s data. Alpine.AI estimates that there will be more than 1 billion voice searches completed in 2018. At this point I’m sure that everyone has seen or done a voice search, even if you just saw an example in a Google Home or Amazon Alexa ad. The power of what can be completed with a voice command is growing by the day. This trend is already having a massive impact on consumer behavior and therefore needs to be a consideration when monitoring or optimizing our search accounts. Right now Google doesn’t provide specific information on how a search was started. For example, was it a Google home search, from the app, done via typing or voice. However, I wanted to take a recent dive into our own search data from our company, BrandMuscle. I used Google Ads client data across all verticals, comparing the first nine months of last year with this year, dates 1/1/17-9/30/17 versus 1/1/18 – 9/30/18. The data revealed some interesting trends that give insight into how voice searches might be showing up in the data we currently have available. Looking at the length and type of queriesTo start, I looked at the length of search queries. The thought is with voice search consumers are using a more natural language. So instead of searching for “car insurance” consumers might search for “cheapest car insurance for Toyota Camry.” Our data shows the average number of words in each query has increased for both mobile and desktop queries year over year. Desktop queries are up 7% year over year to 4.3 words per query. Mobile has increased 9% year over year to 4.14 words per query. With mobile increasing more rapidly I think this shows the impact voice is having, as well as larger devices enabling consumers to input longer queries. The other area I wanted to dig into was what types of search queries are consumers using. I think voice search has brought an increase of consumers asking questions in a more natural language. Therefore I looked into question modifiers like who, what, where, when, why, and how. The use of these terms indicate more conversational tones. Our data has shown an increase in these terms of 118%, and a majority of that change coming on mobile phones which is up 178%. So what actions should you take with the increase in voice search? Here are 3 considerations:1. Build experiences for question based queriesOne of the reasons why search is such a great marketing tactic is consumers are giving you specific clues as to what they want. This is even more true when a question modifier is included. For example the implications behind what type of information a consumer is looking for when they are searching for “Where” + Running Shoes vs. “What” + Running Shoes is very important. “Where” would indicate the consumer wants map or location information, where “what” indicates more research based content is needed. Having ad copy and landing page experiences that meet these specific demands will help increase conversion rates. 2. Monitor your search query dataSome of you might be reading these and questioning if this data matches your own search accounts. I bet it’s directionally correct, but might vary significantly. The only way to know is to dig into your own data. See what insights you can derive. Are any queries that your ad appeared where it should not have? This should be a part of regular account maintenance as you update match types and negatives. 3. Test voice queries that are important to your businessPutting yourself into the shoes of one of your customers is still a very helpful lens for account managers. We often take our eyes off what the consumer experience is considering how much focus and attention we put into the details of a good search program. However, sometimes it’s just as simple as seeing what the search query results look like as a consumer. Try a search on Google Home. Does it provide an answer? Is it an answer that was taken from your organic listing? What opportunities can you uncover with this information? Voice search is having a major impact on our lives as consumers and marketers. This change is happening quickly and will continue to evolve as the technology continuously gets better and better. Following a few of these quick tips to keep an eye on your data will be very helpful to staying ahead of this trend. from https://searchenginewatch.com/how-voice-queries-show-up-search-data WordPress initially started out as a ‘blog-only’ platform and now that it has extended as a full-fledged Content Management System, it remains a popular blogging platform. WordPress.com blogs have over 409 million monthly viewers who looked at 22.4 billion pages per month this past year. This standalone fact is enough to justify the popularity of WordPress as people’s favorite blogging platform. WordPress does provide a lot of helpful features for blogging enthusiasts who are looking to start their own blogging website. However, inexperienced bloggers do commit some mistakes in spite of all the online help available. In this blog post, we will review the most common SEO WordPress mistakes that bloggers commit out of either ignorance or sheer carelessness. Regardless of the reason, these mistakes affect the search engine ranking of their blogs and even their online reputation. So, let’s explore seven of the most common SEO mistakes made by WordPress bloggers. 1. Not using the right SEO optimized blogging themeIf you are new to blogging, you might have missed out on the information that WordPress offers SEO optimized themes for your blogs which are highly helpful when it comes to the quest of online rankings. If you are not using an SEO optimized blogging theme, you are obviously a step behind than the others who are relying on them. There are a lot of SEO optimized blogging themes for WordPress that you could choose from such as Divi, MagPlus, Jevelin etc. 2. Missing on an SEO optimized contact formEven if your WordPress blog is in its initial phase, it needs to provide a point of contact to its followers, even if they are less in the count than expected. A contact form serves the purpose just right. Your contact form is a conversion driver and optimizing it for the right SEO keywords will help your visitors easily find your blog, hence amplifying the traffic. 3. Not buying a domainAre you running your free blog using WordPress with the default blog address you were allotted with? If the answer is ‘Yes’, you might not be pleased with what we are about to tell you. A blog or even a website runs well only when it runs as per the need of its target audience. A proper domain name provides an identity to your blog and prepares a path for the visitors to lay their expectations. Not buying a domain can damage the traffic expectations of your blog and kill its overall Search engine ranking. 4. Not optimizing blog imagesA great blog comes to being only when its relevant content is paired with original and high-quality images. However, a lot of WordPress blog and website owners forget to tap the optimization of these images. It is very important to optimize the images you use in your WordPress blog. It helps your site load faster and even enhances your Google PageSpeed score. To optimize your blog images, you can seek help from WordPress image optimization plugins such as Smush It, EWWW Image Optimizer, and TinyPNG. These plugins will help you compress your images without affecting their resolution and also take care of their SEO optimization. 5. Choosing the wrong keywordYour blog’s reachability depends entirely on the Keyword chosen by you for its Search Engine Optimization. Keyword Research might be a very extensive concept but it can do wonders for your blog’s SEO if done in the right manner. You have to work on an SEO Keyword strategy for your blog in a manner that you are using Keywords that define the subject of your content, are low in competition yet are commonly used by visitors for finding the information they are looking for. Finding Keywords that fit the bill for all these requirements can be quite a task and might overwhelm certain users. As demanding they might be, they require your focus or the attention if you are looking to rank your blog well. 6. Not focusing on loading speedYour online blog’s loading time will highly affect the traffic on it and also the site abandonment ratio that follows if your blog takes a lot of time to load for its visitors. A loading time above 2-3 seconds can lead to a lot of visitors abandoning your blog. If you really are serious about your blog’s loading speed, you must get a Caching plugin for your blogs such as W3 Total Cache, WP Fastest Cache or WP Super Cache. These plugins are easy to use and they make your WordPress blog speedy as well. You must also not refrain from investing in a reliable web hosting service because they tackle your blogging website’s server side issues and have their fair share towards your blog’s overall performance and speed. 7. Not focusing on content and readabilityProbably one of the most important aspects of your blog is the content that you push through it. It needs to be of a top-notch quality when you are looking to commit no SEO mistakes in and around it. Make sure the following things about your blog’s your content:
ConclusionA lot of experienced blog owners do commit technical and onsite SEO errors and then look for SEO agencies and content marketers to take care of their blog’s SEO. However, the most common mistakes can be easily avoided by creating a checklist of the must-haves. Analyze your WordPress blog today and see if you are committing any of the mistakes mentioned above. Hopefully, you’ll be able to tackle them and remove them from your blog at the earliest. Once you have a solid SEO content strategy and a perfect plan of action for your blog’s SEO, you will definitely be able to refine and improve the overall SEO performance of your WordPress blog. from https://searchenginewatch.com/common-seo-mistakes-wordpress-bloggers-make Google has decided to shut down Google+ after discovering a data breach. How should we react to the news? Not many of us were surprised to hear that Google+ will stop existing in a few months. The only surprise came in the way the news was revealed with Google announcing a data breach that led them to this decision. Google has published a blog post last week mentioning that they discovered a bug in the API for Google+ that allowed third-party developers to access data of 500,000 users with unauthorized permission. What’s interesting is that they didn’t disclose the breach back in March when they discovered it and they only brought it to the public after The Wall Street Journal covered it in a post. The story became so big that Google knew that they had to respond to it. They’ve provided more details in their recent blog post about the bug:
They seem to use the word ‘bug’ making it clear that there is no evidence that there was a misuse of the data. There was also an action to reassure users with the launch of more granular Google Account permissions through the individual dialog boxes. Still, it seemed like the best time to shut down Google+, one of their least popular products the last few years. The end of Google+When was the last time you used Google+? Not many of us can remember the last time we’ve had a meaningful interaction on Google+ or used it as part of our social media (or search) ROI. Google’s attempt to launch its own social network was ambitious but the problem was that it never clicked with its audience. The stats speak on their own and they come from Google’s latest blog post: “The consumer version of Google+ currently has low usage and engagement: 90 percent of Google+ user sessions are less than five seconds.” Thus, users are only accessing Google+ by mistake or they simply find no reason to stay engaged. On the contrary, there seemed to be a fit for enterprises using Google+ and they might even find new features to benefit from it:
Hence, the end of the consumer use may not necessarily mean the end of its enterprise users. When will the platform shut down then for consumers then? Google mentioned that there will be a 10-month period that you can still access the social network until it shuts down. This means that we will all say the final goodbye to Google+ at the end of August 2019. What do all these mean?Dr Ben Marder, Senior Lecturer in Marketing at University Edinburgh Business School was skeptical from the beginning of Google’s endeavor and the problem started from the network’s positioning:
You may be indifferent about Google+ as a user but you may have used it in the past as part of your marketing or SEO strategy. There used to be a time that Google+ was still relevant for professional reasons and it even brought some sort of ROI for some businesses, especially in niche industries and communities. What does this change mean for marketers then?Chances are that you haven’t used Google+ for at least a couple of years. However, it’s still interesting to explore how the social media landscape is evolving. Even Google’s power wasn’t enough to convince users to use its social platform. It’s a lesson for all of us not to rely on one platform for our marketing strategy, whether it’s Facebook, Instagram, or YouTube, as you can’t predict what the future holds. It’s always a good idea to look into the future to ensure that your strategy is adapting to the changing consumer habits. Moreover, another data breach, whether it’s called a breach or a bug, is a matter of concern for users who lose their trust on big tech giants. This is important when creating your next marketing campaigns to ensure that your brand and your messages comply with the audience’s needs. Trust will become key on social networks and we cannot ignore it anymore, especially after a year of multiple data breach scandals. And what does this change mean for SEO professionals?Social media and SEO can still make great allies. Google+ used to help companies boost their SEO with social playing as a useful ranking signal. Although it has never been an official ranking signal, it still contributed to online authority. However, as Google+ started losing its audience, it didn’t affect SEO even if it was Google’s own network. YouTube can still impact your search rankings and of course, your popularity on other social channels can still affect your position in SERPs. Still, social media is not the most important factor to your SEO strategy and Google+ certainly won’t be missed in 2018. If you want to check your Google+ data you can visit Google’s Takeout where you can download the data from Google’s services. Google will also provide more details soon on how you can both download and migrate all your Google+ data. from https://searchenginewatch.com/2018/10/14/the-end-of-google-after-a-data-breach-and-how-it-affects-us/ Your website’s link profile is basically one of your biggest areas of concern. Given the kind of penalties levied by Google for any misconduct related to your link profile, the concern is well-justified. The truth is that link profile building is a time-consuming activity that requires judicious decisions; especially when you are pondering over the thought of buying backlinks for your own website. However, a report also talks about how a sample of 750,000 well-shared articles over 50% had zero external links. This points to the widespread shallow SEO knowledge where website owners don’t care enough to build and earn links for their website. And sometimes the ones who are doing well don’t necessarily care about ethics. A lot of website owners struggle with their site’s Search Engine Optimization due to several reasons. Since backlinks are a quick and efficient way of spiking your site’s traffic and ranking, many of them resort to adopting unethical ways of getting these backlinks i.e. buying them or exchanging them for other digital favors. The question is, does it actually work? Can you really fool the Google Algorithms? Will your website be ever penalized for it? If you always had these questions in your mind, this blog post will help you explore it all and better understand if or not you can keep Google from knowing that you are buying links to enhance your site’s SEO. A basic backdrop on backlinksIf you are new to website building or SEO, you might ask, ‘What is a backlink?’ Well, a backlink is an incoming hyperlink from one web page to another website. Having these backlinks on a website increases the credibility of the website as well as the business related to it. Backlinks from quality sites that have a high authoritative value can have an added advantage for your website. There is a whole lot of theory behind the use of the right backlink practices and before you get there, you need to know how a paid link can affect your website. What are paid/bought links vs. earned links?An ideal link profile is one that features earned backlinks rather than paid ones. These earned backlinks happen when other websites and blogs find your website’s content to be genuinely interesting and useful and they choose to add them as a backlink. However, some website owners have relied on the arrangement of buying and selling links in order to survive the tough SEO war. Paid link building is when a website pays a third party domain for a followed backlink that points back to their domain. This is strictly forbidden by search engines and can result in harsh penalties. It can benefit your website for a very short period, but it is never there to stay. Hence, your website should stay away from such unethical practices because sooner or later your website will get penalized for indulging in such misconduct. A rightfully-earned backlink contributes to making websites more resourceful and easy for the online audience, instead of poaching SEO ranks. These earned backlinks point to content that is likable, resourceful, and qualified. Only when your website offers relevant and high-quality content, other websites want to point out to that. Yes, that is a lot of work and that is why most of the website owners try taking a shorter path i.e. paid links. Will you really be able to hide from Google that you are buying links?Some website owners and inexperienced SEO enthusiasts function with an opinion that they are smarter than Google or that ” how would Google ever know” that they are buying links. What they seem to underestimate is the fact that the Google’s reach and ability to mine and interpret data is far outreached than our comprehension. If you think that using a dedicated IP VPN such as PureVPN can help you hide the traces, that is debatable. When these irresponsible website owners sport these paid links on their website, they forget that they always create a pattern, no matter how hard they try to avoid creating one. Patterns such as excessively sharing links that belong to another domain/industry that has nothing to do with their own industry or targeting websites that indulge in such excessive link building activities help Google to flag the culprits. For people who think that they can outsmart the algorithms, they may be right. However, they also underestimate the human/manual review team of Google that can one day take their website for a roll and end up penalizing it heavily for such paid link building activity. It is always either an algorithm report, a tip-off by a competitor, or a manual team member’s action which is going to take down such websites. So, there’s no escape. Why paid links don’t work at all?The answer is short, loud, and clear. Paid links don’t work in the long run because they are unnatural, irrelevant, and deceptive; enough to get your website penalized. By now, we hope that we have managed to throw enough light on the topic of the super bad paid links and on the good guys that earned backlinks are. Now that you know, here are some Bonus link-building tips:
ConclusionLink building is a wide topic to reckon with because it is 2018 and the website ranks make up for a good competition. Even if your website is struggling to stay ahead and make a mark, never fall into the unethical practices of paid links or other black hat SEO techniques. They are always figured out by Google and that can do a lot of harm to the reputation of your website. With the right backlinking practices, you will be able to win over the other marketers only because you will know how to implement advanced link building techniques in the most ethical way. The progress might be slow but all your efforts will be worth the wait when your site’s search engine ranking improves. from https://searchenginewatch.com/hiding-backlinks-google-possible I don’t usually go for drastic headlines, but it does seem like some tides have been turning of late. We’ve all followed the stories of data breaches, new regulations, fake news, hacks, ever-rising privacy concerns. Not to mention this week’s discovery that webmaster Google had a breach exposing private data from as many as 500,000 people. As a result of which, they’ll be shutting down Google+ for consumers. Facebook and Google faced scandals of no small sort within months of each other. GDPR passed, and subsequent regulations are hedging their way into the US market. But perhaps most interesting of all, on September 29 Tim Berners-Lee surfaced to announce the next “one small step” for the web. I may not speak for the masses, but when Berners-Lee pipes up about something I tend to lend my ear. Besides being best known as the person who invented the World Wide Web (how about adding that to your LinkedIn), he’s been quite on-point in following its evolution. Curious footnote: the WWW started as a memoAs he tells the story himself from a TED stage, “I wrote a memo suggesting the global hypertext system. Nobody really did anything with it. But 18 months later — this is how innovation happens — 18 months later, my boss said I could do it on the side, as a sort of a play project…So I basically roughed out what HTML should look like: hypertext protocol, HTTP; the idea of URLs, these names for things which started with HTTP. I wrote the code and put it out there.” And now look at us. Running whole businesses on that one widely explosive memo. Anyway. Almost 20 years after the original invention, Tim Berners-Lee appeared on the TED stage to thank people for all their work contributing to the web so far and to ask support to push the web into the next phase. From documents to dataReflecting on the collaborative effort that had been the web thus far, in 2009 Berners-Lee said, “I asked everybody, more or less,”Could you put your documents on this web thing?” And you did. Thanks. It’s been a blast, hasn’t it?” He likened that first evolution to the next: from documents to data. In that talk in 2009 he asked people, governments, universities, the UN, anyone with large, unused, non-private data sets to open them up on the web. Through data, we saw the magic of Hans Rosling showing us global development over time. We’ve seen data used to help in hurricane relief, to save a primeval forest, and of course to create entirely new industries, products, customer experiences, and interactions. From one-way data to read-write dataHappily, we’ve seen open data in troves. But most of it has been one-way, for instance government data that can be viewed but not interacted with. Which brings us back to: hey Berners-Lee, what have you been up to the last eight years? Besides teaching computer science at both Oxford and MIT (again, casual), he’s apparently been working on a little side project called Solid, “an open-source project to restore the power and agency of individuals on the web.” Built using the existing web, Solid is a platform that offers two primary benefits: data empowerment and data interactivity. It gives users the power to decide where data is stored and who can access which parts of it. It lets users link, share, and collaborate on data with whomever they want. Next: power with digital giants to power with consumers?All of this of course brings us back to the original question: have we reached the tipping point? Some proponent the concept of “walled gardens,” where internet, media, advertising, search, and data power are concentrated in the hands of primarily four digital giants: Google, Amazon, Apple, and Facebook. Those four companies continue creeping into our lives and homes in never before dreamed ways. But trust is waning. Earlier this year, Edelman found a “37-point aggregate drop in trust across all institutions” — a steeper decline than in any other market. In the words on Berners-Lee, “For all the good we’ve achieved, the web has evolved into an engine of inequity and division; swayed by powerful forces who use it for their own agendas. Today, I believe we’ve reached a critical tipping point, and that powerful change for the better is possible — and necessary.” What does internet in the hands of consumers look like?Well, who’s to say? Right now it still looks rather swayed by those powerful forces who use it for their own agendas. A platform like Solid, though, would usurp that. It’s at odds with the current value exchange. Instead of demanding users hand over personal data to digital giants in order to essentially use the web, Solid seeks to take one small step toward restoring the balance of the web as it was actually intended to be. We would each have control over data. Just as we all “put our documents on this web thing” and “it was a blast,” a platform like Solid seeks data empowerment and data interactivity. Two things many of us struggle to imagine. But then again, as Berners-Lee ended his post, “The future is still so much bigger than the past.” Ps in case I haven’t already made this clear, it’s a pretty worthwhile read. This post also appeared on ClickZ. from https://searchenginewatch.com/google-data-breach-berners-lee-solid-power-shifting Google makes changes to its ranking algorithm almost every day. Sometimes we know about them, sometimes we don’t. Some of them remain unnoticed, others turn the SERPs upside down. So, this cheat sheet contains the most important algorithm updates of the recent years alongside battle-proven advice on how to optimize for these updates. PandaIt all started changing in 2011 when Google introduced its first ever Panda algorithm update, the purpose of which was to improve the quality of search results by down-ranking low quality content. This is how Panda marked the beginning of Google’s war against grey-hat SEO. For 5 long years it had been a separate part of a wider search algorithm until 2016 when Panda became part of Google’s core algorithm. As stated by Google, this was done because the search engine doesn’t expect to make major changes to it anymore. Main Focus
Best PracticeThe very first thing to focus your attention on is internally duplicated content. I can recommend carrying out site audits on a regular basis in order to make sure there are no duplication issues found on your site. External duplication is yet another Panda trigger. So, it’s a good idea to check suspected pages with Copyscape. There are, however, some industries (like online stores with numerous product pages) that simply cannot have 100% unique content. If that’s the case, try to publish more original content and make your product descriptions as outstanding as you can. Another good solution would be letting your customers do the talking by utilizing testimonials, product reviews, comments, etc. The next thing to do is to look for pages with thin content and fill them with some new, original, and helpful information. Auditing your site for keyword stuffing is also an obligatory activity to keep Panda off your site. So, go through your keywords in titles, meta description tags, body, and H1 for making sure you’re not overusing keywords in any of these page elements. PenguinPenguin update launched in 2012 was Google’s second step towards fighting spam. In a nutshell, the main purpose of this algorithm was (and still is) to down-rank sites whose links it deems manipulative. Just like Panda, Penguin has also become part of Google’s core algorithm since 2016. So, it now works in real-time constantly taking a look at your backlink profile to determine if there’s any link spam. Main Focus
Best PracticeThe very first thing to do here is to identify harmful links. It’s worth saying that various software use different methods and formulas to determine the harmfulness of a certain link. Luckily, SEO SpyGlass’s Penalty Risk metric uses the same formula as Penguin does. After you’ve spotted the spammers, try to request removal of the spammy links in your profile by contacting the webmasters of the sites that link to you. But if you’re dealing with tons of harmful links or if you don’t hear back from the webmasters, the only option to go for is to disavow the links using Google’s Disavow tool. Another thing that I can highly recommend making your habit is monitoring link profile growth. The thing is any unusual spikes in your link profile can be a flagger that someone has spammed on your site. Most probably you won’t be penalized for one or two spammy links, but a sudden influx of toxic backlinks can get you in trouble. And on the whole, it’s of the greatest advantage to check all the newly acquired links. EMDThe Exact Match Domain update was introduced by Google in 2012 and is exactly what it’s called. The intent behind this update was to target exact match domains that were also poor quality sites with thin content. This was done because back in the days SEOs would skyrocket in search results by buying domains with exact match keyword phrases and building sites with extremely thin content. Main Focus
Best PracticeThere’s nothing wrong with using an exact match domain. The only condition for you to be on the safe side is to have quality content on your website. What is more, I wouldn’t advise you to remove low quality pages entirely, try to improve your already existing ones with new, original content instead. It’s also a good idea to run link profile audits on a regular basis to identify spammy inbound links that have low trust signals and sort them out. After that, it’s only right and logical to start building quality links as they are still the major trust and authority signals. PirateYou probably still remember the days when sites with pirated content were ranking high in the search results and piracy used to be all over the Internet. Of course, that had to be stopped, and Google reacted with the Pirate Update which was rolled out in 2012 and aimed to penalize websites with a large amount of copyright violations. Please note that there’s no way the update can entail your website being removed from index, it can only penalize it with lower rankings. Main Focus
Best PracticeThere’s not much to advise here. The best thing you can do is publishing original content and not distributing others’ content without the copyright owner’s permission. As you may know, the war with piracy is still not won. So, if you’ve noticed that your competitors use pirated content, it’s only fair to help Google and submit an official request using the Removing Content From Google tool. After that, your request will be handled by Google’s legal team, who can make some manual adjustments to indexed content or sites. Hummingbird/RankBrainStarting from 2013, Google has set a course for better understanding of search intent. So, it introduced Hummingbird in the same year and then RankBrain in 2015. These two updates complement one another quite well as they both serve for interpreting search intent behind a certain query. However, Hummingbird and RankBrain do differ a bit. Hummingbird is Google’s major algorithm update that deals with understanding search queries (especially long, conversational phrases rather than individual keywords) and providing search results that match search intent. And RankBrain is a machine learning system that is an addition to Hummingbird. Based on historical data on previous user behavior, it helps Google process and answer unfamiliar, unique, and original queries. Main Focus
Best PracticeIt’s a good idea to expand your keyword research paying special attention to related searches and synonyms to diversify your content. Like it or not, but the days when you could solely rely on short-tail terms from Google AdWords are gone. What is more, with search engines’ growing ability to process natural language, unnatural phrasing, especially in titles and meta descriptions can become a problem. You can also optimize your content for relevance and comprehensiveness with the help of competitive analysis. There are a lot of tools out there that provide TF-IDF analysis. It can help a lot with discovering relevant terms and concepts that are used by a large number of your top-ranking competitors. Besides all the above mentioned factors, don’t forget that it’s crucial to work on improving user experience. It’s a win-win activity as you’ll provide your users with better experience and won’t be down-ranked in SERPs. Keep an eye on your pages’ user experience metrics in Google Analytics, especially Bounce Rate and Session Duration. Pigeon/PossumBoth Pigeon and Possum are targeting local SEO and were made to improve the quality of local search results. The Pigeon Update rolled out in 2014 was designed to tie Google’s local search algorithm closer to the main one. What’s more, location and distance started to be taken into consideration while ranking the search results. This gave a significant ranking boost to local directory sites as well as created much closer connection of Google Web search and Google Map search. Two years later, when the Possum Update was launched, Google started to return more varied search results depending on the physical location of the searcher. Basically, the closer you are to a business’s address, the more chances you have to see it among local results. Even a tiny difference in the phrasing of the query now produces different results. It’s worth mentioning that Possum also somehow boosted businesses located outside the physical city area. Main Focus
Best PracticeKnowing that factors applicable to traditional SEO started to be more important for local SEO, local businesses owners now need to focus their efforts on on-page optimization. In order to be included in Google’s local index, make sure to create a Google My Business page for your local business. What is more, keep an eye on your NAP as it needs to be consistent across all your local listings. Getting featured in relevant local directories is of the greatest importance as well. It’s worth mentioning the Pigeon update resulted in a significant boost of local directories. So, while it’s always hard to rank in the top results, it’s going to be much easier for you to get included in the business directories that will likely rank high. Now that the location from where you’re checking your rankings influences a lot the results you receive, it’s a good idea to carry out geo-specific rank tracking. You just need to set up a custom location to check positions from.
FredGoogle Fred is an unofficial name of another Google update which down-ranked websites with overly aggressive monetization. The algorithm hunts for excessive ads, low-value content, and websites that offer very little user benefit. Websites that have no other purpose than to drive revenue rather than providing helpful information are penalized the hardest. Main Focus
Best practiceIt’s totally fine to put ads on your website, just consider scaling back their quantity and consider their placement if they prevent users from reading your content. It would be also nice to go through Google Search Quality Rater Guidelines to self-evaluate your website. Just as usual, go on a hunt for pages with thin content and fix them. And of course, continue working towards improving user experience. Mobile Friendly Update/Mobile-first indexingGoogle’s Mobile Friendly Update (2015), also known as Mobilegeddon, was designed to ensure that pages optimized for mobile devices rank higher in mobile search and down-rank not mobile friendly webpages. However, soon it has become not enough just to up-or down-rank sites according to their mobile friendliness. So, this year Google introduced mobile-first indexing according to which it started to index pages with the smartphone agent in the first place. Moreover, websites that only have desktop versions have been indexed as well. Main Focus
Best PracticeIf you’re curious whether your site has been migrated to mobile-first indexing or not, check out your Search Console. If you didn’t receive a notification, it means that your website is not included in mobile-first index yet. What is more, make sure that your robots.txt file doesn’t restrict Google bot from crawling your pages. If you still haven’t adapted your website for mobile devices, the time to join the race is now. There are a few mobile website configurations for you to pick from, but Google’s recommendation is responsive design. If your website is already adapted for mobile devices, run the mobile friendly test to see if it meets Google’s objectives. If you have a dynamic serving or separate URLs make sure that your mobile site contains the same content as your desktop site. What is more, structured data as well as metadata should be present on both versions of your site. For more detailed information, consider other recommendations on mobile-first indexing from SMX Munich 2018. Page Speed UpdateAnd now on to the Page Speed Update that rolled out in July of this year which has finally made page speed a ranking factor for mobile devices. According to this update, faster websites are supposed to rank higher in search results. In light of this, our team conducted an experiment to track correlation between page speed and pages’ positions in mobile SERPs after the update. It turned out that a page’s Optimization Score had a strong correlation to its position in Google search results. And what is more important, slow sites with high Optimization score were not hit by the update. That brings us to a conclusion that Optimization is exactly what needs to be improved and worked on in the first place. Main Focus
Best Practice:There are now 9 factors that do influence Optimization Score officially stated by Google. So, after you’ve analyzed your mobile website’s speed and spotted (hopefully not) its weak places, consider these 9 rules for Optimization Score improvement.
from https://searchenginewatch.com/cheat-sheet-google-algorithm-updates-2011-2018 International brands have their work cut out for them. Building a consistent brand experience across multiple continents and to audiences that speak different languages is no easy task, and the process of translating individual pages from one language to another is time consuming and resource intensive. Unfortunately, much of this work can go to waste if the right steps aren’t taken to help search engines understand how your site has been internationalized. To help you prevent this, we’ve collected a list of “Do’s and Don’ts” to help guide your internationalization efforts and ensure that your pages get properly indexed by search engines. Do conduct language specific keyword researchThe direct translation of a keyword will not necessarily be what users are searching for in that language. Rather than simply taking the translation at face value, you will have more success if you take a look at your options in the Google Keyword Planner to see if there are other phrasings or synonyms that are a better fit. Remember to update your location and language settings within the planner, listed just above the “keyword ideas” field: Don’t index automatic translationAutomatic translation can be better than nothing as far as user experience goes in some circumstances, but users should be warned that the translation may not be reliable, and pages that have been automatically translated should be blocked from search engines in robots.txt. Automatic translations will typically look like spam to algorithms like Panda and could hurt the overall authority of your site. Do use different URLs for different languagesIn order to ensure that Google indexes alternate language versions of each page, you need to ensure that these pages are located at different URLs. Avoid using browser settings and cookies to change the content listed at the URL to a different language. Doing so creates confusion about what content is located at that URL. Since Google’s crawlers are typically located in the United States, they will typically only be able to access the US version of the content, meaning that the alternate language content will not get indexed. Again, Google needs a specific web address to identify a specific piece of content. While different language versions of a page may convey the same information, they do so for different audiences, meaning they serve different purposes, and Google needs to see them as separate entities in order to properly connect each audience to the proper page. We highly recommend using a pre-built e-commerce platform like Shopify Plus or Polylang for WordPress in order to ensure that your method for generating international URLs is consistent and systematic. Don’t canonicalize from one language to anotherThe canonical tag is meant to tell search engines that two or more different URLs represent the same page. This doesn’t always mean the content is identical, since it could represent page alternates where the content has been sorted differently, where the thematic visuals are different, and other minor changes. Alternate language versions of a page, however, are not the same page. A user searching for the Dutch version of a page would be very disappointed if they landed on the English version of the page. For this reason, you should never canonicalize one language alternate to another, even though the content on each page conveys the same information. Do use “hreflang” for internationalizationYou may be wondering how to tell search engines that two pages represent alternate language versions of the same content if you can’t use canonicalization to do so. This is what “hreflang” is for which explicitly tells the search engines that two or more pages are alternates of one another. There are three ways to implement “hreflang,” with HTML tags, with HTTP headers, and in your Sitemap. 1. HTML TagsImplementing “hreflang” with HTML tags is done in the <head> section, with code similar to this: <head> <title>Title tag of the page</title> <link rel=”alternate” hreflang=”en” href=”https://example.com/page1/english-url” /> <link rel=”alternate” hreflang=”es” href=”https://example.com/page1/spanish-url” /> <link rel=”alternate” hreflang=”it” href=”https://example.com/page1/italian-url” /> </head> Where hreflang=”en” tells search engines that the associated URL https://example.com/page1/english-url is the English alternate version of the page. URLs must be entirely complete, including http or https and the domain name, not just the path. The two letter string “en” is an ISO 639-1 code, which you can find a list of here. You can also set hreflang=”x-default” for a page where the language is unspecified. Each alternate should list all of the other alternates, including itself, and the set of links should be the same on every page. Any two pages that don’t both use hreflang to reference each other will not be considered alternates. This is because it’s okay for alternates to be located on different domains, and sites you do not have ownership of shouldn’t be able to claim themselves an alternate of one of your pages. In addition to a language code, you can add an ISO 3166-1 alpha-2 country code. For example, for the UK English version of a page, you would use “en-GB” in place of “en.” Google does advise having at least one version of the page without a country code. You can apply multiple country codes and a country-agnostic hreflang to the same URL. 2. HTTP headerAs an alternative to HTML implementation, your server can send an HTTP Link Header. The syntax looks like this: Link: <https://example.com/page1/english-url>; rel=”alternate”; hreflang=”en”, <https://example.com/page1/spanish-url>; rel=”alternate”; hreflang=”es”, <https://example.com/page1/italian-url>; rel=”alternate”; hreflang=”it” The rules regarding how to use them are otherwise the same. 3. SitemapFinally, you can use your XML sitemap to set alternatives for each URL. The syntax for that is as follows: <url> <loc>https://example.com/page1/english-url</loc> <xhtml:link rel=”alternate” hreflang=”es” href=”https://example.com/page1/spanish-url”/> <xhtml:link rel=”alternate” hreflang=”it” href=”https://example.com/page1/italian-url”/> <xhtml:link rel=”alternate” hreflang=”en” href=”https://example.com/page1/english-url”/> </url> Note that the English version of the page is listed both within the <loc> tag and as an alternate. Keep in mind that this is not complete. For it to be complete you will also need separate <url> sections for the Spanish and Italian pages, each of them listing all of the other alternates as well. Don’t rely on the “lang” attribute or URLGoogle explicitly does not use the lang attribute, the URL, or anything else in the code to determine the language of the page. The language is determined only by the language of the content itself. Needless to say, this means that your page content should be in the correct language. But it also means:
Do allow users to switch languagesFor any international business, it’s a good idea to allow the users to switch languages, usually from the main navigation. For example, Amazon allows users to switch languages from the top right corner of the site: Do not force the user to a specific language version of the page based on their location. Automatic redirection prevents both users and search engines from accessing the version of the site that they need to access. Google’s bots will never be able to crawl alternate language versions of a page if they are always redirected to the US version of the site based on their location. Turning to Amazon for our example once again, we are not prevented from accessing amazon.co.jp, but we do have the option of switching to English: Don’t create duplicate content across multiple languagesWhile you should not canonicalize alternate language versions of one page to another, if you use alternate URLs for pages meant for different locations but the language and content are identical, you should use the canonical tag. For example, if the American and British versions of a page are identical, one should consistently canonicalize to the other. Use hreflang as discussed above to list them as alternates with the same language but for different locations. ConclusionUse these guidelines to make sure users from all of your target audiences will be able to find your pages in the search results, no matter where they are located or what language they speak. from https://searchenginewatch.com/onsite-seo-international-brands-dos-donts Last month I wrote about China’s search market, how it is dominated by Baidu, and how that dominance is threatened by mobile-only disruptors such as Shenma. While Shenma continues to build on its growth since being launched back in 2014, there have been news reports in the past few months suggesting Google may also be set to re-enter the market after having its search property (and others) blocked by the Chinese state back in 2010. I want to use this post today to try and separate out the facts from the speculation in regards to these recent reports. What can be corroborated? What is rumor? And what can we reasonably expect from Google in China over the short and long term? ‘Project Dragonfly’ officially existsThe most recent official statement from Google on this came on 26th September at a Senate hearing attended by the search engine’s chief privacy officer Keith Enright and detailed at South China Morning Post. “There is a Project Dragonfly,” Enright said, but added that he was “not clear on the contours of what is in scope or out of the scope for that project”. Enright also pointed out “we’re not close to launching a search product in China, and whether we eventually could, or would, remains unclear.” He iterated that if Dragonfly was anywhere beyond the early phases of exploration and development, then his team would be in the process of reviewing the product to ensure it adhered to Google’s privacy values. So while we can be certain Google is working on a search ‘project’ for the Chinese market, official word is that it is still very early days. Google is establishing new partnerships in the regionAs I highlighted in my last post about the Chinese market, Google has spent 2018 negotiating with key digital companies based in the country, including a patent cross-licensing agreement with Tencent and a partnership with e-commerce company JD.com. In July the company launched an AI-powered mini-game on WeChat (Tencent’s IM and social media client). It has also invested in an AI centre in Beijing, as well as a number of other domestic companies. Dragonfly, too, is rumored to be a partnership between Google and another companyThese partnerships between Google and companies in China is significant. The Intercept – a news site which specialises in privacy, security and politics – has been the primary reference from which other news outlets have based their stories about Dragonfly. Last month they reported that an internal memo was leaked which contained certain details about the project. No official statements have been made by Google as to what is or isn’t accurate in The Intercept’s reports about this memo. But as we will see, there are a number of elements which do seem trustworthy. One key detail is that the development of Dragonfly is a joint venture between Google and another, as yet, unknown company based in mainland China. This would be in-keeping with Google’s deal-making activities to date. And I wouldn’t be surprised if this partner could even be one of the companies we have already mentioned. Users will (according to The Intercept’s memo) need to sign-in to use the serviceThis is where Google’s partner in the project is significant. Whoever they are, they will potentially have access to this sign-in data. It is also speculated that phone numbers and IP addresses will be linked to searches too. As well as movements. In the context of China as a mobile search market, personal and locational data will no doubt help Google compete with the for-mobile service Shenma whose m-commerce and local search convenience are its key features. On the flipside, China as a proponent of ‘cyber sovereignty’ understandably makes a lot of people uneasy about such data potentially being accessed by the government. It is also possible that Google’s partner in Dragonfly will have a hand in being able to edit and amend search results. Dragonfly’s SERPs would need to adhere to China’s strict censorship laws, so if it is a partnership project this might make it easier for Google to run the project within the parameters set by the Chinese government. This might also make things easier in the eyes of users of the service – perhaps if the tool is presented not as a Google product at all, but rather with the subtle byline that it is ‘powered by Google’ the company can relieve itself of some of the responsibility on the censorship/privacy side of the service. A number of Google employees are resigning over Dragonfly (one of which is publicly verifiable)The Intercept has also published reports that around seven Google employees are leaving the company over Dragonfly. One of these, Jack Poulson, has been named publicly – and his resignation letter is accessible here. Poulson himself was made aware of Dragonfly from reports in the press. The subsequent actions of him and his colleagues certainly give credence to some of the speculation surrounding the data Google will gather via the search engine and whether this could end up in the hands of the Chinese government. “Project Dragonfly has, at the very least, involved us ‘designing’ technologies that violate the latter two of our four primary constraints,” Poulson writes. These constraints (or Google’s own AI Ethics Principles as set out in June 2018) state the company will not “design or deploy” AI in the following areas: There is appetite among Chinese consumers for Google to return to the marketWhile there is mounting pressure – even from within Google – to drop the Dragonfly project altogether, there is a massive yearning among Chinese consumers for Google to have a search presence once more. As reported at The Drum, more than 72% of Weibo users (one of the country’s leading microblogging sites) would choose Google over Baidu et al. if it were to launch a new service. Whether users would still be as enthusiastic for Dragonfly with the knowledge that their personal data might be linked to searches and accessible to the government remains to be seen. But it is not surprising that the company are exploring any possible way they might be able to have a search presence in the market once more. Chrome and other Google tools already have a significant presence to leverage fromAnother fact which is likely to push Google to explore every possible route back into the Chinese search market, is that they will be able to leverage some of their other key products in order to help Dragonfly establish a significant footing. As evidenced by StatCounter data, Google’s Chrome is the leading browser in the market. Android is the leading mobile OS. And in 2018 the company also launched the mobile cloud storage service Files Go. These successes, along with the company’s new partnerships ensure that business is in a strong position to compete with Baidu and Shenma should it re-enter the search vertical. Google is in a curious position where Dragonfly is concernedThe criticism levelled at the company from inside and outside its walls seems justified. The possibility that the data of its users may be subject to surveillance from the Chinese government and that the tool itself would need to be censored well beyond what is in-keeping with the company’s founding philosophy makes logical sense. After all, the service clearly aims to make use of personal data – and will likely need to – in order to offer a mobile search and shopping experience that can match Shenma and Baidu. Additionally, the Chinese government is unlikely to keep Dragonfly whitelisted if it doesn’t operate within its policy of “cyber sovereignty”. On the flipside, consumers in China are clearly quite desperate to have access to Google’s world-leading search algorithm once more. The Weibo survey coupled with the popularity of Chrome, Android and other Google products suggests that even if users knew that the state was meddling with Dragonfly’s SERPs many would still be keen to use the product. And they, of course, deserve the right to be able to choose to use it or not. Whether Google does or doesn’t re-enter the Chinese search market – the company needs to weigh up the gains of building the brand in the East, while potentially damaging it in the West. from https://searchenginewatch.com/2018/dragonfly-what-we-know-googles-plans-reenter-china When it comes to optimizing website content, there’s typically a lot of talk about the importance of choosing the right keywords and drafting compelling headlines – but the impact of meta descriptions on a website’s on-page SEO too often goes overlooked. While you may think you have ticked all the boxes to boost visibility of your ecommerce site, the absence of an accurate meta description might just be costing you traffic and sales. So, what are meta descriptions? They are short, unique snippets that describe a webpage. Think of what you would write if you had to advertise the webpage – that is exactly what meta descriptions need to include. Many people tend to leave meta descriptions blank without realizing the effect it has on search results. A powerful meta description leads to a rise in click-through-rates which boosts the SEO ranking of your page. Simply put, they form the first impression of your website, so you rather make it a good one. Here are 6 ways you can optimize meta descriptions to ensure clicks. Answer questionsWhether it’s seeking the nearest plumbing services or discovering the best hotels in Maldives – everyone on Google has come looking for answers to a problem. Put yourself in a customer’s shoes and think about what they could possibly ask to which your business can pose a solution. Your meta description needs to answer their question and impart value to entice them to click on your website. While the meta description length has been extended up to 300 characters in order to make them more “descriptive”, it is always safe to stick to 160 characters so that the description does not appear abrupt and incomplete which can be rather frustrating for readers. Evoke emotionLet’s face it – emotion sells. Whether it is arousing urgency, anger, joy, trust, curiosity – any piece of content that evokes emotion is likely to be more effective in persuading readers to act. The same needs to be applied while drafting meta descriptions too. You need to identify the emotional benefits a customer will attain by considering your brand and leverage it to drive traffic. Use words such as ‘attractive’, ‘enormous’, ‘powerful’, ‘unparalleled’ among others to strike a chord with your readers in just those two to three lines. Use calls to actionCalls to action (CTAs) are powerful words that one must incorporate in the meta description because they communicate a clear purpose and urge readers to take a step forward. However, this is not the place to use obvious calls to action such as ‘read more’ or ‘shop now’. Instead, you should use words like ‘save more’, state an offer or a tangible benefit such as free delivery or a free 30-day trial in the meta description to make better use of this space. Incorporate keywordsChoosing relevant keywords is one of the most critical steps for SEO optimization. That said, stuffing your content with these keywords won’t fetch you results. What’s important is strategically placing them in sections that can make a difference such as the meta description. When Google displays search results, the search words are displayed in bold and it helps to have them highlighted in your website’s meta description tag which indicates relevancy and catches the reader’s attention too. You are likely to have many keyword suggestions from research. In such cases, prioritize the keyword that has maximum impact for the particular webpage and use it in the meta description instead of fitting them all in 160 characters. Avoid duplicationEach of the pages of your website are unique so why must they have the same description? If more than one page of your website shares the same description, then they are competing with each other because it means that they are talking about the same thing. This results in Google pushing your website lower down the rank. So, don’t get lazy here and duplicate meta description content because that just gives the wrong signals to Google, deeming them to be spammy or repeated content which can hurt your search results. Use rich snippetsHave you noticed some meta descriptions contain links, reviews, ratings and video images among others? Those are referred to as rich snippets. Contrary to normal snippets, rich snippets include structured data to give more detailed information to the search engine, helping people make a better decision before clicking websites. Rich snippets give users quicker access to information through their visually appealing formats, images and relevant information that ultimate enhance click-through-rates. So, whether it is adding a contact number, product reviews and ratings or direct links – consider incorporating rich snippets in your meta description for it to stand out and attract clicks. Hence, even though meta descriptions don’t directly affect page rankings, it is recommended to optimize them to drive clicks and generate traffic. If you are unsure how you meta description will appear, you can use this free tool to do a quick test before going live. Adela Belin is the Head of Digital Marketing at Writers Per Hour. She creates content surrounding marketing with a focus on social media and digital marketing. from https://searchenginewatch.com/2018/10/05/how-to-drive-clicks-effective-meta-descriptions |
ABOUT MEPleasure to introduce myself I am Gillian 32 from Calgary, Canada. I am working as social media expert and have helped many clients with their social media marketing. Archives
July 2019
Categories |