Ensure You Can Satisfy a Visitor’s Searching Intention
One of the major factors that search engines use to determine how highly you’ll rank in Google is whether or not your web page manages to satisfy a visitor’s search query. Imagine someone searching for a “private math tutor in Manchester”. A website promising the same appears at rank #1. The googler clicks through and finds a skeleton page that is stuffed with keywords for “private math tutor in Manchester” yet doesn’t have a single tutor available. The searcher is pissed off, hits the back button, and then clicks on another search result, this time the one at rank #2. This second website has a great selection of available tutors, so the visitor engages one of them and ends their search.
The next time this person carries out a search on Google, it will be for something unrelated (e.g., “how to build a giant catapult”). But all this while, Google has been listening with its ear to the ground. It knows that the searcher didn’t find what they were looking for with the first website—otherwise, why would that person have hit back and searched for the same thing again? Google interprets this as dissatisfaction, and they respond by adjusting the future rankings of the two websites such that the one that satisfied the customer’s need gets nudged upwards while the lamer alternative gets pushed downwards.
As a consequence of all this, strategic webmasters should hide or actively dissuade web surfers from visiting unsatisfactory pages. For example, the page for “private math tutor in manchester” should not have been created and indexed.
Research What Currently Ranks
There’s nothing quite like imitation as a starting point in writing a web page intended to rank highly. Before writing the content for a given page, you are advised to study what currently ranks for that keyword. What format do these pages have—are they lists or essays or white papers? In what tone are they written? Who is the intended readership? What topics do they address?
Don’t limit your research to Google Search either—check out what gets upvoted on social bookmarking websites, shared on social media websites1, asked in question forums2, or written about in books on Amazon.
Place Your Keyword in the URLs
There are a bunch of on-page SEO signals that you can influence to help you rank. One of the most prominent of these is the page’s URL, and it is imperative that your URL contains your keyword.
Google’s algorithms aside, there is also a human basis for this practice: Potential visitors encounter links to your web page while browsing other websites. They quickly speed-read the URL and make a snap judgement about whether it justifies opening up a new tab. If you include your keywords in your URL, you send a message-in-a-bottle to these potential visitors, a subtle communication that you might just have what they are looking for. The end effect is that you are more likely to pique their interest and cause them to click the URL leading to your website.
(Doesn’t anchor text stop people from reading URLs? When available, yes. But anchor text is very often omitted (e.g., when someone copies and pastes a link into Twitter). This means that the raw URL will be shown to the potential visitor instead of the anchor text shell.)
There is a further reason to place keywords in your URLs. Google applies a special optimisation bonus within Google Search whenever your URL matches all or part of the searcher’s query. They do this by bolding matching words in your URL, and this added salience increases the chances that potential visitors will notice your search entry amongst all the others, increasing the click-through rate from Google Search to your website.
Try to avoid using machine gobbley-gook in your URLs. Imagine that the URL for the Easements Notes product pictured above was “…/search-db/GF38838383/easements#load-delay”. Now, this web page runs the risk of matching for totally unrelated search queries, such as “search-db” or “load-delay”. Moreover, all this ugliness in the URL is a waste of a good opportunity to put in human language that is more amenable to ranking, such as the student level of the notes (“GDL”, a postgraduate legal degree) or the institution that the notes were created at (e.g., “University of Laws”).
I’d like to give some advice to those in control of websites where at least some of the URLs are generated by administrators or regular users: In these circumstances, there is the danger of amateurly chosen URL names, URL deletions, and keyword cannibalisation.
In response to amateurly chosen URL names, you might want to have a trained administrator touch up user-generated URLs before going live. Or, alternatively, you might build a software module that automatically inserts important keywords into URLs, ensuring that they stay relevant.
Regarding deletions, you might consider some safeguards that prevent a user from deleting or renaming a URL that once drew fantastic traffic. I personally have a system that keeps a history of old URL names and redirects requests for no-longer-existent URLs back to active URLs.
With respect to keyword cannibalisation, the most basic (albeit underwhelming) precaution is a uniqueness check on URL names. A more comprehensive approach would involve the editorial touch of a human or a sophisticated NLP algorithm.
Place Your Keywords in the Title Tag
Google apportions heavy ranking weight to the title tag in your HTML, so one of the most basic points of SEO is to include your keyword(s) here.
As well as being used internally in Google’s ranking algorithm, your title tag also appears as the title of your web page’s search result snippet. For this reason, you should ensure that it communicates relevance to the human reader and entices them to click.
Due to the space constraints of a Google search result, your title tag is limited in length to approximately 55 characters. (An exact number cannot be provided because Google varies cut-off length depending on the device used to search.)
The title tag is typically used as the default text when someone shares your website on social media websites like Facebook or Twitter. By optimising it, you can better control and optimise how your page appears when shared virally.
The title tag also appears as the name of the tab/window in your browser. The marketing value of this little fact becomes apparent when we consider the visitor who gets distracted before buying and opens up Facebook in another tab, leaving your website aside for the time being. As they browse elsewhere, they will continue to see your title text in the corner of their eye. This serves as a reminder of your existence, or as a subtle to-do. There is perhaps a more subtle advantage to your peripherally present title. There is something called the “mere exposure” effect in psychology, whereby people develop preferences for things they are familiar with. This theory would predict that you can warm someone’s feelings towards your brand through presenting it to them in their peripheral vision. You might think that this sounds like psychobabble, but Bernstein, R.F (1989) demonstrated in a meta-analysis of 208 experiments that not only is the mere-exposure effect robust and reliable, but its effect is strongest when unfamiliar stimuli are presented briefly.
If you want to do something to actively grab attention after being relegated to a tab, consider updating your title tag every so often (e.g., Facebook adds a (1) to the title whenever someone receives a notification). This change in tab title grabs people’s attention, even from their peripheral vision, because the human eye is much more attuned to movement compared to stasis.
Google apparently weights the words at the start of your title more heavily than those at the end, so prioritise your keywords accordingly.3 This means that websites which wish to include their brand name should slap it at the end of the title instead of at the start.
Include Your Keyword Early in the HTML
There is an old SEO practice known as “keyword stuffing” which consists of cramming your keyword as many times as humanly possible into a web page’s body. This practice is now penalised by search engines, so it’s no longer a good idea to write “cheap hats” 10 times within the page. That said, Google’s algorithm does expect moderate mentioning of the keywords within the page’s body—after all, if they penalised a web page about Germany for using the word “Germany”, then the internet would become a giant cryptic crossword.
The latest search engine algorithms get around this problem by comparing a page’s keyword density with that which one would expect in such a web page, meaning that artificial and unnatural keyword stuffing will stand out and be penalised. Playing it safe, then, your best bet is to write a web page that earnestly attempts to inform, clarify, or sell in normal language.
When doing this, keep in mind that Google understands synonyms, meaning that you don’t need to repeat the same keyword too often. Google states that “cat breeder” will match for “kitten breeder”; “homicide” for “murder”; and “song words” for “lyrics”.4 Thus, feel free to use synonyms of your keyword throughout your page.
Remember too that the Googlebot approximates the importance of a term with how early within a HTML document that term occurs. On top of this, the Googlebot might only scrape the first X bytes of your page—so if your header or paragraph tags containing keywords appear deep within your web page, they risk being underweighted or even missed.
Carefully Craft a Unique Meta Description
The meta-description tag suggests to Google what it ought to display in your Google Search preview snippet.
The meta-description does not bind Google to the text you set; sometimes they will bypass your recommendation and create their own meta-description by picking out content from the body of your text that they believe to be a good fit with the search query.
I consider the meta-description tag as a sort of online advertisement. Google searchers see your search snippet alongside that of nine other competitors on the page, and it’s the job of your meta-description to convince the searcher to visit your website instead of the others. That is the meta-description tag’s sole purpose in life; it has no influence on how high you rank, only on how effective your snippet converts at its current rank.
Ensure Your Website Makes Sense to Googlebot
Google’s dealings with alternative content formats is hit-and-miss. PDFs are crawlable, except when they’re not; for this reason, it’s probably a good idea to convert them to HTML so as to enhance crawlability. Google cannot index video or audio content, so businesses that primarily produce these formats are advised to transcribe their work. Railscasts, a website that sells video tutorials about how to program, adds HTML transcriptions next to their videos. Mixergy, a website containing interviews with successful entrepreneurs, does the same, transforming their audio into text. (Sidenote: You need not actually transcribe yourself because there are services out there to do this for you.)
If you don’t get around to converting content to HTML, the next best option is to optimise the text elements of these non-text formats. In the case of PDFs, this entails putting your keywords into the file’s metadata, author field, description field, etc.7
There’s a fairly common web design practice that’s a big no-no for SEO: the display of some website text exclusively through images, without a textual backup. Although the text images might look prettier—indeed, that’s usually why they are used in the first place over regular HTML—they have the downside of Google not being able to reliably extract and index the text within. Instead, you should display your text in HTML—and if you must use an image, be sure to attach an alt attribute or nearby transcription. (More on alt attributes in a later chapter.)
Include Rich Snippets and Structured Data
Google defines snippets as “the few lines of text that appear under every search result…designed to give users a sense for what’s on the page and why it’s relevant to their query.” Rich snippets are simply search results that have been enhanced and enriched with nifty little extra titbits.
Google explains, “If Google understands the content on your pages, we can create rich snippets—detailed information intended to help users with specific queries. For example, the snippet for a restaurant might show the average review and price range; the snippet for a recipe page might show the total preparation time, a photo, and the recipe’s review rating; and the snippet for a music album could list songs along with a link to play each song. These rich snippets help users recognize when your site is relevant to their search, and may result in more clicks to your pages.”
At the very least, rich snippets make your search result entry look larger. All things being equal, the larger an entry, the more attention it commands—and the more attention something commands, the more likely it is to trigger clicks and visits.
Sold? Great! You can avail yourself of rich snippets by adding additional markup to your HTML that helps Google understand your content. Various formats are available for marking up your data, but Google recommends one called microdata, so you might as well follow their lead.8 At present, Google supports rich snippets for content types such as reviews, people, products, businesses, recipes, events, and music. For a full and up-to-date list of supported formats, refer to the listing on Schema.org, a joint initiative by Google, Bing, and Yahoo! aimed at creating a set of agreed-upon schemas for structured markup on pages. By following these guidelines, your markup will be supported in other search engines too—at least theoretically….
Another great bonus accruing to some lucky search result entries are what Google calls “sitelinks”. Underneath the entry for your search result, Google sometimes adds a collection of links to other pages of your website that they believe relevant.
So where do you sign up for your sitelinks? Unfortunately, you cannot explicitly direct Google Search to promote chosen pages as sitelinks. Instead, Google’s algorithm determines your sitelinks basis on each page’s popularity with visitors, and with the ease they have in inferring your website’s structure. You can help Googlebot out in this last regard by using the breadcrumbs microdata schema.9 There’s another lever of influence in here too: If Google happens to surface an inappropriate, sensitive, or incorrect sitelink, there’s an option to demote individual links within Google Search Console.10
(Incidentally, you explicitly set sitelinks accompanying advertisements created with Google Adwords.)
Every website has one specialised content type that they focus on, be that tech concerts, dentists, or digital cameras. If you don’t find your particular content type currently listed on Schema.org, Google recommends that you either:
Use a less specific markup type. For example, Schema.org has no “Professor” type. However, if you have a directory of professors in a university department, you could use the Person type to mark up the information instead.
After adding the markup, you may be unsure about whether your efforts resulted in validly formatted data. To assay any doubts, refer to Google’s Structured Data Testing Tool.12
Every Page Should Include a Picture
Researchers at OkDork.com analysed over 100 million articles and found that those which contained at least one image were twice as likely to be shared on Facebook as those with no images.13 This shows how important it is to augment your web pages with visuals.
Write Longer Pages
There are findings both by evidence-based marketers in the SEO world14 and academic researchers15 that web pages with higher word counts are more likely to be shared. And because sharing generates links and links generate further links, this is research well worth heeding. With that in mind, aim for at least 2,500 words per page.
Buzzsumo is a great tool for researching social share data. ↩
WebpageFX’s faqfox shows you what questions people ask in various Q&A forums. ↩
Google Search Console > Crawl > Fetch as Google ↩