by Domenico La Tosa

Outpost24 is a Swedish vulnerability management company.

I worked on the Outpost24 website for almost a year, performing a huge amount of tasks in cooperation with the whole management. I have added many features, pages and graphics to it, empowered its performances and successfully improved its Google ranking through a search engine optimization campaign.

SPOILER ALERT: after a few months of campaign, the Outpost24 website jumped in the Google search results from page 30 to page 4.

A little informations about the case: the website was about 10 years old but still with a 0/10 PageRank, the company was already quite big, with about 50 offices all around the world, several big international customers, a cutting edge R&D department and many mentions on several international vulnerability management magazines. Although the presence of all these fertile starting conditions, the website was not performing as expected.

Given this fresco, how have I proceeded?

I describe below the main on-page aspects of my optimization.


The right informations

In order to structure a good search optimization campaign, you need to own a complete information about the website: all the statistics of accesses (Analytics or other trackers), the platform used, the domain age, the eventual sub-domains… everything! A full set of informations about the company, its marketing strategy, the short-mid-long term plans, the products, the main competitors, the targets and much more is required as well.

All these informations goes on my (very large) project white board: I prefer to have the possibility of looking at them every time I need.


The right keyword(s)

I handled the Outpost24 SEO/Search Engine Campaign working on all the aspects listed in the Google guidelines and, generally, implementing an holistic approach. First, I studied the semantic trends concerning the vulnerability management industry: I found it was a tricky field because, as quite often happens, many keywords and long tails concerning vulnerability management are in common with other semantic families.

A valid example can be the word “scanner”: for a security consultant it represents exclusively a vulnerability scanner, which is a software that finds weaknesses in your corporate network. For everyone else in the world, a scanner is a device to bring documents and picture in digital. I had to find the way to use the word “scanner” making Google “understand” that I was talking about “my” kind of scanners, not the common ones.

Once you have defined a set of valid keywords, you should “test” your competitors’ performances with the same keywords you want to rank for. If they perform better than you, it’s not necessarily a matter of budget.

Here are some of the techniques I used within this project.


The right tricks

Solved the semantic questions, I fixed all the available tags: TITLE and ALT for the images and the TITLE tag for the links as well – yessir, when it comes to html5, even an hypertext links like:

<a href=”/whatever-page/” title=”Page About whatever”>Whatever Page</a>

is different from

<a href=”/whatever-page/”>Whatever Page</a>

by many points of view. You can easily imagine how many links I had to fix manually – and then, later, I had to check using softwares like Screaming Frog.

About the image tags, Google cannot really “see” images as we do, so it uses the picture file name, its TITLE and ALT tags to “read” them. Of course, these tags must be coherent each other, with the page content and with the overall ongoing SEO strategy.

On the base of the semantic study, I rewrote as well the page contents (titles, body, tags and so on) several times, basing on gradual lexical, semantic and syntactical refinements – and, of course, the management’s taste. One of the tools I used, and I always recommend to write child-proof contents, is the Readability-Score text composer. The perfect website content must be super easy to be read by everyone. You should always consider your contents as readable by an 11-years-old kid, especially by the syntactic point of view.

There are plenty of techniques to structure a good SEO-oriented content. I can mention here some that can help:

  1. write down a clear and easy semantic structure for your contents;
  2. use minimum 300 words for your contents – it is not really the golden rule but most of the times it has its effect;
  3. use the “right topic” structure for your contents (more details below);
  4. use the LSI, Latent Semantic Indexing, a kind of quantitative approach to create a semantic ecosystem as more diversified as possible;
  5. use the right headings: H2, H3, H4 and so on have a SEO effect as well as a graphical meaning.

One of these methods, the #2, deserve a little extra explanation, that could be summarized in the following statement: a website is not a book. When you read “And Then There Were None (Ten Little Indians)” by Agatha Christie, you want to reach the end of the novel but I presume you do it for the pleasure of reading it, and you would really not appreciate a friend that suddenly, while you’re still reading, reveals you that the murderer is **********. A page content corporate-oriented should be structured exactly the reverse way: the “murderer” – the core information – first, then all the explanations, the details, the process and/or the researches that bring to that point. Through this approach, your users will not scroll a page looking if there is something interesting for them: they know that there is something interesting in the page, they have already found it. Now they will look deeper to get the value out of your content. This technique helps focusing the reader’s attention and improves the website’s first contact with the potential customer as well.

Another huge task I executed has been rewriting the website taxonomies, which are basically the link paths. The internet is full of

but Google hates unnecessary second-level domains, underscores, “stop words” and other things like these, and prefers:

which, by the way, looks way more clear and tidy.

If you want to take a jump to the next level – and if it’s not out of the line – in our case we could use something like this:

This strategy, of course, it’s not always applicable: adding sensible keywords into the taxonomies can make them look very unkind. Plus, if the taxonomies are not in the right correlation with the content they point to, and the anchor text they come from, they could bring more disadvantages than benefits. In my case, “vulnerability management” was both a very hot long tail and the company tagline and the focus of their products, so it was very appropriate and successful adding it into some taxonomies.

After all the contents have been rewritten, I worked to organize them better into the whole website. I wanted to provide a better navigation experience to the users and an higher grade of “openness” toward them. The contents were originally organized in a kind of too-logical order: they were structured almost “supposing” that their users had an advanced knowledge of the vulnerability management benefits, tools and practices. The users were supposed to pass through certain number of “introduction” pages before having a view of the products and the services that the company provides. This kind of behavior is as common as wrong.

A website is a kind of commercial. No one but a bunch of aficionados would buy a product that is introduced in an extremely technical and complex way. This manner of keeping-a-high-level-of-complexity to treat a topic, an issue, a product or a service, happens very often. It happens all the time that managers and copywriters swap “technicality” with “quality”.


The Content is The King!

A SEO expert that I know use to say that “The content is the king”. This simple statement is really full of meanings. With a complete and smart semantic study you can catch an existing trend, and compete with those who works in the same field where you are. What about creating a trend?

You could write your contents basing on all the recommended Google techniques, filling it with tons of highly-selected keywords and so on. You would probably – especially in the short and mid term – experience an increase of the visits but a constant or decreasing average permanence time of your visitors in your pages and a stable ROI rate. What does it means?

It means that one of the factors that constitutes your website position in Google is the time that your users passes on your website, and on every page they visit. Google uses these informations, together with much more math and statistics, to understand the quality and the reliability of a content. That’s why the practice described before, known as “keyword stuffing“, is progressively disappearing in the top 3-4 search result pages.

So, when you produce a content, you should respect some fixed requirements but use them as a tool, not as a target. It does not matter the peculiarities of your industry or business: there is always the way to produce quality contents, as the recent Bartlett’s guerrilla campaign demonstrates. Producing “quality contents” means that your users should scramble to reach your corporate blog to use them. They do because you always provide cutting edge, valuable and useful ones. The opposite example can be the typical corporate self-referring content: you can recognize it because users run away from it.


Layout, please

One more consideration to do about a professional SEO campaign concerns the page layout.

A set of quality contents is fundamental but, if they are located at the bottom of the page because there is a gigantic menu bar, then an enormous slider and maybe something more below… probably your users will not even read them. The topic is even hotter when it concerns e-commerces websites.

One of the things I worked on for Outpost24 is the customers section, that now is a one-page section. I designed a new layout that let the users have a view of all the Company’s customers “above the fold”. It means that all the section’s contents are visible without scrolling the page down. This technique maximizes the user exposition to your contents/messages, it does not matter if he scrolls down or not.


Website Speed/Performances

Another very important element for every website optimization is its speed and overall performances. It does not matter if you have a fashion blog or a corporate website: both Google and users does not like at all pages that takes a lifetime to be downloaded. I consider the speed assessment of a website a must for the whole web designer category but, too much often, pure designers know very little about programming and the coding aspects of UX – and that’s why a professional SEO can’t be simply a copywriter or a programmer or a designer or a marketer but must the synthesis of all these profiles.

You can have a cross-assessment of the eventual website speed issues using these very popular tools: PageSpeed Insight, Pingdom Website Speed Test and GTMetrix. They are quite precise and offers you different points of view of your web page. PageSpeed Insight focuses mainly over a qualitative assessment of the codes, the Pingdom tool highlights the download time of every single library of your page(s) and GTMetrix provides basically a synthesis of the two previous tests. Just paste your link and click on “Analyze”.

Here is a list of suggestions to make your website faster:

  • activate Gzip: you don’t need to install a plugin or an extension for that, just add a few lines in your .htaccess file and enjoy a ∼20% data compression;
  • don’t upload pictures bigger than the full hd resolution: unless your users are plenty of very expensive 4k monitors, an image with a resolution of 1920*1080 pixels will be shown crispy and perfect on all the screens. Consider that, if you use ShutterStock or ColourBox, your source images are about 4x times bigger than necessary. If you use ColourBox, I know for sure that soon they will release a special tool called “Web Picture Downlaod” to provide natively full hd pictures and make your users save bandwidth;
  • use the container size: if you are uploading the picture of a calendar for a box called “Online Booking”, be sure that the image you are going to use has the same resolution of the container that is going to host it. You will save easily thousands of pixels for every jpg and your users will save download time;
  • install an automatic image compressor: the human eye cannot see all the informations within your jpg files. There are many image compressors for all the cmses that can losslessly strip all these “extra” data away from your pictures and shrink their size in kb (not in resolution);
  • minify css, html and java files: all these libraries includes comments, empty spaces and many more characters that the browsers does not use to render your website. Minifying them means that all of these hundreds of “extra” kb of texts are stripped away, making you save easily up to the 30% of the file size;
  • combine your libraries or make them asynchronous: most of the website are hosted on a cloud host, which means that if there are too many contemporaneous requests to the server, this cannot answer to all of them and must enqueue many, causing a delay in the page rendering. Many themes for WordPress, OpenCart, Joomla and Magento are loaded with up to 40-50 different libraries, and so they are very slow. If you combine all the css and the js libraries together, the server will have to answer to less connections – hopefully not more than 10. This will speed up your performances without buying a more expansive host. Another similar strategy is making the loading process of these libraries asynchronous, so they will not flood your server all in the same moment, bringing a better performance.
  • install a CDN: a Content Delivery Network makes a copy of all your website files on many servers all around the world. Since the distance from your user and your physical server influences the website loading speed, a decentralized network of servers maximize the chance of providing your website from the closer server available. According with your the company budget, you can use a free-of-charge CDN or a super-performing full-optional one.

I do exist!

Install a dynamic sitemap and register your domain to the Google Search Console – previously known as Google Webmaster Tools.

A dynamic sitemap is a set of .xml files that contains all the taxonomies of your website: posts, static pages, portfolios, authors, categories, tags and much more. These xml files can update automatically every time you publish a new content. This makes your website more likely to catch new trends and users in the search result pages.

It is highly recommended to submit these xml files into the Google Search Console. There you can speed up the indexing process and be sure that the pages indexed are always at the latest version possible.

Domenico La Tosa
About Domenico La Tosa
Read more about us here!