Where did that week go? It seems only yesterday I was writing about the importance of having a website as well as a Facebook page; security updates; and Google search strings. Although to be fair we have also practised what we preach and took a good look at our own website maintenance.
We have been quite busy this week with finishing off a website for a client – it’s nearly there and just needs the final tweaks before it can go live. It’s an interesting job as it’s effectively split into three independent sites, but all under one administrative back end. This is to facilitate the client in updating pictures and copy to keep their site fresh, but also to assist its rankings as Google sees it as three separate sites.
The importance of regular website maintenance
We’ve spent a bit of time re-looking at our own site and its keywords for search engine optimisation. For our clients we advocate a monthly audit of their site to check on the keywords, analyse their rankings, see who uses their site and how, check for out of date copy, broken links and give their site an overall health check. However, we are as bad as the plumbers with the dripping taps at home in that, as always, you rarely do it for yourself. So, we made a concious effort yesterday to spend a couple of hours reviewing our site and giving it an MOT.
On the subject of site reviews PC-Tablet has published a five point check list of SEO tools any website owner must have. I have to say that this is the second time this week that I have been recommended Screaming Frog, so watch this space as I will run it and review it for next week’s blog. It goes without saying that all website owners should review their websites in terms of the traffic hitting it and the behaviour once the users are there. It makes total sense to install some form of analytic software – my host installs an analytics package as part of the service, so I get my statistics from there – as well as Google Analytics.
What package should I use to write my site in
A question that often arises when designing websites for clients is what package to design them in. Although WordPress is one of the most commonly used, it does have restrictions and sometimes it really just doesn’t fulfil the clients’ websites’ needs. There was an article in http://www.business2community.com that referenced this. The crux of the article is that there are mistakes that are made on websites that hinder its performance and one of these is that the wrong software is used to design it in the first place, thus limiting the functionality of the website. Make sure your web developer has the skill and expertise to develop your site using the most appropriate software and not just the one they know how to use.
The Financial Times published an article on how to make your website super-sticky. What they mean by this was that you really want clients to actively seek out your site and its content, because it is relevant and of interest. Sometimes the resources spent on SEO could be better utilised in blogging and creating relationships with other industry relevant websites. This approach could boost your rankings far more expediently than “improving superficial facets of their flashy websites.”
Another article, in an similar vein is in, The Atlanta Jewish Times. The guest columist discusses whether it’s all about the ‘user experience’. “You can have the most user-friendly site on the Internet, but if you don’t provide what your audience is seeking, you’ve missed the mark.” In the ‘not so good’ old days of websites, they were designed, and the content written, purely to provide information about the business. That is slowly changing. Marketeers and web developers have had to work together to provide sites that are informative; provide the user experience the client wants; and promote the company at the same time – no mean feat!
The last nugget of information this week was published in Information Week. Those who use Bing, Microsoft’s Search Engine, will have found that the information about specific websites returned in a listing is more detailed, hopefully limiting the number of rogue and fake sites that people visit. “The search engine will now tell users and webmasters about specific problems on each suspicious website.” The plan is that the information “helps users understand each security risk and the potentially harmful effects.”