Dan J Ford

First Post

March 22, 2016 by Dan J Ford

So, after putting it off for ages and countless redesigns, I have finally finished my portfolio and blog website! :confetti_ball:

Although it is now 'finished', I will most likely be doing many tweaks as I have already found areas that I would like to improve. For example, shortly before uploading this version, I went through two different designs and also redesigned the blog post page and as such, I don't think it really fits in with the design of the rest of the website.

I will be writing up various blog posts about the process I have used to get to this stage and the features that I have implemented. One of the features that I am most pleased about is the level of progressive enhancement, making it so that the website feels and loads like Single Page Application, but will work in a close to identical manner with JavaScript disabled.

I am glad that I have implemented the website using progressive enhancement as, a few years back, I was planning on implementing the entire website with Ember and using html snapshotting as a fall-back. This was at a time where most on the web would say that if you were doing progressive enhancement, you were what was wrong with the web due to tailoring to older users and holding the browsers back. However, I am now happy to see that google have deprecated this approach in favour of making progressive enhancement the industry standard again.

Since the assumptions for our 2009 proposal are no longer valid, we recommend following the principles of progressive enhancement.

I have noticed that a few people are confused about the fact that google has deprecated the Ajax Crawling specification and are questioning what they are meant to do now. However, if you read the following article, Google have tried explaining what this deprecation notice means.

My interpretation of this article is that Google are now basically saying that, whereas previously their crawler was not precise enough to execute JavaScript and get the same content, their crawler is now good enough to execute JavaScript to the same standard that a browser would. This means that technically you shouldn’t need to do anything extra to make the crawler see your content, it should execute the JavaScript and get the exact same content that the user does.

Today, as long as you're not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers.

However, even so I would consider using a JavaScript framework that doesn't leave room for progressive enhancement to not be a very good move. As, although Google's crawler is now good with understanding your modern web applications, this will not be the case for countless other crawlers out there. I also believe that by making use of progressive enhancement you benefit from having features such as a faster initial page load through use of e.g. server rendered templates. In some of my own projects that I am working on, I now use my own JavaScript framework called Lodestar-Ractive which attempts to make using JavaScript frameworks and progressive enhancement together easy.

Well, that's it for my first blog post! However, I may be adding more to this when I think about it later...

To keep up to date with what I'm writing, follow me on Twitter! I'll soon get around to writing up the various features that I have implemented in this website. Another incentive to come back may be that, if you are interested in Artificial Intelligence, I will be doing various blog posts about my dissertation which involves use of various tree search algorithms but will also include approaches such as Reinforcement Learning if I have enough time.

Like what I've written?