As a web developer turned SEO type, I'm always happy to see Google using technical factors within their ranking algorithm.
When page speed came in, I immediately set out to make my client's sites run faster than their competitors and in pretty much every instance this was acheived - basically because you'll more than likely find a lot of SEO's, for whatever reason, don't address the technical side of the website when it comes to optimising. More than likely the reason is the developer is far too busy on World Of Warcraft to be concerned with things like download speed and the speeding up of...
We also saw some evidence, in the form of increased rankings, to show that taking the time to do so was worth the effort.
So, it again "excites" me in some ways to see Google's webmaster team talking about validation of CSS and HTML as a measurement of code quality, mainly because they inform us of their "goal of always shipping perfectly valid code".
This by no way means that the validity of HTML and CSS is or ever will be a ranking factor, but if it's important for Google to ensure it's own pages validate, then maybe somewhere down the line they will reward sites that attempt to do the same - it's certainly something I've always hoped they will do.
For me, I will always endeavor to get valid HTML/CSS on all my sites - at least certainly the bits that I code. I wouldn't suggest you obsess about 100% validation because very often 3rd party plugins etc will come along and spoil the party.
It's easy enough to check too, just use the Web Developer plugin for Chrome/Firefox which allows you to check both live and locally.
The number one issue you are more than likely to face is the use of the "&" within URLs and page titles - simply replace this with &
Another common issue I find is that YouTube's embed code doesn't validate - but with a tweak or two it's easily remedied.
For me, this is best practice stuff that I do anyway, but this post certainly gives you something to think about moving forward - I think the more factors introduced which aren't as open to black hat method the better (why would you try to manipulate code to be anything other than valid?).
But as I said, I wouldn't obsess about it just yet, not at least till Google get their own house in order:
And yes, am aware this page and the rest of the blog doesn't validate, I'll blame that on Wordpress and the social media plugins...
The rest of the site validates just fine.
Following on from this article, it has provoked a little discussion in my Twittersphere:
Let me point out, I don't think that page speed and valid HTML should be a more important ranking factor than quality content and links etc - neither do I think that a site should be penalised for not having valid code - but I do think that moving forward, in particular with mobile search growing rapidly, that sites which aim to make mobile searching a better experience for the user - which valid HTML and page speed both do, then there should be some reward for sites that make the effort.
And if you're still not convinced, then maybe this post about valid code on mobiles published on the Google Webmaster blog will get you thinking otherwise.