Welcome to our news section!
Check out what we are up to and see what is happening around us in related fields.

Social Media: Effects on SEO

Searching for Social Signals

Since the inception of SEO as an industry, web marketers have been chasing algorithmic updates and attempting to predict exactly what factors would come to play a leading role in organic search engine rankings. As the search engines' engineering teams expanded and became more sophisticated, the scalable (and indeed manipulatable) techniques of the first decade of the new millennium gave way to speculation that the algorithms could no longer rely on purely faceless and sometimes dollar-oriented metrics such as backlinks and on-page optimization.

While those still, and most likely will always play a large part in how sites are ranked, the search engines are now looking to augment the search experience by crowd-sourcing some of their work to social media sites such as Facebook, FourSquare, LinkedIn and Twitter... to name just a few. The thinking here is that many of the traditional SEO signals like back links and keyword implementation can be manipulated by web developers and optimizers, whereas people's honest opinions on products or services (when their name and reputation is on the line in a social environment) is much less likely to be manipulated.

As Stefan Weitz, Director of Search at Microsoft pointed out in a recent interview, this information was previously "trapped in people's heads" and unreadable by machines. Through the use of social sharing, customer reviews, and search activity, people are now committing their recommendations and endorsements to an open medium that the search engines can read, interpret and utilize in their rankings.

Begging the Question: Correlation and Causation

In December of 2011, Matt Cutts, the head of Google's web spam team confirmed that social signals do in fact count "relatively lightly" in the rankings game and that Google was "studying how much sense it makes to use it a little more widely in [their] rankings." Notice that Google, already anticipating the possibility of a manipulation of social signals, warns that authority matters.

This sparked the search community to not only debate the issue, but to dig deeper into the effects of social media on rankings. For those analyzing the data, the key was to determine if there is a direct correlation between increased social signals and organic rankings or if the relationship is a byproduct of other causal factors. In other words: Are the number of tweets, mentions and shares directly affecting a site's rank, or are the inbound links they naturally create through people discussing them the reason for increased rankings? Either way, there was no disputing that sites with more social activity were ranking better than those without.

In June of 2011, SEOMoz' in house data scientist, Dr. Matt Peters, crunched the numbers in an attempt identify a direct correlation between the number of facebook shares and increased organic rankings.

Graph of Correlation of Social Metrics

At the time, Dr. Peters concluded Google was not using Facebook share data directly in their rankings.

As we enter 2013 the question continues to be: just what is is the correlation between social signals and rankings? Has Google used them "a little more widely" in their algorithms? While the evidence is varied most conclude that the importance of social signals is increasing over time, whether this is a direct cause of algorithmic updates or simply a byproduct of increased traditional SEO signals through social media.

Forget Rankings: Putting Social Media in Perspective

As we continue to try to make sense of social signals from a rankings perspective, there is no question of their marketing value. Here are several reasons to engage and develop a strong social presence as we head into 2013... regardless of search engine rankings.

  • Customer Service: Social media is a great way to leave the phone lines open and show current and potential customers you're devoted to answering questions and solving customer service related issues.
  • Increased Site Traffic
  • Marketing/Promotions: Offer coupon codes and special online promotions with directly measurable ROI.
  • Brand Awareness
  • Analytics: Learn what people are saying about your brand and who's saying it. Who is your target demographic and how does that compare to who is actually engaging with your brand.
  • Networking: with industry opinion leaders.
  • Inbound links: I know I said to forget rankings but the more people are talking about and engaging with your brand, the more links will naturally be built - increasing your traffic and rankings.

Social media and social impact on rankings is not going away any time soon. As the Neilson Report on Social Media in 2012 illustrates, social media is playing a larger and larger role in our online experience. And even more so in the lives of adolescents and teenagers who will become the next generation of consumers, relying on these signals as endorsements or warnings against products and services.

Recommended Web Development Tools Using Mac OSX

Recent projects and client requests have led us to reevaluate our workflow. This is wise to do periodically, sometimes it just helps to have a push!

Basic Physical Setup

  • Standup Desk - I suggest configuring the relatively inexpensive Connections Workstation into a custom desk tailored for your workflow.
  • Two Monitors - Set the second display to extend, not mirror your primary.


  • Sublime Text 2 - It takes extra effort to get in tune with the software, but it is well worth the time spent. The lack of a for GUI for the application settings and third party FTP initially prevented my fulltime migration. But after spending several hours scouring the internet for tips and taking the free nettuts course, I can definitively say Sublime Text 2 is the best editor I have ever used. If you are familiar with WordPress, think of the ST2 settings like "Child Theming" with extensive use of JSON to control app settings, package (plugin) settings key bindings and visual color themes. Store your settings in the respective user "child" files as the parent files will be overriden when updates are released. Take the great free tuts+ premium workflow course, Jeffrey Way does a great job with a series of succinct video tutorials and plugin recommendations. Here a few plugins I suggest:
    • Package Control (Install this first!)
    • Sidebar Enhancements
    • Sublime SFTP
    • Nuttuts+ Fetch
    • BracketHighlighter (Use tutorial link to dial in the colors to your taste)
    • Alignment
    • SublimeLinter
    • Emmet (Ex-ZenCoding)
    • Transmit DockSend (if using Transmit FTP app!)
    • AdvancedNewFile
    • PlainTasks
  • Transmit - The best FTP app out there can be used in conjunction with Sublime Text 2's "Transmit Docksend" plugin.
  • CodeKit - I use this primarily as a LESS preprocessor and browser refresh tool. Whenever code is saved, the browser running on the second monitor will be refreshed showing the changes without having to the carpal tunnel cmd+tab, then cmd+r refresh thing! CodeKit does so much more, check it out.
  • Sequel Pro - An excellent tool for managing mySQL databases. Another great option is Navicat, which has a robust freature set including great backup and visual query tools.
  • Color Schemer Pro - Dial in your color own palettes or download others using the Color Lovers website.
  • Alfred - Launch this app with a keystroke and you have access to opening files, launching applications, performing internet searches, playing music and countless other things! This will likely be the highest productivity boost recommended in this article.
  • JiTouch - A great accompanyment to Alfred, but sort of a reciprocal. This is amazing application extends gestures on your trackpad and allows you to say, launch your text editor by "drawing the letter T"! When used in combination with the Mac's ability to assign keyboard shortcuts, you can perform many system wide actions with a gesture of your choice!
  • Moom - Enabling window sizing and position controls with a keyboard shortcut can really speed things up, especially with Responsive Web Design using a second monitor and jiTouch.
  • Dropbox - Keep your files in sync across devices or access them on the go with Dropbox. Great for illegally sharing music ... just kidding.

There are certainly other excellent tools out there for correspondence, task management and collaboration, version control, backups, password management, keeping current with everchanging technologies and news feeds, but this is a good starter cheatsheet! If you find any of the above useful please purchase a license or donate to the developers that have put so much effort into making these great tools.

Are Exact-Match Domains (EMDs) Going the Way of the Dodo?

EMDs, or Exact-Match Domains refer to domain names that contain keywords that match the keywords of a particular search query. Picture Joe Fitness, sitting at the computer shopping for new running shoes. As with most of us, he might mosey over to Google and type in the search phrase Best Running Shoes and be given a list of websites likely to sell or review running shoes: Nike, Men's Health, RunnersWorld.com etc. But among that list of quality sites which might give Joe valuable reviews or purchasing options he sees www.bestrunningshoes.com. A quick visit to the site reveals, yes, they do fact sell running shoes but overall the site seems quite spammy and not nearly the quality of the other of the other non-EMD domains. In short:

  • www.bestrunningshoes.com is an exact match domain
  • www.nike.com is not

For years this has been a common occurance and cause of consternation for business and site owners alike. But why does this happen? Why have the search engines valued EMDs so highly over the years?

A search engine's primary objective is to return the highest and most relevant quality websites to searchers for their given search term. They do so by looking at a variety of both 'on' and 'offsite' signals to determine a) what the site is about and b) its quality and trustworthiness before recommending it to their search engine users. Among the thousands of ways to figure out what a site 'is about' it is reasonable to look at the domain name they chose (one of the most fundamental marketing decisions a brand makes when establishing their web presence) and assume that www.best-running-shoes.com is a site about running shoes and not square-foot gardening. The key question as we move through 2012 is what weight search engines such as Google will place on exact match domains (EMDs)?

Evidence gathered by SEOMoz's Mozcast project has shown a decline in the value of exact-match domains' correlation to rankings:

A Chart Showing the Decline of Exact-match domains since 2010

Google has warned of the diminishing value of EMDs, or "keyword-laden domains" for a while now. Here's Matt Cutts talking about the issue back in March of 2011

More recently Cutts confirmed what we already knew would eventually happen when he tweeted:

Matt went on to write that this change only "affects 0.6% of English-US queries to a noticeable degree." But it should be noted that this algorithm update will be updated on a periodic basis, so it's difficult to predict how many exact-match domains will be affected in the future. Those not hit now, may certainly fall victim before long when the filter is updated.

Some Advice Moving Forward

It's perfectly natural for many new companies to purchase an exact-match domain. Google's algorithm change targets poor-quality exact-match domains, not exact-match domains in general. If your company's website is ranking well solely on the strength of an EMD, then we suggest getting started now, building your onsite and offsite SEO signals including link-building and content development. If your EMD site has taken a recent hit in the rankings we also recommend reviewing (or having someone else review) and diversifying your backlink anchor text as it may target your domain too closely and be a casualty of Google's Penguin algorithm update targeting overly aggressive anchor text. Then, wait for a few months until the next time the filter is updated.

Not only are exact-match domains on precarious footing (for now, just poor-quality EMDs) but Google has a propensity for ranking brands so if your new company is looking to purchase a domain we'd suggest you opt for "branded" top-level domain name. If an EMD makes sense from both a branded and an SEO standpoint — go with it, but either way be prepared for the long-term investment in brand recoginition, rather than relying on short-term rankings from an EMD.

Bing Introduces Feature to “Disavow Links”

We recently published two articles on Google’s “Penguin” algorithm update and Negative SEO, both of which focus on the perils of low quality, or spammy link building strategies. Whether intentional, or through the unlikely occurrence that a competitor has purchased and pointed low-quality links at your site in an attempt to get you penalized, a way to reject or “disavow” those links has been on many SEOs wish-list for quite some time.

Despite indications by Google that they would soon be adding this functionality to their Webmaster Tools within the coming months, Bing managed to ‘beat them to the punch’ by announcing this feature at the end of June. Webmasters can now login to their Bing Webmaster Tools account and select from the Configure My Site menu to Disavow Links. Bing's Disavow Links Feature Screenshot

Look for Google to introduce this functionality into their Webmaster Tools within the next few months. This appears to be a victory against “negative SEO,” but some SEOs are skeptical that despite its benefits could be more of a data-collection technique and public outing within the SEO community.

An Overview of Content Management Systems

Anyone who has browsed through the articles featured in this news section may have noticed that they range from basic and easily digestible to jargon-rich, techie articles meant for those with a background in web languages. This article relates to the former and is intended for those with a basic understanding of the web looking to learn more about key concepts in current web design trends or potential customers doing their due diligence. Our apologies for the over abundance of analogies but we find they can be useful in making our industry less arcane.

An Explanation of Content Management Systems

Content Management Systems (or CMS for short) are becoming more and more common for web design and maintenance. Some examples you may have heard of are WordPress, Drupal and Joomla, although there are many others.

CMSs are fantastic tools that allow changes to be made to your website easily and at a fraction of the time it can take to markup and code a website from scratch. This benefits not only the owner of the website, who may know nothing about HTML and CSS, or other web technologies, but is now able to maintain their site without having to always rely on a professional web-designer for every little change. This allows the web designer to fast-track much of the design process as well. The website you are looking at now is built on the Expression Engine Content Management system even though we are capable of building our own CMS from scratch. It's often a better choice to use a thoroughly tested and secure product instead of re-inventing the wheel.

Most CMS offer an intuitive dashboard much like a desktop application to manage the site. Here are two popular dashboards.

The Expression Engine Control Panel New Post Screenshot The Word Press Dashboard Screenshot

The underlying technology behind a site built through a CMS is still markup and code (as in the old days), but the visual editor of the CMS speeds up this process and allows for quicker revisions in the future. Although there may be some extra "up front" work in setting up the CMS, the long-term benefits are substantial and allow the site to easily grow and evolve. And because this underlying code powers the website built on a content management system, the way it looks and the things it can do are endlessly customizable. For example, there is a large market of designers offering custom built themes that can be swapped out with the click of a button to completely change the appearance of your website.

Most web surfers are probably getting pretty good at picking out a "WordPress" site as they've come to subtly identify certain out-of-the box layout commonalities. Because of this, small business owners often assume that a site built on WordPress will inevitably look like a "WordPress" site and fear the site design will not be able to properly capture their unique brand qualities and look without a 'templatized' feel. This is a fair but inaccurate concern as a good designer, provided the right marketing assets and given a reasonable budget will be able to customize a website built on WordPress to an astounding degree. The client is then left with a website that accurately portrays their brand image but is also scalable and easy to update.

The job of a good web designer is to understand the client's needs and objectives and choose the right tool for the job. Site owners who are more hands-off may not need a CMS and will rely heavily on the web designer for future changes once the site is built. Others may want to retain control over changing certain aspects of their site such as the contact number, the photos featured in the gallery or frequently adding content such as blog posts or news entries.

The Work Doesn't End There

Once the web design agency has built the website and passed it off to the client it is often assumed that the relationship ends there; the client takes their website under their arm and goes, so to speak. In some cases this may be true, but what often happens is that conflicts between certain software elements of the CMS arise and the webmaster is called back in to fix the problem. Why does this happen?

One of the greatest advantages of CMS's is the fact that they're open source and thousands of developers are coming up with tweaks and add ons daily to sit on top of the core software. There are add ons to pull in videos from YouTube, or your Twitter feed; to translate your website into a variety of languages, play mp3s, etc. etc. Whatever you can think of, it probably already exists. But because developers of much different skill levels and abilities are releasing (but not always supporting) extensions, this can create compatibility conflicts.

Most CMS-powered websites are built using a combination of the following three elements: Core Software, Themes, and Plugins/Modules.

Think of the Core CMS Software as the chassis and engine of an automobile. It forms the framework for the rest of the car to sit on and powers the motion of the car. The Theme used could be looked at as the body style and paint job. It affects how the car looks to the outside viewer. The Plugins and Modules added to a site are equivalent to optional but very useful things like the car stereo, the AC or shiny alloy rims.

So far, so good. We have a complete new car with a powerful engine, a great looking body and some nice available options. But what happens when the car gets a scratch but the paint manufacturer has discontinued that color of blue, or the Bose stereo your trying to fit into the dash requires a custom made bracket; the alloy rims are too large for the body? This happens all the time in the world of Content Management Systems. And most problematic of all, these elements are constantly being updated by their developers and a lot needs to go right for everything to fall into place and work for any extended period of time.

Although Circulation Studio is proficient with various content management systems we have developed our own WordPress "child" theme and identified a core set of robust and well-supported "go-to" Plugins which together form a more stable and feature-rich foundation for our client's websites. We have also taken into account the SEO prominence of certain elements for a sound structure. We hope to release a future article discussing many common misconceptions about CMSs and SEO.

Should conflicts arise with the site's Theme or Plugins, we are well equipped to communicate with the developer or choose a suitable alternative. We offer custom-tailored and affordable, no-hassle maintenance packages to clients who want to ensure that their site can grow and evolve with web standards and new technologies.

Full Disclosure

There are some great free or low cost blogging tools out there that allow you to do some very sophisticated things with minimal computer skills or prior knowledge of coding. Google's Blogger platform and Wordpress.org offer free web-based content management systems with well-developed and supported feature sets and add ons. Anyone can signup, and within no time, upload a logo, integrate their social media accounts, create a contact form and start blogging! These are great tools and we recommend them to anyone looking to blog about their garden or sons's little league team.

So what happens when a small business owner comes to us looking for something "simple" and "quick, you know... a basic site package..." that they can manage and update themselves only to receive back a proposal suggesting they drop thousands of dollars? Understandably they're left scratching their head and we struggle through an over-the-top technical explanation of just what they're getting, and how it differs from their cousin's cooking blog with that great banner and font.

First, let's try to approach the question by way of analogy... You're a recent graduate of the Paul Mitchell Academy, have just received your cosmetology license and are looking at options to setup shop, develop a client base, get your name out there. The way you see it, your best options are either to rent a chair in an established salon (cheap, low maintenance and little responsibility) or open your own salon (expensive, and stressful but something you'd really love to do).

You start to think about the pros and cons of each scenario. Renting a station by the month is much cheaper, someone else is paying the rent and utilities; the chair itself is new and comfortable and the wash station more than adequate. They've given you a nice rolling countertop to display your cosmetology license and photos of your nephew and dog Arnold. A security guard patrols the premises at night and the shop itself is in a high-traffic part of town and receives a fair amount of walk-ins. This might work.

But you've grown tired of staring at wall art resembling the cover of Duran Duran's Rio and not having control of the radio. When you tell people where you work they think back to the horrible haircut Sebastian gave them the year before. You want to repaint the entire place, tear down a wall and no longer be known as Jane Graduate who cuts hair at a salon called Hair to the Throne. You are ready to take complete control of your brand and image and establish your reputation not just as a stylist in somebody else's salon but as a business owner and trendsetter.

As you might guess, the latter scenario represents building your own website out from scratch professionally. While CMS's such as Blogger are great, inexpensive ways to get started and probably quite adequate for many small enterprises, there are just too many restrictions and limitations to be a viable solution for business owners who really want to take control of their brand and future online presence. Much of these limitations come from the fact that the sites are hosted by Google and WordPress' servers (the hair cutting station is "in their salon", so to speak) so understandably, for security reasons, they can't just have people totally repainting the place and knocking down walls.

The same can be said to a certain degree about out-of-the box solutions such as SquareSpace and other "build your company website in 10 minutes!" type solutions. Again, these might be fine for small businesses who don't require custom branding or don't have the appropriate budget to hire a design agency. But in our opinion, everything worthwhile takes time and foresight so whether you are hiring a design agency or set on using a low-cost, out-of-the-box solution such as the above, we recommend writing out your goals (both short and long-term) and complete online strategy which extends far beyond just getting up a site and liking how it "looks." Only then will you be able to develop a comprehensive web presence that will drive traffic to your website and engage the visitors with well placed calls to action and measurable goals.

Link Networks and Negative SEO after Penguin

Due to Google’s recent algorithmic changes, their targeting of link networks and pre-existing measures against low-quality links, there has been a growing concern regarding Negative SEO. Because the web is so open and many of the signals search engines such as Google use to determine a site’s ranking take place off site- without the participation of the site owner or webmaster- there are a number of techniques that could be used by competitors and hackers in an attempt to penalize another site or diminish their rankings.

We will discuss some of those techniques below but the common theme of negative SEO is misrepresentation, i.e., someone impersonating a shady and unscrupulous marketer within your organization in an attempt to entice Google to penalize the site. This reminds us of the Ali G Show skit in which Borat follows a local politician on the campaign trail informing potential voters that if elected, his candidate will be “...powerful, like Stalin!” You don’t want Borat helping you with your political campaign and you certainly don’t want your competitors helping you with your SEO.

The primary concern regarding Negative SEO at this point seems to be backlinks. The fact that anyone can point a link to anyone else without their consent - here, I just randomly pointed a link to NPR- it naturally follows that any competitor could potentially point links from low-quality sites that Google has identified as spam to your site. Low quality links are cheaper to buy than ever and with the April Penguin update which specifically targets sites with low-quality or paid links, Negative SEO, and especially negative linking has become a growing topic of discussion in the search community.

Around Mid-March Google began taking further action against link networks. If you were participating, you may have seen your traffic drop because those links were devalued, or because Google penalized your site. BuildMyRank.com was penalized so severely that they had to shut down their service and provide refunds to customers.

In this type of scenario it is important to distinguish between sites being penalized for low quality links and Google simply not counting those links, which is often the case. In practical terms a penalized site suffers at the hand of one of Google’s engineers directly for participating in practices falling outside of their Webmaster Guidelines while those receiving link devaluation do not benefit from something they never deserved in the first place. In fact, this is the case most of the time.

Here, Google further explains and slightly comforts us on the second scenario:

Also, it should be noted that Google has ramped up its efforts to communicate with webmasters through the message center of Webmaster Tools, including warnings against low quality backlinks. However, it’s unclear whether Google will ever create functionality for sites to “disavow,” or basically say “I don’t know these guys and they don’t represent our company” certain backlinks.

However Negative SEO can extend beyond just pointing spammy links at your site. It can include security breaches in which hackers find a vulnerability in your FTP connection and place spam on your site or edit your robots.txt file to block Google from crawling the site, or introduce other malware. It can also include scenarios in which a competitor may report your site to the search engines for shady or “on-the-border” gray-hat type SEO you’ve conducted or build a number of quick and dirty sites to repost (copy and paste) your site’s content, effectively creating spammy, duplicate content.

Negative SEO isn’t a new concern but has so far proved to be rare. Google has said that it does an enormous amount to prevent one site from targeting another. In most cases large sites with quality structure, content and links have built up a natural immunity to these types of attacks. In other words, the good outweighs, and can often overcome, the bad.

Our advice is to be aware of and watch out for Negative SEO, but to not stress about it. Build backlinks naturally; monitor your site and backlink profile regularly, and avoid any techniques that may be considered spam. Implement known security measures and be ready with a “reconsideration request” should you suspect any wrongdoing. We’ll be following this debate closely and feel free to contact us with any questions or comments.

Additional Sources

Search Engine Land's Article "Google Eliminates Another Link Network"
SEOMoz's always illuminating WhiteBoard Friday on Negative SEO

Build Your Own Responsive Content Slider Without JS Dependence

In conjunction with a responsive grid system, such as The Organic Grid, creating a responsive content slider is a snap. There are a couple widely used methods to build a content slider which typically involve a containing element styled by an overflow: hidden; CSS declaration. This container serves as a window to mask the "non-active" slides. This tutorial uses a different approach made possible by the CSS :target pseudo class selector. Our goals are outlined as follows:

  • Progressive enhancement
  • Responsive slider that adapts to viewport size
  • Slides capable of containing images or HTML
  • Scalable
  • Easy to use
  • Default slide (when their is not :target url fragment)
  • Integrate with a free jQuery Plugin that is well documented and supported
  • Slides viewed not part of browser history (with JS enabled)

Progressive Enhancement

We firmly believe in web standards and progressive enhancement. This keeps our future-selves sane and helps out anyone else who either works with or inherits our code. Even though relatively few of us browse the internet with Javascript unavailable or disabled there are several key reasons to create fallback measure that accomodate those without Javascript ... mainly, for those who are not human, meaning screen readers for those with disablities as well as for SEO.

Search engine robots, like GoogleBot, do not index certain content like Flash or Javascript as readily as HTML. Creating the content first, then enhancing the user experience with unobstrusive Javascript is a good idea to make sure your content is getting indexed and crawled (and validated) properly. We have seen many designers neglect to use alt text that properly describes images, even though this is a W3C required attribute.

Separation Of Concerns

We are conscious of the Separation of Concerns. In our opinion, animation falls more into the behavioral "layer" of the web which implies Javascript should handle this (as opposed to the CSS presentation layer). But I suppose you could make a case for CSS handling animation and it is wise to stay atop of evolving technologies. The pure CSS animation is accomplished using keyframes, see the references section below for links to a few outstanding tutorials that teach and showcase CSS animations.


This code uses a custom grid system we developed, for more information read how we developed a responsive grid system. If you do not want to use a grid you may need to include additional rules to constrain the images and their containing element. It is possible to reduce the "div-itis" that grid system can suffer from by creating more semantic markup via LESS.

<div class="row">
	<div class="col-12">

		<div class="slider flexslider">
			<ul class="slides">
				<li id="slide-01">
					<img src="assets/img/slide-01.jpg" alt="Slide 01 Description">
				<li id="slide-02">
					<img src="assets/img/slide-02.jpg" alt="Slide 02 Description">
					<span class="temp">My Caption</span></li>
				<li id="slide-04">
					<img src="assets/img/slide-blank.jpg" alt="Slide 03 Description">
					<span class="temp">My Other Caption</span>
				<li id="slide-03">
					<img src="assets/img/slide-03.jpg" alt="Slide 04 Description">
					<p class="flex-caption">Winning combination.</p>

	</div><!-- /.col-12 -->
</div><!-- /.row -->

<div class="row">
	<div class="col-12">

		<div class="pager">
				<li><a href="#slide-01"></a></li>
				<li><a href="#slide-02"></a></li>
				<li><a href="#slide-04"></a></li>
				<li><a href="#slide-03"></a></li>

	</div><!-- /.col-12 -->
</div><!-- /.row -->


Using newer CSS combinators we improve our ability to target elements with more Javscript-like precision to make the pure CSS and HTML slider function.

The core of the styling is founded on hiding the <li> elements, except for the default slide (the last list item in the markup) and active slide.

The min-height is not required if you are only displaying images, but because the slider can handle other HTML, such as blocks of text, it may be best to set a minimum height that can be used in media queries to keep the image and non-image content the same height so the elements below the slider do not shift vertically when the slide heights would otherwise vary. Alternatively, you could create a blank transparent slide as a place holder. HTML elements could then be floated into their place to overlay the pseudo-slide image.

@pagerHeight: 25px;
@slideMinHeight: 125px; // mobile viewport

.slider {
  width: 100%;
  background: #ccc;
  min-height: @slideMinHeight; // fine tune in mqs

  ul {
    padding: 0;
    margin: 0;
    list-style-type: none;
    width: 100%;

    li {
      width: 100%;
      display: none;
    li:last-of-type {
      display: block;
    li:target {
      display: block;
    li:target ~ li:last-of-type {
      display: none;
    li#slide-04 {
      background: #f7f7f7;
      width: 100%;
  img {
    display: block;
    width: 100%;

.pager {

  ul {
    list-style-type: none;
    margin: 0;
    padding: 0;
    text-align: center;

    li {
      display: inline-block;
      text-align: center;
      margin: 0 .1em;
      padding: 0;
      width: @pagerHeight;
      height: @pagerHeight;

      a {
        background: #888;
        outline: 0;
        display: inline-block;
        width: 100%;
        height: 100%;
        .gradient(#ccc, #777);
        .box-shadow(inset .01 * @pagerHeight .05 * @pagerHeight .1 * @pagerHeight #777);
      a.active {
        .gradient(#ccc, #aaa) !important;

Javascript (Via jQuery)

With the HTML and CSS laid out, we can now enhance the slider with Javascript. Not wanting to re-invent the wheel we decided to implement the Flexslider jQuery Plugin from WooThemes. Flexslider it is easy to use, includes key options, has decent documentation and it is free!

Flexslider is used to enhance the slideshow to add "auto play" functionality and transition effects (fade or slide). The slider also supports swipe getures, as well as keyboard and mouse wheel options to control the slides. The jQuery selector used here is window, as opposed to the typical document object. We have configured Flexslider to use our navigation, however, you can use the slider's built in (default) functionality too. Our navigation needs to function without Javascript, so it makes sense to reuse what we have already written and to make styling the navigation buttons easier with CSS.

$(window).ready(function() {

      animation: "fade",
      slideDirection: "horizontal",
      slideshow: true,
      slideshowSpeed: 5000,
      animationDuration: 900,
      directionNav: false,
      controlNav: true,
      keyboardNav: true,
      mousewheel: false,
      prevText: "Previous",
      nextText: "Next",
      pausePlay: false,
      pauseText: 'Pause',
      playText: 'Play',
      randomize: false,
      slideToStart: 0,
      animationLoop: true,
      pauseOnAction: true,
      pauseOnHover: true,
      controlsContainer: ".pager",
      manualControls: "ul li a",
      start: function(){},
      before: function(){},
      after: function(){},
      end: function(){}



We had difficultly particularly because sorting out the scope of the General Sibling Combinator. The conclusion we arrived at was to move the default slide to the end of unordered list as :target ~ element will not grab preceding elements (it only finds suceeding elements in the document flow from what we understand). This was necessary for our solution to hide the default slide.

Pay attention when using fluid layouts where the layout width can expand beyond that of the actual image width, as the image can keep expanding and will lose resolution. Extremely large view ports are not our target audience, but the slider can either by constrained by the grid system (column width), Javascript or a CSS media query if need be. It is also possible to have a hybrid layout were portions use the fixed grid and other portions use the fluid grid. In this case, the slider could be put in a fixed grid while the remaining content could be put into the fluid system. Regardless, all images should be optimized to reduce file size, thereby improving page load speed especially since so many users access the web on mobile devices when they are in a rush or have limited data usage. Google also uses page load speed in their ranking algorithm.

Also worth mentioning is the descendant selectors used to style the slideshow may be overkill and it is probably better to only be as specific as necessary. For example, you don't need to target an anchor using .pager ul li a as .pager a would suffice (we wrote it this way to convey the DOM hierarchy).

Extending The Slider

The concepts presented here should allow for the creation of "tabbed" content as well. This is a great way to recycle available page real-estate that will function with or without Javascript.

The Organic Grid System was also modified to include appending and prepending classes that add leading and trailing whitespace to a grid column by adding left and right padding respectively. For example, this is useful to create a smaller content slider that is centered on the page, by adding the classes col-8 prepend-2 append-2 (in our example).

It is also probably a good idea to help out older browsers using something like Modernizr to support to HTML5 and CSS3.

Live Demo and Source Code Download

The outcome of this tutorial can be seen in our live demo and the source code is available for download. To understand what has been built, resize your browser window using a laptop or desktop computer (for each demo). Also, try disabling javascript to see how the slider still functions without the autoplay and fade effects. The code may be updated periodically and slightly differ from what is presented in this article, so be sure to check the source if you have any questions. Let us know if you implement these techniques in your upcoming project! Thanks for reading.

Respect and Resources

  • WooThemes Flexslider
    One of the hundreds or thousands of slideshows or content sliders out there. We have put many hours of research into finding a great slider that does what we are after with little effort. We have used many sliders before, most notably the jQuery Cycle Plugin, but in the end we choose Flexslider for it's simplicity and focus on responsive design.
  • Joshua Johnson at Design Shack wrote Build a Pure CSS Slideshow With Webkit Keyframes
    This site is awesome and a great source of inspiration. This particular tutorial uses CSS3 Keyframes to animate the slideshow which is interesting, but not widely supported by all modern browsers.
  • Another article written by Joshua What's the Deal With :Target in CSS?
    This article helped found the technique used in our tutorial.
  • Harry Roberts at CSS Wizardry wrote Fully fluid, responsive CSS carousel
    Nice (responsive) site with useful information conveyed using succsinct explanations. We dig it. This tutorial has a nice visual of the magic CSS hidden pane "wrapper". The photos credited to Suze in the demo are pretty solid too. Nice work.
  • Robert McIntosh's Pure CSS3 Slideshow
    Great example of the Ken Burns effect that really showcases the new(er) power of CSS.
  • Alessio Atzeni wrote A Pure CSS3 Cycling Slideshow for Smashing Magazine
    This comprehensive tutorial includes animation, progress bar, pausing and tooltips.
  • Jeffrey Way instructs us about The 30 CSS Selectors You Must Memorize
    Another great reference from Nettuts, as usual this article does not dissappoint. A good bookmark.

Sphenisciphobia or the “Fear of Penguins.”

Google releases hundreds of new and updated algorithmic changes every year. In fact, they reported 516 changes in 2010 and for a more recent example, 53 in the month of April alone. This is standard for the world’s largest search engine and most of these changes go unnoticed and unpublicized. It’s when these updates are significant enough to warrant their own animal moniker, such as “Panda” or “Penguin” that website owners and those in the search industry start to take notice, and in some cases, panic.

On April 24th of 2012, Google launched what has come to be known as the ‘Penguin’ update. Originally called the “webspam update” this change has created a wave of nervous webmasters and speculative seo’s - especially in the wake of 2011’s famous Panda/Farmer updates which affected roughly 12% of search results and had major implications for a number of businesses.

Before we get into the exact details of what we know about the Penguin update lets take a look at search algorithm changes in general in an attempt to put a few things in perspective. A topical look at their methodology and objectives helps us to understand how search works, evolves and how we can tailor our strategy to maximize exposure within the search engine results page (SERPS).

Google has indexed billions of web pages for which it returns search results to the average web surfer. For this reason a manual, or human review of the quality and relevance of each website is simply a logistical impracticality for Google. Therefore, mathematical, computerized algorithms are introduced to streamline and automate this process. Google uses automated processes to “crawl” and “index” pages from the web, understand what they’re about and place them in a predetermined order for web searchers based on their relevance and authority in relationship to specific search terms people are using to find information on certain topics. Reasonable enough?

The Penguin update is one such change among many introduced in 2012 but unlike many of the minor changes, is noteworthy for several reasons of which we will now expand upon. But first, it is important to note that the Penguin update is an algorithmic change and not a manual change. In practical terms what this means is that Google is attempting to classify websites on a large, automated scale and does not have the time, nor energy to quibble about whether your site was affected by the change or not. There are no human eyes looking at your website and saying “Gee, I don’t think this site is trusted, or relevant in regards to this particular search term.” Rather, a computerized process is looking for signals within your website to determine just how applicable your content is to web surfers looking for information on a particular subject and it is the algorithm making the decision.

The implications of an algorithmic changes are often related to the level of recourse that can be taken. In the case of the Penguin update, one cannot file a “reconsideration request" but rather must fill out a “feedback” form made available by Google. Additionally this means that one must be patient, make the necessary changes to your website and then wait for traffic to return.

What We Know So Far

Here’s what we know so far about the Penguin update, which according to Google has affected a little over 3% of English language search results.

  • Launched to combat keyword stuffing and cloaked pages.
  • Considered a “success” by Google.
  • In some cases could result a penalty, or just a devaluation.
  • The Penguin update is a ‘filter’ that is run periodically, occurring outside of the main index (or in addition to day-to-day spam detecting processes.) So if you think your site has been hit by the Penguin update be sure to keep an eye out for updates to see if your rankings improve.
  • It is fully live, so look at traffic to your site from Google after launch to see if there was an increase or decrease.
  • Many were worried that the update was meant to penalize over-optimized sites, but rather the update seems to be targeting straight-up spam.
  • Driving force has not been on-page but rather back-links.
  • The update heavily targets comment spam with exact match anchor text. So, instead of using your name like you’re supposed to when commenting on blog posts, those manipulating anchor text have been affected.
  • The update also targets guest posts on sites setup to generate income from such posts.
What is really interesting about this update is that, as Rand Fishkin points out in SEO Moz’s Google Plus Whiteboard Friday is that the change is actually not meant to improve search results per se but rather penalize those falling outside of Google’s Webmaster Guidelines. This is one of the rare updates where Google is flexing its muscles and, for lack of a better phrase, attempting to keep site owners ‘in check.’ The timing is noteworthy in that a company controlling an overwhelming majority of the search marketplace is strategically deciding to fire off a warning flare that could be interpreted as “we own search and you are going to play within our rules.” In effect, they are willing to sacrifice short-term search results in favor of a long-term culture change i.e., more webmasters following Google’s Webmaster Guidelines.

The Key Takeaway

For every site that lost rankings, someone gained and we often won’t hear from those who gained from the Penguin update because they have nothing to complain about. From the anecdotal evidence bandied around the SEO community, the update targets mainly low-quality, or paid-for links which exploited anchor text. If you haven’t bought links with product-heavy anchor text you should be fine.

A (Rough) Troubleshooting Timeline

The following is meant to provide a rough timeline by which to troubleshoot potential issues related to algorithmic changes over the first several months of 2012. If, while checking your site’s analytics, you see a drop in search-related traffic:

  • Shortly after April 24th: You may have been affected by the Penguin update.
  • Between April 19th and April 24th: You may have been affected by the Panda 3.4 update.
  • Between April 17th and April 19th: It may be Google’s incorrect classification of parked domains to blame. Your traffic should have returned by now.
  • Additionally, Google mentioned 53 other algorithmic changes in April so it may be difficult to pinpoint the exact culprit but through a little investigation and experimentation you should be able to develop a plan for identifying what needs to be fixed.

Suggested Post-Penguin Actions for the Afflicted

  1. It’s probably a good excuse to conduct a complete site audit. Review your most recent keyword strategy, your rankings and analytics, and clean up your on-page SEO. Chances are you are probably sitting on under-performing ‘stuffed’ keywords that are muddying up the natural language progression of your page and can either be cleaned up, replaced or thrown out altogether!
  2. As mentioned above: stay on top-of Penguin updates.
  3. You can’t file a “reconsideration request” but if you feel like your site has suffered from the Penguin update you can file a report using this form.
  4. Adjust the anchor text you have control over to be more natural and diverse.
  5. Check messages in Google Webmaster Central for spam notices.
  6. If you’re totally screwed, start over with a new site.
  7. Remember: There are no quick fixes! Be patient.

Lastly, remember that the Penguin update is relatively new and most of the information we have on it is speculative and anecdotal. We should know more as time goes by and updates will be posted.

As always, if you’re still unsure or have questions call (714.485.9412) or contact us!

Sources and Further Reading

Rolling Your Own Responsive Grid System With LESS

We have been studying Responsive Web Design and the mobile first mentality as the statistics on mobile usage are crazy! Hundreds of devices now access the web, from the traditional desktops and laptops to smartphones, tablets, game systems and TVs. This poses interesting challenges for designers as the viewport sizes of these devices range dramatically.

One solution is to have one site adapt itself to the various viewports using the power of CSS3 media queries placed atop of a flexible grid system. Conceptually this is not too complicated, but implementation can be fairly tricky, in both the design phase and writing the actual code. There are many existing open source grid systems in the wild, but to really understand how they work we wanted to get our hands dirty and create our own.

This article aims to dissect our experience and provide a toolkit that will help enable you to do the same. It does not provide a step by step tutorial due to prerequisites involved and feeling light-headed from being in Day 6 of The Master Cleanse detox!

Let's start by outlining the goals of our grid system, dubbed The Organic Grid.

Our Goals

  • Use the LESS CSS preprocessor to handle math
  • Two grid modes, either fixed or fluid
  • Easily count columns to tailor the grid as opposed to thinking in pixels and percentages
  • Variable number of columns
  • Variable column width
  • Variable column margin
  • Offset (push and pull) columns
  • Easily integrate CSS3 media queries to manipulate the grid for different device viewports
  • Quickly generate base classes for non-semantic HTML (traditional usage)
  • Use LESS to generate project specific styles for semantic HTML or for less common circumstances such as nesting
  • Avoid the requirement of special classes that nudge the first and last columns into alignment
  • Keep the system light and modular to be used as a stand alone or integrated into a workflow or framework

Building The Organic Grid

We start by declaring a handful of variables that specify the grid structure. The variables are named intuitively to indicate the value they should contain.

  • @colCount is the total number of columns in the grid
  • @colWidth is the width of each column without the margin (gutter)
  • @colMargin is the margin of the each column, half is applied to each side of a given column
  • @colCombo is the sum of the column width and margin and is not necessary per se, but simplifies the mixins by reducing the parentheses and operators
  • @gridWidth is the total width of the grid as specified in the above variables. This variable is also not a requirement, but improves readability of the mixins

Now let's assign the variables some typical values that create the common 960 pixel grid. We'll use this a target to measure our development against, particularly to verify the LESS mixin calculations.

@colCount:  12;
@colWidth:  60;
@colMargin: 20;
@colCombo:  @colWidth + @colMargin;
@gridWidth: @colCount * @colCombo;

This value of @gridWidth at this point is 12 * (60 + 20) = 960. Although the units are in pixels, they are not specified (yet), as one requirement of this particular design is to function in both a fixed width pixel mode and a percentage mode.

Like most grids, The Organic Grid consists of a series of rows and columns. All columns belong to a row, that is to say, every column is contained by a row. The width summation of all the columns in a given row must equal the row width that contains them (except when using offsets). All this really entails is counting columns.

For example a row with a width of @colCount or 12 must contain some combination of columns that add to 12, such as, an 8 width column and a 4 width column (the Golden Ratio).

This is important, all rows are a certain width (measured in columns) and the width of the columns contained need to sum up to match the row width.

With the variables in place we are ready to develop our mixins that actually use the variable values. It is worth mentioning here that mixins, like creating functions in other languages, can be defined with parameters and these parameters may contain default values. Also, be aware of the variable's scope (local vs global).

We will create two versions of each mixin, one for fixed width pixel mode and the other for fluid percentage based mode. We considered creating one version of each mixin with an argument that would accept a unit of measurement (px or %), but the code becomes more difficult to understand and troubleshoot. So even though this may violate DRY principles, it made life easier!

We will detail the basics of the fixed width methodology here that has been slightly modified to ease the learning curve. Then we will followup with a short summary how logic behind the fluid width mixins.

    margin: 0 (1px * -0.5 * @colMargin);

    width: 1px * ((@span * @colCombo) - @colMargin);
    margin: 0 (1px * 0.5 * @colMargin)

    margin-left: 1px * ((@span * @colCombo) + (0.5 * @colMargin));

The .doRow() mixin sets a negative margin on each row that shifts itself (left) by half of the @colMargin value. This technique is used to avoid targeting first and last columns and shifting them into alignment. The code is straight forward.

Next, the .doCol() mixin does two things. First it sets the width of each column as determined by the @span passed as a parameter. So .doCol(6) would create a column spanning 6 the equivalent of 460px as determined by 6 * (60 + 20) which totals 480, then subtracting 20.

This mixin then sets the column margin on each side of it, which in this case would yield the equivalent of 10px on the left and 10px on the right as determined by @colMargin global variable being divided by two.

The final .doOffset() mixin to complete the fixed width system creates whitespace or reverses element order for SEO by giving a column more or less left-margin. The offset also takes a @span parameter, but this time @span expects either a positive or negative integer to indicate how much and in what direction the column should move. So in using our current example, .doOffset(3) would push a column the equivalent of 250px to the right. This is determined by (3 * (60 + 20)) + (20 / 2). A negative value passed will pull the column to the left. Note, in either case, this affects any other sibling columns it in the document flow.

The fluid grid needs to take the pixel widths and convert them to percentages. It does this by Ethan Marcotte's simply phrased formula of target divided by context equals result.

The concept is the same as the above, except the parent container's width (in terms of number of columns) needs to be supplied to the mixins. Much of the time this will be 12, the value of @colCount in our example. Instead of providing detail let's summarize by saying the main difference in the math is the supplied @parentSpan is used in place of the @colCount and each mixin name is appended the word Fluid. To put this process into words is probably more difficult than looking at the code.

.doRowFluid(@parentSpan: @colCount)
    margin: 0 (-50% * @colMargin) / (@parentSpan * @colCombo);

.doColFluid(@span, @parentSpan: @colCount)
    width: 100% * ((@span * @colCombo) - @colMargin) / (@parentSpan * @colCombo);
    margin: 0 (50% * @colMargin) / (@parentSpan * @colCombo);

.doOffsetFluid(@span, @parentSpan)
    margin-left: 100% * ((@span * @colCombo) + (0.5 * @colMargin)) / (@parentSpan * @colCombo);

Nesting the grid within a grid simply consists of the same process of creating any ordinary row of columns, except the base reference of @colCount (12 columns in our example) becomes an ancestor, not a parent, of the nested rows and columns. Therefore, you need to pass its parent's column count as the @parentSpan in the .doRowFluid() mixin. As before explained, the nested columns width summation will need to match its parent's width.

Nesting is by far the most complex part of this process, but with a bit of practice you should be able to wrap your head around it.

The Organic Grid is an infant and yet to be battle tested, but serves as a starting point. There are browser rounding issues which will impact your design regardless of any percentage-based or sub-pixel grid system (as far as we know) because there is not a rounding standard applied by browser vendors. Opera suffers the worst of those we tested, but version 12 (alpha) greatly improves this.

Well this concludes the write-up and hopefully you find the approach useful and learned a thing or two. The code is still being polished, but we wanted to put my collection of thoughts and notes together into something useful that may help others. Let us know if you use The Organic Grid in any of your projects or have taken it apart and modified it!

If nothing else check out LESS and CodeKit (Mac application) as they have drastically altered our workflow.

We may write a followup with details as to how to use The Organic Grid specifically. An RSS feed is setup if you would like to be notified when future news is released.

Respect And Resources

  • Thanks to Jussi Jokinen for feedback and ideas presented in the Centage! framework
  • Joshua Johnson of Design Shack has a great step by step tutorial to create a fixed width with LESS
  • The amazing publications from A Book Apart that succinctly explain modern web design in exciting ways!
  • Tyler Tate's work, The 1KB CSS Grid, is an excellent light weight grid with clever use of negative margin on the parent to align children. The Semantic Grid system, which as the name suggests, keeps purists happy by moving the presentational classes from HTML to CSS (LESS), which creates semantic and tidy HTML. If you decide to skip all this complexity and give the middle finger to the future, we suggest running with The 1KB CSS Grid.
  • Nathan Smith's Old faithful, The 960 Grid System, deserves a shout out, especially for providing the Photoshop, OmniGraffle and many other various templates. It has been our standard for sometime.
  • Knight Digital Media Center has an excellent tutorial to create a grid in Photoshop for pre-production. Once your grid is created you can then save a cropped 1px version to be used as a background image that repeats on the y axis to underlay the grid elements and assist in alignment during development.
  • Also, we would not have attempted any of this without the Alexis Sellier's LESS extension of CSS and Bryan Jones's LESS.app (Mac) and its successor CodeKit (which compiles the LESS into CSS, among other useful things, including updating your CSS files and browser when a change to less is made for more realtime debugging). The LESS.app is free, but we recommend getting CodeKit instead which comes with small price tag. It will help out the developer to keep the both apps running as smoothly as they currently do!
  • A nice finishing touch is David Cochran's CSS3 Media Query Reporter, which appends the "pixel value range" of the browser's current size (specified in your media queries) to the bottom of the page. This is a pretty sweet tool to identify when a breakpoint has been triggered.
  • The folks at Twitter Bootstrap have made the code available to you and it is very impressive! We really liked the application of a bootstrap file in their architecture. This bootstrap file (commonly used in the Model, View, Controller design pattern) funnels action through one file and can be adopted into your site's CSS setup.
    Use LESS to import other LESS files before compiling them into CSS file. This may have an advantage over using purely CSS because a each import requires a resource request from the server. However, because LESS is compiled this process happens before the CSS file is generated, thereby saving requests which again should improve site performance. But someone will have to double check that!
  • Chris Coyier's snippet summarizing Media Query for Standard Devices.
  • A table summaring media query browser support is found at When Can I Use and using something like Modernizr helps older browsers handle newer features.
  • Of course a hat tip to Ethan Marcotte for helping pave our digital future!

These ideas are certainly not all original, we owe a lot to the great community out there sharing their work and ideas! Please see the code in the downloadable file for examples and checkout the live demo to view The Organic Grid in action!

Go Kings!

Google’s New Privacy Policy Launched March 1st

We are now nine days into Google’s new Privacy Policy and some may say nine days closer to a more privatized version of Orwell’s 1984. Others are enthusiastic about the marketing opportunities and ease of access finding the information and products you are thought, and probably quite accurately so, to be searching for. Wherever you land on this issue, Google’s assertion in a blog post on January 24th that “This stuff matters...” is probably the best way of paraphrasing it.

Basically, what is happening is that Google is collecting and consolidating data from all the Google services you use to make an assumption, or an educated guess as to what kind of ads and search results to return to you based on that data. This includes Google products like your Google Calendar, Gmail account and contacts, and your Google Docs.

So, for example, an email you wrote to your cousin Jane from your Gmail account about how wonderful the trip to Joshua Tree was, might result in “Rock Climbing Adventure Packages” ads in the paid ads section of your Google search results. This is a simplified example of some very complicated data collection and utilization practices going on behind the scenes, and again, across all of your Google accounts. Below is Google’s own video explanation.

If this leaves you feeling a bit uneasy there are steps you can take to minimize the data that is collected such as turning off the setting that allows Google to record your search data (www.google.com/history), or make sure your Ads Preference Manager, if enabled, is configured the way you want it. Be aware of what you’re searching for and what is collected. Login only when necessary and be sure to logout when you are finished. A majority of Google products and services can be accessed without ever signing in.

In the end if you feel more comfortable using a different search engine with a privacy policy more to your liking you can use Google’s own Data Liberation Front to export your personal data from Google’s products (http://www.dataliberation.org/).

Leaving aside the controversial, albeit very relevant, aspects of this ongoing internet privacy debate the impact of SEO (Search Engine Optimization) cannot be ignored. We believe this signals a noteworthy shift by Google towards including more SMM (Social Media Marketing) in their search algorithms. By collecting data from Google Plus, Youtube, Gmail and other social services the results that are being returned to web surfers by the world’s largest (by far) search engine are relying less heavily on on-page optimization and traditional link building (although those are still very important factors).

Companies looking to promote a product or service cannot ignore the impact of social, or viral marketing due to the ever-evolving “personalization” of search. So, again, the playing field shifts slightly and we will certainly be following this issue closely and reporting back to you as this recent development unfolds and its impact is assessed.

Page 1 of 2 pages  1 2 >