Archive for August, 2011

Art Meets Science With A/B Testing – Part 2

Friday, August 26th, 2011
http://optimize.ly/l2BfDR
http://www.gtm360.com/images/logo_optimizely_100w.jpg
http://techcrunch.com/2010/07/15/optimizely-ab-test/”>http://techcrunch.com/2010/07/15/optimizely-ab-test/
Art Meets Science With A/B Testing – Part 1
Enter the new crop of providers of cloud-based A/B testing solutions. They include Optimizely, Unbounce and Webify.
These A/B testing solution providers free up the designer to focus on the business side of optimizing their applications and lift much of the load of online testing by themselves on the cloud.
We recently had an occasion to carry out A/B testing. This was to decide on the best possible text and color for the signup button on our EMAIL360 website.
In Part-2 of this post, we’ll share the findings of the A/B testing done on the EMAIL360 website (Spoiler Alert: We learned that the original design had a huge scope for improvement!). Stay tuned.
===
Amazing!
I’m in the midst of an experiment and, assuming it proves useful, I intend to sign up with Optimizely beyond the trial period. The $11 that Optimizely charges per month is nothing compared to the value they’re giving me. Optimizely is a great example from which Wingify / VWO and their ilk should understand that low cost is not all that matters. End of the day, it’s the value that counts.
===

In Part 1 of this post, we’d introduced the new crop of cloud-based providers of A/B testing solutions. To do a quick recap, Optimizely, Unbounce and other providers of such solutions do all the heavy duty lifting on the cloud, freeing up website designers to focus on the business side of optimizing their web pages. Designers don’t need to do any programming. By adding one line of code to their websites, they trigger the A/B testing and leave it to the service providers to handle the rest. Some providers like Optimizely provide many popular metrics out-of-the-box viz. engagement (i.e. percentage of visitors who clicked any part of the experiment page) and signup (i.e. percentage of visitors who triggered Sign Up).

origpage-01-300w

EMAIL360 Original Page

We were recently curious to find out if there was any scope for improvement in the text and color of the CTA (call to action) element on the homepage of our EMAIL360 website. As the screenshot on the right shows, the original page had the somewhat geeky “Get this widget!” text inside a somewhat staid gray-color button. This probably echoed the flawed assumption of the original design team that EMAIL360 would be used by programmers who knew what widget meant and didn’t bother about aesthetics. However, we had no intention of restricting EMAIL360 to the geek squad. In fact, it has more value for small and medium businesses who lack much IT support. Therefore, we started wondering if changing the text to something less technical (e.g. “Sign up!”) and the color to something more attractive (orange / blue) would make a significant difference to conversion i.e. percentage of visitors who actually entered their details and clicked this button to collect the widget code. Since there was no definitive answer to this subjective question, we decided to carry out an A/B test.

Put off by Wingify / Visual Website Optimizer, who was the first vendor we contacted, we used Optimizely for our A/B test. Our test was conducted on the original page and four variants of button text and color i.e. five versions in total. To illustrate, variant 4  matched the original page’s text (“Get this widget!”) but sported a more attractive blue color.

origpageversusv4-01-400w

Original Page versus Variant 4

As promised by Optimizely on its website, we didn’t have to do any programming while specifying the different versions of the page. We used five metrics, including the most key one, namely, sign-up. We ran the experiment for around two weeks, which was the time it took for the results to reach statistical significance.

ab-results-02-400w

A/B Test Results

On the engagement metric (i.e. percentage of visitors who clicked any part of the experiment page), all five versions performed fairly similarly. Whereas, when we looked at the key signup metric (i.e. percentage of visitors who clicked the “Get this widget!” button), we found a significant difference between the five versions. Our original page had a paltry 5.8% conversion whereas Variant 4 delivered more than 3X conversion at 16.7%.

The writing was the wall. Thanks to Optimizely, we acquired a strong scientific basis on which to take the following decisions:

  • Don’t change the button text
  • Change the button color

We implemented the above change to the homepage of EMAIL360 and have since seen a sharp uplift in customers.

It’s worth noting at this point that the incremental customers we acquired subsequent to our A/B testing came from absolutely no increase in website traffic. In other words, A/B testing fits into the CONVERT – rather than ATTRACT – stage of inbound marketing and logically forms a part of our suite of Frictionless Online Interaction Solutions.

We’d like to take this opportunity to laud Optimizely for its excellent understanding of our situation and prompt support at all crucial times of the experiment. The next time we intend doing A/B testing for ourselves or our customers, we won’t look anywhere else. For the value Optimizely delivers, its price is hardly a pocket-pincher. For vendors like Wingify / Visual Website Optimizer who bungle on the very first base and seem to lack any go to market theme other than low cost, our experience should serve as an eye-opener: Genuine customers will always pay for value and won’t buy products that lack in value even if their sellers give them away free.

Innovations At A Click-And-Mortar Library

Friday, August 19th, 2011

I’ve traditionally been buying books regularly to feed my self-admittedly voracious reading habit. A few months ago, I started running into a severe space crunch in my bookshelf and therefore decided to borrow books instead of buying them.

However, my experience with book lending libraries hasn’t been anything to write home about. Over the last 12-18 months, I’ve tried and given up on two online libraries, Librarywala.com and BooksAtHome.in. I’ve written about my woes with the former in a previous post (in short, erratic logistics and q’jacking). While the latter has decent logistics and is insulated from q’jacking since it doesn’t use a queue, its collection of books simply sucks. It accepts recommendations for new books from its members but doesn’t seem to do anything about them for weeks. My plan entitles me to borrow two books at a time. When I recently found it impossible for the third time in a row to find the second book to add to my shopping cart – which is what its website uses in the absence of the queueing feature that’s common to leading online rental companies – I decided to bid farewell to BooksAtHome.in.

At about this time, I came across JustBooks. From its newspaper insert, this library seemed to blend the online and brick-and-mortar worlds in an innovative fashion. A trip to its nearest store confirmed my first impression.

Unlike the aforementioned online libraries that deliver and pick up books from your home/office based on your online actions, you need to visit a JustBooks’ physical store location to borrow and return books (unless you sign up for its AVID READER plan, more on that in a moment).

pic13_250wHowever, JustBooks makes excellent use of technology in its stores to deliver a superior customer experience. Its membership card and books have embedded RFID chips in them. To have new books issued or to return read books, you place your membership card and the stack of books on a kiosk located at the front of the store. The kiosk recognizes you based on the RFID chip embedded in the membership card and automatically logs you into your account. You simply tap on the ISSUE or the RETURN button. The kiosk automatically reads the names of all books in the stack at one go – no scanning one barcode at a time. You confirm the list, logout and off you go. That’s it.

Now, if you sign up for JustBooks’ highest end plan called AVID READER, you get to experience “order online, ship from in-store inventory”, which is one of the most cutting-edge omnichannel retailing practices according to analysts like RSR Research. Customers of this plan can order a book on JustBooks’ website and the nearest physical store will deliver it to their homes. I asked the owner of a JustBooks franchise how it could afford the cost of home delivery, especially in the low value book lending business. She told me that that JustBooks is hoping to recover the higher costs through substantially higher revenues. This is entirely possible considering that the store-to-home feature is restricted to JustBooks’ most expensive membership plan. Unlike other plans where customers pay a small amount on a monthly basis and can walk out anytime after the initial three months’ lockout period, the costliest plan locks in the customer with an upfront payment for one full year, which results in 20X greater revenues. As an aside, this example shows that it’s possible to adopt innovative practices that fulfill customer’s needs better, and, in the process, upsell and boost topline without necessarily sacrificing bottomline.

It’s still early days for me with JustBooks. Much as I found their process and technology innovative, I wasn’t too impressed with their staff. One of them was perpetually grumpy, another was highly patronizing and the third one couldn’t speak English and doesn’t belong in a lending library for English books. They might need one more kiosk to handle peak rush. I don’t like it that they charge a 2% ‘convenience fee’ for paying monthly fees online. They keep whining about accepting credit card payment for monthly fees even though they confirmed to me at the time of signing up that I could always pay by credit card and also prominently display a “credit card welcome” sticker on their door. Given my not-so-great experience with the two aforementioned online libraries, I’m keeping my fingers crossed with JustBooks. But, whether I stick around around with JustBooks or eventually order another bookshelf, its novel business practice and innovative use of technology must be noted. Consider  it done.

Art Meets Science With A/B Testing – Part 1

Thursday, August 11th, 2011
I’d read this post when it was published last year. I recently found the need to do A/B testing and happened to read a newspaper article about Wingify / VisualWebsiteOptimizer. This article had alluded to Optimizely as a high-cost competitor of VWO. I headed to VWO and found that it lacked the most elementary performance metrics and quickly abandoned it. I got an email from one of their people, which showed absolutely no knowledge of my experiment and why I couldn’t set it up eventhough I’d already done so on their website. I said as much and expected them to take the hint and get back to me. Instead, their response was, like, duh.
I then gave up on VWO and went to Optimizely on the rebounce. Thank goodness I did that. Whatever my experiment needed, I found it immediately on Optimizely. Dan Siroker himself helped out at times. Only now I’m learning of his pedigree – Google Chrome Product Manager, Director Analytics for Obama Campaign, etc. It’s great of such a person to personally get involved, quickly “get” my questions and revert with crystal clear answers. Amazing!
I’m in the midst of an experiment and, assuming it proves useful, I intend to sign up with Optimizely beyond the trial period. The $11 that Optimizely charges per month is nothing compared to the value they’re giving me. Optimizely is a great example from which Wingify / VWO and their ilk should understand that low cost is not all that matters. End of the day, it’s the value that counts.

Beneath their hoods, software applications and websites may be all science but there’s a lot of art on their surface. There’s no right answer when it comes to screen layout, color schemes, navigation flows and other design elements. While one designer might prefer an orange button over a black background, another would swear by a white background and blue button. Some etailers might be satisfied with basic credit card validation, fearing that CVV, VbV and other stronger authentication methods might turn off an average online shopper whereas others might want to minimize their fraud risk and let the shopper be ferried around by electronic payment gateways across third-party websites before bringing them back to their own website to issue the order confirmation.

Traditionally, gut feeling has played a strong role in deciding what option to choose when it comes to the aforementioned design elements.

design-01-400w

However, website owners, etailers and designers are aware that even small changes in design could have a major impact on the conversion of browsers to buyers, so they’re forever  looking for scientific techniques to help them with their decision making instead of leaving it totally to their gut. Traditionally, they’ve relied on surveys and focus groups. However, in today’s world, such techniques are not really adequate. Witness, for example, the fate of BORDERS and Barnes & Noble who continue to close stores across the nation even as they managed to take the top two spots in Forrester’s annual Customer Experience Index that is based on surveys.

What if designers had a way to test their alternative designs using live traffic instead of relying on the verdict of a bunch of community testers?

With online A/B testing, they do. Designers select one or more conversion actions on a given web page and model two versions of that page. They split their live traffic into two parts and route one through version 1 and the other, through version 2. By measuring the conversion metrics on each version of the page, they can scientifically conclude which version fares better.

In the above illustration, a click on the “Sign up!” button by a website visitor would be an important conversion action. The two versions of the webpage are Design A and Design B which have different background and button color as shown in the above diagram. If an A/B testing indicates that (say) Design B converts better – that is, has a higher percentage of visitors who clicked the “Sign up!” button – the designer would be in a position to scientifically decide that Design B is the way to go and ditch Design A.

While A/B testing as a statistical tool has been around for a long time, its use in software design has been restricted. Despite early adoption from the likes of Amazon, A/B testing never went past the realm of the big e-commerce players. The reason isn’t hard to figure: While designing an A/B test and developing multiple versions of a page (or sets of pages) are quite simple, executing the test demands close coordination between the designer and the hosting provider in order to activate the right version of the page at the right time, manually record traffic and conversion for each version, and jump a few more hoops.

Google tried to mitigate some of the pain involved in executing A/B testing with its Google Website Optimizer launched a few years ago. However, according to anecdotal evidence, GWO didn’t go far enough and A/B testing remained too difficult for the average website owner.

No longer.

A new crop of cloud-based solutions from Optimizely and a couple of other companies makes A/B testing easy. These companies do all the heavy lifting on the cloud and free up designers to focus on the business side of optimizing their pages.

origpage-01-300w

EMAIL360 Original Page

We recently wanted to carry out an A/B test to figure out the best possible button text and color for the “Get this widget” button on the home page of our EMAIL360 website.

At about this time, we’d read about Wingify / VisualWebsiteOptimizer in a local newspaper, and signed up with it. Unfortunately, VWO couldn’t track even elementary conversion actions like a button click. As a result, we decided to abandon it. We got an email from its founder a day later. Although we’d set up our experiment on Wingify’s website, its founder’s email showed no prior knowledge of what we’d attempted to do, and why we failed. We replied back asking them to review our experiment and tell us how to proceed. Instead, their response was, like, duh. This send major alarm bells ringing in our mind. Since we knew the basics of A/B testing, we weren’t sure how Wingify would handle more complicated questions that would inevitably crop up as we went deeper into the execution of our experiment and decided to stop dealing with it.

We then decided to check out Optimizely, a startup we were quite impressed with when we’d read about its launch around a year ago.  Thank goodness we did that. Whatever our  experiment needed, we found them easily on Optimizely. Its founder, Dan Siroker, himself stepped in with help at crucial times. It was commendable that a person of his pedigree – Director Analytics for Obama Campaign and Google Chrome Product Manager – was personally involved during the whole process, “getting” our questions immediately and reverting back with crystal clear, actionable responses within reasonable time.

In Part-2 of this post, we’ll share the findings of the A/B testing done on the EMAIL360 website using Optimizely (Spoiler Alert: We learned that the original design had a lot of scope for improvement!). Stay tuned.

Is CROSSWORD Heading Towards A ‘BORDERS Moment’?

Monday, August 1st, 2011
I was telling the manager of a brick-and-mortar store of @crossword_book that their prices – which are not discounted from the printed list prices – were simply too high compared to the 15-30% discounts offered by online booksellers like @Flipkart. He agreed and defended his prices on the basis of higher cost structure involved in a physical store. Apparently, Crossword pays a rent of INR 3-4 Lacs per month for a typical store.
I wonder if Crossword is nearing a ‘BORDERS’ moment.
While trying to find out its Twitter handle today, I discovered that Crossword also has an online store. In fact, its online prices are lower than store prices for many books I checked.
For a brief moment, I wondered why Crossword’s store manager never told me that I could get lower prices at Crossword’s own online store when I was comparing his store prices with pure online bookstores like Flipkart. But, the answer came to me quickly: Like BORDERS, the physical store and web are probably two different channels for Crossword, and the manager I spoke to was only bothered about his channel.
I  hope Crossword, Landmark and other physical bookstores in India realize that BORDERS got thrashed by Amazon not because it didn’t have an online presence, but because its store employees never promoted its online store even at the cost of losing business to Amazon and other online competitors.

Crossword, Landmark and other physical bookstores in India would surely be aware of the fate of BORDERS, the leading American bookseller that recently filed for bankruptcy protection and closed down its stores after years of getting thrashed by Amazon and other online bookstores.

I was recently complaining to the manager of a CROSSWORD store that its prices – no discount on the list prices printed on the back covers of the books – were simply too high compared to the 15-30% discounts offered by various online booksellers like @Flipkart. He agreed with me and defended his prices on the basis of the higher cost structure of a brick-and-mortar store. Apparently, Crossword pays INR 3-5 Lacs (equivalent of US$ 6,666 – 11,111) per month just in rent for a typical store in an upscale retail location. Add salaries, utilities and other costs, and I was supposed to get the drift and fork out the extra cash happily. Thankfully, he probably realized that we were well into the second decade of the 21st century, and spared me the standard spiel that many others would’ve given under such situations about how online shopping could never match the touch-and-feel experience afforded by a physical store.

I began wondering that day if Crossword was heading towards a ‘BORDERS moment’.

Today, I became convinced that it was, indeed.

While trying to find out its Twitter handle – it’s @crossword_book, by the way – I discovered today that Crossword has an online store. In fact, its online prices were substantially lower than its store prices for many books I checked. Besides, shipping was free.

For a brief moment, I wondered why Crossword’s store manager never told me that I could buy books cheaper at Crossword’s own online store when I was grilling him about the lower prices offered by its pure-play Internet competitors. I then realized the answer to that question in a flash: Like BORDERS, the physical store and web are probably two different channels for Crossword and never the twain shall meet.

While Crossword surely knows what happened to BORDERS, it might be enlightened to note that BORDERS faced a ‘BORDERS moment’ not because it didn’t have an online presence. Despite having a website that rivalled Amazon’s in variety and prices, BORDERS sank because it had never managed to integrate its online store into its organizational psyche. Employees of BORDERS’ stores rarely promoted its website even if that meant losing business to Amazon and other online competitors.

I see parallels at Crossword, which is why I’m convinced that it is hurtling towards a BORDERS moment. But, since e-commerce is not yet mainstream in India, Crossword can easily escape BORDERS’ fate.

How? By behaving in a channel-agnostic manner in front of customers.

Will that happen by itself? No chance.

Can Crossword make it happen immediately with the wave of a magic wand? Doubtful.

Will Crossword bite the bullet and bring about the required internal transformation to make sure it happens soon enough to escape the BORDERS moment? Probably. Only time will tell.