I was once asked a couple of questions on the back channels by a long time acquaintance, one of them being: "How come you only have positive reviews, and no benchmark numbers?"

At SmallBizWindows.com and AbsolutelyWindows.com, we have a simple review policy: we test based on the usage scenarios we envision for the product being reviewed by small and medium-sized businesses, and we only post reviews we feel would benefit our readership.

Based on feedback, our users are not interested in benchmarks, for they can get those from other sites. What they are interested in is how the products being reviewed would fit into, and enhance their productivity, and for our after hours reviews, their leisure.

We comply by bringing them usage scenarios. While we may run synthetic benchmarks, those are for internal use only.

Based on this policy, our review program takes a minimum of 90 days from inception to completion. During that time, we have multiple users test the product in multiple locations.

Due to this lengthy time, we shall strive to do the following:

  1. Shiny New Thing: announce a product when it comes in.
  2. OOBE: A brief summary of what the product is, and our impressions of the packaging, and OOBE.
  3. A week With…: After a week, time permitting, we will deliver a synopsis of our experience so far with the product.
  4. The SmallBizWindows Product Review at the end of the review period.

In between all these, we may post updates at our discretion.

We also review more products than we post, blog, or publish in The Interlocutor. However, our policy is not to post any review of any product that does not meet our standards, never giving the vendor any chance at free publicity.

However, when a product sucks fails to the point of being counterproductive, we will call the product, and vendor out as a public service.

At SmallBizWindows, we get quite a few products sent to us for review. We also solicit for products that catch our eyes and fancy – as described in the PR blurb, or by word of mouth.

While we do solicit products, we also have quite a few prototypes in various stages of development from several vendors in our possession at any given time.

Depending on the owner of the product, we might observe total silence, or let it be known that we have the product here.

However, we keep to agreed-upon NDAs and embargoes for the products.

All our public reviews are only of products sent to us for that very purpose since we require the product owners to “have some skin in the game”, so to speak. If we purchase the product for internal use or testing, we will not publicly publish a full review.

Based on the product, it is either reviewed personally at the Orbiting O’Odua, MedikLabs (in NECO), LogikLabs (in NoCal), or at the ‘day job’.

There though, the niceness stops.

Fully fifty percent (50%) of the products we get for review fail our really simple criteria:

  1. It must be cost efficient (OK, cheap) from an SMB perspective,
  2. It must have relevance for small businesses or businesspersons,
  3. It must be compatible with Windows, the client, the mobile, and the server,
  4. It must meet simple price/performance metrics,
  5. It must meet a basic OOBE yardstick,
  6. It must be well supported, and
  7. It must arrive undamaged in original packaging.

For the remaining fifty percent, we are able to winnow out the pretenders from the real deal with our stringent testing, generally over an extended period. Yes, we do not do 24-hr ‘reviews’.

My simple premise is this:

  • If installed at a client location, and we have to roll out a service call for your product, FAIL.
  • If you product does not improve on our current default, FAIL.
  • If I really have to RTFM in order to, ahem, connect your product to the mains or the network, FAIL.
  • A couple of STOP events? FAIL.

As a result, the only products that make it to the blog are Business Ready, and the only demarcation between them is the amount of satisfaction we got, and so would you, from using the product.

In other words, if you read a review here, be rest assured that it is a product that we would purchase for our own enterprise.

Resultantly, we stand by our product reviews.

One hundred percent.

Follow johnobeto on Twitter