Organization Horsepower

Thinking Like a Motorcycle Racing Team

Tag: corporate learning (page 1 of 2)

My New Gig – TiER1 Performance

I know a bunch of folks caught my new job title on LinkedIn week before last. I very much appreciate the well wishes and congratulations I received. In case you missed it, I am now a Senior Solutions Consultant at TiER1 Performance. The company is based out of Covington, KY. (Cincinnati), but I will be out of the Chicago office. This involves a relocation for my wife and I, and while I will greatly miss the Lake Michigan shores of my West Michigan location, Chicago is a vibrant city full of excitement and opportunity.

My new role is client facing, and I am really excited to get to know a whole new group of companies. In this position I will be concentrating on solution architecture and engagement management, and will be very hands-on. One of the advantage s of being with a bit larger of an organization, is fewer operational responsibilities allowing me to focus on what I enjoy more and do best.

At the risk of sounding a bit commercial, TiER1 has a cool planning and organization tool called xMap that I am really enjoying. It lets you drag and drop learning or communication events and/or assets into time based buckets so you can take a wide view the experiences you are building for various audiences. Individuals can use it for free at www.xmap.com and there is an enterprise version available.

If you are in the Chicago area, give me an excuse to come visit you, I need to figure out where I’m going and it will be good practice.

chicago skyline

Organization Horsepower Author Movie

Book Preview on SlideShare

I put up this teaser on SlideShare, if you’d like to learn a little more about the book:

Thoughts from SHARE 2012: Three Drivers that Transcend SharePoint

I just got back from the SHARE conference in Atlanta, and while it’s true that the conference is built around SharePoint as a core technology, the conference is really intended to focus on business applications of SharePoint. I’m not going to do a blow-by-blow conference report; Kristian Kalsing did a pretty good one here. Instead, I’m going to pick out for you the three drivers that are not only critical to the success of SharePoint-based solutions but really to any business solution.
  1. It’s about the business not the technology. When is that last time you experienced or heard the tale of an executive who calls a meeting and says we need an “X”? Recent examples of “X” have included social networking, Kahn Academy, and even Angry Birds. The point being: if you simply take the order and implement your companies version of these tools, the chances of adoption are immediately in danger because nothing in the business is driving the initiative. For example, social networking in and of itself isn’t enough to merit attention, but “connecting the sales force to the engineering staff in near time or real time to shorten the sales cycle by 10 days” is something the organization and your leadership can get behind. The fact that it’s built in SharePoint or any other tool is inconsequential and may be detrimental depending on your organization’s prior experience with the tool (see #3). When it comes time to justify expense or measure ROI of your solution, having real business drivers will be critical.
  2. You’ll need a roadmap. Every project needs a plan, but a roadmap can be so much more. When attacking tough enterprise issues, you’ve got be certain about what the real problem is and how you are going to measure your fix for the problem. Your roadmap can even include governance for the organization or project and accurate requirements gathering and analysis. Susan Hanley was one of the speakers at the conference on governance and is a great resource on governance issues. Sarah Haase from Best Buy is also a great corporate practitioner; her blog can be found here. Our own John Chapin also has a blog post on roadmap creation.
  3. Branding is important (don’t call it SharePoint)! Unless your company sells SharePoint or somehow derives income from marketing SharePoint, you won’t do yourself or your users any good by naming your solution after the technology it was built in. This goes for any branded technology. In fact, if your users have a negative impression of that tool, it may actually interfere with user acceptance of the tool. Let’s face it; sometimes users cringe when they hear SharePoint but won’t bat an eye when you call it the “Sales Efficiency Accelerator.” The trick is this: when users visit this mythical application, it can’t contain the elements of poor user experience that caused them to hate the tool in the first place. Branding will get your users to overlook the underlying technology once or twice, but good user experiences will keep them coming back.

Overall, I’m glad I went to the SHARE conference. It’s nice to see a group of people who are focused on the business applications of SharePoint and not just the stability and scalability of a corporate technology platform. After all, the tools we use are only as good as what we use them for.

Key Learning from the Gartner Portal, Content, and Collaboration Summit

Having just returned from the Gartner Summit, I thought a quick recap was in order. Besides I need a break from my measurement series (Part I, Part II, Part III)

Things got started with Gartner VP and Analyst Whit Andrews taking the stage carrying a shovel. My first thoughts were he we go with another speech about breaking new ground, but I should have known better. His comment was “A Gartner guy with a shovel, never a good thing.” But it turns out, it was a prop for talking about Minecraft, an online game both he and his son play on a regular basis. Here’s a YouTube of him previewing the shovel theme. It set the tone to talk about gamification, but the rest of the conference was decidedly higher level and talked about building and maintaining higher orders and evolutions of portals and content management.

Side note: While researching Whit Andrews I found this hilarious text-speech YouTube diatribe.

Day 1: Portals & UXP

All of the sessions I attended were geared more towards the portal end of the spectrum. Within the realm of portals, the major focus areas were User Experience (UX) and mobile. Of course there is a ton subtlety and secondary topics. I won’t cover every session, key sessions, and perspectives here.

My first session was Gene Phifer talking about User Experience Platforms or UXP. Any reader of the Media 1 blog knows this is a huge focus for us right now. The UXP definitely has a place in the corporate landscape and is a critical piece for creating alignment between people, process, and technology. Gene’s original article on UXP is a great overview of the session. For more of Media 1’s vision for how UXP can drive performance, see Chris Willis’s blog post “Social, Mobile, Integrated…UXP and Your Future Workforce.”

“Using Generation 7 Portals to Attract and Engage Customers” with Jim Murphy was up next. While this session was geared somewhat towards customer facing portals, the same principles apply to employee or internally-focused portals. With a lot people I talk to, “portal” has become a bad word. In a lot folks’ experience, portal technology has been purchased, implemented, and subsequently fallen short of expectations. What we have to realize is that portals have evolved and continue to evolve. The biggest shift I see in portals is that they are moving away from being company, department, or role specific and they are getting personal. When you put the individual at the center of the design and you surround that person with filtered and specific options, portals get a lot more compelling—and that’s the root of why Media 1 is bullish on UXP.

Jim Murphy describes the characteristics of Generation 7 portals as featuring:

  • Analytics
  • Portal-less Portals
  • Context aware
  • Portal ubiquity
  • Emerging UXP
  • Widgets dominate
  • Mobile dominates

Jim and Gene teamed up on another session later that afternoon called “Employee Portals: The Revenge of the Intranet” to address the employee-specific side of the portal equation and aptly drew the connection that the employee portal is the heart of the new intranet. The changing ways that we work have driven the corporate intranet from being information portal to knowledge portal, then to process portal, and now to the latest generation of intranet portal that includes social functionality, mash-ups and combined applications, and has a mobile enabled interface. Social is the connective tissue that ties people to process and information—or people, process and technology in the Media 1 vernacular. More than social, the new portal-based intranet (SharePoint portals included) also enables information management and process management driving alignment to corporate goals. Gartner defines the must have characteristics of the next-generation intranet as:

Current

  • Social
  • Business process enabled
  • Analytics and optimization

Near future

  • Mobile
  • Context aware
  • Gamified

Wish list

  • Enable user content contribution
  • “App store” model

Day 2: Gamification & More

Day 2 of the conference started out with Jane McGonigal’s  compelling presentation “Reality is Broken” on the role of game play on our society and consequently how we can use those principles when addressing the needs of our organizations. Jane’s presentation was based on her book is Reality Is Broken: Why Games Make Us Better and How They Can Change the World. While I’m not always a fan of gamified approaches to development, I must admit to having some changing perspectives based on this presentation. One of the key advantages to gaming is the resiliency gained from repeated trial and error—and sometimes subsequent failure. I think that’s why a lot of people find golf compelling and certainly the incredible volume of Angry Birds play would also seem to reinforce that supposition. Seems to me that many corporate gamified approaches take a “failure is not an option” position and with the absence of failure or the possibility of failure, you miss out on the resiliency benefit and ultimately engagement that the “game” was intended to create. I’m going to do a future blog post on this topic alone. There are some very compelling distinctions between playing, competing, achievement, failure, and losing that deserve to be explored.

I spent the afternoon of Day 2 exploring sessions on UI/UX, people-centered strategy, Yammer and Mobile. Not that these sessions weren’t valuable or intriguing, but I have to assume you have something else planned for your day other than reading my blog.

I finished up the conference with a case a study from GlaxoSmithKline where they talked about their cloud-based employee portal solution, which happened to be a SharePoint 2007 site. While many of their experiences and concerns aligned with my other clients’ SharePoint installations, the fact that the SharePoint farm existed in cloud space created very few if any new concerns. Instead, it appeared to show considerable value to GlaxoSmithKline along the lines of the value proposition of other cloud services.

The final keynote of the conference was from Seth Godin. It was engaging and inspirational to say the least. While many of the examples used where on how to create compelling messaging, or “purple cows” as he calls them, it’s not much of a stretch to see how that relates to how we market performance improvement to our own organizations. It’s more than position; it’s the stories we tell.

In summary, the conference served as an excellent confirmation that I’m talking about the right things with my clients and in the right priorities. It also has taught me that I have a different perspective to bring to gamification and a new understanding of what it takes to make these games compelling. And finally, not that I needed any more convincing, we need to find ways of engaging employees on mobile platforms in meaningful ways. Our obstacles are just that, but mobility is critical to the new work environment.

Measurement, Part III: Measurement as Evidence

The Legitimate Need for Measurement as Evidence

I keep coming back to a quote from President Clinton at Learning 2011:

“If you already know the truth, you don’t need the evidence.”

He was using that in context of a political topic, but I really think it’s applicable to measurement. If trust isn’t the real issue, and if performance and business results are the “truth” we are seeking, and if we can prove those business-related results, do we really need “evidence” that training—either as learning events or as a continuous and integrated process—got us there?

If the purpose of our occupation is to make our companies better, to improve performance, then the primary measurement should be whether or not our businesses are in fact becoming better. In either case, our evidence should ultimately be based in the “proof” that our business objectives are being met.

While this is true in most situations, there is the possible exception of compliance training in which there is a legitimate need to prove learner participation and present it as evidence. However, there is a real danger in perpetuating what I call pseudo-compliance courses, where compliance is mandated but not linked to any regulatory need nor real business drivers or goals.

Compliance vs. Pseudo-Compliance: What’s the Difference?

“But…,” you say. “I have course XYZ that I HAVE to make sure everyone takes.”

This is the classic compliance model. The notion here is:

  1. There are organizations that are legally mandated to provide a training event and must prove that employees did observe the event.
  2. There are organizations whose legal exposure will be unreasonably high if they cannot prove that their employees observed a training event.
  3. There is a strong feeling that training creates a real, actionable alignment between a body of knowledge and the day-to-day behavior of employees.

Clearly items 1 and 2 happen in real life and pass as legitimate reasons for measuring compliance. However, item 3 falls short since it is not linked to a measurable business goal or driver. It’s that simple. It doesn’t mean that it’s not important or that you shouldn’t do it, but you may not need the evidence to back it up.

The Danger of Pseudo-Compliance

The biggest danger in measuring compliance or gathering evidence on compliance comes from tracking things as “compliance” that do not meet the criteria. It’s really easy to incorrectly identify a training event as being either legally necessary or subject to unreasonable legal exposure. These pseudo-compliance courses or events, if allowed to, will:

  • waste your time and resources
  • perpetuate poor impressions of formal training
  • provide cost justifications for systems and processes that do not contribute to your company’s business objectives

It’s perfectly reasonable to set an expectation that employees participate in a pseudo-compliance course, but there are generally ZERO measurable returns on that activity or event. Measuring compliance does, however, have a measurable cost in terms of systems and labor.

The most common occurrence I see of pseudo-compliance courses are around philosophical topics. Sure there are ethical issues that have concrete actions and legal repercussions that are legitimate candidates for measuring compliance, but I’m talking about philosophy here in terms of asking or expecting an employee to believe or think a certain way. Topics like integrity or honesty. You can give examples of someone acting in a way you want your employees to act, but it’s not measurable in the business. Lack of compliance with a mandate for honesty or integrity is typically grounds for dismissal of an employee. What does it matter if you have evidence of the training event when this type of mandate is violated?

Legal Compliance

Assuming that the training you wish to track is legally required or implied as such, it’s reasonable then to assume that the legislation that defines the requirement is strongly linked to either the financial, personal, or civil liberties of persons who work with or for, or come into contact with, your corporation. The premise is that it is in the best interest of your company and the public to comply with the legislation. The rebel in me would love to argue against the idea that all legislated training is needed, but the fact remains that it is a reality of business that there are legal requirements that make compliance necessary.

Assuming for a second that legislation is good and there is a public interest or common good in our compliance, isn’t that something we should want to do regardless? After all, aren’t we as individuals party to the laws of our land? Therefore, the training we do should be such that we not only comply with the law, but also ensure that our behavior is such that we never violate the intent of the law or requirement.

It’s easy enough to leverage an LMS to prove 100% compliance in the eyes of a legal requirement, but the true measure of success is that we have zero violations in our business practice. Thus, our performance measurement is zero or our compliance measurement is 100%. Which measurement is more important?

By definition, legally mandated training is a cost center. We have a responsibility to manage expenditures and be efficient with our companies spending, but that should never interfere with our performance measurement of obeying our legal obligations.

Legally-Compelled Compliance

Now I’d like to move onto to the scenario in which we are legally compelled to provide a compliance measurement, but it is not legislatively mandated. This is a cost-avoidance mechanism. We are, in principle, agreeing to invest in learning in exchange for a reduced or minimized cost should legal action occur at an unknown future date. But let’s be honest with ourselves; legal liability occurs as a result of a tort. Under tort law, companies are held liable for the behavior or actions someone commits while acting as a representative of that company. The reasons for the tort action vary, but can be generally attributed to:

  1. Negligence—lack of knowledge or insight that an action performed on the part of the company could cause damage to another party
  2. Intent—purposeful gains realized by a person, persons, or the company itself at the expense of another party

Much like legally required compliance training, legally-compelled training may have a compliance measurement that could be used in defense of legal action, but the true measure of success is once again that zero actionable behaviors are committed by individuals acting on behalf of our company.

To avoid negligence, we must make sure that people know better, but more importantly, that their actions of behavior reflect that knowledge. When you know better and act in defiance or without deference to that knowledge, then that is the definition of intent. In either case, to truly realize the cost avoidance measure, you must have evidence of compliance yet your obligation doesn’t stop there. Ultimately, performance is the real measurement of success, not compliance.

What Not to Measure

The problem with cost avoidance as a measuring stick is that there is no guarantee that the expense you try to avoid would ever have materialized had you taken no action at all. It’s just a possible expense you may have incurred down the road. There is no direct link to sustainable profitability unless you can say with certainty that you had a consistent, if not fixed, expense that you incurred at a defined level that will no longer be incurred or, at least, will now be incurred at a reduced level. There is no real ROI—only an imagined or implied ROI.

Looking at compliance training as whole, there is a real business requirement, if not a legal requirement, to measure compliance with prescribed formal training events. But that shouldn’t be our justification for creating, maintaining, or supporting those formal training events. And by no means should training compliance itself be a measure of effectiveness upstream in your organization.

At the end of the day, quarter, and fiscal year, the list of training events that we gather evidence of participation on should be as small as possible. This evidence has value, but only as a vague measurement of possible cost avoidance. If we want to actually measure the effectiveness of that training, then the measuring stick needs to be performance based and evidenced by a LACK of adverse occurrences.

 

Measurement, Part II: The Evolution of Systems

In my last post in this series, I wrote about trust (or a lack thereof) as a motivation for organizations producing and/or requiring measurements of training based on learner knowledge or volume of completions. In this post, we’ll take a look at the evolution of measurement systems and how it has led to our current state.

Evolution of Measurement

We are measured our entire lives, starting before we are even born. Height, weight, volume, girth and length are all used as metrics or measurements for doctors and our parents to label us as “normal” when compared to a set of standards. For the most part all of these measurements are well and good, and can serve as indicators of our health.

Eventually, we get bundled up and sent off to school where all the sudden the measurements aren’t necessarily about our health, but rather as a comparative ranking of our ability to retain and occasionally apply knowledge—against a set of standards. These rankings go down on our “permanent record” and follow us as indicators of readiness and aptitude. For better or for worse, this measurement system is used throughout the duration of our education and is sometimes used as factor in deciding whether or not we get a job.

And then a lot of it stops.

Corporations have little use for ranking the knowledge or knowledge capacity of the people who work there. People are brought in to do a job and achieve something that contributes to that company reaching its business objectives—making money.

What workers know is secondary to what they do.

The application of that knowledge to achieve real world results is what really counts.

However, no one really thinks that workers come ready-made with all the knowledge or skills they will ever need. So there has to be some kind of mechanism to assure that knowledge exists if it is missing. That’s what we fondly call a “learning gap.” Of course, personal and professional development is recognized as an irrefutable need since there’s a high correlation between personal development and the likelihood of people being exemplary producers. When we find a learning gap, our knee-jerk reaction is to fill that gap with training and assume that knowing will equate to doing.

Filling the Learning Gap vs. Measuring Performance

The metaphorical issue with the term “learning gap” is that it describes an opportunity or need as a hole or chasm that needs to be crossed. Metaphorically, there are three ways to deal with a hole or chasm: fill it, build a bridge over it, or go around it. In a performance focused sense, none of the metaphorical solutions are the right answer to the problem. We don’t want to go over, around, or through; we want a behavior that clearly demonstrates that the opportunity or need no longer exists.

How do you measure something that doesn’t exist?

It’s much easier to measure how deep a hole is or how far it is across, so that’s the kind of systems we have developed to measure corporate learning. Since 1975 (or 1959 depending on how you measure it), the Kirkpatrick model has been the most accepted standard for measuring the effectiveness of these efforts with its four levels of measurement:

  1. Reaction
  2. Learning
  3. Behavior
  4. Results

However, recently there has been a groundswell towards the rejection of the Kirkpatrick model as a sole methodology for measurement because it often surmises a learning event as a starting point. These grumblings were heard recently at both the CLO Symposium and Learning 2011 conferences and with the writings of thought leader Dan Pontefract, who wrote what I consider the defining article on the Kirkpatrick model problem in the February 2011 Chief Learning Officer Magazine—a stance he further qualified in his blog a short time later. The basic premise is that effective learning is not an event and cannot be disconnected from on-the-job performance; therefore, it cannot be measured on its own outside of a performance system.

That’s not to say that the model has never had value. Level 4 of the model—the Results level—clearly links performance to learned behavior, but it ties those results and behavior to a measured learning event and not culmination of an experience which should include the influence of factors beyond just the learning event. Even if we did apply the model to a grouping of formal learning events, it would do very little to help us evaluate effectiveness of individual pieces or the informal learning that takes place regardless of whether or not informal learning was a planned part of the experience. There are just too many other factors, in addition to learning, that contribute to an individual’s ability to achieve something of value to a business or an organization.

It would be easy at this point to form a rally cry for new measurement standards—ones that are a true indicator of performance—but most organizations already have ways of measuring how they are performing; they just need to find ways to apply those measurements to individual contributors and tie doing things to measurable performance.

There are a select few legitimate needs to measure the delivery of training linked to legal requirements or legal exposure that organizations often refer to as compliance training. However, it’s easy to fall into the trap of imagined compliance, in the next installment in this series on measurements, we’ll explore legitimate verses imagined compliance and how to differentiate between them.

Measurement, Part I: Trust

Every conference I’ve been to in the past year… scratch that. Every conference I’ve EVER been to has had a major focus on measurement. There have been various measurement trends through the years, but recently I’ve seen some shifts that make me hopeful that corporations may actually make some progress in making and taking measurements that actually matter.

This will be the first in a series of blog posts exploring different aspects of measurement—including the importance of trust, motivation, compliance, shifting to business-based measurement, individual measurement, and measurement and its role in budget negotiations.

First up: let’s talk about the importance of trust.

Measurement Part I: Trust

Far be it for me to hold back on how I really feel about something. So, here goes:

Measuring training as a justification for training is an utter waste of time.

It’s like giving style points to the 50-yard dash. It may be interesting, but the only thing that matters is who crossed the finish line first. In other words, the performance or result mattered; the style in which it was achieved is barely noteworthy. Yet, when you measure training in and of itself, that’s exactly what is happening.

I think Charles H. Green hits it on the head with this quote from his blog:

“The ubiquity of measurement inexorably leads people to mistake the measures themselves for the things they were intended to measure.”

Why do we keep using measures instead of actual performance as justification to ourselves and our organizations? The answer to that question in many cases is rooted in why we are asked to measure training in the first place… that is, to prove that it has some kind of meaningful, measurable impact on the organization’s results.

Many of our organizations do not believe that training as it is currently defined has a positive impact. Or they do not trust that you or your immediate organization can execute learning in an impactful way. The requirement for measurement comes from a place of distrust—not from a defined need to measure results. Consequently, measurement is demanded to “prove” training works. Trust is not impacted or improved through this exercise, but regardless, time and effort is spent generating measurements that don’t really tell us anything about the business.

It is not my intent to write a primer on the effects of trust in business. I think Stephen M.R. Covey has done a good job with that in his book the Speed of Trust and the follow-up Smart Trust. The point is that a lack of trust affects our relationships and results in demands for measurements based on volume that are intended to justify the existence of training in an organization. It’s a closed loop with no obvious business value. That’s why old-school training departments are usually viewed as a cost centers, not as a strategic business partners.

So how do we as learning and performance improvement professionals earn trust and show that learning systems are effective and worthwhile without volume (i.e. number of butts in seats) or knowledge-based metrics?

Before we go there, we need to understand how measurement evolved to this state and also how the systems that we maintain perpetuate meaningless measurements. I’ll leave that for next blog post, so stay tuned.

Branding and User Acceptance of SharePoint Sites

In my last post, Pizza and SharePoint™—Branding and Design, I drew an analogy between presenting your best work to your customers without presenting your best selves to your employees in terms of the systems and sites developed for internal use. But why is it so hard to gain user acceptance and what sorts of things can we do to make it easier on ourselves? Why do we even care if your employees “accept” sites we build for them?

It’s easy enough to operate from the perspective that there is certain information that employees “need” to do their job, and there is certain information that is “nice to have.” In corporate structures, critical information or the “need to have” information is often presented in the most expedient way possible. Very often expediency in design results in the employee having to jump through hoops to get the information. “It’s the best we could do, in the time we had.”

While we may have accomplished our basic goals for a site, it doesn’t mean we did a good job. In fact, if we aren’t careful, we may actually create new issues in the process. If we didn’t gain acceptance of the platform we used for the initiative, chances are we’ve:

  • Poisoned the platform for future use by leaving a negative first impression.
  • Used too much time ($) to achieve too little tangible results.
  • Sent a message that we don’t value our users.

While it is sometimes necessary to compromise good design for expediency, we pay a heavy price for failing to gain acceptance. When we do gain acceptance, we achieve our goals faster, cheaper, and we create repeat visits that give us a viable way to expand our goals and create something sustainable over time. AND, we send a message that we care enough to think things through and value our team.

So why, when it comes to SharePoint sites, is it so hard to design for acceptance?

When building informational or community sites, SharePoint acts as a content management system, or CMS, and allows us to present the data separately and in different contexts. This means the data or information is contained in a different technical structure than the look and feel, or branding of the site. This is wonderful when it comes to keeping the content up-to-date, but requires a little extra planning when designing page layouts that support content that is meant to be changed independent of the layout. That seems to be where many implementations fall short.

I.T. departments are typically charged with implementing systems, such as SharePoint, and while your mileage may vary, they generally do a very good job of implementing the functionality or data layer… and tend to pay very little attention to the presentation layer. The typical result is a perfectly functional data infrastructure with a bone-stock, straight from the vanilla Microsoft set of page templates. Since SharePoint wasn’t designed to fulfill a specific need from a specific audience, not much care was taken with these stock templates. Frankly, I find them ugly and filled with usability issues, and I am not alone. Nonetheless, as SharePoint is rolled out, content owners are very often forced to use these templates either expressly or because they aren’t informed that they have any control over the presentation and don’t have the knowledge of how to change it.

In many organizations, a user’s first exposure to SharePoint is an ugly, usability-challenged site, a “crew pie” to reference my previous post. They may need the information that the site contains, but they are often left frustrated and unimpressed. For organizations that recognize this failing, this typically results in a subsequent project to improve either:

  • Look and feel (branding)
  • Usability

The truth is you need to do both. If you fix the content organization and improve the usability, it’s hard for the user to get past the ugly and truly engage with the site. If you fix the ugly but leave the usability out, you may get your users back briefly, but they will inevitably get frustrated again. User acceptance of a site means they accept both the way a site looks AND the way it works.

Usability is a topic for another article, but for organizations that have already fallen into the bad- or no-design trap, a good design can help them crawl out of the user acceptance hole. It sends a message that this site is worthwhile and important enough to warrant thoughtful design, and likewise the users of the site are important and valued enough to warrant the time and money spent on design. For those organizations that haven’t rolled out their first sites, let this serve as a tip:

Whether you call it branding, look and feel, or design, it’s a critical piece of user acceptance.

In the next entry, we’ll focus on usability some more, starting with setting realistic objectives and how to map those objectives to the functionality you design into your sites.

Pizza and SharePoint™—Branding and Design

Once upon a time, in a galaxy far, far away… I used to work for one of the giant pizza chains. As a learning professional, I took it upon myself to understand what it was like to work in a pizza store. You don’t have to be in a store for too long before a mistake happens. Wrong toppings, giant bubbles, or just plain ugly pizzas. Most operators had enough sense not to send these pizzas to the customer and would make a new pizza, but instead of wasting $3 in food cost and throwing out the mistake, these pizzas would become “crew pies” and would often sit boxed on top of the oven until someone had time for a break and would grab a slice or two.

Well, on one store trip, I noticed a sign on the wall that said “no crew pies.” My first reaction was that the store operator was sending a message about mistakes, and not making them, but the company had all sorts of slogans and signs about making quality product and “no crew pies” was not one of them, so I had to ask.

Turns out the operator had much different reasons, and it wasn’t a slogan; it was a rule. He explained to me that he was in a war for good employees with the other restaurants in town. It was hard to find and keep people, and he felt that it sent the wrong message to serve the people that worked for him the worst product his store turned out. Besides, if his team thought that bad pizza was good enough for them, how far of a stretch is it for them to expect his customers to live with bad pizza?

Fast forward to today. I am in the privileged position of consulting with some of the world’s largest companies. Companies that are selling customers some of the most advanced systems, services, and technology available. However, all too often the internal sites these companies use to support their own employees are the internet equivalent of “crew pies.” Barely branded and poorly organized. This is especially true when it comes to SharePoint™ sites.

It’s not enough to just have the information out there. The person has to first want to use the site (acceptance) and then be able to use the site (usability). Newsflash: The default SharePoint™ page templates are not attractive and are not intuitively usable. Even if you are lucky enough to have an IT department that branded the default templates, it most likely is still not good enough. Chances are if you already have an existing SharePoint™ implementation, you’ve seen these default templates in action, as have your users. They have already formed a negative impression of what SharePoint™ is and have little or no vision of what its potential is.

I’m not suggesting that all of your internal sites become graphical Flash sites with splash pages, but I am saying that at a cursory glance, your internal sites need to:

  1. Not look like SharePoint™ default templates
  2. Reflect the importance of the people, business line, product or service it is intended to support

In SharePoint™ development circles, efforts towards user acceptance are often referred to as branding, but it’s more than that; it’s part of the overall design. The goal of design should be a positive or at least transparent user experience. There are two components of user experience, acceptance and usability. Acceptance is typically the result of good positioning and good visual design whereas usability stems from information design.

If we go back to our “crew pie” example, mistake pizzas may in fact taste good, but the user experience is disrupted because admittedly user acceptance is compromised: the pizza is ugly or its usability is challenged—it has the wrong stuff. That’s not to say the crew won’t eat it, but they may not like it.

The intangible message here is that our internal sites and systems set the tone for what our employees deliver to our customers or users, and it’s imperative that our customer’s user experience be flawless. Besides, in a war for talent, our valued employees deserve better than a “crew pie.”

In my next blog post, we’ll dive more into the user acceptance side of the equation and explore some strategies for designing and validating user acceptance as part of a branding, positioning, or graphic design effort.

Older posts