Author Archives: Brian Whaley

Unknown's avatar

About Brian Whaley

Technologist, Digital Transformation Professional, User Experience Champion, Landscape & Macro Photographer, Avid World Traveler, Advanced Open Water Scuba Diver, Enthusiast of Home-Cooked Food

Usability Week 2009 – Day 3

Day 3 was the final day of the Usability In Practice 3 Day Camp. Today the presenters covered a wrap up of how to report your findings by reviewing our homework. Reviewing the bad Findings Report was just as informative as seeing the good report. They also covered paper prototyping, field studies, how to finance usability studies, the cost benefit analysis of your work, and successful usability programs.

Paper prototyping is a low-fidelity, cheap and easy way to try lots of different ideas for new designs without spending a lot of time building them. It is also great for setting expectations that this is not a polished product, so the critique stays much more focused. Paper prototypes are great for reviewing navigation, work flow, layout, content, terminology, labels, and naming conventions. Testing will run very similar, except your users will use a pen instead of a mouse and keyboard, and you will have an additional person in the room playing the part of the computer.

Field Studies area great way to get lots of information about your customers. These studies show how their environment affects how they use your product. Artifacts, or the things around your users, give you clues to their habits, tools, and distractions. Be sure to tell your subjects not to clean up before you arrive! And if you are going to view someone like a call center employee, don’t let their manager give you a demonstration instead. They will definitely use the application differently, if at all. Field studies will provide lots more information, so be prepared to record all the data in different ways.

Jakob Nielsen presented some great numbers on usability testing. If the time of an employee is subtracted, the only cost of usability testing is the cost of incentives. Usability budgets for big projects from major corporations average around 10% of the project’s budget. What you should expect to see is an increase in conversion rate, a decrease in bounce rate, an increase in community participation, and an increase in clickthrough rates.

Kara wrapped up the session with a discussion of usability programs. The focus wason building user-centereddesign and testing into your projectsfrom the beginning. This means that usability needs to fit within the existing project structure withinan organization. This can happen in centralized, decentralized, or matrix organizations. Having acentralized repository for all usability documentation will instill a cultureof knowledge management and continuous improvement. Start small, make management your ally, and you will be great!

This was a really great session. I walked away with a lotof information on how to conduct usability tests, and how to indtroduce this continuous improvement methodology into our organization. It is nice to move on to a new topic, however. I think our lecturers did a great job, but were getting tired of seeing the same faces day in and day out. A couple of us even noticed that they were cutting Q&A sessions short to avoid some of the participants who could not seem to stay on track. Unfortunately, it did limit other people from asking pertinent and intelligent questions. Looking back, there wasa brief discussion on usability guidelines. I was hoping we would have spent more time on this. This left me looking forward to the sessions over rest of the week.

Hopefully, some of my friends from Usability Week 2009 are reading my blog now. Any comments yet, usabilitists?

Usability Week 2009 – Day 2

Today started off reviewing our homework. We had to write an objective and 3 to 5 tasks to review the inmod web site. We spent the first half hour reviewing the tasks in small groups. I am always surprised when working in small groups how easy it is for people to take the group off track.

The big topic for session 2 was about conducting the user test. You need to make sure that not only you are prepared, but everyone involved is prepared. The steps of a user test session are Introductions, Run the test session, Debrief the user, and Prepare for the next session. You should prepare the participant for what to expect, and make them comfortable. Stay as neutral as possible during the session. Get any final feedback you may need from the user, and answer any questions they may have. Then reset the computer and get your notes ready for the next test.

After all the sessions, it is time to analyze all of your new data. You organize your data into Findings with supporting details, assign each of the findings a Priority, make Recommendations that are based on the findings, and then cycle your changes into the next development cycle. All of this information should be included into a Report of your Findings. You should try to provide a short, informal report within the first 24 hours, and a more detailed, formal report within two weeks. Your reports should focus on the positive (what worked well) as well as the negative (what needs improvement). You should also formally present your findings to your client. Keep your meetings short. Leverage the video you took and include quotes from the users.

Jakob presented variants to the user testing methodology. You can test more than one user at a time, if it makes sense (like husband and wife, co-workers, etc). When you cant go to the user, and the user can’t come to you, remote testing is one of the last possibilities. You can also test more than one site (either two separate designs, or competing sites). Sometimes you may want to follow users over an extended period of time, so diaries or videos can be used. Eye tracking is a new technology that is very useful with video recording.

Based on the studies that NNG have implemented, tehy have found that the ROI per user tested is maximized at 4 users. They recommend on testing 5 users to be sure that you are safe in your results. But, anything beyond 5 users, the number of findings flattens out but the cost continues to increase incrementally.

You also need to be very careful when testing users with special needs. Disabled users and low literacy users should be tested with simpler tasks and shorter sessions. Senior citizens love teh attention and are over polite, but should also be given less tasks, and expect more time for the introduction and wrap-up of the session. Testing childres is also very different than testing adults. Testing in schools is ideal, if you can get permission. Shorter sessions and co-discovery methods make testing easier. International testing can be much more expensive, as it requires translators or much more travel. Hardware testing works similar to software testing, but you need the real product to test.

Jakob closed this session with a discussion of the ethics of user research. You need to remember that these are people that you are testing. You need to treat them with respect and dignity. The rule of thumb is that you should treat them as you would want to be treated.

After today’s session, a few of us went to a sports bar to watch the North Carolina / Michigan State game. The three of us that went all had ties to Michigan. Shawn and his wife live there now, Rebecca has bounced back and forth between Detroit and Chicago, and I used to live there when I was very young. Naturally, we were all cheering for Michigan State. After 3 minutes of the game, Michigan State was down 15 points, and they never recovered.

Usability Week 2009 – Day 1

Sunday was the first day in a 3 day intensive boot camp on how to run User Tests called Usability In Practice. I have been trying to keep up with my activities in Washington, D.C. by posting on Twitter as well as here on my blog.

Hoa Loranger kicked things off by covering the foundations of usability. She explained that you and your colleagues have a very different experience than your users, and makes it very difficult to predict their needs. This is the basis of user-centered design. She covered 5 dimensions of usability as a quality criterion – learnability, efficiency of use, memorability, errors, and subjective satisfaction. The relationship between design and usability is like the relationship between writers and editors. The Discount Usability method focuses on qualitative rather than quantitative tests. This provides faster methods with fewer resources.

Hoa then introduced the user testing methodology. This is a simple way to collect forst hand data from real users. It is a simple feedback loop – plan your user tests, conduct the tests, analyze your findings, present your findings, and finally modify your designs and retest.

Janelle Estes then took over and walked us through most of the methodology. She covered how to plan your study, how to recruit your participants, how to write your tasks, how to choose your location, and how to observe and take notes. When planning your study you need to decide exactly what you will be testing, what metrics to collect, and how to identify your target users. When recruiting participants, don’t under-estimate the amount of time it takes to find participants. A screener, or script of questions, is a great way to opt in or opt out possible recruits. Once chosen, send a confirmation letter to your users, and include information about their incentives to show up. Schedule your sessions with both their time and your time in mind. When writing your tasks, keep them focused on the goals of the test session. You can have first impression questions, exploratory tasks, and directed tasks. When choosing the location, you need to keep the user, the tester, and the observer in mind. Should it be in-house in a conference room, or in a usability lab? Be sure that you can capture your setup with screen capture, audio and video, time and note taking. Pilot your test before your users. Be sure that on the day of the tests you are ready to take your notes with a notebook or spreadsheet.

Jakob Nielsen then wrapepd up the day presenting information on the Usability Toolbox. He discussed a number of different sources of data and techniques to improve your site or application. Improvement of your site can be fit into any systems development lifecycle. Jakob also walked through Expert Review methods. The first method is a Heuristic Evaluation – a way for experts to examine the interface. The second method is Guidlines inspection – a way to inspect the site relative to a list of guidelines. An interesting thing that he brought up was the expected vs actual results of a Leikert scale. When implementing subjective satisfaction surveys, keep in mind that when using a Leikert scale of 1 to 7, the mean is usually 4.9, not 4. This means that human nature is not to give a 1 or 2, changing to a Leikert scale of 1 to 5 (or 3 to 7, actually). Very interesting.

So that wrapped up Day 1. Lots of great information. Tomorrow will cover conducting the tests, and analyzing, reporting, and presenting the results.

Usability Week, Washington D.C. – Day 0 – Cherry Blossom Festival

In preparation for the Usability Week conference, I checked into The Omni Shoreham Hotel the day before. To my pleasant surprise, Saturday was the same day as the Cherry Blossom Festival. It was a very windy day, it was late in the afternoon, and there were hundreds of thousands of people in town to enjoy the cherry blossoms. All of these factors did not lend themselves to a peaceful photo session down along the Tidal Basin. But, it was the last of the 4 day peak bloom period of the trees, so I decided to take the Metro down to the Smithsonian and walk along the Tidal Basin.

When I got off the Metro, I had to fight the tens of thousands of people trying to get back down into the subway. Once I got through the crowd, and made my way past the Smithsonian and the Washington Monument, I finally reached the Tidal Basin. The walkway along the tidal basin was absolutely packed with people. It reminded me of the WaterFire Festival I attended last October in Providence, Rhode Island, except there were more people here in Washington D.C.

I spent two to three hours walking along the basin, and took a lot of photos. I posted them on my Flickr account, and you can see my cherry blossom photos there.

Amit had recommended that I eat at Tono Sushi on Connecticut Avenue. It is only a block away from the Omni Hotel, so I decided to give it a try. The food was excellent, and the seafood was very fresh. I particularly enjoyed the Baked Salmon Roll, which was something new that I tried.

Outsourcing 102 – The New Team Members

The Plan

The initial plan was to train the liaison within my department, as there was an immediate need for assistance. After, he would be cycle through the teams of my peers, to have him learn about each of their departments. The liaison was to learn about the environment, gather documentation, and set up an environment offshore for development. As demand increased, we would slowly grow a team offshore. My focus was to start them off slow, as content managers, support staff, and graphic designers. As experience grew, we would spread the offshore team into other, more complex areas of development, such as new development and project management.

Step One – Training and Environment

My team has always been ahead of the game in terms of documentation. We use a shared platform across all of our sites, so the structure of our projects are all similar. Each of the services and the methods are documented with code samples on their usage. We use a project template to jump-start project development, with documentation on the functionality of each of the DLLs, aspx pages, and ashx handlers. We also have templates for project estimation, requirements documentation, and system testing. We have a checklist to guide the developer through the project process. All of our code is stored in VSS, all of our DDL and DML SQL is stored with the solution, and all the documentation is stored in our Documentum instance. Promotes are handled throught CruiseControl for .Net code, and SQL Navigator for the SQL code. All pur projects are initiated through a centralized PMO, and our Client Facing team are the overall project managers. Providing all of this information about our code and our process was the easiest part of the transition.

Getting the offshore team a working environment was not as easy. All of our servers are in the United States, behind our firewall. Working with our security team, VPN access was ruled out as too risly to provide without a more diligent legal contract. This left few options. The one that was chosen was a web based Citrix solution. Tools such as Visual Studio, VSS, and SQL Navigator, were provided to the development, test, and production environment through the Citrix Business Partner site. Some other tools were provided directly to the offshore team, such as local copies of Visual Studio and a local version of our database.

Step Two – Content

Some of the first tasks assigned to the offshore were to help manage content for new projects. I figured that this was a simple set of tasks that would expose the outsourced developers to our development environment. An offshore developer was added to a new project, paired up with an existing onshore developer, and work was divided. Each new page has to be registered in the database, and assigned to the proper application, with all the right properties and URL rewriting information. Our page content comes from advertising agencies in the form of HTML. We take that HTML, slice it into reusable blocks and unique blocks, and insert them into our custom content management database. Each of the pages then has to be associated to each of the content blocks. All of this data is inserted into the database via Data Manipulation (DML) SQL scripts. Once inserted, the pages need to be tested with the solution. These activities were the responsibility of the offshore team members.

Step Three – Support

A significant part of our development activity is support. Each project allocates a 5 year budget to support that application, and we need to manage those activities just as well as projects, if not better. In fact, support tickets are like mini projects. They contain an analysis, design, implement, test, deploy, and stabilize phase just like projects do. The only real difference is that upport activities last less than 15 days. This is another great way to get new staff familiar with a new environment. We set aside two people offshore to be responsible for support. The tickets were divided across the two people. And they had all the same resources as the project based developers. These two very quickly got a cross section of all projects and sites, tools and technologies, and processes and procedures.

We had also implemented a role onshore that would be responsible for coordinating all support, both across technologies and across the globe. This person was responsible for support assignment, workload, quality, and process, all as it related to support. This was received well. Support functions now had a champion; a clear voice representing their perspective.

Next

In my next post, I will cover the results of our new outsourcing contract, and how some other teams were impacted by the outsourcing contract. I will also cover and how it became the most important decision of the year for the team.

Outsourcing 101 – Introduction

The World is Flat

Outsourcing and offshoring is a mainstream business practice in today’s economy. Companies reach to outsourcing and offshoring to find cost savings, find expertise outside of their core business, and provide a follow-the-sun workforce. Blended costs for outsourced companies is lower than a purely domestic team by leveraging lower resource costs in other countries like India, Brazil, and the Philippines. Things are no exception where I work.

The Story

I have decided to document the transformation of my department into a global organization that embraces outsourcing and offshoring. I am also hoping that those that went through this process with me who read my blog will provide comments of their own, and keep me honest.

In The Beginning…

In the middle of 2007, my department decided to globalize our work force and find an outsourcing vendor. Other departments had experience with Satyam, Intellogroup, and Accenture. We decided to start with Satyam, as we heard the most positive reviews of their performance.

We put together a brief meeting and walked through our objectives of blending a global team to drive down costs. Over the course of the next 4 months, we focused our interviewing skills at a half dozen candidates for our first outsourced team member. The first candidate seemed to be a good fit for our team, and so we made short order in making an offer and getting a contract signed. But, in the 24 hours between interview and offer, the candidate mysteriously became unavailable. Interview followed interview, and all the candidates seemed to fall short of our expectations. It reached a point where the resumes stopped coming in. With only two months left in the year, we resolved ourselves to try another outsourcing company.

After reaching out to Intelligroup, we held a brief meeting with them. The meeting turned into an ad-hoc interview for one of the attendees. He seemed to be strong in .Net technologies, have a solid background in software architecture, knew enough about web technologies, and had project management experience. We made an offer, signed a contract, and on December 3, 2007, we had officially taken the plunge into offshore outsourcing and hired our onshore liaison.

Next…

My next post will talk about how we expanded the team to include our first offshore resources located in India, how we integrated them into the team, and some of the bumps and bruises we experienced along the way.

Some Research on User Interface Standards

The Task

I have been asked to put together a working group to put together user interface standards. Initial discussions are that we will need to come up with different standards for different environments, like portal sites, websites, custom applications, mobile applications and off the shelf applications. So… I have done some research on the areas of user interface standards, usability, and user experience.

Some Definitions

Wikipedia was a big help. Here is what I found there.

  • User Interface – also known as Human Computer Interface, user interface is the aggregate of means by which people interact with the system. The user interface provides means of input ( allowing the users to manipulate a system) and output (allowing the system to indicate the effects of the users’ manipulation)
  • Usability – the ease with which people can employ a particular tool or other human-made object in order to achieve a particular goal
  • User Experience – a term used to describe the overarching experience a person has as a result of their interactions with a particular product or service
  • Human Interface –

Existing Standards

I know this is material that has been covered by other companies. Here is what I have found available that other groups have compiled.

Other Standards

These are some other user interface standards that I have heard mentioned in other articles onbline, but could not find any links directly to them.

  • DIN 66234 part 8 standard – 1988
  • The Data Company’s standard
  • Motif™ style guide [OSF 1990] 167
  • OPEN LOOK™ [Sun Microsystems 1990] 404
  • Smith and Mosier [1986] guidelines 485

Conferences

These seem like two promising conferences about user web usability.

Organizations

Here are two organizations I found that focus on Human Computer Interaction (HCI) and usability. I hav just joined the UPA, and plan on re-joining the ACM.

Books

Amazon is a cornucopia of information on user interface standards, usability, and user experience. I have most of these books, and plan on getting the others soon.

Link Roll

Lots of good sites out there about usability and user experience.

If there are any sources that you use that I have not included, please leave a comment and let me know what it is.

Virtumonde is not your friend

I was the victim of a very annoying piece of malware. I have been avoiding the corporate install of Internet Explorer for months now, and I have been using Firefox 2 and 3 instead. I am sure I was doing something I should not have been, because for the last two weeks these strange popups have been plaguing my Firefox browsers, and my machine has been running like there was taffy on my hard drive. I tried to remove the trojan with Spybot S&D, and that did not work. It did identify a Browser Helper Object (BHO) and some registry entries that I could not get rid of. That is when I knew it would be bad. Derek recommended that I try Macafee Avert Stinger. That was no help either. I tried HijackThis. That was informative, but not as helpful as I had hoped. So I did some more digging online, and an article recommended Malwarebytes’ Anti-Malware (MBAM). That was a big step forward. It clearly identified my problem as the Virtumonde Trojan. There were 59 DLLs, BHOs, data files, and registry entries all over my computer from this one trojan. I used MBAM to remove all of them, but the BHO registry entry was stubborn. This meant there was still more. I did some research on Virtumonde, and found that a tool called ComboFix will wipe it out entirely. It took about 20 minutes to run, rebooted my machine, and took another 20 minutes to complete. But when it was all done, I was trojan free. No more popups when I use Firefox, and my machine is fast again. Now… if only I knew what I did that was so bad…

Techno-Christmas 2008

Well, another Christmas has come and gone, and we have all exchanged our gifts. Everyone in the family got new gadgets to alleviate their tech addiction.

Nicholas got his long-overdue Xbox 360. We bought him the Elite version, with the wireless remote and the 120GB hard drive. Can’t get a new console without a shiny new game too, right? So we got him one of his favorites… the new NHL 2K9. He also got lots of gift cards, so that he could go out and get a game of his choice. He picked up Call of Duty 4, and another wireless remote, so that he can pwn me and his friends up in NHL 2K9 or Call of Duty.

Mary Ann was light on the technology this year… she did get some CDs that would help her learn basic Dutch in the car on her long commute to work every morning.

I was burned by the HD-DVD fiasco last Christmas, so this year my wife bought me the Sony BDP-S350 Blu-Ray Disc Player. And, just like the Xbox, you can’t get a new Blu-ray player without getting a couple new Blu-ray movies. My parents bought me Iron Man and Wall-E.

The great thing about technology is the same as the problem with technology. It is always improving. The Harmon Kardon receiver we had for 6 years or so has no HDMI inputs or outputs. I had been using direct component connections to the TV, but I ran out of those too, with all the different HD devices I have now. So I had to treat myself with a new Sony STR-DG820 A/V Reciever. Yay for 4 HDMI inputs! It only took 30 minutes to set up with my speakers and all the devices.

And with all that technology, I was able to watch Rutgers beat NC State 29-23 in the 2008 PapaJohns.com Bowl. RU Rah Rah!

Analytics Tool Wars – Dodge, Parry, Thrust, Spin!

On October 10, Yahoo! launched their new free analytics tool named Yahoo Web Analytics, a rebrand of IndexTools which Yahoo purchased earlier this April. This isn’t very different than Google’s move to buy Urchin in 2005, refine it, and make it available free to the public. However, what is different between Yahoo’s analytics tool and Google’s tool is that Yahoo is not aggregating the data. This is important enough to say it again… you are not analyzing aggregated data with Yahoo. They store all their data in its raw form, allowing for real-time reporting. This is why some think that the two products do not really compete against each other, because they target different audiences.

Not to be outdone, Google announced on October 22 that they were releasing an “Enterprise” feature upgrade to their product. This upgrade includes custom reports, advanced segmentation, an API for developers, updated interface, motion charts, and integration with Google AdSense.

Was Google resting on its laurels, and now feels threatened by the new Yahoo product? Did Google release these new features to combat the release of Yahoo Web Analytics? Could be. It would be interesting to track the number of users of each of these two products over time, just like we track the number of browser users and the number of search engine users.