Author Archives: Brian Whaley

Unknown's avatar

About Brian Whaley

Technologist, Digital Transformation Professional, User Experience Champion, Landscape & Macro Photographer, Avid World Traveler, Advanced Open Water Scuba Diver, Enthusiast of Home-Cooked Food

Visual Thesaurus Bends and Stretches Your Way to Synonyms

I have stayed connected to the search industry ever since I was involved with the original launch of the Pravachol web site ten years ago. One of the ways I have stayed connected is through great online resources like Alt Search Engines. This week they covered a great new online tool that helps its users search for synonyms. Visual Thesaurus displays entries in the thesaurus graphically and separates them into individual entries through a tool called Thinkmap. This is very similar to the technology used in the TouchGraph Google Browser. Both of these technologies are similar to some of the social networking graphs that are used in Web 2.0 sites. Take a look at the new Visual Thesaurus, and the TouchGraph Google Browser, and let me know what you think of the usage of thetechnology, and what other ways you might like to see it.

3 SEO Site Analysis Tools to Grade Your Site

I don’t usually do this, but this blog entry will be about an email I received from one of my readers. I got an email from Rachel, who works at a company called SEO Site Checkup. She asked me to take a look at their site. They have created a simple-to-use web site that will analyze your site against a series of SEO based rules. All you do is put in your URL, submit, and let the site do its work. It will return a list of important fixes, recommended fixes, and successful checks. It provides a lot of information, and a great deal of next steps to make your site more SEO friendly. In fact, it was good enough to point out a few changes that we will want to make to some of our major brand sites.

To be fair, there are two other tools that I use in the web site SEO analysis space. WebSite Grader is provided by HubSpot – a company focusing on marketing for small companies. I have also used a site called XinuReturns, which will help you “Find out how well your site is doing in popular search engines, social bookmarking and other site statistics.”

XinuReturns focuses more on aspects outside of your site, including inbound links, search engine results, and social bookmarking. WebSite Grader gives a high level overview of lots of different aspects of your site’s SEO, both internally and externally, and gives you an easy “grade” to compare results against other sites. The strength of SEO Site Checkup over these other two sites is is that it takes a deep dive into aspects of your web site that you can change to improve your search results. It analyzes your technology and your content, and gives you an action plan for improvement. All three tools are a great way to measure your site’s SEO, but SEO Site Checkup goes the next step further to tell you how to improve those measurements.

I recommend using all three of these tools, in conjunction with analytics tools and other metrics to montiro and improve your site. SEO Site Checkup is a great new tool to add to that arsenal.

Software Metrics

Being able to measure success for a software development group is a difficult thing. But not being able to show the success of your development group is a dangerous thing. Your management team will want to be able to measure quality, show improvement, and benchmark against industry standards. All the popular maturity models (if you put stock into them) emphasize the ability to measure, react, and improve your quality and processes. My department is no different. We try to remain as flexible and lightweight as possible, and add formality where improvement is necessary. Here are some of the metrics that we collect to measure our success and find ares of improvement.

Code Coverage

There will always be bugs. No matter how hard you try, there will always be bugs in your software. That does not mean that you cannot try to prevent them. One of the easiest ways to do that is to write automated unit tests to validate your code is doing what is expected. You can write your unit tests lots of different ways, and Steve Cornett’s Code Coverage Analysis paper gives lots of different ways to break down code coverage. A great place to start is to aim for 50% coverage of all of your lines of code. And, as Tony Obermeit says in his Code Coverage Lessons paper, your ultimate goal of course is to always hit 100% coverage. You will need to pick a code coverage tool to help measure your success. In my department, developing in a Visual Studio and C# environment lends itself to the nCover open source solution. This solution works well with our CruiseControl environment. Test Driven Development methodologies and mocking tools can help you get closer and closer to covering as much of your code as possible with automated tests.

Defect Tracking

I use the words defects and bugs interchangeably. This I am sure is something that some people would disagree with, but I think that it is close enough. If it is not working as expected, then it is a defect, and it is a bug. Regardless, defects are identified in the development process, system and user acceptance testing process, and in the production environment. The objective is to minimize the number of bugs that are found in the live environments. To do that, you need to encourage the identification and mitigation of bugs in earlier and earlier stages. This sounds fundamental, but becomes difficult to implement. There are lots of methods that you can do to identify, solve, remove, and prevent bugs. You must have a way to measure that these methods are improving your success rate. And that means measuring the number of defects found in each environment – development, system test, user acceptance test, and production. The easiest set of numbers to get in a corporate environment is production defects. There is always a problem management system or help desk that tracks these kinds of things. But as a software development organization, you need to implement bug tracking throughout the entire lifecycle of your software. You can trend the numbers, and make sure you are finding more bugs before UAT, particularly in development. Tools like Bugzilla, an open source bug tracking tool, can help you track, trend, and manage defects in your software throughout its lifecycle.

Support Ticket Management

Software is not a static entity. It is always changing. Just think of all the patches, updates, service packs, and bug fixes that Microsoft releases on its suite of software. In a corporate environment, it is no different. Software management does not end once it is released. Teams of developers will be constantly updating desktop, web based, and console applications based on new requirements and requests from their clients. Problem Management software can be used to help track and trend all of these requests bydata points such as severity (Urgent, High, Medium, Low), priority (Urgent, High, Medium, Low), assessment (Customer Assistance, Required Modification, Elective Enhancement, Break Fix), difficulty (High, Medium, Low), etc. You can measure success agains more complex metrics such as the number of tickets created, number of open tickets, time to resolution, etc. All of these metrics will help you determine how fluid, stable, usable, sustainable, and maintainable your software is. Do not ignore your software or its users once it is released to production.

Analytics

Web Analytics tools can tell you how many users you have had on your site, how long they visited, where they came from, where they went, how they found your site, did they reach your goal pages, did they convert, and did they return. There are free web based tools like Google Analytics, and over the shelf packages like WebTrends and CoreMetrics that can help you measure site activity. Do not ignore these metrics to help you define current activity, make improvements to your site, track your new results, and continue to improve. They directly measure your clients’ interaction with yoru software, and can identify trends that with simple changes can vastly improve your software and development processes.

Conclusion

So… these are some of the ways that we track the success of our software. There are a host of other methods to measure software, such as function points, lines of code, complexity, interfaces, velocity, etc. What ways do you measure your software? how do you define success? What are your plans for the future?

Leave me a comment and let me know what you think.

Blogging Trends in the Pharmaceutical Industry

Blogging is now one of the easiest ways to get a message out to your audience. Readers can read and bookmark a blog and get content when they want it, or subscribe to your posts via an RSS feed and have content pushed to them when it is available. There are lots of free open-source solutions that give you the freedom to create, publish, and maintain your content any way you want. Blogs about the pharmaceutical industry abound; but pharmaceutical companies, with all their legal, regulatory, and FDA compliance concerns, have been apprehensive about embracing this fast paced content.

There are lots of blogs about the Pharmaceutical industry. Pharmalot is a blog by The Star-Ledger’s Ed Silverman that keeps up with pharmaceutical industry news. RXBlog also tries to stay on top of pharma industry news. The Pharma Marketing Blog is an op ed for John Mack, the editor in chief of the Pharma Marketing News e-newsletter. CafePharma is another popular website targeting pharmaceutical sales professionals, and has a blog called Pharmagather, that attempts to centralize pharma blog articles from all over the web. These are all great, but are not blogs from the pharmaceutical industry. Pharma companies need to have their own presence in the blogosphere.

Nutra Pharma, a small biotech company, announced that it was re-launching its corporate blog at the ned of February, 2008. Nutra Pharma’s blog has been around since 2006, but has not gotten much attention. Posts are infrequent, very brief, cover a very narrow scope, is buried within its corporate site, and quite frankly are coming from a small biotech company.

Centocor, a company owned by Johnson & Johnson, is going through lots of transformations, both in its pipeline and in its organizational structure within its parent company. It has launched a blog, CNTO411, in an effort to stay closer to its patients, its partners, and the blogosphere. It was launched just this March, has gotten a lot of press, and is leading the way in pharma blogging.

GlaxoSmithKline has released alliconnect, a blog about its new OTC weight loss drug, alli. They are touting the blog as, “place for you to have a conversation with us about weight loss issues.” It is geared towards the drug, but also at the disease state, and invites its patients to freely comment on the posts.

Johnson & Johnson has also tried to harness the power of blogging. Earlier this month, J&J organized and held an event for blogging mothers called Camp Baby 2008. The event was designed to reach out to bloggers who had complained about J&J and their products in the blogosphere and have a two way dialog. The mothers were flown in free, were fed at the 5-star restaurant “The Frog and the Peach” and were the recipients of lots of swag. Throughout the process, there were lots of bumps and bruises along the way on both sides, as The Star Ledger article describes, but dialog channels were open and J&J claims this as a positive event for all.

The blogosphere offers great benefits to pharma, biopharma, and biotech companies. The only barrier to entry is the aversion to risk. These four companies have taken the risk, and are seeing benefits on all different points of the continuum. But as the adage goes – No Risk, No Reward.

How to Redirect Like A Pro

If you have ever redesigned, moved, or migrated a web site, then you know how important 301 redirects are. You have worked hard at building up your page rank within all of the search engines. And you don’t want to lose it. Your users have bookmarked your pages, and your partners all have links to your pages. And you don’t want those to break either.

My team and I are currently in the middle of migrating our first major site from one platform to another, and if we are successful there will be many more to come. We need to handle redirects for all the old content, the media pages, the banner advertisements, the existing client side redirects, and the internal analytics tracking pages. Here are some of the resources we are using while managing all the redirects in the site.

Are there other resources you use when dealing with 301 redirects? Do you have any lessons learned about page redirects when redesigning or migrating your site? Leave me some feedback and let me know what you think.

How Strong is Your SEO Kung Fu?

I am by no means an expert. I work on web sites every day, and work hard at making sure that those sites are optimized from a technical and content perspective. There are a lot of good things I do, but there are a lot of things still left to learn.

It is always a good idea to benchmark your skills against others. These articles on SEO lifecycle are a great way to understand the progression of your skills as an SEO professional. And the quizzes are a great way to compare your knowledge against others in your field. Take these with a grain of salt, however. There is always a slant or twist to throw you off track. And some of these are just for fun.

Sermo: It Takes a Village to Raise a Doctor

Daniel Palastrant, CEO of Sermo.com, has come to Bristol-Myers Squibb to speak at the quarterly OMNI meeting. This meeting is targeted to the individual Brand Teams and intends to bring innovative ideas into the company. Daniel came to talk about Sermo and Online Physician Communities – Salvation or Mirage?

Direct from their web site, Sermo is “a practicing community of physicians who exchange clinical insights, observations, and review cases in real time — all the time.” Their objective, from my perspective, seems to be to connect doctors to each other, doctors to medical information, and doctors to medical services, all in one place.

Medicine is a cottage industry. Accessibility to see doctors is decreasing – 18% of doctors are no-see, and this is increasing. This is due to lots of reasons – the trend is away from in-patient towards out-patient, the introduction of hospitalists, the end of society / academic / association dominance, and script writers staying at the office to make ends meet instead of going to conferences. Key opinion leaders are becoming more polarized from practicing physicians, the number of pharmaceutical approvals is dropping, and the emergence of consumerism in the pharmaceutical industry are accelerating the number of no-see doctors as well.

Detailing is becoming more and more expensive. It costs roughly between $250 and $450 to detail a physician, and when you start including some of the secondary costs, it can reach up to between $600 to $2000 per visit. E-detailing costs between $100 to $200 on average, but you will run into recruitment problems. Community based e-detailing is estimated to cost between $35 and $65. And since you are “fishing where the fish are,” there is no recruitment costs. Doctors are already there. Enter Sermo.

The popularity of online communities have historically arced. They reach a certain point, and the number of attendees, active users, and advertisers will start to drop off. Sermo needs to find a way to make sure their community doesn’t arc like the others. New media needs new rules. The way they plan to attack this is to make sure there is as much efficiency (or harmony, as they call it) within the community. Ebay is a good example of this – there are buyers, and sellers, and the transaction is not over until everyone is happy.

Sermo plans to focus on the needs of the doctors on a vocational level. They will offer a virtual “water cooler” for the cottage industry – a place for doctors to share news, strange and insightful cases, and the opportunity for discussion amongst themselves. And they will collect and offer hard empirical data about what other doctors think about all of these.

Within Sermo’s postings, they provide Hot Spots that focus on Learnings and Earnings. These are small little bubbles that will appear throughout the Sermo interface when there is an additional content the doctor may be interested in (about a particular drug, for example).

Another feature within Sermo is called AlphaMD. This is a way to collect real time market research from doctors within the community. Data will be collected within each article. Doctors can be targeted based on their surfing habits or their profile information. This research will cost 1/10 the usual amount for this kind of information, and reach 4 times the usual target audience.

Sermo also plans on growing its features, again to prevent the value of its community from arcing. Some of the upcoming features are: DrugCards, which will be like a real-time updated Physicians Desk Reference; eDetailing, which is a frame that doctors can schedule detailing, and will integrate with your in-house detailing application; and RepSchedule, which will be a form doctors can schedule a visit from a sales rep, and will integrate with your in-house CRM.

Sermo is working with the FDA and internal regulatory departments to connect doctors to hospitals, other healthcare professionals, and pharmaceutical companies, in ways that make sense for everyone.

This was a great presentation. The exciting part is that this opens up new doors for our sales and marketing teams. Thanks to Daniel for coming to speak with us, and to Bruce Levin for putting this together.

20 Reasons Why DHTML Layers are Bad

A bit of background before I dive in to the post… My team and I are responsible for developing and supporting the Brand web sites for Bristol-Myers Squibb.  The Brand Teams and external Marketing Agencies develop a concept for their site, and they deliver a fully functional version of the site in  HTML to us to implement.  We take that HTML, squeeze it into our custom content management system, and hook up all of our custom features.  This custom content platform that we call LaunchNet has built in registration management, site search, web analytics, SEO helpers, and a full suite of other tools. 

With an environment like this, managing expectations becomes essential.  Sites need to be streamlined for industrial-strength campaigns involving thousands of concurrent users and possibly millions of site users per month.  From this perspective, DHTML layers is one of the banes of development.  I have broken out why DHTML Layers make me lose my hair into 6 categories: Performance, Metrics and Analytics, Accessibility, Implementation, user Experience, and Search & SEO.

Performance

When using DHTML Layers, your users are now loading multiple pages combined into one, some of which they may not even view, wasting download time and bandwidth.  Pages are slower to download, and are slower to draw inside the browser.  Processing is now heavier on client side, and is heavily dependent on JavaScript, which is known to be a memory hog.

Metrics & Analytics

Layers are not pages.  This is a simple fact, but needs to be stated again for emphasis.  Layers are not pages.  This means that anything that is dependent on the construct of a page will break.  Google Analytics tags, which are designed to fire on page load, will need to be re-engineered to fire on layer loads instead of page loads. 

Accessibility

Mobile users on phones, PDAs, tablets, UMPCs, and other lightweight devices with web browsers will have difficulty.  These browsers are slimmed down versions of their bigger brothers, and do not have all the functionality needed to process JavaScript properly.  Cross Browser Compatibility is very difficult to implement and maintain with DHTML Layers.  You cannot bookmark a layer, either, so your users will not be able to come right back to where they were.  Popup blockers may block the use of DHTML layers, as this is a common delivery mechanism for advertising.  And, DHTML Layers could affect your site’s handicap accessibility.

Implementation

Layers on the site increase complexity, and make maintainability more difficult.  If JavaScript is turned off, any functionality to show or hide layers will not work, so your users will not see it.  Developers will need to spend lots of time to make DHTML JavaScript function with content management systems, particularly when custom functionality is delivered in this way.  And, if layers are big enough, scrolling can become an issue, as the layers may run off the page, hiding content from view. 

User Experience

User Experience is the biggest reason to implement DHTML Layers.  It adds slick new interface to the hum-drum of static pages.  But designers need to keep in mind that performance impacts user experience.  This is an “I want it an hour ago” generation, and waiting even 10 seconds for a page to load will mean your users have left and gone somewhere else.  Layers are a not a standard UI convention for web development, and some users may be intimidated by the change in interface.  And, some folks may perceive layers as “popups”, which is bad for perception.

Search & SEO

Implementing site search while using DHTML Layers is very difficult.  Most search products are page based, and as stated before, layers are not pages.  Your content might not get crawled, or may be crawled incorrectly.  Layers could also cause a problem with search engines.  Your page could end up not getting indexed, or not indexed properly.  Invisible content may also be viewed by search engine crawlers as “gaming the system” or a black hat SEO practice, and may negatively impact your page rank.

Conclusion

When implementing, DHTML Layers, think twice about the impact on other aspects of your site.  Ajax can do a lot of the same kinds of things that DHTML Layers can.  Adobe’s Flash and Microsoft’s new Silverlight products can also deliver great new user experiences.  All of these have benefits and drawbacks that need to be weighed before jumping in.  You may be providing a slick new experience to your users, but you may be creating more problems than it is worth.  There are lots of other alternatives to explore.

Dynamic sitemap.xml Files in ASP.Net

I know this is not a new topic. It is not even a new topic for me. I have posted on defining what a sitemap.xml file is for, and on dynamic sitemap.xml files in C#. But my team is finally ready to start implementing this as part of our custom development platform for the external brand sites.

When one searches for dynamic sitemap.xml creators in Google, you get a plethora of sites back. Some are code, some are online based tools. Since we are looking to create our file dynamically from within the site on demand, that helps narrow down our search. I have found a small number of code sources we can use to start with.

There is still the HTTP Handler from my original post. This project, ASP.Net Google Sitemap Provider by Bruce Chapman, is available on CodeProject. You can also read about it in a blog post on his iFinity site. It still looks like the most flexible solution.

There is a great looking solution on the ASP.Net site by Bertrand Le Roy called Google Sitemaps for ASP.NET 2.0. It has been ported over into the ASP.Net Futures July 2007 package. This solution is an HTTP Handler that uses the Web.sitemap file to generate a sitemap.xml file on the fly.

Another interesting idea I found in my searches was some code that shows a site map when a user gets a 404 error. This solution is also implemented as an HTTP Handler, but is only for 404 Page Not Found server errors. This code is also available on CodeProject in an article called Generate a Google Site Map Using the HTTP 404 Handler.

Here are some other sites of note to look at. They have similar solutions to the ones above, and it is always a good idea to see what other people have come up with.

If anyone has any additional resources, ideas, or suggestions, please leave me a comment and let me know what you think.

Mix08 – Session 10 – Application = Designers + Developers

This session is based on a big selling point that Microsoft has been driving home for Silverlight and WPF.  Designers and Developers who share the same source code can work on different aspects of the same project seamlessly without stepping on each other’s toes.  The session walked through two different development scenarios to demonstrate this point. 

The first demo was of a furniture design web site.  The developer built the back end that integrated with the database.  He hooked up simple tabs and list boxes to the database for dynamic content.  The Developer then picked up the XAML for the site, and styled each of the page elements to make a slick looking web site. 

The second demonstration was of a Silverlight application called the Deep Zoom Composer.  This is an application that helps users add images inside of images inside of images, like the Keynote demo from The Hard Rock Cafe.  In the same fashion, the developer hooked up the interface to implement all the heavy lifting, and the designer modified the XAML to style the application any way he chose. 

This kind of development and design interaction is extremely encouraging, and could cut down a significant amount of time we spend in construction and user acceptance testing of our external brand sites at BMS.  I am hoping that we work with an agency in the near future that is just as excited about trying out this technology.