I was the victim of a very annoying piece of malware. I have been avoiding the corporate install of Internet Explorer for months now, and I have been using Firefox 2 and 3 instead. I am sure I was doing something I should not have been, because for the last two weeks these strange popups have been plaguing my Firefox browsers, and my machine has been running like there was taffy on my hard drive. I tried to remove the trojan with Spybot S&D, and that did not work. It did identify a Browser Helper Object (BHO) and some registry entries that I could not get rid of. That is when I knew it would be bad. Derek recommended that I try Macafee Avert Stinger. That was no help either. I tried HijackThis. That was informative, but not as helpful as I had hoped. So I did some more digging online, and an article recommended Malwarebytes’ Anti-Malware (MBAM). That was a big step forward. It clearly identified my problem as the Virtumonde Trojan. There were 59 DLLs, BHOs, data files, and registry entries all over my computer from this one trojan. I used MBAM to remove all of them, but the BHO registry entry was stubborn. This meant there was still more. I did some research on Virtumonde, and found that a tool called ComboFix will wipe it out entirely. It took about 20 minutes to run, rebooted my machine, and took another 20 minutes to complete. But when it was all done, I was trojan free. No more popups when I use Firefox, and my machine is fast again. Now… if only I knew what I did that was so bad…
Category Archives: Web Dev
Techno-Christmas 2008
Well, another Christmas has come and gone, and we have all exchanged our gifts. Everyone in the family got new gadgets to alleviate their tech addiction.
Nicholas got his long-overdue Xbox 360. We bought him the Elite version, with the wireless remote and the 120GB hard drive. Can’t get a new console without a shiny new game too, right? So we got him one of his favorites… the new NHL 2K9. He also got lots of gift cards, so that he could go out and get a game of his choice. He picked up Call of Duty 4, and another wireless remote, so that he can pwn me and his friends up in NHL 2K9 or Call of Duty.
Mary Ann was light on the technology this year… she did get some CDs that would help her learn basic Dutch in the car on her long commute to work every morning.
I was burned by the HD-DVD fiasco last Christmas, so this year my wife bought me the Sony BDP-S350 Blu-Ray Disc Player. And, just like the Xbox, you can’t get a new Blu-ray player without getting a couple new Blu-ray movies. My parents bought me Iron Man and Wall-E.
The great thing about technology is the same as the problem with technology. It is always improving. The Harmon Kardon receiver we had for 6 years or so has no HDMI inputs or outputs. I had been using direct component connections to the TV, but I ran out of those too, with all the different HD devices I have now. So I had to treat myself with a new Sony STR-DG820 A/V Reciever. Yay for 4 HDMI inputs! It only took 30 minutes to set up with my speakers and all the devices.
And with all that technology, I was able to watch Rutgers beat NC State 29-23 in the 2008 PapaJohns.com Bowl. RU Rah Rah!
Visual Thesaurus Bends and Stretches Your Way to Synonyms
I have stayed connected to the search industry ever since I was involved with the original launch of the Pravachol web site ten years ago. One of the ways I have stayed connected is through great online resources like Alt Search Engines. This week they covered a great new online tool that helps its users search for synonyms. Visual Thesaurus displays entries in the thesaurus graphically and separates them into individual entries through a tool called Thinkmap. This is very similar to the technology used in the TouchGraph Google Browser. Both of these technologies are similar to some of the social networking graphs that are used in Web 2.0 sites. Take a look at the new Visual Thesaurus, and the TouchGraph Google Browser, and let me know what you think of the usage of thetechnology, and what other ways you might like to see it.
Software Metrics
Being able to measure success for a software development group is a difficult thing. But not being able to show the success of your development group is a dangerous thing. Your management team will want to be able to measure quality, show improvement, and benchmark against industry standards. All the popular maturity models (if you put stock into them) emphasize the ability to measure, react, and improve your quality and processes. My department is no different. We try to remain as flexible and lightweight as possible, and add formality where improvement is necessary. Here are some of the metrics that we collect to measure our success and find ares of improvement.
Code Coverage
There will always be bugs. No matter how hard you try, there will always be bugs in your software. That does not mean that you cannot try to prevent them. One of the easiest ways to do that is to write automated unit tests to validate your code is doing what is expected. You can write your unit tests lots of different ways, and Steve Cornett’s Code Coverage Analysis paper gives lots of different ways to break down code coverage. A great place to start is to aim for 50% coverage of all of your lines of code. And, as Tony Obermeit says in his Code Coverage Lessons paper, your ultimate goal of course is to always hit 100% coverage. You will need to pick a code coverage tool to help measure your success. In my department, developing in a Visual Studio and C# environment lends itself to the nCover open source solution. This solution works well with our CruiseControl environment. Test Driven Development methodologies and mocking tools can help you get closer and closer to covering as much of your code as possible with automated tests.
Defect Tracking
I use the words defects and bugs interchangeably. This I am sure is something that some people would disagree with, but I think that it is close enough. If it is not working as expected, then it is a defect, and it is a bug. Regardless, defects are identified in the development process, system and user acceptance testing process, and in the production environment. The objective is to minimize the number of bugs that are found in the live environments. To do that, you need to encourage the identification and mitigation of bugs in earlier and earlier stages. This sounds fundamental, but becomes difficult to implement. There are lots of methods that you can do to identify, solve, remove, and prevent bugs. You must have a way to measure that these methods are improving your success rate. And that means measuring the number of defects found in each environment – development, system test, user acceptance test, and production. The easiest set of numbers to get in a corporate environment is production defects. There is always a problem management system or help desk that tracks these kinds of things. But as a software development organization, you need to implement bug tracking throughout the entire lifecycle of your software. You can trend the numbers, and make sure you are finding more bugs before UAT, particularly in development. Tools like Bugzilla, an open source bug tracking tool, can help you track, trend, and manage defects in your software throughout its lifecycle.
Support Ticket Management
Software is not a static entity. It is always changing. Just think of all the patches, updates, service packs, and bug fixes that Microsoft releases on its suite of software. In a corporate environment, it is no different. Software management does not end once it is released. Teams of developers will be constantly updating desktop, web based, and console applications based on new requirements and requests from their clients. Problem Management software can be used to help track and trend all of these requests bydata points such as severity (Urgent, High, Medium, Low), priority (Urgent, High, Medium, Low), assessment (Customer Assistance, Required Modification, Elective Enhancement, Break Fix), difficulty (High, Medium, Low), etc. You can measure success agains more complex metrics such as the number of tickets created, number of open tickets, time to resolution, etc. All of these metrics will help you determine how fluid, stable, usable, sustainable, and maintainable your software is. Do not ignore your software or its users once it is released to production.
Analytics
Web Analytics tools can tell you how many users you have had on your site, how long they visited, where they came from, where they went, how they found your site, did they reach your goal pages, did they convert, and did they return. There are free web based tools like Google Analytics, and over the shelf packages like WebTrends and CoreMetrics that can help you measure site activity. Do not ignore these metrics to help you define current activity, make improvements to your site, track your new results, and continue to improve. They directly measure your clients’ interaction with yoru software, and can identify trends that with simple changes can vastly improve your software and development processes.
Conclusion
So… these are some of the ways that we track the success of our software. There are a host of other methods to measure software, such as function points, lines of code, complexity, interfaces, velocity, etc. What ways do you measure your software? how do you define success? What are your plans for the future?
Leave me a comment and let me know what you think.
Sermo: It Takes a Village to Raise a Doctor
Daniel Palastrant, CEO of Sermo.com, has come to Bristol-Myers Squibb to speak at the quarterly OMNI meeting. This meeting is targeted to the individual Brand Teams and intends to bring innovative ideas into the company. Daniel came to talk about Sermo and Online Physician Communities – Salvation or Mirage?
Direct from their web site, Sermo is “a practicing community of physicians who exchange clinical insights, observations, and review cases in real time — all the time.” Their objective, from my perspective, seems to be to connect doctors to each other, doctors to medical information, and doctors to medical services, all in one place.
Medicine is a cottage industry. Accessibility to see doctors is decreasing – 18% of doctors are no-see, and this is increasing. This is due to lots of reasons – the trend is away from in-patient towards out-patient, the introduction of hospitalists, the end of society / academic / association dominance, and script writers staying at the office to make ends meet instead of going to conferences. Key opinion leaders are becoming more polarized from practicing physicians, the number of pharmaceutical approvals is dropping, and the emergence of consumerism in the pharmaceutical industry are accelerating the number of no-see doctors as well.
Detailing is becoming more and more expensive. It costs roughly between $250 and $450 to detail a physician, and when you start including some of the secondary costs, it can reach up to between $600 to $2000 per visit. E-detailing costs between $100 to $200 on average, but you will run into recruitment problems. Community based e-detailing is estimated to cost between $35 and $65. And since you are “fishing where the fish are,” there is no recruitment costs. Doctors are already there. Enter Sermo.
The popularity of online communities have historically arced. They reach a certain point, and the number of attendees, active users, and advertisers will start to drop off. Sermo needs to find a way to make sure their community doesn’t arc like the others. New media needs new rules. The way they plan to attack this is to make sure there is as much efficiency (or harmony, as they call it) within the community. Ebay is a good example of this – there are buyers, and sellers, and the transaction is not over until everyone is happy.
Sermo plans to focus on the needs of the doctors on a vocational level. They will offer a virtual “water cooler” for the cottage industry – a place for doctors to share news, strange and insightful cases, and the opportunity for discussion amongst themselves. And they will collect and offer hard empirical data about what other doctors think about all of these.
Within Sermo’s postings, they provide Hot Spots that focus on Learnings and Earnings. These are small little bubbles that will appear throughout the Sermo interface when there is an additional content the doctor may be interested in (about a particular drug, for example).
Another feature within Sermo is called AlphaMD. This is a way to collect real time market research from doctors within the community. Data will be collected within each article. Doctors can be targeted based on their surfing habits or their profile information. This research will cost 1/10 the usual amount for this kind of information, and reach 4 times the usual target audience.
Sermo also plans on growing its features, again to prevent the value of its community from arcing. Some of the upcoming features are: DrugCards, which will be like a real-time updated Physicians Desk Reference; eDetailing, which is a frame that doctors can schedule detailing, and will integrate with your in-house detailing application; and RepSchedule, which will be a form doctors can schedule a visit from a sales rep, and will integrate with your in-house CRM.
Sermo is working with the FDA and internal regulatory departments to connect doctors to hospitals, other healthcare professionals, and pharmaceutical companies, in ways that make sense for everyone.
This was a great presentation. The exciting part is that this opens up new doors for our sales and marketing teams. Thanks to Daniel for coming to speak with us, and to Bruce Levin for putting this together.
20 Reasons Why DHTML Layers are Bad
A bit of background before I dive in to the post… My team and I are responsible for developing and supporting the Brand web sites for Bristol-Myers Squibb. The Brand Teams and external Marketing Agencies develop a concept for their site, and they deliver a fully functional version of the site in HTML to us to implement. We take that HTML, squeeze it into our custom content management system, and hook up all of our custom features. This custom content platform that we call LaunchNet has built in registration management, site search, web analytics, SEO helpers, and a full suite of other tools.
With an environment like this, managing expectations becomes essential. Sites need to be streamlined for industrial-strength campaigns involving thousands of concurrent users and possibly millions of site users per month. From this perspective, DHTML layers is one of the banes of development. I have broken out why DHTML Layers make me lose my hair into 6 categories: Performance, Metrics and Analytics, Accessibility, Implementation, user Experience, and Search & SEO.
Performance
When using DHTML Layers, your users are now loading multiple pages combined into one, some of which they may not even view, wasting download time and bandwidth. Pages are slower to download, and are slower to draw inside the browser. Processing is now heavier on client side, and is heavily dependent on JavaScript, which is known to be a memory hog.
Metrics & Analytics
Layers are not pages. This is a simple fact, but needs to be stated again for emphasis. Layers are not pages. This means that anything that is dependent on the construct of a page will break. Google Analytics tags, which are designed to fire on page load, will need to be re-engineered to fire on layer loads instead of page loads.
Accessibility
Mobile users on phones, PDAs, tablets, UMPCs, and other lightweight devices with web browsers will have difficulty. These browsers are slimmed down versions of their bigger brothers, and do not have all the functionality needed to process JavaScript properly. Cross Browser Compatibility is very difficult to implement and maintain with DHTML Layers. You cannot bookmark a layer, either, so your users will not be able to come right back to where they were. Popup blockers may block the use of DHTML layers, as this is a common delivery mechanism for advertising. And, DHTML Layers could affect your site’s handicap accessibility.
Implementation
Layers on the site increase complexity, and make maintainability more difficult. If JavaScript is turned off, any functionality to show or hide layers will not work, so your users will not see it. Developers will need to spend lots of time to make DHTML JavaScript function with content management systems, particularly when custom functionality is delivered in this way. And, if layers are big enough, scrolling can become an issue, as the layers may run off the page, hiding content from view.
User Experience
User Experience is the biggest reason to implement DHTML Layers. It adds slick new interface to the hum-drum of static pages. But designers need to keep in mind that performance impacts user experience. This is an “I want it an hour ago” generation, and waiting even 10 seconds for a page to load will mean your users have left and gone somewhere else. Layers are a not a standard UI convention for web development, and some users may be intimidated by the change in interface. And, some folks may perceive layers as “popups”, which is bad for perception.
Search & SEO
Implementing site search while using DHTML Layers is very difficult. Most search products are page based, and as stated before, layers are not pages. Your content might not get crawled, or may be crawled incorrectly. Layers could also cause a problem with search engines. Your page could end up not getting indexed, or not indexed properly. Invisible content may also be viewed by search engine crawlers as “gaming the system” or a black hat SEO practice, and may negatively impact your page rank.
Conclusion
When implementing, DHTML Layers, think twice about the impact on other aspects of your site. Ajax can do a lot of the same kinds of things that DHTML Layers can. Adobe’s Flash and Microsoft’s new Silverlight products can also deliver great new user experiences. All of these have benefits and drawbacks that need to be weighed before jumping in. You may be providing a slick new experience to your users, but you may be creating more problems than it is worth. There are lots of other alternatives to explore.
Dynamic sitemap.xml Files in ASP.Net
I know this is not a new topic. It is not even a new topic for me. I have posted on defining what a sitemap.xml file is for, and on dynamic sitemap.xml files in C#. But my team is finally ready to start implementing this as part of our custom development platform for the external brand sites.
When one searches for dynamic sitemap.xml creators in Google, you get a plethora of sites back. Some are code, some are online based tools. Since we are looking to create our file dynamically from within the site on demand, that helps narrow down our search. I have found a small number of code sources we can use to start with.
There is still the HTTP Handler from my original post. This project, ASP.Net Google Sitemap Provider by Bruce Chapman, is available on CodeProject. You can also read about it in a blog post on his iFinity site. It still looks like the most flexible solution.
There is a great looking solution on the ASP.Net site by Bertrand Le Roy called Google Sitemaps for ASP.NET 2.0. It has been ported over into the ASP.Net Futures July 2007 package. This solution is an HTTP Handler that uses the Web.sitemap file to generate a sitemap.xml file on the fly.
Another interesting idea I found in my searches was some code that shows a site map when a user gets a 404 error. This solution is also implemented as an HTTP Handler, but is only for 404 Page Not Found server errors. This code is also available on CodeProject in an article called Generate a Google Site Map Using the HTTP 404 Handler.
Here are some other sites of note to look at. They have similar solutions to the ones above, and it is always a good idea to see what other people have come up with.
- http://digitalcolony.com/labels/sitemap.aspx
- http://www.bloggingdeveloper.com/post/Generate-Sitemaps-for-Google-MSN-Live-Yahoo-Ask-on-the-fly-using-an-ASPNET-HttpHandler.aspx
- http://james.newtonking.com/pages/sitemaps-net.aspx
- http://www.communitymx.com/content/article.cfm?page=5&cid=3DAB2
If anyone has any additional resources, ideas, or suggestions, please leave me a comment and let me know what you think.
Mix08 – Session 10 – Application = Designers + Developers
This session is based on a big selling point that Microsoft has been driving home for Silverlight and WPF. Designers and Developers who share the same source code can work on different aspects of the same project seamlessly without stepping on each other’s toes. The session walked through two different development scenarios to demonstrate this point.
The first demo was of a furniture design web site. The developer built the back end that integrated with the database. He hooked up simple tabs and list boxes to the database for dynamic content. The Developer then picked up the XAML for the site, and styled each of the page elements to make a slick looking web site.
The second demonstration was of a Silverlight application called the Deep Zoom Composer. This is an application that helps users add images inside of images inside of images, like the Keynote demo from The Hard Rock Cafe. In the same fashion, the developer hooked up the interface to implement all the heavy lifting, and the designer modified the XAML to style the application any way he chose.
This kind of development and design interaction is extremely encouraging, and could cut down a significant amount of time we spend in construction and user acceptance testing of our external brand sites at BMS. I am hoping that we work with an agency in the near future that is just as excited about trying out this technology.
Mix08 – Session 9 – Silverlight and Web Analytics

This session was a panel discussion regarding Web Analytics. The panel was composed of members from WebTrends, Omniture, and Microsoft. I found this session very interesting, since most of the solutions to track analytics within Silverlight applications are very similar to the ones we implemented with our Flash based RIA sites.
Agenda
- Omniture – SiteCatalyst, hosted solution
- WebTrends – WebTrends Analytics, hosted solution
- Microsoft – AdCenter Analytics – Beta2 released March 1
- These products track information through page tags or beacons
- With Silverlight (and other RIA platforms like Flash, Ajax, Etc), you don’t change pages.
- You have to create and define pseudo-page views
- 4 Scenarios:
- Tracking Silverlight Installation
- Tracking user Interaction
- Tracking media Drop-off
- Tracking Media Buffering
Silverlight Installation
- JavaScript file to put on site
- Silverlight.isInstalled method identifies if it is available
- Check for each version, give them an experience for that version
Tracking user Interaction
- Determine actions in your pipeline, funnel, etc.
- Add Event handlers for each action
- Event handlers map to page view equivalents
Tracking Media Drop off
- Add invisible media markers every 5 seconds in the video
- Media Markers trigger events
- Events trigger page views
- You can then monitor drop-off in 5 second increments
Tracking Media Buffering
- Handle the MediaElement.CurrentStateChanged event
- When State goes to Buffering, trigger MediaBuffering page view
- Correlate bit rate, content, geography, etc.
Analytics can bring you a single goal
- This is through A / B Testing
- Separation of design in XAML and code in JavaScript enables simple A / B design
- In JavaScript or on server, for X% of visitors – show different XAML
- Use analytics service to track difference between results in variation
Wrap Up
- There is a Silverlight sample available – http://xmldocs.net/analytics
- Track Geolocation with these projects
- Akami Edgescape
- Windows Live
Mix08 – Session 8 – The Future of Advertising Technology
This session was very interesting. As a technology professional, the business side is not as transparent as it could be sometimes. This session opened the door to understanding how advertising, both online and traditional media, work today and could work in the future. Microsoft is investing in this vertical very heavily, and through some of these ideas is looking to become a major player.
Market Overview – Now and In The Future
- The advertising market is manual – media is purchased through phone calls & emails
- The advertising market is opaque – there is no pricing transparency
- The advertising market is inefficient – there are loads of remnant inventory that drives prices lower
- Ad networks are the most efficient way to procure advertising
- They can buy them on CPM basis (cost per thousand) and sell at CPC (cost per click) or CPA (cost per acquisition)
- Ad Exchanges – they add transparency, increase liquidity by letting advertisers bid & buy across all networks
- Today – advertiser & agencies come up with marketing goals, the agency will change the mix of ad media manually to match
- In The Future – advertisers and agencies will define a media plan, translate them to business rules, and through automated experiments, an optimization system can evaluate and adjust the advertising mix in real time
- Each impression’s value can be set in real time, adjusted, and shift based on campaign objectives (awareness vs conversions, etc)
- In the future there will be very few analysts, and each of them will be dealing with millions of publishers through automated optimizers and exchanges
- Agencies and Advertisers (Buy Side) will have to be open
- Today – Premium sales force manages most ads, remnant sales force has less
- Future – Automated Systems will take away from premium and leave very low or valueless markets out
Optimizer Architecture
- seamlessly add advertising into any content, application, or device
- seamlessly leverage other Microsoft platforms
- Create ad funded businesses all in one platform
The Brave New World of Advertising
- Nokia – leveraging the power of nanotechnology
- Hypertargeting, personalized advertising
- Personalized product offerings (like Nike – build your own shoes, custom Mini Coopers, Scions)
- Projectors, OLED, Disposable video
- Low resolution projectors are really cheap now
- OLED will become printable
- Siemens Printable Video Displays – with printable batteries
- Tracking & Measuring advertising offline like online
- RFID
- GPS Phones
- 2D Bar Codes
- Bluetooth, etc.
- Neural Scanning