czwartek, 15 listopada 2007

fxCop story

It is quite often, that companies posses their own “Rules of coding” document like the one from Philips, which is more or less the compilation of documents available on the net. I strongly suggest also reading "Framework Design Guidelines", which is my bible in the area. One thing is to write the paper and the other is to have it truly implemented. Some companies have the process established that none code may be checked into repository without the code review performed by the other developer, but that is open issue is if this process ensures that all rules are always followed and exceptions from obeying rules are registered. That is why, it is always nice to have the cyber cop, who will check once per a while if codes in your repository follows the rules established in written documents. The cop has even a name on this t-shirt and heeeeeaaaaar heeeeee issss - fxCop.

Each project is different and each time you must select, which rules will be used. I think that the best way to deal with the issue is the following:

1. Create fxCop project
Get all dlls and exes which will be analyzed and add them in target section. Analyze them and save the whole project.

2. Filter sets of rulesFor example, if you are not planning to localize your software you do not need Globalization Rules at all. In similar way, if you do not have the legacy problem, you probably do not need the Interoperability Rules.

3. Filter rules
Check each rule. More errors is in rule, in more details you need to analyze what exactly the rule is about. In some situation fxCop provides you the link directly, in other situations you must look at help or ask MSDN manually about the rule. You will find quickly that some rules are useless for you like (for me):

  • EnumsShouldHaveZeroValue – we use enums just internally and they are not public
  • AvoidNamespacesWithFewTypes – the project is at early stage and further extensions are planned shortly
  • MarkAssembliesWithClsCompliant – we do not need to be compliant with CLS

And some of them are golden one (for me):

  • IdentifiersShouldBeCasedCorrectly – priceless thing, which usually would take me a lot of time, if performed manually
  • AvoidUncalledPrivateCode – it helps me to find quickly unused code
  • DoNotCallPropertiesThatCloneValuesInLoops – it never should happen

Except checking MSDN, if that is necessary open the code and check where particular error happens in code. It will happen often that only looking at code, you can answer if the problem is dummy or real.

Analyze the code again and check how much work there is to standardize the code. If there is too much, than you can afford you need to repeat step 2 and 3. If you can, talk with your developers to ensure, that your point of view is valid.

4. Look at errors sets and ensure, that they are valid
Forget for a second about fxCop and how cool it is and try to put yourself in developers shoes. Are enlisted errors really important; do they really need to spend a time and risk the repair (which may cause additional errors)? Remember that you should implement the whole thing incrementally to convince people that it truly empowers they work. If you start with too many rules at the beginning, you will cause people to make refactoring for months, what will not make them and your boss happy ;) That is why, it is good to “thing big and start small”.

Filter the last time, the list of suggestions for developers.

5. Establish the process
The tests have been performed for the solution, which consists of 62 projects and has around 30k LOC.

Option A
Run the report from command line manually. The default xml report is parsed nicely by default by IE – options like expand/collapse are possible there without any configuration. You may generate the CSV in the same way, but in fact that is the same xml (just extension changes). The disadvantage is that you need to run the whole process manually and you need also to update the list of targets as the outputs (dis)appears, but the plus is that it does not disturb the daily work of developers. The batch like:

fxcopcmd /p:"..\..\..\_fxCop\ESPO1v0.FxCop" /c /o:".\FxCopReport.xml"

Took me about 1:21.

Option B
The other idea is to apply the fxCop process as after build for Release version of the product (“Adding FxCopCmd to Your Build Process”). In this way, daily Debug builds are quick and when everything is ready key developer does Release build and gets immediately the report of things to repair. Unfortunately post-build actions are available on the projects level and not on solution level, so you must either apply it to each project separately or once to project, which is build at last in the whole solution, if you are sure it will be always compiled as last one (Project Build Order). In this case, you must be also sure that each person:

  • Has fxCop installed
  • It is installed at the same place or path is set properly
  • fxCop project is downloaded into the same place at each PC (shared folder possible)

FxCopCmd /p:"D:\SVN\ESPO\fxCop\ESPO.FxCop" /c /o:"D:\SVN\ESPO\fxCop\FxCopReport.xml"

The other option suggested in here is to use the VS 2005 add-in and link it with MSBuild process.

Option C
Even more interesting option is to add the action as a post-build in any project. You need to change the fxCop project and clean Targets tab. You may establish that folder, where fxCop has been installed is added to the Path global variable and common folder with fxCop project and where all the reports will be stored. You need then to ensure that post-action is added to each project to have the whole thing working

Fxcopcmd /f:"$(TargetPath)" /p:"..\..\..\_fxCop\ESPO1v0.FxCop" /c /out:"..\..\..\_fxCop\$(ProjectName)-fxCopReport.xml"

You need to know, that my sample solution has compiled in 32-45 seconds without fxCop and in 8:10 with fxCop for option C (even when just 5 rule sets were turned on)! Only iIf you don’t have too many projects, it may be the good option. In other cases option A or B, will be probably better.

Whichever option you chose, there has to be some negotiation stage (especially at the beginning until the set of used rules will not stabilize), when developers may give the opinion about the usability of the particular rule or changes. When it is established and the set of rules is redefined, it must be double checked after some time if the errors are repaired

You may also consider at later stage to write your, customized rules, but you always must consider if it really safes time ;)

Conclusion
You must spend some real piece of time to make the whole thing to be REAL, but I have a strong believe that the automated effect is usually worth of it especially if you the project is:

  • long term
  • open source or you need to pass the rights to source codes somebody else
  • the code is analyzed externally
  • to which the others may write plug-ins
  • developers switches

Of course you must review the whole thing once per while and redefine the set of used rules (make the tolerance more narrow or wide), but the even the knowledge about the post-build process will motivate developers to write better code.

At least, but not at last that is important to mark, that automated process is just the support and not replacement for manual code reviews.

Brak komentarzy:

 
web metrics