He's got it all wrong
I was masterfully lured into reading someone's blogpost today, about the proper use of 'var' keyword. This is a topic that stays hot, in the community and on the work floor. The reason that this topic stays hot is that there's basically two camps of developers: those that want to read what a bit of code wants to do, and those who want to read how it's done.
Allow me to rant about this particular blogpost.
Well, there's probably more than one way to divide developers up. There are those that want to read code on paper and those that want to browse around it in their favorite development environment. Yes, those that want to give a particular revision of the state of the sourcecode a lasting impression on paper for all eternity still exist. I hope they're close to retiring or changing their minds.
First of all, the claim that Resharper "practically mandates" the use of var is completely false. You can set it either way, so if your team decides that 'var' is the next sign of a disintegrating civilization can simply set it the other way so that 'var' is always encouraged to be replaced by its proper type at the time of coding. On to the first point:
Allow me to rant about this particular blogpost.
Well, there's probably more than one way to divide developers up. There are those that want to read code on paper and those that want to browse around it in their favorite development environment. Yes, those that want to give a particular revision of the state of the sourcecode a lasting impression on paper for all eternity still exist. I hope they're close to retiring or changing their minds.
First of all, the claim that Resharper "practically mandates" the use of var is completely false. You can set it either way, so if your team decides that 'var' is the next sign of a disintegrating civilization can simply set it the other way so that 'var' is always encouraged to be replaced by its proper type at the time of coding. On to the first point:
Implicitly typed variables lose descriptiveness
That's a bold claim. Just because the type name provides "an extra layer of description" doesn't mean that it's useful. That type name just states the obvious. The real description is in the naming of the variables, and the methods that are used. Example:
5 | // how is it clear to the reader that I can do this? |
6 | return individuals.Compute( "MAX(Age)" , String .Empty); |
The answer? It's not relevant. The compiler decides what can and can't be done. Intellisense will inform the user while typing up that code. What is the reader trying to verify?
'var' encourages Hungarian notation
I think this is absurd. If a developer really wants to show the type of a variable he shouldn't be using 'var' at all.
Specificity vs Context
Same point different paragraph.
11 | // you can't blame the programmer for making this mistake |
12 | SomeMethodThatOperatesOnFruit(orange); |
Yes, yes you can. You can blame him for submitting a piece of code that doesn't compile.
Increased reliance on IntelliSense
This might be a valid point. But then again, if I don't have IntelliSense, I wouldn't be as productive as I am today. Heck, whenever I have to type up a piece of code in Notepad I cringe. I'm sure that using 'var' doesn't help, but I've got lots more to worry about when away from my IDE.
No backward compatibility
Well, for starters, C# 3.0 and .NET 2.0 are unrelated. One is a compiler version, the other is a framework version. You can write all your .NET 2.0 compatible code in C# 3.0 and use all the language features of that compiler. It will run just fine. But of course, if you want your code to compile on the older C# 2.0 compiler, then you're screwed. But I guess in that case not using 'var' is not going to help you: no lambda's, no extension methods, no anonymous types, no initializers, and so forth. If you're going to target the C# 2.0 compiler, better plan out from day 1 and think back about the goold ol' year 2005 where LINQ was still a wet dream.
So what does the use of 'var' buy us?
Less noise
No more "Dictionary<string, string> dictionary = new Dictionary<string, string>()".
Less reliance on types
Without 'var', every time you decide that you need to abstract some portion of your code or add another layer of indirection or do some refactoring, chances are you'll be replacing a lot of types all over your code. If your lucky, your favorite VS add-in will guide you through it, but I've had more than my share of compiler errors because somehow somewhere I still have a variable using the 'old' type instead of the new-and-improved type.
This is where the dynamic language boys are laughing in our face. They claim that their language doesn't impose this sort of "red tape" and then move on to why unit testing is so important while in the statically typed language there's a compiler that removes half of the mistakes we make before we even run our code.
Closing remarks
That's it, I've burned through the rage that made me write up this rant. In short, I'd like to encourage you to embrace your compiler and use all the features that it provides to let you focus on writing the most conscice and elegant code that you can without sacrificing code quality. To me, that means removing all the cruft wherever possible and rely on the tools to provide me with the context needed to wade through my code.
Written on May 14, 2012