Monday, June 30, 2008

Code Conventions

And no, by code "conventions" I'm not talking about JavaOne or VSLive!

Code conventions are the standards you live by (or are forced by your employer to live by) that nitpick define how your code should look. Code conventions commonly specify things like how you name your variables, functions, and classes; guidelines for how to organize the text of your code on-screen; and requirements for properly documented code. If you haven't had to comply with code conventions (and maybe even if you have), you may wonder why anyone would want or need such stifling-sounding constriants. The easiest way to answer that question is with another question: have you ever had to look at someone else's code and try to understand it?

I'm assuming almost everyone answered yes to that question. If you've ever had a hard time trying to figure out just what was going through the head of the person who wrote that code you've been staring at, you know why we need code conventions (unless it was actually a bad piece of code, not just badly written... there's plenty of that out there too, but that's another topic entirely). Code conventions help make it easier for others to read and understand your code. Or to put it more selfishly: code conventions make it easier for you to understand someone else's code!

Now are you ready to accept that code conventions just might be a good thing?

The main problem with this idea is the question "who makes the standards?" There are some common, wide-spread conventions out there... some good, some not-so-good (Hungarian notation, anyone?). Most likely, the standards you follow will be the ones set down by your employer... hopefully they're making you follow some good ones! Without any universal standards -- and the firepower to force everyone to follow them -- you'll still undoubtedly run into lots of marginally intelligible code, but if your company enforces any sort of code conventions at least you'll be able to read your coworker's code fairly easily after he quits to form what he thinks will be the next MySpace.

When I started at my current employer, there were no documented code conventions for VB development. We have a fairly small number of programmers so perhaps it was felt that there was no need for them, but all it takes is one sloppy programmer to make a big mess. Since some of our most complex systems have been written by multiple contractors, the code we have to maintain was all over the place, stylistically speaking. It didn't take me too long to realize that even a small team needs some guidelines on how their source code should be written. And since I was a lead programmer, I got to dictate what those guidelines would be!

When I sat down to start writing my guidelines for coding conventions, I decided to focus on four main areas: documentation, code formatting, programming techniques, and source code organization. Each of these is a very large topic. Some of what I eventually developed came from my own experiences writing and maintaining code. A lot of the naming conventions I adopted come from Sun Microsystem's Java Code Conventions. Many of them I'm still developing as I go along.

Over the next several posts I hope to share the highlights in each of these areas. And since I know it's impossible for me to have thought of everything, I'd love to see what anyone else has to say on this topic as well!

full window

Wednesday, June 25, 2008

Managing and Motivating Developers: Tips for Management Cluefulness

OK, I said I wasn't going to just post links to other content very often, but every now and then I run across something that states a concept better than I ever could. This is one of those.

Managing and Motivating Developers: Tips for Management Cluefulness

Monday, June 23, 2008

How the brain works: Illusions video and other Weird Science videos at 5min

OK... I normally try not to just post links to other pages, but I have to say I've always loved optical illusions and these are kind of freaky to watch.

How the brain works: Illusions video and other Weird Science videos at 5min

full window

Wednesday, June 18, 2008

Change is Good (part 3)

The ever-growing (and changing) framework
As if .NET wasn't big enough already, I'm afraid to find out how many new classes were added between .NET 2.0 and .NET 3.5. With so many classes and types in the framework, can anyone really call themselves a .NET expert? I know I'm a long way away from knowing every class in the .NET 2.0 framework, and I seriously doubt there's anyone out there who's used all of them.

Let's face it... we're all specialists nowadays. As much as I like to consider myself a generalist, a multi-purpose programmer, a Rennaissance man of code, if you will, I'm as much a specialist as anybody. I've specialized in the APIs that I've had to use to get my work done. Right now that means I've specialized mainly in .NET. And a subset of .NET at that. I've had no need for Remoting yet, for example, but I'm sure others use it all the time. I've used the classes in the Reflection namespace fairly often, but I'm sure there are plenty of .NET programmers who've never needed them. And as much as I love Java, I haven't written a line of Java code for at least three years now, so yeah, I'm a specialist.

I just happen to be a "specialist" who specializes in learning technologies as needed. ;-)

Seriously.

Each time I'm faced with a new problem, one of my first thoughts is "has Microsoft already done this for me?" If I can't find a class for what I need in the humongous .NET framework, then I wonder "has someone else already done this for me?" And if they have, then I ask myself "can I convince my company to buy it?" (assuming it's not open source) It's the lazy programmer's approach to code, but it works fairly well. Of course, if I find out that Microsoft or someone else has written a class or API that might fit what I need, I then need to learn how to use that! But isn't that still much easier than writing it yourself?

Besides, once you've gone through that learning curve, you've added another tool to your programmer's toolkit and become just a wee bit less specialized.

I'm currently being "lazy" by trying to learn about NHibernate, but that's a story (or several) for another time.

What if I don't care about .NET?
Well, I'm not going to say that you should! I'm certainly not a Microsoft advocate -- anyone who's ever been on the receiving end of my rants about the latest idiosyncracy I've found in Visual Studio or the .NET framework can attest to that. But honestly, Java's not much different. Ten years ago, Java was a much smaller creature than it is today. And when you consider the various flavors of Java like J2SE, J2EE (especially J2EE!), and J2ME, your typical Java programmer has as much of a learning curve as your typical .NET programmer.

And hey, life's not any easier for you web programmers out there, you know. First, do you use ASP, ASP.NET, Perl, PHP, Python, Ruby on Rails, or some other language as the basis for your web apps (this decision probably isn't yours to make if you're not your own boss)? Add to that the fact that you'll probably need to get fairly good with JavaScript, HTML, DOM, etc. And what about AJAX? There's Atlas, Dojo, Google Web Toolkit (GWT), etc. And I haven't even mentiond Flash and AIR yet! (well, OK, I just did).

The point is, as each language grows and evolves, it becomes more complex. As each ecosystem (such as the world of web programming) grows, more languages spring up (sorry, Java folks, no pun intended) to fill the various ecological niches.

And I've just barely scratched the surface. Change is the name of the game for us programmers. We'd better enjoy the challenge, because like it or not, we're all along for the ride.

full window
full window

Tuesday, June 17, 2008

Congratulations, California newlyweds

'nuff said.

full window
full window

Friday, June 13, 2008

Change is Good (redux)

The three faces of .NET
Sorry for the long delay in this follow-up post... it's been a busy week. As I was saying last time, being a programmer means having to accept working in a field that's constantly changing: new technologies, new languages, changes to existing languages, etc. Even .NET is changing. The first .NET applications I maintained were written in .NET 1.1. While a perfectly fine language, .NET 2.0 was just coming out. VB.NET 1.1 finally made Visual Basic a "real" object-oriented language with inheritance, polymorphism, and Java-style exceptions. And I was looking forward to spending some time getting to know it.

Then along comes .NET 2.0. .NET 1.1, I barely knew ya! With generics (one of my favorite additions to .NET and Java in years), anonymous delegates (a handy copy from Java, but only for those of you lucky enough to be programming in C#; VB.NET doesn't support them), and a whole slew of other features calling their siren song, I abandoned .NET 1.1 like a jilted bride and never looked back. Unless there's a compelling reason to do otherwise, any new project I've written has been in .NET 2.0. Mostly VB.NET since most people in our IT department are more comfortable with Visual Basic, but there are times when I've used C# instead because it was the better fit.

And along with .NET 2.0 came Visual Studio 2005. Can I stress enough how much better VS2005 is over VS2003? Just being able to edit code while debugging again is a compelling enough reason to upgrade! Trying to work in the VS2003 debugger almost, almost, made me want to switch back to VB6.

And now there's .NET 3.5. Well, .NET 3.0 came along between 2.0 and 3.5, but it got ugraded to 3.5 so fast I almost got whiplash watching it. With the additions of LINQ, lambda expressions (added, along with a bunch of other enhancements such as implicitely typed variables, anonymous types, and extension methods mainly to make LINQ possible), Windows Presentation Foundation (WPF), Windows Communication Foundation (WCF), and Windows Workflow Foundation (WF), .NET 3.5 is already starting to make .NET 2.0 look a little stodgy. I'm already sure I'll be switching to .NET 3.5 for future development eventually... the question is just how soon I take the plunge.

Oh, BTW, does anyone know why Windows Workflow Foundation got the short-shrift in the acronym department? The other .NET 3.5 "foundations" were abbreviated as WPF and WCF... why not WWF? Was Microsoft afraid of getting sued by the World Wildlife Fund like the World Wrestling Foundation (now World Wrestling Entertainment) was?

I'd probably switch over to .NET 3.5 right now if it weren't for one thing: my CI build server doesn't support it yet. Oh, I'm sure I can probably get it to work, I just haven't had the opportunity to look into it yet.

Tuesday, June 3, 2008

Change is Good

I think being a programmer means loving the challenge of always having to learn something new. Programming languages evolve, new languages appear, new paradigms emerge, and we as programmers need to learn, grow, and evolve along with them.

Still, even though new languages appear, the old ones never quite seem to go away. There are enough active programs out there written in older languages that still need to be maintained to support a whole fleet of programmers, and it's quite possible to have a long, healthy career without ever writing a program in C#, Java, or any object-oriented language. But for the rest of us, change is a part of our lives and we need to embrace it to survive.

So, what's changing?
Well, languages for one have certainly evolved, even since I started programming. I cut my teeth on computers by programming in BASIC on the Apple IIe and Commodore 64. In college, I learned Pascal and got a much better grasp of procedural programming. Later, I got my first introduction to object-oriented programming in C++ and never looked back. Granted, object-oriented programming existed long before I ever touched a keyboard. Simula was first developed in the 1960s and is considered the first object-oriented language. Smalltalk came along in the late 70s and helped spread the paradigm, but object-oriented programming didn't seem to become popular until the 1990s.

As is often the case in life, there have been detours along the way. My first programming assignments in the real world were writing VBA applications in Microsoft Access (starting back with Access 2.0) and maintaining an accounts-receivable application written in dBase II!

From VBA it was a small jump into Visual Basic and Microsoft's halfhearted attempt at slapping object-oriented features onto the procedural BASIC language. Still, it had classes (even if you couldn't truly inherit from them) so I did what I could to encapsulate my applications' behavior in objects and wired those objects together through events and messages passing. My current employer finally made the plunge into .NET a couple years ago, but there are still a number of VB6 applications that still need support (or migration). I hope to eventually have them all migrated to a more modern platform, but by the time I'm done we may be on .NET 10.0!

As if that's not enough, .NET itself is undergoing a rapid evolution. Between .NET 1.1 and the current .NET 3.5 there have been a number of significant additions to both the VB and C# languages as well as the framework. But more on that next time...