Thursday, December 31, 2009

GNUstep with Eclipse CDT

Although GNUstep has its own IDE and editor, It is possible to use Eclipse for Objective-C projects using CDT, the C/C++ Development Tools.


If you are using Linux, the first thing you need to know about Eclipse is not to install the version packaged with your OS. Ubuntu, for example, is several releases behind on Eclipse. Rather, first make sure you have the latest Java installed (eclipse needs Java), then go to the eclipse.org web site and download it. If you want to use it for both Java and CDT, you can download the one for Java and then add CDT. It's easiest to just install it under your home directory simply by unzipping it there with a command something like...

tar xzvf eclipse-java-galileo-linux-gtk*.tar.gz

Unfortunately, CDT does not directly support GNUstep. You cannot easily import an existing project created by Project Center or create a new GNUstep project with from scratch within Eclipse. Here is one approach suggested in the help-gnustep mailing list by Anne van Rossum (I haven't yet tried this).

- Create sample.m file and GNUmakefile in ObjCProjectExample

- run $GNUstep root/System/Library/Makefiles/GNUstep.sh

- make (that should go fine)

- start eclipse from same console (building/running should go fine)

- add new C project and import directory structure (from ObjCProjectExample)

- at Project - Properties - C/C++ Make Project - [tab] Environment push button Select

- select all environment variables with GNUstep in their names

- apply changes


(Re)Starting Eclipse from whatever location should be fine now. The GNUstep.sh shell script is no longer needed.

[Nicola Pero added...] By the way with the current gnustep-make from trunk, you can also do:


* make sure GNUstep's appropriate dirs are added to your PATH and ld.so.conf [or equivalent to ld.so.conf on your system]


(ie, something like

PATH="$PATH:/usr/GNUstep/System/Tools:/usr/GNUstep/Local/Tools" and adding

/usr/GNUstep/System/Library/Libraries and

/usr/GNUstep/Local/Library/Libraries to your ld.so.conf and rerunning ldconfig regularly)


* set the GNUSTEP_MAKEFILES environment variable (it should be set to
something like /usr/GNUstep/System/Library/Makefiles/)
And then 'make' should work. ;-)

All the other variables are then read automatically by gnustep-make from
the GNUstep config file in /etc/GNUstep/GNUstep.conf.

However, these steps actually seem overly complicated to me. I was able to import an existing GNUstep project into the latest version of Eclipse with the latest CDT with the following steps:

Create a new empty project...



Right click on the newly created project and select Import. Then select General... File System and click Next.


Browse to the project you created in Project Center, click Select All and Finish.

You should now see all your files in Eclipse and should be able to build and run the project.

One oddity is that when you open up a .m file it opens with gedit instead of opening the Eclipse editor. I assume this is because the Eclipse editor doesn't know Objective-C syntax. However, gedit seems to do the color syntax highlighting just fine.

Friday, December 18, 2009

GNUstep Look-and-feel

OpenStep originally was supposed to fit into the look-and-feel of the host system. Consequently, the Solaris and Windows versions of OpenStep looked different, and when OpenStep became the basis for Cocoa on the Apple, the look-and-feel got a Macintosh face-lift. However, Linux has no standard look-and-feel. It all depends on whether you are using KDE or Gnome or whatever.

The GNUstep developers chose to keep the “retro” NeXT look. There is even a desktop on Linux called Window Maker (or wmaker) that is part of GNUstep (or used to be) and implements the old NeXT look-and-feel. However, few people would seriously consider using Window Maker on a regular basis instead of KDE or Gnome (my apologies to Window Maker fans). There is also something called GWorkspace which takes over your desktop and tries to make it look like NeXTstep, but it didn't work right for me when I tried it under KDE4 or Gnome.

By keeping this “retro” look, GNUstep unfortunately makes itself uninviting to most potential users. It also creates a confusion between GNUstep the development environment and GNUstep as a desktop environment.

One feature of GNUstep that can be annoying is the big application icons that appear in the bottom left of the screen when you run a GNUstep application. In Window Maker these would be organized neatly along the right side of the screen, but other window managers just don't know what do do with them.

To get rid of the big icons (I can't think of a reason they will be missed), open a terminal window and type the following command:
defaults write NSGlobalDomain GSSuppressAppIcon YES

You can also change the freestanding menus to be at the top of the screen like in MacOS X using the following command:
defaults write NSGlobalDomain NSMenuInterfaceStyle NSMacintoshInterfaceStyle

However, this will affect other things such as scrollbar style.

To change it back use:
defaults write NSGlobalDomain NSMenuInterfaceStyle NSNextStepInterfaceStyle

There is also a NSWindows95InterfaceStyle, but it won't make your menus appear inside the main menu.

(I think if you set these for your application's name instead of NSGlobalDomain, it will only affect your application.)

In fact, these settings also exist in MacOS X in NSInterfaceStyle.h.

typedef enum {
NSNoInterfaceStyle = 0,
NSNextStepInterfaceStyle = 1,
NSWindows95InterfaceStyle = 2,
NSMacintoshInterfaceStyle = 3
} NSInterfaceStyle;

Apple and Microsoft don't have to support any look-and-feel besides their own. However, if you are producing a cross-platform development environment (like GNUstep), and especially if you are the underdog (as GNUstep is), your programs should behave as expected in whatever environment they are running in. This would be much easier if the GNUstep developers would just abandon the old retro look that simply doesn't appeal to most people anymore.

Tuesday, December 15, 2009

Creating a GNUstep Project

I got the book Developing Business Applications with OpenStep. It's not all I would have wished for. You don't get to your first sample application until Chapter 7, and wading through the first 6 chapters is a bit of a chore. I found Chapter 4 on the Application Kit contains no sample code at all, but it is good for insomnia. Anyway, now that I'm to Chapter 7, Building an Application, I am going to try creating the sample application (PayPerView) which is their only working example and which they add to later in the book when they introduce more advanced topics (this may be the last chapter I read in the book).

PayPerView has a class called ProgramController, which isn't really what it sounds like. The word “program” in this context refers to a pay-per-view tv program. The ProgramController keeps a list of Program objects and interacts with the UI. There is also an OrderController, which controls the process of a user placing an order for a Program. Why there isn't also an Order class I do not know. More classes are added later in the book when the application is expanded to persist to a database and one to use distributed objects. (I am not actually interested in either of these at this time.)

First thing to do is to create the project. In GNUstep, we do this through ProjectCenter. Just like the original Project Builder as described in the book, you choose New... from the Project menu. (Sorry. I am too lazy to paste screen shots in here right now.) I had already created a folder in my home directory called projects_objc. GNUstep creates a GNUstep folder in your home directory for its own configuration files, so better to keep your code separate from that. I'm going to call my application PayPerView and define it as type Application. The default project type in GNUstep is not application but Aggregate. An aggregate project contains multiple sub-projects or “make targets”, meaning it will have multiple executable applications. (You can add new subprojects later even if you select Application at this time.)

On the project window there are noticeable differences between GNUstep's ProjectCenter (PC) and Project Builder. For example, the icons are different. (Still too lazy to do any screen shots, so you'll have to trust me until I feel like adding them.) PC has a screw driver instead of a hammer icon for the Project Builder icon. The remaining icons in GNUstep are different but still reminiscent of the original. That's okay, because there were even significant differences between the NeXT Project Builder and the one that Sun created for Solaris (and, of course, Apple has made significant changes in XCode although it still has the familiar hammer icon). They are the Project Launcher (or Run) icon, the Loaded Files icon, the Project Finder icon, and the Project Inspector (or Attributes) icon.

BTW, at this point if you've never seen XCode it might be best just to ignore it for a while, because it looks so nice while GNUstep looks the same as it did 10 years ago. Just remember that GNUstep is free and not commercially supported.

When you click on Interfaces (as instructed in the book), you will see instead of a .nib file some files related to Gorm, GNUstep's Interface Builder. You get into Gorm by double-clicking on the .gorm file. (Sorry. I am still too lazy to paste screen shots in here right now.)

The format of .nib (NeXT Interface Builder) files were difficult to reproduce in GNUstep, because there is no real .nib file standard. The .nib file is actually just a file containing serialized objects. Duplicating it exactly would have been difficult and possibly required reverse-engineering which probably would create legal problems. However, in newer versions of Gorm, you can at least export to a Nib file by selecting Save As on the Documents menu in Gorm. (I don't think you can import a Nib file created in Apple's XCode and expect to be able to use it in Gorm.)

Anyway, so I will continue later with using Gorm and maybe stop being lazy and add screen shots.

Wednesday, December 9, 2009

Unit Testing GNUstep Programs

As a Java developer who has been trained in test-driven development, I would like to have something similar to Junit for GNUstep/Objective-C. Evidently, around 2005 Apple added a unit testing framework to Xcode 2.1 called OCUnit. OCUnit was not something created by Apple, but they chose to make it the defacto standard by including it in their packages. Unfortunately, this essentially killed off two other competing frameworks, UnitKit and ObjcUnit.

UnitKit is stuck at 1.1 and evidently no longer supported. The GNUStep port is still available from the etoile web site.

http://cvs.gna.org/viewcvs/etoile/Etoile/Frameworks/UnitKit/
http://cvs.gna.org/viewcvs/etoile/Etoile/Services/Developer/UnitTests/

The GNUstep port of ObjcUnit, stuck where is was back in 2002 (v. 1.2) is at:

http://gentoo-portage.com/gnustep-libs/objcunit

Since Apple is distributing OCUnit and the other two are now unsupported, it would at first seem to make sense to try using OCUnit. It is available at...
http://www.sente.ch/software/ocunit/

Unfortunately, although it does support GNUstep, evidently it works only on Mac OS X. Bad news for GNUstep on Linux.

So, we are stuck with using one of the two older, unsupported frameworks or simply doing unit testing with no framework.

Before the creation of Junit it was quite common to test a Java class simply by putting test code in the main method since each class can have a main method. That begs the question, how many main methods can we have in an Objective-C application? Is there one per class or one per application? The answer is one per application, because Objective-C is an extension of C, and there is one and only one main function in any C program (same goes for C++).

I remember in my days as a C++ programmer, I knew people who did unit testing by creating a bunch of Unix shell scripts that called their program in different ways. However, most people didn't even do that. Back then unit tests were usually just a documented list of steps that had to be carried out manually along with expected results. Essentially, you ran the program and if it works without noticeable problems then it passes.

The first unit testing framework that all the others are based on was SUnit by Kent Beck in his 1998 Guide to Better Smalltalk. So, of course, nobody was using these frameworks before then, which makes one wonder if it's really all that important. It has been my experience that most of the time developers don't do true "test first" development in the context of "extreme programming" principles. They write their JUnits after the fact and often only because they are required to by management. Also, unit testing frameworks are pretty good for testing back-end code, but not so much for testing the GUI part of an application.

Don't get me wrong. I like test driven development. However, all things taken into account, I'm not sure I want to spend time writing a bunch of unit tests for a GNUstep program if I can't use OCUnit and there are other ways to get around the testing problem without the need for a testing framework. That's especially true for a program which is only a prototype.

Friday, December 4, 2009

GNUSTEP Projects and Applications

It is entirely possible to create a project manually using a text editor by creating a .m file to contain the Objective-C code and creating a GNUMakefile containing something like the following:
include $(GNUSTEP_MAKEFILES)/common.make
TOOL_NAME = LogTest
LogTest_OBJC_FILES = source.m
include $(GNUSTEP_MAKEFILES)/tool.make

The above is also from the GNUstep manual.

It is also possible to create an application in Gorm (the GNUstep interface builder) without using ProjectCenter.

However, the normal way to create a project would be with ProjectCenter. ProjectCenter will create several files in the project, including AppController.m. This file doesn't get created when you start an application in Gorm instead.

One of the turorials I found on the internet did not show ProjectCenter creating this AppController.m file, but it was written way back in 2001, so evidently this feature was added since then. This is the tutorial linked from the gnustep.org web site, and it's quite disappointing that a more up-to-date one isn't available. Worse yet, there seems to be no other documentation for ProjectCenter other than a FAQ list.

Another tutorial mentions the AppController, but then since it has you create the application in Gorm, bypassing ProjectCenter, you end up having to create AppController.h and AppController.m in a regular text editor.

There is another program called Project Manager which is supposed to be “an alternative Integrated Development Environment (IDE) for GNUstep”. However, I have noticed that when I'm in ProjectCenter and go to edit code it uses Project Manager. It doesn't seem to matter how I set the editor in preferences. It always uses Project Manager as the editor. I assume this has something to do with the way that GNUstep has been packaged for Ubuntu. Unfortunately, I also have had trouble figuring out how to make the font bigger in Project Manager's editor window, so it is kind of annoying. Of course, you should still be able to edit the files yourself outside the IDE using vim or kate or whatever.

GNUstep Renaissance is an alternative way to develop UI's. It allows you to create GUI's for both GNUstep and Cocoa described by XML documents instead of using Gorm or Interface Builder. I haven't explored this too much yet. It kinda feels like a non-standard way of doing things and it seems like one should learn the standard way first. Also, I don't think there's any way to develop iPhone applications using Renaissance. (You can't develop iPhone applications using GNUstep either. However, you could prototype applications in GNUstep and later port them.)

Installing and Setting Up GNUstep in Ubuntu

Under Ubuntu/Kubuntu, it is quite simple to install GNUstep using the provided packages, but it is another thing to get it to work. The applications will run, but when you try to compile anything it won't find the required make files.


Before you can compile anything, you have to source /usr/share/GNUstep/Makefiles/GNUstep.sh or /usr/share/GNUstep/Makefiles/GNUstep.csh. I think the easiest way to make sure that this always happens is to put a file in /etc/profile.d, call it GNUstep_profile.sh and add the following line to it (requires sudo privilege of course):

. /usr/share/GNUstep/Makefiles/GNUstep.sh


Note that the location of the GNUstep.sh may be different if you install GNUstep from source.




Objective-C

The choice of NEXT (and by extension Apple) to use Objective-C is largely a matter of timing. Java and C# are both derived from C++, but Objective-C has no relation to C++. Stroustrup published his first book on C++ in 1985, the same year Jobs left Apple to start NEXT. At the time, there was, of course, no way of knowing which object-oriented language would become dominant. C++ compilers were not widely available until about 1992.


Fortunately, since Objective-C is also an extension of ANSI C, the syntax is familiar until you get to the object-oriented extensions, but at this point it is, in the words of one blogger, Alan Storm, “deeply, deeply weird”. The syntax for declaring a class and it's “messages” (instead of methods or member functions) is one of the weird things. Here is an example from the GNUstep manual (with comments stripped out).


#include

#include


@interface Test

+ (const char *) classStringValue;

@end


@implementation Test

+ (const char *) classStringValue;

{

return "This is the string value of the Test class";

}

@end


int main(void)

{

printf("%s\n", [Test classStringValue]);

return 0;

}


(This code goes in a file source.m.)


C++ basically duplicated the syntax of a struct in C and added member functions. In fact, if memory serves (it's been years since I've written any C++) you can have member functions in a C++ struct as if it is a class. Then Java got rid of the structs and just has classes. Objective-C took a different route entirely, adding syntax that frankly looks very out of place with the rest of the language. The @ reminds me of annotations in Java.


Not only is the syntax different but the message passing mechanism behaves differently. “The Call vs. Message Sending semantics seem, on the surface, to be the same thing, but my book has promised me there are subtle, yet deeply important differences. The first I’ve encountered is, an object will accept messages that haven’t been defined (it silently ignores them) whereas Java/C#/PHP5 would yell at you for calling an undefined method.” [Alan Storm] Objective-C is also more forgiving if you try to use a null (nil) object.


Thursday, November 26, 2009

GNUstep vs. Cocoa

I recently became interested in the idea of writing iPhone apps when the client I was working for mentioned a possible future iPhone project. I wanted to start learning the skills needed for such a project, but didn't have a Macintosh. Then I discovered GNUstep.

GNUstep is a free implementation/extension of the OPENstep standard developed by Next (bought by Apple around 1996) and Sun (recently bought by Oracle) based on NEXTstep. (I initially assumed that the NS that prefixes so many OPENstep class names stands for “NEXTstep”, but some people say it stands for "Next" and "Sun".) Cocoa under MacOS X is also an implementation/extension of OPENstep. Both use the Objective-C programming language and both have an IDE and GUI builder. The main differences between the two are as follows:

  • Cocoa is for MacOS only whereas GNUstep is a cross-platform system used primarily on Linux but also can be used on the Mac or even under Windows.
  • GNUstep does not implement the Mac's Aqua look-and-feel for two reasons:
  1. the original intent was to emulate the look-and-feel of the NEXT computers, and some people supposedly still like the retro look (really?)
  2. implementing the Aqua look-and-feel under Linux might bring Apple's lawyers down on the Free Software Foundation
  • GNUstep does not implement the UIKit classes required for iPhone development (UIKit has many classes that are similar to AppKit classes, but the UIKit classes are named differently, starting with IU instead of NS, and there are other differences).
  • Cocoa is professionally and commercially developed and used by a fairly large community of developers where as GNUstep has not received much attention even from the Linux community.
  • Cocoa is, of course, well documented by Apple and in a number of books that can be purchased from any bookstore. GNUstep documentation tends to be years out of date and incomplete, so you have to be willing to spend some time figuring things out on your own.
In the posts to follow I plan relate some of my experience as I start learning OPENstep in these different environments and move toward producing some iPhone application that I might be able to sell on iTunes.

Sunday, October 4, 2009

Microsoft Security Essentials

I hate paying for anti-virus software. The last time I paid for anti-virus software was back in the late 90's. My computer at home became infected when I took a floppy disk to a school computer lab and later used that floppy disk at home. So, I bought IBM Anti-virus. IBM Anti-virus eventually got bought by Symantec and became part of Norton Anti-virus. I got a free upgrade to Norton and used it for a couple of years until they changed their licensing. A couple of times I bought a new computer and got free anti-virus for a year each time.

However, a couple years ago I was finally forced to decide whether to buy anti-virus protection or use one of the free programs that are available. In the meantime, spyware came along and the malware problem became worse. I started using AVG Free and tried various free anti-spyware programs, including Windows Defender from Microsoft and Spybot Search and Destroy. However, I continued to have problems with spyware. Then one day my laptop became very slow and I discovered it was caused by AVG. So, I switched to Avast. Unfortunately, Avast is big and slow, and Spybot is also slow.

Today I discovered Microsoft Security Essentials. Apparently, it combines something called Windows Live OneCare with Windows Defender to provide both anti-virus and anti-spyware protection. Combine this with Windows Firewall and we are finally at the point where Windows provides full security protection for free.

Symantec's web site said MSE has one of the lowest anti-virus detection rates, and, of course, I know Windows Firewall is only a one-way firewall, not two way like Zonealarm and Comodo. However, I also get tired of the constant popups from Zonealarm, and I get tired of my computers being slowed down by the security programs, so I'm going to give MSE a shot. Maybe it will provide "good enough" protection without being so annoying.

Wednesday, September 23, 2009

SGI is Back

I missed this a few months ago. Silicon Graphics, that great company that gave us the hardware that generated visuals for movies like Jurassic Park, is trading on NASDAQ again (as of May, 2009). I just saw this on the cool site SiliconBunny.com. They are also the original creators of the OpenGL standard, a cross-platform graphics API which pre-dates and competes with Microsoft's DirectX.

Rackable Systems, Inc. (NASDAQ:RACK - News) announced today the completion of its legal name change to “Silicon Graphics International Corp.” The company also announced today that it will change its NASDAQ stock ticker symbol from “RACK” to “SGI.” The stock ticker change has gone into effect for the trading community on Monday, May 18, 2009.


And in September they released a new workstation, the Octane III, advertised to usher in "... a new era of personal innovation in strategic science, research, development and visualization." Whether the hype is true or not, it's nice to see the company trying to make a comeback.

Back in my days as a Unix Admin in the late 90's, I worked on Unix workstations from Sun (both BSD-based SunOS and the newer System V based Solaris), SGI (Silicon Graphics, running their IRIX OS), NeXT (Stephen Job's company after he left Apple that later got bought by Apple and whose technology melded into MacOSX), HP (HP-UX was the first Unix I ever used back in college), and AT&T (I had the "pleasure" of working on an AT&T 3B2). I can definitely say back then SGI was the coolest computer company around. Unix techies and scientists were well aware of SGI, but they only briefly managed to get much notice from the general populace due to their computers being featured in Jurassic Park.

They had everything from personal Unix workstations designed for 3D graphics better than anything a standard PC could do at the time (this was back in the days when PC graphics cards were comparatively primitive) to "low end" super computers (they eventually bought out Cray, the original super computer company).

The new machine will not run IRIX. It will instead run "Red Hat or SUSE Linux (which SGI’s excellent ProPack enhancements) or Windows HPC Server 2008". SGI actually IRIX on MIPS processors in favor of Linux on Intel a few years ago, so this is not really a surprise. I wish I had the money to buy one, but these are definitely high end machines.

Oh, well. At least I can try fsv on my Linux boxes at home for that SGI/Jurassic Park nostalgia. If you're a Windows user, you might want to try StepTree.

Monday, September 21, 2009

The Totem Pole

I got an interesting e-mail. Actual company names have been changed.

"I had some interesting news you should keep somewhat confidential. The contractor rates at [Client X] went up again effective [date withheld]. But here's the real catch: only some contractors were given this benefit. Apparently, the managers went through some sort of vote on who should be allowed to have their rate back to the [original date] levels. I'm not certain how high level these managers were. My [Consulting Company A] recruiter estimated about 80% of our staff got the nod. I don't know about [Consulting Company B], but there's a few from [Consulting Company C] I had lunch with on Friday that had the same news to share. The consulting firms and [Client X] have to keep a lid on it, so we're not supposed to tell. I think it's pretty crappy the way they are playing favorites, and secrets like that never really stay buried anyway."

I can't be surprised about this sort of thing. The company in question has been having problems and has been trying to save money, but undoubtedly they lost some good consultants when they dropped their rates, maybe some of their best.

The decision process they mentioned reminds me of an old way of determining raises that was used at a company I worked for in the past. The process was called the Totem Pole. Employees were ranked by managers, supposedly based on performance. Their names were then placed on a graph as the x coordinates in order of this ranking, and their salaries became the y coordinates. They then did some least-squares method to draw a line which represented what their salaries should be based on this ranking. If your salary was above the line, you were over-paid and didn't get a good raise (maybe cost of living). If you were below the line, you were under-paid compared to the other people and would get a raise.

The whole process of drawing a pretty graph was all to make the whole thing seem more objective and scientific. The ranking, was, of course, highly subjective. In any case, if you don't get a raise in such a system you know that either they think you're already over-paid, or you are at the bottom of the totem pole. Either way is not good. If they think you are over-paid, you can kiss any raises goodbye for the foreseeable future. If you're at the bottom of the totem pole, you not only can kiss your raises goodbye, but you may be on the short list for the next layoff.

Still, knowledge is power. Once you know you're at the bottom of the totem pole, you can try to improve your ranking somehow. If you don't think that's possible because you're already performing well, you can just hope things work out or you can start looking elsewhere.

Yeah, I know. Doesn't have much to do with programming, but it's it's just one aspect of life programmers have to deal with in the trenches.

Thursday, September 17, 2009

Test Code Generation

Yesterday I started evaluating a product that claims to be able to generate test code for you. As a Java developer, I am mainly interested in JUnit test code. I am a big believer in test driven development, but I often find myself inheriting code with low test coverage. So, the question is, are there tools out there that can help one quickly and painlessly generate test cases for under-tested code?

In a past job I used a product called AgitarOne that purported to do just this sort of thing. The tests that were generated, of course, captured the current functionality whether correct or not. It could not produce truly intelligent test code. Also, it was big and expensive and required a lot of resources. Worse yet, generated tests extended a proprietary class, so the more you used it the more you were locked into their product.

I tried to use a free tool called testgen4j, but I didn't have the patience to make it work. Its user interface is a Unix shell script. I tried running it under Cygwin, but found the script didn't like being run with JAVA_HOME set to something with "C:" in it. I may eventually try it again, but what I really want is something that runs as an Eclipse plugin.

With that in mind, I started googling, and I found something called CoView. This product offers a Community Edition and an inexpensive Premium License. I downloaded and installed the community license file and the Eclipse plugin. However, to my disappointment, it simply did not work. It got errors whenever I tried to go to the preference page to tell it about the license file.

With a little more googling, I found CodePro AnalytiX. This is a much more expensive product, but I thought I might as well try the 15 day evaluation. It turns out that AnalytiX is just what I would like to have. It runs as an Eclipse plugin, it does a fair job of generating test code, it uses EasyMock, and it does not rely on some proprietary base class.

Unfortunately, when I set the CodePro preference to always generate mocks for all interfaces, this caused it to generate some code that wouldn't compile due to duplicate local variables. Selective use of mock objects worked better.

I had a number of classes that extend JdbcDaoSupport. Even after enabling generation of Spring tests, CodePro couldn't generate usable tests for these classes. The generated tests all had comments that said: "An unexpected exception was thrown in user code while executing this test...".

I guess the bottom line is that these kinds of tools can be nice, but even the really expensive ones will never generate usable tests for all code much less optimal, intelligent tests. They are largely a crutch for people who do not have the discipline to do proper test driven development, and, unfortunately, not a good solution to low coverage on legacy code.

Thursday, September 10, 2009

OS Wars and Plugin Woes

Personally, I'm a fan of Linux/Unix. I have to use Windows at work, because that's what they give me. Otherwise, I'd be using Linux as most anything I do on the job I can do just as well with Linux.

I do use Linux at home, but even there I am still sometimes forced to use Windows even though it is more vulnerable to virus attack and it boots slower (at least on my computers). The latter is in part because of all the extra anti-virus and anti-spamware software I have to run under Windows. I have found that because I have multiple people using my computers and home, including for game use, I get a lot of malware. I have to have multiple scanners installed and occasionally do a registry clean in order to keep them running fairly well.

One reason I am forced to retain Windows at home is because the majority of PC games work only in Windows. There is other software that also only works in Windows, but for the most part it can be avoided by using open source alternatives. Games are different. Every game is unique, and most game developers don't develop for Linux.

Another reason I have recently been forced to run Windows is that I couldn't figure out how to get HDMI output working under Linux. This could be my own ignorance and deserves more research as I see other people on the internet who claim to have gotten this to work just fine.

However, increasingly the reason I am forced into using Windows is because of browser plugins. There are two important browser plugins that are not available under Linux. One is Shockwave and the other is Microsoft's Silverlight.

Personally, I could live without Shockwave, but my kids play games that use it. However, I recently became a Netflix user, and I've found that their "Watch Now" feature requires a browser with the Silverlight plugin.

In addition, I have found that the Linux version of Flash seems to be slower than the Windows version. So, I can watch Hulu videos under Linux, but the quality of the experience suffers.

Some day I may switch to Macintosh. With Apple's computers you get a Unix-like OS, better support from software developers and hardware vendors, and it's not a difficult sell to all the iPod fans who are already familiar with Apple. Of course, the downside to the Mac has always been that it's more expensive.

Linux, by contrast, is free and runs well even on older machines. I recently installed it on an old machine that had been gathering dust. I plan to use this as an experimental web server running Tomcat. I used Xubuntu Linux, and was able to easily install Tomcat. It seems to work fine.

Games and Netflix are certainly not important to everybody. If you want to get up and running on the internet just to stay connected with your friends and surf the web on a limited budget, an older machine running Linux will work quite nicely. I recommend some flavor of Ubuntu Linux for all Linux installs (I have tried several Linux distros and find Ubuntu to be the easiest to install and maintain). Personally, I like Kubuntu (which runs the KDE desktop, Linux Torvald's favorite desktop) for newer machines and Xubuntu (which runs the XFCE desktop) for older machines with less memory.

Friday, September 4, 2009

Scrum vs. XP

I have worked on both XP (extreme programming) and Scrum projects. There has been a lot written about the differences between Scrum and XP. Here are my personal observations.

First off, XP follows a simple set of rules and practices which can be summarized in a single page whereas Scrum is so complicated that there is a ScrumMaster certification.

There is a perception that XP does not scale well to large projects. XP projects should have one team maintaining a single code base, because "collective ownership" is one of the XP rules. Scrum projects, on the other hand, may have multiple teams working on a single application with several simultaneous branches working towards different releases. Developers are then only allowed to work on the branch that their team has been assigned.

I have found that in such large projects people are afraid to make changes for fear of creating new problems. So, opportunities for improvement are ignored. Although Scrum may scale to a larger team, it will not fix this problem. It should be obvious that a large team will be less agile no matter what methodolgy is being followed.

Scrum does not require TDD (test driven development) or pair programming. In point of fact, many developers don't want to do TDD or pair programming and will resist doing so even after they've been told to. The larger a team/project gets, the more trouble you are going to have convincing people to change the way they work.

I feel that TDD and pair programming should go hand-in-hand. Developers who don't want to write their tests first will not do so unless they are paired up with other developers who are in the habit.

Developers who aren't doing TDD will resent having to write unit tests, because it is an added task which makes it harder for them to get their work done on time. So, they will not put much effort into them. Without pair programming, nobody will know the difference until somebody later has to maintain code with crappy unit tests.

Developers who do TDD will be testing and coding simultaneously and finish both at the same time. So, the testing just becomes part of the process and is not an added step.

In addition to helping to encourage TDD, pair programming eliminates the need for separate peer code review since the peer review is happening simultaneous to coding. By contrast, formal peer code review meetings tend to nit-pick and make the developer feel that she is under the microscope intead of the code.

Unfortunately, most developers will resist pair programming as much as they resist TDD. One reason is that they may not like having two people crammed into a cubicle designed to fit one person. The ideal pair programming workstation will have two monitors and two sets of keyboards and mice. Another reason is that they cannot check their e-mails and such when they are at another person's desk.

So, TDD and pair programming require a real cultural shift and encouragement from the top down. Usually the project managers and architects will have to force the issue.

However, most people do not understand the benefits of these XP practices and the tendency of managers is to think pair programming will require twice the number of programmers to accomplish the same amount of work. I would simply counter that if pair programming requires more developers, why is it that XP projects usually have less devlopers as per the argument that it doesn't scale well to large projects?

The fact that XP requires team members to work more closely together should not be a cause for concern. The word "scrum" is a term borrowed from the sport of Rugby. Unfortunately, the way software development usually works with the Scrum methodology is that team members get together for the daily scrum, sprint planning meeting, etc. However, then they are sent out to work independently on their assigned tasks as if they are not part of a team. The result is that the success or failure of the iteration becomes dependent upon the weakest link.

I believe that XP teams can accomplish much more with far fewer people with less risk, and it is entirely possible to do both Scrum and XP so that you receive the benefits of both. Companies could save a lot of money if they would really embrace the XP practices, but in all honesty it is not the easiest path.

Wednesday, April 15, 2009

Is open source killing Sun?+

Okay. I am going to ask some questions that might sound like heresy to fans of open source, but desperate times call for brutal honesty. We are talking here about a company which changed its stock symbol to the name of their popular but free language, JAVA.

I have always wondered, how does a company make money by producing something that they then give away for free? Of course, they do have their hardware business, but despite that, is it any wonder that Sun is going under when they give away so much for free?

Java: free programming language
NetBeans: free IDE
Star Office: not free, but Open Office version is free
Open Solaris: free Unix OS (which also has to compete against Linux, also free)

In fact, Dave Rosenburg of cnet news recently wrote an article titled "How Linux killed SGI (and is poised to kill Sun)". Amazingly, when talking about Sun, Rosenburg doesn't even mention Java. Java is like a 500 lb gorilla in the room that nobody seems to notice. However, he does say, "There is a vast array of Sun software that costs a lot to maintain but doesn't deliver much revenue. This is arguably the area in which Sun's strategy has been so off the mark." Could this be an indirect reference to Java?

Rosenburg then sites MySQL as Sun's best software, which is humorous since they only acquired MySQL just over a year ago. He says, "With the exception of MySQL, there aren't many Sun software products that generate significant revenue." Say what? Sun's stock has been going down ever since they bought MySQL in early 2008 for $1 Billion. Sun CEO and President Jonathan Schwartz was quoted at the time as saying, "MySQL was clearly the crown jewel of the open source marketplace. As far as we can see there are no higher value assets for us to be acquiring."

However, John C. Dvorak seems to have got it right when he wrote his opinion piece, "The Sun-MySQL deal stinks Commentary: Oracle is the only winner in this deal." John said, "... Sun cannot actually afford to spend a $1 billion on a company producing a mere $60 million in revenue and working outside its core competencies." I guess at the time Sun thought they were doing pretty good with a 4th quarter 2007 profit of $320 Million and could afford to expand, but these days they are in the red. (By the way, John also manages not to mention Java anywhere in his article.)

So, is open source killing Sun? Well, sort of. That and spending huge amounts of money on a stupid merger right before the economy tanked. On top of all this, it would seem that IBM recently offered to save them from the mess they got themselves into by purchasing them, but Sun screwed up that deal too.

Of course, IBM would love to be in control of Java and maybe that wouldn't be such a bad thing. In the wake of Sun falling apart, I can imagine a world where two or three different companies attempt to take over stewardship of Java, which Sun has made open source. It could be like the days when we had AT&T and BSD versions of Unix before Sun made the switch from BSD and everybody standardized on AT&T just before Linux came along to largely replace them all.

In fact, I am wondering... couldn't IBM save a lot of money by forking their own version of Java and letting Sun just die?

Meanwhile, Microsoft does have their own open source strategy, but as Sam Ramji, Microsoft’s Director of Platform Technology Strategy put it in a 2008 interview, "Our focus is getting OSS on top of Windows... And I’m focused on (providing) interoperability between the LAMP (Linux, Apache, MySQL, PHP) and Windows stacks." So, their priority is making sure that people continue using Windows even if they are also using Linux. They are not trying to make money off of open source. (Once again, no mention of Java.)

Honestly, I am not trying to bash (no pun intended) open source. I happen to be using Linux right now. Some companies like Red Hat have made good money selling support for Linux while contributing to its development. Red Hat is doing pretty well. They not only have their Linux but also have acquired JBoss, which they reportedly purchased for $350 Million. Was it really worth that much? I doubt it. However, Red Hat, as I point out, made their fortune on open source whereas Sun did not. So, maybe Red Hat knows what they are doing and won't make the same mistakes Sun has.

New Wine in Old Wine Skins: My First .NET Project

By, "New Wine in Old Wine Skins" I’m not making a techie joke about the “Windows Emulator”. I’m thinking about new projects being created using old technology.

It seems to me there is sometimes an advantage to being new to something. You are not “set in your ways” and have not yet developed bad habits.

Of course, I am new to .NET, but certainly not new to development in general. I may have developed habits developing Java or C++ or working with Unix that may cause me problems with making the transition to .NET. On the other hand, since .NET is all new to me, using its newer features such as WPF, Linq and Ajax should not be anymore difficult to me than learning the old way of doing things which is also new as far as I am concerned.

However, I am helping a friend with a .NET project, and I have found that he was less than open to some of these newer technologies. For example, I saw that his database code relied on hard-coded bits of SQL. Having worked with Hibernate in the Java world, I suggested that he use ORM (object relational mapping).

After a little research on my own, I found it would be fairly easy to generate C# classes that are mapped directly to tables. I started out with the Access database my friend had provided me. To get this process to work, I first had to import the database into Sql Server. This turned out to be easy.

Then I generated DLinq mappings using SqlMetal. The result was an xml file and matching cs file. SqlMetal doesn't work with access files, which is why I had to get the tables into Sql Server. I ran the SqlMetal command below from a cmd prompt sitting in my App_Code folder:


"c:\Program Files\Microsoft SDKs\Windows\v6.0A\bin\SqlMetal.exe" /map:Mappings.xml /code:Mappings.cs "C:\Program Files\Microsoft SQL Server\MSSQL10.SQLEXPRESS\MSSQL\DATA\XXX_Data.mdf"


… where XXX was the name of the database.

For some reason, after importing I had to take down Sql Server and bring it back up before it would work. I am not sure if I was doing something wrong. Of course, the express version of Sql Server doesn't give you much of a UI to work with.

Also, trying to build the project after adding these files I get the following error...


Error 1 The type or namespace name 'Linq' does not exist in the namespace 'System.Data' (are you missing an assembly reference?) C:\Documents and Settings\greg\My Documents\Visual Studio 2008\WebSites\FindEngine\App_Code\Mappings.cs 16 19 C:\...\FindEngine\


I found out I could fix this as follows by adding the following to web.config after the the line for System.Data.DataSetExtensions...

<add assembly="System.Data.Linq, Version=3.5.0.0, Culture=neutral, PublicKeyToken=xxxxx">


where the public key token is the same as that used to sign System.Data.DataSetExtensions

I did not actually try using the mappings yet, but wanted to share my work with my friend. When I sent it to him he replied: “I admire your learning DLinq, and applying it here, but I'm pretty sure it won't work, since I'm storing column names as strings and using them to generate parameters for the queries. It's kind of moving the opposite way from LINQ, since you can't do strongly typed in-line queries that way.”

He then supplied some examples from his code. Here it is with some of the names changed…

Here is an example from WidgetObject.cs that I believe is incompatible with the LINQ concept:

    public void Delete(int ID)
    {
        Db.Execute("DELETE FROM " + _type + " WHERE ID = @ID", ID);
    }
Here's another example from WidgetStuff.cs:

    public override DataTable GetList(int ParentID)
    {
        string whereClause = "";
        List<param> paramList = new List<param>();
        if (ParentID > 0)
        {
            whereClause = " WHERE CategoryID = @ParentID";
            paramList.Add(Param.Int("ParentID", ParentID));
        }
        return Db.GetDataTable(string.Concat(
            "SELECT ID, '' AS ChildType, Name AS Name, Description",
            " FROM Stuff", whereClause, " ORDER BY Name"), paramList);
    }

(code formatted by http://formatmysourcecode.blogspot.com/)

My friend must be smarter than me, because I don’t pretend to understand exactly what he is doing. However, I get the impression that maybe he is getting carried away with trying to come up with a very general solution where something simpler would suffice. I think I will be puzzling over this for some time. If I figure it out, I will write another blog about it.

Tuesday, April 14, 2009

Cross-platform .NET GUI development

Although both Microsoft and Sun are in business to make money and increase market share to make more money, it would seem that there is a fundamental philosophical difference between the two companies regarding software development. For reasons which should be fairly obvious to most software developers, Microsoft does not care much for Sun’s “write once, run anywhere” mantra and has designed .NET to run only under Windows.


But what if one wanted to write .NET applications that can run anywhere? Is it even possible?


As it turns out, there are a couple of open source projects which aim to make it possible. One of them is Mono. From their web site: "The Mono Project is an open development initiative sponsored by Novell to develop an open source, UNIX version of the Microsoft .NET development platform."


Mono cannot run WPF applications, but it can run Windows Forms (WinForms). In addition, Mono has it's own Linux-oriented Windowing layer called gtk# which has been ported to Windows. Also, there is wx.NET, a CLI wrapper for wxWidgets.


Mono can also be used to run ASP.NET applications from Apache on a Linux box.


Of course, just because you can do a thing doesn't mean you should. Should one bother trying to write cross-platform .NET? I would say it depends on why you want to do so and how you go about it.


If you’re out to prove how great Linux is, that’s probably not a good reason to write .NET for Linux unless you happen to work for one of Microsoft’s competitors.


If you have an existing .NET application that you want to run under Linux for some other reason, that might be a good reason to use Mono.


It is not uncommon for Java programmers to develop under Windows and then deploy an application to a Linux server. So, why not go the other way? If you just prefer to use Linux but have some need to develop for Window, that might be a good reason to use Mono.


If you want to use Apache to run ASP.NET applications without IIS, possibly under Linux, that might also be a good reason to use Mono. As of April, 2009 Apache has 45.95% of the server market compared to 29.27% for Microsoft IIS according to netcraft.com.


Unfortunately, as with any copy-cat technology, Mono will always be one or two steps behind the latest Microsoft innovations. I already mentioned that Mono cannot do WPF.


Personally, I would not recommend using gtk# (gtkSharp), because it is specific to Mono, and I would not recommend wx.NET because it is an obscure technology.


If you read my previous blogs you know that I don’t like the idea of using obscure technologies and would prefer to spend my time learning skills that are marketable. That’s because I’m a professional software developer. If you’re a hobby programmer, you can pretty much do whatever you want.


That leaves Windows Forms (or ASP.NET if you are writing a web application). However, as a software developer I want to be learning and using the latest technologies. So, that is one negative. I shouldn’t have to limit myself in order to achieve cross-platform compatibility.


Unfortunately, I found that installation of Mono and Monodevelop IDE (the Linux port of SharpDevelop) can be a real pain depending on what distribution of Linux you are using. The Mono and MonoDevelop projects directly support OpenSUSE, but not Ubuntu or Redhat Linux. I am sure that is because both Mono and OpenSUSE are sponsored by Novell. However, this seems like a mistake to me. I found it very difficult to install MonoDevelop on Ubuntu and ended up changing to OpenSUSE instead. However, some people may not have this option. This issue could certainly have a negative impact on the adoption of Mono. This is a glaring example of how Linux applications also do not adhere to Sun’s “write once, run anywhere” philosophy.


And speaking of “write once, run anywhere”, if you look on the discussion forums on the Mono project web site, you will see many complaints of .NET applications working differently under Mono than under the true Microsoft .NET. Not that this is a big surprise. Fortunately, the Mono project supplies a Migration Analyzer tool, but even it is not perfect. However, I am thinking that if you are doing new development targeted for Mono, it would be better to use MonoDevelop under Linux for all your development so you won’t run into a lot of surprises later down the road with minor incompatibilities.

Friday, April 10, 2009

Avoiding Fads and Obscure Technologies

Years ago I had a job working with some structural engineers. They had developed a prototype in Fortran and I had to convert it to C++. These guys had no idea how easy they had it, because the science they were working with had changed little in 50 years. They kept getting paid for doing the same job year after year, but they didn't have to learn much new technology.

By contrast, as a computer science grad my field of expertise is constantly changing. Computer hardware is constantly becoming more powerful and the software is constantly changing.

Although it is always going to be a challenge keeping up with the pace of change, isn't it often worse than it has to be? I mean, some people are constantly re-inventing the wheel and/or chasing the most recent fads.

I currently work for a company that has a habit of creating their own in-house solutions for common problems and/or using obscure technologies. One of those obscure technologies is SwiXML. SwiXML is one of a small crop of frameworks built on top of Java Swing. SwiXML allows you to specify a GUI in XML instead of Java.

Why use SwiXML? I think the idea was to create a richer GUI than could be done with the in-house servlets framework they were using before and to do it in a way that did not require as much testing. Apparently, the architects had already written off Struts as not working well in an Agile environment, and Eclipse/SWT was suggested and also shot down. I am not sure what other alternatives they may have explored.

Unfortunately, SwiXML is so obscure that I wouldn't even call it a niche technology. Ruby and Groovy qualify as niche technologies. If I search on the internet for jobs that require Ruby or Groovy, I can actually find some. However, a recent search on SimplyHired.com turned up exactly zero jobs that asked for SwiXML.

Let's face it. Even if you like learning new stuff, there are only so many hours in a day, and only so many brain cells to devote to learning new things. Our brains aren't any smarter than the ones our stone age ancestors used when the most complicated thing they had to create was flint tools. Evolution is too slow to keep up with science and technology. If we're constantly having new technologies forced on us, isn't that bad for productivity? Won't it cause us to only have a surface knowledge of what we're doing?

I would love to settle comfortably into a technology that I could trust to be around until I retire, like those structural engineers I mentioned. Is Java going to be around that long? I think so, but there are constantly new open source frameworks and innovations to learn. Spring and Hibernate are a couple that have become just as important to learn as Sun's standard J2EE technologies.

Evidently, .NET developers have less technologies that they need to be proficient at. Learn C# or VB.NET and ASP.NET and you know everything you need to qualify for most of the .NET jobs that are out there. More than likely, the only IDE you will need to learn is Visual Studio.

It seems that these days there are as many .NET jobs out there as Java jobs. A few years ago I heard that 70% of new development was in Java, but I don't think that's true anymore. So, I think .NET is worth exploring for those who want to avoid wasting energy on fads and obscure technologies.

You can download Visual Studio Express editions for free. If you are a Linux fan, it is now possible to run some .NET applications under Linux thanks to a project sponsored by Novell called Mono. If you decide to give that a try, I would recommend using OpenSuse Linux. Since it is also supported by Novell, Mono is designed to work well with OpenSuse.

I have recently started helping a friend with a small .NET project. I don't know if I will ever totally switch from Java to .NET, but there are some jobs that require both. So, in the coming weeks I will be blogging about .NET and related topics. Stay tuned.

The Top 10 Tech Skills?

Reading a recent issue of NETWORKWORLD, I was intigued by an article called "Does a Computer Science Degree Matter Anymore?" In a nutshell, their conclusion was "it depends". Evidently, there are some people that feel that a lack of solid computer science grads could damage America's competitiveness in technical innovation, but, on the other hand, most of the time when companies hire techies, they are looking for specific skills rather than a degree. So, as a side article they included what they think are the current "Top 10 Tech Skills".

I am, by nature, a skeptic, and as one of my professors liked to say, quoting Mark Twain, "There are three kinds of lies: lies, damned lies, and statistics." So, I don't accept such a list as given by NETWORKWORLD or any other "expert source" on face value.

Their numero uno skill is "Business Process Modeling". They say that in the past year pay for these skills are up 10.3%. That's great, but how many people does this affect? Can you actually get one of these jobs? Searching indeed.com for BPM jobs near my zip code, I found a whopping 10 jobs. Woo hoo! By contrast, a search on java turned up 334 jobs and a search for .NET turned up 407. So, here is their complete list with my addition of number of jobs near my zip code using some related job searches (some terms are too broad to search on, such as "database"). Also, note that the numbers I list are the number of jobs for which indeed.com could determine some salary estimate.

1. BPM
"BPM": 10 jobs
"business process modeling": 6 jobs
2. Database
"Microsoft SQL Server": 42 jobs
"Oracle Developer Suite": 0 jobs
"Oracle DBA": 10 jobs
"DBA": 78 jobs
3. Messaging/Communications
"VoIP": 49 jobs
4. IT Architecture
"software architect": 101 jobs
5. IT Security
"it security": 25 jobs
"computer security": 7 jobs
"security analyst": 7 jobs
6. Project Management
"software project manager": 467 jobs
7. Data Mining
"Data Mining": 22 jobs
8. Web Development
"Web Developer": 36 jobs
"Web 2.0": 6 jobs
9. IT Optimization
"IT Optimization": 0 jobs
"virtualization": 27 jobs
"cloud computing": 0 jobs
10. Networking
"cisco certified": 11 jobs
"network engineer": 38 jobs

So, by what standards are these the top 10 skills if their are so few actual jobs in some of these areas? Project managers seem to be in high demand, but I think it would be debatable whether or not this even belongs in a list of "tech skills".

By contrast:
"java": 334 jobs
"j2ee": 108 jobs
"hibernate spring": 51 jobs
"java swing": 19 jobs
".NET": 407 jobs
"c#": 179 jobs
"asp.net": 131 jobs
"visual basic": 83 jobs
"c++": 194 jobs
"cobol": 45 jobs! (there are more jobs for Cobol than for BPM!)
"fortran": 75 jobs!
"groovy": 1 job

Pay is a whole separate issue. Salaries for BPM are going up while salaries for Cobol have gone down since last year. But if you can't even get a job with your skill, what's the point?