Archive for April, 2006


Yes, I am finally back from GLGDW (and vacation). I would like to tell you I walked back from Milwaukee and this is the reason I am blogging my recap three days after the closing session (and it was a fun closing session wasn’t it {bg}). The truth is I am extremely busy at the moment both professionally and personally. This is a good thing. The truth is, I spent the last couple of nights assembling some notes from the opening session to send to Whil with all the feedback and links from the Professional Developers Toolkit part. It was the fastest I ever assembled a session whitepaper, and I hope the attendees find it beneficial. All this and catching up with my client work after a 10 day vacation before the conference. You know how this goes. I figure it will be another week of catching up on all the work. All this and it is playoff season for the Red Wings and Pistons. Fortunately, I have patient clients and a patient family.

As others have blogged and posted on various forums and list servers, the conference was a smash hit. The format of the conference was very comfortable for me. In fact, the entire conference was patterned to draw out participation from everyone in the room. This is exactly how I like my sessions to flow – throw out some ideas and let the discussion flow from everyone in attendance. It has always been my contention we all benefit from the knowledge of many instead of the knowledge of one. As one developer told me Saturday, you could not risk falling asleep because you would miss too much. He told me even the most energetic speaker normally only keeps his attention for 30 minutes, but at GLGDW 2006 he stayed awake for the sessions.

So thanks to everyone who helped me present the Professional Developers Toolkit and Best Practices on Error Handling. It was a true pleasure to be a part of this event.

I attended every session. Normally I miss one or two slots because I have something pressing I need to take care of at the conference. I learned a lot from every presentation. This is not unusual, and is one of the big reasons I go to conferences. The inspiration factor is a big deal to me. GLGDW did not disappoint.

I pretty much agree with everything Doug Hennig blogged about (especially about the Thai food we ate Saturday night).

Each of the presenters (both those in the front of the room and the others sitting comfortably in the seats) challenged my thinking in some way:

  • Marcia and her “must/should/could” approach to determining where the code goes provides me with a different approach when I struggle deciding where in the class hierarchy the properties and methods need to go.
  • Tamar keyed in some ideas on how my user interfaces can be better, especially when messaging users when saving and deleting data, and accessibility. It was very apparent she is passionate about this topic.
  • Andy was the only speaker doing two sessions (local and remote data), and was the only speaker who made me open up my machine and test something I thought was incorrect (naturally I was wrong, but this was the idea – he challenged my thinking). I disagree with some of his best practices especially on remote view, but respect his ideas and thought it was an awesome three hours plus.

That was Saturday. Whew.

  • Nancy’s refactoring session kept bringing back memories of Steve Sawyer’s refactoring session several years ago in Milwaukee. Nancy shares Steve’s passion for this topic and did a great job. She brought out a number of “duh” moments and thanks to Doug, made me appreciate my development approach of not using WITH…ENDWITH.
  • Barbara tackled a reporting, which is a topic given much exposure over the last couple of years because of the new reporting engine in VFP 9. The great thing about this session is it was not a how-to use the Report Designer session, but how to generically approach reports and the query form in a generic sense.
  • Cathy may have had the most difficult topic to tackle with Project Management. This is a topic most developers either love or hate. There is no code, no jumping in to VFP to show off this technique or that idea. Cathy provided a number of ideas and concepts which reminded me of all the good reasons project managers make good money.
  • The debugging session was packed with lots of good ideas from everyone. I happen to consider debugging one of the most important aspects of development because all developers spend a lot of time debugging software. Picked up a couple of tips.
  • Doug presented a great session on vertical market apps. This was one session I was very interested in personally because I am involved in building a couple of vertical market applications. Doug did not disappoint, despite struggling with a hardware failure getting his machine to push video to the projector. Rock solid.

Fun way to spend a Sunday for sure.

  • The last day of the conference is always the hardest, but Craig started it out with lots of best practices with respect to the middle tier and COM+. I have not done a lot of work with COM objects and the middle tier (distributed model) apps. I found the session interesting and learned plenty.
  • Rick Borup covered Deployment like no one else can. Obviously this is a topic near and dear to my heart. His approach showed developers that deployment is not something you consider at the end of the development project, but is part of the entire development life cycle. He provided some great best practices in each of the five stages. Top-shelf presentation from a top gun presenter.

I left Milwaukee exhausted and energized at the same time. I cannot wait to download the session materials (Whil has already sent out the message they are ready).

I liked the single threaded format and the single theme, but I know I would not want every conference to follow this format. It would get boring. Sometimes choice is good, and sometimes choice is bad. I know a mixture of formats will serve the conference organizers well.

My only complaint is the hotel Internet access. I had trouble keeping the wireless connected in my room, and tried dial-up, but that was even worse. Fortunately it was the weekend and there was not too much email traffic. If the wireless was better I would have blogged live during the conference.

Thanks Whil and Alexis. You were great hosts and put on a terrific Fox event.

GLGDW was a great way to kick off the year of Fox conferences. If you missed this one there are several more coming up in the fall.

So what are you waiting for, go get registered and take part in the inspiration. I figure I should be coming down from the adrenaline rush just about the same time I get caught up on all this work.


The Internet is barely usable over a dial up connection. I know, duh.

Last week I researched wireless hotspots and broadband connections in some areas I will be traveling to this year (both business and personal). I figure this is 2006 and wireless is so widely available I would not have trouble getting connected when I was away from home. I was wrong.

I have enjoyed a broadband connection for the last seven years or so. It is now an everyday part of my life, and one thing I count on being available to me all the time. I know how spoiled I am by the reliable DSL connection in my home office, and the ones I use at clients, on the road in hotels and campgrounds, and even in some of the restaurants I eat at. These last two days with a dial up connection have proven again the Internet is not designed for even casual use over a 56K modem. Don’t get me wrong, I am thankful Ameritech, oh sorry – SBC, um, I mean AT&T; have dial-up numbers for their DSL customers. It is perfect in situations where you are on the road and there is no wireless access.

Email has been fine except for one particularly large attachment I got this morning which slowed the entire email pass down. This is not the part I struggle with today. Browsing Web sites has been a trip. A very slow trip. I feel like I am working in slow motion. I want my instant gratification. Today the Internet world is automatically a graphic intense experience. Companies feel the need to be “flashy” and artsy to stand out in their industry, and differentiate them from their competition. I understand this need, but have the graphic artists and Web designers forgotten not all people are connecting through broadband channels?

I know it is easy to argue that most people are connecting to the Web these days via broadband. But I still know many lower-tech families who do not feel the need to pay US$30-50 a month for their Internet access. They don’t use it. In Michigan I have seen a slower DSL service offered for US$13 (cheaper than many dial-up only offerings), so the argument is getting weaker. I guess the dial-up folks have more patience than I do.

I have a different perspective of the Internet of late: mobile access. As you may know from previous posts I have a new Treo 700w. When I bought this remarkable phone I decided to try out the Verizon data service. It is a bit pricey at US$45 a month, but I wanted to try out the platform. I like it. I like the ability to review email when I am not at my PC, I like the fact I can look up business addresses (like where is the closest Panera Bread), and I like being able to Instant Message with clients and geeky friends. It has been an interesting experiment. But browsing the Internet has exposed me to another problem, the Web is not yet “mobilized”. Some sites are, but most sites are formatted and optimized to the general public using a full size browser. Pages are too big both in width and in content. Heck, even my Web sites are difficult to use on this platform. The question is: does this matter? I don’t have the answer yet, but I suspect is does not matter today. It might matter in a year or two. What do you think?

So I continue to enjoy my dial-up for the short term, and occasionally get my broadband fix at a local McDonalds when I need to send large fries (I mean files {g}), or want some instant speed when researching on the Web. I access this through the AT&T; Freedomlink service. It is only US$2 a month, which is inexpensive and well worth it for me. Heck, last year I paid US$40 for a month of T-Mobile access at Kinko’s when I was traveling for a week. Freedomlink is only US$24 a year so I am way ahead.


Last week one of my clients sent me an email asking for some help parsing data. He had a bunch of documents used in his industry all stored in memo fields. The memos look like the contents of a standard Word document. They start out with a table of contents (complete with page numbers and leading periods). Each of the sections inside the memo has a header included in the table of contents.

My mission was to come up with the code to parse the memo into individual records for each of the sections inside the memo. If the memo has 53 entries in the table of contents I should have 53 records in the resulting record set. The contents of each section varies. There may be no text, there may be several paragraphs of text, or something in between. If the table of content has an entry there minimally will be a header in the text. The code needs to put the header entry in one column and the section contents in a memo field.

Want to guess how long it would take you to write this and how many lines of code? Go ahead, take a wild guess. My initial guess was 60 minutes. I did not guess how many lines of code.

I wrote the initial cut of the solution in less than an hour. I parsed out the table of contents using ALINES() and then parsed out the section headers from the table of contents looking for all the text in front of the leading periods. I used the section headers and extracted all the text between the section headers in the rest of the memo using STREXTRACT(). Unfortunately some of the words in the memo were duplicated in the table of contents so I was getting the table of contents in the parse text. It took a little while for me to work around the duplicates issue, clean up the extra spaces and carriage returns.

The solution I prototyped for him can be downloaded here: ParseMemoViaTableOfContents.prg

All done in 61 line of code (including minimal comments and some white space). VFP’s string parsing capabilities absolutely rock! Don’t you agree?

I knew the client was satisfied when he called it the “cat’s meow”. {g}


Saturday I drove out to Grand Rapids to rehearse my Best Practices for Error Handling session I will present at GLGDW 2006 in a couple of weeks. I usually have my sessions down pat before I head out to a user group to test the timing and material, but Saturday was a bit different. I have been working like a crazy developer the last couple if months and barely had time to go through my session once before I drove across the state this weekend. I have nightmares like this all the time: I am walking through the Milwaukee airport and realize I have not put my session together. This was about as close as I have ever come to realizing this in real life. Sure the slides and white paper are done, but I have not looked at them since I sent the email to Whil.

Take it from me, it is all in the preparation. Rehearse, rehearse, rehearse! No matter how many conferences and user groups I have visited over the years, I still need to rehearse. I have been to many sessions where I can tell the presenters were giving the session for the first time. Mailing it in. I feel a little cheated when this happens, and there is no way I want to cheat developers spending their hard earned money to attend the conference.

All the great presenters rehearse. I learned this from some of the best in the business years ago before I broke into the speaking circuit.

Fortunately, the GRAFUGgers are a friendly and generous crowd and let this geek work out the kinks on the fly. The feedback is so valuable. They asked terrific questions, and forced me to think on my feet. We also proved I have about 120 minutes of presentation material that needs to be crammed into 90 minutes. I can decide to speak faster or I will just have to trim some fat.

Regardless, the 100+ attendees at GLGDW will have some homework after the conference. There is no way I can present 50+ pages of material in 90 minutes, even if I turn up my game and talk as fast as Doug Hennig or the late Drew Speedie.

Looking forward to seeing everyone in a couple of weeks. I am already getting geeked.


Exactly who saves time when we change the clocks to Daylight Savings Time? Not me.

I spent 15 minutes changing the clocks in my life. Thank goodness for engineers who developed clocks that autosync with the atomic clocks and those that change the clocks on my computers twice a year.

Today I have several RSS feeds duplicated in FeedDemon because someone changed the clock on the server and now all the RSS entries are one hour different from the original posting. Maybe Nick Bradbury addresses this in the latest version. Note to self: get moving with the upgrade. Fifteen more minutes deleting the duplicate entries.

My body clock wakes me up without an alarm. This is something I like except for days like today and the next few where it will be completely screwed up. So I lost an hour of productivity this morning because I overslept.

So far I am down 2.5 hours if you include the time shift. Bummer. Short of moving to Indiana and Arizona, how can I avoid this problem? I know I will make some of it back in October, but who can wait that long?


Sorry Craig, your excuses are weak {g}.

I host my own Blogger blog on, and all the comments posted by readers are on *my site*, not Blogger’s. I can download the HTML files for each post and they contain the comments too. This is how it works when you host your own blog. I can leave Blogger any day I want and I have all the content and all the comments. I understand you concern for sure, but I think your concerns in this case are unwarranted. Use the tools and save yourself some time not building your own solution. Life is too short and there is too much to do. Think of all the positive blog entries you have time to post now. {bg}

I recently turned on Comment Moderation for the same reason you turned comments off. Sploggers are stupid. I figured my blog would be off the radar of these idiots, but I got a couple and decided to go the moderation route. Unfortunately there is a short delay between the post and my acceptance, but this is a small price to pay. I get notified right away and normally publish them within minutes of the post. I only will block spam and foul language, not comments that disagree with my thinking. That is a promise.

By the way, I was not saying Doug should not expose his feed link. I think we both know Doug was going to do this. I was only suggesting this should not be a reason not to subscribe to his blog. Missing the good blog posts is silly when most good tools will find the feed link by themselves. Heck, I would drive to the local library to read Doug’s blog on a public computer if all 5 of my computers went down simultaneously.

Maybe we should just debate this at WhilFest over a couple of lemonades. My treat! Look forward to seeing you in a few weeks.


When I started White Light Computing two years ago I wrote down several things I knew I needed to do to make the company successful. Competitive advantages and things I have learned from mistakes in the past. One of those competitive advantages is taking regular company field trips. A company field trip is all about learning something to make me a better business person. It can be networking, it can be educational, or it can be technical. Most of the time it has nothing to do with technical growth. A couple of weeks ago I mixed it up a bit (both technical, business, and networking) and took a trip to visit the fine folks at TechSmith in Okemos, Michigan.

In case you are not familiar with TechSmith, they are a software company here in Michigan and the creators of SnagIt (leading screen capturing program), Camtasia (leading screencasting program), and Morae (usability testing). I use both SnagIt and Camtasia on a regular basis. They are best of class tools and I can say “must haves” in my professional toolkit.

SnagIt is something I have become dependent upon to author specifications, documentation, articles, session whitepapers, Help files, and has been a part of all four books I have written. There are plenty of competitors in the screen capturing utilities category, and I have tried a bunch of them. SnagIt beats them all hands down. I use Camtasia to record training material, screencasts for marketing the WLC line of developer tools, and to show customers prototypes of features I am working on. Recently my clients have started using Camtasia to show me reproducible steps for bugs and even ehancements. I have only been using Camtasia for a little more than a year, and I quickly found it to be one of those products I wonder how I lived without.

Betsy Weber is the Chief Evangelist at TechSmith and writes the TechSmith Blog. Back in October she placed an open invitation for a VIP tour at the company’s world headquarters. I am not a VIP by any stretch of the imagination, but I was in the Lansing area to speak with the Mid-Michigan Fox User and Developer group so I sent Betsy an email to see if she was serious.

The tour was great. I was introduced to the program managers, product managers, developers, tech support people, the president, and the guy who ships their products all around the world from the middle of Michigan. Everyone was asking me how I used their products and what I would like improved. I provided some feedback and got to ask some questions. Nothing like getting a personal demonstration of the next version in the hallway from the SnagIt product manager on his TabletPC.

The facilities are nice and very conducive to development and collaboration. I heard stories how one company used three Camtasia licenses to save their company twenty million dollars. Yes, twenty million. I would love if my revenues hit five percent of their savings.

Betsy was very interested in my business, maybe even more than I was in hers. She wanted to understand how I used the TechSmith products to be successful and profitable. She was quite curious about Visual FoxPro and the Microsoft Most Valuable Professional program.

I told Betsy a story on how one of my clients tried the Camtasia trial on his own time, and built a screencast that trains the field on how to use a module in their app. I told her how it addressed one of their big support concerns. My client is a staff developer and got lots of kudos from the field, but his management would not spring for Camtasia because it would deter him from focusing on his priorities. In fact he finally convinced them to buy it after six months of pleading, and then they refused to give it to him when it arrived. Betsy set me up with some swag and door prizes for the MMFUDG meeting, and provided me some big help in getting a copy of Camtasia and SnagIt for my friend. Totally cool.

I have been invited to be on the TechSmith Advisory Board. I will be providing feedback to the development teams, taking more trips to TechSmith in the future, participating in their beta program, and having more geek fun than one person probably should.

It was a great field trip. I learned a lot during the tour about how larger software companies operate, which will be a key as White Light Computing grows. I also made some new friends in our industry, and some excellent contacts, which will help out my clients in the future.