Archive

Archive for the ‘Uncategorized’ Category

May
07
(at the risk of a little self promotion, but definitely as a service to the Fox Community)

Have you read some of the blog or forum posts touting the sessions at GLGDW 2006, and kicked yourself for not attending? Wish you had a second chance? Well Whil is giving you a second chance by releasing the session whitepapers as a new e-book: Visual FoxPro Best Practices for the Next Ten Years

I just reviewed through the book (a whopping 415 pages) and personally feel it is worth twice what Whil is charging. Heck there are a few chapters in this book I would be willing to pay $75 by themselves. Go steal a copy for yourself at Hentzenwerke Publishing.

May
05
A little humor for my day was provided in an email from Epson concerning a rebate they owe me. I got the email today, and here is the content:
 


Dear Richard Schummer,

Your Epson $20.00 rebate check has been sent via 1st class mail on 12/28/2006. Please allow adequate time for delivery.

Sincerely,

Epson Rebate Administrator

PLEASE NOTE: This is an automated email message. Please do not reply directly to this message.


Looks like they can predict the future or have released my notice too soon. I know rebates always seem to take their sweet time getting to me, but building in a 7 month delay is pretty ridiculous {g}. I also like how I cannot contact them with a reply to the email. Thanks Epson. I hope I get it sooner, and I hope their IT department realize the bug on their own since I have no way to tell them.
 

May
02
Or as I should really say: not support.
 
I know I am not your top of the line hardware dude, but even I have been known to change a toner cartridge in a laser printer. After all, it is not a fax machine so even I should be able to handle it. But this weekend it was proven that I am even less competent at hardware than I thought. It started Friday evening when I was printing out a recipe off the web. Simple enough, but the printer is screaming at me to feed it toner. Sunday I stop out at OfficeMax and pick up a cartridge.
 
I get home and open up the printer and try to pull the old toner assembly out of the printer and it is stuck. Stuck hard. So I call Brother’s tech support number and get a message: hours of support are from 6am to 9pm PST, except holidays. April 30th is not a holiday, but the recorded message tells me to call back during working hours despite it being in the alleged support time frame. Idiots. Tell me you are also closed on the weekend.
 
So I called back today and was routed fairly quickly to a support professional. It was a good experience to start as the wait time was less than a minute. All I had to do is provide them my phone number and they told me my printer model and asked how they can help. I explain my situation on how I am out of toner and replacing it for the first time and how it is jammed. The guy asked me to pull hard. I did not want to break the drum assembly so I pulled as hard as I could before I thought I would break something. He asked me to check to see if there was a foreign object in the assembly and I looked (again) to confirm nothing was in the way. He was kind enough to tell me I was the first person he heard of who could not pull the toner out of the printer. As if I don’t have enough esteem problems with hardware. He figures it will be better for me to embarrass myself in front of a human and gives me the phone number of a local tech.
 
I talked to the local tech and he asks me: do you see this white plastic connector on the left side of the door? Sure. Is it connected to a piece of black plastic? No. I connect it and bingo I hear the sound of a release and sure enough the toner slides right out. Simple as pie. This is the first thing they should of asked me. The local tech guy notes he would have made $50 if I had brought it in. I offer to do so, but he declines. First class help from someone knowledgeable.
 
Have I mentioned how much I hate hardware?
 
Powered By Qumana

Apr
27

Yes, I am finally back from GLGDW (and vacation). I would like to tell you I walked back from Milwaukee and this is the reason I am blogging my recap three days after the closing session (and it was a fun closing session wasn’t it {bg}). The truth is I am extremely busy at the moment both professionally and personally. This is a good thing. The truth is, I spent the last couple of nights assembling some notes from the opening session to send to Whil with all the feedback and links from the Professional Developers Toolkit part. It was the fastest I ever assembled a session whitepaper, and I hope the attendees find it beneficial. All this and catching up with my client work after a 10 day vacation before the conference. You know how this goes. I figure it will be another week of catching up on all the work. All this and it is playoff season for the Red Wings and Pistons. Fortunately, I have patient clients and a patient family.

As others have blogged and posted on various forums and list servers, the conference was a smash hit. The format of the conference was very comfortable for me. In fact, the entire conference was patterned to draw out participation from everyone in the room. This is exactly how I like my sessions to flow – throw out some ideas and let the discussion flow from everyone in attendance. It has always been my contention we all benefit from the knowledge of many instead of the knowledge of one. As one developer told me Saturday, you could not risk falling asleep because you would miss too much. He told me even the most energetic speaker normally only keeps his attention for 30 minutes, but at GLGDW 2006 he stayed awake for the sessions.

So thanks to everyone who helped me present the Professional Developers Toolkit and Best Practices on Error Handling. It was a true pleasure to be a part of this event.

I attended every session. Normally I miss one or two slots because I have something pressing I need to take care of at the conference. I learned a lot from every presentation. This is not unusual, and is one of the big reasons I go to conferences. The inspiration factor is a big deal to me. GLGDW did not disappoint.

I pretty much agree with everything Doug Hennig blogged about (especially about the Thai food we ate Saturday night).

Each of the presenters (both those in the front of the room and the others sitting comfortably in the seats) challenged my thinking in some way:

  • Marcia and her “must/should/could” approach to determining where the code goes provides me with a different approach when I struggle deciding where in the class hierarchy the properties and methods need to go.
  • Tamar keyed in some ideas on how my user interfaces can be better, especially when messaging users when saving and deleting data, and accessibility. It was very apparent she is passionate about this topic.
  • Andy was the only speaker doing two sessions (local and remote data), and was the only speaker who made me open up my machine and test something I thought was incorrect (naturally I was wrong, but this was the idea – he challenged my thinking). I disagree with some of his best practices especially on remote view, but respect his ideas and thought it was an awesome three hours plus.

That was Saturday. Whew.

  • Nancy’s refactoring session kept bringing back memories of Steve Sawyer’s refactoring session several years ago in Milwaukee. Nancy shares Steve’s passion for this topic and did a great job. She brought out a number of “duh” moments and thanks to Doug, made me appreciate my development approach of not using WITH…ENDWITH.
  • Barbara tackled a reporting, which is a topic given much exposure over the last couple of years because of the new reporting engine in VFP 9. The great thing about this session is it was not a how-to use the Report Designer session, but how to generically approach reports and the query form in a generic sense.
  • Cathy may have had the most difficult topic to tackle with Project Management. This is a topic most developers either love or hate. There is no code, no jumping in to VFP to show off this technique or that idea. Cathy provided a number of ideas and concepts which reminded me of all the good reasons project managers make good money.
  • The debugging session was packed with lots of good ideas from everyone. I happen to consider debugging one of the most important aspects of development because all developers spend a lot of time debugging software. Picked up a couple of tips.
  • Doug presented a great session on vertical market apps. This was one session I was very interested in personally because I am involved in building a couple of vertical market applications. Doug did not disappoint, despite struggling with a hardware failure getting his machine to push video to the projector. Rock solid.

Fun way to spend a Sunday for sure.

  • The last day of the conference is always the hardest, but Craig started it out with lots of best practices with respect to the middle tier and COM+. I have not done a lot of work with COM objects and the middle tier (distributed model) apps. I found the session interesting and learned plenty.
  • Rick Borup covered Deployment like no one else can. Obviously this is a topic near and dear to my heart. His approach showed developers that deployment is not something you consider at the end of the development project, but is part of the entire development life cycle. He provided some great best practices in each of the five stages. Top-shelf presentation from a top gun presenter.

I left Milwaukee exhausted and energized at the same time. I cannot wait to download the session materials (Whil has already sent out the message they are ready).

I liked the single threaded format and the single theme, but I know I would not want every conference to follow this format. It would get boring. Sometimes choice is good, and sometimes choice is bad. I know a mixture of formats will serve the conference organizers well.

My only complaint is the hotel Internet access. I had trouble keeping the wireless connected in my room, and tried dial-up, but that was even worse. Fortunately it was the weekend and there was not too much email traffic. If the wireless was better I would have blogged live during the conference.

Thanks Whil and Alexis. You were great hosts and put on a terrific Fox event.

GLGDW was a great way to kick off the year of Fox conferences. If you missed this one there are several more coming up in the fall.

So what are you waiting for, go get registered and take part in the inspiration. I figure I should be coming down from the adrenaline rush just about the same time I get caught up on all this work.

Apr
13

The Internet is barely usable over a dial up connection. I know, duh.

Last week I researched wireless hotspots and broadband connections in some areas I will be traveling to this year (both business and personal). I figure this is 2006 and wireless is so widely available I would not have trouble getting connected when I was away from home. I was wrong.

I have enjoyed a broadband connection for the last seven years or so. It is now an everyday part of my life, and one thing I count on being available to me all the time. I know how spoiled I am by the reliable DSL connection in my home office, and the ones I use at clients, on the road in hotels and campgrounds, and even in some of the restaurants I eat at. These last two days with a dial up connection have proven again the Internet is not designed for even casual use over a 56K modem. Don’t get me wrong, I am thankful Ameritech, oh sorry – SBC, um, I mean AT&T; have dial-up numbers for their DSL customers. It is perfect in situations where you are on the road and there is no wireless access.

Email has been fine except for one particularly large attachment I got this morning which slowed the entire email pass down. This is not the part I struggle with today. Browsing Web sites has been a trip. A very slow trip. I feel like I am working in slow motion. I want my instant gratification. Today the Internet world is automatically a graphic intense experience. Companies feel the need to be “flashy” and artsy to stand out in their industry, and differentiate them from their competition. I understand this need, but have the graphic artists and Web designers forgotten not all people are connecting through broadband channels?

I know it is easy to argue that most people are connecting to the Web these days via broadband. But I still know many lower-tech families who do not feel the need to pay US$30-50 a month for their Internet access. They don’t use it. In Michigan I have seen a slower DSL service offered for US$13 (cheaper than many dial-up only offerings), so the argument is getting weaker. I guess the dial-up folks have more patience than I do.

I have a different perspective of the Internet of late: mobile access. As you may know from previous posts I have a new Treo 700w. When I bought this remarkable phone I decided to try out the Verizon data service. It is a bit pricey at US$45 a month, but I wanted to try out the platform. I like it. I like the ability to review email when I am not at my PC, I like the fact I can look up business addresses (like where is the closest Panera Bread), and I like being able to Instant Message with clients and geeky friends. It has been an interesting experiment. But browsing the Internet has exposed me to another problem, the Web is not yet “mobilized”. Some sites are, but most sites are formatted and optimized to the general public using a full size browser. Pages are too big both in width and in content. Heck, even my Web sites are difficult to use on this platform. The question is: does this matter? I don’t have the answer yet, but I suspect is does not matter today. It might matter in a year or two. What do you think?

So I continue to enjoy my dial-up for the short term, and occasionally get my broadband fix at a local McDonalds when I need to send large fries (I mean files {g}), or want some instant speed when researching on the Web. I access this through the AT&T; Freedomlink service. It is only US$2 a month, which is inexpensive and well worth it for me. Heck, last year I paid US$40 for a month of T-Mobile access at Kinko’s when I was traveling for a week. Freedomlink is only US$24 a year so I am way ahead.

Apr
10

Last week one of my clients sent me an email asking for some help parsing data. He had a bunch of documents used in his industry all stored in memo fields. The memos look like the contents of a standard Word document. They start out with a table of contents (complete with page numbers and leading periods). Each of the sections inside the memo has a header included in the table of contents.

My mission was to come up with the code to parse the memo into individual records for each of the sections inside the memo. If the memo has 53 entries in the table of contents I should have 53 records in the resulting record set. The contents of each section varies. There may be no text, there may be several paragraphs of text, or something in between. If the table of content has an entry there minimally will be a header in the text. The code needs to put the header entry in one column and the section contents in a memo field.

Want to guess how long it would take you to write this and how many lines of code? Go ahead, take a wild guess. My initial guess was 60 minutes. I did not guess how many lines of code.

I wrote the initial cut of the solution in less than an hour. I parsed out the table of contents using ALINES() and then parsed out the section headers from the table of contents looking for all the text in front of the leading periods. I used the section headers and extracted all the text between the section headers in the rest of the memo using STREXTRACT(). Unfortunately some of the words in the memo were duplicated in the table of contents so I was getting the table of contents in the parse text. It took a little while for me to work around the duplicates issue, clean up the extra spaces and carriage returns.

The solution I prototyped for him can be downloaded here: ParseMemoViaTableOfContents.prg

All done in 61 line of code (including minimal comments and some white space). VFP’s string parsing capabilities absolutely rock! Don’t you agree?

I knew the client was satisfied when he called it the “cat’s meow”. {g}

Apr
09

Saturday I drove out to Grand Rapids to rehearse my Best Practices for Error Handling session I will present at GLGDW 2006 in a couple of weeks. I usually have my sessions down pat before I head out to a user group to test the timing and material, but Saturday was a bit different. I have been working like a crazy developer the last couple if months and barely had time to go through my session once before I drove across the state this weekend. I have nightmares like this all the time: I am walking through the Milwaukee airport and realize I have not put my session together. This was about as close as I have ever come to realizing this in real life. Sure the slides and white paper are done, but I have not looked at them since I sent the email to Whil.

Take it from me, it is all in the preparation. Rehearse, rehearse, rehearse! No matter how many conferences and user groups I have visited over the years, I still need to rehearse. I have been to many sessions where I can tell the presenters were giving the session for the first time. Mailing it in. I feel a little cheated when this happens, and there is no way I want to cheat developers spending their hard earned money to attend the conference.

All the great presenters rehearse. I learned this from some of the best in the business years ago before I broke into the speaking circuit.

Fortunately, the GRAFUGgers are a friendly and generous crowd and let this geek work out the kinks on the fly. The feedback is so valuable. They asked terrific questions, and forced me to think on my feet. We also proved I have about 120 minutes of presentation material that needs to be crammed into 90 minutes. I can decide to speak faster or I will just have to trim some fat.

Regardless, the 100+ attendees at GLGDW will have some homework after the conference. There is no way I can present 50+ pages of material in 90 minutes, even if I turn up my game and talk as fast as Doug Hennig or the late Drew Speedie.

Looking forward to seeing everyone in a couple of weeks. I am already getting geeked.

Apr
03

Exactly who saves time when we change the clocks to Daylight Savings Time? Not me.

I spent 15 minutes changing the clocks in my life. Thank goodness for engineers who developed clocks that autosync with the atomic clocks and those that change the clocks on my computers twice a year.

Today I have several RSS feeds duplicated in FeedDemon because someone changed the clock on the server and now all the RSS entries are one hour different from the original posting. Maybe Nick Bradbury addresses this in the latest version. Note to self: get moving with the upgrade. Fifteen more minutes deleting the duplicate entries.

My body clock wakes me up without an alarm. This is something I like except for days like today and the next few where it will be completely screwed up. So I lost an hour of productivity this morning because I overslept.

So far I am down 2.5 hours if you include the time shift. Bummer. Short of moving to Indiana and Arizona, how can I avoid this problem? I know I will make some of it back in October, but who can wait that long?