Why We Race and Why We Code

JeffFeitUsingP1IMSA

Andrew Aquilante and Jeff Feit studying live P1TS information during practice.

What is it that draws us to racing or creating software? After returning from Mazda Raceway Laguna Seca this past weekend with Phoenix American Motorsports, I reflected on this.

Race Cars

On magazine pages, websites, and in the paddock they are sleek, caged animals at rest. We take our time studying their static beauty of form and function. We go to races because in motion, they are the ultimate expression of dynamics.  We immerse ourselves in their sounds, smells, touch, and even taste. The best prepared race cars enchant their drivers with lure of driving the perfect race.

Underneath the exterior of the shiny cars are many long hours of work. Before the cars are built, components are first carefully evaluated.  If none suffice, they are fabricated. The promise of speed is weighed against durability, cost, and time. There is struggle and trade-off behind every decision. Cars, drivers, and crew practice and test between races. After each race, we laugh, cry, rinse and repeat the entire cycle again always striving for improvement.

Code

Magical forms, buttons, numbers, and images appear and change on your laptop, tablet, and phone at your command.  The best programs and apps delight us with abilities we did not have before, giving us a sense of mastery and power.

Behind the glass, programmers, testers, documentation writers spend hours creating something out of nothing, with bits out of the ether.  We evaluate and then choose or build the tools and libraries that allow us to express what is in our minds.  The promise of perfect user satisfaction is weighed against robustness, complexity, cost, and time to delivery.  There is struggle and trade-off behind every decision.  Developers code, test, and write between product deliveries.  After each sprint, we repeat the cycle again striving for improvement and delivering delight.

P1Software at Laguna Seca

The latest P1TS software was successfully run and used by the Phoenix American Motorsports team competing in IMSA Continental SportsCar Challenge this past weekend.  With it we were able to not only study and compare each of our driver’s sector performance in real time, but also that of other drivers.

PrestonLagunaPractice

Preston Calvert strings together multiple purple sectors into a best rolling lap.

Some of the pre-race software testing consists of running archived race data streams through the server continuously for 24 hours and simultaneously running multiple clients, testing for CPU, network, memory problems.  Even with all the per-race testing, there were still minor surprises running P1TS at Laguna Seca…

  • IMSA timing and scoring extends the popular RMonitor protocol with additional line crossings and their associated messages, greatly enriching the data stream.  You can see in the screen above that at Laguna Seca there are eight sectors.  P1TS was running throughout most of the race weekend for live on-track burn-in testing.  It is coded to be fault tolerant and keep processing even when it encounters corrupt or unexpected data streams (it happens), logging the anomalies.  Well it turns out some of the other race series were running pure RMonitor (only start/finish loop) resulting in internally caught NullPointerExceptions producing a pretty empty looking Sectors panel.  A bug fix was created, evaluated, and deployed at the track.
  • The mix of hardware using the P1TS system includes a server listening to the local IMSA ethernet data stream and producing and serving an in-memory database of second level derived and historical information.  Surrounding it are client Toughbooks, laptops, iPads, and smartphones connecting to the server via our own  WiFi router.  One of the Toughbooks runs legacy non-P1Software programs like the RMonitor Windows program.  All the systems were running well until mid-race when the RMonitor program stopped responding.  It turns out that a saved MiFi connection became active and the PC decided to switch to it, thus it was not able to reach the IMSA’s local data server.  Fortunately after the first day of practice I switched the server machine to being Ethernet-connected to our router, as trusting WiFi for mission critical software introduces additional risk.

Just as we must race to see what actually happens, we deploy software at real customer situations to see what unexpected happens.

Let me know what you think of this article or if you have similar experiences to share.