Bad customer experience: Valve and Steam delivered games

I returned home from a week long vacation out of state to find my internet connection is down, which after some calls to tech support, yields a technician coming to my house in three days. At any rate, as I’m a pretty avid gamer, I have a stash of single player games which typically I save for these “rainy days”.

Well, over time, it seems I have picked up a fair amount of single player games which were purchased and delivered on Valve’s Steam system, which I am a pretty big fan of. The ability to just download and play the games you have already purchased is awesome!

Now normally, when your internet connection temporarily goes down, Steam supports an “Offline Mode” which allows you to play your games even though it can not connect to the Valve servers to validate the purchase. And usually it works fine. Games which require internet access don’t work so well, but you’d run into that with or with-out Steam, no doubt.

The problem I have run into however is, I would have to call a bug. As I mentioned I have been away for a week and my machine was turned off. So after some period of time, the Steam client decides it needs to check for an update, which ultimately falls as the internet connection is down. So, really that’s fine, the problem is, when an update fails the Steam client will not allow anything else to happen and closes Steam. An even more frustrating, it pops up the dialog which let’s you either Cancel running Steam or to start it in “Offline Mode”, however, when a Steam client update fails, it can only close steam and will not allow “Offline Mode” to proceed.

Here are some screenshots of the process:

1) So far, so good, starts up and tries to update


2) Starts doing something, in fact it gets to 27% pauses for a second, then retries. It will try 3 times in total never getting past the original 27% . This is likely due to the fact that the internet connection this machine is currently using is a tethered mobile phone using the 3G network of my provider. I’m not blaming Steam for this. With absolutely no internet connection, this #2 shot never happens, it just proceeds to #3.


3) After 3 failed attempts at completing the download and update, it errors out.


4) We are now presented with an option to Retry, Quit, or Start in Offline Mode. Retry repeats items 1 thru 3. Quit simply quits the steam client here. We then try the “Start in Offline Mode” to get to our games list or onto the game we started via Steam generated shortcut.


5) The Steam client update is apparently a MUST HAVE to proceed, and since it has failed (and in this case can never proceed as no internet connection is available). We are denied starting in “Offline Mode”


6) Surprise, surpise, there is a problem with the internet connection… We knew that, but I am now being denied access to any of my PURCHSED ($$$) games delivered by Steam.



Surely this is normally a temporary inconvenience but I can’t play games I’ve paid for for 4 days, plus however long the internet connection actually was done while I was away… It’s not Valve or Steam’s fault, but it is there problem.


Typemock’s ASP.NET Bundle released for the masses (and there was much rejoicing!)

Unit Testing ASP.NET? ASP.NET unit testing has never been this easy.

Typemock is launching a new product for ASP.NET developers – the ASP.NET Bundle – and for the launch will be giving out FREE licenses to bloggers and their readers.

The ASP.NET Bundle is the ultimate ASP.NET unit testing solution, and offers both Typemock Isolator, a unit test tool and Ivonna, the Isolator add-on for ASP.NET unit testing, for a bargain price.
Typemock Isolator is a leading .NET unit testing tool (C# and VB.NET) for many ‘hard to test’ technologies such as SharePoint, ASP.NET, MVC, WCF, WPF, Silverlight and more. Note that for unit testing Silverlightthere is an open source Isolator add-on called SilverUnit.

The first 60 bloggers who will blog this text in their blog and tell us about it, will get a Free Isolator ASP.NET Bundle license (Typemock Isolator + Ivonna). If you post this in an ASP.NET dedicated blog, you'll get a license automatically (even if more than 60 submit) during the first week of this announcement.
Also 8 bloggers will get an additional 2 licenses (each) to give away to their readers / friends.
Go ahead, click the following link for more information on how to get your free license.

Thoughts on Unit Tests for Stored Procedures

So very recently I was approached by a team member to write some unit tests around a stored procedure that was being modified.I have never done any unit tests against for a Stored Proc so I was intrigued. I wasn’t really sure where I was going when I started but I opted to write a small console app in C# to proof it out.

For the sake of discussion, let’s say the stored procedure returns back a record-set with a single record containing five columns which describe time-zone information for a given US zip code and takes a single parameter, the zip code. I really couldn’t ask for a better test subject for a first pass, not too big, not too complicated.

My first inclination was to approach it like unit testing any other unit of code. So I figured I would need some variety of data store for my expected values and stored proc inputs. At this point it occurred to me, “Hey, I have a database full of inputs and expected results right here”, so I crafted a quick query to get my expected info. From that, I call the stored proc for each zip and verified the results. Worked like a champ. However, I was unaware the sp code refactoring had already been done on the database I was working against. It was around this point that I started getting a sinking feeling that something wasn’t right with the test. I attributed it to not being sure what to expect as this was a new testing scenario for me (typically the database developers are in charge of their development and testing of their code, while I can write SQL with the best of ‘em, a DBA I am not) and moved on.

Well ok, I have the behavior of the new code, the Actual results in testing terms. Now I can simply aim my program at the old database with the old code-base and get the expected results and modify my app to compare them. And then I discovered that the underlying data structure for the old code was different. The old way, the SP selects the five columns I need by joining three tables and then limiting on the passed in zip code. The new way, a nightly job is scheduled to populate a new “Time Zone” table and the SP simply selects all columns from the new table limiting by zip code.

So finally, it hits me… My test as it stands really is only testing that I get a result for every zip code I ask for and that it selected the same row that I got my expected result from. And maybe, that’s exactly what it should be testing, it does add some value… And yet it feels empty.

What I ended up doing was to retrieve the master list of zip codes from the old way’s table and ran both the new and old SP for each zip code and compared the result. This test felt better, the underlying data structures driving the stored proc could change and the test would be isolated and continue to work as long as the SP’s calling signature stayed the same.

The rub is, I could not have written my new test app without the new SP being developed. So what would I have been testing in it’s first iteration had I wrote tests for it at the time? I would have written my original test that would ultimately grab the expected data from the same database that the stored proc would.

What I should have done was adapt my initial test to work the same way with the old data and verified the stored proc returned the right data in my five columns. Then I should add a a test to verify the data in the new table matches the data from the old data structure. (This test typically would be the responsibility of the db job developers). Then I should modify my test to get it’s expected data from the new (and tested) structures and continue to verify the SP results to the data.

When unit testing code, you tend to have to “mock” out the data calls to control your tests, so from having to avoid all the database calls, it’s a “context-switch” to be requiring the database for everything. But the constant in both is testing the discrete parts from the bottom up. In retrospect, I could have done the test completely with SQL and suddenly the MS “Database Unit Test” makes a lot more sense.